The media this week are exploding with news that a company called Cambridge Analytica used shadily-obtained Facebook data to influence the US elections. The data was harvested by some other shady company using an app that legally exploited Facebook’s privacy rules at the time, and then handed over to Cambridge Analytica, who then used the data to micro-target adverts over Facebook during the election, mostly aimed at getting Trump elected. The news is still growing, and it appears that Cambridge Analytica was up to a bunch of other shady stuff too – swinging elections in developing countries through fraud and honey-traps, getting Facebook data from other sources and possibly colluding illegally with the Trump campaign against campaign funding laws – and it certainly looks like a lot of trouble is deservedly coming their way.
In response to this a lot of people have been discussing Facebook itself as if it is responsible for this problem, is itself a shady operator, or somehow represents a new and unique problem in the relationship between citizens, the media and politics. Elon Musk has deleted his company’s Facebook accounts, there is a #deleteFacebook campaign running around, and lots of people are suggesting that the Facebook model of social networking is fundamentally bad (see e.g. this Vox article about how Facebook is simply a bad idea).
I think a lot of this reaction against Facebook is misguided, does not see the real problem, and falls into the standard mistake of thinking a new technology must necessarily come with new and unique threats. I think it misses the real problem underlying Cambridge Analytica’s use of Facebook data to micro-target ads during the election and to manipulate public opinion: the people reading the ads.
We use Facebook precisely because of the unique benefits of its social and sharing model. We want to see our friends’ lives and opinions shared amongst ourselves, we want to be able to share along things we like or approve of, and we want to be able to engage with what our friends are thinking and saying. Some people using Facebook may do so as I do, carefully curating content providers we allow on our feed to ensure they aren’t offensive or upsetting, and avoiding allowing any political opinions we disagree with; others may use it for the opposite purpose, to engage with our friends’ opinions, see how they are thinking, and openly debate and disagree about a wide range of topics in a social forum. Many of us treat it as an aggregator for cat videos and cute viral shit; some of us only use it to keep track of friends. But in all cases the ability of the platform to share and engage is why we use it. It’s the one thing that separates it from traditional mass consumption media. This is its revolutionary aspect.
But what we engage with on Facebook is still media. If your friend shares a Fox and Friends video of John Bolton claiming that Hilary Clinton is actually a lizard person, when you watch that video you are engaging with it just as if you were engaging with Fox and Friends itself. The fact that it’s on Facebook instead of TV doesn’t suddenly exonerate you of the responsibility and the ability to identify that John Bolton is full of shit. If Cambridge Analytica micro target you with an ad that features John Bolton claiming that Hilary Clinton is a lizard person, that means Cambridge Analytica have evidence that you are susceptible to that line of reasoning, but the fundamental problem here remains that you are susceptible to that line of reasoning. Their ad doesn’t become extra brain-washy because it was on Facebook. Yes, it’s possible that your friend shared it and we all know that people trust their friends’ judgment. But if your friends think that shit is reasonable, and you still trust your friend’s judgement, then you and your friend have a problem. That’s not Facebook’s problem, it’s yours.
This problem existed before Facebook, and it exists now outside of Facebook. Something like 40% of American adults think that Fox News is a reliable and trustworthy source of news, and many of those people think that anything outside of Fox News is lying and untrustworthy “liberal media”. The US President apparently spends a lot of his “executive time” watching Fox and Friends and live tweeting his rage spasms. No one forces him to watch Fox and Friends, he has a remote control and fingers, he could choose to watch the BBC. It’s not Facebook’s fault, or even Fox News’s fault, that the president is a dimwit who believes anything John Bolton says.
This is a much bigger problem than Facebook, and it’s a problem in the American electorate and population. Sure, we could all be more media savvy, we could all benefit from better understanding how Facebook abuses privacy settings, shares our data for profit, and enables micro-targeting. But once that media gets to you it’s still media and you still have a responsibility to see if it’s true or not, to assess it against other independent sources of media, to engage intellectually with it in a way that ensures you don’t just believe any old junk. If you trust your friends’ views on vaccinations or organic food or Seth Rich’s death more than you trust a doctor or a police prosecutor then you have a problem. Sure, Facebook might improve the reach of people wanting to take advantage of that problem, but let’s not overdo it here: In the 1990s you would have been at a bbq party or a bar, nodding along as your friend told you that vaccines cause autism and believing every word of it. The problem then was you, and the problem now is you. In fact it is much easier now for you to not be the problem. Back in the 1990s at that bbq you couldn’t have surreptitiously whipped our your iPhone and googled “Andrew Wakefield” and discovered that he’s a fraud who has been disbarred by the GMA. Now you can, and if you choose not to because you think everything your paranoid conspiracy theorist friend says is true, the problem is you. If you’re watching some bullshit Cambridge Analytica ad about how Hilary Clinton killed Seth Rich, you’re on the internet, so you have the ability to cross reference that information and find out what the truth might actually be. If you didn’t do that, you’re lazy or you already believe it or you don’t care or you’re deeply stupid. It’s not Facebook’s fault, or Cambridge Analytica’s fault. It’s yours.
Facebook offers shady operatives like Robert Mercer the ability to micro-target their conspiracy theories and lies, and deeper and more effective reach of their lies through efficient use of advertising money and the multiplicative effect of the social network feature. It also gives them a little bit of a trust boost because people believe their friends are trustworthy. But in the end the people consuming the media this shady group produce are still people with an education, judgment, a sense of identity and a perspective on the world. They are still able to look at junk like this and decide that it is in fact junk. If you sat through the 2016 election campaign thinking that this con-artist oligarch was going to drain the swamp, the problem is you. If you thought that Clinton’s email practices were the worst security issue in the election, the problem is you. If you honestly believed The Young Turks or Jacobin mag when they told you Clinton was more militarist than Trump, the problem is you. If you believed Glenn Greenwald when he told you the real threat to American security was Clinton’s surveillance and security policies, the problem is you. If you believed that Trump cared more about working people than Hilary Clinton, then the problem is you. This stuff was all obvious and objectively checkable and easy to read, and you didn’t bother. The problem is not that Facebook was used by a shady right wing mob to manipulate your opinions into thinking Clinton was going to start world war 3 and hand everyone’s money to the bankers. The problem is that when this utter bullshit landed in your feed, you believed it.
Of course the problem doesn’t stop with the consumers of media but with the creators. Chris Cillizza is a journalist who hounded Clinton about her emails and her security issues before the election, and to this day continues to hound her, and he worked for reputable media organizations who thought his single-minded obsession with Clinton was responsible journalism. The NY Times was all over the email issues, and plenty of NY Times columnists like Maureen Dowd were sure Trump was less militarist than Clinton. Fox carefully curated their news feed to ensure the pussy-grabbing scandal was never covered, so more Americans knew about the emails than the pussy-grabbing. Obviously if no one is creating content about how terrible Trump is then we on Facebook are not able to share it with each other. But again the problem here is not Facebook – it’s the American media. Just this week we learn that the Atlantic, a supposedly centrist publication, is hiring Kevin D Williamson – a man who believes women who get abortions should be hanged – to provide “balance” to its opinion section. This isn’t Facebook’s fault. The utter failure of the US media to hold their government even vaguely accountable for its actions over the past 30 years, or to inquire with any depth or intelligence into the utter corruption of the Republican party, is not Facebook’s fault or ours, it’s theirs. But it is our job as citizens to look elsewhere, to try to understand the flaws in the reporting, to deploy our education to the benefit of ourselves and the civic society of which we are a part. That’s not Facebook’s job, it’s ours. Voting is a responsibility as well as a right, and when you prepare to vote you have the responsibility to understand the information available about the people you are going to vote for. If you decide that you would rather believe Clinton killed Seth Rich to cover up a paedophile scandal, rather than reading the Democratic Party platform and realizing that strategic voting for Clinton will benefit you and your class, then the problem is you. You live in a free society with free speech, and you chose to believe bullshit without checking it.
Deleting Facebook won’t solve the bigger problem, which is that many people in America are not able to tell lies from truth. The problem is not Facebook, it’s you.
March 24, 2018 at 4:56 pm
While I agree that the problem is gullible people who are easily manipulated, it doesn’t absolve Facebook of selling user data for financial gain.
March 24, 2018 at 9:11 pm
Absolutely! And also their sly terms of service and disclosure process that doesn’t really help you understand what they’re going to actually do with your data. We all need to be more clued up about those things. But better sense of that won’t stop people being deluded by bullshit.
March 24, 2018 at 9:27 pm
I use FB but would rather not. I had to sign up for work and then people found me. It’s also where my entire overseas family hang out. I wish they’d use something else. WordPress.com would be better. Google Circles is another option but I’m not sure what they do with data. I have noticed that younger members of my family use FB less. It’s mostly my parents generation and mine who actively use it.
March 29, 2018 at 5:26 pm
All true at a personal level. But many people don’t pay much attention to politics or indeed the wider world at all. Also they lack the training and background knowledge to sort through lies and misinformation. Finally, people are embedded in social networks where heterodox opinions may be met with hostility or worse. Most people don’t challenge the crazy uncle at the family get-together because, manners, family, politeness. They don’t challenge the boss, or start arguments at work. We who work in argumentative environments forget this.
Also, the US is deeply patronal and oligarchic in much of its social micro-structure, with widespread partisan identification (how often is a non-elected official or leader introduced elsewhere with their party affiliation after their name). It does not make for non-conformity.
March 29, 2018 at 9:22 pm
I think the problem with your theory is there clearly in the middle of it: “Most people don’t challenge the crazy uncle at the family get-together.” There were family get togethers and crazy uncles before Facebook, and there will be around the trashcan fire once Trump starts a nuclear war. Facebook doesn’t offer that crazy uncle anything except a slightly bigger foghorn, and the reality is that if anyone at that bbq was likely to listen to him then they’re also likely to listen to him on Facebook. The only difference is that he’s more likely to be challenged – without the risk of having his spit in your face you can actually argue, and with links.
The same thing applies to your points about the US’s political culture. That has no connection with Facebook and there is no more partisan behavior on Facebook than in real life. My father is the classic crazy uncle and he isn’t on Facebook, but he carefully curates his media consumption to ensure that he never sees anything he doesn’t agree with. He only reads The Daily Mail and doesn’t trust any other source, and he is completely impervious to alternative information. This isn’t Facebook’s fault or the Daily Mail’s, it’s his, and he was 100% convinced to vote for Brexit without any tipping over the edge by Cambridge Analytica.
This belief that it’s technology’s fault is a common flaw of modern journalism, since at least the advent of the VCR. Consider this recent Guardian opinion piece, which argues that while you might think you’re immune to targeted ads, you’re not, and says that we don’t know what impact this micro-targeting could be having on society. Actually we do know, because targeted ads have been a go-to complaint of people criticizing media since at least the 1980s. TV has always targeted ads, and people have always accepted this – you see different ads during daytime TV to kids TV, for example. For a while in the 2000s I think Canada banned advertising during children’s TV time because of the widespread concern about the damage targeted ads were doing. I’m sorry but I grew up listening to complaints about how TV ads were targeted, along with how they have a different sound profile to the shows, and oh we have to put certain magazines in paper bags because children might see them and get warped ideas about sexuality. Nothing Facebook is doing with its adverts is any different to this, except that – ideally – now my media will be full of ads for things I actually want[1]. I would love it if instead of random adverts for Air France (which I will never fly if I can help it!) I could see adverts for RPGs I want to buy, computer games I want to play, books I want to read. Yesterday Facebook targeted me with ads for jobs as a private English teacher, which given my age and career situation is a ridiculous waste of their time and Interac’s money.
We all have a responsibility to learn how to handle the information that saturates our world. For those of us who graduated before information saturation was a thing – back when the only source of information was television and the daily newspaper – and who are used to only “authoritative” voices, some retraining and reworking is necessary. But those people are primarily older people, and they are already often very politically determined and ideological, so they’re both the least likely to change to manage it and the least likely to be influenced by alternative ideas – or to be on social networks. Nonetheless, the issue for these people is education, not the platform their media is delivered on. If you think Fox News is objective and balanced, and you believe the conspiracy theories Tucker Carlson tells you, you’re in big trouble no matter how many people delete Facebook.
—
fn1: Would that it were so, but Facebook’s algorithms are terrible, and Instagram only ever manages to show me adverts for things I found on the web by myself.