The CPD Blog is intended to stimulate dialog among scholars and practitioners from around the world in the public diplomacy sphere. The opinions represented here are the authors' own and do not necessarily reflect CPD's views. For blogger guidelines, click here.

The Really Dark Side of Facebook

Apr 23, 2018

by

The world has been outraged by Cambridge Analytica scraping personal data from Facebook to facilitate targeted campaigning in the U.S. Presidential election and, possibly, the Brexit referendum. But this is small beer.

The real story is how Facebook—and other social media platforms and search machines (like Google)—support Russian information warfare, while frustrating public diplomacy strategies.

Either Facebook starts sharing more information about how its algorithms work, or social media platforms will be reduced to battlefields for 21st century warfare. The advertisers won't like it.

Cambridge Analytica devised an app which enabled them to scrape personal data from Facebook users so as to develop targeted campaigning for the 2016 U.S. Presidential election (and possibly the Brexit vote). Outrage was universal. Congress summoned Zuckerberg to explain himself (which demonstrated only that the House of Representatives has a marginally better understanding of the internet than the Senate). The British House of Commons had to make do with a whistleblower from Cambridge Analytica. But there is nothing new here. Scraping data from the internet for nefarious (or beneficent) purposes has been around for years. It can be dealt with either by regulation (the EU's General Data Protection Regulation–GDPR–for example, protects the data of EU citizens wherever it is stored in the world) or by education (teaching people to be careful about what they put online).

The really dirty secret about Facebook and other social media platforms and search machines is the way in which they facilitate Russian information warfare, while frustrating public diplomacy campaigns aimed at countering information war. Social media platforms like Facebook are driven by algorithms that ensure that users get content that fits with their known likes. They are designed to allow advertisers to target users with products they are likely to buy. But they also ensure that users only receive news and opinions (and friends’ proposals) that fit with their known prejudices. This matters. As a growing number of people receive some or all of their news from social media, social media tend to reinforce, rather than challenge, their existing prejudices. This in turn reinforces the echo chamber effect, where we listen only to news and opinions we already agree with. Political and social debate is increasingly fragmented.


As a growing number of people receive some or all of their news from social media, social media tend to reinforce, rather than challenge, their existing prejudices.

Search machines use algorithms to order the web pages generated by any search. Although the public may think that such algorithms objectively reflect the relevance to the individual search, there are in fact a series of factors that come into play. Algorithms are in fact not objective, despite their complicated mathematics, but reflect the epistemological biases of their designers. Experts who have studied how the algorithms of search machines like Google function have developed search engine optimization (SEO) techniques to ensure that any given web page gains priority in the ordering of a search result. In 2015, far-right political groups used SEO techniques to game Google. Searches for how many people died in the Holocaust first produced web pages by Holocaust deniers.

Russian information warfare aims at generating uncertainty in western societies. It does not aim to convince people of one interpretation of events, but to undermine confidence in all interpretations of events. Fake news does not want to convince people that it is true, but rather that all news is equally fake. Information warfare uses disinformation to fragment social and political debate and so undermine citizens' confidence in the narratives of their governments. Russia uses surrogates, whether Moscow troll farms or innocent dupes, both to ensure plausible deniability, but also to avoid the need for consistency in the messages or disinformation it is promoting. This allows it to reach out to echo chambers with radically different views. Social media platforms like Facebook fit the needs of information warfare like a glove. Information warriors tailor their messages to the different echo chambers they want to inflame. Facebook's algorithms ensure they arrive. The hard thing about combating fake news is not that it is fake, but that each individual is receiving the fake news he or she wants to believe (including political elites and liberals, who are equally vulnerable). Social media algorithms deliver the disinformation to the targets already disposed to believe it and conceal the role of the information warriors.

Public diplomacy aims to engage with foreign publics as a whole to create political and social environments favorable to subsequent specific policy proposals. Governments do public diplomacy because they believe that foreign publics can influence the decisions taken by their governments. Although modern public diplomacy centers on two (or more)-way conversations with foreign publics, it must be coherent to be effective. Although it uses surrogates, it does so not to conceal its origins, but because the surrogates are more credible or effective advocates than diplomats. Public diplomacy seeks to change opinions, not reinforce prejudices. The fragmentation of social and political debate produced by the echo chamber effect, reinforced by social media algorithms, makes it difficult to engage with foreign publics as a whole. To the extent that public diplomacy campaigns depend on social media like Facebook and Twitter, the same algorithms condemn them to reach only those who already agree with their premises. Social media algorithms not only make it hard for public diplomacy to change the opinions of foreign publics, it makes it virtually impossible for public diplomats to reach those who disagree with them. If you can't get to them, you can't change their minds.

To devise public diplomacy campaigns to counter Russian (and other) information warfare, western diplomats need to understand better the algorithms on which social media platforms and search engines operate. But neither Facebook nor Google are going to share this information lightly. These algorithms are the competitive advantage on which their business models depend. Zuckerberg would much prefer stringent regulation to risking the details of Facebook's algorithms becoming known to his competitors. However, if social media platforms and search engines do not collaborate with western governments, they may find western governments countering Russian information warfare with their own version. Facebook and other social media platforms would be reduced to the battlefields of 21st century information warfare. The advertisers who fund these platforms and search engines now would rapidly be scared off. In as far as social media platforms and search machines are not simply commercial operations—but shape the environment of international relations—the companies that own them may find themselves facing coercive diplomacy.

Note from the CPD Blog Manager: This piece originally appeared on BideDao, Shaun Riordan's website.

COMMENTS

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
0 COMMENT(S)

Join the Conversation

Interested in contributing to the CPD Blog? We welcome your posts. Read our guidelines and find out how you can submit blogs and photo essays >