The CPD Blog is intended to stimulate dialog among scholars and practitioners from around the world in the public diplomacy sphere. The opinions represented here are the authors' own and do not necessarily reflect CPD's views. For blogger guidelines, click here.

Robot

You, Robot

Apr 12, 2017

by

To borrow from Dickens, it is the best of media, it is the worst of media. The times we are living in give us unparalleled access to information, but much of what we receive is biased, manipulated or deliberately false.

Two recent phenomena highlight our Internet predicament: Russia’s aggressive information warfare in foreign affairs and Facebook’s uncritical control over the way “news” gets delivered to its two billion-member “community.”

To understand why we should be so concerned about the way Facebook “curates” the news, we should first look at what happens when a state tries to distort the news. 

The word robot, as most students of Russian quickly learn, is related to a Russian word, rabota, meaning “work.” (In Czech, “robota” means “forced labor.”) In the word “bot,” robot has another derivative, referring to the stealthy software that captures and/or proliferates messages across the Internet.  

Bots exist in many forms — in commercial advertising, on social media platforms, and the like. And not only Russians design and use them. But in politics, at the international level, no one puts bots to work more effectively than the Russians.

This was recently illustrated by Clinton Watts, a fellow at the Foreign Policy Research Institute, and at the Center for Cyber and Homeland Security at George Washington University, in testimony before the Senate Intelligence Committee. Watts is the co-author of a report entitled: "Trolling for Trump: How Russia Is Trying to Destroy Our Democracy.”

Watts described how bots — thousands of them — were set up in Russia last year to create fake identities on Twitter and other social media. These fake profiles had American-sounding names and photos and pumped out fake news during the 2016 election campaign.  

This is not, of course, the only charge of Russian “interference” being investigated by the Senate and House Intelligence Committees — and by the FBI.

Far more explosive is the allegation that the Russian government hacked into emails at the Democratic National Committee and then arranged for WikiLeaks to publish them in order to help Donald Trump win the White House. The entire U.S. intelligence community has “high confidence” that this was the case.

More explosive still is the charge — now under investigation — that associates of Donald Trump may actually have collaborated with the Russians in their cyberattacks.

But while these dramatic possibilities provoke concern and attention, it’s important not to lose sight of the everyday use of bots to spread fake news and sow uncertainty about whether any information can be trusted. Watts and many other experts assert that this is a goal of Russian foreign policy.  


If we are determined to go after Russian “bots” and trolls, shouldn’t we also demand more accountability from those who influence greatly the news we are “fed” via social media?

That the Russian government supports substantial Internet “trolling” operations and provides financing to nationalist-populist candidates and political parties is no longer subject to doubt. Such efforts date back at least to the weaponized information campaign that the Kremlin unleashed against Ukraine more than three years ago.

To date, and scandalously so, the only Western response so far has been a weekly email from the European Union, sent to media and government officials, called “Disinformation Review.”

So how does this impact the other part of our “robotized” information environment — our “friends” at Facebook (and other social media platforms)?

A new study by the Tow Center for Digital Journalism at Columbia University’s Journalism School, The Platform Press:  How Silicon Valley Reengineered Journalism, sheds important light, noting that more than 60 per cent of Americans now get their news from social media, Facebook being the most popular among them. And, as the study further states "In the Facebook information system, control over the distribution of journalism is ceded to the algorithm…The News Feed algorithm decides what users see first when they open Facebook, [and] this equation is constantly iterated in order to optimize the experience and maximize the commercial value of the News Feed…"

Would a bot by any other name be so…robotic?

BuzzFeed’s Craig Silverman has chronicled Facebook’s incredible control over the distribution of news and found that “the top-performing fake election news stories on Facebook generated more engagement than the top stories from major news outlets such as the New York Times, Washington Post, Huffington Post, NBC News, and others.”

Isn’t this precisely the sort of news consumption that aids the authors of Russian bots and disinformation? 

Facebook’s management has been reluctant to accept responsibility for a flow of fake news via FB that seems to have played a significant role — just like the Kremlin’s cover campaign of bots, hacking and leaks — in last year’s election. If we are determined to go after Russian “bots” and trolls, shouldn’t we also demand more accountability from those who influence greatly the news we are “fed” via social media? The Tow Center report points out that “mass data collection and the decline of user internet privacy” are central to the phenomenon, and

"As David Carroll, Associate Professor of Media Design at The New School explains, if we consider how increasingly weaponized ad targeting has become, especially since this past summer when Google and Facebook consolidated our browsing histories into their user IDs, and we think about how anybody in the world could target anyone else in the United States with surgical precision by their susceptibilities and propensities, maybe this election was similar to a 9/11 moment, but non-violent and invisible, where we realize that our commercial infrastructure was used against us, and we don’t realize it until after the catastrophe?"

In truth, we need to “unmask” all the manipulators, from those who work in or with government, to those who shape our news consumption via algorithms that distribute fake news while “optimizing our experience.”

On the first front, there are finally signs of life from the U.S. government. Last December, a Defense Authorization bill was passed and signed into law that calls for the State Department to carry out anti-disinformation work through its Global Engagement Center. According to Senator Rob Portman (R-OH), a sponsor of the legislation, the State Department is now gearing up its effort. Last week, Portman had a useful exchange of views with EU Ambassador David O’Sullivan, to try to kick-start U.S.-EU cooperation in this field.

Secretary of State Rex Tillerson, in Moscow for his first official visit, is also calling on Russia to “confront” its activities aimed at undermining Western democracies.

There are also some signs that FB, Twitter and other social media platforms are being brought a bit more into line.

It will take a lot of “rabota” — work — to bring the “bots” under control — whether they are a function of Russian intentionality or of Facebook’s effort to mercantilize every bit of news on the Web. And it will take time to judge how successful such efforts will be in discouraging Russian bots and trolls. But the important thing is that they commence. There’s not a minute to lose.

Photo by Luiz Perez | CC BY 2.0

STAY IN THE KNOW

Visit CPD's Online Library

Explore CPD's vast online database featuring the latest books, articles, speeches and information on international organizations dedicated to public diplomacy. 

Join the Conversation

Interested in contributing to the CPD Blog? We welcome your posts. Read our guidelines and find out how you can submit blogs and photo essays >