Interview: Bret Schafer on Putin’s Internet Propaganda Push

GPM-Interview-Propersize

Russian state-directed efforts to spread disinformation and influence political debate ahead of the 2016 US presidential elections is now the subject of various government investigations. We sit down with Bret Schafer from the Alliance for Securing Democracy at the German Marshall Fund of the United States to learn more about the rise of targeted ‘fake news’ – or perhaps more appropriately: ‘fake polemic’ – and discuss what’s being done to insulate Western democracies from its effects. 

** This interview was originally published on October 23, 2017
Could you briefly describe how Russia is spreading disinformation in Western countries?

Disinformation spreads in largely the same manner as legitimate information; the difference is that disinformation must be disseminated in a way that masks its intent and, frequently, its source. In a similar vein to the laundering of illicit finances, disinformation must be “cleaned” in order to establish a façade of legitimacy. Misleading information that can be directly traced to an unreliable source—say, a foreign intelligence agency—immediately loses its value. The process of spreading disinformation is therefore not simply about amplifying a message; it is about moving a message from a questionable to a credible source, or, at the very least, a source that appears credible.

The Russians, or more accurately, the Soviets, literally invented the word disinformation (most scholars trace its origins back to the Russian word dezinformatsiya), so they understand the framework outlined above better than anyone. The tools they are currently using are modern, but the techniques have been honed over a half-century of influence operations in the West. That’s important to stress because the spread of disinformation is not just about the manipulation of technology, it’s also about the manipulation of people.

But to directly answer your question, the Kremlin spreads disinformation in the West by first either publishing misleading content on one of its overt-propaganda channels (RT or Sputnik); placing content on one of the many websites that are covertly funded by the Kremlin or Kremlin-sympathizers; posting messages to Facebook and other social media sites (usually pseudonymously); commenting on articles or videos (again, pseudonymously); or by promoting misleading content that they did not create but that serves their broader strategic purposes. Regardless of where the content originates, however, the Kremlin operates a network of bots and trolls that spreads the message, either by linking directly to the original story or by repackaging its themes into tweets or posts.  As mentioned earlier, this serves to not only disseminate the message to a wider audience but to also conceal their direct involvement. The hope, of course, is that the misleading message is appropriated by the Western media or people online who connect with the message, without realizing the messenger.

When did these Internet-focused Russian disinformation operations begin?

It is hard to pinpoint an exact beginning, but reports suggest that Kremlin-sponsored online influence operations date back to at least 2003. At that time, their focus was largely in countries with significant Russian-speaking minorities, particularly in Russia’s so-called “near-abroad.” Active measures online increased during the 2008 War in Georgia, and then reached a fever-pitch after the annexation of Crimea and the intensification of the War in Syria. In the West, we are just waking up to this threat, but the Baltics and Balkans have been dealing with influence operations for over a decade.

In terms of the actual messaging, what are some of the reoccurring themes in the tweets and stories distributed by Russian networks?

Content promoted by Kremlin-oriented networks can be roughly divided into two categories: geopolitical content that is of interest to Russia and social or political content that is of interest to specific foreign audiences. The strategy, in essence, is to gain credibility with a target audience by expressing support for their views, and then to use that credibility to promote the Kremlin’s views on various geopolitical issues.

On the Hamilton 68 dashboard, which tracks 600 Russian-influenced accounts on Twitter, we consistently see this dynamic in the themes that emerge within the network. Roughly half the content that is shared is an attempt to insinuate an account into organic, American networks. Over the past three months, we have seen efforts by Kremlin-oriented accounts to amplify messages around Charlottesville, the NFL anthem protests, and the mass shooting in Las Vegas. Themes are fungible and tend to shift depending on the news-cycle, but, broadly speaking, divisive social or political content is favored in order to establish a connection with a target audience.

The second set of content is generally promoted in order to rally support for, or limit opposition to, the Kremlin’s geopolitical agenda, particularly in Ukraine and Syria. Here, the messaging is very consistent. Beyond propagandistic content of the “we’re winning” ilk, we typically observe two general themes: anti-Americanism and Russia as either a global savior or a victim of the West.

Why are these themes chosen?

Moscow’s motive for amplifying pro-Kremlin views on geopolitics is transparent: they want to weaken opposition to their geopolitical agenda. In terms of the content that is promoted to connect with foreign audiences, I think “chosen” is probably the wrong word. That implies that there is a coordinated effort by the Kremlin to select themes that might resonate with audiences abroad. In reality, the pattern we see is far less structured and could best be described as a trial-and-error approach. The hope is that if they throw enough mud at the wall that something will stick.

Also, the themes promoted by Moscow are largely chosen by Americans (or whomever they are targeting). The Kremlin did not invent the NFL protests, or Charlottesville, or any other divisive, hot-button social issues in the United States. They simply exploit those issues for their own benefit.

How do Russian networks use “sock puppets” to influence foreign audiences?

First, it may be helpful to define the term. A sock puppet is a fictitious online persona that is used for the purpose of deception. Sometimes, that deception can be fairly benign; for example, an author who uses an alias to post a positive review of his or her own novel. Other times, sock puppets can be used to seriously damage the reputation of business competitors, ex-spouses, or political opponents, to name but a few examples.

Russian trolls cannot operate without the use of sock puppets, for many of the reasons that I have previously mentioned. Beyond the anonymity it provides, basic social psychology tells us that messages that come from within a group have more value than those that come from outside a group. In the context of the United States, for example, a post from “David from Des Moines” is going to have more currency than one from “Mikhail from Moscow.”

Trolls know this, of course, and thus adopt personas that fit the local context and the target audience. Sometimes, it’s as simple as the selection of a western-sounding name. But oftentimes their attempts are more sophisticated. Trolls may use a profile picture of an attractive woman, or adopt a handle that suggests an affiliation with a certain group, say, the U.S. military, that adds credibility to a persona. Recent reports, for example, suggest that Russian trolls created the @TEN_GOP account that spread divisive messages under the guise of being an official Republican account. It all goes back to influence: What kind of characteristics will lend the most credence to a message? That’s the persona they are going to adopt.

How many employees does it take to execute an effective influence operation? Are these private companies? Have any been found to have overt links to the Kremlin?

All it really takes is one person, operating numerous accounts, to influence an audience, particularly if that audience is relatively insular. Of course, it takes a larger network if you want to influence public discourse in a country the size of the United States or Germany. The more accounts you have flooding comment sections or retweeting messages, the more people you can potentially influence. It is safe to say that the Kremlin employs, in some capacity, several hundred, if not thousands, of people to carry out information warfare.

The question of whether or not these people are employees of public or private companies is somewhat irrelevant because the lines between the two are blurred in Russia. The Internet Research Agency in St. Petersburg (better known as the “troll farm”), allegedly employs up to 400 people and is clearly funded by the Kremlin, albeit indirectly. But the notion that Russian trolls are “employees” and work out of centralized office buildings where they receive daily talking points is not an accurate depiction. Most trolls operate in a far-less structured environment, with only a loose connection to the Kremlin. They essentially are contractors or freelancers who are given wide-lanes to operate within, but who do not work for any particular “troll farm” or, in some cases, even work in Russia. This gives the Kremlin plausible deniability, as evidenced by Vladimir Putin’s description of trolls as “patriotic hackers.” But the vast majority of them are being funded by those with ties to the Russian government.

What is the geographic scope of these operations? Which countries tend to be targeted?

If you want to know the countries the Kremlin wants to influence, all you need to do is follow the trail of RT and Sputnik. According to RT, the channel is now available in over 100 countries, so the scope is clearly global.

Obviously, though, there are certain countries and regions that are of greater importance to the Kremlin. As previously mentioned, post-Soviet states—the Baltics in particular—have been dealing with Russian disinformation for over a decade. But the clear epicenter of Russian influence operations is Ukraine. They have been on the front lines since 2014, and the breadth of disinformation operations there is unrivaled. Everything we are seeing now in the United States was test-run there first.

Have there been any notable successes for Russian disinformation ops, in the United States or elsewhere?

In order to determine whether or not an operation was “successful,” it is necessary to first define the Kremlin’s objectives. Unlike for-profit freelancers who peddle fake news, Russia is not interested in impressions or engagements; they are interested in influence, which is an inherently more difficult metric to measure.

Therefore, it would be inaccurate to claim that a Kremlin-funded disinformation campaign was successful simply because it gained traction in a target country. If that were the sole metric, there would be several notable successes, including the now-infamous “Lisa” case in Germany and the retracted Sputnik article that was cited by then-candidate Trump on the campaign trail. But again, to declare any disinformation operation a success we have to go back to the key question: What were the Kremlin’s goals and did any specific disinformation contribute to the achievement of those goals?

Russian government interference in the 2016 U.S. presidential election is a perfect case study of how difficult it is to untangle questions of contribution, attribution, and more importantly, motive. If one were to take the position that the Kremlin interfered in the election simply to defeat Hillary Clinton, then, clearly, the Kremlin achieved its objective. But it is next to impossible to determine what effect, if any, Kremlin-funded disinformation had on the outcome of the election. Similarly, it is clear that Kremlin-oriented accounts spread divisive social messages in the United States before and after the election; what is not clear, is to what degree those efforts have contributed to the broader divisions in our society.

Conversely, if one were to take the position that Russia’s active measures during the election were conducted with the goal of shifting U.S. policies in areas that are of interest to the Kremlin, one would have to say that Moscow’s efforts have dramatically backfired: bilateral relations between the United States and Russia have worsened, economic sanctions have widened (as a direct result of Russia’s interference), and the United States remains a roadblock in both Syria and Ukraine. Perhaps more importantly, if policy-makers in the United States were sleeping on the threat of Russian disinformation and interference, they are now wide awake.

What can Western democracies do to counter the threat of foreign state-directed disinformation campaigns?

Disinformation is extraordinarily difficult to regulate because of issues of free speech and, in the Internet-age, the ease of placing and spreading misleading content. The first step, though, is acknowledging the problem. We are finally moving in that direction. Bipartisan and transatlantic initiatives, including our own, are raising awareness and acting as nexuses to bring together the various players on this issue. On the Hamilton 68 dashboard, for example, we are now able to show the themes being promoted to American audiences in real-time, and we are creating policy options to deter against future interference. Other efforts in this space are doing important work as well.

Governments and technology companies, though, will obviously play the most critical role. There have been some improvements from both sectors, but there is a lot more that needs to be done. Also, we need to do a better job of educating the public. Italy, for example, is adding media literacy training to their educational curriculum. That is something other countries should consider; the best way to inoculate the public is to give them the skills to defend themselves.

It has almost been a year since the US presidential election. What has changed since then? Have the Russians ramped up their efforts? Has the U.S. government been able to mount an effective response?

As I mentioned earlier, the most important thing that has changed in the past year is awareness, both in the media as well as in government.  Hardly a day goes by now without a major report exposing Russian government efforts to target and influence public opinion through disinformation. That is an important first step, but raising awareness is not enough. Now, we need to take action.

On that front, there has not been much progress. The tech companies have made some token gestures, but their collective response has not been adequate. At this point, one would be hard-pressed to say that it is more difficult to spread disinformation on social media today than it was at this point last year. That needs to change. Pressure from both the government and the public will go a long way towards achieving that goal.

In terms of Russian efforts, it is tough to say whether or not things have ramped up. What is clear, is that our response so far has not been an effective deterrent because they are still actively attempting to undermine democracies in the United States and Europe.

The majority of media and government focus has centered on Russia so far. Are there other countries engaged in these kind of operations?

The short answer is, yes, there are absolutely other state actors engaged in disinformation operations. China is active in Taiwan and Australia. The Turkish government targets diaspora communities in Europe. Philippine President Rodrigo Duterte reportedly operates a “keyboard army” to silence critics. It is safe to say that many, many other governments engage in this kind of activity as well.

That said, the scale, scope, and sophistication of Russian efforts are unrivaled. There really is no comparison. And it is important to note, that when we talk about influence operations, the Kremlin uses a wide array of tools to interfere in Western countries, including support for extremist groups, malign financial influence, and state economic influence. Disinformation is part of that toolkit, but it is a part of a much larger puzzle.

 

The opinions, beliefs, and viewpoints expressed by the authors are theirs alone and don’t reflect any official position of Geopoliticalmonitor.com.

Back to Top

Login

Lost your password?