Influence Operations Kill Chain

Influence operations are elusive to define. The Rand Corp.’s definition is as good as any: “the collection of tactical information about an adversary as well as the dissemination of propaganda in pursuit of a competitive advantage over an opponent.” Basically, we know it when we see it, from bots controlled by the Russian Internet Research Agency to Saudi attempts to plant fake stories and manipulate political debate. These operations have been run by Iran against the United States, Russia against Ukraine, China against Taiwan, and probably lots more besides.

Since the 2016 US presidential election, there have been an endless series of ideas about how countries can defend themselves. It’s time to pull those together into a comprehensive approach to defending the public sphere and the institutions of democracy.

Influence operations don’t come out of nowhere. They exploit a series of predictable weaknesses—and fixing those holes should be the first step in fighting them. In cybersecurity, this is known as a “kill chain.” That can work in fighting influence operations, too­—laying out the steps of an attack and building the taxonomy of countermeasures.

In an exploratory blog post, I first laid out a straw man information operations kill chain. I started with the seven commandments, or steps, laid out in a 2018 New York Times opinion video series on “Operation Infektion,” a 1980s Russian disinformation campaign. The information landscape has changed since the 1980s, and these operations have changed as well. Based on my own research and feedback from that initial attempt, I have modified those steps to bring them into the present day. I have also changed the name from “information operations” to “influence operations,” because the former is traditionally defined by the US Department of Defense in ways that don’t really suit these sorts of attacks.

Step 1: Find the cracks in the fabric of society­—the social, demographic, economic, and ethnic divisions. For campaigns that just try to weaken collective trust in government’s institutions, lots of cracks will do. But for influence operations that are more directly focused on a particular policy outcome, only those related to that issue will be effective.

Countermeasures: There will always be open disagreements in a democratic society, but one defense is to shore up the institutions that make that society possible. Elsewhere I have written about the “common political knowledge” necessary for democracies to function. That shared knowledge has to be strengthened, thereby making it harder to exploit the inevitable cracks. It needs to be made unacceptable—or at least costly—for domestic actors to use these same disinformation techniques in their own rhetoric and political maneuvering, and to highlight and encourage cooperation when politicians honestly work across party lines. The public must learn to become reflexively suspicious of information that makes them angry at fellow citizens. These cracks can’t be entirely sealed, as they emerge from the diversity that makes democracies strong, but they can be made harder to exploit. Much of the work in “norms” falls here, although this is essentially an unfixable problem. This makes the countermeasures in the later steps even more important.

Step 2: Build audiences, either by directly controlling a platform (like RT) or by cultivating relationships with people who will be receptive to those narratives. In 2016, this consisted of creating social media accounts run either by human operatives or automatically by bots, making them seem legitimate, gathering followers. In the years following, this has gotten subtler. As social media companies have gotten better at deleting these accounts, two separate tactics have emerged. The first is microtargeting, where influence accounts join existing social circles and only engage with a few different people. The other is influencer influencing, where these accounts only try to affect a few proxies (see step 6)—either journalists or other influencers—who can carry their message for them.

Countermeasures: This is where social media companies have made all the difference. By allowing groups of like-minded people to find and talk to each other, these companies have given propagandists the ability to find audiences who are receptive to their messages. Social media companies need to detect and delete accounts belonging to propagandists as well as bots and groups run by those propagandists. Troll farms exhibit particular behaviors that the platforms need to be able to recognize. It would be best to delete accounts early, before those accounts have the time to establish themselves.

This might involve normally competitive companies working together, since operations and account names often cross platforms, and cross-platform visibility is an important tool for identifying them. Taking down accounts as early as possible is important, because it takes time to establish the legitimacy and reach of any one account. The NSA and US Cyber Command worked with the FBI and social media companies to take down Russian propaganda accounts during the 2018 midterm elections. It may be necessary to pass laws requiring Internet companies to do this. While many social networking companies have reversed their “we don’t care” attitudes since the 2016 election, there’s no guarantee that they will continue to remove these accounts—especially since their profits depend on engagement and not accuracy.

Step 3: Seed distortion by creating alternative narratives. In the 1980s, this was a single “big lie,” but today it is more about many contradictory alternative truths—a “firehose of falsehood“—that distort the political debate. These can be fake or heavily slanted news stories, extremist blog posts, fake stories on real-looking websites, deepfake videos, and so on.

Countermeasures: Fake news and propaganda are viruses; they spread through otherwise healthy populations. Fake news has to be identified and labeled as such by social media companies and others, including recognizing and identifying manipulated videos known as deepfakes. Facebook is already making moves in this direction. Educators need to teach better digital literacy, as Finland is doing. All of this will help people recognize propaganda campaigns when they occur, so they can inoculate themselves against their effects. This alone cannot solve the problem, as much sharing of fake news is about social signaling, and those who share it care more about how it demonstrates their core beliefs than whether or not it is true. Still, it is part of the solution.

Step 4: Wrap those narratives in kernels of truth. A core of fact makes falsehoods more believable and helps them spread. Releasing stolen emails from Hillary Clinton’s campaign chairman John Podesta and the Democratic National Committee, or documents from Emmanuel Macron’s campaign in France, were both an example of that kernel of truth. Releasing stolen emails with a few deliberate falsehoods embedded among them is an even more effective tactic.

Countermeasures: Defenses involve exposing the untruths and distortions, but this is also complicated to put into practice. Fake news sows confusion just by being there. Psychologists have demonstrated that an inadvertent effect of debunking a piece of fake news is to amplify the message of that debunked story. Hence, it is essential to replace the fake news with accurate narratives that counter the propaganda. That kernel of truth is part of a larger true narrative. The media needs to learn skepticism about the chain of information and to exercise caution in how they approach debunked stories.

Step 5: Conceal your hand. Make it seem as if the stories came from somewhere else.

Countermeasures: Here the answer is attribution, attribution, attribution. The quicker an influence operation can be pinned on an attacker, the easier it is to defend against it. This will require efforts by both the social media platforms and the intelligence community, not just to detect influence operations and expose them but also to be able to attribute attacks. Social media companies need to be more transparent about how their algorithms work and make source publications more obvious for online articles. Even small measures like the Honest Ads Act, requiring transparency in online political ads, will help. Where companies lack business incentives to do this, regulation will be the only answer.

Step 6: Cultivate proxies who believe and amplify the narratives. Traditionally, these people have been called “useful idiots.” Encourage them to take action outside of the Internet, like holding political rallies, and to adopt positions even more extreme than they would otherwise.

Countermeasures: We can mitigate the influence of people who disseminate harmful information, even if they are unaware they are amplifying deliberate propaganda. This does not mean that the government needs to regulate speech; corporate platforms already employ a variety of systems to amplify and diminish particular speakers and messages. Additionally, the antidote to the ignorant people who repeat and amplify propaganda messages is other influencers who respond with the truth—in the words of one report, we must “make the truth louder.” Of course, there will always be true believers for whom no amount of fact-checking or counter-speech will suffice; this is not intended for them. Focus instead on persuading the persuadable.

Step 7: Deny involvement in the propaganda campaign, even if the truth is obvious. Although since one major goal is to convince people that nothing can be trusted, rumors of involvement can be beneficial. The first was Russia’s tactic during the 2016 US presidential election; it employed the second during the 2018 midterm elections.

Countermeasures: When attack attribution relies on secret evidence, it is easy for the attacker to deny involvement. Public attribution of information attacks must be accompanied by convincing evidence. This will be difficult when attribution involves classified intelligence information, but there is no alternative. Trusting the government without evidence, as the NSA’s Rob Joyce recommended in a 2016 talk, is not enough. Governments will have to disclose.

Step 8: Play the long game. Strive for long-term impact over immediate effects. Engage in multiple operations; most won’t be successful, but some will.

Countermeasures: Counterattacks can disrupt the attacker’s ability to maintain influence operations, as US Cyber Command did during the 2018 midterm elections. The NSA’s new policy of “persistent engagement” (see the article by, and interview with, US Cyber Command Commander Paul Nakasone here) is a strategy to achieve this. So are targeted sanctions and indicting individuals involved in these operations. While there is little hope of bringing them to the United States to stand trial, the possibility of not being able to travel internationally for fear of being arrested will lead some people to refuse to do this kind of work. More generally, we need to better encourage both politicians and social media companies to think beyond the next election cycle or quarterly earnings report.

Permeating all of this is the importance of deterrence. Deterring them will require a different theory. It will require, as the political scientist Henry Farrell and I have postulated, thinking of democracy itself as an information system and understanding “Democracy’s Dilemma“: how the very tools of a free and open society can be subverted to attack that society. We need to adjust our theories of deterrence to the realities of the information age and the democratization of attackers. If we can mitigate the effectiveness of influence operations, if we can publicly attribute, if we can respond either diplomatically or otherwise—we can deter these attacks from nation-states.

None of these defensive actions is sufficient on its own. Steps overlap and in some cases can be skipped. Steps can be conducted simultaneously or out of order. A single operation can span multiple targets or be an amalgamation of multiple attacks by multiple actors. Unlike a cyberattack, disrupting will require more than disrupting any particular step. It will require a coordinated effort between government, Internet platforms, the media, and others.

Also, this model is not static, of course. Influence operations have already evolved since the 2016 election and will continue to evolve over time—especially as countermeasures are deployed and attackers figure out how to evade them. We need to be prepared for wholly different kinds of influencer operations during the 2020 US presidential election. The goal of this kill chain is to be general enough to encompass a panoply of tactics but specific enough to illuminate countermeasures. But even if this particular model doesn’t fit every influence operation, it’s important to start somewhere.

Others have worked on similar ideas. Anthony Soules, a former NSA employee who now leads cybersecurity strategy for Amgen, presented this concept at a private event. Clint Watts of the Alliance for Securing Democracy is thinking along these lines as well. The Credibility Coalition’s Misinfosec Working Group proposed a “misinformation pyramid.” The US Justice Department developed a “Malign Foreign Influence Campaign Cycle,” with associated countermeasures.

The threat from influence operations is real and important, and it deserves more study. At the same time, there’s no reason to panic. Just as overly optimistic technologists were wrong that the Internet was the single technology that was going to overthrow dictators and liberate the planet, so pessimists are also probably wrong that it is going to empower dictators and destroy democracy. If we deploy countermeasures across the entire kill chain, we can defend ourselves from these attacks.

But Russian interference in the 2016 presidential election shows not just that such actions are possible but also that they’re surprisingly inexpensive to run. As these tactics continue to be democratized, more people will attempt them. And as more people, and multiple parties, conduct influence operations, they will increasingly be seen as how the game of politics is played in the information age. This means that the line will increasingly blur between influence operations and politics as usual, and that domestic influencers will be using them as part of campaigning. Defending democracy against foreign influence also necessitates making our own political debate healthier.

This essay previously appeared in Foreign Policy.

Posted on August 19, 2019 at 6:14 AM47 Comments


VinnyG August 19, 2019 7:59 AM

I have some serious doubts about whether this can work at all, and, if so, the probability that it can be worth the investment in money, time, and energy. “Basically, we know it when we see it” is a false premise. If that were literally true, there would be little or no need for countermeasures. I think what you really mean is “Basically, we experts presume to know it not long after the fact, when we have conducted what we believe is sufficient investigation and analysis.” Even that is imo debatable. Disinformation, even in the modern sense of the term has, after all, been a major good produced by the CIA and its analogous, extremely well funded and politically supported, organizations since (at minimum) the end of WWII. Successfully countering disinformation at the level you appear to have in mind would require assurance that whoever is doing the analysis has reached the endpoint in the maze of rat holes (and the span of that maze ranges far beyond the electronic communications that are typically classified as within the domain of information science) created by those organizations, and I am quite skeptical that endpoint can be identified with assurance. For better or worse, I think that detection of BS (not to gild the lily) will remain a fundamentally individual exercise, not a product that can be formulated at an expert level, and the results dumbed down and spoon fed to the masses. Even if that process could work, to me it merely creates a choke point to later be captured or co-opted and controlled by would-be propagandists…

AlexT August 19, 2019 8:39 AM

Wow! I noticed that you left IBM not long ago but who did hire you? There is a lot to unravel in this one… Let’s get to work.

George H.H. Mitchell August 19, 2019 8:40 AM

Bruce, I agree with you 100% on this post, but the very rationality of your suggestions will probably delay their adoption.

On a sort of related tangent, though, can we all agree to ban the phrase “fake news” from civilized discourse? It was useful, maybe four years ago, but it has lost its legitimacy through overuse (especially by one particular citizen). I don’t have an ideal replacement (propaganda is too long); maybe pseudo news or anti-news. In the right context, just call it lies.

parabarbarian August 19, 2019 9:59 AM

This started well before the 1980s. Arguably it goes back to before WW2 but is much easier to trace to the 60s and 70s where the “counter culture” proved a fertile ground for Soviet inspired nonsense. Back then it was dismissed as right-wing Conspiracy Theories.

Everything old is new again…

jones August 19, 2019 10:40 AM

In the 2016 presidential elections, .625% of the vote separated Trump from Clinton.

I’m sure Russia interferes with our elections. I’m sure we interfere with theirs. We overthrow governments in Latin America to install dictators, build Panama Canals, control the cocaine trade, and to prevent the outbreak of social democracy.

As somebody, however, who is concerned with fair elections, I’m not too worried about Russia.

I am more worried about:

  1. Gerrymandering
  2. Deregulated political spending
  3. Systematic black disenfranchisement
  4. Black box electronic voting machines
  5. VoterID laws
  6. Drug war felons who can’t vote
  7. Domestic for-profit news propaganda
  8. Campaign strategies to game the Electoral College

Seriously, we can’t do much about Russia, but the short list above….

I find it troubling that, with .625% of the vote separating Trump from Clinton, an election settled by less than the statistical margin of error by definition says nothing about voter preference.

We’re doing a great job wreaking havoc on out own elections.

Russia is a distraction from our problems abroad, and Russia-Trump collusion is a distraction from Trump’s business dealings in China.

Impossibly Stupid August 19, 2019 11:07 AM

I didn’t see a link to the local version of the exploratory post:

My comments from there still stand.

@George H.H. Mitchell

On a sort of related tangent, though, can we all agree to ban the phrase “fake news” from civilized discourse? It was useful, maybe four years ago, but it has lost its legitimacy through overuse (especially by one particular citizen). I don’t have an ideal replacement (propaganda is too long); maybe pseudo news or anti-news. In the right context, just call it lies.

The problem is that the label “fake news” did not originate to describe lies, but rather to lie and discredit stories that reported inconvenient truths. I would agree that lies should just be called lies, but softer words often get used for whatever reason. The real offensive part of all of this is that liars no longer lose credibility these days. People have become so tribal that they support the lies of their allies rather than looking to associate with people who aren’t dishonest.

And it’s important to note that propaganda may not be lies. Indeed, the best kinds of propaganda are unslanted truths that the target just doesn’t want people to hear. Or sometimes the lack of information will get the gears of conspiracy theorists running out of control (e.g., the recent death of Jeffery Epstein). I maintain that no “kill chain” will be successful if the public isn’t educated enough to be able to understand how and why they’re being influenced.

gordo August 19, 2019 11:18 AM

Unless the targets are removed from the battlefield it’s a game of whack-a-mole till kingdom come. That is, remove the means of targeting, i.e., the ability to microtarget citizens by restoring to them their right to privacy, their expectation that they have a right to be let alone not only by their government but also the communication of any material not explicitly identified as political, i.e., no solicitation.

Step 6. Countermeasures if undertaken via microtargeting of “the persuadables” amounts to thinly veiled re-education measures. The below paragraph has been borrowed from a recent Tom Englehardt piece:

He still has to visit Room 101. As his interrogator tells him, “You asked me once what was in Room 101. I told you that you knew the answer already. Everyone knows it. The thing that is in Room 101 is the worst thing in the world.” And that “worst thing” is always adjusted to the specific terrors of the specific prisoner.

Ross Snider August 19, 2019 11:19 AM

Seriously Bruce?!

How did you not include the United States as an influence operations actor?

Did it get edited out, or did you exclude it?

The way this post reads, influence operations are only things that US enemies perform, and they do it against US interests.

It’s narrow.

D Snyder August 19, 2019 11:38 AM

This is great stuff to help us think about, and respond to, threats from external actors. However, the piece could be enhanced by acknowledging and addressing the way that politicians can use “influence operations” unethically. For example, telling people that news media are producing “fake news” in attempts to undermine the legitimacy of the news media and the stories they are reporting.

Typically, news media are self-policing, in that if one journalist/reporter announces something, others seek to verify the information or else advise skepticism. On the other hand, competing news outlets may attempt to undermine the credibility of a report by doubting its veracity, which then requires another round of verification.

I’ll be interested to see recommendations how to offset influence operations attacks from internal sources.

Mark August 19, 2019 11:48 AM

Releasing stolen emails from Hillary Clinton’s campaign chairman John Podesta and the Democratic National Committee, or documents from Emmanuel Macron’s campaign in France, were both an example of that kernel of truth. Releasing stolen emails with a few deliberate falsehoods embedded among them is an even more effective tactic.

Note that for the Clinton/Podesta case DKIM signatures from the Clinton email server were used to validate the authenticity of the emails. I’m not aware of examples where the attackers have modified the Podesta emails. It’s conceivable that the attackers suppressed some of the emails that didn’t help their case, but again I’m unaware of claims that emails have been removed from the archive. It seems for this “kennel of truth” method it can be sufficient to just dump the information as is.

Sergey Babkin August 19, 2019 12:05 PM

The thing that has been left unsaid is that we MUST be getting angry at the fellow citizens who abuse and break the democratic (small-d) system. And this kind of abuse must be punished. If any behavior of this kind gets promptly investigated, and if confirmed, punished, that creates the trust in the political system, where everyone knows that everyone else is honest. That’s the only real countermeasure.

Only the absence of this self-healing lets the propaganda take hold. If the known abuses go on and on unpunished, you never know, what new abuses will start. As Solzhenitsyn said in “GULAG Archipelago”, when the government is systematically covering its abuses, we can rightfully extrapolate the worst from the bits of information we do have.

And reporting the abuses of Podesta and Clinton is not even propaganda. It’s the public service. For this kind of abuses, both should at least be removed from the political landscape with prejudice, if not spending time in prison. It’s the lack of punishment like this that allows the phony stories about “voter suppressions” and such take hold.

art fowl August 19, 2019 12:41 PM

I appreciate and agree with the analysis. One thing I don’t see discussed much is decreasing everyone’s reach. Disinformation works on democracy because of how highly connected our social media graph is, and this is due to just a small subset of people. It’s also a knob we directly control.

Basically, I don’t see why really big influencers and corporate ad campaigns need to exist at all. They’re an artifact of the technology.

Clive Robinson August 19, 2019 2:51 PM

@ Bruce,

Public attribution of information attacks must be accompanied by convincing evidence.

Hey nice to hear, when I started saying this a half decade back and since I’ve had a fair amount of stick over it. I hope your journy will be more gentle.

However I still maintain attribution is very hard and needs genuine HumInt not just easy to fake SigInt.

@ ALL,

The real problem behind this is actually that we are becoming les parochial and more citizans of the world.

That is as we reach out beyond our own village, town or city we become disconected from our usuall methods of establishing what we feel is true and what we feel is not. Instinctively from millennia of social evolution we have developed a distrust for those who are not of our tribe. But also for other reasons we have become in effect over trusting. The two should balance but they don’t because tribal is primeval and without the need to articulate, trust however is implicit in those we can touch unless they hurt us, but for those we can not touch we realy only have what is articulable to go on and in general we lack discrimination (hence the advantage con artists have when “telling the tail”).

The odd thing is we should all know this at more than a subconscious level but for some reason we don’t. Which is oddly why,

    On the Internet nobody knows you’re a dog.

Is funny to most people.

So what to do about it?

Well we could chose to stay in our village and repell strangers. It’s generally held not to be a good thing to do for lots of reasons. Not least because it develops a “siege mentality” which tends to drag society backwards due to heightening fear and suspicion which eats away ate the foundations of society via paranoia. Which if not checked leads to the equivalent of a police state, people from East Germany can tell you in depth just why this is so bad.

We could on the other hand be open and welcome all to our village. Whilst this does bring good things it also has a flip side and bad things can end up on your doorstep. Which unfortunatly has the side effect of making a chunk of the original village become paranoid and thus fairly quickly a closed society repelling all.

So being too Closed as a society or too Open can and usually does go bad on people for various reasons.

The thing is, what is wrong is staying in the village. A reasonable percentage of the population needs to “get out and go visit” in effect enlargening the touch of the village when they return. The problem is nobody realy likes tourists “because they are not us”. Being tribal is hard to shake off because it’s so deeply embedded.

The thing that makes strangers attractive is the new things they bring with them. Thus trade not tourism is the way to make the village larger.

The other thing about trade is when it’s equitable and both sides gain, it disipates tension, suspicion, and paranoia, but also builds up a form of mutual dependence that is sometimes called a symbiotic relationship.

There can be downsides in that regional idetities and even languages can be lost. However one downside that tends not to happen when both sides feel things are equitable is armed conflict and conquest or stealing of natural resources by force etc.

Two things are happening although not many realise it, the first is our growing dependence on information. When you think about it every thing about you is dependent on three things,

1, Obtaining raw Resources,
2, The energy to process them,
3, The knowledge required to do both the previous steps.

That is the key to all societies where ever they are however they exist is “Knowledge” which is derived from “Information” processed by “Intelligence”.

The Internet is at the end of the day the way we transport and share information.

Where information is not available then we have to “fill in the blanks”, sometimes we do it right sometimes we don’t. One of the primary reasons we don’t do it right is lack of “Knowledge”. If you like “Knowledge” is a set of rules or huristics which we use to process “Information” into further “Knowledge”.

A crucial part of gaining “Knowledge” is observation, and the only way you can realy do that is by beong their. So if we want knowledge on how other people think and reason we have to go and meet them in their environment.

But we can not just turn up and consume, we have to give equal measure otherwise suspicion will arise.

Thus the question boils down to “How do we be there via the Internet?” and the honest answer in part is by being transparent in the ways we behave. The problem is working this out is fairly easy, the hard part is working out how to actually accomplish it in practice…

Ismar August 19, 2019 5:12 PM

Very methodical approach to a social problem showing that Bruce is an engineer at hart ????.
As usually is the case in our modern societies, however, the steps try to tackle the symptoms rather than root causes- a pain killer approach if you will.
Nothing wrong with that as an interim measure but for a long term cure the doctors may have to perform some sort of surgery but more importantly a change in lifestyle is needed.
These, in American context, would include thorough education reform to make sure Americans understand their place in the world as well as a larger effort to make the University education more broader (not enough to just be very good as an Electrical engineer but also important to understand the world in which you live becoming somebody like Clive for example ????) .
There are some other examples here but it is this aspect of narrow lens most Americans look at the world through that has enabled the majority of the attacks mentioned by Bruce and I don’t have time to go into more details as I need to drop off my kids to school now ????.( a ha maybe another reason is also lack of time to reflect of the broader society ????)

VinnyG August 19, 2019 5:45 PM

@jones re .625% – May I inquire as to the source for that number? According to my information, the final tally had Clinton with ~48.2% of the popular vote, and Trump with ~46.1%, a difference of ~2.1% in favor of Clinton. However, the result of a U.S Presidential election is determined by the votes of members of the Electoral College (electors,) not popular vote totals. Those electors are indirectly elected by popular vote (method details left to the individual state,) but representation in the EC is apportioned according to a formula that is designed to prevent states with large populations from running roughshod over the citizens of lesser populated states (each state gets representation equal to the total of its House membership (based on population) and its Senate representation (2 Senators per state,) the U.S being a Constitutional Republic of States, not a democracy. In 2016, the Electoral College vote was 304 or 57.25% (Trump) to 227 or 42.76% (Clinton,) a difference of 14.5% (disregarding for convenience the handful of electors who voted for neither candidate.) If your contention is that a 0.625% change in popular vote in various states at various times potentially could have altered the 2016 Electoral College results in Clinton’s favor, I suppose that is possible (I’d still like to see the numbers) but I’d bet there are many, many ways to slice that lemon. If you are a U.S. citizen and prefer to live in a true democracy (e.g., two wolves and a lamb voting on the luncheon menu,) feel free to agitate for a Constitutional amendment to change the process. That isn’t my preference, but I don’t expect to see the change in my lifetime, if ever. I guess I should acknowledge that some states have passed legislation to force their EC representatives to follow the total popular vote, however the impact is limited as that does not change the relative power of larger and smaller states. NTM those laws can be rescinded as easily as imposed. If I was a legislator from a state with a small population who voted in favor of such a change, I think I’d be looking over my shoulder a lot (and for a long tome…)

VinnyG August 19, 2019 6:22 PM

To exemplify my objection to what I see as the completely inadequate scoping of the problem stated in the OP:

In 1953, the US CIA, to curry favor with Great Britain, whose remaining military might (post WWII) was fully occupied in trying to dampen the many other political brush fires that British colonialism had ignited in Asia and Africa, used (among other tactics) disinformation campaigns to engineer the overthrow of the democratically elected Prime Minister of Iran, Mohammad Mosaddegh, in favor of British lackey “Shah” Reza Pahlavi. The objective was to allow British Petroleum to continue unimpeded the extraction of crude oil from Iranian wells for a price (previously negotiated with other British-installed lackeys) that was far below fair market value.

In 1953, IBM announced the model 701, the very first “mass produced” computer. Also in 1953. MIT made the first successful experimental use of magnetic memory to supplant vacuum tubes.

Yes, information systems today have a much greater reach. But to propose that as a result, the problem is restricted to such systems, and that a solution that is similarly restricted can provide the ultimate remedy for the problem, is imo both egoistical and short-sighted.

Ironically, in 1952, a UNIVAC computer correctly predicted the result for that year’s US Presidential election for CBS, but CBS declined to broadcast the prediction for fear of unduly influencing the results (or possibly for fear of being wrong, based on my cynical opinion of he morals of TV executives)…

Charles August 19, 2019 6:52 PM

Bruce, first I want to thank you for writing this article. This is enormously important, and deserves to be widely read. In an effort to combat the information war in which we now find ourselves, I would like to propose legislation making identity verification mandatory for all major social media platforms. Doing so makes it much more difficult for foreign agents to acquire accounts, and other bad actors will face permanent enforceable bans. It also virtually eliminates the bot-nets that have made this an asymmetric cognitive battlefield. An added bonus is that this method could better protect user identities by offloading collection and processing of personally identifiable information to trusted third parties. After verifying their identity, a user that prefers to open an anonymous profile may do so while still having the added accountability that comes with an identity-verified account. Your backing of this proposal would go a long way in congress. I hope you will consider supporting the idea.

Petre Peter August 19, 2019 7:02 PM

If Romania had access to the Internet in December 1989, the overthrown of Ceausescu would have been much easier.

John Smith August 19, 2019 9:58 PM

“Propaganda is to a democracy what violence is to a dictatorship.” – William Blum

I’d like to see a Part 2 from the OP. How do we, the people, defend ourselves from influence operations by the State?

Take the 2003 Iraq war as an example, and the documented machinations of the US, UK, and other governments to sell the case for war, using fake “sexed up” intelligence.

How do we defend ourselves when the State itself is a “firehose of falsehood”? When social media work to eliminate “alternative narratives”? When our own intelligence and law enforcement agencies surveil, infiltrate, and disrupt dissenting groups?

What do you recommend when “Deny Degrade Disrupt Destroy” gets turned on us?

Not so Fast August 20, 2019 6:29 AM

“Find the cracks in the fabric of society­ — the social, demographic, economic, and ethnic divisions.”

So, the solution is to reduce these cracks.

The last 40 years of neoliberal economics in which the middle class has been eviscerated and wages have stagnated despite increases in productivity (hence the wealth gap) would seem like an obvious partial cause for deepening/widening these cracks.

Furthermore, with Citizens United it is clear that the legislature is doing the work of the rich (see the study which proves this). Additionally, the populace have lost confidence in the main stream media, because they are doing the same, advancing the narratives wished by the wealthy and Military Industrial Intelligence Complex. See Manufacturing Consent.

You rightly point out that digital literacy is important, and this has not been taught.

So, you have a large amount of the population with low digital literacy skills, struggling to survive economically, who have no confidence in their government or corporate media.

A perfect setup to be exploited by influence operations.

Attack the cause, not the symptom.

Peter A. August 20, 2019 6:42 AM

I feel all this boils down to “benevolent dictatorship”, maybe in a collective, “aristocratic” fashion. People – on average – are stupid and not able to sort out truth and lie, therefore vulnerable to manipulation bye “them”, so they cannot be relied upon to make competent decisions, either directly or indirectly (via elections). So the measures proposed are more or less equivalent to a “we know better what’s good for you” policy of regulating speech and flow of information, to steer “the masses” into making “right decisions” because “we” think they cannot make right decisions by themselves. So “we” do the same thing as “they” (spread propaganda) but one supposedly “good for the society”.

The next logical step is getting “the masses” out of the decision loop entirely, “for their own good”, even if not formally (by abolishing or limiting the elections) then actually, by spoon-feeding them with crafted propaganda to make them do the “right choice” and keep them in the illusion of control. Which is what actually happens, gradually. The only questions is who’s the “we”, the aristocracy, who decide what propaganda is good and what is bad, and how long and how much the “we” will care for “the masses” and not for themselves.

Just my sad crazy thoughts.

@Clive Robinson: I fully agree with you on the benefits of trade. I keep repeating: free trade is the canvas of peace.

iceman August 20, 2019 10:30 AM


That’s a good question, Sir, and I notice that the very first comment post asking a similar question in a more pointed manner has been deleted. I must say I felt a chill reading this piece. Remember those heady days when it seemed that the biggest threat to “our” “free” Internet was out-of-control adware? Why, the Internet would simply route around censorship! How times change… These days who knows who the good guys are, if any.

Clive Robinson August 20, 2019 11:32 AM

@ Peter A.,

I keep repeating: free trade is the canvas of peace.

Especially when it improves the lives of all.

As many note productivity goes up profits go up but wages well the reality they go down, the average standard of living drops, and even some of the densest of authoritarian follower Guard Labour start to realise where that’s heading…

We have a politician in the UK called Jacob Rees-Moog, who is known as “The MP for the 18th Century” not just because of the way he dresses or the manners he effects, it’s his world view would have been considered long out of date even back then… But worse his father was similarly odd, and wrote a book that makes “Atlas Shrugged’s” manifesto look tame.

Certain Silicon Valley Corporate owners have read it and as a consequence bought up land in NZ’s south island, a place as bleak as Scotland’s inaccessible northern lowlands where only sheep grow :-S and often called “The last bus stop to Antarctica” or “The last stop in civilisation”. Why because they have a good idea where it might go, and they have been building bunkers and stocking them with five years or more of supplies.

Not sure where those taking the blood of teenagers as a vampirish way to extend their lives are going to get their supplies from, as even frozen blood has a shelf life…

As some one once said “You can not make this stuff up…”.

Andy August 20, 2019 11:41 AM

Reads like some agency’s talking points.

The assumption that populations have no free will underlies this whole narrative of external controllers. During the Vietnam war protests there were officials claiming they were all being whipped up and controlled by the USSR; the entire political left: communists, pacifists, socialists, and liberals were all accused of harboring “anti-American” ideas.. Totalitarian mindsets will always search for an explanation for societal dissent that keeps them as the ones in tune with public opinion and the system they maintain as infallible. The poor old gullible public must be being manipulated from afar, yes, must be that. The alternative explanation; that politics is demonstrably massively corrupt and society isn’t functioning well for the majority, that’s too uncomfortable.

No doubt the French Aristocracy would’ve blamed the internet and some great hidden hand manipulating their populace, had the internet been around at the time of the revolution.

JA August 20, 2019 11:36 PM

Disinformation campaigns rely on a number of physiologic processes in our stress-response system. Specifically they focus on creating unease in the receiver that is reduced by the receiver acting in a manner that the purveyor of the disinformation wants. This is similar to what is done in advertising. Humans have a regrettable tendency to prefer behaviors that reduce unease rather than those that reduce difficulty. So explaining to people that their choices are not helpful is not going to get them to make different choices if the original choices reduce unease quickly.

Stress physiology is involved because as unease increases, sympathetic activation increases and that tends to narrow the attention. In an untrained person the attention gets focused on what is causing the unease, magnifying it and causing an escalating cycle. This happens within seconds. The purveyor of the disinformation then presents the “solution” which reduces unease quickly, and the receiver goes for it, experiencing pleasure that reinforces the behavior.

For example, a message starts by causing unease about the presence of immigrants. That causes increased sympathetic activation and narrows the attention to the undesirable attributes of immigrants that are being portrayed. The receiver is less likely to pay attention to counter-examples. Then the message offers a connection with those who will protect the viewer from these dangerous immigrants. This connection reduces unease in the viewer and that relief reinforces the viewer’s connection with the anti-immigrant group.

The ability to step back and avoid having one’s attention hooked by unease and sympathetic activation is, in part, a physiologic skill. If a person who is uneasy can activate their parasympathetic nervous system then this can help them ‘unhook’ their attention and look for information that differs from what is being presented. This is what is meant by the saying “take a deep breath.” Unhooking from unease often causes a temporary increase in unease, and so doing this requires the necessary self-discipline to tolerate the increased unease in order to think more critically.

An important point is that the ability to think critically and recognize disinformation is facilitated by an awareness of one’s physiology and the ability to use that to work with one’s unease instead of being driven by it.

If we reflect on it, many social engineering techniques manipulate unease in the victim in order to cause them to engage in unhelpful (to them) behavior because it reduces their unease quickly. And training people to resist that is difficult. These techniques are also used for social control.

Note that Clive’s insightful comments about how getting out of one’s village and equitable trade reduce the effect of disinformation fits with this model. Uncertainty and lack of familiarity tend to increase unease. If we are more familiar with others, and if we have experiences of others being sources of goods we appreciate and buyers of goods we produce, we are less uneasy about them, and so the purveyor of disinformation will have a harder time increasing our unease about the other.

A journal article that describes the relationships between these components of stress in more detail can be found (no paywall) in Frontiers in Psychiatry. The Unease Modulation Model: An experiential model of stress with implications for health, stress-management and public policy.
http://www.frontiersin dot org/articles/10.3389/fpsyt.2019.00379/full

VinnyG August 21, 2019 7:45 AM

@unlink re:can someone obtain the end of this linking ? Sure. Append the following parameters and arguments to your search strings:
“” “” (etc.)
Or were you proposing that internet providers use your personal biases as criteria to prevent Google News from presenting results that otherwise meet their aggregation formulae to certain widely defined audiences? I have serious issues with Google News, to the point that I spent several months trying to “tune” Topix to be a viable replacement for me (until Topix ceased operations not long ago,) but I wouldn’t view that proposal as an improvement…

Abnegor August 21, 2019 1:02 PM

A brief summary of the “countermeasures” (look especially at Steps 2, 5, and 6): identify people who publish views or facts you don’t like and censor them and suppress those views or facts, across all platforms, and enforce the censorship with direct personal punishment.

We have already seen that the ruling oligarchy in every country, including the USA, tries to censor people and suppress inconvenient facts and views. Any mechanisms built to defeat so-called influence operations will be used first and foremost to silence opponents of the ruling oligarchy.

Bruce does not propose any practical way, or any way at all, really, for the censorship system he advocates to distinguish a malign influence-operation from a benign democratic political campaign. An appeal to “truth” is utopian– ruling oligarchies simply declare everything they dislike to be false, and who could prove otherwise when Bruce’s system is used to censor their every utterance?

Antistone August 21, 2019 7:37 PM

“Much sharing of fake news is about social signaling, and those who share it care more about how it demonstrates their core beliefs than whether or not it is true.”

I realize that changing human behavior is really hard and so we might want to focus on solutions that don’t require it. However, I think we ALSO really ought to make a conscious push to evolve our culture toward a point where this is considered socially unacceptable.

The indiscriminate spread of misinformation causes significant harms to society. There are plenty of alternative ways to send social signals that don’t have such severe drawbacks for the public good. Do those things instead!

Choosing this particular method of social signaling should be considered selfish, deceptive, and antisocial. Like promising favors to your friends and then not following through, or encouraging people to depend on you for skills you don’t actually have, or spreading scurrilous lies about people behind their backs.

People who do these things should be shamed. If your friend does this, you should clearly express your disapproval both to that friend and to your mutual friends. If your friend does this repeatedly, you should institute a policy of not believing or sharing any stories they send you without corroborating them first, and in extremis you should stop being their friend.

Cherish friends who express their core beliefs by actually articulating them instead of by forwarding news stories they haven’t carefully read!

Don’t reward a reckless disregard for the truth with social acceptance!

Alyer Babtu August 21, 2019 9:22 PM

The way out of the maze of twisty passages all alike and all different is to recognize the principle of subsidiarity, what can be done by a lower power should not be done by a higher. This means local discussion and a hierarchy rising as the domain of the goods involved becomes more general. And it seems perhaps impossible to honor this in large scale polities such as modern states. Technology plays no essential role here, or exacerbates the problems. Likewise most large news organizations.

Jack August 22, 2019 4:07 AM

” Find the cracks in the fabric of society­ — the social, demographic, economic, and ethnic divisions. ”

Here the author makes the assumption that those in charge of government policies (congressional committees, think tanks, special envoys appointed by the president, etc.) are benevolent in their endeavors, which is a glaring falsehood.

These cracks in our “society” exist as designed to make exploits possible by those in charge. These so-called foreign “adversaries or “influence operations” are exploiting the same cracks that were intentionally designed. This is the information security equivalent of bad actors exploiting government mandated backdoors.

Alex Security, August 22, 2019 4:33 AM


How would you describe the type of disinformation campaign perpetrated by mainstream media during 2016 election cycle? There were an endless demonstration of republican president elect Trump thru out-of-context quotes and pictorial parodies to villify his position on various subjects. On the other hand, the Clintons were given a free pass despite their various failures. Truth is crack and distrust of the “establishment” exists for a good reason because they’ve become awful at their doings.

1&1~=Umm August 22, 2019 9:18 AM

@Bruce Schneier:

“That shared knowledge has to be strengthened, thereby making it harder to exploit the inevitable cracks. It needs to be made unacceptable — or at least costly — for domestic actors to use these same disinformation techniques in their own rhetoric and political maneuvering, and to highlight and encourage cooperation when politicians honestly work across party lines.”

The “shared knowledge” is often wrong you could yake as a point “Going Dark” pushed by most Law Enforcment seniors, and many politicians in quite a few nations.

The “inevitably cracks” are there because a few in society realise what a load of bovine fecal matter is being sprayed out by Law Enforcment seniors and politicians. Some of whom are actually “on the take” directly or indirectly from certain organisations that will benifit by the false “going dark” agenda, as they will supply at vast amounts the fairly usless tools (eg Palantir) command from the tax payer purse.

If made “unacceptable or at least costly” then those who see the dung heap argument of “going dark” and similar for what they are, will be treated like any other of your “domestic actor” groups and effectively shut down.

And instead of bring able to “highlight and encourage cooperation” the opposit will happen and politicians will certainly not “honestly work across party lines”, in fact it will encorage them to make unsuported and unwarented claims that have little or no reality because division is a surefire way to get money for political campaign funds, the purpose of which are to further disadvantage the majority of citizens.

Too many peoole are buying into the “fake news mantra” which can at the end of the day usher in Orwellian censorship.

If you want politicians to behave then their lives must be transparent and every dime they and those around them must be directly attributable. With any smell of fraud being treated quite severely. Further their must be hard caps on campaign and political spending and no media freebies or other “assistance”.

vs pp August 22, 2019 1:07 PM

@impossibly stupid has very good point:

And it’s important to note that propaganda may not be lies. Indeed, the best kinds of propaganda are unslanted truths that the target just doesn’t want people to hear. Or sometimes the lack of information will get the gears of conspiracy theorists running out of control (e.g., the recent death of Jeffery Epstein). I maintain that no “kill chain” will be successful if the public isn’t educated enough to be able to understand how and why they’re being influenced.<

Are we able to handle unpleasant truth or just suppress it from distribution/spreading based pure on source?

@all related research
First of its kind mapping model tracks how hate spreads and adapts online:

@VinnyG almost always agree with your reasonable and logical posts.

VinnyG August 22, 2019 2:24 PM

@Alex Security re: whatever point you intended to illustrate with this butchering of entropy “…disinformation campaign perpetrated by mainstream media during 2016 election cycle? There were an endless demonstration of republican president elect Trump…”
How in Mephistopheles lair was Trump “president elect” during the 2016 election cycle? Unless the election itself was a preordained set-up (not making that claim, but it is the only scenario I can contrive that allows your statements to seem anything more than meaningless babble interspersed with a few buzz words…)

Jeb August 23, 2019 3:54 AM


The particular tellings from recent google “whistleblower” tells a different tale from yours. The search overlords are well-prepared for upcoming 2020 information war zone. Apparently, they believe it’s worth the money and it could all “work”.

Alex Security August 23, 2019 4:33 AM


“Unless the election itself was a preordained set-up ”

Well, he was President Elect after the time the election results were official and prior to being sworn into the oval office, but I’m sure you already know that. Atleast we can agree too bad your preordained set-up Hillary did not win it.

tds August 23, 2019 9:03 AM

@Alex Security, Jeb, VinnyG

@Alex: “Atleast we can agree” a good read from David Ignatius

Perhaps China, too, thinks that President Trump is a good Return on Investment (ROI) for dollars spent marketing President Trump

also Opinion President Xi’s choice

Do the Washington Post links look longer than usual? Tracking info?

gordo August 23, 2019 9:05 PM

@Bruce Schneier,

Clint Watts of the Alliance for Securing Democracy is thinking along these lines as well.

Previously from Clint Watts:

Welcome to Hamilton 68 Dashboard Tracking Russian Propaganda

The New Blacklist
Russiagate may have been aimed at Trump to start, but it’s become a way of targeting all dissent
MARCH 5, 2018

The Russians, Hamilton 68 now said, were sowing discord on both sides of the gun control debate by pushing contradictory hashtags like #guncontrolnow and #NRA.

The New York Times put a piece about Russia’s Parkland meddling on page A1, the choicest real estate in American journalism, and outlets like Wired, Newsweek, Vanity Fair and countless others trumpeted the same story. Even Fox News, usually a Russiagate doubter, got in the act, citing Hamilton 68 to say: “Russian bots aren’t pro-Republican or Pro-Democrat. They’re just anti-American.”

Fox wrote the story in a way that used the Hamilton 68 data to make it seem like the Russians didn’t have an exclusive preference for Donald Trump. But the defense of Trump was really a distraction. The palmed card in this propaganda trick was the mere fact that right-wing media, too, were now accepting the core principle of projects like Hamilton 68: that a foreign enemy lurks everywhere in our midst, and the source of political discontent in this country comes not from within, but from without.

[. . .]

“I’m not convinced on this bot thing,” Hamilton 68 honcho and noted War on Terror vet Clint Watts incredibly told Buzzfeed recently. This was after he’d helped place a string of Russian-bots-are-everywhere stories in spots like the front page of the Times.

McCarthyism Inc.: Terror Cranks Sold America the Russia Panic
NOV 14, 2017
Max Blumenthal / AlterNet

The first part of this investigation introduced the ASD’s most prominent figure, Clint Watts, the self-styled national security expert whose high-profile—and factually deficient—Senate testimony introduced America to the supposed menace of Russian bots. Watts flaunts a bio that makes it appear as though his opinions on Russian “active measures” are backed by academic credentials. However, he has no record of scholarship on Russia, does not appear to speak Russian and has no professional experience inside Russia. He has, however, confessed to wasting “billions” of taxpayer dollars on a failed “influence operation” allegedly waged by the U.S. military in the Middle East. As a fellow at the right-wing Foreign Policy Research Institute, Watts has urged American intelligence agencies to encourage jihadists to carry out terrorist attacks against Russia and Iran.

Clint Watts’ World: How America’s top Russian disinformation expert pushes disinformation to justify censorship
When I challenged the high tech flim-flam man behind the Hamilton 68 Russian bots tracker, he ducked my questions. Then his fans asked me if I was a Russian asset.
July 2, 2018
By Ilias Stathatos

In Russia, this process has been long underway, as Natalia Rudakova’s book, “Losing Pravda: Ethics and The Press in Post-Truth Russia,” made clear. Here in the West, we are rapidly approaching a no less troubling scenario where insidious information warfare experts get to legislate the truth and the outliers are driven into oblivion. It is Watts’ vision, where censorship defends democracy and a passive media culture is the best defense against “active measures”, that brings us closer to a landscape that open societies are supposed to resist.


Vincent August 24, 2019 2:26 AM

@Clive Robinson wrote, “Too many peoole are buying into the “fake news mantra” which can at the end of the day usher in Orwellian censorship.”

This may have already come true as evident in Zach Vorhies revelations of Google’s censorship programs circa 2016. The particular recordings and “document trove” certainly fits inline with known “progressive” liberal tactics of thought policing, which may or may not be limited to acts within Google.

Cuddles to him for using the phrase “social engineering” in its original, and correct, context.

Clive Robinson August 24, 2019 2:52 AM

@ gordo,


Yup, there are many opinions on what is going on in the “Internet” and how it effects more traditional media…

But you seldom hear as much about what comes the other way. That is how the more traditional media is being used to influance the Internet.

Once you realise that it is going on for quite deliberate reasons you start thinking about it in a little more depth. That is when you see a dog’s tail wagging, you have to ask which end of the animal is responsible, or perhaps neither, that is if it has been conditioned by the sound of the ringing bell.

The American author and humorist Mark Twain once advised that,

    If you don’t read the newspaper, you are uninformed. If you do read the newspaper, you are misinformed.

Which might have given rise to his other advice about first learning to read a newspaper, and a quite important observation,

    I’ve lived through some terrible things in my life, some of which actually happened.

Politics is a dirty game even when played by the most honest of men. But those attracted to politics usually have failed to learn the lessons of “power”, “masters” and how they control Politicians. That is there are those that have power, and those that are allowed to think they have power. Politician’s are not masters, though they have been alowed to think they have power. The real game is mastery so that you can tell politicians what to think and do.

It used to be more discreetly done, and in ways that encoraged a conservative methodology, that almost always resisted change. There was a time when polititions courted “Press Barons” to ensure favourable coverage. As the Barons dictated “The freedom of the press”. But people are supppsed to want instant answers it’s what dictionaries and later encyclopedia were once supposed to give. But they failed whilst it was easy to find “the meaning of a word” it was difficult to find “the word for the meaning”, thus you had to “ask” someone more knowledgeable.

Thus communications was the real key to both knowledge and it’s acquisition. The old “you will be told” model broke down in the face of easy communications to the new “I will ask” model. The problem is who do you ask?

Thus the secret to propaganda these days is identifing and subverting the so called “influencers”. But few are actually any good at subverting influencers so we see the heavy hand of censorship brought forth.

Which is what we see today, the likes of Facebook managment failed to fully realise which way things were going to go, even though the leasons of history were fairly clear.

gordo August 24, 2019 6:09 AM

@ Clive Robinson.

The number of cirlular-firing-squad misfires are increasing and with them the collateral damage, but some would say that OPS like these are not misfires at all.

tds August 25, 2019 1:16 PM

I want to retract the following statement:

“Perhaps China, too, thinks that President Trump is a good Return on Investment (ROI) for dollars spent marketing President Trump”

An internet search for ‘chinese government interfering in US elections’ was inconclusive. For example, on page one of the search results, there were articles about President Trump and Vice President Pence claiming China meddled in the 2018 US elections. President Trump, while president, has averaged 12 false or misleading statements (or lies) per day.[1] IMO, neither Trump nor Pence are credible.

In addition,

“The Epoch Times is a multi-language newspaper[2] founded in 2000 by John Tang and a group of Chinese Americans associated with the Falun Gong spiritual movement.[3] The newspaper covers general interest topics with a focus on news about China and human rights issues there. It draws from a network of sources inside China, as well as Chinese expatriates living in the West.[4][5][6][7] Along with its critical coverage of the Communist Party of China, the paper is increasingly known for its support of U.S. President Donald Trump and favorable coverage of right-wing politicians in Europe.[8][9][10][11][12] Its news sites and YouTube channels have been criticized for spreading conspiracy theories such as QAnon and anti-vaccination propaganda.[13][14]”


AnonymousAP August 27, 2019 9:23 AM

Why is it that MOST talks on the subject mention other countries attacking other countries, other countries attacking the US, but never the US attacking other countries? I get it, if we have to pick a side it would be it. But does that mean that MOST speech on the subject should be washed in detriment of accuracy? Do scientists pick political sides?

It’s a genuine question, please don’t take as an attack or flame war starter..

Mike August 28, 2019 8:34 PM


The short answer would be it is because US attacking other countries is de facto standard. When you look at many news outlets in the world, their news reportings on major events are often sourced from an U.S.A. outlet because we are economically fitted to operate extensively large news networks around the globe. Thus, it is easy for our news outlets to exert their world view on other countries. However, journalism is long dead in U.S. of A and replaced by tabloids and partisanship which holds true across the board. If there is a money motive in politicism, news outlets will participate accordingly because money runs everything.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.