Should There Be Limits on Persuasive Technologies?

Persuasion is as old as our species. Both democracy and the market economy depend on it. Politicians persuade citizens to vote for them, or to support different policy positions. Businesses persuade consumers to buy their products or services. We all persuade our friends to accept our choice of restaurant, movie, and so on. It’s essential to society; we couldn’t get large groups of people to work together without it. But as with many things, technology is fundamentally changing the nature of persuasion. And society needs to adapt its rules of persuasion or suffer the consequences.

Democratic societies, in particular, are in dire need of a frank conversation about the role persuasion plays in them and how technologies are enabling powerful interests to target audiences. In a society where public opinion is a ruling force, there is always a risk of it being mobilized for ill purposes — ­such as provoking fear to encourage one group to hate another in a bid to win office, or targeting personal vulnerabilities to push products that might not benefit the consumer.

In this regard, the United States, already extremely polarized, sits on a precipice.

There have long been rules around persuasion. The US Federal Trade Commission enforces laws that claims about products “must be truthful, not misleading, and, when appropriate, backed by scientific evidence.” Political advertisers must identify themselves in television ads. If someone abuses a position of power to force another person into a contract, undue influence can be argued to nullify that agreement. Yet there is more to persuasion than the truth, transparency, or simply applying pressure.

Persuasion also involves psychology, and that has been far harder to regulate. Using psychology to persuade people is not new. Edward Bernays, a pioneer of public relations and nephew to Sigmund Freud, made a marketing practice of appealing to the ego. His approach was to tie consumption to a person’s sense of self. In his 1928 book Propaganda, Bernays advocated engineering events to persuade target audiences as desired. In one famous stunt, he hired women to smoke cigarettes while taking part in the 1929 New York City Easter Sunday parade, causing a scandal while linking smoking with the emancipation of women. The tobacco industry would continue to market lifestyle in selling cigarettes into the 1960s.

Emotional appeals have likewise long been a facet of political campaigns. In the 1860 US presidential election, Southern politicians and newspaper editors spread fears of what a “Black Republican” win would mean, painting horrific pictures of what the emancipation of slaves would do to the country. In the 2020 US presidential election, modern-day Republicans used Cuban Americans’ fears of socialism in ads on Spanish-language radio and messaging on social media. Because of the emotions involved, many voters believed the campaigns enough to let them influence their decisions.

The Internet has enabled new technologies of persuasion to go even further. Those seeking to influence others can collect and use data about targeted audiences to create personalized messaging. Tracking the websites a person visits, the searches they make online, and what they engage with on social media, persuasion technologies enable those who have access to such tools to better understand audiences and deliver more tailored messaging where audiences are likely to see it most. This information can be combined with data about other activities, such as offline shopping habits, the places a person visits, and the insurance they buy, to create a profile of them that can be used to develop persuasive messaging that is aimed at provoking a specific response.

Our senses of self, meanwhile, are increasingly shaped by our interaction with technology. The same digital environment where we read, search, and converse with our intimates enables marketers to take that data and turn it back on us. A modern day Bernays no longer needs to ferret out the social causes that might inspire you or entice you­ — you’ve likely already shared that by your online behavior.

Some marketers posit that women feel less attractive on Mondays, particularly first thing in the morning — ­and therefore that’s the best time to advertise cosmetics to them. The New York Times once experimented by predicting the moods of readers based on article content to better target ads, enabling marketers to find audiences when they were sad or fearful. Some music streaming platforms encourage users to disclose their current moods, which helps advertisers target subscribers based on their emotional states.

The phones in our pockets provide marketers with our location in real time, helping deliver geographically relevant ads, such as propaganda to those attending a political rally. This always-on digital experience enables marketers to know what we are doing­ — and when, where, and how we might be feeling at that moment.

All of this is not intended to be alarmist. It is important not to overstate the effectiveness of persuasive technologies. But while many of them are more smoke and mirrors than reality, it is likely that they will only improve over time. The technology already exists to help predict moods of some target audiences, pinpoint their location at any given time, and deliver fairly tailored and timely messaging. How far does that ability need to go before it erodes the autonomy of those targeted to make decisions of their own free will?

Right now, there are few legal or even moral limits on persuasion­ — and few answers regarding the effectiveness of such technologies. Before it is too late, the world needs to consider what is acceptable and what is over the line.

For example, it’s been long known that people are more receptive to advertisements made with people who look like them: in race, ethnicity, age, gender. Ads have long been modified to suit the general demographic of the television show or magazine they appear in. But we can take this further. The technology exists to take your likeness and morph it with a face that is demographically similar to you. The result is a face that looks like you, but that you don’t recognize. If that turns out to be more persuasive than coarse demographic targeting, is that okay?

Another example: Instead of just advertising to you when they detect that you are vulnerable, what if advertisers craft advertisements that deliberately manipulate your mood? In some ways, being able to place ads alongside content that is likely to provoke a certain emotional response enables advertisers to do this already. The only difference is that the media outlet claims it isn’t crafting the content to deliberately achieve this. But is it acceptable to actively prime a target audience and then to deliver persuasive messaging that fits the mood?

Further, emotion-based decision-making is not the rational type of slow thinking that ought to inform important civic choices such as voting. In fact, emotional thinking threatens to undermine the very legitimacy of the system, as voters are essentially provoked to move in whatever direction someone with power and money wants. Given the pervasiveness of digital technologies, and the often instant, reactive responses people have to them, how much emotion ought to be allowed in persuasive technologies? Is there a line that shouldn’t be crossed?

Finally, for most people today, exposure to information and technology is pervasive. The average US adult spends more than eleven hours a day interacting with media. Such levels of engagement lead to huge amounts of personal data generated and aggregated about you­ — your preferences, interests, and state of mind. The more those who control persuasive technologies know about us, what we are doing, how we are feeling, when we feel it, and where we are, the better they can tailor messaging that provokes us into action. The unsuspecting target is grossly disadvantaged. Is it acceptable for the same services to both mediate our digital experience and to target us? Is there ever such thing as too much targeting?

The power dynamics of persuasive technologies are changing. Access to tools and technologies of persuasion is not egalitarian. Many require large amounts of both personal data and computation power, turning modern persuasion into an arms race where the better resourced will be better placed to influence audiences.

At the same time, the average person has very little information about how these persuasion technologies work, and is thus unlikely to understand how their beliefs and opinions might be manipulated by them. What’s more, there are few rules in place to protect people from abuse of persuasion technologies, much less even a clear articulation of what constitutes a level of manipulation so great it effectively takes agency away from those targeted. This creates a positive feedback loop that is dangerous for society.

In the 1970s, there was widespread fear about so-called subliminal messaging, which claimed that images of sex and death were hidden in the details of print advertisements, as in the curls of smoke in cigarette ads and the ice cubes of liquor ads. It was pretty much all a hoax, but that didn’t stop the Federal Trade Commission and the Federal Communications Commission from declaring it an illegal persuasive technology. That’s how worried people were about being manipulated without their knowledge and consent.

It is time to have a serious conversation about limiting the technologies of persuasion. This must begin by articulating what is permitted and what is not. If we don’t, the powerful persuaders will become even more powerful.

This essay was written with Alicia Wanless, and previously appeared in Foreign Policy.

Posted on December 14, 2020 at 2:03 PM52 Comments


Jesse December 14, 2020 2:53 PM

It’s a fine line between preventing manipulation and suppressing dissent. The law made to prevent misinformation could be exploited to further an agenda. Short of mandating education in problem solving, what can be done to prevent misinformation from spreading that does not cause as much harm as it cures? Even with perfect legalese, it could still be interpreted differently in practice.

Psychology is a fluid science that could have competing experts on either side of an issue that vehemently disagree. What is evidence of psychological manipulation? What is measurable?

For as massive a problem as this is, I can’t wrap my head around a solution that can’t be abused and land us in an even worse situation.

jones December 14, 2020 2:59 PM

C. Wright Mills discusses some the threats posed by “the applied social sciences” in The Sociological Imagination, published 1959. Mills was wary of the idea that the social sciences be put to use in the same way as, say, physics, to exert mastery over the field’s subject. Some snippets:

Among the slogans used by a variety of schools of social science, none is so frequent as, ‘The purpose of social science is the prediction and control of human behavior.’ Nowadays, in some circles we also hear much about ‘Human engineering’— an undefined phrase often mistaken for a clear and obvious goal. It is believed to be clear and obvious because it rests upon an unquestioned analogy between ‘the mastery of nature’ and ‘the mastery of society.’ Those who habitually use such phrases are very likely to be among those who are most passionately concerned to ‘make the social studies into real sciences,’ and conceive of their own work as politically neutral and morally irrelevant.

If a man has an apparatus of control, both subtle and powerful, over an army division on an isolated island with no enemy, he is, you must agree, in a position of control. If he uses his powers fully and has made definite plans, he can predict, within quite narrow margins, what each man will be doing at a certain hour of a certain day in a certain year. He can predict quite well even the feelings of various of these men, for he manipulates them as he would inert objects; he has power to override many of the plans they may have, and occasionally may properly consider himself an all-powerful despot. If he can control, he can predict. He is in command of ‘regularities.’

I want to make it clear in order to reveal the political meaning of the bureaucratic ethos. Its use has mainly been in and for non- democratic areas of society—a military establishment, a corpora- tion, an advertising agency, an administrative division of gov- ernment. It is in and for such bureaucratic organizations that many social scientists have been invited to work, and the problems with which they there concern themselves are the kinds of problems that concern the more efficient members of such admin- istrative machines.
I do not see how anyone can reasonably disagree with Profes- sor Robert S. Lynd’s comment on “The American Soldier:”

These volumes depict science being used with great skill to sort out and to control men for purposes not of their own willing. It is a signif- icant measure of the impotence of liberal democracy that it must increasingly use its social sciences not directly on democracy’s own problems, but tangentially and indirectly; it must pick up the crumbs from private business research on such problems as how to gauge audience reaction so as to put together synthetic radio programs and movies, or, as in the present case, from Army research on how to turn frightened draftees into tough soldiers who will fight a war whose purposes they do not understand. With such socially extraneous pur- poses controlling the use of social science, each advance in its use tends to make it an instrument of mass control, and thereby a further threat to democracy.2

Clive Robinson December 14, 2020 3:52 PM

@ Bruce, ALL,

Before it is too late, the world needs to consider what is acceptable and what is over the line.

It’s not just “the line”, for “persuasion” to realy work it needs to be instilled below a conscious level. Which means the message has to be repeated frequently but below a threshold where we consciously think about it. In effect it needs to creep up on us.

The problem with modern content delivery systems and persuasion is “choice” there is rather to much of it therefor you can not get a message across if the target jumps information sources.

So there are in effect three parts

1, Holding
2, Selling
3, Closing

Of which the hardest with the Internet is “holding” the target on the channel.

Which is why so much attention is being paid to getting the targets “addicted” to the channel and then manipulating the emotions in a cyclic manner.

So perhaps before we talk about the selling&closing aspects most would assume was where the danger lies in “persuasion systems”, we should first talk about the ethics of getting individuals addicted to a particular content channel or channels.

SpaceLifeForm December 14, 2020 4:40 PM

@ Clive, ALL

The “holding” issue I believe goes back to an individuals early childhood when they were in a seesaw pattern of attention and non-discipline.

Never learned boundaries.

Never learned to question.

Never learned what is trust.

Ross Snider December 14, 2020 4:48 PM

Schneier, this is the most important essay you’ve written in a very long time. Persuasion is at the heart of information security. Not “information technology security”, but the security a person has about their ability to make information-based decisions about which actions they can take in their best interest.

MrXENIX December 14, 2020 5:45 PM

The horror created be the intersection of psychology and mass data collection makes me think that there is some kind of human right here. A first step might be to cut off the companies involved by prohibiting it’s use for targeted advertising altogether.

vas pup December 14, 2020 6:01 PM

OK, Bruce and ALL:
those videos should help to understand how our brains are hacked meaning disclosure of those methods at least should make it harder to manipulate:

Living on Autopilot


Us vs. Them

The Wings of Angels

@Bruce said: “Another example: Instead of just advertising to you when they detect that you are vulnerable, what if advertisers craft advertisements that deliberately manipulate your mood? In some ways, being able to place ads alongside content that is likely to provoke a certain emotional response enables advertisers to do this already.”

Very true! They put dogs and puppies in any TV ad even distantly related to the subject because those images generate oxytocin – hormone of trust!

Now, my nickel:
-what about FINE PRINT? I wish those guys are already in the hell. That is terrible and very unfair manipulative practice.
-what is about legal agreements average Joe/Jane should sign to get basic service (including privacy notice which basically eliminate privacy at all on the side of user) which are in many pages of legalize not understandable even for lawyers? That is not contract with two big business when equal power is on both sides meaning big corporate legal teams.
So, those so called pro-forma contracts for Joe/Jane should be in plain English with level of understanding of high school graduates.

But what I observe for more than 20 years, Presidents come and go, Legislators come and go (but occupying their positions for substantially longer) but NOTHING is substantially change on that subject matter.
Money are in charge – that is what you may do not want to accept, but that is reality.

Cris Perdue December 14, 2020 6:23 PM

Seems like it might be desirable to tighten up the limits on the use of repetitive demonstrable falsehoods for personal or, yes, political gain.

Frank Wilhoit December 14, 2020 6:56 PM

“Should There Be Limits on Persuasive Technologies?”

Yes: they should be banned outright. They are an existential threat to the human spirit. They are a covert and dishonest form of violent aggression. A society that permits them has literally and entirely given up on itself, and none of its inhabitants can escape total immiseration.

Jon December 14, 2020 9:07 PM

@ Jesse

That such a solution can be abused is no reason not to enact one.

The fundamental point being that those who do abuse it be caught and punished. This is a sticky point, of course, because those who are in power are likely to use such techniques, and unlikely to punish themselves, but it can be done.

Everything can be abused. Take, for example, a screwdriver. One can (ab)use one to commit murder. But the point isn’t to outlaw screwdrivers, but to catch and punish those who commit murder (with or without a screwdriver).


Goat December 14, 2020 10:56 PM

The answer to the question in the title is ofcourse yes, but a better question is How?

Things that won’t work:

  • Removing political persuasion tools: Censorship prevails
  • Banning politicians from paying ad money: they will just hire PR firms
  • Rigorous fact checking: It is a lot difficult(and costly) than one might think
  • Getting away with Universal Adult Franchise: democracy dies

Let’s ask Why? then:

  • In short run people may act irrationally: the whole ad business model lives due to this
  • Tech giants have business models based on manipulation: People think they are immune and companies can get away with it.
  • Human psyche is inclined to believe visuals and written words
  • In democracies manipulative practices can make politicians get rich quick

Most of these seem to psychological and can’t be fixed but we can change the business model:

  • We must understand that we are prone to assumptions and we CAN be manipulated
  • Regulation against manipulation algorithms
  • Stop using algorithamically censored(Yes! that autoplay is censorship)
  • Changing the monetary aspects:
    • Ads are a stupid monetisation, they undermine human potential
    • Not all things need to be done to make money
    • People thinking that they are producing free things but putting ads(I have met them) or using platforms that “censor” should be shown the true daylight
    • We can pay for somethings and financially support people and orgs: If payments are made easier and international(USD only may not be an option)

Goat December 14, 2020 10:59 PM

A slight correction, I meant political persuasion posts removal not tools(Tool removal may work in some cases, Social media in the way it exists isn’t needed)

Jon December 15, 2020 12:07 AM

@ Goat

In. re. “Won’t work” item #2, that’s arguable. See, if the money’s traceable, it’s still punishable.

e.g. Murder is unlawful, and no, you can’t get around culpability just by hiring a hit-man. J.

probably December 15, 2020 12:56 AM

This is a pretty persuasive argument.

“To run SolarWinds products more efficiently, you may need to exclude certain files, directories and ports from anti-virus protection and GPO restrictions.”

“We also list the service accounts that should be added for optimal performance and to allow all Orion products to access to required files with required permissions.”


William Barr should have full confidence that any backdoors will be managed responsibly, and that the credentials to any repositories containing backdoors or accompanying documentation will likewise be managed responsibly.

It doesn’t matter who’s backdoor it is, just as long as the bastard works, right?

Goat December 15, 2020 2:04 AM

@Jon, that’s quite right but money laundering does happen and tracing origins isn’t always easy, That’s why I said it SHOULD be illegal, but that may not be the solultion to the problem

Goat December 15, 2020 2:13 AM

To Clarify: I am talking about banning political ads. Even if this is done, manipulative posts may be spread often by PR firms who hire clickers(Real humans) from Bangladesh and India. This is something very close to a whack-a-mole

hooby December 15, 2020 4:52 AM

I agree that the amount of manipulation everyone is exposed to on a daily basis is way too much and mentally unhealthy.

But I’m not convinced that trying to crack down on those practices through laws and limitations is the best and most effective way to approach this.

I would favor to make psychological manipulation a mandatory part of the school curriculum. People who know and understand how manipulation works are way more likely to detect and thus resist manipulation attempts made against them. Educating the populace to be more resistant to manipulation, harder to exploit, and more politically mature in general might be more effective than any laws and regulations ever could be.

Denton Scratch December 15, 2020 4:55 AM

“[…] voters are essentially provoked to move in whatever direction someone with power and money wants.”

This seems to me to be a case of letting voters off the hook.

Just under 50% of US voters voted for Trump. If that was the result of persuasive technologies run out-of-control, then there’s no hope for any of us. For any kind of public participation in decision-making to have a chance, we have to be able to assume that most people are capable of making rational decisions.

I am an optimist in that respect; I assume that all those Trump voters were more-or-less rational actors, not simply the puppets of advertisers. I conclude that the USA’s current political divisions are real, and not the result of malevolent campaigns of persuasion. Given that these divisions look to me to be much the same divisions that have blighted the country since the end of the Civil War, I further conclude that the Civil War never really ended (I’m a socialist, but I seem to share this view with many USAians).

I don’t think the English Civil War ever really ended either; Northern Ireland is my evidence. I certainly don’t advocate re-fighting either Civil War.

It’s tragic, and I offer no solution. But blaming all our problems on secretive persuaders is a doctrine of despair, because as you note at the beginning of your article, secretive persuasion is human nature – if human reason is denceless against money and technology, then we’re all doomed.

Doomed, I say!

Winter December 15, 2020 5:16 AM

“I’d be hard pressed to understand what kind of laws would have prevented Bernays stunt of linking feminism to smoking cigarettes; something that was extremely effective.”

All the discussions here about Pr firms etc are caused by two American-only rules: That companies have free-speech and advertising is somehow free speech too.

In most to the world, e.g., Europe, companies have no “human freedoms”, so they also have no Free Speech rights. Advertising is not free speech. If you are paid to say something, that is not protected under your “Freedom of Speech”. The person who pays you might have these rights, or not.

So, Bernay’s stunt would probably file under paid promotions and as such, the actresses would be required to make well known that they were paid to do so.

Petre Peter December 15, 2020 6:56 AM

“Warning! The following message has content that may affect you emotionally.” I think this would be a good start.

Internet Individual December 15, 2020 7:12 AM

I recently wrote an essay in a similar context. Persuasive technologies can assuredly be tools used to take advantage of, or impose/assert/exploit the will, on a target or target audience. I quickly realized more underlying, root issues at play that require further grasp for the challenges that we will be confronting in the impending future.

Getting to the root core of online disinformation to try and mitigate, principally without going full totalitarian N. Korea style, or absolute control of any form of information communication. After considerable reflection over the past couple of years, I came to the realization that inherent vulnerabilities lay embedded in the very foundations of how we as humans communicate. I only speak English, or American-English to be more precise, but I suspect each language/culture are similar in this respect. I explored the very basics from a practical and logical perspective. Starting with “what are words.” As ridiculous and or trivial as that sounded initially. I quickly realized human communication quickly spiked off the charts in terms of complexity after accounting for only a few of the countless plethora of variables we subconsciously consider in a communication. It’s surreal to consider how the grunting and other interesting sounds we created and assigned to convey ideas, has lead us to the technical society in which we reside today. Especially, after understanding how poorly we convey or perceive others messages. Is that guys smiling and nodding his head in aggreement with what you just stated because he understands the idea you were trying to convey? or does he just think he understands it? How would you know? Ideas are subjective and comprehended through relative perception.

What does that mean? In short, take 20 different people, convey the same message to them, and see how each individual interpreted that message. You will see the information that was interpreted by each individual might be vastly different from one person to the next. A recent example that comes to mind is when President Trump said. “Proud Boys, Stand down and stand back.” Everyone saw and heard what was stated, yet there were many interpretations on what idea was being conveyed.

Similar to how a polygraph machine works. You can’t interpret someones perception into a scientific un-objective binary datapoint. An example might be someone online ranting. They rant on a popular social media platform, “Head Magazine” and say, “There are no jobs, unemployment is high, and its the fault of …Assume in this scenario the individual doesn’t have any dishonest or ill intentions and are simply stating the facts as best they can convey. In City A, (from which they reside) there are few jobs, unemployment is very high and the policy of was non-objectively and directly impacting the unemployment rate. However, in City B. (A city 1000 miles away in the same country) Unemployment is very low, there are plentiful jobs and that very same that caused City A’s poor economic conditions, non-objectively and directly is impacted the prosperous conditions of City B. So, is the online ranter spreading disinformation, or lying? Stirring trouble or civil dissension? It depends who receieved or read the messege.

What’s interesting to me is how the online ranter might be identified as a mis/disinformation spreader, notably from the context of an online social media corporation. The ranter was expressing frustrations and only stated honest facts in good faith. However, because citizens from City B viewed the rant, (which is out of the control and unknown to the ranter), dictated if the ranter was spreading mis/disinfo.

Consequently, the ranter might be suspended/banned from the social media platform, put on some list, marginalized, etc. Associated with negativity or anti-government/establishment data-points on a record in a database, of the social media corporation in question. (which might eventually be acquired by data brokers, shared with government at a local and or federal level, or hacked and leaked.)

The ranter’s comment might get de-prioritized beneath 40k others as to not be seen. Or maybe 40k others’ comments were prioritized above the ranter’s. (which has the same practical outcome but different potential legal ramifications).

What might have happened if only individuals from City A viewed the message? What about if the ranter’s message aligned with the social media companies’ views? What about if it didn’t align but created more traffic and “buzz” to the social media website, creating more revenue for them? This sort of example happens daily online with personalities/influencers and the millions of viewers and that follow them.

In short, after lengthy theoretical and practical conjecture in search of potential solutions. I came to the conclusion disinformation which is (incorrectly conveying an idea) at its core is subjective. Which is why polygraph machines aren’t admissible in court regardless of how many sphincter sensors and other technological doohickeys are added, or the efficacy of the administrator. I don’t even want to know how this metric might be concocted and presented. Disinformation has always been an issue, but has been exacerbated exponentially due to global interconnectedness at near the speed of light. (internet). I got into it deeply in my writing, regarding ways to mitigate this as a democratic society which guarantees freedom of speech, but ultimately regardless of any government, country, language or culture, it will tear apart the fundamental constructs of society. Its an unwinnable situation, and yet another unintended consequence of the pandora’s box unleashed on the world known as the internet.

Until then, its about mitigation in a red vs blue team type of situation, with everyone losing at the end. Now that adversarial nation-states are at play online, this issue is going to very quickly become significant and impactful, and will eventually undermine society both locally and globally and therefore how we function and behave as humans.

Michael P December 15, 2020 7:21 AM

As usual, Ian Betteridge was right.

This kind of essay would be early on the list of things to ban if governments adopted the kind of rule that this essay advocates.

Winter December 15, 2020 7:45 AM

@Internet Individual
“Starting with “what are words.””

Are you aware that linguists are unable to tell you what a word is?

To get a an idea of what you are up to, read Words and Rules” by Stephen Pinker (384 pages). This is only one view of one aspect of “What is a word?”, and focused on only English like aspects at that. It does not come to a definite conclusion.

Ray December 15, 2020 8:28 AM

This essay was written with Alicia Wanless

I think you ought to make this a little more prominent and at the beginning of the essay.

BP December 15, 2020 8:49 AM

Isn’t this just a rehash of the first day of any college first year philosophy class. All that stuff about why Socrates drank the hemlock. Isn’t this just a modern rehash?

david raab December 15, 2020 8:58 AM

Some thoughts…

  • individual-level manipulation requires access to individual data such as mood, location, social status, etc. Some is volatile (mood), some is not (gender, race). Restricting access to this and, especially, banning its use in ad targeting, would mean that everyone see the same messages at the same time and reduce the ability to manipulate individuals.
  • BUT: mass propaganda is effective even when everyone sees the same message. It relies on people being fed a steady diet of lies and distortions with little conflicting information. In totalitarian states, people have no choice because all media show the same messages. In free societies, individuals can still choose to restrict themselves to outlets that send a consistently distorted message.
  • This is where things get dicey. Everyone probably agrees that falsehoods and distortions are bad, but the problem is who decides what’s false or distorted. Free speech theory assumes people will figure this out for themselves but that fails if they don’t hear both sides. Having a fact-checking authority judge for them is obviously dangerous, although people of good will can probably resolve more issues than is commonly realized.
  • The other option is to require each outlet to fairly present opposing views, so people at least see them and are forced to judge for themselves. Along this line: there’s a plausible argument that polarization in the U.S. accelerated after repeal of the FCC Fairness Doctrine in 1987, when it was no longer required for each TV station to expose its viewers to both sides of controversial issues.
  • Of course, some (perhaps many) people will judge poorly even when presented with both sides. It’s hard to see how you avoid this possibility without serious encroachments on freedom.
  • These problems existed long before social media or personalized ad targeting. Social media and personalization do exacerbate them by making it easier for people to fall into a bubble of one-sided information, both because algorithms push them towards extreme content and because there is more extreme content available. Regulation of social media and personalization can reduce the problem, and is worth pursuing for that reason. But it won’t solve all problems.
  • There’s an even deeper crisis of authority among people who feel “the elites” have lied to them. That’s a root cause, but it’s a topic for another day.

Anonymous December 15, 2020 9:41 AM

“I would favor to make psychological manipulation a mandatory part of the school curriculum. People who know and understand how manipulation works are way more likely to detect and thus resist manipulation attempts made against them. Educating the populace to be more resistant to manipulation, harder to exploit, and more politically mature in general might be more effective than any laws and regulations ever could be.”

Agreed I recall an article that Finland has made something like that part of their educational system. I’m afraid I did not bookmark the article and it was many months ago at least.

Goat December 15, 2020 10:20 AM

“People who know and understand how manipulation works are way more likely to detect and thus resist manipulation attempts ”

Are you sure? People have become quite aware of privacy risks but facebook still seems to be paying it’s bills.

Winter December 15, 2020 10:23 AM

@David Raab
“Free speech theory assumes people will figure this out for themselves but that fails if they don’t hear both sides.”

That fails even more under the Kremlin approach, where people are drowned by a fire hose of fake news. That is like a comment section of Schneier without a moderator, but worse. Any attempt to find facts is thwarted.

BOb December 15, 2020 10:30 AM

Such a wonderfully persuasive piece. Of course, I’m guessing the author would exclude himself from regulation because he’s a good person. The other people should be muted. That’s how all of these arguments go. It’s all just censorship by the people who think they can get away with it.

Winter December 15, 2020 10:45 AM

“I’m guessing the author would exclude himself from regulation because he’s a good person. ”

Obviously, you have not been here before much? Then there is the question, why do you come here now, and why are you dismissive without having looked at the context?

Clive Robinson December 15, 2020 11:22 AM

@ david raab,

The other option is to require each outlet to fairly present opposing views, so people at least see them and are forced to judge for themselves.

It does not work as we have found in the UK where a governmrnt changed the Charter of the British Broadcasting Service (BBC) a few years back.

What it actually does is give undeserved air time to very minor frequently socially unacceptable view points.

Look at it this way 99% of the population has no issue with religious tolerance. However the “opposing view” which is a politer version of “Heritics should be burned on earth as well as hell” is held by at most just 1% of the population. But the rule means the “they will burn” very minority virw get airtime aproximately equal to the majority view.

In the past few years I’ve looked at “Rules to ensure fair behaviour” and they can all be followed to the letter but give the opposite effect you would think and it reinforces unfair view points and behaviours.

Billikin December 15, 2020 12:38 PM

Internet Individual wrote:

“You will see the information that was interpreted by each individual might be vastly different from one person to the next. A recent example that comes to mind is when President Trump said. “Proud Boys, Stand down and stand back.” Everyone saw and heard what was stated, yet there were many interpretations on what idea was being conveyed.”

An excellent example. 🙂 President Trump did not tell the Proud Boys to stand down. He told them to “stand back and stand by”. Big difference.

Billikin December 15, 2020 12:53 PM

Bring back Rhetoric to the curriculum.

Back in the 1990s I became acquainted with Critical Thinking. At least some courses applied critical thinking to ads. Is Adbuster magazine still around? I don’t know. A one year high school course in Critical Thinking mostly aimed at analyzing online ads and persuasive techniques would be invaluable.

When I was in high school I had the benefit of reading my grandmother’s high school Rhetoric textbook. IMO it was better than Critical Thinking texts. Rhetoric was one of the seven liberal arts. Bring back Rhetoric, I say.

Internet Individual December 15, 2020 2:30 PM


Excellent catch. My mistake, I was juggling too many things around this morning. He did indeed say “Stand back and stand by.

Internet Individual December 15, 2020 4:26 PM

@ Winters

Thanks for your reply. I wasn’t aware of the book, nor did I know linguists failed to define exactly what a “word” is. That’s interesting, I’ll have to take a glance at it. I like to think im my own worst critic, and so when I broke things down to such a level, it sounded like nonsense trying to explain to others. I got the impression people started thinking I was losing my mind. And who knows, maybe I have. At least it sounds like I’ll have some company.

Faustus December 15, 2020 4:28 PM


The evidence of fact is connection to a web of references. Bald statements, here and elsewhere, are unlikely to be fact if they refer to anything beyond logic or mathematics or.philosophy.

Even the best and most intelligent posters here err.

After hearing a.lot about feminism I bought a text to learn more. Despite containing many claims about gender in the realm of history and sociology and psychology, there was not a single.reference. I’d never seen a college textbook without references. It had been invented from whole cloth.

I’m the will be used as a reference, but the fact is that it is without foundation. The book that is. The assertions themselves, although unsupported, may be true. Or.not.

David Leppik December 15, 2020 10:39 PM

While persuasion is an important issue, it is one of the things least affected by modern technology.

Humans have been engaged in psychological warfare for as long as there have been humans. In fact, all social animals engage in it to some extent, and chimpanzees are capable of a wide range of persuasive techniques. Dogs have evolved a facial muscle to mimic human sadness or remorse when indicating submission. It’s very cute. Humans may have huge brains as the result of an arms race in persuasion.

Propaganda techniques are not significantly different or more persuasive than they were a century ago. A lot of people are concerned that the last five years have been the result of some new propaganda tool that caused Brexit and the Trump presidency, rather than long-standing trends that weakened trust in institutions. Social media and bubble-producing algorithms aren’t innocent, but they are only part of the story.

You don’t need to look far back in history to see plenty of examples of demagogues and fascists using the same techniques to the same, or stronger effect. Well-fed people who feel respected are hard to radicalize. When people feel threatened and disenfranchised, it’s hard to keep them from getting radicalized.

Similarly religions across the world have a long history of making bizarre or easily disproven claims that nobody questions because they have been framed in ways that discourage that line of questioning.

New technology, from TV ads to banner ads, see a drop-off in effectiveness once people get used to them. That’s what makes advertising so hard: you need novelty to grab attention, and novelty is fleeting.

If persuasive technology were so effective, we’d all be drinking Coca-Cola. There’s a fair amount of evidence that advertising that is persuasive, rather than educational, doesn’t work even for targeted ads. That is, the billions of dollars that powers Google and Facebook are part of a feedback loop that does more to convince advertisers that their ad dollars are well spent than to actually persuade customers.

The most persuasive technique, whether in religion, politics, or elsewhere, is to convince you that everyone—experts and neighbors alike—agrees and therefore that if you disagree there must be something wrong with you. (That may be why so much money is spent on advertising.) The second most persuasive technique, which is related, is to convince you that something may be right but it’s impossible. Political change happens when something that was just recently impossible (abolishing slavery, public schools, gay marriage) is now possible or inevitable.

That said, there are some real threats here, but they aren’t just about persuasion.

Micro-targeting, or targeting people with highly directed messages, is what’s new. The main risk of micro-targeting has to do with targeting vulnerable populations and determining when people are most persuadable. For example, cults have long known to target people when they are emotionally vulnerable and prey on their weakened self-esteem. Facebook can automate this—in fact they have categories for people with particular insecurities.

Micro-targeting could be worse. I really don’t want Google (owner of FitBit) to know someone’s heart rate when they are watching a particular YouTube video, since it may reveal secret emotions and alliances.

Micro-targeting can be countered with strong privacy protections. That would also help fight discrimination. Also blackmail, which is an age-old persuasion technique.

David Leppik December 15, 2020 10:49 PM

Concerning education as a counter-measure:

It’s a good start, but limited. When I took psychology in college, just about every technique they described to manipulate people still worked when people knew the trick. Some more than others. For example, price lists tend to include a high priced item that’s there to make the other prices look lower. You might know that a price list with low, medium, and high is really good-enough, more-than-you-need, and nobody-needs-this, but it’s still hard to choose the lowest priced item.

Winter December 16, 2020 12:36 AM

“If persuasive technology were so effective, we’d all be drinking Coca-Cola. ”

Look around at the (over)size of the average person on the streets. Speculate about the origin of the excess calories.

Now again, do you still think persuasive technology does not work?

Goat December 16, 2020 4:20 AM

Most people here understand that all security is fallible, let’s also understand that all humans are gullible(Meaning they CAN be influenced. though with varying diffculty

Antistone December 16, 2020 4:26 AM

I’m inclined to say that any persuasion that knowingly promotes irrational decision-making is unethical, unless precautions are taken to ensure that the subject is not actually harmed thereby. (E.g. demonstrations of cognitive biases so that you can train people to resist them would be OK.) Even if the subject isn’t directly harmed by the decision you are persuading them to make (ha!), you are promoting habits of insanity.

This probably condemns >95% of modern marketing. (Ethical marketing is entirely possible. There’s nothing inherently unethical about informing people of the existence of your product or service, or conveying an accurate impression of its benefits. But most modern marketing doesn’t stop there.)

Not everything that’s unethical should be illegal (due to difficulties of enforcement, if no other reason). I’m not sure what the legal line ought to be.

Cassandra December 16, 2020 4:35 AM

@David Leppik

I submit it is better to have a good start than no start at all.

However, I agree that persuasive techniques work, even when you know about the technique. It is one of the paradoxes of being human that knowledge and behaviour are not as closely linked as people would like. I know I am overweight and should eat less. I remain overweight. And, I am definitely not a perfectly rational actor, beloved by economists.

If everyone were completely rational, a study showing that the placebo effect works even when the participants know they are receiving a placebo should give a null result. It doesn’t.


People are weird, and susceptible.

In past times, people learned rhetoric (the art of persuasion), which gave a good grounding in identifying false arguments, and recognising persuasive methods for what they are. It is sad the average person gets little education in that, or critical thinking.


hooby December 16, 2020 6:07 AM

I’m not claiming that education is the be-all end-all solution to all manipulation.

But compared to trying to limit manipulation through stricter legislation – I do feel that education might be more effective. I feel education tackles the underlying problems closer to their root causes – whereas enabling people to sue each other for manipulation after the fact… that feels like a quagmire to me.

I think that just that one tiny little step of allowing people to recognize and properly call out toxic rhetoric, logical fallacies and argumentative trickery could already have a massive impact on all political discourse, for example.

Clive Robinson December 16, 2020 8:01 AM

@ Cassandra,

. I know I am overweight and should eat less. I remain overweight.

Welcome to the Western World, where most Drs are way behind the Science curve and for more than a working life time have slavishly followed the words of a liar and fraud called Dr Ancel Keys[1] of the US Army “K Ration” that caused US troups to actually be malnorished and unhealthy from it even though their calorific needs were more than met.

I know one Dr who is more on the leading edge of the curve note that “in the West virtually nobody is the weight we should be”. In part because of image issues but as much due to the,fact we measure it wrong (BMI is not the right way, it was developed by the insurance industry a century and a half ago to be easy for assessors to use without complicated tables).

Another problem was and still is with some Drs, is the mantra of “all calories are equal”. Even in a flame that’s not true of food, and a few moments thought about the work involved in digestion shows that the mantra has to be untrue in living creatures as well. Because the laws of thermodynamics demands it and brooks no argument.

However what realy makes us fat and unable to shift it is fat storage and the hormonal effects it has on the brain. It’s why “crash diets” work for many but the weight just goes back on.

It’s a clasic “ratchet effect”. If you have an excess of blood sugar your body tries to store it away, first in the liver then in fat cells. If you don’t have enough fat cells then your body makes more of them very quickly, which is a desirable outcome in environments where free calories in the form of simple carbohydrates are extreamly rare as they normally are. So the ratchet has a fast rise time, and for similar reasons a very slow fall time.

However as those fat cells start to empty and the fat gets moved into the storage in the liver the fat cells send out hormonal messages indicating that you must be in a form of starvation mode… This causes your brain to seek out food. This kicks in around day four to six of most low cal diets and even the euphoria of seeing weight loss on the scales does not get you to ignore the message for more than a few weeks.

The problem is the more fat cells you have the bigger that hormonal message, thus your brain tells you to in effect “binge eat” which actually makes more fat cells and so the ratchet clicks up a notch.

Given sufficient time your body will eventually kill and reabsorb fat cells, but that takes quite a period of time, depending on age and other factors years may pass. With your brain telling you you are starving all the while because as far as it is concerned you are…

So how to break the ratchet?

The thing is that starvation is harmfull but it takes a while to start happening. There is a period of time where it’s not harmfull and that is fasting and your body/brain can with just a little training get over the bad eating habits sugar has caused.

The best time to fast is when you are asleep and after you wake up… Thus missing breakfast sounds like a good idea and it actually is. But there is fasting and fasting, remember what the aim of fasting is which is reduce the intake of is simple carbohydrates or sugars. Thus a protien breakfast will give you food for protien building but not lift your blood sugar levels and so won’t cause the “spike-n-drop” that causes the mid morning snacking thus excess blood sugar and fat cell filling/making… Lunch should be low in simple carbs and have a few complex carbs and lots of fiber and some fats. Idealy non root vegtable, vegtable soup and no seeds or grains, but pulses are ok. This fills you up but only provides a gradual lift that replaces the energy you have used in the morning, whilst giving a significant portion of both soluble and insoluable fiber your dietary tract needs to be healthy and stop secondary fast carb cravings and that after lunch drowsyness.

Your evening meal should realy be a late tea, that is eaten atleast three to four hours before you go to sleep. This should again be vegtable rich but contain the remainder of your energy requirments split amoungst protein, fat, and complex carbohydrates.

But also due to the way the liver works healthy people can fast for upto a day and a half without harm or even realy noticing…, infact science has found out it’s healthy to do so. That is you miss out breakfast or breakfast and lunch to prolong your fast period. You can also have an evening meal then not eat at all the following day then start with breakfast or lunch the day after that. You can also do the 1/2, 2/2, 3/2 cycle where you eat your base needs on a cycle over three days that has a partial fast at the begining.

These fasts actually cause your base matabalism to rise to near where it should be and in the process increase the movment of energy from the fat cells to the liver, without anywhere near the level of hunger a continuous reduced calorie diet causes.

Untill recently the AMA recomendation for dealing with unstable type II diabetes was overload with calories and up the insulin doses. A sure fire recipy for obesity and ten to twenty years off of your real life expectancy. The argument was that type II diabetes was incurable and having sugar lows was potentially fatal.

The way to deal with unstable diabetes in many cases is stop eating the foods that cause “sugar highs” and that reduces the need for insulin that causes artificial low sugar levels. Also for those that have continuous blood glucose monitoring they can up their physical work load as the blood glucose starts rising to both burn it off and get some of that healthy daily cardiovascular excercise.

I know I make it sound easy, but I know it’s not[4]. I’m slowly loosing weight and not getting the hunger issues I used to.

My big problem at the moment is getting excercise, walking/running and other high impact excercise is “contra indicated” due to degenerative issues with spine and joints, along with COVID putting paid to swimming and other low impact excercise in gyms etc…

But the secret to real weight loss as I indicated is to “break the ratchet with minimum pain” and that might be attainable for many with just changing the foods they eat to get rid of sugars[2].

[1] It was mostly about politics, tipped by the grain and corn industry into realy dangerous diets that wrecked peoples internal organs,

Which is why Type II diabetes is killing millions of people early every year.

[2] One way to do this is not have any “pre-cooked” food in the house. If it’s easy to eat then giving way to cravings is to easy (and kid yourself if it’s fruit). Having only raw food in the house that has to be cooked slowly helps get over the food gravings[3], as does doing excercise. It’s not the big changes that make the big changes but the little everyday lifestyle changes.

[3] Almost as importantly slow cooking has psychological benifits, it lifts mood and helps reduce the highs and lows of various forms of depression brought on by stress. It also slows you down with what is actually gentle excercise, because you are standing on your feet moving around not sitting in a chair etc.

[4] There are tricks that can help first off socially acceptable drugs like caffeine nicotine, alcohol mess your body up and thus best avoided. Hunger pangs can be reduced with hot drinks. A few shreds of green vegtables and cooked meat in a hot –low sodium– stock gives taste and smell, fills the stomach with few calories. Certainly a lot less than a piece of fruit or tiny handfull of nuts or dried berries. Making your own stock is not difficult and can be immensely satisfying as an activity but don’t go as far as marmites that realy is a lot of effort.

Faustus December 16, 2020 10:30 AM


That is a pretty irrelevant critique missing the whole point. Why criticize someone for.attempting to find out more about a subject rather than simply discounting it? You may not be able to learn from texts but I read texts all the time because they generally present good overviews and attempt to present various sides of controversies. And generally they provide references for further research.

Don’t confuse your limitations for reality.

Obviously I have heard of.feminism. But with me too there have been many more factual (rather than moral or philosophical) claims. I wanted to.see how they were.supported.

JonKnowsNothing December 16, 2020 4:41 PM


re: If everyone were completely rational, a study showing that the placebo effect works even when the participants know they are receiving a placebo should give a null result. It doesn’t.

There is the “placebo effect” and the “nocebo effect”.

A nocebo effect is said to occur when negative expectations of the patient regarding a treatment cause the treatment to have a more negative effect than it otherwise would have.

A placebo is a substance or treatment which is designed to have no therapeutic value.

Humans are complex animal and medicine is an “art” which is “practiced on you”.

Feel like a canvas yet?

Erdem Memisyazici December 17, 2020 4:24 PM

We cannot simply say, “allow it all” because it’s “progress”. There are a lot of technologies we consider “progress” yet they are regulated. Especially in a world where a lot of people consider such “perfect timing” events as divine intervention. Not everybody is technology literate and I think there is a lot of abuse of the psychology of the uninformed masses here and that’s just wrong. That’s why we have experts.

Mr. Schneier, you’re the man 🙂 and Happy Holidays!

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.