Political Disinformation and AI

Elections around the world are facing an evolving threat from foreign actors, one that involves artificial intelligence.

Countries trying to influence each other’s elections entered a new era in 2016, when the Russians launched a series of social media disinformation campaigns targeting the US presidential election. Over the next seven years, a number of countries—most prominently China and Iran—used social media to influence foreign elections, both in the US and elsewhere in the world. There’s no reason to expect 2023 and 2024 to be any different.

But there is a new element: generative AI and large language models. These have the ability to quickly and easily produce endless reams of text on any topic in any tone from any perspective. As a security expert, I believe it’s a tool uniquely suited to Internet-era propaganda.

This is all very new. ChatGPT was introduced in November 2022. The more powerful GPT-4 was released in March 2023. Other language and image production AIs are around the same age. It’s not clear how these technologies will change disinformation, how effective they will be or what effects they will have. But we are about to find out.

Election season will soon be in full swing in much of the democratic world. Seventy-one percent of people living in democracies will vote in a national election between now and the end of next year. Among them: Argentina and Poland in October, Taiwan in January, Indonesia in February, India in April, the European Union and Mexico in June, and the US in November. Nine African democracies, including South Africa, will have elections in 2024. Australia and the UK don’t have fixed dates, but elections are likely to occur in 2024.

Many of those elections matter a lot to the countries that have run social media influence operations in the past. China cares a great deal about Taiwan, Indonesia, India, and many African countries. Russia cares about the UK, Poland, Germany, and the EU in general. Everyone cares about the United States.

And that’s only considering the largest players. Every US national election from 2016 has brought with it an additional country attempting to influence the outcome. First it was just Russia, then Russia and China, and most recently those two plus Iran. As the financial cost of foreign influence decreases, more countries can get in on the action. Tools like ChatGPT significantly reduce the price of producing and distributing propaganda, bringing that capability within the budget of many more countries.

A couple of months ago, I attended a conference with representatives from all of the cybersecurity agencies in the US. They talked about their expectations regarding election interference in 2024. They expected the usual players—Russia, China, and Iran—and a significant new one: “domestic actors.” That is a direct result of this reduced cost.

Of course, there’s a lot more to running a disinformation campaign than generating content. The hard part is distribution. A propagandist needs a series of fake accounts on which to post, and others to boost it into the mainstream where it can go viral. Companies like Meta have gotten much better at identifying these accounts and taking them down. Just last month, Meta announced that it had removed 7,704 Facebook accounts, 954 Facebook pages, 15 Facebook groups, and 15 Instagram accounts associated with a Chinese influence campaign, and identified hundreds more accounts on TikTok, X (formerly Twitter), LiveJournal, and Blogspot. But that was a campaign that began four years ago, producing pre-AI disinformation.

Disinformation is an arms race. Both the attackers and defenders have improved, but also the world of social media is different. Four years ago, Twitter was a direct line to the media, and propaganda on that platform was a way to tilt the political narrative. A Columbia Journalism Review study found that most major news outlets used Russian tweets as sources for partisan opinion. That Twitter, with virtually every news editor reading it and everyone who was anyone posting there, is no more.

Many propaganda outlets moved from Facebook to messaging platforms such as Telegram and WhatsApp, which makes them harder to identify and remove. TikTok is a newer platform that is controlled by China and more suitable for short, provocative videos—ones that AI makes much easier to produce. And the current crop of generative AIs are being connected to tools that will make content distribution easier as well.

Generative AI tools also allow for new techniques of production and distribution, such as low-level propaganda at scale. Imagine a new AI-powered personal account on social media. For the most part, it behaves normally. It posts about its fake everyday life, joins interest groups and comments on others’ posts, and generally behaves like a normal user. And once in a while, not very often, it says—or amplifies—something political. These persona bots, as computer scientist Latanya Sweeney calls them, have negligible influence on their own. But replicated by the thousands or millions, they would have a lot more.

That’s just one scenario. The military officers in Russia, China, and elsewhere in charge of election interference are likely to have their best people thinking of others. And their tactics are likely to be much more sophisticated than they were in 2016.

Countries like Russia and China have a history of testing both cyberattacks and information operations on smaller countries before rolling them out at scale. When that happens, it’s important to be able to fingerprint these tactics. Countering new disinformation campaigns requires being able to recognize them, and recognizing them requires looking for and cataloging them now.

In the computer security world, researchers recognize that sharing methods of attack and their effectiveness is the only way to build strong defensive systems. The same kind of thinking also applies to these information campaigns: The more that researchers study what techniques are being employed in distant countries, the better they can defend their own countries.

Disinformation campaigns in the AI era are likely to be much more sophisticated than they were in 2016. I believe the US needs to have efforts in place to fingerprint and identify AI-produced propaganda in Taiwan, where a presidential candidate claims a deepfake audio recording has defamed him, and other places. Otherwise, we’re not going to see them when they arrive here. Unfortunately, researchers are instead being targeted and harassed.

Maybe this will all turn out okay. There have been some important democratic elections in the generative AI era with no significant disinformation issues: primaries in Argentina, first-round elections in Ecuador, and national elections in Thailand, Turkey, Spain, and Greece. But the sooner we know what to expect, the better we can deal with what comes.

This essay previously appeared in The Conversation.

Posted on October 5, 2023 at 7:12 AM40 Comments

Comments

Winter October 5, 2023 8:29 AM

@PaulBart

Still pushing the Russian disinformation (yawn).

I am wondering who does argue there is no Russian disinformation? Prigozhin did invest more than enough money in it.

Don’t look behind the Deep State Curtain (aka Twitter Files, as well as everything shown by Assange and WikiLeaks), might find a mirror into your soul.

That is a lot of whatsaboutism. Remember that “whatsaboutism” is an admission the original claim was true. It is not an endorsement of the truth of whatsaboutism. So you now have rejected and admitted that Russia wages a disinformation campaign in the same comment.

If we forget the fact that the “Deep State” is as nebulous and elusive as Bigfoot or the Log Ness monster, there is noting in neither the Twitter files nor Wikileaks that even suggests there was or is no Russian disinformation campaign.

Nor do these files claim that the US is attacking their own elections. That is not even necessary as the US media are very good at airing all the attempts to interfere with voters (Gerrymandering, redistricting, voter ID), vote counting (Trump et al.), and fake news (eg, Fox News, Breitbart).

[1] ‘https://www.rand.org/pubs/perspectives/PE198.html

[2] ‘https://www.themoscowtimes.com/2023/02/14/wagner-founder-prigozhin-admits-he-was-behind-russias-infamous-troll-farms-a80228

“I was never just a financier of the Internet Research Agency. I invented it, I created it, I ran it for a long time,” Prigozhin told journalists from Germany’s Der Spiegel magazine, adding that he had set the organization up “to protect the Russian information space from boorish, aggressive anti-Russian propaganda by the West.”

Gabriele October 5, 2023 8:49 AM

They expected the usual players—Russia, China, and Iran—and a significant new one: “domestic actors.”

I am a bit confused about election interference by “domestic actors”, isn’t that called democracy? I think that in every country there has been a debate about dirty tactics and irresponsible behavior by whoever you oppose. However, most people consider this a bad, but legitimate, part of politics.

If you consider local people and companies being disonest or misleading acceptable, I do not think that using generative AI to do it is worse than any other means. Most people in democracy think we can be disonest with ourselves, since it is impossible to find a fair and objective way to regulate it. We can exclude foreigners since they have no standing in our own nation. And of course that is true for every country, but I am sure that no democracy, including the USA, ever did any of that.

My fear is that somebody is using this argument of domestic interference to go back to the time when the bad people, poors and minorities were excluded from partecipating in politics.

Winter October 5, 2023 9:27 AM

@Gabriele

I am a bit confused about election interference by “domestic actors”, isn’t that called democracy?

Have a look at Slovakia’s recent elections

‘https://www.wired.co.uk/article/slovakia-election-deepfakes

Or look at what fake news did in organizing an armed rebellion in the USA where a mob was close to murdering the vice-president and major elected members of Congress when they stormed the Capitol.

Personally, I think those who deny the fake news deluge are the ones who hope the fake news campaigns succeed, either before or after the elections.

Winter October 5, 2023 9:29 AM

@Gabriele

I am a bit confused about election interference by “domestic actors”, isn’t that called democracy?

Have a look at Slovakia’s recent elections

‘https://www.wired.co.uk/article/slovakia-election-deepfakes

Or look at what fake news did in organizing an armed rebellion in the USA where a mob was close to murdering the vice-president and major elected members of Congress when they stormed the Capitol.

Personally, I suspect those who deny the fake news deluge are the ones who hope the fake news campaigns succeed, either before or after the elections.

Winter October 5, 2023 9:31 AM

@Wannabe Techguy

Who gets to decide what is “disinformation”?

You are one of those people who don’t believe in facts, but only in opinions?

Scott October 5, 2023 10:01 AM

It’s beyond the scope of this blog, but what adds to the risk factor here is the sheer number of citizens who want to believe the propaganda – either about their favored candidate or about the opposition. The number of voters/commenters/citizens who are willing to believe everything and anything positive about their side as well as everything and anything negative about the other side is daunting.

1&1!=2 October 5, 2023 10:16 AM

@PaulBart:

“Still pushing the Russian disinformation (yawn).”

It’s been said that,

‘There is none so blind as those who chose not to see.’

And your chosing is clearly to others view, a cognative bias beyond repair.

Clive Robinson October 5, 2023 10:53 AM

@ALl,

Due to auto-mod.

Part 1

The very minority US politicals and those looking to make gelt by them, talk of Russia, China, Iran, North Korea and others as being threats, if not “The Four Horsemen”…

Such melodrama and faux-angst, and for “Who?” is such created, and importantly ‘Why?”

But you likewise have to ask who others see as being threats and why.

In the UK for instance since Brexit and the proven link back to one of the three families trying to control the GOP at the time and funneling money into various UK political “exit” activists through “Russian cutouts” as an attempt at cover their illegal activities, and worse much worse the easy access given by the US Silicon Valley Corps to individuals profiles built from Social Media. Many have seen the US as perhaps the worst offender by far for this sort of “propaganda”.

Perhaps if the US cleaned it’s own stables…

“For the fifth labor, Eurystheus ordered Hercules to clean up King Augeas’ stables. Hercules knew this job would mean getting dirty and smelly, but sometimes even a hero has to do these things. Then Eurystheus made Hercules’ task even harder: he had to clean up after the cattle of Augeas in a single day.”

I’m not asking for “a day” but before November this year would be good…

Winter October 5, 2023 12:01 PM

@summer

Winter’s hyperbole

What kind of person thinks an armed uprising that leaves 2 people dead and several in hospital as “hyperbole”? The revolutionairies were out to kill the vice-president and several Democrat representatives.

Attempts to overthrow election outcomes by violence are serious stuff, especially as it happens inside parliament.

Clive Robinson October 5, 2023 12:09 PM

@ ALL,

Due to the auto-mod.

Part 1,

Those mostly considered “the idiots on the Hill” and those looking to make capital by them.

Nina Jankowicz October 5, 2023 12:16 PM

@Winter

You are one of those people who don’t believe in facts, but only in opinions?

Don’t you know? It’s MY opinions that ARE facts.
I worked so hard on Zelensky’s campaign that I earned the right to determine what constitutes facts. If you disagree with my opinion’s I’ll just label you as a Russian actor and have all my associates in the media blacklist you. That’s how we do it here in the US.

JonKnowsNothing October 5, 2023 12:31 PM

@Winter, @Wannabe Techguy, All

re:
@WT: Who gets to decide what is “disinformation”?

@W: You are one of those people who don’t believe in facts, but only in opinions?

The huge problem with Generative AI is not only can we no longer discern facts, but we can not longer discern opinions. “Attribution Is Hard” and made significantly harder as the presented facts may or may not true in the absolute sense, but also since humans make selective choices about particular facts as suits their outlook on life.

The Scrabble Game of Generative AI fitted into a Sentence Diagram for a defined grammar, throws all sorts of nouns and verbs together in a montage which given a particular viewpoint will appear true and correct. Human’s decide the “truth” level but Generative AI doesn’t have any touchstone about “truth”; so it easily passes along the results not only to itself but also to other AI crawlers.

Example:

  • Q: What are eggs?
  • Q: What is melting?
  • Q: Do eggs melt?
  • A: Eggs are a common food.
  • A: Melting is the transition phase when a solid becomes a liquid.
  • A: “Yes, eggs melt.” (an actual answer given by Generative AI)

Melting Eggs is an opinion aka hallucination or fantasy presented as fact. Which in this case, was picked up and regurgitated by other Generative AI programs. It’s firmly in the datasets.

Only humans with preexisting knowledge find the last one item odd. Without preexisting knowledge that statement will go unchallenged.

A person asking Generative AI to write it a story about eggs, perhaps was thinking: “Green Eggs and Ham”.

===

ht tps://en.wikipedia.o r g/wiki/Scrabble

  • Scrabble is a game where random drawn letters of a particular distribution are set on a grid to form identifiable words.

ht tps://en.wikipedia.o r g/wiki/Sentence_diagram

  • A sentence diagram is a pictorial representation of the grammatical structure of a sentence.

ht tps://en.wikipedia.o r g/wiki/Green_eggs_and_ham

  • Green Eggs and Ham is a children’s book by Dr. Seuss, first published on August 12, 1960

(urls fractured)

Lamont October 5, 2023 12:37 PM

They expected the usual players—Russia, China, and Iran—and a significant new one: “domestic actors.”

You don’t think there was domestic bullshit-spreading social media campaigns in the 2016 election?

JonKnowsNothing October 5, 2023 12:44 PM

@Nina Jankowicz, All

Your HAIL is showing…

  • I worked so hard on Zelensky’s campaign that I earned the right…
  • That’s how we do it here in the US…

So you are in the US meddling in a foreign election???

Better clean up that input query.

Winter October 5, 2023 1:47 PM

@Nina Jankowicz

My fear is that somebody is using this argument of domestic interference to go back to the time when the bad people, poors and minorities were excluded from partecipating in politics.

It is remarkable, but most of the Russian (and other’s) disinformation is seen to target exactly the participation of minorities and poorest in politics. Disinformation campaigns tend to tell one group of people they are shortchanged and should fight another.

So the experience is that it is not those fighting disinformation that try to exclude people from politics, but those disseminating the disinformation do their best. And we see in the US, that they are successful.

Clive Robinson October 5, 2023 1:54 PM

@ ALL,

Part 2,

That is those who our host used to only half jokingly refere to as “The Four Horsemen…”.

Clive Robinson October 5, 2023 1:57 PM

@ ALL,

Part X,

The auto-mod is “holding for moderation” single sentences, that have no rude words, questionable words or anything else to “moderate”

Thus will this get through or not.

Clive Robinson October 5, 2023 2:03 PM

@ ALL,

Part 2,

They have to create such melodrama and faux honesty, having already done it to others…

Which obvioudly raises questions.

As I’ve indicated before, in essence they are, for “Who?” are they created, and importantly ‘Why?”.

Clive Robinson October 5, 2023 2:07 PM

@ ALL,

Part 3,

But you have to pause and think and like others that have posted, you have to ask who others see as being threats and why.

As noted in the majority of cases when those on the hill complain, it’s because they’ve actuall already been caught doing it to others in other continents.

Nina Jankowicz October 5, 2023 2:11 PM

Read the caption on the picture in the article below. No HAIL here, just sunshine and facts that exacerbate the cognitive dissonance.

ht tps://nymag.com/intelligencer/2022/05/poorly-conceived-biden-disinformation-board-put-on-pause.html

Clive Robinson October 5, 2023 2:13 PM

@ ALL,

Part 4,

In Britain for instance since Brexit and the proven link back to one of the three families trying to control the GOP at the time, and their illegal funneling of money into various UK political “exit” activists through “Russian cutouts” as a failed attempt to cover their illegal activities, and worse much worse the easy access given by Silicon Valley Corps to individual citizens profiles built from Social Media.

Thus many have seen the GOP backers as perhaps the worst offenders by far for the lead up for this sort of “propaganda”.

Clive Robinson October 5, 2023 2:18 PM

@ ALL,

Part 5,

Perhaps if the US cleaned out it’s own stables, and flushed the detritus and bovine scat as was once a job suitable for a hero…

“For the fifth labor, Eurystheus ordered Hercules to clean up King Augeas’ stables. Hercules knew this job would mean getting dirty and smelly, but sometimes even a hero has to do these things. Then Eurystheus made Hercules’ task even harder: he had to clean up after the cattle of Augeas in a single day.”

I’m not asking for “a day” but before November this year would be nice. But I’m not holding my breath as that sort of colour does not look good on me…

Arclight October 5, 2023 3:08 PM

We already have a domestic terrorism problem. At least we are taking the far-right side of it seriously. I see periodic updates from investigative journalists on which groups are based where in the US, their structure, goals and so on.

I would like to see this same level attention placed on the “Black Bloc” or Marxist-style protest groups. They were responsible for multiple shootings, arsons and attacks on public buildings in 2020-2022. And yet we still have no idea who the real leaders of these organizations are, how funds flow from legitimate non-profits to the violent groups or that the organizations really want. This is the real conspiracy.

Eldridge "Beaver" Cleaver October 5, 2023 3:48 PM

At least when the radical left engaged in conspiracy, it makes up “cool” names like the Symbionese Liberation Army and the Weather Underground[1].

What do we get from the right? Proud Boys? Three Percenters?

Please, try harder.

[1][2] Ask your grandparents.

[2] Markdown does strange things with superscripts.

Anonymous October 5, 2023 4:48 PM

would like to see this same level attention placed on the “Black Bloc” or Marxist-style protest groups. They were responsible for multiple shootings, arsons and attacks on public buildings in 2020-2022. And yet we still have no idea who the real leaders of these organizations are, how funds flow from legitimate non-profits to the violent groups or that the organizations really want. This is the real conspiracy.

A real conspiracy theory, that is. It’s really hard to find the people behind non-existent organizations responsible for things that didn’t happen. But if you’re worried about “subversive leftist groups”, worry not, the TLAs have been spying on and infiltrating perfectly peaceful and lawful leftist organizations since the 50s. Them keeping tabs on actually dangerous violent groups is the novelty.

lurker October 5, 2023 6:26 PM

We don’t need AI round here. Campaigns in full swing, existing PM tests positive for C19, pulls out of all public appointments for 5 days, opposition wannabe PM calls him chicken.

‘https://www.rnz.co.nz/news/political/499257/labour-leader-chris-hipkins-says-luxon-putting-up-roadblocks-to-leaders-debate

bk5k sw5N October 5, 2023 6:57 PM

The level of political discussion is generally so low that it is in any case lindistinguishable from dis – and mis- information

John doe October 5, 2023 11:01 PM

I previously posted as Nina Jankowicz and want to clarify that the posts were made to demonstrate a concept and were sort of satire, not actually Nina Jankowicz, and not intended to be defamatory. Nina Jankowicz was appointed as the head of the disinformation governance board which only lasted a short time before being shut down. She no longer works in government and shouldn’t have been used as the face of the concept I was trying to demonstrate. The comments were a crude attempt at explaining that all governments of the world have competing interests that extend into the information space, including the US. It’s an unfortunate reality that politics in any flavor will always involve some element of propaganda and expecting politicians, NGO’s, and anyone else with a vested interest to police the information space is a losing bet as we’ve seen time and time again. Freedom of speech is the single most important thing in a democratic and free society and I would rather deal with Russian bots on Twitter than have some one policing what I can or cannot say, or what information I have access to.

Erdem Memisyazici October 6, 2023 1:35 AM

While it is difficult to live at a time when any sort of media can be faked very easily and convincingly I think it doesn’t really change much as it only got cheaper and better. In fact as the article mentions it can simply be generated by more people now. If there is any serious or official matter you simply cannot submit information to it anonymously if you want it to be counted as your person. If your A.I. writes falsehoods and you put your name under it, you’d simply be held accountable for what was generated which we did see in the news for example by Lawyers Steven Schwartz and Peter LoDuca of the firm Levidow, Levidow, & Oberman.

In judicial matters there is no anonymity so if people flood the Congress with well written generated texts then the persons behind them must own up to the words and be accounted for by the numbers. Maybe have official digital business be made from the public library only and secure the hell out of that? It can’t be that hard to go to your local library to write a letter to your representatives and have people confirm sources there for elections related matters.

When it comes to the matter of swaying the opinion of the masses by automated disinformation campaigns people must simply realize the Internet largely is full of shit. Don’t believe more than half the stuff you come across, check references and remember that there is no such a thing as, “surely it wasn’t all made up” because anyone can get you to a website called, “thishereisthewholetruth.com” and put on a live interactive conference with any person with thousands of pages of text, images, videos of anyone and anything.

In matters of archiving a similar issue persist as well. In that case your memory, verifiable official sources, and printed media will always serve you best of course. I recently found a soccer match in full on YouTube posted by some random account where the results of the match as well as the footage were edited to produce a different outcome. I remembered what it was from years ago but if I hadn’t I may have thought that’s what had happened. Who would take the time to edit 90 minutes from early 2000s? Apparently someone did and there it was.

Winter October 6, 2023 1:55 AM

@John doe [really?]

Freedom of speech is the single most important thing in a democratic and free society

Let’s look at the definition of Freedom of Speech:

the legal right to express one’s opinions freely [1]

This definition is also the translation of the corresponding right in many other languages. It is not Freedom of “Speech” but Freedom of “expressing one’s opinion” in, eg, French, German, and Dutch law. [2]

What characterizes disinformation is the fact that the person or institution that disseminates it do not believe it themselves. Therefore, they are not expressing their opinions, but manipulating people.

Freedom of publishing any speech whatsoever is untenable. Even the US are unable to do so and have many laws to limit fraud, defamation, or false information. [3] The only ones who proclaim the fundamental right to deceive, lie, and cheat people are USA constitutional fundamentalists.

The “Freedom to express one’s opinion” is fundamental to any Democracy. The Freedom to Cheat and Deceive is not.

[1] ‘https://www.merriam-webster.com/dictionary/freedom of speech

[2] Hence it is “easy” to limit commercial advertisements in these countries as paid speech is not “expressing an opinion” but simple manipulation.

[3] ‘https://www.law.cornell.edu/uscode/text/18/1038

Winter October 6, 2023 2:05 AM

@John Doe

Freedom of speech is the single most important thing in a democratic and free society

The definition of “Freedom of Speech” is [Merriam Webster]:

the legal right to express one’s opinions freely

Note that this is not about “speaking” but about expressing one’s opinion. This is also the right as written in the laws in other languages (eg, French, German, Dutch law). There is no legal right to defame, cheat, deceive, or defraud.

The characteristic of disinformation is that those who originate it do not believe it themselves. It is not their opinion.

Winter October 6, 2023 2:14 AM

Re: Freedom of Speech absolutism

Another form of disinformation

Hundreds of US schools hit by potentially organized swatting hoaxes, report says
‘https://arstechnica.com/tech-policy/2023/10/report-active-shooter-hoaxes-hitting-hundreds-of-us-schools-may-be-coordinated/

I do not assume there are many people who want to protect the Freedom of these callers to Say Whatever They Want.

Winter October 6, 2023 5:53 AM

@Erdem Memisyazici

In judicial matters there is no anonymity so if people flood the Congress with well written generated texts then the persons behind them must own up to the words and be accounted for by the numbers.

Anything digital is just a string of numbers. These numbers can be created and manipulated to show whatever you want. Just like a drawing, painting, or text are no proof of anything, a digital file is no proof of anything.

The courts do not consider any object, movie, image, recording, or text as evidence unless they have a person who can and will vouch for their veracity under penalty of perjury.

Although the public cannot level penalties onto witnesses, the public can demand that someone will stake their reputation on the recording or documents presented. If there is no reputation to stake, the trust in the matter should be scaled down accordingly.

Which means that you should never trust any anonymous “evidence” without independent verification. It is no surprise that this is SOP for any halfway decent journalist or news outlets (and much more stringent ones).

The point is that, due to inadequate education, much of the public is still in the medieval mindset that holds printed texts and (painted) images as evidence of the truth.

[1] Why else would there be a global QAnon movement based on a few anonymous emails? [2]

[2] I know, these people believe what they want to believe. The emails are just an excuse to organize around.

Kikeron October 6, 2023 9:04 PM

These persona bots, […] have negligible influence on their own. But replicated by the thousands or millions, they would have a lot more.

Apparently we don’t need AI to spread “disinformation.” This paragraph describes millions of actual flesh-and-blood Americans on social media.

vas pup October 8, 2023 6:21 PM

@Wannabe Techguy and ALL:

See, that is old as argument: our spies are intelligent offices, their spies (from unfriendly countries) are spies. That just name labeling in attempt to change emotional impact but they are the same by nature of the craft as plumbers are the same in any country around the globe when you put ideological bullsh1t aside.
Close to your question: that is the answer “There are no facts, only interpretations.” ~ Friedrich Nietzsche

I could trust overlapping part of information from unfriendly source only, e.g. what in CNN and FOX input is the same. That is when you want to get facts aka ‘truth’ versus emotionally charge statements to reduce your logical ability and switch to emotional state when it is easy to manipulate. That tactics is always used by interrogator around the globe regardless of ideological bs.

“Disinfromation” is spectrum thing. 100% lie – no existent facts – highly likely bs. 95% truth mixed with %5 lies when former supports by facts latter just poison pill you have to swallow usually based on motivational bias – we rather want to believe what is match our believes rather than truth that contradicts them.

The most terrible thing when folks making decisions start believing themselves in myths the are created for consumption for other. That is total abandoning reality.

vas pup October 12, 2023 4:14 PM

X takes down hundreds of accounts with Hamas ties, flags content
https://www.timesofisrael.com/x-takes-down-hundreds-of-accounts-with-hamas-ties-flags-content/

Elon Musk’s social media platform X has removed hundreds of Hamas-linked accounts and taken down or labeled thousands of pieces of content since the terror group’s murderous assault on Israel, according to the CEO of the company formerly known as Twitter.

Linda Yaccarino on Thursday outlined efforts by X to get a handle on illegal
content flourishing on the platform. She was responding to a warning from a top
European Union official, who requested information on how X is complying during
the Israel-Hamas war with tough new EU digital rules aimed at cleaning up social media platforms.

“So far since the start of the conflict X has identified and removed hundreds
of Hamas-affiliated accounts from the platform,” Yaccarino said in a letter posted on X.

Under the EU’s Digital Services Act, which took effect in August, social media
companies have to step up policing of their platforms for illegal content, under threat of hefty fines.

“There is no place on X for terrorist organizations or violent extremist groups
and we continue to remove such accounts in real-time, including proactive efforts,” Yaccarino said.

Rivals such as TikTok, YouTube, and Facebook also are coping with a flood of
unsubstantiated rumors and falsehoods about the Middle Eastern conflict,
playing the typical whack-a-mole that erupts each time a news event captures
world attention.”

My nickel: Bravo Linda, Bravo Elon – good news, bad news – they did it AFTER EU request not by their own initially. That reminds me old ‘complain’ when one wife complaining to her female friend: my husband did everything and help me with everything when I ask but never did it on his own initiative.

ResearcherZero October 14, 2023 10:58 PM

“deep stories” and “deep frames”
https://misinforeview.hks.harvard.edu/article/critical-disinformation-studies-history-power-and-politics/

Velocity, Virality, Anonymity, Homophily, Monopoly and Sovereignty: What conditions lead to disinformation fomenting in these online spaces?

‘https://pacscenter.stanford.edu/wp-content/uploads/2019/03/a6112278-190206_kaf_democracy_internet_persily_single_pages_v3.pdf

critical ignoring — choosing what to ignore and where to invest one’s limited attentional capacities.

‘https://journals.sagepub.com/doi/10.1177/09637214221121570

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.