Essays in the Category "Internet and Society"

Page 1 of 4

Don’t Talk to People Like They’re Chatbots

AI could make our human interactions blander, more biased, or ruder.

  • Albert Fox Cahn and Bruce Schneier
  • Atlantic
  • January 17, 2024

For most of history, communicating with a computer has not been like communicating with a person. In their earliest years, computers required carefully constructed instructions, delivered through punch cards; then came a command-line interface, followed by menus and options and text boxes. If you wanted results, you needed to learn the computer’s language.

This is beginning to change. Large language models—the technology undergirding modern chatbots—allow users to interact with computers through natural conversation, an innovation that introduces some baggage from human-to-human exchanges. Early on in our respective explorations of ChatGPT, the two of us found ourselves typing a word that we’d never said to a computer before: “Please.” The syntax of civility has crept into nearly every aspect of our encounters; we speak to this algebraic assemblage as if it were a person—even when we know that …

How ChatGPT Hijacks Democracy

  • Nathan E. Sanders and Bruce Schneier
  • The New York Times
  • January 15, 2023

Launched just weeks ago, ChatGPT is already threatening to upend how we draft everyday communications like emails, college essays and myriad other forms of writing.

Created by the company OpenAI, ChatGPT is a chatbot that can automatically respond to written prompts in a manner that is sometimes eerily close to human.

But for all the consternation over the potential for humans to be replaced by machines in formats like poetry and sitcom scripts, a far greater threat looms: artificial intelligence replacing humans in the democratic processes—not through voting, but through lobbying…

‘Grassroots’ Bot Campaigns Are Coming. Governments Don’t Have a Plan to Stop Them.

Artificial intelligence software can easily pass for real public comments

  • Henry Farrell and Bruce Schneier
  • The Washington Post
  • May 20, 2021

This month, the New York state attorney general issued a report on a scheme by “U.S. Companies and Partisans [to] Hack Democracy.” This wasn’t another attempt by Republicans to make it harder for Black people and urban residents to vote. It was a concerted attack on another core element of U.S. democracy—the ability of citizens to express their voice to their political representatives. And it was carried out by generating millions of fake comments and fake emails purporting to come from real citizens.

This attack was detected because it was relatively crude. But artificial intelligence technologies are making it possible to generate genuine-seeming comments at scale, drowning out the voices of real citizens in a tidal wave of fake ones…

The Peril of Persuasion in the Big Tech Age

Persuasion is essential to society and democracy, but we need new rules governing how companies can harness it.

  • Bruce Schneier and Alicia Wanless
  • Foreign Policy
  • December 11, 2020

Ukrainian translation

Persuasion is as old as our species. Both democracy and the market economy depend on it. Politicians persuade citizens to vote for them, or to support different policy positions. Businesses persuade consumers to buy their products or services. We all persuade our friends to accept our choice of restaurant, movie, and so on. It’s essential to society; we couldn’t get large groups of people to work together without it. But as with many things, technology is fundamentally changing the nature of persuasion. And society needs to adapt its rules of persuasion or suffer the consequences…

Bots Are Destroying Political Discourse As We Know It

They’re mouthpieces for foreign actors, domestic political groups, even the candidates themselves. And soon you won’t be able to tell they’re bots.

  • Bruce Schneier
  • The Atlantic
  • January 7, 2020

Spanish translation

Presidential-campaign season is officially, officially, upon us now, which means it’s time to confront the weird and insidious ways in which technology is warping politics. One of the biggest threats on the horizon: Artificial personas are coming, and they’re poised to take over political debate. The risk arises from two separate threads coming together: artificial-intelligence-driven text generation and social-media chatbots. These computer-generated “people” will drown out actual human discussions on the internet.

Text-generation software is already good enough to fool most people most of the time. It’s writing news stories, particularly in …

8 Ways to Stay Ahead of Influence Operations

With election meddling inevitable in 2020, the United States needs a powerful kill chain.

  • Bruce Schneier
  • Foreign Policy
  • August 12, 2019

Influence operations are elusive to define. The Rand Corp.’s definition is as good as any: “the collection of tactical information about an adversary as well as the dissemination of propaganda in pursuit of a competitive advantage over an opponent.” Basically, we know it when we see it, from bots controlled by the Russian Internet Research Agency to Saudi attempts to plant fake stories and manipulate political debate. These operations have been run by Iran against the United States, Russia against Ukraine, China against Taiwan, and probably lots more besides…

We Must Prepare for the Next Pandemic

We’ll have to battle both the disease and the fake news.

  • Bruce Schneier
  • The New York Times
  • June 17, 2019

When the next pandemic strikes, we’ll be fighting it on two fronts. The first is the one you immediately think about: understanding the disease, researching a cure and inoculating the population. The second is new, and one you might not have thought much about: fighting the deluge of rumors, misinformation and flat-out lies that will appear on the internet.

The second battle will be like the Russian disinformation campaigns during the 2016 presidential election, only with the addition of a deadly health crisis and possibly without a malicious government actor. But while the two problems—misinformation affecting democracy and misinformation affecting public health—will have similar solutions, the latter is much less political. If we work to solve the pandemic disinformation problem, any solutions are likely to also be applicable to the democracy one…

Democracy's Dilemma

  • Henry Farrell and Bruce Schneier
  • Boston Review
  • May 15, 2019

The Internet was going to set us all free. At least, that is what U.S. policy makers, pundits, and scholars believed in the 2000s. The Internet would undermine authoritarian rulers by reducing the government’s stranglehold on debate, helping oppressed people realize how much they all hated their government, and simply making it easier and cheaper to organize protests.

This is Democracy’s Dilemma: the open forms of input and exchange that it relies on can be weaponized to inject falsehood and misinformation that erode democratic debate.

Today, we live in darker times. Authoritarians are using these same technologies to bolster their rule. Even worse, the Internet seems to be undermining democracy by allowing targeted disinformation, turning public debate into a petri dish for bots and propagandists, and spreading general despair. A new consensus is emerging that democracy is less a resilient political system than a free-fire zone in a broader information war…

Toward an Information Operations Kill Chain

  • Bruce Schneier
  • Lawfare
  • April 24, 2019

Cyberattacks don’t magically happen; they involve a series of steps. And far from being helpless, defenders can disrupt the attack at any of those steps. This framing has led to something called the “cybersecurity kill chain“: a way of thinking about cyber defense in terms of disrupting the attacker’s process.

On a similar note, it’s time to conceptualize the “information operations kill chain.” Information attacks against democracies, whether they’re attempts to polarize political processes or to increase mistrust in social institutions, also involve a series of steps. And enumerating those steps will clarify possibilities for defense…

Defending Democratic Mechanisms and Institutions against Information Attacks

  • Henry Farrell and Bruce Schneier
  • Defusing Disinfo
  • January 28, 2019

To better understand influence attacks, we proposed an approach that models democracy itself as an information system and explains how democracies are vulnerable to certain forms of information attacks that autocracies naturally resist. Our model combines ideas from both international security and computer security, avoiding the limitations of both in explaining how influence attacks may damage democracy as a whole.

Our initial account is necessarily limited. Building a truly comprehensive understanding of democracy as an information system will be a Herculean labor, involving the collective endeavors of political scientists and theorists, computer scientists, scholars of complexity, and others…

1 2 3 4

Sidebar photo of Bruce Schneier by Joe MacInnis.