Essays Tagged "Communications of the ACM"

Page 1 of 2

AI and Trust

  • Communications of the ACM
  • June 12, 2025

[full text—PDF (Acrobat)]

Note: The text in this column is taken, for the most part verbatim, from a talk by Mr. Schneier during the 2025 RSA Conference in San Francisco, CA on April 29, 2025.

This is a discussion about artificial intelligence (AI), trust, power, and integrity. I am going to make four basic arguments:

  1. There are two kinds of trust—interpersonal and social—and we regularly confuse them. What matters here is social trust, which is about reliability and predictability in society.
  2. Our confusion will increase with AI, and the corporations controlling AI will use that confusion to take advantage of us…

Web 3.0 Requires Data Integrity

New integrity-focused standards are necessary to enable the trusted AI services of tomorrow.

  • Bruce Schneier and Davi Ottenheimer
  • Communications of the ACM
  • March 24, 2025

If you’ve ever taken a computer security class, you’ve probably learned about the three legs of computer security—confidentiality, integrity, and availability—known as the CIA triad. When we talk about a system being secure, that’s what we’re referring to. All are important, but to different degrees in different contexts. In a world populated by artificial intelligence (AI) systems and artificial intelligent agents, integrity will be paramount.

What is data integrity? It’s ensuring that no one can modify data—that’s the security angle—but it’s much more than that. It encompasses accuracy, completeness, and quality of data—all over both time and space. It’s preventing accidental data loss; the “undo” button is a primitive integrity measure. It’s also making sure that data is accurate when it’s collected—that it comes from a trustworthy source, that nothing important is missing, and that it doesn’t change as it moves from format to format. The ability to restart your computer is another integrity measure…

Lattice-Based Cryptosystems and Quantum Cryptanalysis

Quantum computers are probably coming—and when they arrive, they will, most likely, be able to break our standard public-key cryptography algorithms.

  • Communications of the ACM
  • May 25, 2024

Quantum computers are probably coming, though we don’t know when—and when they arrive, they will, most likely, be able to break our standard public-key cryptography algorithms. In anticipation of this possibility, cryptographers have been working on quantum-resistant public-key algorithms. The National Institute for Standards and Technology (NIST) has been hosting a competition since 2017, and there already are several proposed standards. Most of these are based on lattice problems.

The mathematics of lattice cryptography revolve around combining sets of vectors—that’s the lattice—in a multi-dimensional space. These lattices are filled with multi-dimensional periodicities. The …

LLMs’ Data-Control Path Insecurity

Someday, some AI researcher will figure out how to separate the data and control paths. Until then, we’re going to have to think carefully about using LLMs in potentially adversarial situations—like on the Internet.

  • Communications of the ACM
  • May 12, 2024

Back in the 1960s, if you played a 2,600Hz tone into an AT&T pay phone, you could make calls without paying. A phone hacker named John Draper noticed that the plastic whistle that came free in a box of Captain Crunch cereal worked to make the right sound. That became his hacker name, and everyone who knew the trick made free pay-phone calls.

There were all sorts of related hacks, such as faking the tones that signaled coins dropping into a pay phone and faking tones used by repair equipment. AT&T could sometimes change the signaling tones, make them more complicated, or try to keep them secret. But the general class of exploit was impossible to fix because the problem was general: Data and control used the same channel. That is, the commands that told the phone switch what to do were sent along the same path as voices…

In Memoriam: Ross Anderson, 1956-2024

  • Communications of the ACM
  • April 9, 2024

Ross Anderson unexpectedly passed away in his sleep on March 28th in his home in Cambridge. He was 67.

I can’t remember when I first met Ross. It was well before 2008, when we created the Security and Human Behavior workshop. It was before 2001, when we created the Workshop on Economics and Information Security (okay, he created that one, I just helped). It was before 1998, when we first wrote about the insecurity of key escrow systems. In 1996, I was one of the people he brought to the Newton Institute at Cambridge University, for the six-month cryptography residency program he ran (I made a mistake not staying the whole time)—so it was before then as well…

Psychology of Security

  • Bruce Schneier
  • Communications of the ACM
  • May 2007

The security literature is filled with risk pathologies, heuristics that we use to help us evaluate risks. I’ve collected them from many different sources.

Risks of Risks
Exaggerated Risks Downplayed Risks
Spectacular Pedestrian
Rare Common
Personified Anonymous
Beyond one’s control More under control
Externally imposed Taken willingly
Talked about Not discussed
Intentional or man-made Natural
Immediate Long-term or diffuse
Sudden Evolving slowly over time
Affecting them personally Affecting others
New and unfamiliar…

Risks of Third-Party Data

  • Bruce Schneier
  • Communications of the ACM
  • May 2005

Reports are coming in torrents. Criminals are known to have downloaded personal credit information of over 145,000 Americans from ChoicePoint’s network. Hackers took over one of Lexis Nexis’ databases, gaining access to personal files of 32,000 people. Bank of America Corp. lost computer data tapes that contained personal information on 1.2 million federal employees, including members of the U.S. Senate. A hacker downloaded the names, Social Security numbers, voicemail and SMS messages, and photos of 400 T-Mobile customers, and probably had access to all of their 16.3 million U.S. customers. In a separate incident, Paris Hilton’s phone book and SMS messages were hacked and distributed on the Internet…

Two-Factor Authentication: Too Little, Too Late

  • Bruce Schneier
  • Communications of the ACM
  • April 2005

Two-factor authentication isn’t our savior. It won’t defend against phishing. It’s not going to prevent identity theft. It’s not going to secure online accounts from fraudulent transactions. It solves the security problems we had 10 years ago, not the security problems we have today.

The problem with passwords is that it is too easy to lose control of them. People give their passwords to other people. People write them down, and other people read them. People send them in email, and that email is intercepted. People use them to log into remote servers, and their communications are eavesdropped on. Passwords are also easy to guess. And once any of that happens, the password no longer works as an authentication token because you can never be sure who is typing in that password…

The Non-Security of Secrecy

  • Bruce Schneier
  • Communications of the ACM
  • October 2004

Considerable confusion exists between the different concepts of secrecy and security, which often causes bad security and surprising political arguments. Secrecy usually contributes only to a false sense of security.

In June 2004, the U.S. Department of Homeland Security urged regulators to keep network outage information secret. The Federal Communications Commission requires telephone companies to report large disruptions of telephone service, and wants to extend that to high-speed data lines and wireless networks. DHS fears that such information would give cyberterrorists a “virtual road map” to target critical infrastructures…

Insider Risks in Elections

  • Paul Kocher and Bruce Schneier
  • Communications of the ACM
  • July 2004

Many discussions of voting systems and their relative integrity have been primarily technical, focusing on the difficulty of attacks and defenses. This is only half of the equation: it’s not enough to know how much it might cost to rig an election by attacking voting systems; we also need to know how much it would be worth to do so. Our illustrative example uses the most recent available U.S. data, but is otherwise is not intended to be specific to any particular political party.

In order to gain a clear majority of the House in 2002, Democrats would have needed to win 13 seats that went to Republicans. According to Associated Press voting data, Democrats could have added 13 seats by swinging 49,469 votes. This corresponds to changing just over 1% of the 4,310,198 votes in these races and under 1/1000 of the 70 million votes cast in contested House races. The Senate was even closer: switching 20,703 votes in Missouri and New Hampshire would have provided Democrats with the necessary two seats…

Sidebar photo of Bruce Schneier by Joe MacInnis.