May 15, 2013
by Bruce Schneier
Chief Security Technology Officer, BT
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <http://www.schneier.com/crypto-gram-1305.html>. These same essays and news items appear in the "Schneier on Security" blog at <http://www.schneier.com/blog>, along with a lively and intelligent comment section. An RSS feed is available.
In this issue:
- Refuse to be Terrorized
- Intelligence Analysis and the Connect-the-Dots Metaphor
- Transparency and Accountability
- Me on the Boston Terrorist Attacks
- The Boston Marathon Bomber Manhunt
- More Links on the Boston Terrorist Attack
- The Public/Private Surveillance Partnership
- Schneier News
- Michael Chertoff on Google Glass
As the details about the bombings in Boston unfold, it'd be easy to be scared. It'd be easy to feel powerless and demand that our elected leaders do something -- anything -- to keep us safe.
It'd be easy, but it'd be wrong. We need to be angry and empathize with the victims without being scared. Our fears would play right into the perpetrators' hands -- and magnify the power of their victory for whichever goals whatever group behind this, still to be uncovered, has. We don't have to be scared, and we're not powerless. We actually have all the power here, and there's one thing we can do to render terrorism ineffective: Refuse to be terrorized.
It's hard to do, because terrorism is designed precisely to scare people -- far out of proportion to its actual danger. A huge amount of research on fear and the brain teaches us that we exaggerate threats that are rare, spectacular, immediate, random -- in this case involving an innocent child -- senseless, horrific and graphic. Terrorism pushes all of our fear buttons, really hard, and we overreact.
But our brains are fooling us. Even though this will be in the news for weeks, we should recognize this for what it is: a rare event. That's the very definition of news: something that is unusual -- in this case, something that almost never happens.
Remember after 9/11 when people predicted we'd see these sorts of attacks every few months? That never happened, and it wasn't because the TSA confiscated knives and snow globes at airports. Give the FBI credit for rolling up terrorist networks and interdicting terrorist funding, but we also exaggerated the threat. We get our ideas about how easy it is to blow things up from television and the movies. It turns out that terrorism is much harder than most people think. It's hard to find willing terrorists, it's hard to put a plot together, it's hard to get materials, and it's hard to execute a workable plan. As a collective group, terrorists are dumb, and they make dumb mistakes; criminal masterminds are another myth from movies and comic books.
Even the 9/11 terrorists got lucky.
If it's hard for us to keep this in perspective, it will be even harder for our leaders. They'll be afraid that by speaking honestly about the impossibility of attaining absolute security or the inevitability of terrorism -- or that some American ideals are worth maintaining even in the face of adversity -- they will be branded as "soft on terror." And they'll be afraid that Americans might vote them out of office. Perhaps they're right, but where are the leaders who aren't afraid? What has happened to "the only thing we have to fear is fear itself"?
Terrorism, even the terrorism of radical Islamists and right-wing extremists and lone actors all put together, is not an "existential threat" against our nation. Even the events of 9/11, as horrific as they were, didn't do existential damage to our nation. Our society is more robust than it might seem from watching the news. We need to start acting that way.
There are things we can do to make us safer, mostly around investigation, intelligence, and emergency response, but we will never be 100-percent safe from terrorism; we need to accept that.
How well this attack succeeds depends much less on what happened in Boston than by our reactions in the coming weeks and months. Terrorism isn't primarily a crime against people or property. It's a crime against our minds, using the deaths of innocents and destruction of property as accomplices. When we react from fear, when we change our laws and policies to make our country less open, the terrorists succeed, even if their attacks fail. But when we refuse to be terrorized, when we're indomitable in the face of terror, the terrorists fail, even if their attacks succeed.
Don't glorify the terrorists and their actions by calling this part of a "war on terror." Wars involve two legitimate sides. There's only one legitimate side here; those on the other are criminals. They should be found, arrested, and punished. But we need to be vigilant not to weaken the very freedoms and liberties that make this country great, meanwhile, just because we're scared.
Empathize, but refuse to be terrorized. Instead, be indomitable -- and support leaders who are as well. That's how to defeat terrorists.
This essay originally appeared on TheAtlantic.com.
It's a rewrite of something I wrote in 2006.
The essay received 42,000 Facebook likes and 6800 Tweets. The editor told me it had about 360,000 hits. That makes it the most popular piece I've ever written. It's interesting to see how much more resonance this idea has today than it did a dozen years ago.
And -- I can hardly believe it -- President Obama said "the American people refuse to be terrorized" in a press briefing right after the bombings.
The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn't happen again?
It's an old song by now, one we heard after the 9/11 attacks in 2001 and after the Underwear Bomber's failed attack in 2009. The problem is that connecting the dots is a bad metaphor, and focusing on it makes us more likely to implement useless reforms.
Connecting the dots in a coloring book is easy and fun. They're right there on the page, and they're all numbered. All you have to do is move your pencil from one dot to the next, and when you're done, you've drawn a sailboat. Or a tiger. It's so simple that 5-year-olds can do it.
But in real life, the dots can only be numbered after the fact. With the benefit of hindsight, it's easy to draw lines from a Russian request for information to a foreign visit to some other piece of information that might have been collected.
In hindsight, we know who the bad guys are. Before the fact, there are an enormous number of potential bad guys.
How many? We don't know. But we know that the no-fly list had 21,000 people on it last year. The Terrorist Identities Datamart Environment, also known as the watch list, has 700,000 names on it.
We have no idea how many potential "dots" the FBI, CIA, NSA and other agencies collect, but it's easily in the millions. It's easy to work backwards through the data and see all the obvious warning signs. But before a terrorist attack, when there are millions of dots -- some important but the vast majority unimportant -- uncovering plots is a lot harder.
Rather than thinking of intelligence as a simple connect-the-dots picture, think of it as a million unnumbered pictures superimposed on top of each other. Or a random-dot stereogram. Is it a sailboat, a puppy, two guys with pressure-cooker bombs, or just an unintelligible mess of dots? You try to figure it out.
It's not a matter of not enough data, either.
Piling more data onto the mix makes it harder, not easier. The best way to think of it is a needle-in-a-haystack problem; the last thing you want to do is increase the amount of hay you have to search through. The television show "Person of Interest" is fiction, not fact.
There's a name for this sort of logical fallacy: hindsight bias. First explained by psychologist Baruch Fischhoff, and extensively researched by Daniel Kahneman and Amos Tversky, it's surprisingly common. Since what actually happened is so obvious once it happens, we overestimate how obvious it was before it happened.
We actually misremember what we once thought, believing that we knew all along that what happened would happen. It's a surprisingly strong tendency, one that has been observed in countless laboratory experiments and real-world examples of behavior. And it's what all the post-Boston-Marathon bombing dot-connectors are doing.
Before we start blaming agencies for failing to stop the Boston bombers, and before we push "intelligence reforms" that will shred civil liberties without making us any safer, we need to stop seeing the past as a bunch of obvious dots that need connecting.
Kahneman, a Nobel prize winner, wisely noted: "Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight." Kahneman calls it "the illusion of understanding," explaining that the past is only so understandable because we have cast it as simple inevitable stories and leave out the rest.
Nassim Taleb, an expert on risk engineering, calls this tendency the "narrative fallacy." We humans are natural storytellers, and the world of stories is much more tidy, predictable and coherent than the real world.
Millions of people behave strangely enough to warrant the FBI's notice, and almost all of them are harmless. It is simply not possible to find every plot beforehand, especially when the perpetrators act alone and on impulse.
We have to accept that there always will be a risk of terrorism, and that when the occasional plot succeeds, it's not necessarily because our law enforcement systems have failed.
This essay previously appeared on CNN.com.
The no-fly and watch lists.
As part of the fallout of the Boston bombings, we're probably going to get some new laws that give the FBI additional investigative powers. As with the Patriot Act after 9/11, the debate over whether these new laws are helpful will be minimal, but the effects on civil liberties could be large. Even though most people are skeptical about sacrificing personal freedoms for security, it's hard for politicians to say no to the FBI right now, and it's politically expedient to demand that *something* be done.
If our leaders can't say no -- and there's no reason to believe they can -- there are two concepts that need to be part of any new counterterrorism laws, and investigative laws in general: transparency and accountability.
Long ago, we realized that simply trusting people and government agencies to always do the right thing doesn't work, so we need to check up on them. In a democracy, transparency and accountability are how we do that. It's how we ensure that we get both effective and cost-effective government. It's how we prevent those we trust from abusing that trust, and protect ourselves when they do. And it's especially important when security is concerned.
First, we need to ensure that the stuff we're paying money for actually works and has a measureable impact. Law-enforcement organizations regularly invest in technologies that don't make us any safer. The TSA, for example, could devote an entire museum to expensive but ineffective systems: puffer machines, body scanners, FAST behavioral screening, and so on. Local police departments have been wasting lots of post-9/11 money on unnecessary high-tech weaponry and equipment. The occasional high-profile success aside, police surveillance cameras have been shown to be a largely ineffective police tool.
Sometimes honest mistakes led organizations to invest in these technologies. Sometimes there's self-deception and mismanagement -- and far too often lobbyists are involved. Given the enormous amount of security money post-9/11, you inevitably end up with an enormous amount of waste. Transparency and accountability are how we keep all of this in check.
Second, we need to ensure that law enforcement does what we expect it to do and nothing more. Police powers are invariably abused. Mission creep is inevitable, and it results in laws designed to combat one particular type of crime being used for an ever-widening array of crimes. Transparency is the only way we have of knowing when this is going on.
For example, that's how we learned that the FBI is abusing National Security Letters. Traditionally, we use the warrant process to protect ourselves from police overreach. It's not enough for the police to want to conduct a search; they also need to convince a neutral third party -- a judge -- that the search is in the public interest and will respect the rights of those searched. That's accountability, and it's the very mechanism that NSLs were exempted from.
When laws are broken, accountability is how we punish those who abused their power. It's how, for example, we correct racial profiling by police departments. And it's a lack of accountability that permits the FBI to get away with massive data collection until exposed by a whistleblower or noticed by a judge.
Third, transparency and accountability keep both law enforcement and politicians from lying to us. The Bush Administration lied about the extent of the NSA's warrantless wiretapping program. The TSA lied about the ability of full-body scanners to save naked images of people. We've been lied to about the lethality of Tasers, when and how the FBI eavesdrops on cell-phone calls, and about the existence of surveillance records. Without transparency, we would never know.
A decade ago, the FBI was heavily lobbying Congress for a law to give it new wiretapping powers: a law known as CALEA. One of its key justifications was that existing law didn't allow it to perform speedy wiretaps during kidnapping investigations. It sounded plausible -- and who wouldn't feel sympathy for kidnapping victims? -- but when civil-liberties organizations analyzed the actual data, they found that it was just a story; there were no instances of wiretapping in kidnapping investigations. Without transparency, we would never have known that the FBI was making up stories to scare Congress.
If we're going to give the government any new powers, we need to ensure that there's oversight. Sometimes this oversight is before action occurs. Warrants are a great example. Sometimes they're after action occurs: public reporting, audits by inspector generals, open hearings, notice to those affected, or some other mechanism. Too often, law enforcement tries to exempt itself from this principle by supporting laws that are specifically excused from oversight...or by establishing secret courts that just rubber-stamp government wiretapping requests.
Furthermore, we need to ensure that mechanisms for accountability have teeth and are used.
As we respond to the threat of terrorism, we must remember that there are other threats as well. A society without transparency and accountability is the very definition of a police state. And while a police state might have a low crime rate -- especially if you don't define police corruption and other abuses of power as crime -- and an even lower terrorism rate, it's not a society that most of us would willingly choose to live in.
We already give law enforcement enormous power to intrude into our lives. We do this because we know they need this power to catch criminals, and we're all safer thereby. But because we recognize that a powerful police force is itself a danger to society, we must temper this power with transparency and accountability.
This essay previously appeared on TheAtlantic.com.
Poll on sacrificing personal freedom for safety.
The ineffectiveness of security camera machines:
Airport full-body scanners save naked images:
The lethality of Tasers:
The FBI eavesdrops on cell phone calls:
The existence of FBI surveillance records:
I did a Q&A on the "Washington Post" blog.
I was on The Steve Malzberg Show, which I didn't realize was shouting conservative talk radio until it was too late.
I generally give the police a lot of tactical leeway in times like this. The very armed and very dangerous suspects warranted extraordinary treatment. They were perfectly capable of killing again, taking hostages, planting more bombs -- and we didn't know the extent of the plot or the group. That's why I didn't object to the massive police dragnet, the city-wide lock down, and so on.
Ross Anderson has a different take:
...a million people were under virtual house arrest; the 19-year-old fugitive from justice happened to be a Muslim. Whatever happened to the doctrine that infringements of one liberty to protect another should be necessary and proportionate?
In the London bombings, four idiots killed themselves in the first incident with a few dozen bystanders, but the second four failed and ran for it when their bombs didn't go off. It didn't occur to anyone to lock down London. They were eventually tracked down and arrested, together with their support team. Digital forensics played a big role; the last bomber to be caught left the country and changed his SIM, but not his IMEI. It's next to impossible for anyone to escape nowadays if the authorities try hard.
He has a point, although I'm not sure I agree with it.
Ross Anderson's argument:
Another excellent argument on the topic:
Slashdot thread on the topic:
Encouraging poll data says that maybe Americans are starting to have realistic fears about terrorism, or at least are refusing to be terrorized.
Good essay by Scott Atran on terrorism and our reaction.
This, on the other hand, is pitiful.
Reddit apologizes. I think this is a big story. The Internet is going to help in everything, including trying to identify terrorists. This will happen whether or not the help is needed, wanted, or even helpful. I think this took the FBI by surprise.
Here's a good commentary on this sort of thing.
"Hapless, Disorganized, and Irrational": John Mueller and Mark Stewart describe the Boston -- and most other -- terrorists.
Max Abrahms has two sensible essays.
Probably the ultimate in security theater: Williams-Sonoma stops selling pressure cookers in the Boston area "out of respect." They say it's temporary. (I bought a Williams-Sonoma pressure cooker last Christmas; I wonder if I'm now on a list.)
Shortly thereafter, Williams-Sonoma released a statement apologizing to anyone who might be offended.
A tragedy: Sunil Tripathi, whom Reddit and other sites wrongly identified as one of the bombers, was found dead in the Providence River. It seems that Sunil Tripathi died well before the Boston bombing. So while his family was certainly affected by the false accusations, he wasn't.
And worst of all, New York Mayor Bloomberg scares me more than the terrorists ever could. "In the wake of the Boston Marathon bombings, Mayor Michael Bloomberg said Monday the country's interpretation of the Constitution will 'have to change' to allow for greater security to stave off future attacks."
Terrorism's effectiveness doesn't come from the terrorist acts; it comes from our reactions to it. We need leaders who aren't terrorized.
Only indirectly related, but the Kentucky Derby banned "removable lens cameras" for security reasons.
And a totally unscientific CNN opinion poll: 57% say no to: "Is it justifiable to violate certain civil liberties in the name of national security?"
On the difference between mass murder and terrorism: "What the United States means by terrorist violence is, in large part, 'public violence some weirdo had the gall to carry out using a weapon other than a gun.'"
On fear fatigue -- and a good modeling of how to be indomitable:
The surprising dearth of terrorists:
Why emergency medical response has improved since 9/11:
What if the Boston bombers had been shooters instead.
"Let's not be terrorized":
The new terrorism -- from 2011, it's in five parts, and this is the first one.
This is kind of wordy, but it's an interesting essay on the nature of fear. And cats.
Glenn Greenwald: The Boston bombing produces familiar and revealing reactions
The Saudi Marathon Man: how a 20-year-old Saudi victim of the bombing was instantly, and baselessly, converted by the US media and government into a "suspect".
Four Highly Effective Responses to Terrorism
People being terrorized: Drastic security changes coming to large-scale public events, experts say
Thoughts on the Boston bombing: Don't let the bad guys win:
A time for resilience:
The best response is resiliency:
Why terrorism works:
Terrorist attacks have declined since the 1970s:
After the bomb, mass hysteria is the Boston terrorist's greatest weapon
We're learning a lot about how the FBI eavesdrops on cell phones from a recent court battle.
The Nemim.gen Trojan evades forensic examination by deleting its own components.
This article, from some internal NSA publication, is about Lambros Callimahos, who taught an intensive 18-week course on cryptology for many years and died in 1977. Be sure to notice the great redacted photo of him and his students on page 17.
About police shootouts and spectators.
An interesting discussion about redaction.
Securing members of Congress from transparency.
PhotographyIsNotACrime.com points out the obvious: after years of warning us that photography is suspicious, the police were happy to accept all of those amateur photographs and videos at the Boston Marathon.
I've talked about plant security systems. Specifically, I've talked about tobacco plants that call air strikes against insects that eat them, by releasing a scent that attracts predators to those insects. Here's another defense: the plants also tag caterpillars for predators by feeding them a sweet snack (full episode here) that makes them give off a strong scent.
In this video, Ellen makes fun of the "Internet Password Minder," which is -- if you think about it -- only slightly different than Password Safe.
Internet Password Minder:
A 92-year-old World War II Bletchley Park codebreaker has had a set of commemorative stamps issued in his honor.
The Internet anonymity service Tor needs people who are willing to run bridges. It's a goodness for the world; do it if you can.
xkcd on a bad threat model.
Turns out you can learn a lot if you ping the entire Internet.
I've already written about the guy who got a new trial because a virus ate his court records. Here's someone who will have to redo his thesis research because someone stole his only copy of the data. Remember the rule: no one ever wants backups, but everyone always wants restores.
I have no idea if that image is real or not, but I've been hearing such stories for at least two decades.
Really interesting article detailing how criminals steal from a company's accounts over the Internet. "The costly cyberheist was carried out with the help of nearly 100 different accomplices in the United States who were hired through work-at-home job scams run by a crime gang that has been fleecing businesses for the past five years." Basically, the criminals break into the bank account, move money into a bunch of other bank accounts, and use unwitting accomplices to launder the money. "The publication said the attack occurred on Apr. 19, and moved an estimated $1.03 million out of the hospital's payroll account into 96 different bank accounts, mostly at banks in the Midwest and East Coast."
Google is paying bug bounties. This is important; there's a market in vulnerabilities that provides incentives for their being kept secret and exploitable; for Google to buy and patch them makes us all more secure. The U.S. government should do the same.
The market in vulnerabilities:
FinFisher (also called FinSpy) is a commercially sold spyware package that is used by governments worldwide, including the U.S. There's a new report that has a bunch of new information.
Mozilla has sent them a cease and desist letter for using their name and code.
Interesting research on the risks of networked systems.
Here is a simple but clever idea. Seed password files with dummy entries called "honeywords" that will trigger an alarm when used. That way a site can know when a hacker is trying to decrypt the password file.
Is the U.S. government recording and saving all domestic telephone calls?
The efficacy of evacuation alerts at an airport.
Another example of how easy it is to reidentify anonymous data.
"The Economist" supports closing Guantanamo.
"The Onion" on browser security.
And while we're talking about "The Onion," they were recently hacked by Syria (either the government or someone on their side).
They responded in their own way.
Our government collects a lot of information about us. Tax records, legal records, license records, records of government services received -- it's all in databases that are increasingly linked and correlated. Still, there's a lot of personal information the government can't collect. Either they're prohibited by law from asking without probable cause and a judicial order, or they simply have no cost-effective way to collect it. But the government has figured out how to get around the laws, and collect personal data that has been historically denied to them: ask corporate America for it.
It's no secret that we're monitored continuously on the Internet. Some of the company names you know, such as Google and Facebook. Others hide in the background as you move about the Internet. There are browser plugins that show you who is tracking you. One Atlantic editor found 105 companies tracking him during one 36-hour period. Add data from your cell phone (who you talk to, your location), your credit cards (what you buy, from whom you buy it), and the dozens of other times you interact with a computer daily, we live in a surveillance state beyond the dreams of Orwell.
It's all corporate data, compiled and correlated, bought and sold. And increasingly, the government is doing the buying. Some of this is collected using National Security Letters (NSLs). These give the government the ability to demand an enormous amount of personal data about people for very speculative reasons, with neither probable cause nor judicial oversight. Data on these secretive orders is obviously scant, but we know that the FBI has issued hundreds of thousands of them in the past decade -- for reasons that go far beyond terrorism.
NSLs aren't the only way the government can get at corporate data. Sometimes they simply purchase it, just as any other company might. Sometimes they can get it for free, from corporations that want to stay on the government's good side.
CISPA, a bill currently wending its way through Congress, codifies this sort of practice even further. If signed into law, CISPA will allow the government to collect all sorts of personal data from corporations, without any oversight at all, and will protect corporations from lawsuits based on their handing over that data. Without hyperbole, it's been called the death of the 4th Amendment. Right now, it's mainly the FBI and the NSA who are getting this data, but -- all sorts of government agencies have administrative subpoena power.
Data on this scale has all sorts of applications. From finding tax cheaters by comparing data brokers' estimates of income and net worth with what's reported on tax returns, to compiling a list of gun owners from Web browsing habits, instant messaging conversations, and locations -- did you have your iPhone turned on when you visited a gun store? -- the possibilities are endless.
Government photograph databases form the basis of any police facial recognition system. They're not very good today, but they'll only get better. But the government no longer needs to collect photographs. Experiments demonstrate that the Facebook database of tagged photographs is surprisingly effective at identifying people. As more places follow Disney's lead in fingerprinting people at its theme parks, the government will be able to use that to identify people as well.
In a few years, the whole notion of a government-issued ID will seem quaint. Among facial recognition, the unique signature from your smart phone, the RFID chips in your clothing and other items you own, and whatever new technologies that will broadcast your identity, no one will have to ask to see ID. When you walk into a store, they'll already know who you are. When you interact with a policeman, she'll already have your personal information displayed on her Internet-enabled glasses.
Soon, governments won't have to bother collecting personal data. We're willingly giving it to a vast network of for-profit data collectors, and they're more than happy to pass it on to the government without our knowledge or consent.
This essay previously appeared on TheAtlantic.com.
The Internet is a surveillance state:
National Security Letters:
Fingerprinting at Disney properties:
Google and facial recognition:
I am speaking at QCon in New York on June 13th.
Earlier this month I spent a week at the Berkman Center for Internet and Society, talking to people about power, security, technology, and threats. As part of that week, I gave a public talk at Harvard. Because my thoughts are so diffuse and disjoint, I didn't think I could pull it all together into a coherent talk. Instead, I asked Jonathan Zittrain to interview me on stage. He did, and the results are here: both video and transcript.
Be warned, though. You're getting a bunch of half-formed raw thoughts, contradictions and all. I appreciate comments, criticisms, reading suggestions, and so on.
Interesting op-ed by former DHS head Michael Chertoff on the privacy risks of Google Glass.
Now imagine that millions of Americans walk around each day wearing the equivalent of a drone on their head: a device capable of capturing video and audio recordings of everything that happens around them. And imagine that these devices upload the data to large-scale commercial enterprises that are able to collect the recordings from each and every American and integrate them together to form a minute-by-minute tracking of the activities of millions.
That is almost precisely the vision of the future that lies directly ahead of us. Not, of course, with wearable drones but with wearable Internet-connected equipment. This new technology -- whether in the form of glasses or watches -- may unobtrusively capture video data in real time, store it in the cloud and allow for it to be analyzed.
It's not unusual for government officials -- the very people we disagree with regarding civil liberties issues -- to agree with us on consumer privacy issues. But don't forget that this person advocated for full-body scanners at airports while on the payroll of a scanner company.
One of the points he makes, that the data collected from Google Glass will become part of Google's vast sensory network, echoes something I've heard Marc Rotenberg at EPIC say: this whole thing would be a lot less scary if the glasses were sold by a company like Brookstone.
Chertoff's conflict of interest:
The ACLU comments on the story:
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers "Liars and Outliers," "Beyond Fear," "Secrets and Lies," and "Applied Cryptography," and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT, and is on the Advisory Boards of the Electronic Privacy Information Center (EPIC) and the Electronic Frontier Foundation (EFF). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.
Copyright (c) 2013 by Bruce Schneier.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..