Crypto-Gram

April 15, 2016

by Bruce Schneier
CTO, Resilient Systems, Inc.
schneier@schneier.com
https://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2016/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.


In this issue:


Lawful Hacking and Continuing Vulnerabilities

The FBI’s legal battle with Apple is over, but the way it ended may not be good news for anyone.

Federal agents had been seeking to compel Apple to break the security of an iPhone 5c that had been used by one of the San Bernardino, Calif., terrorists. Apple had been fighting a court order to cooperate with the FBI, arguing that the authorities’ request was illegal and that creating a tool to break into the phone was itself harmful to the security of every iPhone user worldwide.

Last week, the FBI told the court it had learned of a possible way to break into the phone using a third party’s solution, without Apple’s help. On Monday, the agency dropped the case because the method worked. We don’t know who that third party is. We don’t know what the method is, or which iPhone models it applies to. Now it seems like we never will.

The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations.

Compare this iPhone vulnerability with another, one that was made public on the same day the FBI said it might have found its own way into the San Bernardino phone. Researchers at Johns Hopkins University announced last week that they had found a significant vulnerability in the iMessage protocol. They disclosed the vulnerability to Apple in the fall, and last Monday, Apple released an updated version of its operating system that fixed the vulnerability. (That’s iOS 9.3—you should download and install it right now.) The Hopkins team didn’t publish its findings until Apple’s patch was available, so devices could be updated to protect them from attacks using the researchers’ discovery.

This is how vulnerability research is supposed to work.

Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and—more important—everyone is more secure as a result of the work.

The FBI is doing the exact opposite. It has been given whatever vulnerability it used to get into the San Bernardino phone in secret, and it is keeping it secret. All of our iPhones remain vulnerable to this exploit. This includes the iPhones used by elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents.

This is the trade-off we have to consider: do we prioritize security over surveillance, or do we sacrifice security for surveillance?

The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies. A vulnerability in Windows 10, for example, affects all of us who use Windows 10. And it can be used by anyone who knows it, be they the FBI, a gang of cyber criminals, the intelligence agency of another country—anyone.

And once a vulnerability is found, it can be used for attack—like the FBI is doing—or for defense, as in the Johns Hopkins example.

Over years of battling attackers and intruders, we’ve learned a lot about computer vulnerabilities. They’re plentiful: vulnerabilities are found and fixed in major systems all the time. They’re regularly discovered independently, by outsiders rather than by the original manufacturers or programmers. And once they’re discovered, word gets out. Today’s top-secret National Security Agency attack techniques become tomorrow’s PhD theses and the next day’s hacker tools.

The attack/defense trade-off is not new to the US government. They even have a process for deciding what to do when a vulnerability is discovered: whether they should be disclosed to improve all of our security, or kept secret to be used for offense. The White House claims that it prioritizes defense, and that general vulnerabilities in widely used computer systems are patched.

Whatever method the FBI used to get into the San Bernardino shooter’s iPhone is one such vulnerability. The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately.

This case has always been more about the PR battle and potential legal precedent than about the particular phone. And while the legal dispute is over, there are other cases involving other encrypted devices in other courts across the country. But while there will always be a few computers—corporate servers, individual laptops or personal smartphones—that the FBI would like to break into, there are far more such devices that we need to be secure.

One of the most surprising things about this debate is the number of former national security officials who came out on Apple’s side. They understand that we are singularly vulnerable to cyberattack, and that our cyberdefense needs to be as strong as possible.

The FBI’s myopic focus on this one investigation is understandable, but in the long run, it’s damaging to our national security.

This essay previously appeared in the Washington Post, with a far too click-bait headline.
https://www.washingtonpost.com/posteverything/wp/…

https://www.washingtonpost.com/world/…
https://www.washingtonpost.com/world/…

Johns Hopkins University iPhone vulnerability:
https://www.schneier.com/blog/archives/2016/03/…

US government officials who use iPhones:
http://www.theverge.com/2013/9/22/4760038/…
http://www.apple.com/r/store/government/
http://9to5mac.com/2012/11/21/…

Vulnerabilities equities process:
https://www.whitehouse.gov//2014/04/28/…
https://www.schneier.com/blog/archives/2012/06/…
https://www.lawfareblog.com/…
http://www.networkworld.com/article/2462706/…

Former national security officials on Apple’s side:
http://www.bloomberg.com/news/videos/2016-02-25/…
https://www.washingtonpost.com/opinions/…
http://www.npr.org/2016/03/14/470347719/…
http://www.theguardian.com/technology/2016/mar/02/…

To be fair, the FBI probably doesn’t know what the vulnerability is. And I wonder how easy it would be for Apple to figure it out. Given that the FBI has to exhaust all avenues of access before demanding help from Apple, we can learn which models are vulnerable by watching which legal suits are abandoned now that the FBI knows about this method.
http://s.cfr.org/cyber/2016/03/29/…

Matt Blaze makes excellent points about how the FBI should disclose the vulnerabilities it uses, in order to improve computer security. That was part of a New York Times “Room for Debate” on hackers helping the FBI.
http://www.nytimes.com/roomfordebate/2016/03/30/…
http://www.nytimes.com/roomfordebate/2016/03/30/…


More Links on the San Bernardino iPhone Case

The FBI’s final reply to Apple is more of a character assassination attempt than a legal argument. It’s as if it only cares about public opinion at this point.

Although notice the threat in footnote 9 on page 22: “For the reasons discussed above, the FBI cannot itself modify the software on Farook’s iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers.”

This should immediately remind everyone of the Lavabit case, where the FBI did ask for the site’s master key in order to get at one user. Ladar Levison commented on the similarities. He, of course, shut his service down rather than turn over the master key. A company as large as Apple does not have that option.

https://www.justsecurity.org/wp-content/uploads/…

Character assassination:
https://theintercept.com/2016/03/11/…
http://www.theguardian.com/technology/2016/mar/10/…

Lavabit:
http://www.theguardian.com/commentisfree/2014/may/…
http://www.zdnet.com/article/…

Marcy Wheeler wrote about this in detail.
https://www.emptywheel.net/2016/03/14/…

The New York Times reports that the White House might have overreached in this case.
http://www.nytimes.com/2016/03/14/technology/…

John Oliver has a great segment on this. With a Matt Blaze cameo!
https://www.youtube.com/watch?v=zsjZ2r9Ygzw

Good NPR interview with Richard Clarke. “Well, I don’t think it’s a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they’re much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it’s Jim Comey and the attorney general is letting him get away with it.
http://www.npr.org/2016/03/14/470347719/…

Senator Lindsay Graham is changing his views: “‘It’s just not so simple,’ Graham said. ‘I thought it was that simple.'”
http://fortune.com/2016/03/10/apple-fbi-lindsay-graham/

Steven Levy on the history angle of this story.
https://backchannel.com/…

Benjamin Wittes on possible legislative options.
https://www.lawfareblog.com/new-front-second-crypto-war

Apple’s final response is pretty withering.
https://www.documentcloud.org/documents/…
http://www.wired.com/2016/03/…
http://techcrunch.com/2016/03/15/…

Commentary from Susan Crawford.
https://backchannel.com/…

FBI and China are on the same side.
http://www.theverge.com/2016/3/16/11244396/…

How this fight risks the whole US tech industry.
https://hbr.org/2016/02/…

Tim Cook interview.
http://time.com/magazine/us/4262476/…

Apple engineers might refuse to help the FBI, if Apple loses the case.
http://www.nytimes.com/2016/03/18/technology/…

And I should have previously posted this letter from racial justice activists, and this more recent essay on how this affects the LGBTQ community.
https://ssl.apple.com/pr/pdf/…
https://motherboard.vice.com/read/…

Interesting article on the Apple/FBI tensions that led to this case.
http://www.bloomberg.com/news/features/2016-03-20/…

My early speculation on who might be helping the FBI.
https://www.schneier.com/blog/archives/2016/03/…


Cryptography Is Harder Than It Looks

Writing a magazine column is always an exercise in time travel. I’m writing these words in early December. You’re reading them in February. This means anything that’s news as I write this will be old hat in two months, and anything that’s news to you hasn’t happened yet as I’m writing.

This past November, a group of researchers found some serious vulnerabilities in an encryption protocol that I, and probably most of you, use regularly. The group alerted the vendor, who is currently working to update the protocol and patch the vulnerabilities. The news will probably go public in the middle of February, unless the vendor successfully pleads for more time to finish their security patch. Until then, I’ve agreed not to talk about the specifics.

I’m writing about this now because these vulnerabilities illustrate two very important truisms about encryption and the current debate about adding back doors to security products:

  • Cryptography is harder than it looks.
  • Complexity is the worst enemy of security.

These aren’t new truisms. I wrote about the first in 1997 and the second in 1999. I’ve talked about them both in “Secrets and Lies” (2000) and “Practical Cryptography” (2003). They’ve been proven true again and again, as security vulnerabilities are discovered in cryptographic system after cryptographic system. They’re both still true today.

Cryptography is harder than it looks, primarily because it looks like math. Both algorithms and protocols can be precisely defined and analyzed. This isn’t easy, and there’s a lot of insecure crypto out there, but we cryptographers have gotten pretty good at getting this part right. However, math has no agency; it can’t actually secure anything. For cryptography to work, it needs to be written in software, embedded in a larger software system, managed by an operating system, run on hardware, connected to a network, and configured and operated by users. Each of these steps brings with it difficulties and vulnerabilities.

Although cryptography gives an inherent mathematical advantage to the defender, computer and network security are much more balanced. Again and again, we find vulnerabilities not in the underlying mathematics, but in all this other stuff. It’s far easier for an attacker to bypass cryptography by exploiting a vulnerability in the system than it is to break the mathematics. This has been true for decades, and it’s one of the lessons that Edward Snowden reiterated.

The second truism is that complexity is still the worst enemy of security. The more complex a system is, the more lines of code, interactions with other systems, configuration options, and vulnerabilities there are. Implementing cryptography involves getting everything right, and the more complexity there is, the more there is to get wrong.

Vulnerabilities come from options within a system, interactions between systems, interfaces between users and systems—everywhere. If good security comes from careful analysis of specifications, source code, and systems, then a complex system is more difficult and more expensive to analyze. We simply don’t know how to securely engineer anything but the simplest of systems.

I often refer to this quote, sometimes attributed to Albert Einstein and sometimes to Yogi Berra: “In theory, theory and practice are the same. In practice, they are not.”

These truisms are directly relevant to the current debate about adding back doors to encryption products. Many governments—from China to the US and the UK—want the ability to decrypt data and communications without users’ knowledge or consent. Almost all computer security experts have two arguments against this idea: first, adding this back door makes the system vulnerable to all attackers and doesn’t just provide surreptitious access for the “good guys,” and second, creating this sort of access greatly increases the underlying system’s complexity, exponentially increasing the possibility of getting the security wrong and introducing new vulnerabilities.

Going back to the new vulnerability that you’ll learn about in mid-February, the lead researcher wrote to me: “If anyone tells you that [the vendor] can just ‘tweak’ the system a little bit to add key escrow or to man-in-the-middle specific users, they need to spend a few days watching the authentication dance between [the client device/software] and the umpteen servers it talks to just to log into the network. I’m frankly amazed that any of it works at all, and you couldn’t pay me enough to tamper with any of it.” This is an important piece of wisdom.

The designers of this system aren’t novices. They’re an experienced team with some of the best security engineers in the field. If these guys can’t get the security right, just imagine how much worse it is for smaller companies without this team’s level of expertise and resources. Now imagine how much worse it would be if you added a government-mandated back door. There are more opportunities to get security wrong, and more engineering teams without the time and expertise necessary to get it right. It’s not a recipe for security.

Unlike what much of today’s political rhetoric says, strong cryptography is essential for our information security. It’s how we protect our information and our networks from hackers, criminals, foreign governments, and terrorists. Security vulnerabilities, whether deliberate backdoor access mechanisms or accidental flaws, make us all less secure. Getting security right is harder than it looks, and our best chance is to make the cryptography as simple and public as possible.

This essay previously appeared in IEEE Security & Privacy.
http://ieeexplore.ieee.org/stamp/stamp.jsp?…

It’s an update of something I wrote in 1997.
https://www.schneier.com/essays/archives/1997/01/…

That vulnerability I alluded to in the essay is the recent iMessage flaw.
https://www.schneier.com/blog/archives/2016/03/…


News

Companies handing source code over to governments.
https://www.schneier.com/blog/archives/2016/03/…

The Brennan Center has released a report on EO 12333, the executive order that regulates the NSA’s overseas surveillance. Much of what the NSA does here is secret and, even though the EO is designed for foreign surveillance, Americans are regularly swept up in the NSA’s collection operations.
https://www.brennancenter.org/publication/…
Here’s an article from the Intercept:
https://theintercept.com/2016/03/17/…
And this is me from “Data and Goliath” on EO 12333: “Executive Order 12333, the 1981 presidential document authorizing most of NSA’s surveillance, is incredibly permissive. It is supposed to primarily allow the NSA to conduct surveillance outside the US, but it gives the agency broad authority to collect data on Americans. It provides minimal protections for Americans; data collected outside the US, and even less for the hundreds of millions of innocent non-Americans whose data is incidentally collected. Because this is a presidential directive and not a law, courts have no jurisdiction, and congressional oversight is minimal. Additionally, at least in 2007, the president believed he could modify or ignore it at will and in secret. As a result, we know very little about how Executive Order 12333 is being interpreted inside the NSA.”

Matthew Green and team found at Johns Hopkins University and reported a significant iMessage encryption flaw last year.
https://www.washingtonpost.com/world/…
https://isi.jhu.edu/~mgreen/imessage.pdf
I wrote about this flaw in IEEE Security and Privacy earlier this year in the essay above

Related: A different iOS flaw was reported last week. Called AceDeceiver, it is a Trojan that allows an attacker to install malicious software onto an iOS device, bypassing Apple’s DRM protections. I don’t believe that Apple has fixed this yet, although it seems as if Apple just has to add a certificate revocation list, or make the certs nonreplayable by having some mandatory interaction with the iTunes store.
http://researchcenter.paloaltonetworks.com/2016/03/…

Observations on the surveillance that resulted in the capture of Salah Abdeslam.
https://medium.com/@thegrugq/…

This was newly released under FOIA at my request: Victor C. Williams, Jr., Donn B. Parker, and Charles C. Wood, “Impacts of Federal Policy Options for Nonmilitary Cryptography,” NTIA-CR-81-10, National Telecommunications and Information Administration, US. Department of Commerce, June 1981. It argues that cryptography is an important enabling technology. At this point, it’s only of historical value.
http://www.its.bldrdoc.gov/publications/2825.aspx

Last month, the FBI added two members of the Syrian Electronic Army to its cyber most-wanted list. I had no idea that the FBI had a cyber most-wanted list.
http://www.networkworld.com/article/3047172/…
http://www.theatlantic.com/technology/archive/2016/…
https://www.fbi.gov/wanted/cyber/@@wanted-group-listing

Hacking the lottery by manipulating the terminals.
https://www.lotterypost.com/news/301512

Interesting paper: Yochai Benkler, “Degrees of Freedom, Dimensions of Power,” Daedelus, winter 2016.
https://www.wzb.eu/sites/default/files/u32/…

Here’s a 1,300-page Congressional report on “surveillance technology” from 1976.
https://ia801702.us.archive.org/26/items/…

Research paper: Elizabeth Stoycheff, “Under Surveillance: Examining Facebook’s Spiral of Silence Effects in the Wake of NSA Internet Monitoring”:
https://www.schneier.com/blog/archives/2016/03/…

ISIS encryption opsec: tidbits from the New York Times.
https://www.schneier.com/blog/archives/2016/03/…

Long and interesting article about a fixer who hacked multiple elections in Latin America. This isn’t election hacking as in manipulate the voting machines or the vote counting, but hacking and social-media dirty tricks leading up to the election.
http://www.bloomberg.com/features/…
http://fusion.net/story/287086/…

April Fool’s joke on election security.
http://avi-rubin.blogspot.com/2016/04/…

Reddit has received a National Security Letter.
https://boingboing.net/2016/03/31/…
https://yro.slashdot.org/story/16/04/01/0321257/…
http://www.bbc.com/news/technology-35943062
https://www.reddit.com/r/worldnews/comments/4ct1kz/…
http://arstechnica.com/tech-policy/2016/03/…
https://www.reddit.com/r/announcements/comments/…
I have long discounted warrant canaries. A gag order is serious, and this sort of high-school trick won’t fool judges for a minute. But so far they seem to be working. Now we have another question: now what? We have one piece of information, but not a very useful one. We know that NSLs can affect anywhere from a single user to millions of users. Which kind was this? We have no idea. Is Reddit fighting? We have no idea. How long will this go on? We don’t know that, either. When I think about what we can do to be useful here, I can’t think of anything.

Smart essay on the limits of anti-terrorism security.
https://www.washingtonpost.com/opinions/…

WhatsApp is now end-to-end encrypted.
http://www.wired.com/2016/04/…
https://www.whatsapp.com/security/
https://www.whatsapp.com/security/…
https://www.theguardian.com/technology/2016/apr/05/…
http://www.wsj.com/articles/…
http://arstechnica.com/tech-policy/2016/04/…
https://it.slashdot.org/story/16/04/05/1713244/…
https://news.ycombinator.com/item?id=11431108

CONIKS is an new easy-to-use transparent key-management system.
https://coniks.cs.princeton.edu/
https://www.usenix.org/conference/usenixsecurity15/…
https://freedom-to-tinker.com//masomel/…

Bypassing phone security through social engineering:
https://www.schneier.com/blog/archives/2016/04/…

Interesting research: Suphannee Sivakorn, Iasonas Polakis and Angelos D. Keromytis, “I Am Robot: (Deep) Learning to Break Semantic Image CAPTCHAs.”
http://www.cs.columbia.edu/~polakis/papers/…
http://go.theregister.com/feed/…
http://news.softpedia.com/news/…

Security lessons from the game of Werewolf:
http://eaves.ca/2013/11/07/…

Scams from the 1800s. They feel quaint today.
http://www.npr.org/sections/npr-history-dept/2015/…

Ross Anderson liveblogged the 24th International Workshop on Security Protocols in Brno, Czech Republic.
https://www.lightbluetouchpaper.org/2016/04/07/…
https://www.engr.mun.ca/~spw2016/organization/schedule/

The company Cellebrite is developing a portable forensics device that would determine if a smartphone user was using the phone at a particular time. The idea is to test phones of drivers after accidents. They’re calling it a “textalyzer.” This is interesting technology. To me, it feels no more intrusive than a breathalyzer, assuming the textalyzer has all the privacy guards described in the article.
http://arstechnica.com/tech-policy/2016/04/…
https://www.nysenate.gov/legislation/bills/2015/…
https://yro.slashdot.org/story/16/04/11/2121240/…
https://www.reddit.com/r/technology/comments/4ecl6s/…

Story of Julie Miller, who cheated in multiple triathlon races.
http://www.nytimes.com/2016/04/10/sports/…


Memphis Airport Inadvertently Gets Security Right

A local newspaper recently tested airport security at Memphis Airport:

Our crew sat for 30 minutes in the passenger drop-off area Tuesday without a word from anyone, and that raised a number of eyebrows.

Certainly raised mine. Here’s my question: why is that a bad thing? If you’re worried about a car bomb, why do you think length of time sitting curbside correlates with likelihood of detonation? Were I a car bomber sitting in the front seat, I would detonate my bomb pretty damned quick.

Anyway, the airport was 100% correct in its reply:

The next day, the airport told FOX13 they take a customer-friendly “hassle free” approach.

I’m certainly in favor of that. Useless security theater that adds to the hassle of traveling without actually making us any safer doesn’t help anyone.

Unfortunately, the airport is now reviewing its procedures, because fear wins:

CEO Scott Brockman sent FOX13 a statement saying in part “We will continue to review our policies and procedures and implement any necessary changes in order to ensure the safety of the traveling public.”

http://www.fox13memphis.com/news/…

The airport PR person commented on my blog. “Jim Turner of the Cato Institute” is actually Jim Harper.
https://www.schneier.com/blog/archives/2016/03/…


Schneier News

I’m speaking at Infosecurity Mexico in Mexico City on April 21.
http://www.reedexpo.com/es/Eventos/4971?evEdId=4971

I’m speaking at the Rochester Institute of Technology in Rochester, NY on May 5.

I’m speaking at the Centrify Connect Conference in New York on May 12.
https://www.centrifyconnect.com/

A quote from Data and Goliath is the answer to a Wall Street Journal acrostic. It’s not the same as being a New York Times crossword puzzle answer, but it’s close.
http://www.wsj.com/public/resources/documents/…


New NIST Encryption Guidelines

NIST has published a draft of its new standard for encryption use: “NIST Special Publication 800-175B, Guideline for Using Cryptographic Standards in the Federal Government: Cryptographic Mechanisms.” In it, the Escrowed Encryption Standard from the 1990s, FIPS-185, is no longer certified. And Skipjack, NSA’s symmetric algorithm from the same period, will no longer be certified.

I see nothing sinister about decertifying Skipjack. In a world of faster computers and post-quantum thinking, an 80-bit key and 64-bit block no longer cut it.

NIST Special Publication 800-175B, Guideline for Using Cryptographic Standards in the Federal Government: Cryptographic Mechanisms:
http://csrc.nist.gov/publications/drafts/800-175/…

FIPS-185:
http://csrc.nist.gov/publications/fips/fips185/…

Skipjack:
http://csrc.nist.gov/groups/ST/toolkit/documents/…

Post-quantum thinking:
https://www.schneier.com/blog/archives/2015/08/…

My essays from 1998 on Skipjack and KEA.
https://www.schneier.com/crypto-gram/archives/1998/…
https://www.schneier.com/crypto-gram/archives/1998/…


Resilient Systems News: IBM Has Bought Resilient Systems

It’s officially final; IBM has “completed the acquisition” of Resilient Systems, Inc. We are now “Resilient, an IBM Company.”

As I expected when I announced this acquisition, I am staying on as the CTO of Resilient and something like Senior Advisor to IBM Security—we’re still working on the exact title. Everything I’ve seen so far indicates that this will be a good home for me. IBM knows what it’s getting, and it’s still keeping me on. I have no intention of changing what I write about or speak about—or to whom.

For the company, this is still a great deal. The acquisition was big news at the RSA Conference a month ago, and we’ve gotten nothing but a positive response from analysts and a primarily positive response from customers.

https://www.resilientsystems.com/…

https://www-03.ibm.com/press/us/en/pressrelease/…
https://www.resilientsystems.com/

My original announcement:
https://www.schneier.com/blog/archives/2016/02/…

Here’s a video of Resilient CEO John Bruce talking with IBM Security General Manager Marc van Zadelhoff about the acquisition.
https://www.resilientsystems.com/…

And here’s an analyst talking about the acquisition.
http://marketrealist.com/2016/03/…


Hacking Lottery Machines

Interesting article about how a former security director of the US Multi-State Lottery Association hacked the random-number generator in lottery software so he could predict the winning numbers.

For several years, Eddie Tipton, the former security director of the US Multi-State Lottery Association, installed software code that allowed him to predict winning numbers on specific days of the year, investigators allege. The random-number generators had been erased, but new forensic evidence has revealed how the hack was apparently done.

[…]

The number generator had apparently been hacked to produce predictable numbers on three days of the year, after the machine had gone through a security audit.

Note that last bit. The software would only produce the non-random results *after* the software security audit was completed.

It’s getting harder and harder to trust opaque and unaccountable algorithms. Anyone who thinks we should have electronic voting machines—or worse, Internet voting—needs to pay attention.

https://www.theguardian.com/technology/2016/apr/08/…

Opaque and unaccountable algorithms:
https://aeon.co/essays/…

Internet voting:
https://www.verifiedvoting.org/resources/…


IRS Security

Monday is Tax Day. Many of us are thinking about our taxes. Are they too high or too low? What’s our money being spent on? Do we have a government worth paying for? I’m not here to answer any of those questions—I’m here to give you something else to think about. In addition to sending the IRS your money, you’re also sending them your data.

It’s a lot of highly personal financial data, so it’s sensitive and important information.

Is that data secure?

The short answer is “no.” Every year, the GAO—Government Accountability Office—reviews IRS security and issues a report. The title of this year’s report kind of says it all: “IRS Needs to Further Improve Controls over Financial and Taxpayer Data.” The details are ugly: failures in identification and authentication of network users, failures to encrypt data, failures in audit and monitoring and failures to patch vulnerabilities and update software.

To be fair, the GAO can sometimes be pedantic in its evaluations. And the 43 recommendations for the IRS to improve security aren’t being made public, so as not to advertise our vulnerabilities to the bad guys. But this is all pretty basic stuff, and it’s embarrassing.

More importantly, this lack of security is dangerous. We know that cybercriminals are using our financial information to commit fraud. Specifically, they’re using our personal tax information to file for tax refunds in our name to fraudulently collect the refunds.

We know that foreign governments are targeting U.S. government networks for personal information on U.S. citizens: Remember the OPM data theft that was made public last year in which a federal personnel database with records on 21.5 million people was stolen?

There have been some stories of hacks against IRS databases in the past. I think that the IRS has been hacked even more than is publicly reported, either because the government is keeping the attacks secret or because it doesn’t even realize it’s been attacked.

So what happens next?

If the past is any guide, not a lot. The GAO has been warning about problems with IRS security since it started writing these reports in 2007. In each report, the GAO has issued recommendations for the IRS to improve security. After each report, the IRS did a few of those things, but ignored most of the recommendations. In this year’s report, for example, the GAO complained that the IRS ignored 47 of its 70 recommendations from 2015. In its 2015 report, it complained that the IRS only mitigated 14 of the 69 weaknesses it identified in 2013. The 2012 report didn’t paint IRS security in any better light.

If I had to guess, I’d say the IRS’s security is this bad for the exact same reason that so much corporate network-security is so bad: lack of budget. It’s not uncommon for companies to skimp on their security budget. The budget at the IRS has been cut 17% since 2010; I am certain IT security was not exempt from those cuts.

So we’re stuck. We have no choice but to give the IRS our data. The IRS isn’t doing a good job securing our data. Congress isn’t giving the IRS enough budget to do a good job securing our data. Last Tuesday, the Senate Finance Committee urged the IRS to improve its security. We all need to urge Congress to give it the money to do so.

Nothing is absolutely hacker-proof, but there are a lot of security improvements the IRS can make. If we have to give the IRS all our information—and we do—we deserve to have it taken care of properly.

This essay previously appeared on CNN.com.
http://www.cnn.com/2016/04/13/opinions/…

2016 GAO Report on the IRS:
http://www.gao.gov/assets/680/676097.pdf

IRS and Identity Theft:
http://www.usatoday.com/story/money/columnist/2016/…

OPM Hack:
https://www.washingtonpost.com/world/…
http://www.cnn.com/2015/07/09/politics/…

IRS Hack from 2016:
http://www.usatoday.com/story/money/2016/02/26/…

2015 GAO Report on the IRS:
http://gao.gov/assets/670/669108.pdf

2012 GAO Report on the IRS:
http://www.gao.gov/assets/590/589399.pdf

IRS Funding:
http://www.cbpp.org/research/federal-tax/…

Senate Hearings on IRS Security:
http://www.accountingtoday.com/news/tax-practice/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 13 books—including his latest, “Data and Goliath”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient, an IBM Company. See <https://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Resilient, an IBM Company.

Copyright (c) 2016 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.