Entries Tagged "intelligence"

Page 4 of 24

Intelligence Oversight and How It Can Fail

Former NSA attorneys John DeLong and Susan Hennessay have written a fascinating article describing a particular incident of oversight failure inside the NSA. Technically, the story hinges on a definitional difference between the NSA and the FISA court meaning of the word “archived.” (For the record, I would have defaulted to the NSA’s interpretation, which feels more accurate technically.) But while the story is worth reading, what’s especially interesting are the broader issues about how a nontechnical judiciary can provide oversight over a very technical data collection-and-analysis organization—especially if the oversight must largely be conducted in secret.

From the article:

Broader root cause analysis aside, the BR FISA debacle made clear that the specific matter of shared legal interpretation needed to be addressed. Moving forward, the government agreed that NSA would coordinate all significant legal interpretations with DOJ. That sounds like an easy solution, but making it meaningful in practice is highly complex. Consider this example: a court order might require that “all collected data must be deleted after two years.” NSA engineers must then make a list for the NSA attorneys:

  1. What does deleted mean? Does it mean make inaccessible to analysts or does it mean forensically wipe off the system so data is gone forever? Or does it mean something in between?
  2. What about backup systems used solely for disaster recovery? Does the data need to be removed there, too, within two years, even though it’s largely inaccessible and typically there is a planned delay to account for mistakes in the operational system?
  3. When does the timer start?
  4. What’s the legally-relevant unit of measurement for timestamp computation­—a day, an hour, a second, a millisecond?
  5. If a piece of data is deleted one second after two years, is that an incident of noncompliance? What about a delay of one day? ….
  6. What about various system logs that simply record the fact that NSA had a data object, but no significant details of the actual object? Do those logs need to be deleted too? If so, how soon?
  7. What about hard copy printouts?

And that is only a tiny sample of the questions that need to be answered for that small sentence fragment. Put yourself in the shoes of an NSA attorney: which of these questions—­in particular the answers­—require significant interpretations to be coordinated with DOJ and which determinations can be made internally?

Now put yourself in the shoes of a DOJ attorney who receives from an NSA attorney a subset of this list for advice and counsel. Which questions are truly significant from your perspective? Are there any questions here that are so significant they should be presented to the Court so that that government can be sufficiently confident that the Court understands how the two-year rule is really being interpreted and applied?

In many places I have separated different kinds of oversight: are we doing things right versus are we doing the right things? This is very much about the first: is the NSA complying with the rules the courts impose on them? I believe that the NSA tries very hard to follow the rules it’s given, while at the same time being very aggressive about how it interprets any kind of ambiguities and using its nonadversarial relationship with its overseers to its advantage.

The only possible solution I can see to all of this is more public scrutiny. Secrecy is toxic here.

Posted on October 18, 2016 at 2:29 PMView Comments

NSA Contractor Arrested for Stealing Classified Information

The NSA has another contractor who stole classified documents. It’s a weird story: “But more than a month later, the authorities cannot say with certainty whether Mr. Martin leaked the information, passed them on to a third party or whether he simply downloaded them.” So maybe a potential leaker. Or a spy. Or just a document collector.

My guess is that there are many leakers inside the US government, even more than what’s on this list from last year.

EDITED TO ADD (10/7): More information.

Posted on October 7, 2016 at 6:07 AMView Comments

The 2016 National Threat Assessment

It’s National Threat Assessment Day. Published annually by the Director of National Intelligence, the “Worldwide Threat Assessment of the US Intelligence Community” is the US intelligence community’s one time to publicly talk about the threats in general. The document is the results of weeks of work and input from lots of people. For Clapper, it’s his chance to shape the dialog, set up priorities, and prepare Congress for budget requests. The document is an unclassified summary of a much longer classified document. And the day also includes Clapper testifying before the Senate Armed Service Committee. (You’ll remember his now-famous lie to the committee in 2013.)

The document covers a wide variety of threats, from terrorism to organized crime, from energy politics to climate change. Although the document clearly says “The order of the topics presented in this statement does not necessarily indicate the relative importance or magnitude of the threat in the view of the Intelligence Community,” it does. And like 2015 and 2014, cyber threats are #1—although this year it’s called “Cyber and Technology.”

The consequences of innovation and increased reliance on information technology in the next few years on both our society’s way of life in general and how we in the Intelligence Community specifically perform our mission will probably be far greater in scope and impact than ever. Devices, designed and fielded with minimal security requirements and testing, and an ever—increasing complexity of networks could lead to widespread vulnerabilities in civilian infrastructures and US Government systems. These developments will pose challenges to our cyber defenses and operational tradecraft but also create new opportunities for our own intelligence collectors.

Especially note that last clause. The FBI might hate encryption, but the intelligence community is not going dark.

The document then calls out a few specifics like the Internet of Things and Artificial Intelligence—no surprise, considering other recent statements from government officials. This is the “…and Technology” part of the category.

More specifically:

Future cyber operations will almost certainly include an increased emphasis on changing or manipulating data to compromise its integrity (i.e., accuracy and reliability) to affect decisionmaking, reduce trust in systems, or cause adverse physical effects. Broader adoption of IoT devices and AI ­—in settings such as public utilities and health care—will only exacerbate these potential effects. Russian cyber actors, who post disinformation on commercial websites, might seek to alter online media as a means to influence public discourse and create confusion. Chinese military doctrine outlines the use of cyber deception operations to conceal intentions, modify stored data, transmit false data, manipulate the flow of information, or influence public sentiments -­ all to induce errors and miscalculation in decisionmaking.

Russia is the number one threat, followed by China, Iran, North Korea, and non-state actors:

Russia is assuming a more assertive cyber posture based on its willingness to target critical infrastructure systems and conduct espionage operations even when detected and under increased public scrutiny. Russian cyber operations are likely to target US interests to support several strategic objectives: intelligence gathering to support Russian decisionmaking in the Ukraine and Syrian crises, influence operations to support military and political objectives, and continuing preparation of the cyber environment for future contingencies.

Comments on China refer to the cybersecurity agreement from last September:

China continues to have success in cyber espionage against the US Government, our allies, and US companies. Beijing also selectively uses cyberattacks against targets it believes threaten Chinese domestic stability or regime legitimacy. We will monitor compliance with China’s September 2015 commitment to refrain from conducting or knowingly supporting cyber—enabled theft of intellectual property with the intent of providing competitive advantage to companies or commercial sectors. Private—sector security experts have identified limited ongoing cyber activity from China but have not verified state sponsorship or the use of exfiltrated data for commercial gain.

Also interesting are the comments on non-state actors, which discuss both propaganda campaigns from ISIL, criminal ransomware, and hacker tools.

Posted on February 9, 2016 at 3:25 PMView Comments

UK Government Promoting Backdoor-Enabled Voice Encryption Protocol

The UK government is pushing something called the MIKEY-SAKKE protocol to secure voice. Basically, it’s an identity-based system that necessarily requires a trusted key-distribution center. So key escrow is inherently built in, and there’s no perfect forward secrecy. The only reasonable explanation for designing a protocol with these properties is third-party eavesdropping.

Steven Murdoch has explained the details. The upshot:

The design of MIKEY-SAKKE is motivated by the desire to allow undetectable and unauditable mass surveillance, which may be a requirement in exceptional scenarios such as within government departments processing classified information. However, in the vast majority of cases the properties that MIKEY-SAKKE offers are actively harmful for security. It creates a vulnerable single point of failure, which would require huge effort, skill and cost to secure ­ requiring resource beyond the capability of most companies. Better options for voice encryption exist today, though they are not perfect either. In particular, more work is needed on providing scalable and usable protection against man-in-the-middle attacks, and protection of metadata for contact discovery and calls. More broadly, designers of protocols and systems need to appreciate the ethical consequences of their actions in terms of the political and power structures which naturally follow from their use. MIKEY-SAKKE is the latest example to raise questions over the policy of many governments, including the UK, to put intelligence agencies in charge of protecting companies and individuals from spying, given the conflict of interest it creates.

And GCHQ previously rejected a more secure standard, MIKEY-IBAKE, because it didn’t allow undetectable spying.

Both the NSA and GCHQ repeatedly choose surveillance over security. We need to reject that decision.

Posted on January 22, 2016 at 2:23 PMView Comments

France Rejects Backdoors in Encryption Products

For the right reasons, too:

Axelle Lemaire, the Euro nation’s digital affairs minister, shot down the amendment during the committee stage of the forthcoming omnibus digital bill, saying it would be counterproductive and would leave personal data unprotected.

“Recent events show how the fact of introducing faults deliberately at the request—sometimes even without knowing—the intelligence agencies has an effect that is harming the whole community,” she said according to Numerama.

“Even if the intention [to empower the police] is laudable, it also opens the door to the players who have less laudable intentions, not to mention the potential for economic damage to the credibility of companies planning these flaws. You are right to fuel the debate, but this is not the right solution according to the Government’s opinion.”

France joins the Netherlands on this issue.

And Apple’s Tim Cook is going after the Obama administration on the issue.

EDITED TO ADD (1/20): In related news, Congress will introduce a bill to establish a commission to study the issue. This is what kicking the can down the road looks like.

Posted on January 20, 2016 at 5:02 AMView Comments

Michael Hayden and the Dutch Government Are against Crypto Backdoors

Last week, former NSA Director Michael Hayden made a very strong argument against deliberately weakening security products by adding backdoors:

Americans’ safety is best served by the highest level of technology possible, and that the country’s intelligence agencies have figured out ways to get around encryption.

“Before any civil libertarians want to come up to me afterwards and get my autograph,” he explained at a Tuesday panel on national security hosted by the Council on Foreign Relations, “let me tell you how we got around it: Bulk data and metadata [collection].”

Encryption is “a law enforcement issue more than an intelligence issue,” Hayden argued, “because, frankly, intelligence gets to break all sorts of rules, to cheat, to use other paths.”

[…]

“I don’t think it’s a winning hand to attempt to legislate against technological progress,” Hayden said.

[…]

“It’s a combination of technology and business,” Hayden said. “Creating a door for the government to enter, at the technological level, creates a very bad business decision on the parts of these companies because that is by definition weaker encryption than would otherwise be available … Both of those realities are true.”

This isn’t new, and is yet another example of the split between the law-enforcement and intelligence communities on this issue. What is new is Hayden saying, effectively: Hey FBI, you guys are idiots for trying to get back doors.

On the other side of the Atlantic Ocean, the Dutch government has come out against backdoors in security products, and in favor of strong encryption.

Meanwhile, I have been hearing rumors that “serious” US legislation mandating backdoors is about to be introduced. These rumors are pervasive, but without details.

Posted on January 12, 2016 at 1:22 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.