Entries Tagged "traffic analysis"

Page 1 of 2

The NSA Warns of TLS Inspection

The NSA has released a security advisory warning of the dangers of TLS inspection:

Transport Layer Security Inspection (TLSI), also known as TLS break and inspect, is a security process that allows enterprises to decrypt traffic, inspect the decrypted content for threats, and then re-encrypt the traffic before it enters or leaves the network. Introducing this capability into an enterprise enhances visibility within boundary security products, but introduces new risks. These risks, while not inconsequential, do have mitigations.

[…]

The primary risk involved with TLSI’s embedded CA is the potential abuse of the CA to issue unauthorized certificates trusted by the TLS clients. Abuse of a trusted CA can allow an adversary to sign malicious code to bypass host IDS/IPSs or to deploy malicious services that impersonate legitimate enterprise services to the hosts.

[…]

A further risk of introducing TLSI is that an adversary can focus their exploitation efforts on a single device where potential traffic of interest is decrypted, rather than try to exploit each location where the data is stored.Setting a policy to enforce that traffic is decrypted and inspected only as authorized, and ensuring that decrypted traffic is contained in an out-of-band, isolated segment of the network prevents unauthorized access to the decrypted traffic.

[…]

To minimize the risks described above, breaking and inspecting TLS traffic should only be conducted once within the enterprise network. Redundant TLSI, wherein a client-server traffic flow is decrypted, inspected, and re-encrypted by one forward proxy and is then forwarded to a second forward proxy for more of the same,should not be performed.Inspecting multiple times can greatly complicate diagnosing network issues with TLS traffic. Also, multi-inspection further obscures certificates when trying to ascertain whether a server should be trusted. In this case, the “outermost” proxy makes the decisions on what server certificates or CAs should be trusted and is the only location where certificate pinning can be performed.Finally, a single TLSI implementation is sufficient for detecting encrypted traffic threats; additional TLSI will have access to the same traffic. If the first TLSI implementation detected a threat, killed the session, and dropped the traffic, then additional TLSI implementations would be rendered useless since they would not even receive the dropped traffic for further inspection. Redundant TLSI increases the risk surface, provides additional opportunities for adversaries to gain unauthorized access to decrypted traffic, and offers no additional benefits.

Nothing surprising or novel. No operational information about who might be implementing these attacks. No classified information revealed.

News article.

Posted on November 22, 2019 at 6:16 AMView Comments

Russians Hack FBI Comms System

Yahoo News reported that the Russians have successfully targeted an FBI communications system:

American officials discovered that the Russians had dramatically improved their ability to decrypt certain types of secure communications and had successfully tracked devices used by elite FBI surveillance teams. Officials also feared that the Russians may have devised other ways to monitor U.S. intelligence communications, including hacking into computers not connected to the internet. Senior FBI and CIA officials briefed congressional leaders on these issues as part of a wide-ranging examination on Capitol Hill of U.S. counterintelligence vulnerabilities.

These compromises, the full gravity of which became clear to U.S. officials in 2012, gave Russian spies in American cities including Washington, New York and San Francisco key insights into the location of undercover FBI surveillance teams, and likely the actual substance of FBI communications, according to former officials. They provided the Russians opportunities to potentially shake off FBI surveillance and communicate with sensitive human sources, check on remote recording devices and even gather intelligence on their FBI pursuers, the former officials said.

It’s unclear whether the Russians were able to recover encrypted data or just perform traffic analysis. The Yahoo story implies the former; the NBC News story says otherwise. It’s hard to tell if the reporters truly understand the difference. We do know, from research Matt Blaze and others did almost ten years ago, that at least one FBI radio system was horribly insecure in practice — but not in a way that breaks the encryption. Its poor design just encourages users to turn off the encryption.

Posted on September 24, 2019 at 6:33 AMView Comments

Traffic Analysis of the LTE Mobile Standard

Interesting research in using traffic analysis to learn things about encrypted traffic. It’s hard to know how critical these vulnerabilities are. They’re very hard to close without wasting a huge amount of bandwidth.

The active attacks are more interesting.

EDITED TO ADD (7/3): More information.

I have been thinking about this, and now believe the attacks are more serious than I previously wrote.

Posted on July 2, 2018 at 9:35 AMView Comments

Detecting Drone Surveillance with Traffic Analysis

This is clever:

Researchers at Ben Gurion University in Beer Sheva, Israel have built a proof-of-concept system for counter-surveillance against spy drones that demonstrates a clever, if not exactly simple, way to determine whether a certain person or object is under aerial surveillance. They first generate a recognizable pattern on whatever subject­ — a window, say — someone might want to guard from potential surveillance. Then they remotely intercept a drone’s radio signals to look for that pattern in the streaming video the drone sends back to its operator. If they spot it, they can determine that the drone is looking at their subject.

In other words, they can see what the drone sees, pulling out their recognizable pattern from the radio signal, even without breaking the drone’s encrypted video.

The details have to do with the way drone video is compressed:

The researchers’ technique takes advantage of an efficiency feature streaming video has used for years, known as “delta frames.” Instead of encoding video as a series of raw images, it’s compressed into a series of changes from the previous image in the video. That means when a streaming video shows a still object, it transmits fewer bytes of data than when it shows one that moves or changes color.

That compression feature can reveal key information about the content of the video to someone who’s intercepting the streaming data, security researchers have shown in recent research, even when the data is encrypted.

Research paper and video.

Posted on January 24, 2018 at 5:28 AMView Comments

More on the NSA's Use of Traffic Shaping

“Traffic shaping” — the practice of tricking data to flow through a particular route on the Internet so it can be more easily surveiled — is an NSA technique that has gotten much less attention than it deserves. It’s a powerful technique that allows an eavesdropper to get access to communications channels it would otherwise not be able to monitor.

There’s a new paper on this technique:

This report describes a novel and more disturbing set of risks. As a technical matter, the NSA does not have to wait for domestic communications to naturally turn up abroad. In fact, the agency has technical methods that can be used to deliberately reroute Internet communications. The NSA uses the term “traffic shaping” to describe any technical means the deliberately reroutes Internet traffic to a location that is better suited, operationally, to surveillance. Since it is hard to intercept Yemen’s international communications from inside Yemen itself, the agency might try to “shape” the traffic so that it passes through communications cables located on friendlier territory. Think of it as diverting part of a river to a location from which it is easier (or more legal) to catch fish.

The NSA has clandestine means of diverting portions of the river of Internet traffic that travels on global communications cables.

Could the NSA use traffic shaping to redirect domestic Internet traffic — ­emails and chat messages sent between Americans, say­ — to foreign soil, where its surveillance can be conducted beyond the purview of Congress and the courts? It is impossible to categorically answer this question, due to the classified nature of many national-security surveillance programs, regulations and even of the legal decisions made by the surveillance courts. Nevertheless, this report explores a legal, technical, and operational landscape that suggests that traffic shaping could be exploited to sidestep legal restrictions imposed by Congress and the surveillance courts.

News article. NSA document detailing the technique with Yemen.

This work builds on previous research that I blogged about here.

The fundamental vulnerability is that routing information isn’t authenticated.

Posted on July 12, 2017 at 6:32 AMView Comments

Disguising Exfiltrated Data

There’s an interesting article on a data exfiltration technique.

What was unique about the attackers was how they disguised traffic between the malware and command-and-control servers using Google Developers and the public Domain Name System (DNS) service of Hurricane Electric, based in Fremont, Calif.

In both cases, the services were used as a kind of switching station to redirect traffic that appeared to be headed toward legitimate domains, such as adobe.com, update.adobe.com, and outlook.com.

[…]

The malware disguised its traffic by including forged HTTP headers of legitimate domains. FireEye identified 21 legitimate domain names used by the attackers.

In addition, the attackers signed the Kaba malware with a legitimate certificate from a group listed as the “Police Mutual Aid Association” and with an expired certificate from an organization called “MOCOMSYS INC.”

In the case of Google Developers, the attackers used the service to host code that decoded the malware traffic to determine the IP address of the real destination and redirect the traffic to that location.

Google Developers, formerly called Google Code, is the search engine’s website for software development tools, APIs, and documentation on working with Google developer products. Developers can also use the site to share code.

With Hurricane Electric, the attacker took advantage of the fact that its domain name servers were configured, so anyone could register for a free account with the company’s hosted DNS service.

The service allowed anyone to register a DNS zone, which is a distinct, contiguous portion of the domain name space in the DNS. The registrant could then create A records for the zone and point them to any IP address.

Honestly, this looks like a government exfiltration technique, although it could be evidence that the criminals are getting even more sophisticated.

Posted on August 21, 2014 at 6:08 AMView Comments

Evading Internet Censorship

This research project by Brandon Wiley — the tool is called “Dust” — looks really interesting. Here’s the description of his Defcon talk:

Abstract: The greatest danger to free speech on the Internet today is filtering of traffic using protocol fingerprinting. Protocols such as SSL, Tor, BitTorrent, and VPNs are being summarily blocked, regardless of their legal and ethical uses. Fortunately, it is possible to bypass this filtering by reencoding traffic into a form which cannot be correctly fingerprinted by the filtering hardware. I will be presenting a tool called Dust which provides an engine for reencoding traffic into a variety of forms. By developing a good model of how filtering hardware differentiates traffic into different protocols, a profile can be created which allows Dust to reencode arbitrary traffic to bypass the filters.

Dust is different than other approaches because it is not simply another obfuscated protocol. It is an engine which can encode traffic according to the given specifications. As the filters change their algorithms for protocol detection, rather than developing a new protocol, Dust can just be reconfigured to use different parameters. In fact, Dust can be automatically reconfigured using examples of what traffic is blocked and what traffic gets through. Using machine learning a new profile is created which will reencode traffic so that it resembles that which gets through and not that which is blocked. Dust has been created with the goal of defeating real filtering hardware currently deployed for the purpose of censoring free speech on the Internet. In this talk I will discuss how the real filtering hardware work and how to effectively defeat it.

EDITED TO ADD (9/11): Papers about Dust. Dust source code.

Posted on August 28, 2013 at 7:07 AMView Comments

More on NSA Data Collection

There’s an article from Wednesday’s Wall Street Journal that gives more details about the NSA’s data collection efforts.

The system has the capacity to reach roughly 75% of all U.S. Internet traffic in the hunt for foreign intelligence, including a wide array of communications by foreigners and Americans. In some cases, it retains the written content of emails sent between citizens within the U.S. and also filters domestic phone calls made with Internet technology, these people say.

[…]

The programs, code-named Blarney, Fairview, Oakstar, Lithium and Stormbrew, among others, filter and gather information at major telecommunications companies. Blarney, for instance, was established with AT&T Inc….

This filtering takes place at more than a dozen locations at major Internet junctions in the U.S., officials say. Previously, any NSA filtering of this kind was largely believed to be happening near points where undersea or other foreign cables enter the country.

[…]

The systems operate like this: The NSA asks telecom companies to send it various streams of Internet traffic it believes most likely to contain foreign intelligence. This is the first cut of the data. These requests don’t ask for all Internet traffic. Rather, they focus on certain areas of interest, according to a person familiar with the legal process. “It’s still a large amount of data, but not everything in the world,” this person says.

The second cut is done by NSA. It briefly copies the traffic and decides which communications to keep based on what it calls “strong selectors”—say, an email address, or a large block of computer addresses that correspond to an organization it is interested in. In making these decisions, the NSA can look at content of communications as well as information about who is sending the data. One U.S. official says the agency doesn’t itself “access” all the traffic within the surveillance system. The agency defines access as “things we actually touch,” this person says, pointing out that the telecom companies do the first stage of filtering.

The surveillance system is built on relationships with telecommunications carriers that together cover about 75% of U.S. Internet communications. They must hand over what the NSA asks for under orders from the secret Foreign Intelligence Surveillance Court. The firms search Internet traffic based on the NSA’s criteria, current and former officials say.

The NSA seems to have finally found a PR agency with a TS/SI clearance, since there was a response to this story. They’ve also had a conference call with the press, and the Director of National Intelligence is on Twitter and Tumblr.

I am completely croggled by the fact that the NSA apparently had absolutely no contingency plans for this sort of thing.

Posted on August 27, 2013 at 1:19 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.