Another Side Channel in Intel Chips

Not that serious, but interesting:

In late 2011, Intel introduced a performance enhancement to its line of server processors that allowed network cards and other peripherals to connect directly to a CPU’s last-level cache, rather than following the standard (and significantly longer) path through the server’s main memory. By avoiding system memory, Intel’s DDIO­short for Data-Direct I/O­increased input/output bandwidth and reduced latency and power consumption.

Now, researchers are warning that, in certain scenarios, attackers can abuse DDIO to obtain keystrokes and possibly other types of sensitive data that flow through the memory of vulnerable servers. The most serious form of attack can take place in data centers and cloud environments that have both DDIO and remote direct memory access enabled to allow servers to exchange data. A server leased by a malicious hacker could abuse the vulnerability to attack other customers. To prove their point, the researchers devised an attack that allows a server to steal keystrokes typed into the protected SSH (or secure shell session) established between another server and an application server.

Posted on September 16, 2019 at 6:39 AM19 Comments


parabarbarian September 16, 2019 9:08 AM

An exploit that lets an attacker capture keystrokes from an ssh session seems pretty serious to me.

Bob September 16, 2019 9:13 AM

Between this, spectre, and meltdown, it’s looking more and more like Intel’s performance edge over AMD was accomplished by throwing out any and all semblance of secure operation.

Clive Robinson September 16, 2019 10:56 AM

@ Bob,

I suspect you are not the only person who thinks Intel have “Scr3w3d the pooch” over their insesent specmanship (a look at UK “Bus Stop” advertising shows half door sized adverts with such nonsense).

But as we keep getting told “security does not sell” and “nobody cares about security”, it’s hard for any security minded types to make much millage where the decisions are made.

Alyer Babtu September 16, 2019 12:17 PM

Re: security does not sell

My solution: the tech analogue of the Chinese scholar/administrators refusing to come out of their caves to serve a new ruler they disapproved of. The hardware and software devs go the extra mile and design for security anyway but don’t tell management, saying in a generic way its “technical improvements”.

Sigh …

Pit September 16, 2019 12:55 PM

“Not that serious, …”
Why not ? In case of residual risk – probably, but IMHO both “features” are welcome in large, multitenant environments, so they won’t switch them off…

Petre Peter September 16, 2019 7:40 PM

Yes it’s not stealing, it’s making a guess based on a specific language. Anyway, I thought one s stands for secure in ssh.

ITg uy September 17, 2019 5:17 AM

As far as I know, they only got delay between keystrokes. I said “only”, but study from 200? managed to narrow possible password to 1/50 of original pool. I believe it would work for dictionary attacks as well. It’s not a game changer, but many services depend on VMs isolation. In last few years, Intel managed to erode the very foundation of virtualization, cloud services and all server-client systems. I hope these ripples won’t come back as tsunami.

[ ]

Rj Brown September 17, 2019 8:01 AM

This feature should be turned off in an multi-customer leased vm environment for security reasons, but in a case where multiple servers are all operated (perhaps in a cluster) on behalf of a single owner or customer, and isolated behind a suitable firewall from the rest of the world, this could indeed provide a valuable performance enhancement. Think of a hadoop cluster operated for a single customer, or an embedded application either air-gapped or firewalled from everything else. It looks like a valuable tool that must be operated in a responsible manner. If it were a prescription drug in the US, it would probably need a “black box” warning label.

Clive Robinson September 18, 2019 8:54 AM

@ Ali,

Packet Chasing attack exploits similar vulnerability to propose a web fingerprinting that works even without DDIO.

The paper is an interesting read.

Thr actual attack though reminds me of the cache based attack against AES going back oh about two decades to the end of 2000, shortly after the finalist had been selected.

Back then time based side channels whilst not unknown were considered by many –incorrectly– as “theoretical attacks”.

It is intetesting to note that some people believed at the time and presumably still do that the NSA were very well aware of time based side channels, and deliberatly made their implementation as part of thr AES contest.

Thus I guess the same question is going to arise about Intel… That is is this a bug that got in, due to Intel taking short cuts. Or more likely they are a deliberately puting in such low level attack points because they know the end user cann’t do anything to stop them as potential attack vectors that favour the SigInt entities.

RealFakeNews September 18, 2019 9:36 AM

I must clarify: I’m not suggesting that Intel are in some way responsible for the DMA vulnerability, but rather that they should know allowing any hardware to access any mmemory it wants without making checks is a bad idea at best, remembering it can write as well as read.

Who? September 18, 2019 12:16 PM

For those who think this side channel is not a serious one I would suggest reading “Silence on the wire,” by Michal Zalewski. As other hardware side channels it is a pretty serious one.

As Clive Robinson I am seriously considering the possibility of being deliberate weaknesses implanted in hardware. My first suspicion was ten years ago, this one is the very reason I bought my last workstation —on november 2017— with an i5 processor that had four real cores. Obviously disabling any trace of simultaneous multithreading was not enough, as I discovered a month later. In most cases there is nothing we can do at the operating system level to stop them—at least for NetCAT there is a chance to stop it if BIOS allows Intel’s Data Direct I/O (DDIO) being entirely disabled.

K.S. September 18, 2019 12:47 PM

I wonder why all these issues are disclosed only now? I heard unconfirmed rumors of various vulnerabilities/backdoors in Intel Management Engine for at least a decade. Why it took so long for these to come to light? Was there a new breakthrough in tools that allowed research to conduct or was Intel somehow successful in suppressing reports?

Clive Robinson September 18, 2019 4:41 PM

@ K.S.,

Was there a new breakthrough in tools that allowed research to conduct or was Intel somehow successful in suppressing reports?

No there was no new breakthrough in tools, and we’ve been aware that Intel CPU’s had issues for atleast thirty years or so.

The real answer was although plenty of hardware engineers –myself included– were aware there were issues, as Intel used to publish “hardware erata” for each current chip “step”, they were at best very vague and very abstract and not easy to get hold of, then there were NDA issues on top of that. All of which I guess you could call “lying by omission”.

Further most hardware engineers are creative at heart and want to make things and for various reasons[1] hardware “security research” did not have job prospects. So the reality is next to no hardware engineers went looking to turn hardware bugs into attack vectors untill recently.

Likewise in the Open Security research area, both commercial and academic, software bugs were not only widely known, they were easier to find as the methods were “known quantities”. So it was seen as a much easier way “to make a name for yourself” or “earn a living with” as it’s a very target rich environment (which is why making a name for yourself is now not so easy in software security research).

We can not say what the Closed Security research area such as the various SigInt agencies were upto. Put simply in the US simple things like TEMPEST protection materials were in effect made illegal to have as they were controled substances and research was likewise in theory “classified”. Even in Europe where things were more open things were heavily discouraged[2] and “the doors only turned one way” at the best of times.

If you actually go for a little dig into Meltdown, you will see that actually several people started looking at about the same time, mostly as “private research” in their own time… That is the idea had germinated and hatched. When I read about it I knew that not only had it come of age, but also due to the noise it made it was going to be the next “Make your name” field of research, thus I christened it “Thr Xmas gift that would keep giving” and indicated we would get a lot more discovered for three to five years.

Part of the reason this sort of leading edge research is done as “in our own time with our own gear” way is researchers have to eat like the rest of us. Thus their professional research is to a certain extent dictated by funding being available. For obvious reasons in the same way the software industry used to attack software vulnerability researchers with lawyers, the hardware industry is not going to fund research unless it’s locked up under NDA or worse. I got out of the “Officialy Secret” defence industry early on despite a promising career because certain things were causing me nightmares one of which we now call “Supply Chain Poisoning” and there are a few others still crawling out of that Pandora’s box.

[1] Hardware engineers tend to spend much longer in training and there is a prevelent ethos of not “rocking the boat” or more correctly “Not killing the goose that lays the golden eggs”. I’ve mentioned this before as I ran afoul of it years ago when I showed to an engineering manager how easy it was to fake fingerprints to get past the biometric fingerprint readers we were designing. Put simply I had to go find another job without a refrence… I likewise had the same “Don’t kill the goose” attitude from people when I found a major but obvious hole in DNA testing, it was actively denied untill an Australian researcher got the attention of ABC and people could nolonger deny it. But they still prevaricated and the chances are that the older cheatable DNA test system will be used…

[2] As an example I discovered independently decades ago just how easy it was to spoof the GPS system and gave some millitary people a demonstration, and I got given the “Keep it quiet whilst we investigate it further” nonsense. I also discovered independently back in the 1980’s just how susceptable microprocessors were to EM radiation. You could cause them to go crazy with just a low power lowish frequency unmodulated RF carrier. Heck even a Handheld transceiver with around 1 watt at VHF PMR would do it upto a couple of feet away. I however took it a lot further using tricks such as “cross modulation” to work out what the code on the CPU was doing[3], and also using it to trigger modulation that caused program execution to be changed in a fairly reliable way. Again it got me into trouble, firstly with a pocket gambling machine where I could make it win way way more than it should. Secondly with an early “electronic wallet”. It was a few years later that the “smart card” / SIM people went firstly into denial and then into doing something about it in all the wrong ways. People talk about “DPA” from nearly two decades later, but that still required power supply pin access, where as my discovery could not just detect the on chip signals, it could unlike DPA change the execution[4]… However a certain person decidrd to patent DPA research and got such broad claims past the examiner that he in effect killed further research…

[3] Actually determining what was going on with code in computers kind of started when I was either just a twinkle in my fathers eye or in nappies. Back then it was “Big Iron” actually made of individual transistors at clock speeds down at or below Mediumwave frequencies and instruction cycle times down in the hundred kilohertz range. Which meant execution loops etc were down in the audio range. Which meant that an ordinary AM radio would pick the signals up and make a garish cacophony of buzzsaw like noise laid over an ordered series of beeps and whistles as the code progressed. If however the software got stuck in a loop a clear tone or whistle would become predominant. Thus “listening in” was used by a number of computer operators, untill clock speeds went up and the signals went beyond what a mediumwave receiver and human ear could pick up.

[4] As I said I indipendently discovered the use of EM fields to cause problems in CPU’s back last century in my own time with my own equipment out of curiosity. However the first published academic research on it I’m aware of was by a couple of researchers at the UK Cambridge Computer Labs a decade into this century some quater of a century or so later. They took a commercial IBM 32bit true random number generator used in ATMs and when “illuminating” it with an RF carrier caused the entropy to fall from 32bits to just below 8bits. That is the range dropped from one in four billion to around one in two hundred and twenty. Thus making a “Guessing attack” entirely feasible,

Since then there has been little or nothing done, especially in “active fault injection attacks”. I suspect nothing will till someone demonstrates it in a way that makes a lot of buzz in the industry, then it will become the new field of research to make your name in for a couple of years.

babyboy September 20, 2019 8:38 AM

How long until Universal Read Gadget?

Or rather, how long until it is disclosed that the NSA has had one for years…

TRX September 20, 2019 10:09 AM

In S.M. Stirling’s alternate-history novel “The Stone Dogs”, the Draka civilization is crippled by back-doored microprocessor designs. Stirling called it an “infovirus”, but it depended on compromised processors to run.

That was written in 1990.

I laughed; nobody could possibly be so stupid as to leave that sort of security hole in critical infrastructure…


Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.