Apple's Tracking-Prevention Feature in Safari has a Privacy Bug

Last month, engineers at Google published a very curious privacy bug in Apple’s Safari web browser. Apple’s Intelligent Tracking Prevention, a feature designed to reduce user tracking, has vulnerabilities that themselves allow user tracking. Some details:

ITP detects and blocks tracking on the web. When you visit a few websites that happen to load the same third-party resource, ITP detects the domain hosting the resource as a potential tracker and from then on sanitizes web requests to that domain to limit tracking. Tracker domains are added to Safari’s internal, on-device ITP list. When future third-party requests are made to a domain on the ITP list, Safari will modify them to remove some information it believes may allow tracking the user (such as cookies).


The details should come as a surprise to everyone because it turns out that ITP could effectively be used for:

  • information leaks: detecting websites visited by the user (web browsing history hijacking, stealing a list of visited sites)
  • tracking the user with ITP, making the mechanism function like a cookie
  • fingerprinting the user: in ways similar to the HSTS fingerprint, but perhaps a bit better

I am sure we all agree that we would not expect a privacy feature meant to protect from tracking to effectively enable tracking, and also accidentally allowing any website out there to steal its visitors’ web browsing history. But web architecture is complex, and the consequence is that this is exactly the case.

Apple fixed this vulnerability in December, a month before Google published.

If there’s any lesson here, it’s that privacy is hard—and that privacy engineering is even harder. It’s not that we shouldn’t try, but we should recognize that it’s easy to get it wrong.

Posted on February 10, 2020 at 6:06 AM5 Comments


me February 10, 2020 8:24 AM

if the problem comes from the fact that a state, unique per user, exists.
why not upload the list of trackers and give to every user the same list?
it will work better on every user too.

but i remember that i read somewhere on twitter:
if your solution to security problem is: Wbuild a giant database” or “don’t do/use it” you failed badly at secuirty.

and i agree.

Clive Robinson February 10, 2020 10:28 AM

@ Bruce,

If there’s any lesson here, it’s that privacy is hard — and that privacy engineering is even harder.

The underlying problem as is often the case these days is,

    Exceptions effecting behaviour

The moment you treat any entity differently to others it becomes an exception. The more exceptions you have the more identifiable you become in some way.

If a lot of people have the same exceptions, they’ve declared themselves as part of a “group”. If people have a unique series of exceotions then they have given themselves a unique identity by which they can be recognized or a “fingerprint”.

The finer the rules of exception the larger the amount of information leaked.

Thus to minimise the risk you have to as a minimum,

1, Stop exception information leaking.
2, Prevent external enterties linking your activities.

Whilst there are partial solutions to the first problem, they tend to be negated by the second problem over which you have little or no effective control.

Whilst not quite “Traffic analysis 101” the problem is that one privacy measure negates another privacy measure, whilst there will be a sweet spot the problem is will it be sufficient, to which the answer is probably no especially with a unique list. Thus a “group” offers a degree more privacy, but still leaks information.

Jon February 10, 2020 11:48 PM

I suspect the problem was, fundamentally, trying to make it ‘intelligent’. Turns out it was, in some ways, kinda dumb. But the complexity made that unclear, until pointed out. I have no doubt there are other ways that it’s defective (possibly by design).

Keeping it simple is best – until you run into edge cases where certain sites just won’t work if you turn off tracking, and other sites you deliberately want to let track, and some other tracking that maybe other people want?

Note that the application was designed to ‘reduce’, not ‘eliminate’, tracking…

As Mr. Schneier pointed out, ‘privacy is hard’ – especially when some of your revenue depends upon breaking other people’s privacy.


Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.