Hacking HTTP Status Codes

One website can learn if you’re logged into other websites.

When you visit my website, I can automatically and silently determine if you’re logged into Facebook, Twitter, Gmail and Digg. There are almost certainly thousands of other sites with this issue too, but I picked a few vulnerable well known ones to get your attention. You may not care that I can tell you’re logged into Gmail, but would you care if I could tell you’re logged into one or more porn or warez sites? Perhaps http://oppressive-regime.example.org/ would like to collect a list of their users who are logged into http://controversial-website.example.com/?

Posted on February 2, 2011 at 2:26 PM59 Comments

Comments

Kerrek SB February 2, 2011 2:46 PM

This is yet another repercussion of the incredibly short-sighted design of the whole JavaScript+DOM integration in clients. Surely there are countless more ways in which the arbitrary code downloaded from the internet and executed in the local context (able to communicate back to the internet, of course) will be able to expose client data. It’ll just keep coming. There isn’t even anyone to blame.

For now, JavaScript just stays off by default in my browser. It is really time for a complete redesign of a browser API that’s exposed to untrusted scripts! (Any ideas for that, Bruce?)

Eric Johnson February 2, 2011 2:56 PM

I keep telling people to use FireFox, and install the extensions NoScript, RequestPolicy, and BetterPrivacy.

Stops this attack, and pretty much every other browser-based drive-by attack in its tracks.

Petréa Mitchell February 2, 2011 3:02 PM

Interesting that the Facebook/Twitter version requires JavaScript to be enabled for those sites. Even among the more security-savvy, those are likely to be in some people’s always-on lists.

Kerrek SB February 2, 2011 3:04 PM

Well, I’ve been thinking about this for some time actually. As I see it, there are two simple steps that would probably eradicate 99% of all popular browser-based ‘attacks’:

1) A client-side scripting API that obeys the simple rule: “No network activity shall ever result from any script execution.” I still like JS calendars and form validators and fading tab bars, but none of those need to use the networking component of the browser.

2) An active “same origin” policy togglably built into the browser. That is, when toggled (e.g. by default, overridable per site), the browser shall not load any resources from a domain different from that of the main document. Why doesn’t that exist?

Michael Taylor February 2, 2011 3:15 PM

2) An active “same origin” policy togglably built into the browser.
That is, … the browser shall not load any resources
from a domain different from that of the main
document. Why doesn’t that exist?

Because it could break a large percentage of Content Delivery Network (CDN) usage? That nearly all very large sites use to move media-rich content closer to the end-user in order to reduce delays caused by long hauling of popular, often large, files or media.

Second; the attacker would simply create broken domain name records that get parsed so as to pass the security check, yet are from the malicious site rather than the “trusted” or “target” site.

Ali February 2, 2011 3:21 PM

Isn’t this a kind of fear mongering? So what you can tell if I’m logged into facebook? Who isnt? And how do you know who it is, as in the individual. What you know is the visiting IP is logged in.

Bob T February 2, 2011 3:22 PM

My chrome and microsoft browsers both give an “untrusted certificate” message for the cited website – grepular.com. I’m getting this for a lot of websites lately. Could a problem on my own machine be causing these?

Justin Long February 2, 2011 3:24 PM

As far as “NoScript” and all that jazz ends up going, it’s the equivalent of locking yourself in the basement with a gun.

You’re not experiencing even a remotely good version of the internet, just some tiny part.

I might be biased, being an ECMAScript (which is the name for JavaScript) developer, but people who turn off JS support, are losing out on the internet as a whole, and might as well use Lynx

NobodySpecial February 2, 2011 3:42 PM

Interesting that this is a side effect of a security feature – restricting content to logged in users.

If you didn’t read the article – what it does is try and get you to read a resource from a site like Gmail where resources can be restricted to logged in users. If this works then you are logged in – it’s not reading information from your browser directly.

Kerrek SB February 2, 2011 3:47 PM

Because it could break a large percentage of Content Delivery Network (CDN)
usage? That nearly all very large sites use to move media-rich content
closer to the end-user in order to reduce delays caused by long hauling of popular,
often large, files or media.

Sure, but it’d be up to those large websites to give their CDN servers sensible DNS records. If you are http://www.example.com, just make sure that eu.cdn.example.com and au.cdn.example.com point to the right thing. Even if the same-origin would apply to *.example.com, it’d already suppress lots of these problems of sneaking IMG-elements into a website that contact a dodgy tracking server, non?

Ben T February 2, 2011 3:48 PM

oppressive-regime.example.org would need only to pass sneaky-executive-order.example.org forcing all-isps.example.net to make a backdoor for .. [connection-lost]

Phil P February 2, 2011 3:50 PM

Bob T: Mike’s site uses a cert from CA Cert, which is not in most browsers as a “trusted” CA. Their trust model is based on a web-of-trust of trained assurers and IMO gives better identity assurance than you can expect out of most CAs who are bundled.

max630 February 2, 2011 3:58 PM

After the css history vulerability was discovered (by the way, it is not and probably will not be fixed for Firefox < 4), it looks like javascript security model is not about keeping privacy.

2Justin Long: NoScript, for example, allows selective inclusion of javascript. But yes, here is a kind of tradeoff.

Boris February 2, 2011 3:59 PM

Bruce, the “image” part of this attack is a non-issue. You don’t need to detect HTTP status codes via scripting. Just load the image. If you get an image of nonzero size, there was an image there; otherwise there wasn’t.

The only way to protect against that in the browser is to completely disable all cross-site image linking.

Kerrek SB, see http://people.mozilla.com/~bsterne/content-security-policy/

Gary C February 2, 2011 4:40 PM

The ironic part of this hack is that it affects Firefox, Chrome and Safari but not Internet Explorer or Opera as they do not return an HTTP error code on the javascript call.

Its a privacy-compromising hack where IE is more secure than Firefox. I’m still having trouble digesting this.

Dirk Praet February 2, 2011 4:47 PM

It’s probably different for most of us to which extent we care or should care who is watching what we are doing on-line. If I were in Egypt right now, there are indeed some pretty good privacy & security Firefox add-ons that offer additional protection to default browser behaviour, although some do indeed come at a cost. Or you can go Tor.

Fred P February 2, 2011 5:05 PM

@Justin Long-

I installed NoScript shortly after getting an extremely annoying advertisement; NoScript was simply the easiest way to block it from ever happening again.

As for scripts I want to execute, I can always turn them on.

John Thurston February 2, 2011 5:16 PM

@Justin Long
The first thing I do on every fresh windows install is install Firefox. The second thing I do is install NoScript. The third thing I do is clear NoScript’s default whitelist.

You may find it hard to believe, but I like it here in my basement. I can let the scripts in when I like, and keep them out when I don’t. My web experience is greatly enhanced by preventing the wiping, sliding, overlaying javascript and flash driven advertising drivel which people seem to think I’d like to see.

Heh heh heh, what a mess. February 2, 2011 5:33 PM

@Justin Long:

“As far as “NoScript” and all that jazz ends up going, it’s the equivalent of locking yourself in the basement with a gun.

You’re not experiencing even a remotely good version of the internet, just some tiny part.

I might be biased, being an ECMAScript (which is the name for JavaScript) developer, but people who turn off JS support, are losing out on the internet as a whole, and might as well use Lynx”

As a Tor user who uses Firefox, Privoxy, NoScript, Torbutton, and Torsocks, I find the web a much more cleaner place to surf.

I don’t miss the ads, cookies, exploits, javascript, java, and other tripe. My mouse movements aren’t tracked (how long I hovered over a photo of a celeb while furiously employing one hand).

No, I don’t need Lynx, you’re stretching it a little too far there, buddy boy.

If I need some content from a site like YouTube, I can pull it off with youtube-dl program, handled via Privoxy without spilling privacy beans. If I need other content, I push it through a utility pointed at Privoxy, usually always with the option to use Torsocks.

I would argue those “experiencing” the “whole” fat of the internet are the ones in their virtual basement, guns pointed at their cpus, just waiting for the next malware to drive them to apologetics type forums where they cry about their infections or “what could this malware be?” and dump log after pathetic log from their proprietary OS and many of their closed applications.

No, the savvy net user is not in the basement, he’s on his own self appointed throne of better understanding when it comes to surfing the Wild West Internet, and he’s armed with better tools than most of the casual users foaming at the mouth on Windows forums with their antiviruses, antitrojans, antimalware.

The joke is squarely placed on these fools, and so your “point” is played up, but easily destroyed.

me – 1
you – 0

Game over, man.

Petréa Mitchell February 2, 2011 5:44 PM

Justin Long:

Yes, you are biased. But it’s natural to be frustrated when people are ignoring your work completely.

I use NoScript, but, like most of its users, I use it selectively. Does your site have a really useful tool that requires JS? Is that video worth my time? I’m willing to give it a chance. In the meantime, my online experience is much improved by the removal of ads and the blocking of a popular attack vector.

Dr. T February 2, 2011 5:45 PM

@Justin Long: “… As far as “NoScript” and all that jazz ends up going, it’s the equivalent of locking yourself in the basement with a gun.

You’re not experiencing even a remotely good version of the internet, just some tiny part.”

What bullcrap. NoScript uses a white list so that JavaScript can be active on trusted sites. Firefox-NoScript users aren’t locked in basements.

The problem is web site developers like you who use JavaScript in place of ordinary HTTP code for simple things like hyperlinks, buttons, and login fields. When I go to such a site and cannot view any content (because JavaScript is inactive), I conclude that the site was designed by idiots and close the tab.

Thomas February 2, 2011 5:47 PM

@John Thurston

+1

What’s this “facebook” everyone keeps going on about?

Oh, that’s that “free” site that somehow made the owner the youngest billionaire ever.

Every wonder what he sold to get that much money?
If you’re a facebook user, just look in the mirror…..

max February 2, 2011 8:28 PM

It seems that much of the problem is the cookie model employed by browsers. If a server could request that the browser only use the cookie when requesting resources for a page loaded from that server, the attack wouldn’t work. Nor would a whole host of CSRF attacks.

I haven’t looked into it much, but presumably this is unworkable for some reason, or was not practical 10 years ago, or something.

Richard Steven Hack February 2, 2011 10:10 PM

I agree with the use of NoScript. The only problem with using it is that you then have to selectively activate the parts of the site you want to really see, such as the media. But the way Web sites are constructed these days, with a dozen or more external links, it can sometimes be hard to tell what is the minimum that should be re-activated, so in frustration one ends up temporarily activating the entire page.

It ends up being like Microsoft’s UAC – such an annoyance that you act to defeat its purpose.

Fortunately, when on a porn site, I usually don’t have to bother doing that – the pics can be downloaded quite well without activating any scripts or allowing other sites linked to the main site.

And NoScript does prevent porn sites from hijacking my browser, which is what I installed it to prevent. The rest of its protections are just a bonus – especially since I’m running on Linux, so 99.99 percent of the malware out there can’t touch me.

trapspam.honeypot February 2, 2011 10:51 PM

Firefox 3.16.15Pre browser
AskForSanitize 2.1
BeefTaco (Targeted Advertising Cookie Opt-Out) 1.3.2
BetterPrivacy 1.48.4
Ghostery 2.4.2
HTTPS-Everywhere 0.9.2

Inbetween every web page change I select Clear History that also includes LCO Flash Cookies.

At the end of each browsing session I use:
PurgeFox
FurgeIE Pro
CCleaner
EasyCleaner
Evidence Eliminator (licensed since ver 1.0)
Also use Unlocker (http://cedrick.collomb.perso.sfr.fr/unlocker ) on index.dat and *.ie5 files to clear before a sweep of all unwanted browsing history within the operating system.

trapspam.honeypot February 2, 2011 10:52 PM

Also use all the same with Firefox 4.0b11Pre Minefield beta.

Works very well, no issues with a clean machine and operating system.

Throw in WinPatrol Pro, a good firewall, and antivirus.

Paul February 2, 2011 11:00 PM

Maybe I missed something, but could this same attack vector be very handy to figure out when and which internet banking you are logged into and send a few nefarious requests to them (from the other window using JS) to, say, transfer some funds…

Just asking.

Davi Ottenheimer February 3, 2011 1:49 AM

@Justin Long

“As far as “NoScript” and all that jazz ends up going, it’s the equivalent of locking yourself in the basement with a gun. You’re not experiencing even a remotely good version of the internet, just some tiny part.”

So many people have taken (Long) shots already, I almost didn’t want to join the list.

However, I really can’t resist.

Your analogy brings to mind many many examples from history…such as the Conquistadors who believed they approached the Incan Empire with illumination, when in reality they brought smallpox.

Why are scripts so inherently beneficial? I rarely enable them. This site has very few, for example. Are you calling Bruce’s blog basement-like for not using more?

I mean would you describe a glass of water as “not even a remotely good version” of Mountain Dew?

It also reminds me of the Law of Software Envelopment by jwz: “Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.”

The “good version” of the Internet is the version without all the unnecessary and evil scripts; in the end we all just end up reading text/mail anyway, some more efficiently than others.

Fredrik February 3, 2011 2:04 AM

I like to use this AdBlock filter: *$third-party.

It blocks tracking images, Google analytics, and this attack, but to be on the safe side, you should probably use NoScript and FlashBlock too.

Danny Moules February 3, 2011 3:08 AM

@BobT Check your date/time settings.

@Paul That’s a CSRF and a different attack altogether with its own security measures. You don’t need to check they are logged in to attempt that, you just try it. You can’t ‘talk across windows’ also.

@Ali You’re assuming a) It’s not targeted at a specific user as part of another attack b) it’s not a government who can usefully and quickly collate that information – and there are plenty of oppressive governments that track people through the web and c) the link doesn’t require privileges. If it does you can use it to say ‘oh this user has access to [x], therefore has [y] privilege, therefore they must be [z]’

“It ends up being like Microsoft’s UAC – such an annoyance that you act to defeat its purpose.”

UAC doesn’t have rules that specify how to react in the future (this is by design, it would be useless otherwise). NoScript does – once it’s set up you’ll rarely have to invoke it one way or the other. Very different beasts.

I can see why Google would ignore this. My own experience with Google’s security team is they blow off far more serious things than this.

Mozilla users can use the ‘RequestPolicy’ extension to protect against this (or NoScript, properly used). It’s a known issue and not revolutionary. A web application pen tester should already be aware this is possible.

Clive Robinson February 3, 2011 4:17 AM

@ Paul,

“… to figure out when and which internet bank you are logged into and send a few nefarious requests to them (from the other window using JS)”

It’s a little bit more complicated than that, or should be… but for some browsers and banks in the not to distant past…

@ Justin Long,

JS has many niggily problems (ie it does not do quite the same thing on different browsers or even versions).

Likewise many JS developers code has many niggles and snags and is rarely tested properly and is full of incorrect assumptions about screen size etc etc (if it looks OK in 16 colour standard VGA you are probably doing it right).

The worst offenders as far as I am concerned are those ignoring the people with some form of visual impairment (which is an offence in Europe and many other jurisdictions).

Thus any web site that indicates in various ways (usually by failing badly) that JS or any other “run in the browser” code is a major requirment is not worth visiting.

Likewise the sites of page developers that insist on flash or some other junk they think looks cool, but needs a large bandwidth, just for some pathetic logo manipulation.

Believe it or not but the majority of web users want pages that are readable concise and download quickly and cleanly, giving quick and easy access to the information they are after.

Especialy those using mobile conectivity, they realy don’t want to wait five mins to sit through a 30 second promo that does not fit their screen and renders badly at best with audio that sounds like a cat being eviserated the hard way, before finding out that what they want to buy etc is not in stock.

As a simple rule of thumb if your site cannot work in it’s primary function (diseminating information) without bells and whistles then it should not be up on the web.

This is because for most humans text is king it’s how they get the information they want, sometimes simple figures and charts help. But most often people find your site through a text search, or something they have read on another site.

Thus text should be easily readable, that is plain backgrounds are a must, as is high contrast, and plain fonts. Also remember menu bars take up more screen space than they are worth for many users so design it to look acceptable in 16 colour standard VGA resolution.

Pictures might be worth a thousand words to poets and marketing gurus but few people need them in much more than 1/8th screen area (they can click on it if they want more detail) and no more than one or two per A4page equivalent.

None of this needs JS or other in browser code to accomplish.

As for movie clips or animations they might look clever but they chew up bandwidth and CPU cycles on smart phones and the like and quickly become a very anoying distraction at best.

Dimitris Andrakakis February 3, 2011 4:25 AM

@Everybody taking a shot @Justin Long :

Guys, seriously now, that’s cheap. Yes, you may be tech-savvy enough to use NoScript effectively and without your browsing experience being crippled. I am too –I use firefox+noscript almost excusively for my non-company-intranet browsing.

But 99.5% of the users out there are far from tech-savvy. And even if they are, what reliable criterion is there for the user to decide which script to allow and which not ? Hint : There isn’t.

Yes, there are tools that enable the user to have some real privacy. But for most people (that is, unless you’re a human rights group activist living in China) using combinations like “Firefox, Privoxy, NoScript, Torbutton, and Torsocks” is simply not a sensible tradeoff between perceived value (more privacy) and time consumed (to install/maintain etc.).

Cupcake February 3, 2011 5:01 AM

@Dimitris Andrakakis: give some more credit to 99.5% percent of all users.

My parents (retired white-collar worker and housewife, respectively) use Firefox, and installed NoScript a year or two ago. I was taken completely by surprise for the same reason that you mention, but it seems to work for them.

Of course there is no reliable criterion for them to determine with 100% accuracy which site is trustworthy and which isn’t. OTOH, there isn’t one for me, either, and some defense is better than no defense.

TL;DR: don’t be such an elitist.

Kerrek SB February 3, 2011 6:27 AM

@Dimitris: You say “99.5% of the users out there are far from tech-savvy”. Wouldn’t that actually be an argument IN FAVOUR of not giving them this extremely dangerous software that we do, but rather we SHOULD design a client-side scripting API that is safe by construction? I.e. something that follows my simple rule from my first post, “no script activity shall cause any network activity”.

@trapspam.honeypot: You’re joking, right? I mean, the very fact that you suggest I use about 10 browser-specific, specialised pieces of add-on software to make my browsing vaguely safe should be a wakeup call. I don’t want to have to install hundreds of bits of extra software – I just want a browser that has a simple, built-in feature to keep things same-origin.

(I know that you could twist DNS records to circumvent name-based same origin, but that’s a far more involved attack than simply writing a website with a tracking image.)

Paul Crowley February 3, 2011 6:47 AM

From a server’s point of view, would it work to ignore cookies if the “referer” field doesn’t match the current site? Would this serve to implement “max”‘s proposal?

From a browser’s point of view: couldn’t that same filter be implemented at the browser end? What sort of problems might this cause?

Stew February 3, 2011 6:53 AM

Make sure you keep your tin hat on as well trapspam.honeypot.

Why not just run your browser from a sandbox or vmware image, the time it takes you to load/close a guest OS would be far less than dicking around with all those programmes?

Dirk Praet February 3, 2011 7:16 AM

@ Dimitris Andrakakis

Nothing is black and white. Over the years, the web has evolved from a text and information based medium for professionals to a rich multimedia experience with loads of bells and whistles for the masses. As you correctly point out, the majority has no idea whatsoever as to the risks and pitfalls they’re being exposed to every time they go online, whether it be with a computer or smartphone.

Those of us that have been around for a long time and still use the web primarily as an information medium couldn’t care less about Flash, Java, Silverlight, JS and the like because there is exactly zero need to create informational pages in such a way. In addition, all of these technologies offer new vulnerabilities and possibilities of tracking a users activities, habits and data, whether it be by governments, commercial or criminal organisations. You don’t need to be a human rights activist or living in a country with an oppressive regime to find this undesirable.

What many uninformed people are not aware of is that in its current incarnation the web is just as much a users portal to the world as to some folks it is a portal into theirs, especially commercial organisations offering “free services”.

For the internet generation, going online holds just as much potential dangers as experimenting with sex, alcohol, drugs and guns. That is why education and awareness are crucial. That’s why I explain to my mom that there really is no such thing as a friendly Nigerian man wanting to share a hundred million with her, or talking my ten year old niece out of publishing everything she does on Facebook.

What people do with this knowledge is up to them to decide. One may chose to have unsafe sex or drink and drive and get away with it. The same way you can chose to have all or most functionality in your browser activated in a totally unmoderated way or chose for a slightly different and more effort intensive experience by installing and maintaining privacy and security controls.

Mark R February 3, 2011 11:21 AM

Justin Long may have overstated his case, but he does have a point. Like most of the opposition, I disable scripts by default and enable them only when needed. It is undeniable that this removes some useful functionality (along with gobs and gobs of invasive and offensive crap).

To say that there is no legitimate use for client-side scripts in browsers is pretty short-sighted and curmudgeony. There are cool and useful things you can do in-browser without requiring a round trip to a server-side script. Just one example, you can use client-side scripts to filter the rows that are displayed in an HTML table based on user-selected criteria.

We make a trade-off when we turn off scripts. Some sites don’t even render correctly. This trade-off makes sense to me in 99% of cases, but to deny that you’re missing anything is wrong.

What I’d like is more fine-grained control over what scripts can do, as some here have suggested.

paul February 3, 2011 11:26 AM

This also tells us something about the dangers of monoculture. If there are only a dozen sites that anyone might care about people being logged in to, this kind of hacks is much easier to perpetrate.

scottnottheotherscott February 3, 2011 12:51 PM

Those arguing with Justin Long are, I think, missing the forest for the trees.

The simple fact that NoScript requires a whitelist should be evidence enough to indicate that Javascript is already necessary to use the modern web. This is only going to become more so as websites transition to web applications. It may be currently possible to eke out some manner of scriptless user experience, but I think this will just get poorer and poorer.

I’m kind of surprised that people are focusing in on the particular technology and are not asking the more interesting question of, “How do we sort the good scripts from the bad?”

The issue is not that a website can view my login status, but rather that:

a)I didn’t give it permission to do so
and
b)We have no manageable framework that grants that level of control over arbitrary, unauthenticated content.

Oh, and the hack is a nifty hack.

Clive Robinson February 3, 2011 1:39 PM

@ scottnottheotherscott,

“I’m kind of surprised that people are focusing in on the particular technology and are not asking the more interesting question of, “How do we sort the good scripts from the bad”

The simple answer is 99.99…% of users cann’t, of those that can most cann’t be bothered.

The reason is JavaScript was not a good idea to start with and it was badly and insecurely implemented on a badly designed and badly implemented environment all of which was used on Win95 which hid a version of DOS underneath it.

Nobody ever sat down and said “hang on a minute guys, let’s clean this up” and it just got bit after bit thrown onto it by various browser designers year after year.

The best thing to do would be a “fire sale” and start again from scratch, but that is very unlikley to happen. The whole thing is built on shifting quick sand and the more we add to it the more likely it is to just give way.

It may be possible to make JS secure but in all honesty it’s not worth the effort, as modern desktop etc systems have sufficient spare capacity to run better designed systems.

Richard Steven Hack February 3, 2011 7:27 PM

Clive: “Nobody ever sat down and said “hang on a minute guys, let’s clean this up” and it just got bit after bit thrown onto it by various browser designers year after year.”

Yup. You’ve just described the entire IT industry. Which is the real problem.

erica February 4, 2011 2:06 AM

I’d be delighted to run any javascript that comes with a fully valid quality assurance certificate from an internationally recognised validator/tester of javascript code.

If javascript purveyors want access to my computer, then all they need to do is ensure their code is certified to that level.

Meanwhile, noscript is my friend.

Come back to me when the infrastructure is in place to vet javascript code then I’ll reconsider.

Kerrek SB February 4, 2011 5:49 AM

@Mark R: “What I’d like is more fine-grained control over what scripts can do, as some here have suggested.” — You’re perfectly right that we need client-side scripting, but the problem with levels of control is that invariably it’ll never work and there’ll always be hacks and ways around. That’s why I suggested a simple, global design rule (“no network access”), rather than fine-tuning. Actually, I bet that if we made a browser with a restricted API like that and opened it up to a “creative web site” competition, we’d get tons of brilliant designs.

@scottnottheotherscott: “How do we sort the good scripts from the bad?” — Why should the design of the scripting even allow for something like a “bad script”? What I’d prefer personally is something that just doesn’t allow anything bad to happen in the first place — one less thing for me to keep track of!

@Clive Robinson: Let’s not confuse the language and the API — the language is probably not such an issue, it’s the retarded idea of exposing everything and anything through the DOM.

@Erica: “I’d be delighted to run any javascript that comes with a fully valid quality assurance certificate from an internationally recognised validator/tester of javascript code.” — Somehow I feel this is missing the point entirely. So many problems don’t come from scripts that are intentionally put into a website, but from a combination of vulnerabilities which allow scripts from other sources to be executed. Apart from the fact that you’d have to have an infallible, expert validation service (have you heard of the “underhanded C contest”?), any such system is almost guaranteed to be hackable and evadable.

Shane February 4, 2011 1:07 PM

I love his ‘solution’: “Some of these requests could be stopped by doing referrer checks; reject all external referrers for content only accessible when logged in.”

As if forging referrer headers was even a hurdle an attacker had to overcome, haha.

Sure buddy, tell me about all the porn sites I’m logged into, just as soon as you write enough site-specific content scrapers to cover them all.

Haha, what a load of crap.

erica February 4, 2011 1:31 PM

@Kerrek — of course it could be hackable and evadable. But that’s not my problem….Or at least not entirely my problem.

I’ll agree to allow your javascript to run on my machine, provided you fully indemnify against any loss or damage caused by your script.

So I only run scripts from (or authenticated by) an agency that has full insurance cover — insurance that I can claim on.

Those who absolutely need javascript to make their sites sparkle will be happy to pay to have them authenticated.

And the rest of us can run those authenticated scripts sure in the knowledge that any problems will be compensated by reputable industry bodies.

It would bring clarity, openness, honesty and trust to the WWW-JS interface.

What’s not to like?

Shane February 4, 2011 1:38 PM

@Erica – “What’s not to like?”

How about endless paperwork, lawsuits, EULAs, fixing software flaws with bureaucracies, quis custodiet ipsos custodes, etc et al.

Besides the major fallacy that paying someone to say you can be trusted somehow makes you trustworthy 😛

Dazed and confused February 5, 2011 10:35 AM

I tested this technique in Firefox, Safari, Chrome, Opera and various versions of Internet Explorer and it worked in them all. I reported it to Google and they described it as “expected behaviour” and ignored it.

Unfortunately, this attack doesn’t seem to work in Internet Explorer or Opera, but does work in Firefox, Chrome and Safari.

So does it or doesn’t it work in IE?

Will February 9, 2011 8:56 AM

@trapspam.honeypot

Is that all you do? I surf from within a Faraday cage in a disposable VM running on a PC operating totally read only filesystem. I use robot arms to access the browser buttons from outside a hermetically sealed airtight room. Between each page I self destruct the whole building and start again.

BOOM

Shane February 9, 2011 4:52 PM

@Will, @trapspam.honeypot

Wow you guys are working waaaaaaaaaaay too hard. I just use the neighbor’s computer when they’re out (haha).

js February 13, 2011 7:43 AM

” I might be biased, being an ECMAScript (which is the name for JavaScript) developer, but people who turn off JS support, are losing out on the internet as a whole, and might as well use Lynx ”

Interestingly, I find that not to be the case. I do lose some, but the majority of sites are actually better usable without Javascript. I am fairly certain this is due to the use I make of the Internet though, which is usually to search for textual information, rather than entertainment or business stuff.

In some cases though, I do lose some. I haven’t found this to be such a big problem due to the small amount of such cases. So, for a certain niche of people, it’s actually making things better.

Definitely not something for the casual user though.

js February 13, 2011 7:48 AM

Oh, god. Sorry Justin, it seems everyone’s piling on you there. Maybe the niche is a bit larger than I expected and I should have read all the next comments before deciding to give my view about this.

And yes, I know the feeling of having people decide to just ignore what you’ve done summarily for reasons you don’t think good 😛

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.