Interview with an Adware Developer


I should probably first speak about how adware works. Most adware targets Internet Explorer (IE) users because obviously they’re the biggest share of the market. In addition, they tend to be the less-savvy chunk of the market. If you’re using IE, then either you don’t care or you don’t know about all the vulnerabilities that IE has.

IE has a mechanism called a Browser Helper Object (BHO) which is basically a gob of executable code that gets informed of web requests as they’re going. It runs in the actual browser process, which means it can do anything the browser can do—which means basically anything. We would have a Browser Helper Object that actually served the ads, and then we made it so that you had to kill all the instances of the browser to be able to delete the thing. That’s a little bit of persistence right there.

If you also have an installer, a little executable, you can make a Registry entry and every time this thing reboots, the installer will check to make sure the BHO is there. If it is, great. If it isn’t, then it will install it. That’s fine until somebody goes and deletes the executable.

The next thing that Direct Revenue did—actually I should say what I did, because I was pretty heavily involved in this—was make a poller which continuously polls about every 10 seconds or so to see if the BHO was there and alive. If it was, great. If it wasn’t, [ the poller would ] install it. To make sure the poller was less likely to be detected, we developed this algorithm (a really trivial one) for making a random-looking filename that was consistent per machine but was not easy to guess. I think it was the first 6 or 8 characters of the DES-encoded MAC address. You take the MAC address, encode it with DES, take the first six characters and that was it. That was pretty good, except the file itself would be the same binary. If you md5-summed the file it would always be the same everywhere, and it was always in the same location.

Next we made a function shuffler, which would go into an executable, take the functions and randomly shuffle them. Once you do that, then of course the signature’s all messed up. [ We also shuffled ] a lot of the pointers within each actual function. It completely changed the shape of the executable.

We then made a bootstrapper, which was a tiny tiny piece of code written in Assembler which would decrypt the executable in memory, and then just run it. At the same time, we also made a virtual process executable. I’ve never heard of anybody else doing this before. Windows has this thing called Create Remote Thread. Basically, the semantics of Create Remote Thread are: You’re a process, I’m a different process. I call you and say “Hey! I have this bit of code. I’d really like it if you’d run this.” You’d say, “Sure,” because you’re a Windows process—you’re all hippie-like and free love. Windows processes, by the way, are insanely promiscuous. So! We would call a bunch of processes, hand them all a gob of code, and they would all run it. Each process would all know about two of the other ones. This allowed them to set up a ring…mutual support, right?

So we’ve progressed now from having just a Registry key entry, to having an executable, to having a randomly-named executable, to having an executable which is shuffled around a little bit on each machine, to one that’s encrypted—really more just obfuscated—to an executable that doesn’t even run as an executable. It runs merely as a series of threads. Now, those threads can communicate with one another, they would check to make sure that the BHO was there and up, and that the whatever other software we had was also up.

There was one further step that we were going to take but didn’t end up doing, and that is we were going to get rid of threads entirely, and just use interrupt handlers. It turns out that in Windows, you can get access to the interrupt handler pretty easily. In fact, you can register with the OS a chunk of code to handle a given interrupt. Then all you have to do is arrange for an interrupt to happen, and every time that interrupt happens, you wake up, do your stuff and go away. We never got to actually do that, but it was something we were thinking we’d do.

EDITED TO ADD (1/30): Good commentary on the interview, showing how it whitewashes history.

EDITED TO ADD (2/13): Some more commentary.

Posted on January 30, 2009 at 6:19 AM45 Comments


Randy January 30, 2009 6:51 AM

I wonder if someone at Microsoft will read this and say, “See! Someone likes all the insecurity!”

Are all of these capabilities also available in Vista and Windows 7?


Sean Cleary January 30, 2009 7:38 AM

OMG, in a sense, a look at true evil.
But an honest look, showing the vulerablilites. This is a list of how a virus person could improve their virus.
Do I get the feeling that in shuffling functions that there should be a compilation phase? or is this all interpreted?
I am planning to move on to Linux in the fall when Ubantu gets out of Beta. I may now do so earlier.

Phillip January 30, 2009 7:46 AM


None of this would be possible from the beginning if the OS was developed with any thought to security at all (rather than an afterthought). I look at a Linux platform, unless you’re logged in as root (in which case you should be shot if you are using root to browse the web) — this would never happen.

root January 30, 2009 8:25 AM

Dear philiph.

Most of this is acomplish by running at the same level as the browser. The point however is once this software is installed, it is VERY hard to get uninstalled. The actuial infection is not mentioned at all, but the “adware” in the title suggests that this is because is delivered with some other software.

selfmodifying code, interprocess communication are no secret in linux, just no commecial interest (yet) to make this happen.

and yes, a lot of software requires root rights to be installed.


PS, vista & windows 7 are vulnarable also, once the software is installed. It will get less easy installed via a vulnability, but as a co-package with other software will still run.

David January 30, 2009 8:32 AM

On MacOSX or Ubuntu, you will be prompted for your password if you try to do something that requires root access. This should be fairly rare, and in my experience is mostly for installing software.

Therefore, for MacOSX or Ubuntu users, you need to trick them into thinking they’re installing something, so they will expect the password prompt. At that point, the malware is running with full root privileges (through a GUI version of sudo), and can do as it likes.

This isn’t as easy as on Windows, where privilege escalation is usually just clicking a button among a user community that’s basically trained to click OK to get some work done, but there’s nothing to stop the malware on any OS once it gets that many privileges.

gdfuego January 30, 2009 8:33 AM


What do you mean by “in the fall when Ubantu gets out of Beta”? Ubuntu has a new supported release that comes out every 6 months. There is a supported release right now, and the next one comes out in the Spring.

Tim January 30, 2009 8:49 AM

If I understand linux correctly ‘root’ is simply an admin account. I use Windows everyday and NEVER log in with my admin account. I use RunAs and UAC (Prompts to elevate, asks for password). There are ways to infect a machine without having admin privileges if you get away from attacking the machine to attacking the user account. Example: to cause a program to start with Windows you can add an entry to HKLM\Software\Microsoft\Windows\CurrentVersion\Run with your executable. To attack a user you change it to HKCU\Software\Microsoft\Windows\CurrentVersion\Run (Which on my system actually does require admin, but by default it does not.) Microsoft did create some questionable API’s like CreateRemoteThread and the Windows Hook stuff, and these need to be adressed in some way (They missed there chance in the move to 64-bit). Also I know it has been said before, if 95% of the world used Linux then 95% of the viruses and other malware would target Linux and find ways around anything once again. Profit is a powerful force. I’m starting to ramble.

dot tilde dot January 30, 2009 8:54 AM

@1 (randy):

or something like “eventually somebody turned out to be cool enough to use our advanced features”…


Bill January 30, 2009 9:09 AM

“If you’re using IE, then either you don’t care or you don’t know about all the vulnerabilities that IE has.”

If only! Most corporations have an IE-only policy because the benefits of easier support, and manageability are considered to outweight the cost.

Thereafter web-filtering proxies are deployed in the (increasingly mistaken) belief they protect IE from nasties.


Cdek January 30, 2009 9:25 AM

They are abusing functions put into place to allow legit code to run. That’s always been the case. It’s the same reason your house is insecure. Function and ease of use. Hell, it’s the reason everything is insecure. Sure some OS’s are more secure by default than others, but a properly configured OS, a competent user, and some decent security measures works well, MS, OSX, or any flavor of *nix.

Anonymous January 30, 2009 9:38 AM

When Ubuntu gets out of beta…?!

I’ve been using Ubuntu for almost three years now, never one used the “beta” or “release candidate” version. Been extremely pleased.

After the switch, Windows hurts my eyes when I look at it. I’M BEING SERIOUS. I have to turn the contrast and brightness down on my monitors when I load it in a VM. ;\

Mike W January 30, 2009 10:00 AM

@Randy (and all)

Isn’t the issue more that the Windows/Explorer combination is the prevalent configuration of systems accessing the Internet, rather than the particular vulnerabilities of Vista/XP?

While it’s easy (and fun) to poke at the security problems of Windows, a wholesale shift to Linux wouldn’t solve the problem. The security model of Linux, despite it’s superiority, isn’t bulletproof. If Ubuntu (or MacOS or any other OS) captured 90% of the market like Windows has, then virus or adware programmers would focus in on that platform, find the chinks in the armor, and exploit them.
The adware author came up with innovative ways to exploit Windows flaws because he had a financial motivation to do so. I’d guess that if he had the sufficient time & motivation to exploit another OS, he could apply the same creativity to exploit it in equally effective ways.

Nerull January 30, 2009 10:12 AM


On Ubuntu, at least, silently replacing the sudo executable a user runs is trivial (And yes, I’ve done it – I’m not just making this up), so that the next time they do an apt-get install or upgrade you can hijack the system. I haven’t tried replacing the graphical sudo, but I don’t see why it would be much harder. You don’t even have to emulate the proper behavior – just call the real sudo from your app, with an argument to run your malicious app. It’ll prompt the user for a password as if nothing is wrong and then run your application. You can then run whatever the user really wanted to keep from raising suspicions.

paul January 30, 2009 10:19 AM

Another thing that’s clear here is how professionally the malware people do their jobs. As with the analysis of the Storm worm, this interview shows that they’re as thoughtful and capable as many developers of more useful programs (and head-and-shoulders above the people who write code for proprietary one-offs like voting machines).

Cdek January 30, 2009 10:19 AM

Mike W. hits this nail on the head.

It was purely a function of market. If you are getting paid per-install, or per hijacked machine, you go after the largest demographic. Attacking a smaller section of the market gives you a smaller target surface, and won’t return numbers as big as a larger market section. Same thing goes for botnets really. I think that until the market changes somewhat, it’s just not worth making a target out of anyone else. There’s no money in it.

Carlo Graziani January 30, 2009 10:48 AM

Insofar as role-based security and Linux/OSX are concerned: Unfortunately, in the name of usability, there’s been some shading of what was once a clear role distinction.

It used to be that if you wanted to do something admin, you would have to assume a special role — “root”. Nowadays, you get to create other normal accounts, and with the click of a mouse in a menu, ascribe admin privileges to that account. Which means, unfortunately, that there are a lot of people running Ubuntu and Mac OSX who, when prompted for an admin credential, type their own password, instead of a root password. Since this is the same password they type to log in (a non-administrative operation), this amounts to a conflation of the user and admin roles, a loophole that the social engineers could drive a truck through.

It’s a terrible idea, even if it enhances “usability”. People who use networked computers, irrespective of platform OS, need to know about the security uses of roles, (need to be forced to think about security at all, for that matter), just as people who drive cars need to know about lane-change technique and road-sign interpretation. If they don’t, they are vulnerable, even if they are running OpenBSD — their incompetence will open doors that should be locked.

I wish we could give people Internet Violation tickets and fines for getting into accidents (getting recruited into a botnet, for example). I bet they’d learn much faster.

Bryan January 30, 2009 11:04 AM

Sadly, modern Windows root-kits are already using much more advanced techniques that what is described here. The appropriately named forums and articles can give you a deeper look at how the dark side hides their software.

Since adware, spyware, etc is a very active area right now, and there are lots of antivirus companies to keep them on their toes, this (abuse of) technology is advancing quickly.

scramasax January 30, 2009 11:16 AM

I like the part about:

“You had to go to some web site, download an uninstaller, take a short survey about why they were getting rid of us, and then it would actually remove us and we would also leave a Registry key to make sure we didn’t reinstall. Sadly, some misguided antivirus and anti-adware software would go in and remove that, which therefore meant that we would reinstall again.”

So difficult step to follow to be sure people that lazy people will not remove it, and the people that are bright enough to follow the procedure are probaly the same that use anti-adware and antivirus so it stay anyway

Timm Murray January 30, 2009 11:46 AM


“If I understand linux correctly ‘root’ is simply an admin account. I use Windows everyday and NEVER log in with my admin account.”

Root does equal admin. The difference is that by necessity for backward compatibility, normal Windows users tend to have more access to the complete system than normal Linux users do. Especially on common home user machines, which is what adware and botnets mostly target. User accounts are functionally identical to admin on many of these.

“Also I know it has been said before, if 95% of the world used Linux then 95% of the viruses and other malware would target Linux and find ways around anything once again.”

True. This is why it’s dumb to have 95% of the world running on the same fool thing.

McCoy Pauley January 30, 2009 2:53 PM

The vast majority of computer users have no understanding of the most elemental concepts of security—and no desire whatsoever to learn. I’m afraid I’m going to have to use the dreaded n-word here: NITWIT. There, I said it. Let the mob begin to howl.

Tomas V January 30, 2009 3:23 PM

It is true that malware will target the largest community but there is a major difference between OSX/Windows and Linux. Linux is open source/free software which mean that everyone can look at it and modify it. If a vulnerability is found then it would be easier for one or several companies to hire people to fix it. Microsoft and Apple don’t have any obligations towards their clients (well they do they listen to me whining about vulnerabilities?), the patches come when they do. You could hire developers from the linux community to fix them a.s.a.p and this will then be available to everyone.

The question that might arise is how to make it fair so that everyone that use the software chips in? I don’t know? Taxes and a group of people who decides priority of vulnerabilities?

Anonymous January 30, 2009 3:31 PM

You need to get the user to download and run your code. That code will have to change their PATH, and most likely add the change to their .bashrc.

I think that should suffice. Of course, that won’t help if the user is reasonably paranoid.

Anonymous January 30, 2009 3:38 PM

Schneier, you are a bit late in publishing this link, I saw it all over the place weeks ago.

Still, even two weeks ago people made a big deal about this.


I’ve written more complex pieces of malicious code in 9th grade. Not that I’m that good, but because if you look at what the adware actually did, its below par.

If you want your malware detected, please use a BHO. Oh, and CreateRemoteThread is more than standard at this point too. Processes polling each other? We’ve been using that for antimalware and malware

I don’t know… I just… didn’t find the article that fascinating and I didn

sean January 30, 2009 11:18 PM

I recently read that Ubuntu is going to be quite a bit better and much more user friendly after a release planned this fall. No command line interface, just GUI. I may never need to know the underlying stuff, just put my store bought programs on it, no problem. In that case, I can easily move into it.


HumHo January 31, 2009 2:48 AM


Thereafter web-filtering proxies are deployed in the (increasingly mistaken) belief they protect IE from nasties.

QUESTION: why doesn’t web-filtering proxies protect IE? I am sure they protect IE from those nasties that could come from certain websites.

Arslan January 31, 2009 7:51 PM

I still think that Linux/OSX offer better security than Windows. Not
perfect, but better, and can be hardened more easily.

The Browser Helper Object is a chunk of code controlled by the adware
(so it’s untrusted & malicious), and it runs with the same privileges
as the user. Firefox on Linux has no such ‘feature’…if I want to run
executable code on your Linux-based browser, I have to either:

  1. use java, which is sandboxed by Java’s applet security manager so has reduced privs.
  2. use javascript, which is sandboxed by the browser itself, again with reduced privs
  3. find & exploit a vulnerability in the browser

AFAIK this applies to all browsers on Linux or OSX. (Firefox on
Windows did have vulnerabilities in letting DirectX code run a few
years ago, but that’s Windows again.)

Of course, if you can exploit a Linux browser, and obtain user
privileges, you can do most of the evil the adware dev claimed. You
can install a malicious executable in the user’s home dir (no need for
root privileges), invoke it in .login (and .bashrc and .xinitrc
and…). Remember the Friar Tuck hack here, I suppose this coresponds
to the article’s support ring:

I don’t think it would be worthwhile making it easier to clean up a
pwned Linux machine, who would trust it? Reformat the drive (after
ensuring your backups are good of course 🙂 and re-install. And
prevent further pwnage.

Clive Robinson February 1, 2009 6:07 AM

One of the reasons MS OS’s have more problems than they should is support of legacy code.

In the not so dim and distant past MS OS’s had no security and every program had whatever access it liked to a users machine.

Developers of commercial software went about their business however they liked and bits of their programs got sprayed all over the file system on a whim.

As MS started tightening up security and the dll hell issues older applications broke if not running with high enough privalage.

The result is that many comercial programs run at privalage levels way above that, that they should, and this makes them vulnerable to privalage escalation attacks not just viable but in some cases way to easy.

As noted above by Tim, MS should have dumped this sort of legacy support and forced developers to clean up their act when moving to 64bit.

Sometimes I get the fealing the only way to tighten MS issues up as an end user/organisation is to run apps in seperate locked down VM’s but then that has issues…

Clive Robinson February 1, 2009 6:51 AM

@ Arslan,

“…is a chunk of code controlled by the adware (so it’s untrusted & malicious), and it runs with the same privileges as the user.”

Not sure on that, I think you will find that, due to MS trying to build IE components into the OS and most other of their applications, that the adware can, if written correctly, run with higher privileges than the user…

One problem with linux is that it comes with development tools as well as scripting tools, and often they are put on a machine to install a piece of software but importantly left available…

Back in the mid to late 90’s there was a worm that attacked Red Hat linux boxes and used the compiler to replicate it’s self.

So it would be wise if you are going to leave development and other tools on a linux box to put them on a seperatly mounted partition and only mount them when required.

Rich Rumble February 1, 2009 9:12 AM

@ Tim and Timm

Also I know it has been said before, if 95% of the world used Linux then 95% of the viruses and other malware would target Linux and find ways around anything once again.

It’s not market share directly that dictates why M$ is a target. It’s the EASE of infecting such a large market share. If linux and M$ market shares were reversed, M$ would still be the target as Active-X, BHO’s and running as admin/root is not practiced in *nix. They’d have to work much harder to exploit linux than M$, no question. Just running and alternate browser on M$ like FireFox/Opera/Chrome is practically enough to keep you safe, it’s and easy conclusion to come to after only a few days of use.

Arslan February 1, 2009 10:44 AM

@ Clive:

“”…is a chunk of code controlled by the adware (so it’s untrusted & malicious), and it runs with the same privileges as the user.””

“Not sure on that, I think you will find that, due to MS trying to build IE components into the OS and most other of their applications, that the adware can, if written correctly, run with higher privileges than the user…”

Probably true, but that does not change my point. I suspect an ‘unprivileged’ Windows user still has enough privs to control what programs start up when they log on, consequently malware running with user privs would also have that ability. I think most of the tricks described in the interview don’t require admin privileges.

“One problem with linux is that it comes with development tools as well as scripting tools, and often they are put on a machine to install a piece of software but importantly left available…”

“Back in the mid to late 90’s there was a worm that attacked Red Hat linux boxes and used the compiler to replicate it’s self.”

“So it would be wise if you are going to leave development and other tools on a linux box to put them on a seperatly mounted partition and only mount them when required.”

Perhaps so, but how can a worm invoke a compiler without the ability to execute arbitrary code with full user privs? The compiler is not the vulnerability that grants your attacker code execution. Furthermore, you can’t easily remove scripting ability away from Linux, as the Bourne shell (or GNU bash) is a critical part of its infrastructure. Better to prevent untrusted code from running /bin/sh in the first place.

“The result is that many comercial programs run at privalage levels way above that, that they should, and this makes them vulnerable to privalage escalation attacks not just viable but in some cases way to easy.”

When Apple introduced OSX, it forced developers to run with reduced privileges, consequently many OS9 apps wouldn’t work on OSX, or would force the user to enter an admin password. And there was much wailing and gnashing of teeth. And many developers took forever to port software to Mac. Photoshop was primarily a Mac app until OSX and one of the Mac’s biggest third-party apps; now it’s primarily a Windows app. I can’t help thinking that Apple’s tightening of security cost them Photoshop and other killer apps. At least I understand why MS doesn’t want to tighten up security THAT much.

Brandioch Conner February 1, 2009 2:00 PM

@Mike W and Cdek

If everyone ran Ubuntu then, of course, all of the viruses / worms / trojans would be written for Ubuntu.

So what?

The issue is whether the TOTAL number (or percentage) of such infections would be LOWER on Ubuntu than the total number (or percentage) of such infections on Windows.

10 million Windows machines and 10% of them infected means 1 million infections.

10 million Ubuntu machines and 1% of them infected means 100,000 infections.

Eventually, you get to a point where the infection rate falls BELOW the disinfection rate. At which point the virus/worm/trojan ceases to exist in the “wild”.

It’s all about populations.

A system does not have to immune to all threats.

Just having a population that is highly resistant to threats is “good enough” because it is more likely that the threat will die before it hits YOUR machine than otherwise.

Cdek February 2, 2009 10:21 AM


You are right, at least in theory. I suppose we will have to wait until Unbuntu or any other OS has that kind of market share to see how it plays out in the wild. I personally believe that the common weak link will always be the user. I look at users like I look at drivers. A dangerous one is, well, dangerous…regardless of what car they are driving.

kangaroo February 2, 2009 12:59 PM

Knox is an insufferable fool. Pretending to have some kind of genius, when he’s using the oldest tools of the trade — invading other processes. Every OS has this issue. OSX just closed with their intel machines the mach port calls that allowed one to overwrite the virtual memory of other processes — now that’s hidden in some setuid part of XCode. It’s for debugging.

Most Linux distributions leave users with the capability to do a ptrace attach, for debugging. Of course, most accounts should have that capability masked out.

Of course, I’d expect those holes to close up fairly quickly as OSX and Linux gain market share — OSX already closed that hole, for example.

But Windows has a massive backwards compatibility problem. They can not close the holes without killing their real customers — software vendors. Linux distributions are recompilable — and if you don’t stay up to date, some one will fork your software and update it. OSX has a controlled software base, so they just demand changes for the privilege of working on their platform.

But Windows? Didn’t they just end a few years ago 16-bit compatibility without protected memory? Unlike everyone else, their customers are software vendors — they stay on people’s machines as long as they satisfy the vendors, and not the consumers. So they never fix the bugs, or they create silly security theater like the annoying pop-ups to change your screen-saver.

Why would anyone produce functions like ActiveX and ie’s other gaping holes? It’s not for the end-consumers, but to speed up software production — it was much easier to put that together so vendors could deliver “features” rather than forcing the vendors to really think out their products.

It’s a function of their market position.

o.s. February 2, 2009 1:56 PM

You make a very good point. For the vast majority of the existence of Microsoft features have ruled over security. However, I would say that nowadays Microsoft has paid quite a bit more attention to security than they did in the past.
The problem is that Microsoft values backwards compatibility above all else because of the HIDEOUS undocumented hacks that are consistently cooked up by software vendors that use their platform instead of dropping old software components designed to appease a vendor and that don’t really do much to help the users of the operating system. Its a death spiral situation to support the old hacks and previous designs which still leaves Windows vulnerable to the same old exploits.

Entrevista com um programador de Adware February 2, 2009 2:37 PM

“O instrutor de Ruby Matt Knox revela em entrevista como funciona o lucrativo meio do Adware e também fala sobre segurança, especialmente a respeito das sérias vulnerabilidades de segurança do Windows e do Internet Explorer que permitem que esse meio prospere tanto[…]“

Anonymous February 2, 2009 4:57 PM

@ o.s., kangaroo,

Theres two problems,

Both are MS’s own fault.

The first is they made a truly appaling API in the form of Windows MFC it was badly documented and full of hidden functions.

MS used those hidden functions in their own code giving rise to acusations of “unfair practice to gain commercial advantage” over their rivals.

And this was the second problem MS made for it’s self. Those programers had found how to leverage that “hidden advantage” for their own code. A “Macho” programer ethic built up around MFC where the “secrets” where kept close to programers chests for their own competative advantage.

They charged high prices for their services as “contractors”. This made companies jump on the “code reuse” bandwagon as a form of financial self defense.

Often as “object code” DLL’s the replacment of which can mean significant hits not just in performance but in significant re-writes both of the DLL or the aplication.

It is this code reuse that MS has to support as a legacy and it is that that realy hurts. Both they and the software companies know this cannot go on but change is dificult and expensive.

Often the tempory solution is seen to be to use a “shim” between the aplication and the MS newer API’s. These have downsides in terms of functionality, performance, security stability and testing.

Unix however has had fairly open API’s since the early days in Linux Linus has made it clear that certain API’s will remain supported and others will change without notice so should not be used. In practice this effects very few applications (drivers however…).

Also for some reason unix programers have maintained an “Open Attitude” where they freely passed on their “code hacks” not just within their organisations but with others outside.

The result is the mess that MS find themselves in today, and to be quite honest I’m not sure how they are going to pull their “but out of the fire”. Forcing more open API use means that moving aplication code from the MS closed platform to the open Unix platform becomes easier and easier with time.

MS’s .net initiative makes it just as easy for open developers to replicate the functionality in much the same way as the wine project has done for the older API’s.

The question is how will MS plot their future now that their grip over their own platform is lessening due to external preasures they cannot avoid.

Brian February 2, 2009 5:19 PM

Interesting article. If you spend any time reading in the reversing community, you will find a lot of interesting approaches to windows. For example, it’s pretty common now to pack a file to make it more difficult to disassemble. However, if you examine the way a packer works, it’s just instructions to decode instructions. This is a lot like what the guy is talking about above.

Now, once folks see problems with windows; for some strange reason, they want to jump to different OS’s. You often hear about how “secure” linux or macs are today. The only reason why you’re not seeing more articles like the one above about linux and macs is market share.

Once their market share goes up, it makes logical sense to write adware for linux and macs. And if you have run either, then you all ready know that neither is really secure by default. The approach to writing adware for either would just be different. Of course, if your browser is making available private information, then the OS doesn’t really matter.

And given the large quantity of people that have no idea how to securely operate their PCs, adware will continue. And packers will just make it more difficult for virus detection programs to find them.

Adam W. February 3, 2009 10:57 PM

Sounds to me like this trojan requires the user to intentionally install it before it will do anything. In that case, how is running Linux, MacOS or BSD any more secure than Windows?

mmmmm February 6, 2009 10:30 AM

Always, when there’s some bad story about the total lack of security in Windows, some biased Windows fanboys tell us that the amount of problems would be same for Linux, if it had the same market share.

Windows has huge and a large number security problems since it’s beginning.
Linux has a more than outstanding security history. Never ever have i read about any Linux worm or virus or other malware. The only thing i heard of was more like a proof of concept – created in the lab, and would never survive in the big world.

Still, there are many people who think Linux would suffer as much as Windows. It’s just totally ridiculous.

There are many huge differences between Linux and Windows. It’s not because there are only 10 million Linux boxes that they can’t be attacked, it’s because they are just really out of league compared with Windows.
The fact that Linux is Open Source is probebly by far the best way to prevent security problems.

Microsoft always preferred supporting vendors wishes (like kangaroo said) and developing new features to attract more vendors over security. Linux has no such (horrible) history and it will never go in that direction – so it doesn’t matter if they would really get 95% market share. Some problems might popup ofcourse but they will be much easier and faster to combat than it would be possible with a closed source OS. The problems that would rise up, never would has so much impact as is has done numerous times with Windows boxes.

It’s plain and simple: the moment some security problem is discovered, the community (or Linus himself?), it gets patched. A Linux virus doesn’t stand much chance because of this. We can study the virus and easily adapt Linux to prevent the problem going to spread widely.
Studying a Windows virus or malware is HUGELY more difficult because of the Closed Source. Only a handful of Microsoft employees will be able to tell what’s the problem. We all have to wait for that little group of experts to come up with an solution. As it has always been, that solution only comes up if negative publicity is/stays substantial. Otherwise, Microsoft will do NOTHING about the security hole.

It’s not uncommon to find new Windows security holes which have been there in many Windows versions over many years.
Such is very uncommon for Linux.

The utterly commercial driven Microsoft hasn’t been able to address the security issues — even with all it’s billions of profit each year. If all those billions can’t repair Windows, that should say enough: Microsoft still gives no priority to security at all. It’s business model is: conquer the land first — after that, we will see how to stay in control of it. If consumers are the victims of that: Microsoft surely doesn’t give any shit about them. As long as the consumers are interlocked with Windows because of all the software vendors go with Microsoft, Microsoft will never better its security in favour of the individual consumer. They just don’t care — as long if Microsoft can hold its near monopoly and expand in many new markets, Microsoft will be happy with just that.

Yes, Microsoft is evil. You knew it all along.

JardaP February 6, 2009 6:58 PM

Some of you mention the possibility of attacking Linux/OS X, if they’d have a sufficient share to drive the attacker’s interest. But social engineering is much easier on Windows. Often all you have to do is to add an attachment to a mail and call it “I love you blabla.txt……….exe”. Or create a “gambling” web and ask user to install access software or whatever of this type (and you have a good chance that the user is running his comp. as an admin). Or alternatively a porn site access software.

On Linux this would look a bit more complicated. You’d have to add an attachment “I love you blabla_(please save this file, set the executable flag and double-click it).txt”.

It takes much more stupid a user on Linux to infect his computer.

Or it takes a vulnerable software which appears every so often on any platform. The thing is that the Linux ones are quite rapidly patched in most cases, while in Windows there are sometimes years old holes that nobody bothers to fix. Also, nobody runs Linux from 1998 or even 1995 (and the internet will probably be a safer place to go when all this Win 9x/Me shite will die out). Linux distros are probably mostly rather up to date. Windows are nowadays perhaps also, but not necessarily the software, which runs on top of it. Windows update somehow doesn’t care about other soft, than Windows while e.g. apt-get makes it all in one run. And often people don’t bother patching, until they need a new feature (feature, not a version patched for security). So even if Linux would get 90% of the desktops, it would be much harder a target then Windows anyway. Add to it the fact that an exploit would have to target different distros with different versions of kernel and software, all differently patched by the distributor and many attacker might prefere to go and do a honest work, because it would be easier.

Clive Robinson February 7, 2009 5:45 AM

@ JardaP,

“Some of you mention the possibility of attacking Linux/OS X, if they’d have a sufficient share to drive the attacker’s interest. But social engineering is much easier on Windows.”

You have forgoton to take into account that both “crime” and “user limitations” are OS agnostic. And that MS never realy developed a true Multi-user OS like Unix, so never implemented the required level of security in their “server range” (NT) of OS products or their now defunct “user range” (DOS) OS products.

Crime is OS agnostic in that it will always focus on where the desirable assets are.

As Willy Sutton pointed out he robbed banks because that was where the money was. And as was made famous in “All the Presidents Men” criminals “follow the money”. Therefore criminals will do whatever it takes to get at the money. The relative strengths of security of one OS over another does not realy matter except in the level of effort and resources required to achieve the desired objective.

Users are likewise OS agnostic they care not a jot which OS they use, it’s the user interface and applications they care about.

One of the prime reasons for a graphics based display system over a text based one is “information richness”. Both a text and graphics display system can support a “windowing” information overlay system to effectivly increase information capacity and there by support user level aplication multi tasking.

However the greater detail available to a graphics system allows a graphical as opposed to text based control system for the overlay mechanism which most people find easier to learn to use initialy. Hence the popularity of a windows based display User Interface (UI) system.

Therefore a graphical based windows display UI system allows effective use with less user training. Which is one reason the use of the command line is considered “geeky” (but is actually considerably more efficient in most respects once the slower lurning curve has been negotiated).

Which comes around to why is MS windows and OS products less secure than Unix based systems. The answer is history and the fact that MS OS products have never been truely multi-user. They have always been single user and network server products.

The MS DOS OS became the default OS on IBM PC compatable hardware. However it was originaly not muti tasking due to hardware related limitations and had no mechanisum to support more than one user application at a time. Thus MS DOS needed no process related security mechanisms, so they where not designed in. It was only with the development of Terminate and Stay Resident (TSR) programes by non MS organisations that it became possible to have two user applications in memeory at the same time. MS then built in support for TSR capabilities into MS DOS as the cababilities of the hardware (IBM AT) memory and processor (80286/80386) could support it. However the AT platform was still seen as a single user standalone system even though it could run multiple applications it was by an inifficient task switching mechanism where each task assumed full control of the system resources and was switched by the user not the OS. Thus it was not seen as multi-tasking and still needed no security.

With the advent of the 80286 AT&T decided that it should be possible to put Unix onto the Intel processor IBM AT platform. They looked around for organisations to do the port. Although MS did not have the inhouse capability to port Unix they bid for and won the job. They then found three geeks working out of a garage that grandly called themselves the “Santa Cruz Operation” which later became just SCO. They did the Unix port for MS and thus AT&T but due to the typical contract shenanigans that MS excelled at MS was paid royalties for all AT&T Unix code on Intel Platforms (which accounts for part of their interest in the SCO problems of more recent times and why you still find MS copyright messages in old AT&T Intel Unix header filles).

It was doing this port and the rapidly expanding capabilities of the Intel x86 platform that made MS realise that they needed a proper (ie pre-emptive) multi-tasking operating system.

They took on a guy called Dave Cuttler who wanted to make a “better Unix than Unix” and MS’s New Technology (NT) OS came into being. Due to time to market and hardware limitations a lot of stuff was left out of NT most of which had security implications. This was due to the fact that MS had decided to compeat with Novell’s network server not with the Unix “multi user” or IBM type Big Iron “Job proccessing” server systems.
Prior to version 3.0 MS Windows was considered inferior to most other windows systems. However MS had the advantage of supplying the predominant OS and thus having distribution and other market related factors in place which they used to their advantage to push their windows system. Importantly the underlying OS remained

As of Win95 effectivly stoped supplying the OS and windows system seperatly.

However it was still a single tasking operating system with a psudo multi tasking windows UI on top of it. Not only was there no security as a single user system it was still considered not required it now had significant stability issues due to backwards compatability and applications walking all over eachothers spaces and creating DLL hell.

This sillyness continued through to MS Windows 98 98SE & ME.

The problem MS had was that they wanted the server product to also have the same “look and feel” and backwards compatability which ment that most of the security issues where not addressed in that product range either.

In NT4 it became obvious to any organisation with an Internet connection that MS NT did not have sufficient security. MS finaly after much public critisisum bit the bullet and started putting security that should have been there all along.

However security is analagous to quality. In that it is a process that has to be built in from the start for it to be effective.

You could look at it this way,

One person can saddle and harnes ten or more horses in stables in a morning without difficulty. They can maybe do two in a paddock if they are co-operative or the paddock is small. But ten horses running free on open range are unlikley to be caught and saddled by one person ever…

MS’s problem is they are also trying to do it with both hands tied behind their back in the dark….

Clive Robinson August 11, 2009 5:17 AM

If you want to see how sneaky etc malware developers are getting at confusing the ordinary user how about pretending to be a malware detector running as part of the MS in built security…

I was having a sniff around the net today looking for dodgy malware sites found via search sites such as google and the following site struck me as being a good example (I have put spaces in the deliberately.

http : // go – in – search . net

What the malware does is fake the MS security centre scanning your PC and finding some malware. |It looks convincing on an MS platform (obviously fake on any other OS)

Oh when I go looking for this stuff I use a CD ROM only systems on old PC hardware which I have all the BIOS flash and other stuff to put it back to a known state. The one I use for hunting (wolf) has no HD or other normal mutable memory (except the flash BIOS which I check out after use with some hardware I designed for the job). I have a number of OS’s on CDROM that I use in it. This system connects via an old fashioned hub (not switch) which has various bits of network monitoring equipment on it. If I find something that is of interest I have another PC (goat) again using old hardware with removable HD’s that are copied from a known good master likewise the flash BIOS etc. After becoming infected it is checked out for changes etc and compared against the network monitoring logs etc.

Lets just say that some of what goes on shows just how serious some of these malware etc writers are getting at their trade and it does not bode at all well for the average PC user.

I can see a time in the not to distant future where effectively the battle will have been lost to malware writers on the existing platform methodologies. Simply due to the fact that for the more popular OS’s security was not built in from the very start, and retro fitting is a long slow and painful process.

This is because malware writers are starting to move away from the OS both upwards to applications and downwards to the BIOS etc. THe upward trend has been happening for a while now with the web browser or plug-ins being the target (Google Chrome tries to combat this by bringing in OS security mechanisms into the app).

However it is the downwards direction that is the scary direction as this requires a whole new methodology of protection that cannot be achieved by software alone…

One (partial) solution is to have a ROM based system but this has it’s own disadvantages, but there are workable methods around some of the problems (and no I do not mean “use signed code” or TPM etc)

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.