Zoom Exploit on MacOS

This vulnerability was reported to Zoom last December:

The exploit works by targeting the installer for the Zoom application, which needs to run with special user permissions in order to install or remove the main Zoom application from a computer. Though the installer requires a user to enter their password on first adding the application to the system, Wardle found that an auto-update function then continually ran in the background with superuser privileges.

When Zoom issued an update, the updater function would install the new package after checking that it had been cryptographically signed by Zoom. But a bug in how the checking method was implemented meant that giving the updater any file with the same name as Zoom’s signing certificate would be enough to pass the test—so an attacker could substitute any kind of malware program and have it be run by the updater with elevated privilege.

It seems that it’s not entirely fixed:

Following responsible disclosure protocols, Wardle informed Zoom about the vulnerability in December of last year. To his frustration, he says an initial fix from Zoom contained another bug that meant the vulnerability was still exploitable in a slightly more roundabout way, so he disclosed this second bug to Zoom and waited eight months before publishing the research.

EDITED TO ADD: Disclosure works. The vulnerability seems to be patched now.

Posted on August 17, 2022 at 6:11 AM11 Comments

Comments

Q August 17, 2022 9:09 AM

I can’t help wondering that allowing auto-updates is the real problem here.

Once an application has been verified to do the thing the user wants, and doesn’t have any extra “bonus” functionality like sending back telemetry, then auto-update can undo all of that verification work with a single update.

With auto-update people open themselves up to being abused by the companies being able to include new unwanted “features” whenever they feel like it, ostensibly to “enhance your experience”, or to “improve security”, or some other lie.

Lucy B. August 17, 2022 10:41 AM

@ Q,

I can’t help wondering that allowing auto-updates is the real problem here.

Not directly. While I have a similar feeling, and agree they’re often abused, the data seems to show that auto-updates improve security overall. However, that’s predicated on the inability of programmers to write software correctly; if we could do that, auto-updates wouldn’t be needed. But we apparently can’t (even qmail has been exploited). So we just add another layer of sandboxing occasionally.

I think the larger problem in this case is the privilege model and/or the abuse of it. Why does Zoom’s installer “need to run with special user permissions”, and why do users put up with that? Why is an “installer” needed at all?

It doesn’t help that Zoom’s entirely proprietary; if we’d been able to review source code, and many people were using third-party implementations, the problem wouldn’t have been so big. But companies feel they can get more money by not revealing anything, and users let themselves be sold out.

Clive Robinson August 17, 2022 12:15 PM

@ Lucy B., Q, ALL,

Re : Correctly written software

“However, that’s predicated on the inability of programmers to write software correctly; if we could do that, auto-updates wouldn’t be needed. But we apparently can’t… “

But,

1, We have.
2, We Can.
3, We Do.

Write good software. I’ve written a lot of software that has no reported functional or security defects upto the time the software was EOL’d.

Some software I’m still supporting after four decades now… It was written for an Apple ][ in Pascal (UCSD) and has been ported a number of times, and has had updates as hardware became obsolete.

There was nothing magical or even exceptional about the way it was written, and we can still do the same if we wish to.

Bwck in the 1980’s people were still doing what had been the norm in the 1970’s and before which, was paying for bespoke development. It was howrver seen in the US as,

1, Expensive.
2, Slow.
4, Needlessly involved.

Because the first of the commercial / consumer applications were written using the same methodology they tended not to have issues.

However people wanted to “crank the money machine” and thus things had to be done faster, steps that money orientated investors saw as needless got pulled, the result was quality started to drop. Worse “kitchen sink marketing” came along and wanted a huge slice of the pie for basically makeing “dumbass promises to customers” just to make a sale.

The result was Micro$haft behaviour. But that is still not close to what we see today…

To crank the handle even faster we got the “code reuse” mantra. The problem was you had to write notvhalf a page to a page of code, but five to ten pages of “all things to all men” code, that unsuprisengly was not at all good.

One thing that happened was all sorts of input validation got removed.

Another was all exception handling got turned into “Blue Screen of death code”.

And the list goes on and on, every money man looking to,

1, Cut costs
2, Use less developer resources.
3, Cut proper design methods.
4, Outsource to the cheapest labour.
5, Preasure developers with too short time scales.
6, Add features that won’t be used but put vulnetable code in the tree.

And so on… And people wonder why their apps lock up, have security vulnerabilities and Crash out…

Annoying on a PC or Smart pad etc. But what about when “Things get physical?”

Those 737’s that dropped out of the sky was we know but people won’t officially admit was down to,

1, American Airlines being tight fisted.
2, Boeing marketing pandering to the AA executives black mail.
3, The money men pushing the aviation authorities to ease up on what the money men saw as needless.

I could go on but as any halfway comoetent engineer will tell you quality is a process that starts before a project and stays inplace for the whole project. As such it has rules that if you break them you will not get Quality, not even the illussion of quality, and yes when energy and matter are involved lack of quality kills, plain and simple.

Oh what sort of software methodology do you think Google and others are using for their self driving cars/trucks and unmaned aerial vehicles… I guess you probably do not want to know if you are ever going to get in one or alow one to come in range of you…

Ted August 17, 2022 1:17 PM

@Lucy B., Q, Clive, All

It doesn’t help that Zoom’s entirely proprietary; if we’d been able to review source code…

It looks like Patrick Wardle kindly posted the slides for his Def Con talk “You’re M̶u̶t̶e̶d̶ Rooted.”

I’m trying to figure out what code is displayed in his presentation. Any ideas?

https://speakerdeck.com/patrickwardle/youre-muted-rooted

SpaceLifeForm August 17, 2022 1:58 PM

@ Ted

“Your winnings Sir”

He reverse engineered the binaries.

You have to scroll horizontially thru the slides. Do not try it on mobile. It will be an eye test that you will fail.

nate August 17, 2022 4:36 PM

Most likely these fixes and validations come from outsourced work to eastern part of the with cheap labor?
Don’t try to compare such trivial bugs with Boeing etc.,

Ted August 17, 2022 9:27 PM

@SpaceLifeForm, All

He reverse engineered the binaries.

Thanks.

You have to scroll horizontially thru the slides. Do not try it on mobile.

Dang. Zoom is the gift that keeps on giving. To be fair though, Wardle literally wrote the book on Mac malware. Glad there are a few people out there who know what’s up.

DoYouReally August 17, 2022 10:30 PM

Do you really want to run a binary blob from some third party you know cut every corner in the book? And give it root access to boot?

I didn’t. Turns out, you don’t need their binary. Once you click download, you can run Zoom in your web browser without installing anything.

Granted, given the state of Chrome et al, that’s not necessarily entirely safe either. But it’s Google or Firefox running a binary blob as root rather than some fly-by-night third party. And it works on a chromebook in guest mode, which can be cleared & reset afterwards.

Ismar August 17, 2022 10:39 PM

I find this even more interesting than the original bug:

“ But because of a subtlety of Unix systems (of which macOS is one), when an existing file is moved from another location to the root directory, it retains the same read-write permissions it previously had. So, in this case, it can still be modified by a regular user. And because it can be modified, a malicious user can still swap the contents of that file with a file of their own choosing and use it to become root.”

SpaceLifeForm August 20, 2022 9:15 AM

The fix was incomplete

‘https://nitter.net/theevilbit/status/1560123877086347264#m

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.