Online Privacy and Overfishing

Microsoft recently caught state-backed hackers using its generative AI tools to help with their attacks. In the security community, the immediate questions weren’t about how hackers were using the tools (that was utterly predictable), but about how Microsoft figured it out. The natural conclusion was that Microsoft was spying on its AI users, looking for harmful hackers at work.

Some pushed back at characterizing Microsoft’s actions as “spying.” Of course cloud service providers monitor what users are doing. And because we expect Microsoft to be doing something like this, it’s not fair to call it spying.

We see this argument as an example of our shifting collective expectations of privacy. To understand what’s happening, we can learn from an unlikely source: fish.

In the mid-20th century, scientists began noticing that the number of fish in the ocean—so vast as to underlie the phrase “There are plenty of fish in the sea”—had started declining rapidly due to overfishing. They had already seen a similar decline in whale populations, when the post-WWII whaling industry nearly drove many species extinct. In whaling and later in commercial fishing, new technology made it easier to find and catch marine creatures in ever greater numbers. Ecologists, specifically those working in fisheries management, began studying how and when certain fish populations had gone into serious decline.

One scientist, Daniel Pauly, realized that researchers studying fish populations were making a major error when trying to determine acceptable catch size. It wasn’t that scientists didn’t recognize the declining fish populations. It was just that they didn’t realize how significant the decline was. Pauly noted that each generation of scientists had a different baseline to which they compared the current statistics, and that each generation’s baseline was lower than that of the previous one.

What seems normal to us in the security community is whatever was commonplace at the beginning of our careers.

Pauly called this “shifting baseline syndrome” in a 1995 paper. The baseline most scientists used was the one that was normal when they began their research careers. By that measure, each subsequent decline wasn’t significant, but the cumulative decline was devastating. Each generation of researchers came of age in a new ecological and technological environment, inadvertently masking an exponential decline.

Pauly’s insights came too late to help those managing some fisheries. The ocean suffered catastrophes such as the complete collapse of the Northwest Atlantic cod population in the 1990s.

Internet surveillance, and the resultant loss of privacy, is following the same trajectory. Just as certain fish populations in the world’s oceans have fallen 80 percent, from previously having fallen 80 percent, from previously having fallen 80 percent (ad infinitum), our expectations of privacy have similarly fallen precipitously. The pervasive nature of modern technology makes surveillance easier than ever before, while each successive generation of the public is accustomed to the privacy status quo of their youth. What seems normal to us in the security community is whatever was commonplace at the beginning of our careers.

Historically, people controlled their computers, and software was standalone. The always-connected cloud-deployment model of software and services flipped the script. Most apps and services are designed to be always-online, feeding usage information back to the company. A consequence of this modern deployment model is that everyone—cynical tech folks and even ordinary users—expects that what you do with modern tech isn’t private. But that’s because the baseline has shifted.

AI chatbots are the latest incarnation of this phenomenon: They produce output in response to your input, but behind the scenes there’s a complex cloud-based system keeping track of that input—both to improve the service and to sell you ads.

Shifting baselines are at the heart of our collective loss of privacy. The U.S. Supreme Court has long held that our right to privacy depends on whether we have a reasonable expectation of privacy. But expectation is a slippery thing: It’s subject to shifting baselines.

The question remains: What now? Fisheries scientists, armed with knowledge of shifting-baseline syndrome, now look at the big picture. They no longer consider relative measures, such as comparing this decade with the last decade. Instead, they take a holistic, ecosystem-wide perspective to see what a healthy marine ecosystem and thus sustainable catch should look like. They then turn these scientifically derived sustainable-catch figures into limits to be codified by regulators.

In privacy and security, we need to do the same. Instead of comparing to a shifting baseline, we need to step back and look at what a healthy technological ecosystem would look like: one that respects people’s privacy rights while also allowing companies to recoup costs for services they provide. Ultimately, as with fisheries, we need to take a big-picture perspective and be aware of shifting baselines. A scientifically informed and democratic regulatory process is required to preserve a heritage—whether it be the ocean or the Internet—for the next generation.

This essay was written with Barath Raghavan, and previously appeared in IEEE Spectrum.

EDITED TO ADD (6/23): This essay has been translated into German.

Posted on June 5, 2024 at 7:00 AM24 Comments

Comments

K.S. June 5, 2024 7:39 AM

Chaff generation/database poisoning needs to become easily available and mainstream. General population adopted ad blocking, there is no reason the same could not be done here.

Mass commercial surveillance is only possible if generated data is mostly accurate. Undermine accuracy and there won’t be a good reason to implement industrial-scale privacy violations.

Related to the above, I really enjoyed all the gardening advice I have read on this blog.

Allen June 5, 2024 7:43 AM

The most amazing thing about the recent Microsoft Recall announcement was how muted the responses were, even in the DoD community in which I work. There was the initial shock then went to acceptance within a day or two. Even 10 years ago no one would have accepted it.

K.S. June 5, 2024 8:11 AM

@Allen

I think fight with MS about Recall will happen through JITC,I can’t imagine it will be enabled for DoD deployments. The only question if this opt out will be available to everyone or only Gov folks.

jg June 5, 2024 10:21 AM

There was a time before there was (at least one) computer on every desk. We logged in to a large shared system over a POTS analogue phone line. Pre-history to “Historically, people controlled their computers, and software was standalone.”

Daniel Popescu June 5, 2024 10:34 AM

The fishing analogy and the shifting baselines are spot on, Bruce. At the start of my career I used to, among other things, service and maintain lots of different lab equipment of various complexities from simple analytical balances and electronic pipettes to HPLCs, gass and mass spectrometers and so on, and it was always a struggle to explain to senior management and scientists which were the users of said equipment, why tunning, calibrations and periodic maintenance are so important, and costly(for them): if you want good results, you need a perfect baseline.

Then I became an ISO 17025 auditor, the explaining and acceptance of got easier and I left the company :).

Jaime June 5, 2024 10:34 AM

Allen and K.S., “JITC” refers to a part of the U.S. Department of Defense. Will there really be a “fight”? I’m sure any Domain Administrator will be able to disable the feature via group policy, and that militaries are ardent users of group policy. Should they fight it? It won’t directly harm them, but if the general public loses any expectations of computers keeping secrets, maybe that has harmful follow-on effects (relating to how people treat classified data, for example). Or perhaps militaries and governments will take advantage of it in the opposite way: have all screenshots sent to management, once computer users are accustomed to the idea that someone is going to be grabbing their data.

Allen, your “no one would have accepted it” story reminds me of Windows (XP) Product Activation. My fellow university students at the time, and the internet at large, said the same thing; it’s an outrage, we won’t stand for this, I’ll never use Windows again, etc. One friend actually did switch to Mac. A couple of others switched to Windows 2000 for a bit, but eventually went back to the mainstream releases, having apparently stopped caring. Ultimately, there’s no evidence of any significant harm to Microsoft’s profits from that decision, which they never reversed.

Bruce and all, I can’t help but notice the anthropocentric biases associated with “overfishing”. The very term “fishery” makes it sound like something invented to satisfy humanity, when actually it mostly means the places humans go to catch fish—that is, the parts of the ocean where we’ve found the profitable ones to occur naturally. The concept of “fish stock” refers specifically to the species of fish “of interest” to the people catching and selling them. I’m not sure whether this matters much in relation to Bruce’s point. But, it’s a measurement bias; and, hypothetically, maybe it’s mostly humans (who depend on fish as food) who are harmed. Maybe the fish have become harder to catch, having moved out further to sea or to the sea floor; or have been replaced by other species altogether.

I wonder whether there is, similarly, a subset of privacy-seeking humans, going largely unnoticed by the general public. I won’t carry a cellular phone, for example, because I don’t want to be tracked, but the idea of being anywhere without a powered-on phone seems to be something completely alien to many people; during the early days of COVID, there were several times I had to explain to someone that I had no way to call their business on arrival, nor to fill out the online health questionnaire (inevitably, they didn’t really care; it was “safety theater” to appease other customers and maybe the government). Or if you’re using Google Analytics, software telemetry, or the like, to learn about your audience, keep in mind that some of us block that stuff. If you block “the country of TOR” and pat yourself on the back when the number of “blocked malicious requests” goes up, well, that’s confirmation bias; a good data scientist should always be considering falsifiability.

cybershow June 5, 2024 11:01 AM

one that respects people’s privacy rights while also allowing
companies to recoup costs for services they provide.

I would have stopped that sentence at the word “rights” and saved
eleven words Bruce. 🙂

Nobody gave tech companies permission to treat private data as
de-facto currency. No person or legal system I know of ever, even
tacitly or incidentally, granted leave to foist such an arrangement on
society. The idea that “we pay for services with data” is an
unexamined figment of West-coast US culture.

I neither asked for nor wanted the disservices provided, and where I
do I’ll happily pay for them – using real money – and conduct my
technological affairs with entities of my choice whom I trust.

Nobody owes these people a living. The baseline “expectation” we need
to adjust is BigTech’s expectation of charitable subsidy in kind and
legal exceptionalism.

BTW the shifting baselines observation is useful and definitely
something security engineers could take more care of to widen our
scope.

For those curious I strongly recommend studying the work of Forrester
and Meadows. There’s a great video of Dana’s lectures where she talks
about fishing, tipping points and problems of absolute measurements
while sampling. Macro systems theory is essential to understanding
some parts of cybersecurity, and these are subjects we often visit on
Cybershow.

regards,
Andy

What price common sense? June 5, 2024 11:37 AM

@Bruce Schneier

“In the security community, the immediate questions weren’t about how hackers were using the tools (that was utterly predictable), but about how Microsoft figured it out. The natural conclusion was that Microsoft was spying on its AI users, looking for harmful hackers at work.”

We know Microsoft are “spying” as has been said on this blog “LLM and ML AI is a surveilence tool to gather PPI”. Or as has been put in

https://www.schneier.com/blog/archives/2024/02/microsoft-is-spying-on-users-of-its-ai-tools.html/

‘Thus the only way the current owners of AI systems based on LLMs can get their money back, and make money on them going forward is as “surveillance engines”.

Fix in your mind,

“Bedazzle, Beguile, Bewitch, Befriend, and BETRAY”’

Because this just shows it’s already in full progress and all your privacy will be “striped bare” and sold over and over.

The days of “Personal Computing” are long gone to “Surveillance Conscription”, every thing you type view or do will become part of your record.

And some modern day Cardinal Richelieu will tap a few keys on a keyboard and somebody’s record will fit some need for a show trial. And a probably innocent person will be hanged on high to sate the beast of blood lust such show trials engender.

Kent K June 5, 2024 12:14 PM

This analogy also pretty clearly explains why some of us older curmudgeons would say younger generations don’t care about data privacy. However – they do care, it is just that they operate from a different baseline than we do. The have no reference to what we saw 20, 30 or 40 years ago. Agreeing on a common yardstick would be a useful (but perhaps Quixotic) exercise.

Hal June 5, 2024 12:36 PM

@jg Prehistoric to that, you might’ve had to take a tray of ordered punchcards to the computer room. Prehistoric to that, “computer” was a human job title.

Jaime June 5, 2024 12:52 PM

cybershow wrote:

No person or legal system I know of ever, even tacitly or incidentally, granted leave to foist such an arrangement on society.

Tacit permission is given by people every time they accept such an arrangement; by legal systems every time they issue a meaninglessly-tiny fine for privacy breaches.

Or, as a specific example, personal credit ratings hardly existed before the 1960s, and U.S. Social Security cards explicitly said they were not to be used for identification. Companies started requesting SSNs anyway, and people provided them, and the government eventually removed that message—giving tacit permission. The U.S. government also started regulating the credit bureaus, thus tacitly allowing all the prior behavior that was not explicitly proscribed; notably, no consent was ever required for any company to share data.

Less tacitly, the E.U. keep explicitly modifying their privacy laws to ensure the U.S. can still process their data. Safe Harbor gets shot down by the courts, so then it was Privacy Shield; courts later declared that illegal—as before, only after data had been shared for years—and now it’s the Data Privacy Framework. And the E.U. aren’t the only ones explicitly reducing protections when they’re found to impede the status quo. Publishing subscriber names in phone books without consent was an obvious violation of Canadian privacy law, once it (PIPEDA) had been implemented in the year 2000; but the courts and/or regulators said it was okay, basically because people had grown to expect this privacy-invasion (and, also, one could pay for privacy, or give a fake name for free).

lurker June 5, 2024 2:25 PM

@Bruce

Surely “shifting baseline syndrome” must be a weakness of the education system producing those scientists. Why did they never look back for historic data?

Similarly with computer networking, my personal experience doesn’t go back as far as @jg’s POTS connections, but I knew people who had used it. Our connection was always at the edge even as the nework topology and technology grew, so the network was always “out there”. Any system that requires constant connection to a “cloud” is a case of “here be dragons.” Failure to use dragon hunting measures is a failure by the users. Much as I hate Microsoft, I don’t blame them for taking advantage where they can.

What price common sense? June 5, 2024 6:03 PM

@Hal
@jg
@lurker
@ALL

“Prehistoric to that, you might’ve…”

You left out

  1. Wired up integrators
  2. Adjusted screws on track run lengths

On old electronic and mechanical “analog computers we would now at best call “graphing calculators”.

(Both of which I’ve done in my time, but lets leave out “ladder logic” with Norbits.)

But such compute engines got bomb and gunnery sights “on target” from planes and battle ships in WWII and through the Vietnam war and used on rockets that took men and women into space and brought them safely back again.

Even now “mechanical calculators” can still be found at work in machine shops for cutting metal ‘piece work’ items.

They were and still are more robust designs before overly expensive “Computer Numerical Control”(CNC) machines that are in effect the opposite “subtractive” devices to 3D Printers that are “additive” and a the rage.

Oh the reason the electronic circuit that gets used by the tens of million each year is called an “Op-amp” is it’s short for “Operational amplifier” that was the main stay component in analog computers and these days is found in just about all analog electronic circuitry even in the more refined form of “instrumentation amps”.

The world goes around and quite a few things drop off and appear to get forgotten… but somethings stick in places and ways that would not have been envisioned originally 😉

As a side note, the earliest machine you could call a computer after a loom was the “pattern cutter” that saw major service during the US Civil War cutting out rifle stocks by the tens of thousands and similar parts on mass… The sad thing is when you think about it, there is nothing like killing people on mass to get things automated 🙁

echo June 6, 2024 3:35 AM

I would have stopped that sentence at the word “rights” and saved
eleven words Bruce.

That’s the critical thing and a very definite blind spot on this blog and a manufactured loophole in other places. It’s definitely exploited for reasons of greed but also exploited politically. It’s pretty much why I don’t regard the US as a trustworthy partner. As for Russia and China? Yeesh.

In the UK the Tories are treading down this path. Thankfully they’re tanking in the polls but until they are hung, drawn, and quartered and their remains scattered to every corner of the earth I’m not going to feel remotely comfortable.

pup vas June 6, 2024 3:53 PM

https://www.fastcompany.com/91132643/how-to-find-the-balance-between-telling-it-like-it-is-and-displaying-good-people-skills

=Modern self-help advice often pushes us to be less bothered about what others think, to express our genuine and authentic thoughts and emotions without fear of repercussions, and to pick radical candor over polite conflict avoidance. According to this advice, a truth that hurts is better than a comforting lie, even if the lie is intended to enhance trust and promote positive work relations. The idea is that truth is bitter initially but sweet in the end, while the opposite is true for lies (pun intended).

More often than not, our work colleagues will not care much about what we think deep down about different issues, especially if they don’t pertain to work. Just because your views are controversial and likely to upset others doesn’t make them right (just like the conventional nature of opinions doesn’t make them wrong). Indeed, more often than not, telling it like it is is not indicative of intelligence. Poor social skills, if not a sign that you are too self-centered to care about what others think, give the impression you about what others think, give the impression you can’t contemplate the possibility that their views are right and yours are wrong.

That said, if you always tell people what they want to hear, you won’t be regarded as socially skilled or emotionally intelligent, but as a fraud, fake, or phony. We have all seen this type of person: someone who always compliments everybody, is exaggeratedly positive and nice, and never overtly disagrees with anyone. It’s that painful moment when someone crosses the line between political skills and acting like a politician.

It’s probably just as risky to trust individuals who are too inconsiderate to modulate or temper down their grumpy thoughts and negative opinions.

Although fake politeness may be preferable to genuine rudeness, they are equally incompatible with trust, and as likely to inhibit connections with others.

How much do you care about the other person’s reactions, versus putting out your opinion? Would you rather say what you want but hurt their feelings, or focus on building or maintaining a positive relationship, even if requires not expressing your views?

For example, how important is it for your coworkers to hear how you feel about the upcoming presidential election, abortion, geopolitics, and the crisis in the Middle East?

In general, polarizing views can achieve only two outcomes: to either strengthen the already tribal connection between you and people who think like you or to antagonize those who don’t (or both). People rarely change their views on issues that deeply connect with their values, experiences, and identity.

Understand that voicing your views may ignite conflict and confrontation, so be sure to pick your battles. Just like you won’t have any credibility if you agree with everyone about everything all the time, you will also not be taken seriously if you disagree with everyone on everything all the time.

If you hope to persuade people to think like you, then be sure to assess whether this goal is attainable, and the pros and cons of winning or losing the argument.

Importantly, how you do this is even more critical than what you do and why (see next point).

People will care less about your views than your ability to convey them in a way that shows an awareness of social norms, the dominant etiquette, and, fundamentally, an attempt to consider other people’s views and feelings.

People will be able to tolerate your views more easily if you make an effort to be understood, >don’t underestimate or patronize them, and display an ability to appreciate the nuances, caveats, and different perspectives on the issue.

Your degree of conviction doesn’t need to be matched by assertiveness or arrogance. You will more likely persuade others—or at least encourage them to consider your view—if you inject a dose of humility and empathy.

if you want to encourage others to tell you what they really think, you’d better display some tolerance, open-mindedness, and an ability to connect with other people’s ideas even when they are different.”

What price common sense? June 7, 2024 7:49 AM

@ Howard NYC

From the end of Charlie’s well written article about MS Co-Pillock

“Some commentators are snarking that Microsoft really really wants to make 2025 the year of Linux on the Desktop, and it’s kind of hard to refute them right now.”

I immediately thought of that old saying,

“Never interrupt your enemy when he is making a mistake”

(Alledgedly by Sun Tzu from ‘The Art of War’ but others claim it was “Old Boney” Napoleon Bonaparte part time French Emperor and like Emperor Nero to often on the fiddle).

Jaime June 7, 2024 11:22 AM

Microsoft committing suicide? No, I think Charles and y’all are underestimating this “baseline effect”. I mentioned Product Activation already. There was also all kinds of press about the privacy implications of Windows 10 and its data collection. “Is it even legal for a lawyer, doctor, accountant, etc., to use Windows 10 on systems dealing with customer data?” How many such offices actually switched away from Microsoft stuff?

Microsoft know quite well that there’s a cycle to this outrage. The press get a hold of it, hype it up for a while… but as long as Microsoft don’t blink, it’ll die down without resolution. Because people need Windows, right? Nevermind Mac and Linux (including SteamOS); they’re handy names to throw at MS while upset, but the complainers are mostly unwilling to deal with the inconvenience of going ahead with their threats.

Jaime June 7, 2024 1:19 PM

Well, I guess I have to take back my comment, at least in part: Microsoft Will Switch Off Recall by Default After Security Backlash:

“We are updating the set-up experience of Copilot+ PCs to give people a clearer choice to [opt in] to saving snapshots using Recall,” reads a blog post from Pavan Davuluri[,] Microsoft’s corporate vice president, Windows + Devices. “If you don’t proactively choose to turn it on, it will be off by default.”

(The square brackets indicate grammatical corrections by me; apparently, neither Microsoft nor Wired bother with copy-editing anymore.)

What price common sense? June 7, 2024 3:16 PM

@Jaime

“Is it even legal for a lawyer, doctor, accountant, etc., to use Windows 10 on systems dealing with customer data?”

That depends.

Is the computer connected to a method of communications by which privileged conversations / information can be seen by others who are not privy to them?

History shows that people in the legal fraternity have been censured for writing with pen and paper in public places before. So from that you would assume the use of a modern device that can be viewed not just physically but electronically would carry the same or similar risks of censure.

However we know that in the US privileged phone calls to/from US prisons have not just been recorded by the third party telecom supplier, but details from privileged calls have ended up in the hands of investigators thus prosecutors. With no real action being taken by the courts or governments.

However I’m not aware of even any discussion on the use of computers in the US other than with regards “insider trading”. However there was objection made with respect to the Electronic Courts system in the UK by the legal profession when it was first proposed and it in effect got kicked into the long grass to die eventually during C19.

Oh and UK Members of Parliament were given MS Office 360 with everything going to servers in the Republic of Ireland… Yup that included much that was “National Security” or “Privileged” or subject to other disclosure restrictions. We discovered that all of the MPs traffic was being recorded by GCHQ and that the MP’s should have no expectation of privacy, when the head of GCHQ testified to a parliamentary committee…

So I would not expect any legal privilege to exist with electronic communications unless you take rather special precautions. None of which current commercial or consumer products come even remotely close to.

A Nonny Bunny June 7, 2024 3:47 PM

Well, at least exponential decline is better than linear.

linear 100, 90, 80, 70, 60, 50, 40, 30, 20, 10, 0
exponential 100, 90, 81, 72, 64, 57, 51, 45, 40, 36, 32, …

The endless journey June 7, 2024 4:24 PM

@A Nonny Bunny

“Well, at least exponential decline is better than linear.”

Only if you never want to get there …

Kind of like the old

“Close but no cigar”

But others say

“It’s all about the journey, not the destination”

You takes your choice…

Me I like just sitting and resting these days, I’ve climbed many mountains jumped out on a chute to often, and sailed chunks of the world few get to see. I’ve crawled deserts on my belly and fought more fights than I care to remember, and said that final farewell to comrades to often to cry. These and many more things supposedly make me “well seasoned”.

But now, the only adventure I want to go on for ever is that with those close to me in life. The smile in the morning the quiet sigh at the end of day, these are the real treasures in life, value them while you can.

Gratefully June 9, 2024 3:54 PM

@Dr. Schneier, it very much seems that when

a democratic regulatory process is required

as a baseline , society has become the boiled frog just the same, on many critical fronts.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.