The Concept of "Return on Data"

This law review article by Noam Kolt, titled "Return on Data," proposes an interesting new way of thinking of privacy law.

Abstract: Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply -- "return on data" (ROD) -- remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals? Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data. They treat data protection as a standalone issue, distinct from the benefits which consumers receive. This article suggests that privacy concerns should not be viewed in isolation, but as part of ROD. Just as companies can quantify return on investment (ROI) to optimize investment decisions, consumers should be able to assess ROD in order to better spend and invest personal data. Making data-for-services transactions more transparent will enable consumers to evaluate the merits of these deals, negotiate their terms and make more informed decisions. Pivoting from the privacy paradigm to ROD will both incentivize data-driven service providers to offer consumers higher ROD, as well as create opportunities for new market entrants.

Posted on May 20, 2019 at 1:30 PM • 38 Comments


TinoMay 20, 2019 1:59 PM

I think that is a really bad idea. First of all people aren’t that great at understanding what privacy means or in which way not having it could impact them, so evaluating the value of their assets is hard. Next to that it could mean that poor people would be more inclined to give away their privacy (their right) as they have less other currencies to pay with. I really hope lawmakers look at privacy as a fundamental right, irrespective of what someone is willing to pay for it.

pseudonMay 20, 2019 2:00 PM

Nearly impossible to quantify, however, is the risk of present and future harms from supplying various kinds of personal information. Risks may be due to lax use, re-use for unintended purposes, or abuse by the data collector, or due to data getting shared or breached and becoming available to others even more likely to be lax or abuse the data.

Xavier LagraulaMay 20, 2019 2:41 PM

Terrible idea. As most money driven ones, it overlooks every other facets of the deal.
The gist of it is that one's personal information (identifiers, habits, beliefs, tendencies of all kinds...) is not something one owns. It is what one IS. Once you have realized that, there can be no "deal' of any kind. It is not a commercial relationship. It is a imbalance of power.
This is not a matter of transferring ownership. It is a matter of forfeiting your identity. It is a matter of giving a mindless and amoral commercial entity a very special kind of power over you. The power of knowledge.
This kind of power has been repeatedly abused by authorities through history, which is why democracies usually have safeguards against excessive aggregation of data about citizen. Why would one grant a private entity the same kind of power one refuses to one's own government?
Today, of course, there's that nagging issue that even democracies want this power for themselves, over their own citizens. So they won't help...

JeremyMay 20, 2019 3:09 PM

Setting aside philosophical objections for a moment and coming at this from a purely mathematical perspective, this seems like a weird analogy because you can only invest any given dollar in a single place, but you can "sell" the same bit of data about yourself to multiple companies.

Additionally, the "cost" of giving up the data declines the more places you do so; at the extreme where you broadcast some bit of data publicly, the value of withholding it in any given transaction drops essentially to zero, and so you might as well "sell" it to as many places as possible.

I suppose one might argue that once you've given your data to a single company, you should assume they're going to resell it behind your back to the entire world, which is kind of like only being able to sell it once. But while reselling seems pervasive, I doubt "the entire world" is truly a useful approximation of who already knows your data (if it were, companies wouldn't still be asking for it!)

So even if you're inclined to view your personal data as a financial asset and try to optimize its use, it's unclear how you would make use of ROD in your decisions.

There's also the obvious problem that the ROD = U / D equation requires you to quantify the data (D) that you're giving away, which is a massive problem in its own right. How many phone numbers is a credit card number worth? (You could try to quantify it only in terms of information entropy, but it seems pretty clear that would give the wrong answer in practical terms; two facts about you could be equally obscure yet have totally different value.)

HumdeeMay 20, 2019 3:38 PM

The proposal furthers surveillance capitalism rather than undermining it because it represents the ultimate step in the commodification of privacy. To be sure, there is a strong case that as a de facto matter consumers in the market already commodify privacy but I think this reality should be resisted rather than buttressed. Alternatively, maybe the commodification of privacy is a ship that has already sailed an this proposal is a least worst option. I dunno. Like other, I think it is a terrible idea but maybe it the best we can do right now.

Xavier LagraulaMay 20, 2019 3:42 PM

To add weight to Jeremy's last paragraph I would add that a good deal of the expected value of the collected information comes not from the information itself but from what one can infer from the data itself and by aggregating data from different sources or points in time.
Not knowing what the buyer or its clients already know about you, you cannot possibly quantify the worthiness of your information.

So, at first sight, the idea is just as morally objectionable as impossible to put in practice. By the way, I don't believe it is new in any way. I think I have seen this idea mentioned quite a few times in the last years already, and dismissed as it should.

Clive RobinsonMay 20, 2019 4:02 PM

Yes it is a terrible idea if we set up a market, but what about to assess fines and damages?

But there is another asspect the market for private data is bassed on the premise it has value to a third party. I place a high value on my privacy, but the real question is not how highly I value it, or how cheaply a data broker will try to buy it for, but what third party organisations are prepared io pay.

And this is begining to come unstuck for many. Some have even found that the aggregation of private data actually has a negative value. That is the cost of collecting storing and even processing private data exceeds the price they get for the data.

When you throw in risk from regulation and legislation the figures look even worse.

Thus the private data market might be approaching the likes of a Black Tulip Market.

albertMay 20, 2019 4:50 PM

Unfortunately for Mr. Kolt* and others of his ilk, some things can't be quantified. First of all, one doesn't 'invest' in a data harvesting corporation, and one doesn't 'buy' the service either. It is 'free'. The sort of service they provide is obvious.

I'll take one example. A Search Service(SS) does searches for you. If your business relies on an SS, then you -might- be able to quantify those benefits. I submit that such calculations would be difficult and time consuming. No one is going to bother. What does the SS get in return? Plenty. Your business searches would likely indicate the nature of your business. Your personal searches would produce an encyclopedia of your life, including personality indicators. There's a huge 'data market' that allows (for a price) companies (like online sales, email providers, even brick and mortar stores) to buy that data. -You- can't put a dollar value on this information, but the SS can. The last thing that SS wants is to expose -their- financial returns on -your- data. This will never happen. Regulation? Seriously?

I won't deal with Social Media(SM) here. Their shortcomings are more widely known.

Privacy is a one way street, and it doesn't lead to your house. I wonder if Mr. Kolt would care to calculate Googles SS profits on its SS business, then divide by the number of Google SS users, for a profit-per-user figure.
That would be fascinating number. If it's been done, someone please mention it.

We need fewer BS sidetracks, and stricter privacy laws.

*Mr. Kolt seems to have no web presence to speak of, except a few papers. I couldn't find a CV either.
. .. . .. --- ....

name.withheld.for.obvious.reasonsMay 20, 2019 5:21 PM

This is comedic arithmetic exercise, a simplistic boil down of a complex set of affairs that probably requires a non-simultaneous calculus. A functional description where the value of privacy, the law that protects (i.e. "secure in their persons, houses, papers, and effects...) to the degree limited by statute and the constitution (within the confines of the U.S.), and the harm or cost that individuals experience due to extensive/pervasive surveillance. There is ample evidence of pre-censureship effects, I myself find it necessary to carefully craft opinions and ideas that minimize the potential to fall within the scope of suspicious or deviant thinking. To my mind, the lack of intelligence within our government and law represents the largest risk to our society and possibly the world.

I do however recognize the value of a more formalized treatment of law. The issue is the ability to well codify law in a meaningful, just, and fair way. No confidence can be given to our current treatment of anybody or thing given the nature of human frailty in passing judgement. We still have a fifteenth century legal system, the fact that canon law and the Magna Carta still has sway irrespective of known, formal, and tested truths. Some have observed that there may be witch trials held in some portions of the world today.

1.) Proof of witches?
2.) Proof of being a witch?
3.) Proof of acts, by a witch, which is witch-like?

The last test was too much fun to write...

ThomasMay 20, 2019 5:21 PM

Sometimes "the whole is worth more than the sum of its parts", sometimes you get "diminishing returns".

So you interact with n services:

ROD0 = U0 / D0
ROD1 = U1 / D1
RODn = Un / Dn

Let's assume you manage to make a good deal on each of those transactions.

What's the sum total of the Utility of all those services?
What's all the aggregate data worth once it's combined at the back-end?

How do the individial transactions compare with:
ROD = aggregated(U) / aggregated(D)

I guess once you've signed up for so many things that "big data" knows everything about you the "aggregated(D)" suffers from "diminishing returns" so you might as well keep signing up...

Jeff RootMay 20, 2019 5:24 PM

I think we should start (in the US, at least) by using the existing contract law framework. In a contract, you can trade your data for their value, but neither of you can bind a third party to your bargain. Yet that's exactly what gives Social Media companies their power: they do a deal with one user, then use that data to harvest the data of third parties. Facebook has said publicly that they hold significant data on people who are not, and never have been, customers of Facebook.

If we (again, in the US) were to limit data gathering to first-party only, we'd go a long way towards regaining control of our private data. It should go further, but this is a start, it should be fairly effective, and simply requires application of existing laws and regulations.

gordoMay 21, 2019 12:24 AM

@ Albert,

We need fewer BS sidetracks, and stricter privacy laws.

-> We need both the enforcement of existing privacy laws and new legislation. . .

Democrats Need to Tame the Facebook Monster They Helped Create
Breaking up the social networking behemoth is one option. But first, Democrats need to start pointing the finger at regulators who won’t admit there’s a problem.
By MATT STOLLER May 18, 2019

The rationale for [Representative] Pallone to avoid FTC failures is clear. For one thing, Democrats want to pass a federal privacy bill which would place rules on companies that handle personal data. They need new authorities and a regulator to implement such a bill, and the regulator on hand is the FTC. So they can’t very well acknowledge that the regulator is an institutional catastrophe, and at the same time call for more of it. (It brings to mind the old joke, “this restaurant’s terrible, and the portions are so small.”) The second reason Democrats have a problem pointing the finger at the FTC is because the failures at the agency largely happened under the Obama administration.

And yet, recognizing that the privacy problem is really because of failures at the FTC is an essential first step to solving it. Facebook doesn’t encompass everything that’s wrong with our privacy regime, and clear rules around privacy wouldn’t address all of what people fear about Facebook. But generally speaking, Facebook and Google are the best examples of how business models based on pervasive surveillance structure our culture.

Facebook makes its money from behavioral targeted advertising. This means tailoring ads to each user based on what it knows about them, generating traffic through incendiary content so it can have a lot of ad slots, and then placing ads in the least expensive ad slot possible. This means the company has the incentive to collect as much personal information about each user as possible, and it has the incentive to prioritize poor quality content. Users and advertisers have nowhere else to go to an increasingly poor quality product, because Facebook has bought up its competition. Addressing a broken market structure like this one is the kind of problem the FTC was set up to address.


Do Not Track is back in the US Senate. And this time it means business. As in, fining businesses that stalk you online
Republican Senator preps proposed legislation ahead of hearing
By Kieren McCarthy 20 May 2019

New legislation that would put teeth into the web's Do Not Track option for internet users, by fining companies that ignore it, will be introduced this week in the US Senate.

[. . .]

In some respects, the proposed Do Not Track Act calls Big Tech's bluff. For years, privacy advocates have been calling for ways to limit or restrict the amount of personal information that online companies are able to capture, store and combine on their millions of users.

In response, the industry pushed for a "Do Not Track" feature in web browsers that allowed netizens to actively flag to websites that they do not want any non-essential information about them to be stored by third parties. It was proposed as a W3C standard in 2009, and America's Federal Trade Commission (FTC) endorsed the program in 2010.

And while the industry has repeatedly noted the program's existence as a way of pushing back against any new web privacy regulation and legislation, the industry has never agreed on how to implement it. The fact that it remains entirely voluntary has meant that in reality the Do Not Track flag is often completely ignored, and ultimately the draft technology died on the vine as ad networks and other sites refused to acknowledge netizens' requests for privacy.


-> Then stuff like what Tim Berners-Lee is advocating (which also contemplates the subject of this thread) becomes possible . . .

JasonMay 21, 2019 3:30 AM

@Clive Robinson wrote, "Thus the private data market might be approaching the likes of a Black Tulip Market."

When you speak of speculative markets, one must look at two factors not just one. One being the numerator but there's also a denominator.

What is an item value, if its value cannot be quantified?

Thus, in a speculative value, an "item of value" cannot be computed without some sort of "metrics" to denominate it in a relative term. The most common denominator is money, and the most commonly used money is the US Dollar.

If you think of this problem, from a bottom up perspective. One must ask what value does the US Dollar place on black tulips? A "speculative" market must adhere to the metrics of money as well, which is an elastic calculation in its own right.

Gerard van VoorenMay 21, 2019 3:34 AM

As an example of how bad this is: The link shows this:

@ gordo,

In reply to both your replies to albert, there is something I would like to add: Why don't you just forget about that US crap? Why don't you just think about other countries and the world instead of that crap that hangs around? In other words? Why don't you think about 2049 instead of 2019? Why not think about 30 years from now? Do you still want this all to happen or do you want a better political system with a better globally juridical system? That can only happen when you step out of it.

WinterMay 21, 2019 3:35 AM

There is a vibrant market in credit card numbers/details. So, what is the Return on Data for me, selling my credit card details?

Or the PIN of my debit card? To be more specific, what is a fair price for the keys to my house, or safe, or car?

The point is, those who want to buy my data often have the same motivation as those who would like to buy my CC number and details, or the keys to my car.

What this analysis (intentionally?) leaves out are what it delegates to be "externalities". The effects that selling my data has on my future financial, and real, health.

fajensenMay 21, 2019 5:25 AM

Why the hell would anyone be forced to sit down and ... evaluate the merits of these deals, negotiate their terms and make more informed decisions. whenever one buys a lightbulb online?

Most people just want simplicity and to not be totally screwed over and abused at every turn. That's all. Regulation and Enforcement of it would fix that for everyone just fine.

The more "choices" people have to make, the more "options" they have to configure in order to not get screwed over or just to complete simple, basic, tasks online, the higher the odds that they will make the wrong choice or set the wrong option so they will get screwed over and abused!

Anyone stupid or persistent enough to ever buy a ticket from Ryanair online knows Exactly the drill: Feverishly iterating the purchase pages, looking for that flyspeck "tick" in a wall of text and "options" that adds the unwanted travel insurance to all the tickets and 200 EUR to the final ticket price!

Or the swine at the car hire selling an upgrade, which is in fact the very car they already reserved for you in the first place and you would have got anyway by NOT making a choice, only much cheaper.

Why would anyone want this Everywhere, at Every Interaction!, to maybe grub back 1.50 EUR a year like we can on the "liberalised" electricity "market"!? Sod it!!

whysthatsoMay 21, 2019 5:26 AM

I'm very glad that the majority of the comments unmasks the proposal as that what it is: a weird way in trying to commodify what is an essential human right.

thank you

Petre Peter May 21, 2019 7:41 AM

How will my data be used? Who else has access to my data? Can I delete it? These answers might change the entire equation.

Denton ScratchMay 21, 2019 8:10 AM


You're right, it was Jaron Lanier. I was trying to find a proposal from Marlinspike along those lines. Not the same chap at all; I guess it's the haircuts that confused me.

Denton ScratchMay 21, 2019 8:42 AM

Looks like Lessig may also have suggested that personal data should be treated as property.

The article is annoying; I found one reference to Lanier, but I couldn't find it with a text-search - I had to actually skim the whole article (because all the references are in the form of page-by-page footnotes).

At any rate, I can't see that Kolt actually credits Lanier with coming up with the idea. Looks like it was 2013 - about six years ago.

gordoMay 21, 2019 9:55 AM

The author of this thread's article, while citing Jaron Lanier a couple of times, does cite Andreas Weigend and his book 'Data for the People' a couple of dozen times and seems to draw the 'ROD' from there. I haven't read Weigend's book. However, here's what might be a good thumbnail in an opinion piece by Weigend which I see reflected in the Kolt paper:

"Rather than focusing on how much our information is worth and asking for a paltry financial handout in return, we should demand something far more valuable: the right to experiment with our data and the settings that determine what data refineries show us. This "seat at the controls" will ensure that we can make the best decisions for ourselves, while still benefiting from these companies' recommendations."

Denton ScratchMay 21, 2019 10:57 AM


Thanks for pointing out that the author cites Lanier more than once; as I said, I just skimmed it.

Regarding the "seat at the controls": hardly anyone has the time for all that. Or the skills, or the inclination; it's only in aggregate that data acquires value. Google makes billions out of collecting personal data, but individuals will be lucky to earn a few tens of dollars for selling their name, address, political inclinations and purchasing preferences. After all, for the most part that information is already in the wild.

So it doesn't matter to Google how close I keep my cards to my chest; they care about personal data on entire populations, they don't care about me. And anyway, it's not a card game, with different cards each round; my personal data is roughly the same from one year to the next - they'll get to see my hand in due course.

Another problem is that nobody really knows how to put a value on personal data. Like, Google doesn't know - the value depends on what other data you can combine it with. And they don't know what other data they're going to be able to harvest down the road. So Google could combine my street name with (say) local purchase records, and make a good guess at who I am and where I live in the street. Or who my kids are, and where they live. And of course, Goo can sell the data on to some broker (Experian, maybe?) who can combine my data with other data they hold about me to make it even more valuable.

So Google and Experian can make a much better estimate of the value of my personal data than I can. That is, there is a lack of information and transparency and an asymmetry of information in this so-called market, and that works consistently to harm the data subject, and benefit the data collector. I can't see a way to remedy that; I can't see how an individual data subject can ever have the time and skills to negotiate a fair deal with a large corporate data collector, even if you ignore the likelihood that the collector will just sell the data on, without your permission, to a third-party that is not a party to your deal, and that gains a quite different benefit (this is effectively what Google does when they target ads at you on behalf of their customers).

The only remedy that I can see is to regulate the bejabers out of the data harvesters. But I don't like heavy regulation - I have anarchist leanings. Regulation's expensive, and regulators routinely get captured by the regulated. So then you need regulators for the regulators, and it's turtles all the way up.
And not all data collectors are easily subjected to regulation - e.g. the PRC government, or Cambridge Analytica. Well, OK - or the NSA. And of course, neo-cons hate government regulation, and prefer to set up schemes where industries run their own regulator at their own expense. Obviously, such schemes will always be toothless shams. Consider, for example, the absurd IPSO.

FWIW, Data and Goliath is on my shelf (yes, I read it).

markMay 21, 2019 11:23 AM

I like this idea. But then, for decades, my attitude on wearing clothes with big corporate logs, like Aeropostale, has been, "you want me to wear a sandwich board, you pay me by the hour to wear it. Oh, you want *me* to pay *you* for the sandwich board? Excuse me, I'm going to roll on the floor laughing."

No, I don't think using their website is adequate return on the money they make by selling me as a product.

AlexMay 21, 2019 12:40 PM

I think this pairs up nicely with your "data is a toxic asset" essay. Maybe we need a unified theory of user data.

GeorgeMay 22, 2019 5:04 AM

@Jason wrote, "One must ask what value does the US Dollar place on black tulips? "

By your own gestimation, US Dollar in its own narrative is a worthless quantity. Its value is only derived from what "it can buy" not what it can redeem because the gold standard is no longer. Given that money is "elastic" and abstract, the value of money is determined by the aggregation of what "it can buy".

Thus, it is easy to see that the value of black tulips is valuable to "the US Dollar" because it creates an abstracted "value" with which it can exchange for.

peteMay 22, 2019 9:17 AM

As if consumers have a real choice in how their data is used, how much they are tracked or targeted, etc.

Maybe in Europe, a little bit. Definitely not in America. Even in Europe the protections are mostly lip service.

GuestMay 22, 2019 2:49 PM

Disclosure + Privacy = Identity

By selecting what information you disclose to which people, you express your own identity. This constant framing of disclosure as opposed by secrecy is misleading; it advances a mass-surveillance model where the only options you have are total openness to everyone or total censorship of self (where you cannot express yourself selectively because the surveillance model takes for granted that you cannnot conceal information about yourself in any situation), where privacy means inhibition / self-repression to avoid giving yourself away.

Higher-level identity constructs include integrity (the right to access your own historical disclosures, as currently under attack by Telegram's "unsend" feature which allows anyone who witnessed your identity to, at any future moment, erase your own digital memory/testament so that you cannot check who you were or prove that you ever said those things you think you have previously felt), where the desire for a "backdoor" qua secret "participant" in encrypted communications blurs the lines between "the other participant swears they didn't drunkenly/accidentally delete your conversation", "you both wonder who got hacked in a way that irrevocably destroyed the memories you both wanted to preserve", and "someone with access to the backdoor untraceably removed the details you were never aware they had been a party to".

gordoMay 22, 2019 7:00 PM

@ Denton Scratch,

The harms of aggregating and commodifying everyone's digital dossiers creates moral hazards at societal scale. We're slowly being atomized and reconceptualized for profit with little to no input except for what we output. Each of the four V's of big data, volume, variety, velocity, and veracity have brought new problems to the table. Any notion of 'Return on Data' should start first with Brandeis' observation regarding privacy: a "right to be left alone." Meaningful participation in society should not be predicated upon how well we can be accosted by digital gangsters of every stripe. I favor regulation.

GeorgeMay 25, 2019 12:02 AM

@Guest wrote, "Higher-level identity constructs include integrity "

There is no integrity to speak of if a "secret participant" can be used as a gateway to collect and dupilicate any data at will. Any claims to such system with this feature is a sham or false advertisement to speak the least.

Wesley ParishMay 25, 2019 5:19 AM

Well, in keeping with my tendency towards the macabre - a headmaster told me I had a macabre sense of humour - I would say, the price placed on personal details should be based on the worst-case scenario;, ie, what if you doofuses collecting my data stuff up your security as you so routinely do, and my data gets into the wrong hands? What sort of fine should be levied if I get a booby-prize home invasion by the Elbonian Defense Force under the impression that I've been stashing their crown jewels for the last eighty years, as a result of some Doofus Data Collectors-n-Aggregators-R-Us data leak?

I would also levy the same high cost on every link of the chain that could lead up to such a disaster. And likewise, for every occurence and recurrence of such a disaster.

Let others slap their wrists - I'd rather body check them.

A Nonny BunnyMay 25, 2019 2:44 PM


Next to that it could mean that poor people would be more inclined to give away their privacy (their right) as they have less other currencies to pay with.
And since we're sold to advertisers as potential customers, poor people's data is probably valued less as well. So if there were any differentiation made, they'd have to give up more to get the same service.
I suppose the only value to the surveillance economy of someone without purchasing power would be to convince everyone else it's "normal" to give up your data. Kinda reminds me of the Judas goat :p

A Nonny BunnyMay 25, 2019 2:48 PM

@Xavier Lagraula

The gist of it is that one's personal information (identifiers, habits, beliefs, tendencies of all kinds...) is not something one owns. It is what one IS.
Arguably you own yourself, so it can be both.
And once upon a time in a place, you could sell yourself into slavery. So it's not like we invented something new.

jayMay 28, 2019 7:40 AM

Since the dawn of Sears and Roebuck, your address has been a link to your identity. That info tied you to your history. (and the phone book--everyone's name, address and phone number in a FREE book) It just gets more complicated now.

Regulation can't solve it. First because it will be impossible to realistically, economically (and fairly) evaluate the trillions of transactions, but additionally, heavily regulated organizations use it as a bulwark (via lawyers and complex laws and teams of accountants) against upstart competitors. Regulated industries do a bit of superficial squawking, but ultimately they love the 'protection'.

gordoMay 30, 2019 6:50 PM

@ Jay,

It just gets more complicated now.

Not really. It's a feature of the medium, what McLuhan called "extensions of man", i.e., in the wide sense (of tracking/surveillance data: first party; third party; online; offline; etc.), what the digital ad-targeting industry refers to as "touchpoints".

Regulation can't solve it. . . . trillions of transactions . . .

Sure it can. The threat of privacy regulation, by itself, has proved enough to elicit both defensiveness and consequent changes in policy or the application of technological countermeasures.

a bulwark (via lawyers and complex laws and teams of accountants) against upstart competitors.

Such bulwarks signal monopoly status.

gordoJune 6, 2019 8:19 PM

Interoperability as a way of combating network effects and fostering privacy, competition and innovation . . .

Regulating or breaking up Big Tech: an antitrust explainer
US regulators will investigate whether companies like Amazon, Facebook, and Google have too much power. Here’s an introduction to the issues.
by Angela Chen, Jun 5, 2019

Experts who don’t favor a breakup say it would be overkill and make our tech tools worse: search would become less functional, for example. Some think that problems could be addressed by making it easier to take data from one platform and use it at another (called “data portability”), or by forcing different services to work with each other (“data interoperability”), so they can’t lock users in. Facebook and another social network could communicate the same way that people with Yahoo and Gmail accounts can still email each other.

Interoperability = Privacy + Competition
By Gus Rossi and Charlotte Slaiman, April 26, 2019

As Congress and other relevant stakeholders debate how to protect Americans’ privacy, a key concern is making sure that new legislation doesn’t entrench the power of big tech incumbents. In this post, we argue that incorporating data interoperability into privacy legislation is essential to empowering consumers’ data rights and fostering a competitive marketplace.

Regulating Big Tech makes them stronger, so they need competition instead
Use antitrust to promote interoperability, says Cory Doctorow, an author and tech activist
By Cory Doctorow, Jun 6th 2019

In a monopolised market, sellers get to bargain by fiat. But interoperability—from ad-blocking to switching app stores—is a means by which customers can assay real counteroffers.

Tim Berners-Lee On Building A New Web
By PYMNTS, May 20, 2019

Berners-Lee said that there are plenty of instances where consumers and businesses should be able to tap into the data troves they’ve created at the web sites with which they engage. But they can’t because non-interoperable data silos are preventing them.

“Most people come with a huge back pocket full of concerns about [the web and] privacy.” Berners-Lee told Webster. “They aren’t worried about their pictures being shown to the wrong person so much as they are worried about being part of a system that is really dysfunctional.”

Berners-Lee contends that those concerns are “bigger, more powerful” and very much on the mark. The issue that should be taking center stage is interoperability and how it “empowers [the consumer] to do things [they] could never have before.”

gordoJune 8, 2019 1:41 PM

Follow-up on the previous post regarding data interoperability . . .

Sir Tim versus Black Mirror
Posted on June 6, 2019 by Ethan Zuckerman

There’s something very surreal about a moment in which thousands of researchers and pundits are studying what’s wrong with social media and the Web, and surprisingly few working on new models we can use to move forward. The man who built the web in the first place is now working on alternative models to save us from the Black Mirror universe and the broader academic and professional world seems… surprisingly uninterested.


Adversarial interoperability: reviving an elegant weapon from a more civilized age to slay today's monopolies
By Cory Doctorow, June 7, 2019

Today, consumers and toolsmiths confront a thicket of laws and rules that stand between them and technological self-determination. To change that, we need to reform the Computer Fraud and Abuse Act, Section 1201 of the Digital Millennium Copyright Act, , patent law, and other rules and laws. Adversarial interoperability is in the history of every tech giant that rules today, and if it was good enough for them in the past, it's good enough for the companies that will topple them in the future.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Sidebar photo of Bruce Schneier by Joe MacInnis.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Security.