How the Solid Protocol Restores Digital Agency

The current state of digital identity is a mess. Your personal information is scattered across hundreds of locations: social media companies, IoT companies, government agencies, websites you have accounts on, and data brokers you’ve never heard of. These entities collect, store, and trade your data, often without your knowledge or consent. It’s both redundant and inconsistent. You have hundreds, maybe thousands, of fragmented digital profiles that often contain contradictory or logically impossible information. Each serves its own purpose, yet there is no central override and control to serve you—as the identity owner.

We’re used to the massive security failures resulting from all of this data under the control of so many different entities. Years of privacy breaches have resulted in a multitude of laws—in US states, in the EU, elsewhere—and calls for even more stringent protections. But while these laws attempt to protect data confidentiality, there is nothing to protect data integrity.

In this context, data integrity refers to its accuracy, consistency, and reliability…throughout its lifecycle. It means ensuring that data is not only accurately recorded but also remains logically consistent across systems, is up-to-date, and can be verified as authentic. When data lacks integrity, it can contain contradictions, errors, or outdated information—problems that can have serious real-world consequences.

Without data integrity, someone could classify you as a teenager while simultaneously attributing to you three teenage children: a biological impossibility. What’s worse, you have no visibility into the data profiles assigned to your identity, no mechanism to correct errors, and no authoritative way to update your information across all platforms where it resides.

Integrity breaches don’t get the same attention that confidentiality breaches do, but the picture isn’t pretty. A 2017 write-up in The Atlantic found error rates exceeding 50% in some categories of personal information. A 2019 audit of data brokers found at least 40% of data broker sourced user attributes are “not at all” accurate. In 2022, the Consumer Financial Protection Bureau documented thousands of cases where consumers were denied housing, employment, or financial services based on logically impossible data combinations in their profiles. Similarly, the National Consumer Law Center report called “Digital Denials” showed inaccuracies in tenant screening data that blocked people from housing.

And integrity breaches can have significant effects on our lives. In one 2024 British case, two companies blamed each other for the faulty debt information that caused catastrophic financial consequences for an innocent victim. Breonna Taylor was killed in 2020 during a police raid on her apartment in Louisville, Kentucky, when officers executed a “no-knock” warrant on the wrong house based on bad data. They had faulty intelligence connecting her address to a suspect who actually lived elsewhere.

In some instances, we have rights to view our data, and in others, rights to correct it, but these sorts of solutions have only limited value. When journalist Julia Angwin attempted to correct her information across major data brokers for her book Dragnet Nation, she found that even after submitting corrections through official channels, a significant number of errors reappeared within six months.

In some instances, we have the right to delete our data, but—again—this only has limited value. Some data processing is legally required, and some is necessary for services we truly want and need.

Our focus needs to shift from the binary choice of either concealing our data entirely or surrendering all control over it. Instead, we need solutions that prioritize integrity in ways that balance privacy with the benefits of data sharing.

It’s not as if we haven’t made progress in better ways to manage online identity. Over the years, numerous trustworthy systems have been developed that could solve many of these problems. For example, imagine digital verification that works like a locked mobile phone—it works when you’re the one who can unlock and use it, but not if someone else grabs it from you. Or consider a storage device that holds all your credentials, like your driver’s license, professional certifications, and healthcare information, and lets you selectively share one without giving away everything at once. Imagine being able to share just a single cell in a table or a specific field in a file. These technologies already exist, and they could let you securely prove specific facts about yourself without surrendering control of your whole identity. This isn’t just theoretically better than traditional usernames and passwords; the technologies represent a fundamental shift in how we think about digital trust and verification.

Standards to do all these things emerged during the Web 2.0 era. We mostly haven’t used them because platform companies have been more interested in building barriers around user data and identity. They’ve used control of user identity as a key to market dominance and monetization. They’ve treated data as a corporate asset, and resisted open standards that would democratize data ownership and access. Closed, proprietary systems have better served their purposes.

There is another way. The Solid protocol, invented by Sir Tim Berners-Lee, represents a radical reimagining of how data operates online. Solid stands for “SOcial LInked Data.” At its core, it decouples data from applications by storing personal information in user-controlled “data wallets”: secure, personal data stores that users can host anywhere they choose. Applications can access specific data within these wallets, but users maintain ownership and control.

Solid is more than distributed data storage. This architecture inverts the current data ownership model. Instead of companies owning user data, users maintain a single source of truth for their personal information. It integrates and extends all those established identity standards and technologies mentioned earlier, and forms a comprehensive stack that places personal identity at the architectural center.

This identity-first paradigm means that every digital interaction begins with the authenticated individual who maintains control over their data. Applications become interchangeable views into user-owned data, rather than data silos themselves. This enables unprecedented interoperability, as services can securely access precisely the information they need while respecting user-defined boundaries.

Solid ensures that user intentions are transparently expressed and reliably enforced across the entire ecosystem. Instead of each application implementing its own custom authorization logic and access controls, Solid establishes a standardized declarative approach where permissions are explicitly defined through control lists or policies attached to resources. Users can specify who has access to what data with granular precision, using simple statements like “Alice can read this document” or “Bob can write to this folder.” These permission rules remain consistent, regardless of which application is accessing the data, eliminating the fragmentation and unpredictability of traditional authorization systems.

This architectural shift decouples applications from data infrastructure. Unlike Web 2.0 platforms like Facebook, which require massive back-end systems to store, process, and monetize user data, Solid applications can be lightweight and focused solely on functionality. Developers no longer need to build and maintain extensive data storage systems, surveillance infrastructure, or analytics pipelines. Instead, they can build specialized tools that request access to specific data in users’ wallets, with the heavy lifting of data storage and access control handled by the protocol itself.

Let’s take healthcare as an example. The current system forces patients to spread pieces of their medical history across countless proprietary databases controlled by insurance companies, hospital networks, and electronic health record vendors. Patients frustratingly become a patchwork rather than a person, because they often can’t access their own complete medical history, let alone correct mistakes. Meanwhile, those third-party databases suffer regular breaches. The Solid protocol enables a fundamentally different approach. Patients maintain their own comprehensive medical record, with data cryptographically signed by trusted providers, in their own data wallet. When visiting a new healthcare provider, patients can arrive with their complete, verifiable medical history rather than starting from zero or waiting for bureaucratic record transfers.

When a patient needs to see a specialist, they can grant temporary, specific access to relevant portions of their medical history. For example, a patient referred to a cardiologist could share only cardiac-related records and essential background information. Or, on the flip side, the patient can share new and rich sources of related data to the specialist, like health and nutrition data. The specialist, in turn, can add their findings and treatment recommendations directly to the patient’s wallet, with a cryptographic signature verifying medical credentials. This process eliminates dangerous information gaps while ensuring that patients maintain an appropriate role in who sees what about them and why.

When a patient—doctor relationship ends, the patient retains all records generated during that relationship—unlike today’s system where changing providers often means losing access to one’s historical records. The departing doctor’s signed contributions remain verifiable parts of the medical history, but they no longer have direct access to the patient’s wallet without explicit permission.

For insurance claims, patients can provide temporary, auditable access to specific information needed for processing—no more and no less. Insurance companies receive verified data directly relevant to claims but should not be expected to have uncontrolled hidden comprehensive profiles or retain information longer than safe under privacy regulations. This approach dramatically reduces unauthorized data use, risk of breaches (privacy and integrity), and administrative costs.

Perhaps most transformatively, this architecture enables patients to selectively participate in medical research while maintaining privacy. They could contribute anonymized or personalized data to studies matching their interests or conditions, with granular control over what information is shared and for how long. Researchers could gain access to larger, more diverse datasets while participants would maintain control over their information—creating a proper ethical model for advancing medical knowledge.

The implications extend far beyond healthcare. In financial services, customers could maintain verified transaction histories and creditworthiness credentials independently of credit bureaus. In education, students could collect verified credentials and portfolios that they truly own rather than relying on institutions’ siloed records. In employment, workers could maintain portable professional histories with verified credentials from past employers. In each case, Solid enables individuals to be the masters of their own data while allowing verification and selective sharing.

The economics of Web 2.0 pushed us toward centralized platforms and surveillance capitalism, but there has always been a better way. Solid brings different pieces together into a cohesive whole that enables the identity-first architecture we should have had all along. The protocol doesn’t just solve technical problems; it corrects the fundamental misalignment of incentives that has made the modern web increasingly hostile to both users and developers.

As we look to a future of increased digitization across all sectors of society, the need for this architectural shift becomes even more apparent. Individuals should be able to maintain and present their own verified digital identity and history, rather than being at the mercy of siloed institutional databases. The Solid protocol makes this future technically possible.

This essay was written with Davi Ottenheimer, and originally appeared on The Inrupt Blog.

Posted on July 24, 2025 at 7:04 AM29 Comments

Comments

Robin July 24, 2025 8:39 AM

Improved identity management is a worthy, even essential aim so I don’t want to detract from that ambition. However it is also essential that exceptional cases are integrated into the design of solutions. An obvious case is when individuals are incapable of managing their own affairs; legal constructs exist to deal with this (Lasting Power of Attorney in the UK, for example) although getting such powers – and then being allowed to use them – is not trivial. Since it is already difficult enough for attorneys to get authority to use their powers from, eg banks, I am not optimistic that ordinary folks will be able to make full use of the kind of universal solutions discussed ATL.

But there are even more difficult cases, for example when people are only partially capable of managing their affairs but have not reached the point where attorneys need to take over. I regularly help out 3 people who have difficulties with “computer stuff”. All three are regular, long-term users of emails, online shopping, etc but are no longer capable of understanding newer concepts (new for them) such as 2FA to the point where I could introduce a separate app for authorisation or even a password manager (both of which I would dearly like to do). With 2 of them I use Rustdesk to remotely sort out computer problems (which they have sometimes brought on themselves) and to do that I often need to use their identity parameters. Of course the difference between me doing that and a scoundrel doing it is merely the intent.

Strict protocols, strictly enforced have the potential to exclude a lot of people.

a sheep July 24, 2025 8:49 AM

@Bruce,
“…yet there is no central override and control to serve you—as the identity owner.”

I am 100% certain that Palantir has it all in a central DB and cashing in on it like it’s something they own. Some countries treat each of their citizen’s PII as their own trademarked and copyrighted property so others cannot use it without the owner’s written consent. I don’t know much about it but it has been brought to my attention recently. I might be wrong but I think it’s being discussed and/or rolled out in one of the Nordic/Scandinavian countries. Anyone who knows more about it feel free to please correct me. No hard feelings.

Shashank Yadav July 24, 2025 8:57 AM

People like to own things which accord them status or meaningful utility – which is where all expectations of users considering data ownership falter.

Moreover, for enterprise users this may work, the vast majority of individual users cannot be expected to maintain such personal data pods. Hypothetically, let us say you make a law requiring this way of data management, there will immediately be third-parties who people would prefer to handle this for them. Kind of like the notion of consent managers in India’s data protection laws, because competent and continuous technical administration cannot be expected from ordinary users.

Peter Galbavy July 24, 2025 9:30 AM

Maybe I have failed to have boned up on Solid, but the charming naivete that people will maintain their own personal data stores in an honest and trustworthy way is only slightly less laughable than how it’s done right now. Or maybe not.

Again, perhaps, because I have not spent any time looking at the actual protocol details I am confused where the veracity comes from? Or am I suddenly able to call myself an Admiral with a law degree and a healthy trust fund as a credit line?

Financial criminality would be democratised overnight, if nothing else.

Rontea July 24, 2025 10:46 AM

Considering the Solid protocol’s emphasis on user-controlled “data wallets,” it’s clear that such wallets should indeed belong to the individual. Privacy is at the core of this architecture, ensuring that personal information is safeguarded and managed solely by the user. The Solid protocol champions data security, empowering individuals to decide who can access their data and how it is used. Therefore, my data wallet rightfully belongs to me, as it embodies the principle of personal data ownership and the critical importance of privacy.

atanas entchev July 24, 2025 11:01 AM

The Solid protocol is charmingly naive. It assumes — like the early internet — good-will participation from everyone. We know that this is not how the real world functions.

What is to stop bad actors from building and presenting a fake profile / history / whatever?

Another Sheep-Never Anonymous-"Preferred" IP Address July 24, 2025 11:01 AM

Think of it this way: in most urban areas in just about any country on the planet, as soon as you’re out the door, physically, so is your privacy – cameras, everywhere, so there are much less individuals commiting crimes in public because most potential criminals are aware of this, so this makes them think twice.
Why can’t we have it like that on the Internet, everything open book, about everyone of us. I am all for it, but guess who’d have a problem with it? Those who have the power to enable/change it – those are the ones who’d be against it the most.
If you have everything transparent in the banking/financial sector, and are able to see everyone’s behaviour in the virtual world, you’d then see who needs attention before it’s too late.
Those who are trying very hard to keep the Status Quo, even in the Justice System in the USA, are the ones who have the power to change it but it does not benefit them so they aren’t changing it.
So, why does USA celebrate July 4th, the Independance Day – when the English Case Law is still being used?
Independent from what, from whom? They are still using Inches, Feet, and other non-metric garbage.
Didn’t Australia and Canada get rid of the Case Law?

A CORRUPT JUDGE is standing between you and justice!
A CORRUPT JUDGE DENIES YOU TO TELL THE JURY HOW YOU GOT DESTROYED BY THE EVIL, CORRUPT COPS.
A CORRUPT JUDGE WHO WASN’T EVEN ELECTED BY THE PEOPLE – HE COVERS UP FOR THE CORRUPT COPS.
If you’re trying to get justice in a CORRUPT STATE in the USA – you will be reminded by A CORRUPT JUDGE THAT THERE WAS SOMEBODY ELSE BEFORE YOU, (MAYBE A 100 YEARS AGO) who tried to get Justice by suing CRIMINAL COPS hey but he got denied and I, THE CORRUPT JUDGE WILL APPLY THE SAME TEMPLATE TO YOU – YOU LOSER IMMIGRANT!!!
https://drive.google.com/drive/folders/16GB5NiUu4Zb07RD6B3ai08qerHbHxJhI

Peter A. July 24, 2025 11:11 AM

There’s also another problem: partial identities, pseudonymous/fake identities, companies that collect too much data, etc. Having a data store that has it all is a bit risky, as you can accidentally share too much, especially the people that are a little less competent with all that computer stuff.

For example, for online shopping in my corner of the woods, the only data really needed is the delivery address (which doesn’t need to be a residence address, and often is just a parcel locker), payment method (already handled by semi-trusted third parties) and optionally a way of contact (email, whatever). But the vendors require full name, address, zip code, phone number etc. – all mandatory. I just fill in semi-normal-looking garbage and a one-use or individual email (so if they start spamming it, I can easily block it, without affecting my other emails). When the order-handling system at the vendor is mostly automatic, it goes through, computers are indifferent. But when it is some old chap doing shipping and handling manually I sometimes get complaints about fake name etc. They say something about accounting purposes, but WTF, when they sell over the counter they do not ID customers.

TimH July 24, 2025 11:20 AM

The right to data deletion is weak because:
1. You have no idea whether it is actually not deleted, but marked as not-for-direct-disclosure, but still there for inferences
2. There’s no right to prevent re-accumulation, which could be the provider re-acquiring the same dataset from another provider
3. Private right to action is key. As NOYB keeps disclosing, European DPAs (e.g. Eire) simply ignore lawbreaking making GDPR useless in practise.

pseudon July 24, 2025 1:18 PM

Some prior work was done here:
https://projectvrm.org/2012/11/08/the-identity-problem/

I would love to own and control all of my data. Nothing shared unless you prove to me why you need it, how you will use and secure it, and how you will correct or delete it and prevent secondary uses.

I shouldn’t have to cough up all of the details on my drivers license, when all I need to prove is that I’m over 13, 16, 18, 21, etc.?

I should be able to provide a unique proof-of-human credential, different for each entity I deal with to prevent cross-correlation.

C U Anon July 24, 2025 3:12 PM

pseudon : I shouldn’t have to cough up all of the details on my drivers license, when all I need to prove is that I’m over 13, 16, 18, 21, etc.?

Too late, as pointed out in,

https://www.schneier.com/blog/archives/2025/07/encryption-backdoors-and-the-fourth-amendment.html/#comment-446651

“All this “think of the children” nonsense with “age verification” has next to nothing to do with protecting anyone including children. In fact it is the very opposite, it’s being used to extract all sorts of personal and private data for profit.

https://www.theregister.com/2025/07/21/opinion_column_age_verification/

Read the article and you will find it’s “any age” as it’s UK “age verification” Law that is quite deliberately so badly defined as to be meaningless. Because the legislators know it’s not going to work, but that it can not work… So they’ve made the law such that if it goes wrong they can do a “Pontius Pilate” and go on to blame someone else for not doing the impossible and thus avoid any responsibility.

And guess what like any “hot potato game” people are doing the same and “out sourcing it” to others who ask for everything then sell it on… But have taken care to have no legal responsibility as they are in effect out of scope / jurisdiction. And surprise surprise the data brokers and worse are in on it already slurping it all up…

But why are the UK legislators doing this?

Because they hate the Internet as it shows them up for the liars and worse they are 100% of the time. And worse because the “Blairites” have been desperate for decades to get National ID cards/etc in by any way they can, even though it’s been proved they will fail and fail badly (as they have almost every where they’ve been put in).

The dirty secret of Bio-metric Passports is that they have both positive and negative fail rates worse than 1 in 1000.

Whilst that is not as bad as experienced passport control officers lucky to get a 75% success rate and eye witnesses usually less than 25% in a random line-up…

It’s in effect an open doors event to those who can afford to do a little “ID Shopping” on stolen credentials or even have them “stolen to order”. It’s well known but not widely spoken about, that this is what Mossad have done for as long as they have existed and still continue to do today.

Winter July 25, 2025 1:53 AM

@C U Anon

“All this “think of the children” nonsense with “age verification” has next to nothing to do with protecting anyone including children.

This is not about “protecting” users, but about criminalizing of the whole practice.

When you look at what those who advocate “age verification” actually want, it is to forbid it by law. However, as this is not yet possible, they want to make it as cumbersome as possible.

And this is just the start. Everything that has been leberated since the civil war should be turned back to 19th century puritanism. Eg,
‘https://www.msnbc.com/opinion/msnbc-opinion/project-2025-porn-ban-lgbtq-transgender-rcna161562

C U Anon July 25, 2025 6:33 PM

Winter : This is not about “protecting” users, but about criminalizing of the whole practice.

Not sure what you mean by criminalise.

Do you mean,

1 – Criminalise “age verification”

That is make any age verification system impossible, thus make fines high income for Gov and it’s agencies untill it becomes clear to everyone that Online Age Verification will always fail just as it does in “real life”.

2 – Criminalse “What the age verification is allegedly needed for.

That is like allegedly addictive substances and activities reserved for adults etc. Some of which are just out right unlawful to use or do at any age. Others however like alcohol, tobacco, vaps, and watching/playing certain movies or games have an age threshold. Thus by “criminalise” for all much like was tried with “American Prohibition” of alcohol (that failed, created worse societal harm and more apparently addictive behaviours thus got repealed).

C U Anon July 26, 2025 4:44 AM

Winter :

This is fresh from the in-box,

Discord’s “UK Required” age verification test can be tricked with video game characters

https://www.thepinknews.com/2025/07/25/discord-video-game-characters-age-verification-checks-uk-online-safety-act/

I was not expecting such a simple work around so fast…

But consider disinterested developers, cost saving management and near zero specification was going to produce at best “meet minimum required”.

Then throw in the real battle of smart kids V dumb politicos and bureaucrats where history shows the kids win almost every time, and you can see that it was going to go that way it was just a question of time.

If it’s found to be true and there is a reasonable probability it will for the above reasons, it just makes the point that On-Line user verification just can not be made reliable.

As has been pointed out on this blog in the past “Virtual is not Real Life” and between them is a gap where “Not only the Devil can play and win”.

As has been repeatedly stated Bio-metrics have always been a bad idea. Whilst most jump to the idea of “chopping off fingers and pulling out eyes” of Spy Movie Stories the reality is there are two issues.

The first is humans change all the time every day in ways that are not predictable. Finding a bio-metric that is non invasive, low cost, and fast to work that will cope with not just such change, but grime, etc is difficult at best and thus almost always a compromise. Hence the generally poor false positive and false negative figures.

The second is that there is a considerable gap between “Real Life ‘warm bodies'” and “Information measuring and converting ‘sensors'”. Even when the flesh is pressed against the glass measurements are going to have, scale, bias, offset, noise, and dropout etc issues.

Then the question of,

“What are you actually neasuring?”

Arises because you are not comparing to an individual reference but Jo Average at one of those times in life when people change the fastest…

Throw in low cost easy to make AI deep fakes and the question that should pop up is,

“Why on earth does anyone think this can work, let alone reliably?”

When history shows over and over it can not.

Oh and there are also the down sides like the equivalent of “racial profiling” and covert databases. Those that used “23andMe” have had an ice-bath / dusche wake up call,

“23andMe’s privacy statement, which all customers must accept to use the service, contains provisions that it may sell your personal information if it is ever involved in bankruptcy proceedings.”

https://www.techradar.com/health-fitness/23andme-is-bankrupt-and-about-to-sell-your-dna-heres-how-to-stop-that-from-happening

Such things will become more and more common place untill consumers “wise-up”… And made worse by the likes of lobyists and unelected regulators with adverse political views stop sensible regulation,

https://www.techradar.com/pro/security/trump-administration-scuppers-plan-to-stop-data-brokers-from-putting-americans-sensitive-data-up-for-sale

Winter July 26, 2025 12:51 PM

@C U Anon

Criminalse “What the age verification is allegedly needed for.

Especially, any practice related to depictions of humans having intercourse or are depicted without clothes. In addition, anything even suggesting loving someone without this resulting in babies.

In short, anything religious fundamentalists and fascists object to.

I would like to remind you of the fact that in the state of Texas, which currently has age identification in law, the sale of likenesses of male members is forbidden for no reason a sane person can understand, except, of course, 19th century puritans.

‘https://en.wikipedia.org/wiki/Texas_obscenity_statute

Paul Sagi July 27, 2025 9:25 AM

I do agree that it’s a problem when the various healthcare personnel treating a patient are each seeing only fragments of a patient record.
SOLID is however naive.
1) Contradiction: The problem is stated as fragmented data, the whole dataset does not follow the patient. Giving a cardiologist only cardiology and background data is fragmenting the dataset.
2) The data lives in a digital wallet. How prone to failure is that?
3) It’s naive to think that there can be quality medical care if medical data (example below) can be parsed so that for example only cardiac and background info would be available to a cardiologist. Thinking such shows quite a lack of knowledge about medicine. Restricting the info a cardiologist has access to prevents full understanding of a patient, thus preventing proper diagnosis and proper treatment. A simple example, the kidneys and heart are tightly linked, if one has a problem the other will also have a problem. The human body is comprised of cells that form tissues, tissues that form organs and organs that form systems. Those systems are linked through feedback loops such as hormones and nerves, creating systems of systems. A cardiologist trying to diagnose a patient without full knowledge of the patient would be like the parable: “The parable of the blind men and the elephant illustrates how individuals can have different perspectives based on their limited experiences. Each blind man touches a different part of the elephant and describes it differently, highlighting the importance of understanding that one’s viewpoint may only represent a part of the whole truth.
Wikipedia LibreTexts”
Because the human body is a system of systems, what appears to be a heart problem may not be a heart problem, it may be a symptom. Severe Vitamin B12 deficiency can cause heart failure. Treating the heart would be futile, what the patient needs is an infusion of B12. I know of a case where a patient had heart failure, a cardiologist wanted to give them heart medication. A physiologist reviewed blood tests of the patient, found the patient had hyponatremia (low blood level of sodium) and the patient was cured by an IV of sodium chloride (salt). That patient would have died if the cardiologist had continued considering only cardiac data without consideration of blood chemistry. For my final comment, blood pressure is controlled by multiple feedback loops. Treating a problem with blood pressure requires a broad approach, not merely selecting one aspect and being blind to the others.

Gateklons July 27, 2025 4:39 PM

But while these laws attempt to protect data confidentiality, there is nothing to protect data integrity.

That’s just not true. GDPR specifically addresses this in Article 5(1)(d) and (f) and is part of the concept of security of processing (which also includes data availability).

Outside of public administration it doesn’t go the extra mile of data interlinking but that is its own privacy risk (why would I want the database state to be overextended into the private sector?), may lead to universal identifiers, central points of failure etc.

The real issues here are 1) the existence of data brokers, 2) under-enforcement of data protection law and 3) the continued reliance on names, dates of birth, addresses and other details for authentication or to establish a link with your legal identity instead of using a purpose/service-specific identifier issued to you by a central population register (established in many developed democracies). Everywhere where that is not necessary the personal data should be pseudonymous and authentication managed with anonymous passkeys. Personal information management systems (PIMS), which have found their way into German law, can fill in some of the gaps.

Agammamon July 28, 2025 5:36 AM

Currently, I am not sure what value accurate and timely data bout me across the multitude of apps, platforms, and services, has for me.

I feel I am better protected when that data is fragmented and out of date. I have no desire for data brokers to be able to compose a complete and accurate identity file on me and there’s plenty of online interactions that were one-and-done and I don’t want them to be able to access up-to-date information on me from any source.

And then there’s government surveillance . . .

Agammamon July 28, 2025 5:39 AM

This also goes to security breaches. A lot of the time the info these people are getting is incorrect and old – more breaches, less valuable info on any given breach.

The alternative here is to have all the data maintained up to date making it a massive target and so when it’s breached – and it will be when – it’s a gold mine.

Bonus – you know the security around it will be just as shite as it is everywhere else;)

Agammamon July 28, 2025 5:44 AM

For example, imagine digital verification that works like a locked mobile phone—it works when you’re the one who can unlock and use it, but not if someone else grabs it from you.

https://youtu.be/dUMH6DVYskc?si=_L7FZok4MtcVWVTt

Until a child can unlock their parent’s phone because the lock wasn’t very secure.

C U Anon July 28, 2025 7:36 AM

Agammamon: …unlock their parent’s phone because the lock wasn’t very secure.

There are when you get down to it two basic kinds of lock,

1 – Tangible / physical.
2 – Intangible / informational.

Both have issues some are general to all locks, others apply to individual types of lock.

But it needs to be said,

All locks have issues from before they are designed due to trade-offs, legislation, regulation, and manufacturing processes.

But as a rough rule two things apply before design actual design,

1, Minimise costs.
2, Maximise convenience.

The result of this is,

Defects are built in from the very begining of a lock life cycle.

I can break this down further, but yes the majority of locks have avoidable defects that bypass the intent of the lock from the owner / user perspective.

Agammamon July 29, 2025 9:57 AM

I don’t need it broken down.

Locks are fallible, that was my point. And if your lock is fallible it’s better to not have all your valuables locked in one place.

lurker July 30, 2025 2:01 PM

@C U Anon
” … and should act towards one another in a spirit of brotherhood.”

Cain – Abel

Searching for a true Original Sin might take you back to the act of creating the heavens and the earth.

Anon 97 August 1, 2025 3:28 AM

(1) I like Solid and I’m excited to see how it develops.

It seems like the essence is in the promised ecosystem, not the protocol. The protocol per se appears to be compatible with the status quo — does it really make a difference whether you enter your name and DOB through a closed platform’s UI or the platform requests the same info from your pod?

The hope must be that Solid enables a new ecosystem with a new status quo. As best I can understand it, it doesn’t prevent players from reconstructing the status quo within that ecosystem. I just hope it provides the technical means to tip the balance of power.

(2) Does the user control the license under which her pod data is readable and copyable?

B.M. Peachey August 13, 2025 4:49 AM

Reading through the comments, I am very curious how many of them are written by AI, as the alternative would be very uninformed people making very inexact statements and claims.

Statements along the lines of “I haven’t read the spec, but …” and “I might be wrong, but…” are just straw man arguments.

Statements regarding “This is unsafe” or “This is perfect for criminals” falls under the category of misunderstanding how this works. Yes, you can claim your shoe size is a 5 when it is actually a 6. The shop won’t care, they’ll still sell you shoes. But you don’t gain anything by lying. As soon as there is gain, there is also mitigation/remediation. There are already all sorts of mechanisms for ensuring data validity. Would Solid be 100% self declared, without any checks and balances? Does that make sense? No, it doesn’t. It simply means that more security implementation is needed to match the more decentralised nature of Solid. Not that “Financial criminality would be democratised overnight”. Honestly, only an idiot would think that. (Ad hominem intended).

People not wanting Solid but prefering trusted third parties is also a straw man, albeit slightly more represetative of the current state of affairs. A lot of people just fill in anything anywhere anyway. They simply have no idea of the repercussions. That is why it is essential that a user-centric solution is pushed forward, because that is the only path toward educating users about the results of their action. This is why it is essential that trusted Solid third parties exist. Sadly, they are not here yet. But what if those existing trusted third parties simply start offering Solid Pods, Identities and related applications? That why more exposure for Solid is needed, so more parties feel inclined to start offering those services.

Some real concerns, about real world problems are very valid. “Strict protocols, strictly enforced have the potential to exclude a lot of people” and a clinician needs the full patient picture so fragmenting data can harm diagnosis are both real concerns. Both of these things are already happening with existing applications and closed-source systems. Solid explicitly aims to change that, not make it worse… But yes, this needs to be taken very good care of if that ambition is to be achieved.

Not, to address individual statements and questions (as someone who has read the protocol, built Solid Apps, implemented Solid Servers, and works with Solid on a weekly basis).

It seems like the essence is in the promised ecosystem, not the protocol.
To a large extend, that is correct. The only way the Solid Protocol has any real world use, is if enough parties use it, which will only happen if the ecosystem becomes useful for consumers, business, and developers alike.

does it really make a difference whether you enter your name and DOB through a closed platform’s UI or the platform requests the same info from your pod?

Yes. Ownership, consent and intent are important. Data autonomy should be as normal as your phisical autonomy.

it doesn’t prevent players from reconstructing the status quo within that ecosystem

It does, through both technical and legal means. But, to be fair, this does become easier in regions where there are actual laws protection digital citizens.

I am not optimistic that ordinary folks will be able to make full use of the kind of universal solutions discussed ATL

Part of the underlying mechanism for data access consists of providing consent to other agents (either apps or other users). Although it might be legally difficult to gain agency, it is technologically trivial. A user can give another entiry agency over their entire Solid pod, or parts thereof. Solid also understands the concept of co-ownership, so a patient would not actively have to give a hospital (or app representing the hospital) consent. Access would already be availabe.

[users who] are no longer capable of understanding newer concepts

They shouldn’t have to. The majority of this takes palce “under the hood”. The UI is “just” a regular web or mobile interface. In fact, Solid would make it easier te create multiple UI, enabling developers or companies to provide simpler interfaces for user that need them.

Some countries treat each of their citizen’s PII as their own trademarked and copyrighted property so others cannot use it without the owner’s written consent.

Personally, I have never heard of anything remotely like this. But maybe that just because I am priviliged enough not to live in a region where this is an issue. Regardless, there is way more data that is useful to a user than just their PII. Also, even if the country “owns” the data, it would be beneficial for users to be able to decide how much of that data they expos to which proprietary service, which is part of what Solid is trying to achieve.

the vast majority of individual users cannot be expected to maintain such personal data pods.

They would not have to, that is the whole point. Things should “just work”. But the important part is that they could if they wanted to, which currently, unsurprisingly, they can not.

am I suddenly able to call myself an Admiral with a law degree and a healthy trust fund as a credit line

Not without a source of proof, no.

What is to stop bad actors from building and presenting a fake profile / history / whatever?

Technologically enforcable proof mechanisms. Think zero-knowledge proofs, verifiable credentials, linked data proofs, etc.

There’s also another problem: partial identities, pseudonymous/fake identities, companies that collect too much data, etc.

True. These are real world issues, that need real world solutions. Work is being done to address these needs but they are not fully there yet.

Having a data store that has it all is a bit risky, as you can accidentally share too much, especially the people that are a little less competent with all that computer stuff.

This is where a major task of prevention and education becomes an absolute must for Pod providers and Solid app creators. With greater power also comes greater responsibility.

You have no idea whether [data] is actually not deleted
[..] the provider re-acquir[es] the same dataset from another provider
[..] European DPAs (e.g. Eire) simply ignore lawbreaking making GDPR useless in practise.

Again, stricter technological and legal mechanisms are needed. Proprietary systems are not providing these. The Solid movement is at least trying. Will it be 100% successful? Most likely not. Will it make things better (or at last less worse)? Almost guaranteed.

I could go on, but at this point, if you want / need more you might as well just talk to me in person… https://www.linkedin.com/in/benpeachey/

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.