Academia and the “AI Brain Drain”

In 2025, Google, Amazon, Microsoft and Meta collectively spent US$380 billion on building artificial-intelligence tools. That number is expected to surge still higher this year, to $650 billion, to fund the building of physical infrastructure, such as data centers (see go.nature.com/3lzf79q). Moreover, these firms are spending lavishly on one particular segment: top technical talent.

Meta reportedly offered a single AI researcher, who had cofounded a start-up firm focused on training AI agents to use computers, a compensation package of $250 million over four years (see go.nature.com/4qznsq1). Technology firms are also spending billions on “reverse-acquihires”—poaching the star staff members of start-ups without acquiring the companies themselves. Eyeing these generous payouts, technical experts earning more modest salaries might well reconsider their career choices.

Academia is already losing out. Since the launch of ChatGPT in 2022, concerns have grown in academia about an “AI brain drain.” Studies point to a sharp rise in university machine-learning and AI researchers moving to industry roles. A 2025 paper reported that this was especially true for young, highly cited scholars: researchers who were about five years into their careers and whose work ranked among the most cited were 100 times more likely to move to industry the following year than were ten-year veterans whose work received an average number of citations, according to a model based on data from nearly seven million papers.1

This outflow threatens the distinct roles of academic research in the scientific enterprise: innovation driven by curiosity rather than profit, as well as providing independent critique and ethical scrutiny. The fixation of “big tech” firms on skimming the very top talent also risks eroding the idea of science as a collaborative endeavor, in which teams—not individuals—do the most consequential work.

Here, we explore the broader implications for science and suggest alternative visions of the future.

Astronomical salaries for AI talent buy into a legend as old as the software industry: the 10x engineer. This is someone who is supposedly capable of ten times the impact of their peers. Why hire and manage an entire group of scientists or software engineers when one genius—or an AI agent—can outperform them?

That proposition is increasingly attractive to tech firms that are betting that a large number of entry-level and even mid-level engineering jobs will be replaced by AI. It’s no coincidence that Google’s Gemini 3 Pro AI model was launched with boasts of “PhD-level reasoning,” a marketing strategy that is appealing to executives seeking to replace people with AI.

But the lone-genius narrative is increasingly out of step with reality. Research backs up a fundamental truth: science is a team sport. A large-scale study of scientific publishing from 1900 to 2011 found that papers produced by larger collaborations consistently have greater impact than do those of smaller teams, even after accounting for self-citation.2 Analyses of the most highly cited scientists show a similar pattern: their highest-impact works tend to be those papers with many authors.3 A 2020 study of Nobel laureates reinforces this trend, revealing that—much like the wider scientific community—the average size of the teams that they publish with has steadily increased over time as scientific problems increase in scope and complexity.4

From the detection of gravitational waves, which are ripples in space-time caused by massive cosmic events, to CRISPR-based gene editing, a precise method for cutting and modifying DNA, to recent AI breakthroughs in protein-structure prediction, the most consequential advances in modern science have been collective achievements. Although these successes are often associated with prominent individuals—senior scientists, Nobel laureates, patent holders—the work itself was driven by teams ranging from dozens to thousands of people and was built on decades of open science: shared data, methods, software and accumulated insight.

Building strong institutions is a much more effective use of resources than is betting on any single individual. Examples demonstrating this include the LIGO Scientific Collaboration, the global team that first detected gravitational waves; the Broad Institute of MIT and Harvard in Cambridge, Massachusetts, a leading genomics and biomedical-research center behind many CRISPR advances; and even for-profit laboratories such as Google DeepMind in London, which drove advances in protein-structure prediction with its AlphaFold tool. If the aim of the tech giants and other AI firms that are spending lavishly on elite talent is to accelerate scientific progress, the current strategy is misguided.

By contrast, well-designed institutions amplify individual ability, sustain productivity beyond any one person’s career and endure long after any single contributor is gone.

Equally important, effective institutions distribute power in beneficial ways. Rather than vesting decision-making authority in the hands of one person, they have mechanisms for sharing control. Allocation committees decide how resources are used, scientific advisory boards set collective research priorities, and peer review determines which ideas enter the scientific record.

And although the term “innovation by committee” might sound disparaging, such an approach is crucial to make the scientific enterprise act in concert with the diverse needs of the broader public. This is especially true in science, which continues to suffer from pervasive inequalities across gender, race and socio-economic and cultural differences.5

Need for alternative vision

This is why scientists, academics and policymakers should pay more attention to how AI research is organized and led, especially as the technology becomes essential across scientific disciplines. Used well, AI can support a more equitable scientific enterprise by empowering junior researchers who currently have access to few resources.

Instead, some of today’s wealthiest scientific institutions might think that they can deploy the same strategies as the tech industry uses and compete for top talent on financial terms—perhaps by getting funding from the same billionaires who back big tech. Indeed, wage inequality has been steadily growing within academia for decades.6 But this is not a path that science should follow.

The ideal model for science is a broad, diverse ecosystem in which researchers can thrive at every level. Here are three strategies that universities and mission-driven labs should adopt instead of engaging in a compensation arms race.

First, universities and institutions should stay committed to the public interest. An excellent example of this approach can be found in Switzerland, where several institutions are coordinating to build AI as a public good rather than a private asset. Researchers at the Swiss Federal Institute of Technology in Lausanne (EPFL) and the Swiss Federal Institute of Technology (ETH) in Zurich, working with the Swiss National Supercomputing Centre, have built Apertus, a freely available large language model. Unlike the controversially-labelled “open source” models built by commercial labs—such as Meta’s LLaMa, which has been criticized for not complying with the open-source definition (see go.nature.com/3o56zd5)—Apertus is not only open in its source code and its weights (meaning its core parameters), but also in its data and development process. Crucially, Apertus is not designed to compete with “frontier” AI labs pursuing superintelligence at enormous cost and with little regard for data ownership. Instead, it adopts a more modest and sustainable goal: to make AI trustworthy for use in industry and public administration, strictly adhering to data-licensing restrictions and including local European languages.7

Principal investigators (PIs) at other institutions globally should follow this path, aligning public funding agencies and public institutions to produce a more sustainable alternative to corporate AI.

Second, universities should bolster networks of researchers from the undergraduate to senior-professor levels—not only because they make for effective innovation teams, but also because they serve a purpose beyond next quarter’s profits. The scientific enterprise galvanizes its members at all levels to contribute to the same projects, the same journals and the same open, international scientific literature—to perpetuate itself across generations and to distribute its impact throughout society.

Universities should take precisely the opposite hiring strategy to that of the big tech firms. Instead of lavishing top dollar on a select few researchers, they should equitably distribute salaries. They should raise graduate-student stipends and postdoc salaries and limit the growth of pay for high-profile PIs.

Third, universities should show that they can offer more than just financial benefits: they must offer distinctive intellectual and civic rewards. Although money is unquestionably a motivator, researchers also value intellectual freedom and the recognition of their work. Studies show that research roles in industry that allow publication attract talent at salaries roughly 20% lower than comparable positions that prohibit it (see go.nature.com/4cbjxzu).

Beyond the intellectual recognition of publications and citation counts, universities should recognize and reward the production of public goods. The tenure and promotion process at universities should reward academics who supply expertise to local and national governments, who communicate with and engage the public in research, who publish and maintain open-source software for public use and who provide services for non-profit groups.

Furthermore, institutions should demonstrate that they will defend the intellectual freedom of their researchers and shield them from corporate or political interference. In the United States today, we see a striking juxtaposition between big tech firms, which curry favour with the administration of US President Donald Trump to win regulatory and trade benefits, and higher-education institutions, which suffer massive losses of federal funding and threats of investigation and sanction. Unlike big tech firms, universities should invest in enquiry that challenges authority.

We urge leaders of scientific institutions to reject the growing pay inequality rampant in the upper echelons of AI research. Instead, they should compete for talent on a different dimension: the integrity of their missions and the equitableness of their institutions. These institutions should focus on building sustainable organizations with diverse staff members, rather than bestowing a bounty on science’s 1%.

References

  1. Jurowetzki, R., Hain, D. S., Wirtz, K. & Bianchini, S. AI Soc. 40, 4145–4152 (2025).
  2. Larivière, V., Gingras, Y., Sugimoto, C. R. & Tsou, A. J. Assoc. Inf. Sci. Technol. 66, 1323–1332 (2015).
  3. Aksnes, D. W. & Aagaard, K. J. Data Inf. Sci. 6, 41–66 (2021).
  4. Li, J., Yin, Y., Fortunato, S. & Wang, D. J. R. Soc. Interface 17, 20200135 (2020).
  5. Graves, J. L. Jr, Kearney, M., Barabino, G. & Malcom, S. Proc. Natl Acad. Sci. USA 119, e2117831119 (2022).
  6. Lok, C. Nature 537, 471–473 (2016).
  7. Project Apertus. Preprint at arXiv https://doi.org/10.48550/arXiv.2509.14233 (2025).

This essay was written with Nathan E. Sanders, and originally appeared in Nature.

Posted on March 13, 2026 at 7:04 AM21 Comments

Comments

Paul Sagi March 13, 2026 9:41 AM

I suppose AI can be used to flood with fake research papers, as a means of sabotage by competing nation-states.

Rontea March 13, 2026 10:45 AM

In the rush to monetize intelligence, we risk forgetting that science is not merely an engineering problem to optimize. When discovery becomes a compensation arms race, curiosity is reduced to throughput, and the elegant mess of inquiry is flattened into quarterly metrics. Security, ethics, and resilience emerge from the same principle: systems thrive when they are diverse, collaborative, and hard to game.

We need to remember that science is not just a machine that delivers answers. It is a human enterprise, and like cryptography, its strength lies in its capacity for creativity under constraint. To endure, science must become more like art—valued not only for its utility, but for its ability to reveal, provoke, and connect. Institutions that protect this spirit will produce the knowledge that lasts; those that treat research as a talent-extraction problem will find themselves with brittle systems and shallow insight.

Milan March 13, 2026 10:48 AM

I hope some universities follow your advice! We need powerful forces to combat the societal drift into self-serving oligarchy

mark March 13, 2026 1:00 PM

And even for someone who’s not being offered millions of dollars, the salaries and benefits are higher… esp. when the odds are that you won’t ever get tenure. (1970’s – 70% with tenure, now, 30% and dropping)

lurker March 13, 2026 2:10 PM

How “open” are Chinese Open Source models?

While the West debates GPT-5 vs Claude 4, China’s open-source AI community has been quietly shipping production-grade tools that solve real problems. Some of these projects have tens of thousands of GitHub stars and massive user bases in Asia — yet remain virtually unknown on Reddit, Hacker News, or Dev.to.

Why Does This Gap Exist?
Three reasons:

* Language barrier. Tutorials, case studies, and community discussions are in Chinese. GitHub stars don’t translate into English-language awareness.
* Different ecosystems. Chinese developers share on WeChat, Zhihu, and Bilibili — platforms Western devs never visit.
* Discovery bias. Western tech media covers OpenAI and Hugging Face. Chinese OSS projects don’t have PR teams pitching TechCrunch.

https://dev.to/victorjia/5-chinese-ai-open-source-tools-the-west-doesnt-know-about-2026-18b0

Clive Robinson March 13, 2026 5:11 PM

@ lurker,

With regards

Why Does This Gap Exist?
Three reasons:

There are other “non technical reasons”…

It is said,

“You can not solve social problems with technological solutions”

Which is true enough but what happens when you flip the premises?

One answer you get is,

“you can solve social problems with non technological solutions”

Which is also true enough.

But consider that most tech is developed to solve some aspect of a sociological need. As such you need a sociological solution to solve these things effectively…

Like it or not even with the bad press we hear, the Chinese people tend to be more “pro-social” than we credit them with (though they are not US style “communist” by and large. This makes for a more effective workforce than the American / Western US style neo-con Capitalist rapacious answer that is neither technology or sociologically based, but “greed front and center”.

Which is why the neo-cons having failed in so many other “greed based” systems, are now trying “all in” through a perverted form of “self entitled individualism”. Any one who can think can see it’s not going to go very far, and the far from legal ideas it has to have to survive are seriously debilitating to the economy etc.

Rent seeking may be invoked in the US to “make AI pay it’s way”… by the use of surveillance. But it is not survivable as surveillance is not workable in a pro-social community. In general it’s not its self though and does not fit is just that,

“Bad llM output is actually hard to distinguish for various reasons, so “safety guard rails” will prove the idea… Of an always workable solution, even though we know it’s the wrong way with the wrong turn subset…

Clive Robinson March 13, 2026 11:19 PM

@ Bruce,

Are you aware of just how short lived this is going to be,

“Moreover, these firms are spending lavishly on one particular segment: top technical talent.”

They can only do this if “general use AI” “brings home the bacon” or “investors are happy to burn money for no return”.

The simple fact is “general use AI” as an income earner is “doomed” from the get go, it won’t pay, and it’s certainly not going to pay on a high price subscription model.

I’ve explained the reasoning in answer to a question raised by @lurker on another thread,

https://www.schneier.com/blog/archives/2026/03/friday-squid-blogging-squid-in-byzantine-monk-cooking.html/#comment-452866

I’ve known for three decades and more that you can not make an economic market work as a “free market” if the “Distance Cost Metric” is either zero or zero to your customers and there is no way to create an effective cartel or monopoly.

Put overly simply, it’s one interpretation of “supply and demand”. If people see profit in an activity “supply of a service will increase” initially there may be profit, but soon supply will be higher than demand and the profit will go close to zero at which point “the game is over”.

What stops this happening in tangible physical object markets is the “distance cost metric” cuts profit with range, thus forces local and regional coverage area distances as the cost of distance shipping and retail outlet area coverage rises as a power law.

There is no “distance cost metric” with “intangible information objects” it costs the same to send them anywhere on the Globe where there is Internet access.

Whilst this allows initiall “first to market” advantage unless there is a way to ensure a cartel or monopoly then the law of “supply and demand” kicks in and as supply increases profit decreases and profit goes close to or less than zero (think about Enshittification reducing demand).

FOSS actually speeds this along as it can turn anyone into a “supply of information objects” very very inexpensively.

One of the reasons China is doing well in the AI space race is “cooperation” and in effect making things “Free and Open” and the cost of hardware is the only impediment to supply of “general AI” sky-rocketing…

The novelty of “general AI” is now quite quickly wearing thin. And although people have not yet realised it the supply of “general AI based on Current AI LLM and ML systems is rapidly approaching if not already in “over supply” thus not only “no profit” no “return on investment” either. So “Death of the hype bubble” that stupid investors put way more than pocket-change into.

If as increasing numbers suspect / expect this hype investing will put the US economy into recession, it will also put the world economy into recession as well. Especially if some other factor such as the cost of power production from petrochem energy supply rises (which it is significantly at the moment).

Winter March 14, 2026 3:35 AM

@Clive

They can only do this if “general use AI” “brings home the bacon” or “investors are happy to burn money for no return”.

The competition is for investors. They buy the talent to reduce the competition for investors.

These startups are deprived of investment money because the talent attracting the money are taken out.

Heinrich March 14, 2026 5:48 AM

As much as I would like to have a “broad, diverse ecosystem in which researchers can thrive at every level”, the reality is that researchers do not thrive.

  • The distribution of Research Grant money is horribly inefficient — this failure is documented thoroughly by now. The problem is that much researcher time is spent on judging whether a research plan has a chance of success — but for cutting edge research, this misses the fact that a) judges may fail to recognize novel ideas when they do not fit their bias and b) if the research can be planned, it doesn’t cover enough unknown territory to be novel. On a): New ideas never fill a void in an empty world, they have to fight their way through the molasses of minds who think they already know better (hint: they don’t).
  • The above means that the skill to write a good Research Grant is actually in opposition to the skill of doing good research. In the pyramid scheme junior → senior → professor, this means that the good researchers are likely to drop out.
  • Individual excellence is the only thing that can produce both novelty and handle unknown territory. As a case in point: You can’t replace Bruce Schneier’s voice on security. No pool of fungible researchers will you give that individual perspective that Bruce has.

Winter March 14, 2026 11:15 AM

@Heinrich

The distribution of Research Grant money is horribly inefficient

But fundamental research is itself extremely inefficient in the short term, and the most efficient way to generate knowledge in the long term.

Most really good fundamental research has as outcome that we are wrong in what we thought was true. Any research which finds what it expected does not tell us something new.

As there are infinitely many things to investigate, grant committees have to distribute the available money wisely. They have to strike a balance to pick answerable questions with a good chance to give a useful answer.

A negative result is a very useful answer if it teaches us something we didn’t know.

yet another bruce March 14, 2026 5:09 PM

@Bruce prime

Some of the assertions in this essay do not reflect my experience.

Curiosity rather than profit is a false dichotomy, just as curiosity vs grants or curiosity vs tenure are false dichotomies. Curious people are curious. Ambitious people are ambitious. Some people are both.

Companies may be paying star performers high salaries, but that does not mean they are hiring them for what they can do as lonely individuals. Cristiano Ronaldo is well paid, but rest assured, football is a team sport and a player like Ronaldo elevates the performance of the entire team.

Firstly, when I studied at an elite US university, I was surprised at how little the faculty cooperated on research. Grad students under the same advisor cooperated well, but groups were pretty well siloed from one another. I suspect the tenure process can lead to a view that other assistant professors are competitors rather than colleagues. When I graduated and took a job in a big industrial research organization, I saw much healthier teamwork and research cooperation across teams and even divisions. That aspect of the job was great.

While I am sure that money is a big factor in the AI brain drain you describe, I suspect that the desire to work on the cutting edge is also a factor. In many fields, the most innovative work and the work with the greatest impact is often being done in large companies rather than in universities.

Clive Robinson March 14, 2026 6:31 PM

@ Winter, Heinrich, ALL,

You say,

“A negative result is a very useful answer if it teaches us something we didn’t know.”

All “research” informs in some way.

Even if it “further confirms” what we think we know.

The Universe does not as far as we can tell run on “formulations of mathematical rules” they are “models we ourselves make” and the sort of research you cover with,

“But fundamental research is itself extremely inefficient in the short term, and the most efficient way to generate knowledge in the long term.”

is about the making and testing of these models and rules, and more often than most realise we find the models are wrong in some oft small way even if they do appear to represent the things we observe.

This sort of “arms race” like any other produces less and less at higher and higher costs. It is a point not lost on Richard Feynman and he gave voice to in writing. In essence his advice to those observing the macro macro in terms of time and space should be honest about what they expect to bring to the table but rarely if ever are. In part it is why politicians do not trust scientists, because they see lies and half truths in grant applications all the time and no effective return on the money from the public purse in their lifetime or that of the imaginable generations to come (about 100-200years).

Thus researchers are seen as “dishonest” from the get go, so when they do have important results that have near term consequences they are easy to dismiss and ignore, which we see all the time.

It’s why “corporations” have been allowed to “capture and control” research much to the cost of society, and society gets sicker and weaker because of it. Thus an even worse return happens on the use of public funds, and worse with two exceptions, we are all the more harmed not just poorer for it. Those two being those on the line between Corporate controllers and those hands out / noses in the trough / grease my wheel, politicians and legislators.

I’ve made no secret of the fact I enjoy doing short term research and hopefully adding it to human knowledge with out strings and for the betterment of “the others” not those on the aforementioned line. As long term readers will have read, all I ask is a “hat tip and two drinks” they buy @Bruce our host both of them, and at some point Bruce –if we ever meet up[1]– buys me one (an idea that somebody else later turned into “buy me a coffee” and similar[2]).

The sort of research I like doing falls in two areas,

1, Industrial archaeology.
2, Applied Engineering.

The first a near life long personal interest that I grew up in thanks to my curiosity and my mother who’s profession it had been before becoming a mother.

The second has been across more fields of endeavor than I could conveniently list from before being a teen with being a “self taught lock pick” and a little later design and manufacture of boats and antennas in glass reinforced plastic long before leaving school. As a side to that I got into electronics and communications and “Pirate Radio” then in my teens 8bit CPU’s brought Personal Computing to the world which I just fell into feet first and somehow always found myself on the “leading and oft bleeding edge” of design in both. The conversion of “hobbies into professions” is a major failing in my life as hobbies are ment as something relaxing you want to do, and professions are rarely relaxing or after a relatively short time things you want to do.

And it is this last point that those striking out on their own should consider, the Corporate waters are full of dangers, and being a minnow in such waters has a minimal life expectancy, and you probable will not like what you get turned into to survive. As for thriving in such an environment, research is not something that you will be able to do let alone control.

An example of which is your comment of,

“The competition is for investors. They buy the talent to reduce the competition for investors.

These startups are deprived of investment money because the talent attracting the money are taken out.”

And it’s not just “money” and “investment” it’s also “ego, control, power” and worse we’ve talked about in the past (dark pentad of abusive domination).

[1] We nearly were in the same room in Cambridge UK back in 06/24 but unfortunately ill health got the better of me and the administration of surgeons like that of judges takes no account of personal, if you don’t accept you get punitive results.

[2] I do not like the likes of all the “by me a XXX” sites, because like “tip jars” they are not under the control of either you or your intended recipient, so they don’t get the value you might put in or at best some small fraction of it. Corporations and governments take big slices “off the top” so get the cream, whilst the recipients get not milk but something more watered down than even the whey of harmful sugar water. Am I biased… Well yes, so what do others think,

https://medium.com/make-online/is-buy-me-a-coffee-worth-it-11538c6b9205

Winter March 14, 2026 10:41 PM

@Clive

It is a point not lost on Richard Feynman and he gave voice to in writing. In essence his advice to those observing the macro macro in terms of time and space should be honest about what they expect to bring to the table but rarely if ever are.

There is some duplicity in these accusations.

Mr Feynman was a brilliant and very successful scientist who was instrumental in many important technical advances.

All these advances were built upon science from a few decades before that had been derided without end as being convoluted nonsense without any possible use: Quantum Mechanics. The physicist behind this research have been called frauds and con men many times.

The same accusations had been heaped upon Mr Faraday in his days for his useless work on this mirage called electricity. Or, to pick a non-physics subject, all this genetics work done on useless fruit flies which is at the basis of modern agriculture and medicine.

Modern researchers are asked without to predict the future economic value of their proposed research. Nothing in their education or training prepares them for this task. There are even very few, if any, people in the world who would be able to make evidence based estimations of such kinds.

So these researchers try to advocate for their projects with the tools they have, which is only their imagination.

When research delivers big money, think electricity, tele-communication, the internet, GPS, quantum mechanics, cures for AIDS, vaccines for crippling diseases, the very food we eat, everyone immediately forgets how the science behind it had been derided and made fun of before and the scientists doing it called lying frauds.

But new research proposals are required to explain how they will repeat these earlier success stories. And then they complain about these “predictions” not coming true.

Sean March 14, 2026 11:36 PM

Clive, I get around 90-95% of the donations I receive on sites like that after deducting payment-processing fees (eg. 2.9% + $0.30 for a credit-card payment over $3 on Patreon). The exceptions are things like porn clip sites, credit card transactions in countries sanctioned by the USA, and payments within iOS apps where the payment processors have leverage. Regardless its transparent on the recipient’s side eg. https://support.patreon.com/hc/en-us/articles/11111747095181-Creator-fees-overview

Clive Robinson March 15, 2026 4:44 AM

@ Sean,

From the page you link to,

“Patreon’s new, standard 10% plan + applicable taxes and payment processing fees.”

So three sets of deductions on the income…

1, Platform Fees,

“If you published your creator page after August 4, 2025, please note that you are on our standard 10% pricing plan”

So the original payment value less 10%

Deduct (100*0.1) = 10

2, Payment processing fees,


Deduct (100*0.039) + 0.3 = 4.2

3, Government Taxation,

“Where required by law, Patreon collects VAT, GST, or sales tax on platform fees.”

Deduct (100*0.2) = 20 VAT/Sales

Deduct (100*0.3) = 30 Income tax

Totting those up and applying,

100 – (10 + 4.2 + 20 + 30) = 35.8

You then might be able to get the “income tax” back in a couple of years, by jumping through hoops, that can necessitate “accountants fees” and other expenses, which is maybe why few do claim back…

But that paltry amount then goes into your service “account” which then attracts further “withdrawal” fees added to get the money out of the service account and into your bank account…

So probably 2/3rds of any money “sliced off the top” in fees and taxation … Is in your view a “good deal”?

Sean March 15, 2026 11:58 AM

Clive, a few things. In most jurisdictions sales tax etc. is added to a pledge. So a more realistic scenario is Alphonse pledges $5.00 to Gaston, 20% sales tax makes the bill $6.00, and $1 (20% of $5) goes to the taxman, $0.50 (10% of $5) goes to Patreon, and $0.45 ($0.30 + 2.9% of $5) goes to the credit card system, leaving $4.05 for Gaston.

The original value proposition of Patreon was that 1) most pledges were donations, so no sales tax was due, and 2) multiple small donations to multiple webcomics, podcasts, and writers would be lumped into a single credit-card transaction per month, amortizing the payment-processing fees and avoiding the hassle of tracking many small payments in your ledgers. This was very widely discussed among people in the business of creating online. They changed their model for reasons, and its now most attractive for creators who charge at least $5/month and offer something of value in return.

More generally, this is just how making small purchases over electronic payment rails works. Its the same as buying an energy drink at one of the retailers most of us have had to work at. If you walk into the village store and buy a red bull for £3 with a credit card, a surprising amount of that is remitted as sales tax or deducted by the credit-card company.

Sean March 15, 2026 12:14 PM

“But that paltry amount then goes into your service ‘account’ which then attracts further “withdrawal” fees added to get the money out of the service account and into your bank account.” At present those are pretty minimal (eg. PayPal will still send money into your bank account for free if you know which buttons to click) and note that all monthly pledges become one transfer to your bank account. The two that hurt are charging sales tax on donations that are not exchanges for goods or services, and deducting the full payment-processing fee on every small donation. Oh, and currency conversion is also ‘fun.’

This is elementary business and numerate people in the trade will talk your ear off it, like taxi drivers and restaurant owners will talk your ear off about the evils of credit cards. The bigger issues are that creative economies are winner-take-all, and that its easier to make money by posting often and encouraging your fans to develop parasocial relationships with you than on posting fewer things which require more effort and expertise.

Clive Robinson March 15, 2026 7:37 PM

@ Sean,

You avoided rather than answered the 2/3rds question…

By saying,

“In most jurisdictions sales tax etc. is added to a pledge. “

Actually that is not as true as you might think, and something those services rely on you not knowing so they can take and hold your money and trade it in “overnight lending” and the like.

There are three basic types of income into an entity that attract different rates of taxation,

1, Earned – from goods/services sales
2, Charitable – from giving to a “charity / church / foundation / trust etc”.
3, Gifting – limited sums that attract no taxes.

Those online “middle men” rarely let you receive income in the latter two categories, or claim back from authorities not in their jurisdiction (It’s a US IRS issue that the “financial middle man” also benefits from”.

So in the UK a charity or exempt trust gets all of the payment ie no VAT or Sales/income tax.

Likewise direct payments of small amounts are treated like “boxes under the Xmas tree” or around the “birthday cake”.

Also small value goods shipped from abroad are/were not subject to import/export taxation or other tariffs.

All things those financial middle men directly exclude, not because of the US IRS as they tell you, but because it’s so lucrative for them.

Hence you buying our host two drinks, is only subject to sales tax and “tip taxation” if you chose to give it to the “server” which you pay.

The value you give is a “gift” and not subject to taxation you or the recipient would have to pay. So you could however give our host $10 and say “buy yourself a drink or two” in which case it’s a “gift” and save you both any taxation or potential “gratuity” tip liability.

This is well recognised under UK Law, in fact if you ask a UK Barrister about their “court robes” they will tell you about the “liripipe” –the flap over the shoulder– said to be a “gift or gratuity pocket” in them. Gifting is a tradition well enshrined in law (but it’s not just the legal profession that has a liripipe my graduation robes have one as well it is as I was told on having it “fitted” by the tailor “The mark of a professional in service for the public good”).

Sean March 15, 2026 9:21 PM

Clive, I didn’t bother answering the hypothetical 2/3 question, just showed our host’s audiences where the money goes in a specific real-world case (1/3 to the government and the finance industry, the rest to the creator). Most of the time the loss is lower (eg. my Patreon still has only a 5% fee deducted, and tips through a simple PayPal account never pay sales tax). The situations where the creator only gets 1/3 are situations like “buying porn clips” and “buying a rug with a credit card in Iran while the US is sanctioning it” (fortunately I had cash).

In Canada, where I live, sticker prices are before sales tax (PST, GST, and/or HST). So if you buy something advertised for $2.00 to which those taxes applies, you pay say $2.24. That is what I meant by “sales tax is added to a pledge.”

Sean March 15, 2026 10:11 PM

Parenthically, ko-fi is another service that only charges sales tax if the creator requests it. https://help.ko-fi.com/hc/en-us/articles/10792069957661-How-tax-works-on-Ko-fi#01H8PEJ62C264XWGQCQJCG4H8G The only service of this nature which I have caught improperly charging sales tax is Patreon (and Patreon encourages creators to offer something to donors, which pretty quickly blurs the lines between “tipping the busker” with no sales tax and “buying the pin at the gig” with sales tax).

Its also unfortunately the case that more people give me money through Patreon than PayPal or ko-fi, so even though Patreon keeps the most, its still a very good deal. Bind not the mouth of the ox that treads out the corn, etcetera etcetera.

Sudha Academy March 20, 2026 2:28 PM

This post highlights a critical issue as AI talent shifts from academia to industry. Balancing innovation with open research is essential to ensure knowledge remains accessible and benefits society broadly.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.