Princess Diana stumbling through a parkour park. Team USA taking gold at the Bong Olympics. Tank Man breakdancing in Tiananmen Square. Kurt Cobain playing pogs. Tupac Shakur seeking poutine in Costco. Open AI’s Sora 2 artificial intelligence video generator debuted this month, and the internet’s mind-benders pounced upon it. Hilarious and harmless? Or a symbol of how we are kissing reality goodbye, entering an age where nobody can ever trust video again?
It’s the latest example of how AI is transforming the world. But the problem goes deeper than just creating energy-sucking brainrots; it’s becoming a major threat to democracy itself. Billions of people today experience the internet not through high-quality news and information portals but on algorithmically generated clickbait, misinformation and nonsense.
This phenomenon forms the “slop economy”: a second-tier internet where those who don’t pay for content are inundated with low-quality, ad-optimized sludge. Platforms such as TikTok, Facebook and YouTube are filled with maximal content at minimal cost churned out by algorithmic scraping and remixing bits of human-written material into a synthetic slurry. Bots are creating and spreading countless fake author clickbait blogs, how-to guides, political memes and get-rich-quick videos.
Today, nearly 75% of new web content is at least partially generated by AI, but this deluge is not spread evenly across society. People who pay for high-quality news and data services enjoy credible journalism and fact-checked reports. But billions of users cannot afford paywalled content or just prefer to rely on free platforms. In the developing world, this divide is pronounced: As billions come online for the first time via cheap phones and patchy networks, the slop flood often becomes synonymous with the internet itself.
This matters for democracy in two key ways. First, democracy depends on an informed citizenry sharing a base of facts and a populace capable of making sense of the issues that affect them. The slop economy misleads voters, erodes trust in institutions and fuels polarization by amplifying sensational content. Beyond the much-discussed problem of foreign disinformation campaigns, this insidious slop epidemic reaches far more people on a daily basis.
Second, people can become susceptible to extremism simply through prolonged exposure to slop. When users are scrolling different algorithmic feeds, we lose consensus on basic truths as each side literally lives in its own informational universe. It’s a growing problem in the United States, with AI-generated news becoming so prolific (and so realistic) that consumers believe that this “pink slime” news is more factual than real news sources.
Demagogues know this and are exploiting the have-nots around the world who lack information. For example, AI-generated misinformation is already a pervasive threat to electoral integrity across Africa and Asia, with deepfakes in South Africa, India, Kenya and Namibia affecting tens of millions of first-time voters via cheap phones and apps.
Why did slop take over our digital world, and what can we do about it? To find answers, we surveyed 421 coders and developers in Silicon Valley who design the algorithms and platforms that mediate our information diet. We found a community of concerned tech insiders who are constrained from making positive change by market forces and corporate leaders.
Developers told us that the ideology of their bosses strongly shapes what they build. More than 80% said their CEO or founder’s personal beliefs influence product design.
And it’s not only CEOs who make business success a top priority, even ahead of ethics and social responsibility. More than half the developers we surveyed regretted the negative social impact of their products, and yet 74% would still build tools that restricted freedoms like surveillance platforms even if it troubled them. Resistance is hard in tech’s corporate culture.
This reveals a troubling synergy: Business incentives align with a culture of compliance, resulting in algorithms that favor divisive or low-value content because it drives engagement. The slop economy exists because churning out low-quality content is cheap and profitable. Solutions to the slop problem must realign business incentives.
Firms could filter out slop by down-ranking clickbait farms, clearly labeling AI-generated content and removing demonstrably fake information. Search engines and social feeds shouldn’t treat a human-written investigative piece and a bot-written pseudo-news article as equals. There are already calls in the U.S. and Europe to enforce quality standards for the algorithms that decide what we see.
Imaginative solutions are possible. One idea is to create public nonprofit social networks. Just as you tune into public radio, you could tap into a public social AI-free news feed that competes with TikTok’s scroll but delivers real news and educational snippets instead of conspiracies. And given that 22% of Gen Z hates AI, the private sector’s billion-dollar idea might simply be a YouTube competitor that promises a total ban of AI slop, forever.
We can also defund slop producers by squeezing the ad-money pipeline that rewards content farms and spam sites. If ad networks refuse to bankroll websites with zero editorial standards, the flood of junk content would slow. It’s worked for extremism disinformation: When platforms and payment processors cut off the money, the volume of toxic content drops.
Our research offers a ray of hope. Most developers say they want to build products that strengthen democracy rather than subvert it. Reversing the slop economy requires tech creators, consumers and regulators to together build a healthier digital public sphere. Durable democracy, from local communities to the global stage, depends on closing the gap between those who get facts and those who are fed nonsense. Let’s end digital slop before it eats democracy as we know it.
Jason Miklian is a research professor at the University of Oslo in Norway. Kristian Hoelscher is a research professor at Peace Research Institute Oslo in Norway.