AI Slop Is Eating the Internet. We Helped.
The headline is not a metaphor. AI slop is everywhere: subreddits that read like chatbot summaries, comment sections that answer questions with plausible-sounding wrong specifics, Stack Overflow threads where the top answer was obviously generated and has two hundred upvotes anyway. A piece documenting this landed 800 points on Hacker News last week. The discussion ran for hours. The consensus was grim: online communities that took years to build are being hollowed out.
Here is the part most hot takes skip: we built the infrastructure for this before AI existed.
The platforms did not need AI to create a slop problem. Reddit gave karma to posts that generated the most engagement in the shortest window. YouTube ranked videos by watch time and clicks. LinkedIn figured out carousels got reshared, and then every growth consultant made carousels until LinkedIn was all carousels. Google rewarded content that matched search queries, and articles got written specifically to match search queries, and we called it content marketing and charged four hundred dollars an hour for it.
None of that required AI. Humans were already doing it. AI just learned to do it faster, at zero marginal cost.
The barrier to slop production was never quality control. It was friction. Writing a bad Reddit comment took time. Faking a forum post took effort. Building a spam persona took at least one bad weekend. That friction was never enough to stop determined bad actors, but it kept the casual slop rate manageable. When posting takes effort, low-effort posting has a natural ceiling.
AI dropped that ceiling. Not gradually. In about eighteen months.
One community moderator cited in the original write-up is now banning around 600 AI content accounts per month from a niche creative forum. Six hundred. Per month. From a niche community. The moderation cost now exceeds the creation cost by an order of magnitude. That inversion is the actual structural change. Slop is free. Policing it is not.
What Is Actually Being Destroyed
When people say AI slop is killing communities, they usually mean it is annoying. That is true but it is the smaller problem.
The larger problem is the signal. The forum post that used to answer your question with hard-won personal experience is now indistinguishable, at first glance, from a plausible-sounding AI response that gets the specifics exactly wrong. You still got an answer. You just cannot trust it.
This is also what is happening in music. Forty-four percent of new tracks uploaded to Deezer are AI-generated. The listening numbers for those tracks? Essentially nothing. The content exists. Nobody is consuming it. That is slop in its cleanest form: content optimized to exist, not to be used.
Communities that were supposed to be immune to this built their immunity on friction. Art forums required demonstrating skill. Technical forums required demonstrated expertise. Active subreddits required account history. Those mechanisms were imperfect, but they set a floor. Without them, you get what we have now.
The Sorting Is Happening Faster Now
Here is what is actually true: the internet has always had a slop problem. Spun articles. Keyword-stuffed product pages. Copied forum posts. Astroturfed reviews. None of that is new. What is new is the velocity.
AI tendency to optimize for measurable proxies rather than actual goals plays out at the community level too. The platforms built systems that rewarded engagement metrics. AI learned to optimize for engagement metrics. The result looks exactly like what the design would predict if you were paying attention.
The platforms that relied on passive friction to maintain quality are now discovering what zero friction actually means. Quality has to come from somewhere else. Credibility. Accountability. Skin in the game.
The communities surviving this look the same as they always have. Professional forums where being wrong has professional consequences. Hobby groups where members have known each other for years and recognize a fake. Subscription publications where the author name is the product. Anywhere the incentive runs opposite to slop: places where a bad take follows you.
That is not a new rule. It is the oldest one. Communities built on reputation survive. Communities built on volume do not. AI is just accelerating the sorting.
Here is the part that should bother you: a lot of the communities now being hollowed out were already fragile. The moderation was thin. The verification was absent. The economic incentive pointed entirely toward volume. AI did not create those vulnerabilities. It showed up with a scale model and said: you built this.
The internet is not dying. The parts of it that could only survive because posting was hard are dying. The parts that invested in actual community structure are going to be fine.
Start paying attention to which side of that line your favorite places are on.
The About.chat newsletter covers AI developments like this every week. Subscribe here.
Enjoyed this? Get more.
Weekly dispatches on AI culture, chatbots, and the robot future. No hype.