The Enshittification of Content - and its Future
Please stop wearing that in public.
You’ve all noticed this: scrolling through X or LI (my two drugs of choice - I’m sure you might have others with the same phenomenon) has gotten even more depressing than it used to be, for one very specific reason: it’s all AI slop.
Specifically, there are two similar but unique brands of AI slop:
Pure AI Slop: this is content that’s created with
“Claude make me a banger social media post. Make no mistakes”-style prompts. The depressing bit here is not that people do it (it’s actually a useful way to instantly tell the moral qualities and intellectual capabilities of the posters), but it’s the fact they work. These posts are actual bangers: people like them, comment “what amazing insights” “great read” “very well written” and so on. Which gives you a glimpse into the real world. Before then, you could have just imagined your posts not working while others did because of the content, or because you’re unlikable, or the algorithm or whatever, but you can now have unfiltered access to the truth: people like the slop. I guess that when you think about tit, this is nothing new, as it’s happened in essentially all industries, and it’s probably even a bit late for this to happen to content, but still. [And fuck me, I’ve just done one of those “it’s not just X, it’s actually Y” above organically - AI Slop has performed perfect inception on my brain.].Good AI Slop: this is good content, from good people, who write a seed of an idea, insight, topic or whatever and then say
“chat turn this idea into a good social media post”. This stuff is the worst for a few reasons.It looks like all other AI Slop: so it is super easy to dismiss.
Even if you realize it’s good stuff: it’s super tiring to read as you constantly notice it’s in some other droid’s voice.
You instinctively realize where it’s all going to go: to shit.
These past few days I’ve made an effort to call people out on this, and I’m sure it’s not gonna help my already edgy way of making friends - but I do it to people in the second category.
Aside from quite a bit of support, the most worrying comments I’ve got are from people who asked “how could you tell?”. I’ve discarded these comments as glitches in the matrix and I’m going to continue this post with that same mindset.
It’s just a phase
The obvious pushback here would be: you’re a goodamn boomer, just get on the train and use Claude to write your stuff which sucks anyways.
Problem is, Claude still sucks at writing, or you’d be spared this, I promise.
We’re at that weird point where AI-created/edited content is better quality than average/low-quality humanly crafted content, but lower quality than good writers’ content.
Will it last? Probably not. But it’s like showing up with the first machine made sweater just as they had invented the machine, in a club full of people dressed with handmade stuff: you would have looked ridiculous.
I contend that you can still wear handmade sweaters while knowing that machine-made stuff is the obvious future, and in the meantime use machine-made stuff to clean your toilets until it’s good enough to actually wear.
The future of content
I don’t think I even own a single handmade clothing item, and that’s ok. I do realize that handmade stuff can be gorgeous, last forever, and so on - but the world has just moved on: it’s too expensive.
So we’ve made a very conscious decision: we’ve accepted lower-quality garments in order to have an almost infinite amount of them over our lifetimes and change them whenever we grow tired of them.
In the clothing market, you can make the case (and many companies did make the case and won it) that there is ~infinite demand for new clothes - therefore bringing down the price of each one of them is a reasonable thing as you can then sell so many more.
Can we do the same thing with content? I’m still not so sure.
There clearly is infinite demand for good content, but the supply had already saturated the market long before AI came around (there’s no way to “finish” X, Instagram, etc.).
Therefore, I’m not so sure that lowering the cost of production of average content has the same benefits of a new cheap pair of jeans.
I already have ~100-1000x the amount of good content (books, podcasts, movies, articles, tweets, shorts, etc.) that I can realistically consume in my lifetime, so adding another supply of shitty content is really not gonna help here.
The only things that are going to matter in the content market are:
can you create way better content than other people thanks to AI?
can you create much more personalized/able content?
On the first point, you need to keep in mind that everyone will have access to the same tools as you, therefore whatever technological arbitrage you’re betting on will be closed in a matter of days.
The second one is the more interesting one, but it’s also hard to grasp exactly what it means:
Will your agent write 1000 posts a day, and each one will reach only that one person who would find it valuable?
Will the machine-makers/owners churn out a gazillion posts a day and you’ll just have to choose between what the supermarket of ideas has in stock?
Will it be new platforms where AIs write all the content you consume?
Whatever will happen, I think we all have a vested interest in making sure AI writing brings about a flourishing of unique ideas and therefore the continued evolution of human thinking, rather than a descent into distopian infinite slop factories.
Do your part for now, stop writing AI slop.
Leaving you with a helpful / funny / depressing video of a professional enshittificator.



AI-driven content generation remains fundamentally tethered to the rearview mirror. Because Large Language Models are stochastically trained on historical datasets, their outputs are probabilistic averages of what has already been said. By definition, these systems are backward-looking, identifying patterns within the known rather than conceptualizing the unprecedented.
Investing and finance are a good example. In what we call the "Age of Geopolitical Dominance"—characterized by radical shifts in geopolitical alliances, energy transitions, and systemic market volatility—AI fails as a predictive tool. It cannot synthesize a future that breaks from the past; instead, it offers "hallucinations" of normalcy or recycled tropes.
For the sophisticated investor navigating a regime change where historical correlations have decoupled, AI-generated insights are not merely limited—they are potentially catastrophic, as they lack the cognitive plasticity required to anticipate a world that no longer resembles its training data. https://swerveinsights.substack.com/