What Actually Survives the AI SaaS Reckoning
A follow-up on the AI SaaS shakeout and the moats that actually survive.
11 days ago I said 30-80% of AI SaaS companies would be dead by December.
14 reactions. 13 comments. Some strong opinions in the thread.
The response split into two camps: people who agreed with the destruction thesis but wanted to know what comes next, and people who thought I was being dramatic.
Both camps asked the same question: "What actually survives?"
I was heavy on the destruction in that post. Light on the construction. Here's the other side.
Proprietary Data Plays
Not "we fine-tuned GPT on some data." That's a weekend project now. I mean genuine proprietary data that took years to collect and that nobody else has access to.
Medical imaging datasets built over a decade of hospital partnerships. Industrial sensor patterns from thousands of machines across hundreds of factories. Legal precedent databases compiled by teams of lawyers who actually read the case law.
The key word is "years." If your data advantage can be replicated in months by a well-funded competitor with API access, it's not a moat. If it took years of real-world relationships, domain expertise, and physical-world data collection, you have something defensible.
The models are getting smarter every quarter. But they still can't generate data they've never seen. Original data from original sources remains valuable precisely because it's scarce.
Regulated Industries
Healthcare. Finance. Legal. Government.
Compliance isn't a feature you can clone in a hackathon. If your product exists because a team spent 18 months navigating FDA clearance, you have at least 18 months of breathing room before anyone catches up. If your financial product is SOC 2 certified with a dedicated compliance team maintaining it, a weekend wrapper built on Claude isn't a threat.
Regulation is often criticized as a barrier to innovation. In the context of AI disruption, it's also a barrier to commoditization. Companies embedded in regulatory frameworks have time that pure-software plays don't.
This isn't permanent protection. Eventually, AI-native companies will navigate regulation too. But "eventually" might be years, and years of lead time in a market moving this fast is worth a lot.
Infrastructure
The picks-and-shovels layer. Compute providers, model hosting platforms, evaluation tools, observability systems, vector databases, deployment pipelines.
Developers need infrastructure regardless of which model wins or which wrapper dies. When the gold rush shakes out, the people who built the shovels are still in business.
Infrastructure also benefits from switching costs. Once a company has built their entire ML pipeline on your platform, migration is expensive and risky. That's a real moat, even if it's not a glamorous one.
Network Effects
If your product gets better because more users are on it, that advantage compounds in a way AI capabilities alone can't replicate.
Hugging Face has millions of developers sharing models, datasets, and applications. Stack Overflow's value comes from decades of accumulated answers from real practitioners. GitHub's value comes from the social graph of developers and the massive repository of existing code.
These aren't defensible because of AI capabilities. They're defensible because of network density. A new entrant would need to convince millions of users to switch simultaneously. That's not a technical problem — it's a coordination problem, and those are much harder to solve.
Network effects built on user-generated content, community knowledge, or collaboration dynamics are the hardest moats to breach. The AI underneath might become a commodity. The network on top doesn't.
Deep Vertical Integration
Not "AI for X" as a marketing label. Actually embedded in the customer's daily workflow. Connected to their ERP. Trained on their edge cases. Integrated at a level where ripping it out would mean ripping out processes that touch every department.
The integration itself is the moat. It took a year of implementation, custom development, and domain-specific configuration. Nobody's replacing that with a ChatGPT wrapper, because the wrapper doesn't know that this specific client's invoicing system has a quirk where line items over a certain threshold need secondary approval routed through a different department.
Enterprise complexity is ugly. It's also deeply defensible.
What Doesn't Survive
• "Custom AI" that's a system prompt behind a paywall. The moment someone publishes that prompt (and they always do), the entire value proposition evaporates overnight.
• Any product where the complete value is calling an API and formatting the response with a nice UI. The API providers are building their own interfaces. The formatting layer is getting automated. There's no sustainable middle ground.
• Tools that compete purely on "our UI is nicer" when the underlying model providers keep improving their own interfaces. Claude, ChatGPT, and every other provider are rapidly improving their native experiences. Competing on presentation when you don't control the capability underneath is a losing position.
• "AI-powered" features bolted onto existing products as a marketing play. If the AI feature was added in a sprint to check a trend box, users can tell. When the hype cycle moves on, those features quietly disappear from the roadmap.
• Analytics dashboards that just display what the model already knows how to produce. If your product takes AI output and puts it in a chart, that's a formatting layer. The model will learn to make its own charts.
The Moat Test
I keep coming back to one question:
Can a motivated developer with Claude and a weekend replicate your core value?
If yes — you're a feature, not a company. Your exit window is now, and it's narrowing fast.
If no — what specifically is stopping them? That answer is your actual moat. And it should be something more substantive than "we had a head start" or "our brand is established."
Head starts evaporate when the tools move this fast. Brand matters only when it's backed by something tangible that others genuinely can't deliver.
Honest Reflection
I might be wrong about timing. The dot-com bust took longer than most people predicted. Some wrappers will limp along on existing enterprise contracts and slow-moving buyers who don't evaluate alternatives frequently. Inertia is real, and it's powerful in enterprise sales.
But the direction is clear. Foundation model capabilities are expanding every quarter. API costs are dropping. Every model update makes the wrapper layer thinner and less defensible.
I've been in tech long enough to know that "inevitable" doesn't mean "immediate." Some of these companies have 18 months. Some have 6. Some are already dead and just haven't noticed yet because the revenue is still on auto-renew.
The developers and founders who understand this aren't panicking. They're already building the next layer — the infrastructure, the deep integrations, the data plays, the things that genuinely can't be replicated with a good prompt and a weekend.
That's where the real opportunity is. Not in wrapping the model. In building something real underneath it.
What's your moat? Genuinely — I want to hear from founders who think their product survives this.
Related Guides
- AI Product Pricing Strategies — Price for value and growth
- AI Product Validation Guide — Prove demand before you build
- How to Ship AI Products Fast — Ship in weeks, not quarters
Related Stories
- Shipping AI Products in Weeks — The fast shipping mindset
- From Agency to AI Products — Building products, not wrappers
Learn More
For the full product strategy, join the AI Product Building Course.
Amir Brooks
Software Engineer & Designer