For a new golden age of FOSS
Feb 23, 2026

The generative AI trend is arguably one of the biggest paradigm shifts of the last decades for the tech industry —even though the field has seen its share of upheaval recently: the disappearance of OOP as the default mental model, the rise of distributed systems and big data, the emergence of low-level languages (Rust/Zig) making a quiet comeback, and even no-code frameworks promising to abstract the programmer away entirely.
This latest shift has split opinion sharply. On one side, a chorus of voices argues that developers are a dying breed, that the craft is being automated away into irrelevance. On the other, an equally loud camp warns that the "vibe-coding" trend is a plague slowly killing projects, demoralising key contributors, and ultimately creating more problems than it solves.
Both camps are, to some extent, missing the point.
I want to argue something different: that the AI productivity wave is a wonderful, and underappreciated, opportunity for Free Software and Open Source. Not despite the disruption, but because of it.
Software Was Already Eating the World
When Marc Andreessen coined the phrase "software is eating the world" in 2011, it felt like a provocation. Fourteen years later it reads like an understatement.
SaaS was the dominant economic model of the last two decades, and its winners compounded advantages ruthlessly. The clearest illustration is the cloud wars: AWS, GCP, and Azure didn't win by owning better hardware, they won because they could afford to build, staff, and iterate on software abstractions faster than any competitor. Their moats weren't physical. They were organisational and financial: the ability to sustain large engineering teams attacking large problems, continuously, for years - it’s arguably what any European competitor lacks right now and OpenStack did not help much...
This concentration of power is not confined to the giants of the datacenter world. The same pattern plays out at every scale, in every corner of the data and infrastructure ecosystem. Storage, ETL pipelines, artifact management, data transformation — each of these markets has its own set of companies who captured a market early, established switching costs, and gradually shifted their energy from innovation to retention.
What made these positions durable wasn't superior technology. It was the cost of execution. Building a credible alternative to these solutions meant years of engineering, a VC round or two (if the market was the “next big thing” otherwise forget it), a dedicated team, and a long way to reach feature parity before you could even begin the sales conversation. The idea was often not the hard part. Assembling the execution capacity to turn the idea into something real was.
That is rapidly changing.
The Slow Death of FOSS at Scale
To understand the opportunity ahead, it helps to understand what went wrong, or rather, what was always structurally fragile.
The open source ecosystem has been living with a quiet tension for years. The original promise was simple: share the code, share the burden, share the benefit. In practice, maintaining a widely-used open source project at scale is expensive. It requires sustained engineering investment, infrastructure, community management, and increasingly, legal and security resources. The idealism doesn't pay the bills and a lot of great projects died because of that (a recent example being Scrapoxy by Fabien Vauchelles).
The industry's response has been the open core model: release the core as open source, sell the enterprise features, and use the community as a distribution channel. It worked for a while, before the cloud. But it has been fraying, and the fraying is accelerating.
Minio is the most striking recent example and one of the fastest turnarounds in recent memory. The S3-compatible object storage project that became a cornerstone of countless self-hosted and cloud-native stacks quietly archived its open source project to make way for AIStor, a proprietary fork repositioned around AI workloads. The community didn't get a slow pivot. It got a fait accompli. The failure mode here is the rug pull: the open source project could be considered, in retrospect, as a customer acquisition funnel. When the market shifted, the funnel got redirected.
HashiCorp's BSL relicensing of Terraform followed the same pattern a year earlier, triggering the OpenTofu fork. A reminder that the community can fight back, but only when it moves fast enough.
The dbt and Fivetran story is a different failure mode: consolidation absorption and it plays out in two acts. In the first, dbt Labs built genuine momentum as an open source data transformation tool, then acquired SDF Labs to push further into the SQL intelligence space. Fivetran then acquired dbt Labs, folding an independent ecosystem player into a commercial platform. What was an independent node in the ecosystem became an asset on a balance sheet.
The second act is subtler and more damaging. dbt Labs announced the transition from dbt Core (the Apache 2.0-licensed engine) to dbt Fusion, a rewritten engine released under the Elastic License v2 (ELv2). ELv2 is not open source by any definition the OSI would recognise: it prohibits offering the software as a hosted service, which is precisely the use case that made dbt Core valuable to the ecosystem. The open source project didn’t disappear for now (and a release happened in February 2026), but it’s clear that the bulk of the innovation and investments of the company is going to be on dbt Fusion. It’s a rug pull with extra steps: slower, deniable, but just as final.
Then there is the quieter, less dramatic failure mode that Sonetype’s Nexus and JFrog’s Artifactory represent: innovation stall. No rug pull, no hostile acquisition, just a gradual calcification. These artifact repository tools captured their markets early, established deep enterprise integrations, and then largely stopped innovating in any meaningful sense. Pricing crept up. The UI stagnated. Feature development slowed to a pace dictated by enterprise sales cycles rather than user needs. They didn't fail — they just became the kind of expensive, slightly-resented infrastructure that teams budget for because replacing them feels too painful to contemplate and the alternatives are either partial and/or cloud-provider based.
Each of these stories has a different surface cause. But underneath, they share the same root: sustaining FOSS innovation at scale, in a market with well-capitalised incumbents, was too costly relative to the prize. The economics just didn't work and people can only compensate for so long.
The Gap Between Idea and Execution
Here is where the argument turns.
There is a basic principle at work in any market: when the cost of producing something falls, more of it gets produced. This is the supply side of the law of demand, and it is about to reshape the software landscape in ways the AI discourse has largely missed.
For most of software history, the gap between idea and execution was wide enough to be a meaningful filter. Having the right insight about what to build was table stakes. What separated successful projects from abandoned GitHub repositories was execution capacity: the engineering hours, the sustained attention, the infrastructure, the tooling, the documentation. The idea was cheap. Everything else was expensive.
Generative AI is compressing that gap at a rate that is easy to underestimate. Not uniformly, quality still matters enormously, but the cost of turning a well-scoped idea into working software is falling faster than at any point since the commoditisation of cloud compute.
The implications for FOSS are asymmetric, and this is the part that rarely gets said plainly: the cost reduction benefits free and open source projects disproportionately.
A proprietary vendor still needs to recoup its engineering investment through revenue. A VC-backed startup still needs to justify its burn multiple. But a FOSS project only needs to cross an activation energy threshold: enough working software to be useful, enough documentation to be approachable, enough momentum to attract contributors. That threshold has always been the hard part. It is now lower.
Think about what it used to take to build a credible challenger to Artifactory. Three years minimum. A funded team. A long crawl to feature parity across package formats. A sales motion to crack enterprise procurement. The idea — "a better, cheaper, open artifact registry" — was never the scarce resource. The execution capacity was.
Now consider what that same project would look like if a single maintainer with strong domain knowledge and AI assistance decided to tackle it and disrupt a space that hasn’t moved since 15 years. You don’t need to imagine it, just take a look at the project ArtifactKeeper (45+ package formats, systems and library repositories, distributed proxy support, SSO and Security included with an MIT License), at the time of writing this article (22nd February 2026) all of these features are included — the project started on the 15th January 2026 ~a month ago 🤯. I’m not judging the quality I haven’t tested it yet but at least the ambition is clear and I salute Brandon Geraci’s motivation.
This is not hypothetical. The signals are already there in projects like this, and a growing number of infrastructure tools being built by very small teams to very high levels of polish. These aren't flukes. They are early evidence of a structural shift in what a small, motivated team can produce.
FOSS as the Natural Beneficiary
The backlash against AI-generated pull requests is real and worth taking seriously. The reviewer fatigue, the low-signal noise, the erosion of the human craft at the heart of collaborative software development. These are genuine problems, not just personal anxieties.
But they are, at their core, governance problems, not productivity problems. The productivity, to me, is real. The question is who captures it and under what terms.
This is where the proprietary model has a structural disadvantage it rarely acknowledges. A commercial vendor capturing the AI productivity gains does so to protect margins, accelerate roadmaps, and deepen competitive moats. The gains flow to shareholders and, partially, to customers through better products. But as the software itself remains locked, the moat deepens.
A FOSS project capturing the same gains operates under entirely different incentives. The productivity goes into shipping more, faster, under a licence that ensures the code stays free. There is no margin to protect. There is no rug pull option if the core is Apache-2.0 or AGPLv3 from day one. The community retains the right to fork and if the license is GPLv3 it can even start a legacy. The switching costs stay low by design.
This is why the licensing question matters more now than it did five years ago. Projects that embed permissive or copyleft licences from the start are structurally protected against the failure modes we saw with Minio, HashiCorp, and the dbt ecosystem. The rug pull requires the rug. If the licence doesn't allow it, the option doesn't exist.
The opportunity is particularly acute in the markets like that of Nexus, Artifactory, Fivetran : expensive, stale, critical infrastructure with high switching costs and low innovation velocity. These are markets where their moats were built on execution cost, the cost of the alternative being too high to justify. That moat is eroding.
A well-designed FOSS alternative in any of these spaces, built by a small team leveraging the current generation of AI tooling, with a clean licence and a genuine community is a credible threat in a way it simply wasn't possible three years ago. They know this, which is partly why the pace of proprietary pivots and acquisitions is accelerating. The window for pre-emptive consolidation is closing.
For a New Golden Age of FOSS
The first golden age of open source happened because the internet eliminated the cost of distribution. Linux didn't win by outspending SCO or Sun. It won because the economics of sharing code shifted so dramatically that the proprietary model could no longer justify its own overhead for most use cases. The infrastructure of the modern internet (Apache, MySQL, OpenSSH, Linux itself) was built largely by contributors who couldn't have done it without that distribution revolution.
We are at an analogous inflection point, but on the production side. The cost of writing software is falling the way the cost of distributing software fell in the nineties. The implications are the same: the competitive advantage that large engineering organisations held through execution capacity is being democratised.
That doesn't mean expertise stops mattering. It doesn't mean quality is free. It doesn't mean every ill-conceived FOSS project is suddenly viable. The governance challenges around AI-assisted contributions are real and will require new norms : better review tooling, clearer contribution standards, more explicit signal-to-noise filtering.
But it does mean that the class of problems that were previously "too expensive to FOSS" is shrinking. The stale, expensive, proprietary dominant players in tech are facing a structural shift in the cost curve of their competition. For the first time in a while, the economics of building a genuinely free alternative are on the right side of viable at least for the bootstrapping.
The AI wave is not the enemy of free software. Handled well — with good licences, healthy governance, and the willingness to build — it might be its best chance in a generation to actually take back control.