James Governor's Monkchips

Open Source Foundations Considered Helpful

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

lasercut of a man walking up steps. one step is missing. a mans hand supporting the cut out as it walks up a missing step

In the era of the open source rug pull, the role of open source foundations is more important than ever. The “rug pull” here refers to companies that have used open source as a distribution mechanism, building a community and user base, before changing the license to be restricted, rather than truly open source. “This is capitalism, yo. We’ve got shareholders to satisfy. It’s time to relicense that software, move to a Business Source license.”.

Why does open source even matter? My colleague has covered this question in some depth here.

The answer is that instead of the embarrassment of riches of open source projects we have today that developers may take up and use for whatever they choose without reference or restriction, we’d have a world dominated by projects carrying varying, conflicting usage restrictions that would render the licenses incompatible with one another and not usable by some.

Where open source used to be a sustainable commitment, today too often it feels like a short term tactic. Commercial open source isn’t what it used to be. Which means that open source foundations, which provide ongoing governance and intellectual property management for open source projects, are in an interesting position, in some cases becoming more adversarial than they historically have been with vendors.

The Linux Foundation (LF) and Cloud Native Computing Foundation (CNCF) are predicated on trust. Linux is of course one the most important pieces of infrastructure in tech history, so it’s no surprise the LF plays a key industry role as its steward, but latterly container-based infrastructure has taken a similarly important role, thus emphasizing the importance of the CNCF. Meanwhile the Apache Software Foundation (ASF) has done a great job of fostering sustainable, commercial, open source for decades now, most notably in the data infrastructure space – think Hadoop, Spark, Kafka, Flink etc.

One premise behind the CNCF is that user organisations can within reason trust it to stand behind the projects it incubates and manages. While not an explicit commitment, adopters generally, and enterprises specifically, have seen the CNCF imprimatur as one that they can rely on. In the era of the open source rug pull this kind of promise becomes even more important.

In this post we’ll look at some projects to get a lie of the land for the foundation in 2024 – Flux CD, OpenTofu (a fork of Terraform) and Valkey (a fork of Redis). One thing is clear – the LF and CNCF are taking a somewhat more interventionist approach than in the past. The CNCF was born in the era of the zero interest rate phenomenon (ZIRP), and it seems like we’re finding out how it, and other foundations, functions in the new era, where money is harder to come by. So what protections are available to adopters?

Sid Sijbrandij, CEO of GitLab has argued that open source companies should commit to an Open Charter as a mechanism to protect users from open source rug pulls.

Open source software isn’t useful if people can’t rely on the project remaining open source. Adopting Open Charter offers open source users predictability amidst the growing licensing switch trend.

With a CNCF project, though, the need for this kind of charter becomes less important, because the code is by design not single source, but has a diverse set of contributors. Which is to say that open source foundations can make rug pulls a lot less likely than adoption of open source technology built by a single company. Relying on benevolent dictators is generally pretty risky. And recently the benevolent dictators have seemed… less benevolent.

In theory then foundations play a critical role in trusted open source infrastructure. And yet they are far from universally appreciated.

Every once in a while someone writes a piece that complains that such and such foundation isn’t working the way they think it should. The piece gets picked up on Hacker News, goes viral, and everyone jumps in with their own complaints and opinions. A classic of the genre was a post from a few years back, now deleted, by Mikael Rogers – Apache Software Considered Harmful (now deleted) – in which he complained about the ASF and its hesitance in adopting Git, at the time a new way of working with source code.

Over the years the Linux Foundation, the Apache Software Foundation and the Eclipse Foundation have all received their fair share of brickbats. Depending on who you speak to these organisations are either too commercial, or not commercial enough. Too consensus oriented, or too top down. Too opinionated, or not enough. Or both. Captured by major vendors, and not responsive enough to community needs. But here is the thing – commercial open source would almost certainly never have achieved critical mass and continued success without foundations in the mix. The ASF was founded in 1999, and underpinned the adoption of open source middleware in the enterprise.

The LF is a non-profit organisation, which is not to say it’s uncommercial. It runs profitable conferences, including Kubecon. It sells corporate sponsorships to vendors and enterprise organisations to fund its operations. It also runs certification programs.

The CNCF acts as a subsidiary of the LF. It hosts projects including Kubernetes, Open Telemetry, Prometheus, Backstage, Envoy, and ContainerD. It also encourages end users to contribute to these projects. It also offers certification through the Certified Kubernetes Administrator (CKA) exam, a 2 hour online exam costing $395.

Open source foundations provide governance but they don’t provide direct revenue models for vendor contributions – which can be frustrating for commercial companies operating in their ecosystems.

On the “not being commercial enough” side of the ledger, earlier this year the CNCF said it would need to reassess Linkerd as a graduated project, when Buoyant, the commercial company behind it, announced it would now charge for stable releases.

There is a paradox here that mirrors that of open source. While open source is a phenomenal distribution and community model, it is not in itself a business model. While open source foundations are a fantastic way to encourage the growth of a community around a project, they do not in themselves drive revenue to the commercial companies building those projects.

The ASF, while also a non-profit, is somewhat less commercial in terms of its culture and operations. The Eclipse Foundation, now based in Brussels, is also a non-profit organisation and functions somewhat more like a traditional standards body. The Eclipse Foundation, now based in Brussels, is also a non-profit organisation and has leaned into a role where standardisation is all important – such as automotive, supply chain security, and regulatory compliance. The organisation has perfected the art of standardisation through implementation rather than just specification.

A new entrant on the foundation scene is Commonhaus, founded with the idea it could offer fewer restrictions than other foundations – claiming “minimum viable governance”:

Adhering to a “community-first” model, we offer support that respects project autonomy, ensuring governance is effective without being restrictive.

It has attracted some widely used projects such as Hibernate and the up and coming OpenRewrite.

But now let’s move on to those case studies for how governance and IP management can be helpful.

Sustaining Flux CD

A few weeks before Kubecon Europe 2024 in Paris a company called Weaveworks unfortunately went out of business. Weaveworks was a leading Kubernetes contributor, and it was also the commercial company behind the widely adopted Flux CD project. At Kubecon North America 2023 it had won an award for making an outsize contribution to CNCF projects relative to its size.

Flux CD is a deployment tool, designed to work with Kubernetes, used in production at big banks, telcos, and a number of cloud companies. So what would happen to Flux when Weaveworks shut down?

Sustainability doesn’t happen without a lot of hard work. Weaveworks CEO Alexis Richardson immediately began hustling and making calls, talking to the CNCF and other companies using the software about what would happen next. The ecosystem rallied round. A number of companies stepped up to announce ongoing support for Flux including Aenix, Aviator, Microsoft Azure, Cisco, Edgecell, Fairwinds, Giant Swarm, Gimlet, GitLab, Nearform, Opsmx, Opsworks, OSO, Teracloud, and TNG. With names like Cisco, GitLab and Microsoft in the mix the future of the project looks reasonably solid. A UK services company called Controlplane hired one maintainer, Stefan Prodan, with a view to providing enterprise support for Flux.

The CNCF was a vehicle for Richardson’s hustle. It stepped quickly into the breach, and arguably for the first time really proved its longtime value proposition of project sustainability. When interest rates are zero, and VC money is cheap, it’s pretty easy for everybody to feel comfortable and confident that everything will get funded. In the current environment, where VCs are tightening their belts, everything is a lot harder. The CNCF in this case was able to show that Cloud Native software wasn’t a zero interest rate phenomenon. There’s still plenty of work to be done but the most important thing is that Flux got a soft landing.

Helpful yes, but Richardson would have definitely preferred the CNCF to be active in helping Weaveworks actually sell the product as part of its charter. This is what he had to say about the Linkerd situation in February, and the same would apply to Flux.

People and companies work on projects for economic and intrinsic motivations.

eg Economics of CNCF needs to be justified to vendors.

Intrinsic motivations include belief that the project will flourish and be sustainable, popular, cool.

End users often are a mix of both. They will need support over time (in 3-10 year horizon) and in short term want to mitigate new tech risk

CNCF needs to take this a lot more seriously or it will be the home of poorly staffed second tier projects.

TOC+GB+EUC needs to engage at the level of product, marketing and long term business support, or find people who can.

The Relicensing Era

Of course there are going to be some issues around open source that are contentious in this post zero interest rate era. There is a lot more friction, in and around open source, as companies have been pressured by investors, shareholders, and so on to deliver greater returns. An increasing number of companies have therefore relicensed in favour of restricted licenses that are no longer open source, notably in the data management space. Company executives and investors have felt that returns needed to increase, and that open source was preventing this increase. They have also been driven by the idea that cloud companies – notably AWS – are taking advantage of the open source contributions of these companies to build businesses without needing to invest accordingly.

RedMonk has a clear position here – we don’t believe it’s necessary in order to build successful businesses. My colleague Rachel Stephens recently examined Software Licensing Changes and Their Impact on Financial Outcomes, by considering the cases of MongoDB, Elastic, Confluent and Hashicorp. We don’t see a clear relationship between license changes and revenue growth or share price increase, or, to be fair, decrease for that matter.

Any company is within its rights to relicense its software, but it can certainly be problematic from a community and project health perspective. Which is exactly why open source foundations are more important than ever.

Redis and The Post-Valkey World

On March 20th 2024 Redis announced it was moving to a source available model

Future Redis releases will continue to offer free and permissive use of the source code under dual RSALv2 and SSPLv1 licenses; these releases will combine advanced data types and processing engines previously only available in Redis Stack.

Beginning today, all future versions of Redis will be released with source-available licenses. Starting with Redis 7.4, Redis will be dual-licensed under the Redis Source Available License (RSALv2) and Server Side Public License (SSPLv1). Consequently, Redis will no longer be distributed under the three-clause Berkeley Software Distribution (BSD).

What was surprising in this case was the rapidity of the community response – by March 28th the Linux Foundation had announced Valkey, an “open source alternative” to Redis, which would allow for continued use and distribution under the Berkeley Software Distribution (BSD) 3-clause license. The announcement came with support from Amazon Web Services (AWS), Google Cloud, Oracle, Ericsson, and Snap Inc, which all committed to making contributions to the new project.

Getting AWS, Google and Oracle lawyers to commit to fork of Redis is under a week? That is… impressive.

One could argue that AWS would of course commit to a fork – after all, one of the main rationales behind Redis Labs decision to relicense was the complaint that AWS was gaining too much advantage from the code it was writing. But it’s important to also note the community aspects here. Madelyn Olson, an AWS employee and long time Redis committer led the community charge.

“I worked on open source Redis for six years, including four years as one of the core team members that drove Redis open source until 7.2. I care deeply about open source software, and want to keep contributing. By forming Valkey, contributors can pick up where we left off and continue to contribute to a vibrant open source community,” said Madelyn Olson, former Redis maintainer, co-creator of Valkey, and a principal engineer at AWS.

It was notable that Salvatore Sanfilippo, the original creator of the Redis project, said on Hacker News, that freedom to fork was why he’d chosen to use a permissive license for the project.

When I chose BSD for Redis, I did it exactly for these reasons. Before Redis, I mostly used the GPL license. Then my beliefs about licensing changed, so I picked the BSD, since it’s an “open field” license, everything can happen. One of the things I absolutely wanted, when I started Redis, was: to avoid that I needed some piece of paper from every contributor to give me the copyright and, at the same time, the ability, if needed, to take my fork for my products, create a commercial Redis PRO, or alike. At the same time the BSD allows for many branches to compete, with different licensing and development ideas.

When authors pick a license, it’s a serious act. It’s not a joke like hey I pick BSD but mind you, I don’t really want you to follow the terms! Make sure to don’t fork or change license. LOL. A couple of years ago somebody forked Redis and then sold it during some kind of acquisition. The license makes it possible, and nobody complained. Now Redis Inc. changes license, and other parties fork the code to develop it in a different context. Both things are OK with the license, so both things can be done.

The concerns of a business and a project founder may diverge over time. Redis is perfectly within its rights to move forward and close the license. But others are within their rights to fork it

So that’s a chunk of the community, big vendors, and the LF all in alignment – which adds up to a significant challenge for Redis. The vendor is putting its best foot forward by shipping new functionality, in critical areas such as graph database and vectors for AI, and most enterprise customers are fairly sanguine about the relicensing. But for community members and indie developers there is a feeling a social compact has been broken. Valkey therefore feels like a landmark for the Linux Foundation but also the industry as a whole. Relicensing doesn’t look nearly as attractive or indeed, worry free, as it did just a couple of months ago. Vendors are going to have to give such moves a bit more thought – the idea that relicensing concerns will simply blow over isn’t quite so clear. My colleague Stephen calls this the Post-Valkey World.

Valkey represents – whatever the project’s ultimate fate might be – the first real, major pushback from a market standpoint against the prevailing relicensing trend.

To be clear, no one should get carried away and expect Valkeys to begin popping up everywhere. It’s important to note that there are many variables that impact the friction involved in forking a project and the viability of sustaining it long term. Some projects are easier to fork than others, unquestionably, and Redis – if only because it was a project with many external contributors – was lower friction than some.

Not every project that is re-licensed can or will be forked. But investors, boards and leadership that are pushing for re-licensing as a core strategy will, moving forward, have to seriously consider the possibility of a fork as a potentially more meaningful cost. Where would be re-licensors previously expected no major project consequences from their actions, the prospect of a Valkey-like response is a new consideration.

So what about Terraform?

A bowl of Open Tofu

Terraform, the widely adopted automation engine from HashiCorp, was also relicensed in 2023 and a number of vendors, including systems integrators and cloud companies, were unhappy about the situation. The Open Tofu fork didn’t get the kind of immediate traction that Valkey did though. Contributors are mostly smaller companies – including Gruntwork, Spacelift, Harness, Env0, Scalr. Without a hyperscaler on board it’s harder to see the fork achieving true breakout status, although it continues to make progress, for example recently launching a registry of providers, modules and resources.

HashiCorp is still in the process of being acquired by IBM so we’ll see what happens going forwards. A rapprochement between Hashicorp and Open Tofu may yet be possible, although it seems unlikely.

But again, from the perspective of companies being able to be fairly confident that the bets they’re making are sustainable, a foundation’s support is a very good hedge. Companies relicensing usually claim they need to do so for sustainability reasons, so that they can keep investing in a product – providing all the good stuff like bug fixes, security patches, documentation and new features. But open source offers its own sustainability value, which is amplified with a well funded foundation in the mix. Foundations aren’t perfect – some of the complaints about them of course contain some truth. But it’s hard to see a model where they could effectively act as a reseller of specific projects and companies. That’s not the value they provide. Open source is a model proven over decades and foundations have been a big part of it.

A final point worth considering in terms of the value of foundations is project diversity. The CNCF has done a frankly stellar job of improving and sustaining diversity in the Kubernetes ecosystem – for example with dedicated programs to bring in new committers and contributors from a diverse set of backgrounds. The Kubernetes ecosystem is constantly being refreshed with new faces from diverse backgrounds, one of the biggest challenges of any open source project.

A word of caution about sustainability – while the CNCF has some huge backers with very deep pockets – Google, AWS, Microsoft and so on, it also needs a healthy ecosystem of smaller companies around it. It can’t just be a playground for trillion dollar companies. Many of the companies with those expensive booths at Kubecon are currently struggling to raise new funding rounds in a tough economic climate. VCs only want to invest in AI right now, and new funding rounds for cloud native infrastructure are a lot harder to come by. Vendors need to be able to illustrate better fundamentals.

But on balance yes – Open source foundations considered helpful.

Disclosure: Amazon Web Services, IBM, Elastic, Hashicorp, GitLab, Google, Microsoft, MongoDB, Oracle, Redis Labs are clients. While Weaveworks was not a client at the time of its shuttering, Richardson is a good friend of mine, which is worth disclosing.

4 comments

  1. Great foundational article. I posted the following on Linkedin yesterday and I believe it relates.

    MHO on “Fair Use Licensing” (See: https://lnkd.in/gfC3vabZ)
    1.) Open source has consistently accelerated innovation in the computing industry, ensuring foundational technologies can be shared by others who then expand upon them
    2.) The creators of the foundational technology almost always gain the largest share of revenue from monetizing that foundation “IF” they build good marketing, sales, services teams, and maintain a good partner ecosystem.
    3.) Without open source viral adoption, these same companies would almost never have achieved the same success
    4.) Their internal debate to later change their licensing model, is almost always driven by a desire to make even more money, e.g., Company A hits $100M ARR, and has now stalled, Company A’s executives, investors, and sales team, sees others making money from their project and desire to stop them, assuming those same clients will come to them.
    5.) However, a.) there is no assurance that those same customers would come to Company A, b.) there may be a reason Company A lost out to those competitors, and that should be investigated first (e.g., they provide better service, they have better sales strategies, they have better add-on products).
    6.) However, there is another reason open source and open core companies may look to change their model, 1.) if Global System Integrators begin providing free support for their products, given they use these technologies in major projects so their ROI to support them is easy, and 2.) if other major corporations bundle these open source projects into their platforms, cutting off revenue to the originators of the open source, but not necessarily increasing their own sales, (I can mention lots of examples here that will make people mad).
    7.) Therefore, there is a need to understand open core or FCL / FSL models.

  2. Many of these organisations are Us-based and have no trusted governance mechanism for an international audience. Just for example, US regulations are perfectly fine with it, that elected association board members get voted out by other board members. That would be unthinkable in Europe.

    The US bias often shows in neglect over other languages than English, tokenish representation of other communities at best, and difficulties of subsidiary governance. The US Culture often amounts to a kind of corruption in the open, where positions are traded.

    Personally I trust Belgium associations more to strike the right balance required for international affairs of open source. I just don’t want to get governed by an alien culture and get shoved down their poisonous “code of conduct” policies mandated by their corporate US sponsors that would even disqualify St. Peter in the Vaticans and their anglosaxon imperialism against other working languages.

    There is a good reason why the Document Foundation was created in Europe and the community did not – as desired by IBM – join the ASF OpenOffice effort. I would not trust the Apache Foundation and now they are even polluted by the woke discourse that finds their own name offensive. I think it is quite telling that the ASF has no branch in Europe or Asia!

  3. Thanks for the article, I do think open source foundations do a lot of great work on sustaining open source projects, with some of those foundations running on shoe string budgets like the ASF. There’s also a lot of work that PROFESSIONAL open source foundations do behind the scenes, like deal with patent trolls which have been an issue for open source projects and the downstream consumers of them: https://www.linuxfoundation.org/press/linux-foundation-and-cncf-expand-partnership-with-unified-patents-to-defend-open-source-software-from-non-practicing-entities

Leave a Reply

Your email address will not be published. Required fields are marked *