tecosystems

The Cyber Resilience Act: A Five Alarm Fire

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

On October 21, 2016, CNN’s website was knocked offline. So was the BBC and Guardian’s. Amazon, Etsy and Shopify too, along with Quora, Reddit, and Twitter – among others. Huge swaths of the internet were taken down by a series of attacks on the DNS provider Dyn. These Distributed Denial of Service (DDoS) attacks were conducted by legions of bots, which is to say tens of thousands of internet connected devices that had been compromised by malware called Mirai. It wasn’t an army of servers, or at least not just servers: it included cameras, printers, routers and even baby monitors. All of these devices from the Internet of Things (IoT) had been harnessed and retasked by Mirai, in this case in service of making a substantial portion of the internet unavailable.

The early versions of Mirai principally relied on unchanged default settings; later variations more directly attacked known vulnerabilities in the software running these internet enabled devices.

This attack was possible for two basic reasons.

  1. First, and most obviously, software is hard to secure and protect. All software is vulnerable. Even the best and most dedicated software authors in the world are not able to produce perfectly secure software.
  2. Making matters more complicated was (and is) the fact that there was little if any economic incentive for many providers of these IoT devices to protect them as experts like Bruce Schneier have been saying for years. It is hard to write secure software and keep it up to date and patched, and activities that are hard to do are by definition expensive. Nor were there any real penalties for not investing in producing secure artifacts, which is why most IoT vendors slapped together some cheap hardware and software, shipped it and called it good.

Even for those vendors that might feel obligated either ethically or financially to try and maintain their devices over time there was another critical problem: they didn’t write some or even most of the software on the devices they shipped – open source developers did. This meant that in many cases the device manufacturers did not understand in any meaningful way the software they relied on. It also meant that vendors often had no practical mechanism to get a given piece of open source patched short of begging – or worse, badgering – the developers in question. Developers who were already burning out in huge numbers because of unreasonable demands from users of the project to fix critical issues for free.

The industry reality, therefore, had (and has) a lot in common with a house of cards and was perfectly captured by Randall Monroe here. The European Union, to its credit, looked at this and said “maybe we shouldn’t be building on a house of cards.” Thus was the mandate for the Cyber Resilience Act (CRA) born.

The CRA’s initial remit was to try and tackle the IoT problem. But then came Log4shell. Present from 2013 on, the critical vulnerability in Log4j wasn’t discovered and disclosed until November of 2021. Then, thanks to the ubiquity of Log4j, all hell broke loose. Jen Easterly, the director of the United States Cybersecurity and Infrastructure Security Agency (CISA), called the exploit “one of the most serious I’ve seen in my entire career, if not the most serious.”

In the wake of the catastrophic incident, the EU decided to reconsider and revise the CRA’s mandate. No longer would it be strictly focused on software powering IoT devices. Instead it would encompass software, full stop.

On paper, the CRA makes sense. The world of software, after all, cannot be built indefinitely on a foundation that includes Munroe’s “project thanklessly maintained by a single developer from Nebraska since 2003” as a load bearing component. Change was and is necessary, as are new incentives – and penalties.

The question isn’t, therefore, whether or not something like the CRA is necessary and inevitable. The question is whether or not the CRA as it is written today is the appropriate tool for the job. And after multiple briefings on the subject, it seems safe to say that the jury is still very much out on that subject.

The CRA introduces a broad set of mandates for parties involved in the production of software, with the specific responsibilities varying depending on role. Among the many goals and requirements of the CRA are:

  • Shipping software free from known exploitable vulnerabilities
  • Shipping in a secure by default state
  • Ensuring that vulnerabilities can be patched easily
  • Making security updates available “for a minimum of 10 years or the remainder of the support period, whichever is longer”
  • Notifications of actively exploited vulnerabilities within 24 hours and general vulnerabilities within 72 hours both of which should include plans for mitigation and workarounds

The good news is that the CRA in 2025 is much improved from its initial incarnations in 2022. Those were characterized by one participant in the discussions as a “near-death experience for open source.” Indeed, without some of the current carveouts for open source developers, one plausible and even likely outcome would have been geo-fracturing of open source, specifically via the introduction of licenses that prohibited the usage of projects within the EU’s borders. That horrific outcome is potentially still on the table, but we’ll come back to that later.

For now it’s enough to know that open source foundations like Apache, Eclipse, Linux, Mozilla and many others all lobbied hard on behalf of open source and its developers to try and protect them from some of the act’s more far reaching requirements and penalties – it’s almost, as an aside, as if open source foundations could be considered helpful. Their collective efforts have tempered some of the CRA’s more problematic provisions from an open source perspective, but questions still remain, and a host of implementation details and specifications have yet to be finalized so evaluating the act is challenging.

As an example, many of these details are currently being worked on in the Open Regulatory Compliance Working Group (ORC WG), which has a very useful FAQ here. It defines the new term “open source steward,” codified for the first time in this document, and its responsibilities. When it comes to discussing how a steward can demonstrate that its met its reporting and attestation obligations, or what happens if it does not, the FAQ’s answer is “No answer yet.” Likewise, it’s clear that if you’re the creator of an open source project but do not monetize it, you are under no obligations under the CRA. If you monetize it, though, the act’s implications are less clear, and specifically at what thresholds obligations are triggered as the current wording includes legally vague conditions like “the intention of making a profit.” Mere contributors to open source, at least, are specifically exempt from CRA requirements.

If the implications for open source are less dire than in the initial draft, there is one thing that is inarguable: the CRA is a veritable five alarm fire for manufacturers.

At present, and as described above, manufacturers currently rely heavily on a wide variety of open source projects to produce devices of all shapes and sizes. From databases to operating systems to runtimes, open source is the foundation on which everything from baby monitors to cars to lunar rovers rests. Effectively zero manufacturers have commercial relationships with the producer of every project, framework or library they’re incorporating, which means that they are likely to struggle with some of the reporting and attestation requirements. And the penalties if they do are quite severe.

The penalty for non-compliance, for example, is a fine of up to 15,000,000 EUR or up to 2.5% of its total worldwide annual turnover for the preceding financial year, whichever is higher. The penalty for incomplete or inaccurate information, meanwhile, is a fine of up to 5,000,000 EUR or up to 1% of its total worldwide annual turnover for the preceding financial year, whichever is higher.

Based on the financial downsides here, optimistic observers are concluding that the CRA could be a game changer in open source economics. As companies digest the potential penalties involved, they will be obligated to establish commercial relationships with open source projects they currently rely on at no cost. That means more money going from vendors relying on open source to those producing it, which would be a boon for developers. It also raises the question of what happens to product prices when manufacturers are compelled to pay for software they have to date consumed at no cost, but that’s outside the scope of this exercise.

Pessimists evaluating the potential impacts of the CRA on open source software, on the other hand, see a world in which greater commercial interest and monetization is more than offset by a sea of manufacturers flooding maintainers of popular projects with requests – or more likely, given the penalties involved, demands – for project related services to meet the CRA mandated reporting obligations. It also could result in less open source software overall as manufacturers bring some software back in house, or it could produce the aforementioned geo-fracturing as open source developers who do not wish to have anything to do with the CRA either abandon their projects – which the aforementioned ORCWG FAQ felt compelled to discourage – or attempt to prohibit usage of their software within the EU’s jurisdiction.

The CRA’s requirements notably do not take effect until 2027, which is perhaps why it has received so little attention to date. But given the scale and scope of the effort required to comply here, which dwarf those made for GDPR and are perhaps more comparable to Y2K remediation efforts, any manufacturer not already planning for the CRA is behind and likely to be facing a mad scramble in the years ahead.