When it was first released in August of 2006, Amazon’s Elastic Compute Cloud – shortened to EC2 – was the definition of a minimum viable product. Which naturally led many of the vendors that EC2 would compete with and eventually dominate to dismiss it as little more than a toy. It wasn’t just the lack of features and polish that made it seem unthreatening, however. Its most important feature at the time – immediate, elastic availability – was not the priority for buyers.
It was for developers, quite obviously – the constituency that represented the critical mass that propelled AWS to where it is today. Enterprises, governments and other buyers at that time, however, not only didn’t prioritize the speed that EC2 enabled, they actively and aggressively discouraged it. For most buyers then and large numbers today the priority with technical infrastructure was reliability and stability, traits that were viewed at the time as antithetical to velocity. With IT typically viewed as less than a strategic differentiator and more of a cost center, senior technology leaders were unlikely to get a raise or recognition from pushing the boundaries of what was possible with technology. They were highly likely to be fired, however, in the event of unexpected or prolonged downtime.
This was the world perfectly encapsulated and described by Nicholas Carr’s 2003 Harvard Business Review article and 2004 book, both titled “Does IT Matter?” After decades of escalation in technology investments by institutions both private and public, Carr asked the basic question of whether the return justified the investments. With exceptions, Carr’s view was that in the question of whether IT mattered, the answer was that it did not. As he put it in the original HBR piece:
IT management should, frankly, become boring. The key to success, for the vast majority of companies, is no longer to seek advantage aggressively but to manage costs and risks meticulously. If, like many executives, you’ve begun to take a more defensive posture toward IT in the last two years, spending more frugally and thinking more pragmatically, you’re already on the right course. The challenge will be to maintain that discipline when the business cycle strengthens and the chorus of hype about IT’s strategic value rises anew.
Carr’s research symbolized the post-dot com era backlash against technology providers, one which led them to view infrastructure as they might utilities. It also, importantly, led to large scale offshoring and outsourcing. If technology didn’t matter, after all, why pay a premium to have software and applications developed locally, when the cost for programming talent in geographies such as India or the Philippines was a fraction of what it was in the United States or Europe.
In some respects, this remains true. Enterprises are increasingly content to outsource large swaths of their infrastructure that would have been self-managed in the past. Just as it has become obvious that owning and managing email infrastructure, as one example, is unlikely to deliver deliver differentiated value to an organization, the same conclusion can and has been reached by millions of customers of the various cloud computing vendors. Every customer running on AWS, Azure or GCP is, to some extent, saying that IT does not matter, at least not enough for the customer to own that infrastructure themselves.
But the real answer to the question depends on how IT is defined. If narrow definition is used and IT is taken to mean nothing more than base infrastructure, then Carr’s viewpoint remains correct. If, however, the definition of IT encompasses the entirety of an organization’s technology portfolio and strategy, however, the assertion that IT doesn’t matter could not be less accurate today.
Whatever one thinks about the idea of Digital Transformation – whether it’s perceived to be a legitimate description of a real phenomenon or a term hopelessly corrupted by marketing departments – the fact is that virtually every business on the planet is in fact attempting to transform itself, digitally. Alarmed by the prospect of “software eating the world” and desperate to not be the next Blockbuster or Kodak, businesses are attempting to dramatically expand their in-house technology capabilities, which in many cases means bringing them back in from abroad.
While vendors tend to talk about Digital Transformation using industry terms such as Agile, CI/CD, DevOps or similar, at heart what most of them are focused on is velocity – the same velocity that was once an anathema to conservative, stability minded-organizations. For companies that grew up on the web, digital natives like Amazon, Facebook or Google, the basis of competition in the digital realm has always been speed. How quickly can you spin up new products or services? How fast can those be iterated and improved upon? Can they be rolled out, efficiently? When you’re dealing with literally unprecedented questions of scale, a conservative, measured approach to software engineering simply isn’t an option.
When these digital natives were content to stay in their lanes, all was well. But as soon as they, the companies they inspired or their ex-employees founded began to aggressively enter traditional industries with an eye towards disrupting them via technology, the game changed. Gone were the days that insurance companies or banks could move at the pace that banks or insurance companies were accustomed to moving. Speed kills, as the saying goes.
In their efforts to move quickly enough, then, organizations are taking a two pronged approach. First, they’re making a determination of what areas of their technology infrastructure are mere commodities and which can offer them a competitive advantage. And second, once that determination has been made, they’re putting a premium on any area above the commodity layer, any investment that will allow their organization to move at a faster pace.
The implication of which is that while all IT may not matter, some may literally spell the difference between organizational life and death. In that sense, then, it’s safe to say we’re living in a post-“Does IT Matter?” world.
Betteridge’s Law, for once, is wrong.