According to AT&T, 15 of their available 38 smartphones are LTE. None of those are iPhones. For Verizon Wireless, that number is 23 of 46. Again, no iPhones. While it may seem unusual that Apple’s flagship – and indeed only – smartphone is not technically able to access the fastest available wireless network, it is not. Few remember it today, but the original iPhone was no different. When launched in June of 2007, the device was limited to GPRS/EDGE – so-called 2G networks – rather than the 3G that was already in place in many major markets at the time.
LTE will almost certainly arrive in the next version of the iPhone, just as 3G came to the original iPhone in the 3G update. The belated arrival of next generation network capabilities, and their lack of a demonstrable impact on product sales, speaks to the importance – or lack thereof – of technology in Apple’s design process and in its customers’ buying decisions. The sustained success of the iPhone suggests that Apple’s customers care more about the overall product than whether the device incorporates the newest technology. Technology, in other words, is but one piece of the equation, rather than the equation itself.
Which seems like an obvious conclusion. And yet it’s one that the technology industry seems unable to grasp in any fundamental way. This would be understandable if the evidence were limited to Apple. Few would argue that the industry’s largest entity is, in nearly every respect, unique. But the history of the industry is littered with projects that have been dismissed at one time or another because they were outdated, inferior or otherwise non-competitive technology. Linux was dismissed by industry analysts and observers for years; it is now the de facto operating system of the cloud, as well as the foundation for the volume leader in mobile shipments. MySQL was routinely pilloried by DBAs and competing vendors alike; it is today, in all likelihood, the most widely used relational database on the planet. PHP has been the object of scorn from high end technologists for years; all these years later, even its critics admit it’s everywhere. Firefox is too simple. Eclipse isn’t competitive with professional IDEs. Ubuntu is just for desktops. Android is a poor re-implementation of iOS. VMware is just a toy. Same with AWS [coverage]. And so on.
The point isn’t to debate the individual merits of these claims; some have substance, others do not. The point is rather to highlight the relative irrelevance of the debate itself. Regardless of technical experts’ opinions of the individual projects, all have demonstrated volume success and adoption. The fact that open source was widely perceived, at least until recently, to be fundamentally unable to innovate has hardly seemed to throttle usage.
Because technology is, appropriately, driven by engineers, it is natural that perspectives are heavily informed by same. The challenge is that, as Apple has proven, technology is only one part of a product, and often far from the most important. What amounts to a cognitive bias is seemingly endemic to this industry.
More specifically, it leads us to:
- Believe that we can predict outcomes based on technical merits
- Undervalue factors such as convenience or packaging [coverage]
- Overvalue elegance or degree of difficulty of technical implementation
- Undervalue emerging markets
- Disparage solutions that are deemed less technically sophisticated
If it’s true, and it seems self-evident here that it is, that software is eating the world, technology will only become more important moving forward. In that context, quality of implementation will naturally play a role in sifting the winners from the losers. But as we move forward, it is worth remembering that the best technology not only doesn’t win all the time, it may not even win most of the time. For better and for worse.