“The exciting convergence of technology consumption and production models” is the topic of the next book Vinnie Mirchandani is working on (his last book had a good, optimistic take on the need for innovation). He recently asked about the role of developers in all that soup – at least I think that’s what he asked. I typed up the below in response, explaining something I’ve noticed of late: this is a really good time to be a software developer. But why?
One of my development friends, John Joseph Bachir, said something refreshingly optimistic at SXSW this year: “this is a great time to be a developer.” And he’s right: there’s a mixture of (a.) technology choice and diversity, (b.) a focus on simplifying technology to provide more frequent functionality, and, (c.) a huge extension of what a “computer” is, therefore, what you can program.
Years ago, I was suspicious of the “software is in everything” line from big vendors like IBM – when that story fit nicely with Rational buying Telelogic. But, now, the broader idea of “the Internet of things” is real enough to play around with, and, for some to do something meaningful with. There’s three high-level things that are driving this, and no doubt more:
(1.) Networking is everywhere and it finally works well enough
Whether it’s Internet, cellphones/smartphones, tablets, laptops, TVs, or whatever, the chances are any device can be online all the time and people can be “always on.” I have a Withings scale that has wifi built into it, for example. There’s cheesy things like the Chumby, and it really doesn’t seem like sci-fi ridicousluness to have your appliances wired up. A friend of mine showed me at our weekly breakfast how he could use his iPhone to turn lights on and off at his home.
We’ve had the potential for all of this for years – here in Austin, IBM (again!) has been well known for a decade (or more!) for having one of those “labs of the future” where the fridge tells you when you need to buy more milk. What’s exciting here is that this networking is in common devices, not just luxury ones. For example, we’d expect BMW to have iPod syncing and fancy bluetooth, but now low-end Fords have the same technology. This week Google threw its lot in with the home automation crew with Android@Home – we’ll see if they can finally be the ones that deliver on the home-front.
That Withings scale is terribly expensive, for a scale, but the price will enviably go down. Having the network everywhere first means that consumers can do more with their devices, hopefully doing new things – like turning their porch light on if they’re staying out on a late night bender. But it also throws gas on another fire:
(2.) Open systems are now more the norm than not
The rainbow and sandals crew (kisses!) did much work to build, explain, and then execute on the idea of open source. For all of the polemics, business-driven subterfuge (you gotta make a return on that VC money somehow!), and IP-crap that open source stews in, there was a key idea that’s become part of what it means to do programming now: every component should be open, if only just enough, for you to tinker with and extend it.
Completely closed systems are aborhent to programmers and they generally think they’re crap. Compare a cable box to a Google TV (or a Roku, if you prefer). Traditional cable boxes have all these lovely ports on them like USB ports, Firewire, HDMI, and such. But you can’t do anything with them. There’s no way to hook something up and play around with it. Newer Internet-TV boxes have “app store” equivalents where you can contribute channels, apps, and otherwise get in there and do something.
Services like Facebook and Twitter are examples of open platforms as well: all of the extra “plugins” and applications built on-top of those two communities help contribute to their overall value: just look at how Zynga has built on-top of Facebook – nevermind the money Zynga is making (for the point being made here), look at all the entertainment value people get out of it.
More nerdy things like REST vs. SOA and cloud vs. enterprise architecture show how more open systems are attractive and enabling for developers. Using simple, more open, REST-based (or, at least, inspired) APIs allows developers to innovate and extend existing software faster than top-down architected SOAs. The lower priced, anyone can come use it, nature of cloud allows developers to do the same with the infrastructure they need to deliver their applications: they don’t need to wait on the more cumbersome machinations (though, largely for safety-nets) that enterprise architecture and IT hoist in front of them.
And then there’s the more straight forward open source itself: there’s so much software, from top to bottom, that’s freely available to use as components, tools, and even applications that a programmer practically has too much choice.
In short: developers have more functionality available to them than in the past, and there are more outlets for their software, more ways to reach users.
(3.) The Return of Apple
These two things and others do a lot to lay the groundwork developers need for that developer renaissance John pointed to, but it was Apple that finally made it all happen. It’s easy to be an Apple fan-boy and glowing speak of them as if they could do no wrong. What Apple did, however, was reboot our expectations of what computers could do for us. Let’s be honest, before the iPhone, people pretty much thought computers were a neusense, were difficult,
expensive, slow, and overall a dancing bear (to crib from The Inmates Are Running the Asylum): you see a bear dancing, and you think it’s amazing, you gape at it…but compared to how well even a rudely trained human could dance, the bear’s skill is total crap.
We’d lived through decades of crappy desktop, web, and phone interfaces. Good user experience was always a left over idea: it was like flossing, we all knew we should be doing it and praised its effectiveness, and were amazed when people did it and had stupendous results…but we never did it. Right before Apple came back, most of the offenders had returned to good UX (Facebook vs. MySpace is a good touch point as well) – Ajax was amazing, though for all that amazingness, compare Google Cal to something like iCal or Outlook’s desktop calendar.
The release and (importantly) maturation of the iPhone made people look at computers in a new way (and the iPod before that to a less extent): you didn’t expect the computer to get in the way, you didn’t need to learn how to deal with it’s eccentrics. And if you did need to learn, you soon forgot that awkward time (for example, note how people complain that Android is awkward at first – they’re just used to iOS, mostly and have to “learn” the naturalness).
There’s much ink spilled on this topic, but to crib a phrase from the enterprise world, Apple figured out business/IT alignment in the consumer space: the figured out consumer/IT alignment. The iPod was not a computer with a music browsing and MP3 syncing application, it was a “service” that played music, a device: they and their users don’t think of it as a computer, let alone software. It’s just a thing. The iPhone is largely the same,
though more wide in function than the single function iPod (well, the iPod did/does had calendars, solitaire, and some games).
What’s equally remarkable is that the rest of the industry learned quickly from this: they learned that the most important part of computers, software, and programming is the end-user, focusing on making their experience with the computer excellent, if only efficient. Hold up the iPhone to a traditional CRM system (or call center terminal) and you’ll get the idea. To me, this is what “the consumerization of IT” means: focusing on users above all else. And that’s the final thing that makes this a great time to program: you have to – get to! – pay attention to how people are using your software.
More underpinnings
There’s lots of details around those three things: the biggest and most exciting is cloud computing which is indicative of a lot of this: it’s a way to make programmer’s lives easier and allow them to more rapidly deliver features and applications to people. I call that “frequent functionality.” The unexpected rise of JavaScript as a “real language” is an under-appreciated important event, and a multi-year struggle to improve something as boring as source control (first with subversion, then git, and now the social programming that GitHub brings) helps make a programmer’s life better. And there’s all the practices around how you to that: Agile development, dev/ops, rapid feedback loops fed by real-time access to user behavior data, and &co.
The torch is passed from The Enterprise
A side thought on this is one conclusion: business is the last place you want to go now for innovation in software. Enterprises had their chance to be the flag-ship of IT, as did the military and science before them – all of those leaders did really well. Relational databases, GPS, transactional systems, processors and memory that perform beyond our wildest dreams (compare the rate of innovation in the auto industry to the computer industry – the older car industry looks like a bunch of people sitting on their thumbs! Were’s our rocket cars?! Only now do we electric cars, even!).
All of these things were good, but then business got wrapped up in developing software that was not focused on being usable, but intended to be compliant, safe, and affordable above all else. With rare exception, business no longer looked at computers as a way to differentiate and drive business (at least, their speed at adopting new technologies belied any attempts): businesses were dragged into e-commerce and banks took forever to do rational, online banking.
The consumer space is where interesting technology innovation happens now, and taking a consumer mind-set is what works when applying technologies. Consumers don’t think in terms of risk management: if I use Facebook,
I’m opening myself to these risks; if I sign up with the Sony Playstation Network, what’s the risk/reward for fun/credit card leaks; if I use this Android phone, what happens if Oracle wins their law-suit against Google (and how did that new Galaxy tab know about all the wifi networks I’d joined on other Android devices)? Instead they look at using computers to deliver functionality they need, not as part of some grand, management-chain plan.
Consumer tech is focused on the user, not the “stakeholder.” Enterprise tech has come to care only about the second.
Disclosure: see the RedMonk client list for clients mentioned.
Great post, good read. Thanks for sharing.
Not to mention redonkulous demand vs supply of good developers, right?