tecosystems

Revisiting the 2012 Predictions

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

As has become the tradition in this space, with January comes the year’s annual predictions piece, which forecasts technology industry developments for the calendar year ahead. But it’s difficult to weigh the value of the predictions absent context; in this case, an evaluation of how the prior year’s predictions fared. From last year’s performance, as well as the performance of years prior (2010, 2011 parts 1 & 2), readers may properly value the predictions for the months ahead.

Below then are thoughts on the hits and misses from the 2012 predictions. Before we get there, however, a quick list of the most popular posts on this blog for 2012.

Most Popular Posts

In a new feature this year, here are the five most popular posts of 2012 by traffic:

  1. “The RedMonk Programming Language Rankings: February 2012”
  2. “The RedMonk Programming Language Rankings: September 2012” (sensing a trend?)
  3. “Microsoft Surface and the Future of Software”
  4. “On the Decline of the GPL”
  5. “Amazon DynamoDB: First Look”

And with no further ado, on to the predictions.

Data & The Last Mile

[Data visualization and analytics], then, will be an area of focus in 2012, for both innovation (look for assisted anomaly and correlation identification, a la Google Correlate) and M&A.

While the anticipated wave of analytics/visualization merger and acquisition activity did not materialize, the area was nevertheless an area of substantial focus. From Nathan Yau of Flowing Data’s membership program for creating visualizations to RStudio’s hire of Hadley Wickham to improvements in Google Docs’ charting capabilities to spikes in interest in D3 (see above) to the near doubling in size of the Tableau Customer Conference to the Datameer tooling that generated this map of the Hadoop ecosystem to some of the most widely viewed visualizations of the year at FiveThirtyEight, it was a busy year for the last mile of data. The business of making data useful is not nearly as mature as the industry around processing data, but it took some important steps forward in 2012.

I’ll count this as a hit.

Desktop Importance Declines

The desktop is simply not as important as it once was. Mobile usage is eroding the central role PC’s once played; while they are still the dominant form of computing, the trendline is declining and there is no reason to expect it to invert. It’s been suggested that mobile computing in general is additive; that it’s being used to extend the usage of computing to areas where PCs were not employed, and is thus non-competitive. But our data as well as Asymco‘s indicates that, at least in part, mobile usage is coming at the expense of traditional platforms. General search volume data, as we’ve seen, validates this assertion.

According to Horace Dediu, unit sales of tablets will exceed that of PCs this year. That particular forecast may or may not prove to be correct (NPD concurs), for what it’s worth, but it would be difficult if not impossible to make the case that the importance of desktops – in all contexts – declined over the past year.

October was dominated by headlines such as “PC sales continue slump amid iPad takeover” and “PC Sales Go Into a Tailspin” even as iPads sold as quickly as they could be manufactured and even Android showed some life in the tablet market with one million Nexus 7’s sold.

Tablets and PCs in many cases are complementary devices, with tablets far more portable but significantly worse – barring some real innovation in virtual keyboards – for content creation. Tablets, therefore, are not a mortal threat, as their market entrance does not make it a zero sum game. But the salad days of desktop computers are, in all probability, gone forever, as their role shifts from general purpose computing platform to more specialized niches such as content creation, gaming, or workstation.

I’ll count this as a hit.

Developer Shortages

View Stanford ML-Class Student Map in a larger map

This is a long term process, so obvious progress within 2012 will be minimal, and talent shortages will continue. But we will in the next twelve months begin to see distance trained students hired at scale, and this will be one of the first steps towards lower talent costs as well as, possibly, the restoration of middle class employment opportunities.

Interest in online courses has been, as expected, robust. By the time Stanford Engineering’s new online courses officially began on October 10th, the 66,000 students for their Introduction to Databases course had watched, collectively, 290,000 videos and taken 10,000 tests. The 72,000 Machine Learning enrolled students, meanwhile, had viewed 850,000 videos. The Artifical Intelligence course, meanwhile, attracted more students than the other two put together: 160,000 from 190 countries. The program was so successful that one of the Stanford professors involved, Sebastian Thrun, has left Stanford to found an online learning startup.

If there were questions about the appetite for online learning, then, those are over. From Stanford’s programs to EdX to Khan Academy to Coursera, multiple entities are targeting the surging demand in online education.

The prediction above, however, went further and asserted that students trained online would be hired at scale. And there is no direct evidence yet that the tens of thousands of students enrolled in programs like Stanford’s are successfully parlaying their online training into gainful employment at scale. Online learning institutions, in fact, appear to be targeting the credibility of the coursework through a variety of mechanisms, from robot graders to partnerships with existing educational institutions. So while there are indications that the stigma of online education from an employer’s standpoint is beginning to fade, this prediction has to be called into question. I still believe in the substance of the prediction, however; it was probably made a year or two too early.

This is, technically at least, a miss.

Monitoring as a Service

The proliferation of these services is a direct response to the increasingly heterogeneous nature of application architecture and the reality that the substrate is frequently network based, rather than local. Given accelerating rather than declining consumption of network resources, we predict a strong increase in interest and adoption of MaaS tools. Much as I don’t care for the term itself.

In 2012, New Relic claimed to have nearly tripled their user base in a year, and perhaps the best evidence of its success came in a patent infringement suit filed by CA in November. Boundary, meanwhile, raised $15M in a Series B round that closed in July. Similarly, 10gen, the company behind MongoDB – and its related Monitoring and Management Service (MMS) – landed $42M in another round in May. And yet another MaaS player, Tracelytics, was snapped up in June.

The implications of the M&A and investments in the space are clear: the substantial, sustained growth – not to mention the potential for annuity-like revenue streams – are indicative of a high growth, high upside market.

In spite of the fact that the majority of entities in this space are substantially underleveraging the asset that is their collected data, I’ll call this one a hit.

Open Source and the Paradox of Choice

This paradox of choice [from a proliferation of open source projects], or too much of a good thing, will become more problematic over time rather than less as contributions will continue to rise. The net impact is likely to be increased commercial opportunities around selection, and therefore attention to vendors like Black Duck, Open Logic, Palamida and Sonatype.

It is certainly true that the paradox of choice with respect to the selection of open source software became more problematic. In the database space alone, confusion is the rule rather than the exception as users attempt to determine first which type of datastore is appropriate, and secondarily which is the best choice within the category. Consider the following actual queries pulled from RedMonk Analytics: “hadoop vs mongo vs redis” or “what is nosql.” And even in categories once thought settled, rising new projects such as nginx are creating new choices or new decisions to make, depending on your perspective.

The attendant commercial opportunities have gone somewhat under the radar, but Black Duck’s growth has been strong enough to have it publicly discussed as an IPO candidate. Sonatype, meanwhile, was named one of Inc. Magazine’s “fastest-growing private companies” in August, a month after taking another $25M in its latest funding round. Open Logic, for its part, reported an increase of 730% in open source scans from Q12011 to Q22012. Growth, in other words is there, even if the largest opportunities – dramatically streamlining the open source selection process – remain.

I’ll call this a hit.

PaaS: The New Standard

In 2012, this will become more apparent. PaaS platforms will emerge as the new standard from a runtime and deployment perspective, the middleware target for a new generation of application architectures.

In 2012, PaaS took some important steps forward, led by projects like Cloud Foundry and OpenShift, but it cannot be argued that it fulfilled its predicted role as the new middleware standard. It may yet get there, but it certainly hasn’t yet.

This is a miss.

Service Proliferation

With the inevitable adoption of multiple third party services – varying cloud resources, multiple, possibly overlapping, management and monitoring services and so on – will come challenges in making sense of the whole. Overall, instrumentation and visibility on a per service level is improved, but aggregating these views into a cohesive picture of overall architectural health and performance is likely to be highly problematic. Not least because the services themselves may present conflicting information and data. Google Analytics and New Relic, for example, are frequently at odds over load times and other delivery related performance metrics. Introduce in to that mix services like Boundary or CloudWatch and the picture becomes that much more complex. Connecting their data back to underlying log management and monitoring solutions such as 10gen’s MMS or Splunk is more complicated still.

The challenges of service intregration will create commercial opportunities for aggregating services which consume individual performance streams, normalize it and present customers with a consolidated single picture of their network performance. Commercial solutions will not fully deliver on this vision in 2012, but we will see progress and announcements in this direction.

The opportunity here, in my opinion, remains. Apart from applications like Cloudability, Cloudyn, Newvem or PlanForCloud (purchased by Rightscale) which sought to centralize visibility and management of usage or pricing metrics, or players like Boundary who traversed complex infrastructures with a number of moving parts, no one played even a superficial role in aggregating different service feeds into a horizontal view of infrastructure.

This is a miss.

Telemetry Usage

Five years ago, we began publicly discussing revenue models based around what we termed telemetry, or product generated datastreams. The context was providing open source commercial vendors with a viable economic model that better aligned customer and vendor needs, but the approach is by no means limited to that category: Software-as-a-Service vendors, as an example, are well positioned to leverage the data because they maintain the infrastructure. In 2011, we finally began seeing vendors besides Spiceworks take the first steps towards incorporating data based revenue models. For products like Sonatype Insight [coverage], data is not a byproduct, but the product.

In 2012, this trend will accelerate as necessary monitoring capabilities are added to product portfolios and industry understanding and acceptance of the model overcomes conservative privacy concerns. Many more vendors will begin to realize that like New Relic, which observed a decline in commercial application server usage, their accumulated data is full of insights on customer behaviors and wider market trends both.

On the surface, little has changed. With obvious exceptions like Sonatype’s Insight product (launched late in 2011) or New Relic’s App Speed Index (2012), most vendors remain publicly silent on the topic of putting aggregated telemetry to work. Behind the scenes, however, the tone of the conversations has changed completely. For the first time, vendors are now asking us about the trend, rather than us introducing it to them. A few are further along the curve here, and are at least building out their collection capabilities. Having watched Apple learn the lesson that data is much more effective barrier to entry than software, vendors are finally, after all these years, beginning to perceive the value that data represents.

In spite of the lack of high profile public examples, then, I’ll call this a hit.

Value of Software Will Continue to Decline

Capital markets have not, traditionally, been overly fond of software firms, perhaps because comparatively few of them eclipse annual revenue marks of a billion dollars – less than twenty, by Forbes‘ count. Microsoft’s share price has languished for over a decade in spite of having not one but two licenses to print money. The mean age of the PwC’s Top 20 software firms by revenue is 47 years; a fact which cannot be encouraging to startups.

Higher valuations instead are being awarded to entities that employ software to some end, rather than attempting to realize revenue from it directly. Startups today realize this, and the value of software in their models has commensurately been adjusted downward. Tom Preston-Werner, for example, describes the GitHub philosophy as “open source (almost) everything.” Facebook, LinkedIn, Rackspace, Twitter and others exhibit a similar lack of protectiveness regarding their software assets, all having open sourced core components of their software infrastructure that would have been even five years ago fiercely guarded.

This is becoming the expectation rather than the exception because it is nothing more or less than an intelligent business strategy. Businesses can and will keep private assets they believe represent competitive differentiation, but it will be increasingly apparent that less and less software is actually differentiating. As a result, 2012 will see even less emphasis on the value of software and more on what the software can be used to achieve.

It’s hard to measure this prediction accurately. There are so many different types of software with so many use cases, that a comprehensive answer would be novel sized – and a long novel, at that. Still, one exercise conducted over the last twelve months looked at the profits and margins of businesses built on or in part from sofware. Specifically, I examined the importance of software at IBM, Oracle and Microsoft. Among the findings were the conclusion that IBM’s software group accounts for 23% of the company’s total revenue, that Microsoft’s gross margin has been in steady decline since approximately 1998 and that Oracle’s percentage of revenue from software has declined from 80% (2007) to 69% (2012). None of which mean that the companies involved cannot make money from software; the reverse is actually proved. It does indicate, however, that each is facing a more challenging environment than in years past.

Conversely, we have seen consumer companies like Netflix (Asgard, Hystrix, etc) and enterprise vendors like Cloudera (Impala) continue to release important new technologies as open source that, a decade ago, would have been closely held proprietary code.

So I’ll call this one a hit. And it will continue.

Bonus Prediction:

Facebook’s Most Important Feature in 2012 will be Timeline. Mark it down.

Having cleverly failed to define “important” I can call this a win merely by whim today. But while features like “Promoted/Targeted Posts” are undoubtedly more relevant to the bottom line, and Poke is certainly the most reviled, I’d argue that Timeline – in reorienting how Facebook’s core asset, its users, engage with the property – is undoubtedly the most important feature of 2012. If you don’t agree, Google “facebook businesses timeline.”

I’ll call this one a hit.

The Final Tally

Tallying them up, then, we get seven correct predictions against three misses for a 70% success rate. For contextual purposes, that’s better than my 2010 predictions (67%) but down substantially from 2011 (82%). This is, in part at least, a function of the aggressiveness of the picks per year. And in general, when discussing predictions with those in the industry, the preference skews towards aggressive, as highly accurate but non-surprising predictions are of comparatively low value. So for this year, a few of the predictions anticipated far more progress than was actually made. Given that in the majority of cases, however, I still feel good about the prediction itself, even if my timing was off, this is an approach I will stick with.

Thursday, then, look for moderately aggressive predictions for the year ahead. Even though it’s already begun.

Disclosure: The following companies are RedMonk clients: 10gen, Black Duck, Cloudera, IBM, Microsoft, New Relic, Sonatype, PlanForCloud (Rightscale), Red Hat (OpenShift), Tracelytics, and VMware (CloudFoundry). These companies or institutions, meanwhile, are not currently RedMonk clients: Apple, Boundary, Datameer, Cloudability, Cloudyn, Coursera, Facebook, Google, Khan Academy, Open Logic, Oracle, Netflix, Newvem, RStudio, Stanford University, and Tableau.

2 comments

  1. You could make a reasonable argument that tools like Graphite/Riemann/Statsd and companies like Librato are helping to deal with the service-proliferation problem.

  2. […] Revisiting the 2012 Predictions According to Horace Dediu, unit sales of tablets will exceed that of PCs this year. […]

Leave a Reply to Donnie Berkholz Cancel reply

Your email address will not be published. Required fields are marked *