In early January, I released my annual set of predictions for the upcoming year. Among them was the contention that “Ubuntu will emerge as the de facto alternative at the expense of SuSE.”
As might be imagined, the people at Novell had questions about this assertion. These we discussed during a briefing on January 26th. To their credit, the SUSE team was polite and respectful, even as we fundamentally disagreed.
My own is that SUSE faces some fundamental challenges.
The disconnect is simple to explain: we advantage different metrics. Our belief at RedMonk is that more often than not, sustainable performance is most strongly correlated with developer traction, visibility and usage. Revenue, shipments and profit are excellent at measuring how you’re performing now; they are less adept at anticipating future direction.
Developer adoption, on the other hand, is, in our view, highly predictive. It is difficult to identify cases where volume adoption of a given technology has not resulted in its success. Profit potential varies, of course, according to a number of variables from licensing model to addressable market size. But historically, optimizing for developer adoption is a sound strategy; it is arguably true that this correlation is becoming stronger.
The metrics we look at at RedMonk, then, are intended to quantitatively assess developer opinion. There is no single metric for accomplishing this; instead we employ – depending on the subject – varying combinations of public and private data sources to form a larger narrative. Even trivia like Google Trends search data has significance when used in context.
Sometimes the conclusions reached merely validate the conventional wisdom; Amazon Web Services is one such example – it is just as popular as commonly believed. More often, our explorations of developer traction turn up less understood strengths, weaknesses and areas for improvement.
We perform these analyses regularly for research aimed at particular developer communities; my FOSDEM talk “The Rise and Fall and Rise of Java” [coverage] was one such. But we also perform them regularly on behalf of clients, whether they want us to measure them at a corporate level or to look at specific products or platforms.
And while Novell is not currently a RedMonk client, having prepared a few of the charts in preparation for my call with the firm, I thought the embedded slides might be useful both for those wondering how I came to the conclusions I did regarding Novell and for those curious about how RedMonk tries to quantitatively measure developer traction. This is just a sample report, and does not include the executive summary, remediation recommendations or backup data, but it communicates the essence of what RedMonk Analytics Custom offers.
As W. Edwards Deming once said, “In God we trust; all others must bring data.” The State of Novell and indeed all of our “State of $YOURCOMPANY” reports are us attempting to comply. The slides above are a measurement of how we see the firm performing according to a few of the metrics we believe are important. As such, it is a complement to, rather than replacement for, your traditional research services. We do not discount wholesale cited revenue performance metrics, marquee account wins and so on. We simply believe that relative to bottom up adoption and usage, they are less likely to be predictive.
If you’d like to see how your company or product is performing, we’ve got RedMonk Analytics Custom packages available for existing clients and a la carte options for those who aren’t working with us.
For the developers in the audience, we hope you find this research of interest, and feel free to let us know if there’s something you want us to look at.
Disclosure: Novell, as disclosed above, is not a RedMonk client. Mentioned competitors Canonical (Ubuntu), Microsoft, and Red Hat (Fedora) are RedMonk customers.