The technology industry, by nature, turns niche activities into mainstream ones. Even Thomas J Watson, founder of IBM, so the the apocryphal story goes, claimed the world would never need more than 5 computers. The flipside? Remember Bill Gates saying he wanted to put a PC in every home- oh how we laughed.
Well it seems the latest technology that is going is going to be pushed into mainstream business life is statistics and deep data mining and analysis. Back in 2006 BusinessWeek heralded the age of the “Quant” in article entitled Math Will Rock Your World.
With the SPSS deal that era may be arriving. My ex colleague at InformationWeek, Bob Evans, certainly thinks so:
I think IBM’s acquisition of SPSS will mark a seminal moment in that company’s evolution, and that it will also accelerate — perhaps even greatly accelerate — the broader evolution of the IT industry from one fixated on boxes and code that run internal operations to one that’s focused on providing insight and expertise that helps customers grow.
Google shows us every day the results of relying on data, and the law of large-numbers, for decision-making. The Google spell checker for example, is based on normalizing all of the different spellings people use of a word, rather than a traditional dictionary-based approach. Google’s idea of great design is… count the times. Google’s disease tracking efforts, meanwhile, are showing real success against more traditional approaches – see Flu Trends.
So, apart from the fact it was quants, the rocket scientists on Wall Street, who apparently confused inches with centimeters in their financial models who nearly brought down the world economy… now the argument is – we just didn’t have enough data. Because of the Wall Street Guassian model fugazi I am skeptical of IBM claims that SPSS brings them predictive analysis, but really deep analysis is plenty useful.
Banks, for example, are now looking to analyse not just 5 days of data to calculate market risk, but five years.
The scale of data we’re dealing has fundamentally changed. We stand on the cusp of a big step forward in environmental sciences, for example, again based on Big Data, as we start trying to mitigate the effects of climate change. Think of carbon-trading on a global scale and the data volumes involved. Think about retailers certifying and tracking every tree from the forest to the futon.
In retail and supply chain RFID tags generate obscene amounts of data. Queue SPSS.
The Reality-based community believes in data. Science relies on data, not just opinion. Where climate research under the last US administration was about editing press releases, the new EPA is all about the data. The entire US government apparatus is arguably a lot more data-focused than it has been in a long time. I have been written about Stimulus and Smarter Planet alignment before.
Every IBM Smarter Planet deal is going to have a major statistics and Big Data aspect. Smarter Planet is an engagement model: IBM now owns all the tools to sell into these big deals. It’s probably worth reminding ourselves what SPSS stands for – Statistical Package For The Social Sciences. Will SPSS have a role to play in IBM’s burgeoning Services Sciences play? You would have thought so, certainly.
It’s important to note the wider context – its not just IBM that wants to make quantitative approaches more mainstream. The New York Times recently wrote a gushing article about another statistics software technology, a language called “R”, more commonly associated with statistics and maths and physics classes than business schools. A new company, Revolution Computing, was recently formed to commercialise R. Data Analysts Captivated by R’s Power.
The world is going mad for maths, or as my colonial side of the family might put it, mad for math. We even have “hot” computational search engines now- anyone for Wolfram Alpha, which could change software?
Doctor Jim Goodnight, founder of SPSS’s biggest competitor, the bigger, more commercially oriented and successful SAS Institute, must be feeling good today. IBM just strongly validated its model. So while SAS will find itself competing more directly with IBM Software Group, it will also have IBM effectively marketing Big Data analysis. I expect IBM Global Services will also continue to partner with SAS.
Just to dork out a little and get specific. One very interesting area I haven’t seen anyone else comment on is mainframe pricing. IBM has done a lot of really solid work making the mainframe less expensive for non-CICS and IMS workloads like Linux (IFL), DB2 (zIP) or WebSphere (zAAP). IBM is determined to drive datawarehousing workloads to the mainframe. But SAS Institute was a “stick in the mud”, effectively forcing users to pay capacity-based mainframe charges, and so making it less likely customers would run Big Data analytics on z. Well now IBM is in a great position to offer specialist offload processors for data analytics workloads, but also push SAS Institute into a price war that can only benefit customers interested in mainframe consolidation- and don’t think that’s an isolated group. What was the first thing SAS claimed about the acquisition? It would force up prices. Good luck with that… One other thing IBM mainframe customers will be able to do- analyse all of their CICS transactions for patterns.
Like SAS another company in an interesting position right now is SAP. It recently signed a deal to resell SPSS tooling as part of its Business Objects portfolio. This is the second time in a month or so that has SAP has lost an ecosystem partner – see my analysis of Software AG’s purchase of IDS Scheer.
It wouldn’t surprise me in the least to see HP acquire Teradata now.
IBM has a major marketing job on its hands convincing the world of the value of statistics and deeper analytics driven decision making, but since when did IBM not like a major marketing job.
It is definitely worth checking out Forrester’s analysis too.
disclosure: IBM and SAP are clients, Software AG and SAS Institute are not.