Blogs

RedMonk

Skip to content

MMS 2007: Update on DSI, System Center, and Microsoft in IT Management

Yesterday, Microsoft ran it’s annual MMS (Microsoft Management Summit) analyst day. About 25 or 30 analysts were given a day of presentations and talks updating us on Microsoft’s IT management products and ambitions (or “aspirations” as they like to call them). Additionally, I spoke with six different sets of people yesterday afternoon and this morning about various aspects of the Microsoft IT management stack.

Below is my general overview and update of the Microsoft’s IT management suite as I heard about it at MMS. I’ll follow-up with other posts on narrower topics, possibly on such as SML/SDM/CML, PowerShell, System Center Essentials (the SMB offering, which should be interesting to several RedMonk clients), Softricty and other topics that fall out of my notes.

While MMS goes on through the week, I had to leave early for the Adobe Analyst Summit in New York (in fact, I’m writing this on the plane from San Diego to JFK; mid-week, I was lucky enough to get a row of 3 seats to myself…yuh!)

Delivering Part of the Vision

Compared to last year, this year was focused on deliverables like System Center Operation Management (MOM), System Center Essentials, System Center Configuration Manager (SMS), and previews of products like the virtialization manager and Service Desk/CMDB stack, Service Manager.

That’s a lot of products to rattle off, and doesn’t include other components like Data Protection (back-up) and the security offerings, and recent acquisitions likes AssetMetrix and Softricty (both quite interesting). Paula Rooney has a concise round-up.

What is IT Management?

Pulling back a bit, what is IT management? Boiled down, it means providing and taking care of all the computers, software, and associated administrative artifacts that a business or organization needs to run. Ideally, a business forms a plan for making money, figures out how IT can help to execute and maintain that plan, and then asks the IT department to provide and keep that IT running.

How all of that done is of great nuance and countless Big Thuds of thinking, guidelines, and software. But it’s a good enough definition of the the ideal, a priori approach to IT management without getting bogged down in the muck.

In the end, the thing to keep your eye on is using IT to help the business make, keep, and grow money instead of suck money away.

The DSI Story

Any large IT management suite has a conceptual story wrapped around it that tells:

  • a narrative and a “world-view” of why the software exists
  • how it should be used
  • what success looks like

Microsoft calls it’s story for IT management the “Dynamic Systems Initiative” (DSI). At the moment, Microsoft is in the 4th year of a 10 year plan for delivering on DSI.

Last year, at the same event, much more emphasis was placed on explaining DSI. This year, as Microsoft said, was a delivery year, so there was less talk about the DSI story. In brief, DSI revolves around the idea of taking IT departments along a maturity model from a “basic” level of fire-fighting to “dynamic” level where they’re responsive and helpful to the needs of business. IT moves out of the Carr dog-house as a necessary evil to a strategic asset to the business. That is, IT helps the business compete and make money.

This is the familiar story of “aligning IT with business” which all IT management vendors aspire to now-a-days. There’s nothing wrong with that goal at all or it being “familiar.” The approach that Microsoft takes is (currently) about understanding what IT you have, modeling it, and then adding the tools on-top of those models to enable rapid change in IT.

The Next 3 Layers of the Onion: Models and Virtualization

There are 3 pillars of DSI that answer the question of “how does it do it?”:

  • Design for Operations – assuring that software, servers, and whatever else is to be managed is properly instrumented, has help and troubleshooting “knowledge,” and is properly “modeled” so that it fits in with IT. That is, the chunk of “IT” should be “self-describing” and contained when it comes to the needs of an IT admin: there should be as little mystery as possible as to what it is, how it should behave, and how it can be fixed. Think of the “self-describing” dreams people used to ascribe to XML.
  • Knowledge-Driven Management – with that foundation, running IT can be managed by assessing the current state of any given “piece” of IT. You can compare it’s health and performance to the expected health and performance and, as things go wrong, quickly figure out how to fix them. Another key aspect, enabled by proper modeling, it know how a piece of IT fits in with other IT.
  • Virtualization – while the first two are architectural and design pillars, the third is, in comparison, a “low-level” technological stake in the ground: IT should be as virtualized as possible, from servers to applications themselves.

This is a clipped down version of DSI, check out this Feb. 2007 white paper for more details. It’s actually quite self-contained and complete.

Configuration First

In a subtle way, the DSI approach leads with “configuration” (knowing what’s running, in what, well, configuration, and keeping it up-to-date) and follows with monitoring and management (making sure what’s running is working correctly, as in fast and available enough). I point this out because the “traditional” approach to IT management (as seen in established vendors and most new vendors and projects) is concerned monitoring and management first rather than knowing “the state of things” first.

In both mind-sets, you do both, of course: it’s more a question of which of the two — configuration or monitoring/management — is done first and in most depth. Indeed, having “done” the monitoring and management work long ago, the more established vendors’ have been circling their configuration wagons for several years now with their own CMDB and IT modeling stories.

To me, it seems natural that Microsoft would lead with configuration. Managing Windows-land is about keeping all those desktops up-to-date and operational. There are more, many, many more Windows desktops out there than servers and services of whatever ilk. Taking a configuration approach of just keeping all those desktops in a good state of configuration is about all you can do: you can’t really respond to every single CPU slow-down on every desktop machine. The pay-off isn’t worth it.

That said, Microsoft wants to be — and is — about more than just the desktop. Server management may be about management and monitoring, but the history of desktop management, I’d wager, is what causes thinking of configuration first in order of concepts.

System Center

The above 3 pillars translate into actual code in the form of System Center. System Center is the (year old) brand for all of Microsoft’s management software, primarily the venerable SMS and MOM, renamed to System Center Configuration Manager and System Center Operations Manager.

IT management is full of product names that don’t immediately explain themselves. Thankfully, Microsoft’s product do. Operation Manager does the monitoring and management, while Configuration Manager does exactly what it’s name implies.

My sense is that slowly, over the years, these two product lines are being transformed from the “point products” they began as, and are aligning more with the story of DSI. As mentioned above, we’re just in year 4 of 10.

Above the base-level of monitoring and management support, System Center provides the tools to model and instrument (Design for Operations) IT and IT assets. This modeling takes the form of “management packs” which contain the definition of what’s being managed, explanation and help for managing it (“knowledge”), and the “tasks” for doing management. An authoring tool allows developers and admins to create new management packs from a low-level, script driven approach to easier, wizard-driven builders.

Also, the announcement today was that Microsoft has OEM’ed some of EMC’s Smarts technology into Operations Manager to flesh out network management. Not being familiar with EMC’s Smarts at the moment, I can’t judge how cool or un-cool that is.

ITIL, Service Desk, CMDB => Service Manager

The other part in the hopper is moving to the conceptual-level of “service management” that currently dominates IT management think. Boiled down, this means manifesting ITIL thinking and practices in the IT management software. The first rung in this, for vendors, has been implementing a help desk. Or, put in more flowery language, a service desk. When providing IT as a service to the rest of a company (or whoever is the consumer of IT) you’re taking requests for new IT or updates to existing IT, getting “tickets” about things that aren’t working, and then doing all the “low level” stuff to keep everything up and running.

The in vogue way of doing the last part is to maintain a Configuration Management Database, or CMDB, that keeps a record of “everything” that IT is concerned with: physical assets, logical services being provided, and even people and roles. The idea — much as with Microsoft’s idea of modeling everything — is that if you just have everything “written down,” it’ll be easier to and quicker to manage it.

Of course, the assumption of all of this is that you have the tools available at your disposal to keep things running and healthy. That “low-level” is often implicit and not talked about in aspiring to establish help desk processes for responding to requests and the catalog of the known IT universe in a CMDB.

This year, Microsoft gave more detail on it’s in progress service desk and CMDB, giving us an update on the once code named “Service Desk” and now officially named Service Manager. Service Manager contains a help desk and a CMDB, though, from talking with Shawn Bice and Eric Berg, the two can be separated as needed or desired. The overview of Service Manager is essentially the same as last year, with a bit more meat on the bones:

  • the data-model for the CMDB is based on SDM/SML/CML (more on those in another post)
  • a self-service portal allows users to help themselves with requests as much as possible
  • workflows for doing incident, problem, change, and asset management will be provided. These workflows will be done in WinFX Workflow Foundation
  • reporting is provided over the request information, workflows, and CMDB

Microsoft codified an “implementation of ITIL” some years ago in the Microsoft Operation Framework, or MOF. As ITILv3 comes out, Microsoft’s hope is that MOF can merge closer to purer ITIL than an extension of it. As ITILv3 isn’t “out” yet, no promises they say, but the hope is that it’ll be good enough to use more off-the-shelf than not.

Challanges

A SaaSier World

While biting off virtualization now will result in System Center and DSI managing the “future of IT,” I still worry about how the increasing use of Software-as-a-Service will displace the over-all usefulness the System Center we get 6 years from now. Application delivery, and IT in general, delivered as a service rather than the traditional and well established behind-the-firewall approach is challenge to all of this. Existing vendors, not having to spend their time to lay the foundation of an IT management platform as Microsoft must do over the upcoming years, can adapt to managing Software-as-a-Service (both providing it and consuming it) as the need arises. A year ago, we heard a lot from Microsoft about the SaaS way of thinking, but that aspiration seems to have gone quite of late. So, perhaps there’s something in the works. We’ll have to see. If there is, hopefully it’s permeated into the management wing of Redmond as well.

Developers

Also key to success for Microsoft is appealing to developers to actual “model” and provide the management packs for software and services they develop. While Microsoft is doing “Design for Operations” for all of their own software, getting other software providers to do the same is a challenge. In essence, it’s more work for developers. Developers who’ve had to support the software they write understand the need of “Design for Operations,” but my sense is that most developers haven’t had the “pleasure” of suffering through support. I know it was an eye-opener for how I developed software. Convincing developers that they’ll “suffer less” if they “Design for Operations” is a big task ahead for Microsoft.

My sense after the last two days is that Microsoft doesn’t yet have a compelling story to tell all developers, Windows and non. What’s needed is a developer community around all of this, if only represented by out-reach into existing developer communities.

As the modeling specification are hashed out, the utility of those spec may help if they’re simple and helpful enough. To be brutally honest, if the SML 1.0 spec is a whiff of things to come, there’ll be problems ahead when it comes to delivering simplicity and pragmatic usefulness. It feels way, way too WS-*. Getting that right is vital for the IT management industry as a whole. While the landscape is littered with (overly complex and horky) attempts at creating a common IT model, we’ve been doing all of this littering because we desperately need that model. In the mean time, customers pay for the roach-motel incompatibilities that necessarily exist when everyone does their own thing.

In the meantime, PowerShell is an interesting bright-light. The coolness of PowerShell is a good grappling hook to keep the attention of the overall developer community. Indeed, as the increasingly coverage and explanation of PowerShell has shown, it may be an easy enough way to hook Windows based applications into the Windows management world to motivate developers to start practicing “Designed for Operations.” Indeed, thinking of PowerShell as a sort of action-over-model driven approach to making IT more manageable is a fun through-path to go down.

So What?

My sense of all this is that Microsoft is slowly buy surely firming up it’s IT management portfolio to be on par with traditional enterprise IT management. That’s a hard row to hoe The Big 4 and others have been at IT management for a long time and are massively entrenched. The trophe from the competition is that everyone hates the offerings of the Big 4 because they are so entrenched. Little wonder that the vision painted by DSI os one of agility, change, and zip-zip-zip motion and flexibility.

The hope and promise of Microsoft’s approach is that they are the best option for managing Windows. At the moment, the Windows-centric approach is the primary weakness, and I’d hope to see a more rapid embrace of the non-Windows world and applications if Microsoft hopes the System Center line to be a full-blown, IT management platform rather than just a Windows management platform. Their strategy is satisfy Windows-land and depend on partners to bring in heterogeneity.

From a strategy perspective, this approach is a sort of, Mooerian classic of winning over of narrower markets and then an expansion into the general market based on the credit established in the narrower markets. Of course, it’s also a story of Microsoft having to build up the full-suite of technology needed for an IT management platform, including corralling their existing efforts into a more unified narrative and stack.

From what I saw this week at MMS, if they’re on at year 4 of a 10 year plan, they’re doing all right. Numerous customer stories — take them as you will — attest to the usefulness of using System Center to manage medium to “small enterprise” scale Windows shops. The role of Microsoft virtualization is one waiting in the beta wings, but the plans of how it will help are well articulated and demo’ed.

The other 6 years, of course, give established and even new players plenty of room to maneuver. I suspect as the years tick off, they’ll have to take Microsoft more seriously.

Disclaimer: parts of Microsoft are clients. Microsoft paid for my travel and hotel to MMS2007: thanks! Check the clients page for relevant clients in the IT management space.

Technorati Tags: , , , , ,

Categories: Conferences, Systems Management.

Comment Feed

5 Responses

Continuing the Discussion

  1. […] to, if not at the center of, Microsoft’s overall IT management vision/plan (the Dynamic Systems Initiative, DSI) is the idea of modeling all of IT both ahead of time (a priori) and in real-time. Old hands will […]

  2. […] I won’t get into what I’d call the Microsoft IT management philosophy and view of the world. But I’ve written that up several times while at MMS’s and TechEd’s, most recently at MMS 2007. […]

  3. […] always had a soft spot for Microsoft’s IT Management efforts because their Dynamic Systems Initiative (DSI) was so damn dorky and developer oriented. DSI is a […]

  4. […] As with practically all elder companies (if not all), most of the cloud talk is about how IT must change to take advantage of cloud computing. Though it may seem opportune, Microsoft’s Dynamic Systems Initiative (DSI) doesn’t require too much square peg in round hole shoving to match up with the emerging vision and practices for cloud computing. DSI (in year 7 or a 10 year plan) has mostly to do with modeling and virtualization, a sort of Microsoft go at IBM’s forgotten Morlock utopia, Autonomic Computing. (See more extensive DSI coverage from my Microsoft Management Summit 2007 write-up.) […]

  5. […] As with practically all elder companies (if not all), most of the cloud talk is about how IT must change to take advantage of cloud computing. Though it may seem opportune, Microsoft’s Dynamic Systems Initiative (DSI) doesn’t require too much square peg in round hole shoving to match up with the emerging vision and practices for cloud computing. DSI (in year 7 or a 10 year plan) has mostly to do with modeling and virtualization, a sort of Microsoft go at IBM’s forgotten Morlock utopia, Autonomic Computing. (See more extensive DSI coverage from my Microsoft Management Summit 2007 write-up.) […]