Blogs

RedMonk

Skip to content

Gary Edwards: $0.02 That Go a Long Way


OpenOffice.org Display at LinuxWorld SF 2005

Originally uploaded by webmink.

I’ve already indirectly surfaced this comment from OO.o’s Gary Edwards (the guy on the right in Simon Phipps’ inset Flickr photo) in my del.icio.us links, so those of you picking up my Feedburner feed that integrates those might already have read this. But James suggested, and I agree, that it’s worth explicitly calling out for the folks that a.) don’t get my main feed with the del.icio.us links, b.) don’t get the comment feed, and c.) stick principally to the front page. I feel fortunate for all of the comments I get here, b/c (knock on wood) our commenters tend to be thoughtful, articulate and best of all – respectful of one another. But when Gary checks in, he really checks in :) What follows are his thoughts on my post entitled “Open Source, Ecosystems, The Second Law of Thermodynamics, and a Random Belgian Mathematician.” They’re appreciated, as always, Gary – and I hope the rest of you find them as insightful as I do. For the record, while I don’t personally subscribe to every last nuance, I agree with most of it, particularly that Adobe’s Duanne Nickull is very definitely a cosmic genius. What is it with all these absurdly smart Canadian technologists? Anyway, here’s Gary:

I think that for much of the technology industry, “the dynamics of interaction” have been forever changed by the Open Internet. As an ecosystem, everything else is just another layer in the rain forest.

Open Internet protocols and methods are showing up everywhere. So much so that applications and information systems are near worthless if they fail to connect and collaboratively compute in any other way.

I used to think about things in terms of platforms, frameworks, languages, and application environments. Increasingly though i tend to think about these things in terms of how well they make use of the Open Internet. For instance, SOA-ESB is just a collection of best practices and services for successfully embracing the Open Internet infrastructure. The Internet becoming the most important point of interoperability between platforms, frameworks and applications.

I can’t help but notice that portable run time engines (Java, Python, MONO, JavaScript, XPCOM-XUL, Ruby, Flash, etc.) are able to adapt platform, framework, and application specific environments to the Open Internet faster than vendors can build consensus around proprietary initiatives. How does a vendor build an ecosystem around a proprietary initiative when the participants eat and run? It’s like dealing with subatomic particles. The mere act of observing alters the behavior in response. It happens instantly. Poof poof.

The question is no longer about whether a vendor can grow an ecosystem around a proprietary system. Rather, can a vendor homestead a significant piece of the global ecosystem, and create sympathetic dependencies of a lasting and profitable nature? Can they find a layer in the rain forest and thrive?

What matters today are dynamics of interaction based on Open Internet standards. Fading fast are the days of vendors competing on the basis of platform or application ecosystems, with attention increasingly shifting to the Open Standards and Consortia Standards organizations as places to meet and arbitrate issues of Internet based interaction.

Where vendors used to meet to talk about the interoperability of their systems, they now meet to arbitrate and advance protocols and methods enhancements likely to favor the capabilities of their proprietary systems.

This isn’t “growing” an ecosystem in the traditional sense. This is more like seeking the favor and consent of the global ecosystem to advance the common infrastructure in new directions. The world having long ago chosen a common infrastructure unique in that it is owned by none, and used by all. So today we argue more about which platform, application environment, or integrated stack of technology systems makes the best use of the Open Internet, than we do about which platform or integrated stack is better.

The proof of this is that if you were to disconnect any platform or integrated stack from the Open Internet, who would want it?

The “short term proprietary / long term open source” transition model you mention is somewhat similar to what we briefly discussed at JavaONE when i asked cosmic genius Duane Nickull to comment on Bill Coleman’s prescient concept called, “running the stack”. Ever the careful politician, Duane was coy in his explanation of how Adobe plans to mesh the Adobe stack with Open Standards. But it looks to me that Adobe is doing exactly what Bill Coleman described as “running the stack”.

The basic idea behind “running the stack” is to build on a commons infrastructure where Open Standards are publicly embraced. Above the commons high water mark, a vendor innovates and extends protocols, methods, and interfaces with the promise of returning these efforts to the commons infrastructure (as Open Standards proposals). This “promise” to customers and ecosystem participants kicks in when critical mass has been reached. Critical mass being defined as that moment when interoperability with other information systems and technologies becomes as important an issue as the initiative itself. At that point the initiative is pushed down the stack into the commons infrastructure, raising the high water line. The vendor of course moves higher up the stack to innovate and enhance once again.

Although Adobe doesn’t have a “public” promise to run the stack, they are pushing important proprietary technologies into the Open Standards commons. Shortly after JavaONE, Duane presented a key Adobe Stack technology, the XMP-RDF Metadata model, to the OASIS OpenDocument TC for consideration. He has also submitted the critically important Adobe XFA method of extending XForms to the W3C for consideration. Adobe routinely describes PDF2 and the Adobe LifeCycle model as a XML framework, broadcasting loudly the importance of Open Standards across the entire stack.

At our JavaONE meeting you asked Duane where Adobe will draw the line. Personally i don’t think there is any answer to that question, other than the Bill Coleman response. His answer was that BEA is committed to running the stack faster than any of their competitors. Meaning, they would push initiatives into the commons faster, raising the high water mark ever higher, and moving up the stack further with innovative enhancements.

Besides, when working with a commons infrastructure like the Open Internet, drawing the line is as much determined by competitors as it is by proprietary business interests. For instance, both IBM and Adobe are rushing to convert their proprietary systems to be fluidly interoperable Open XML frameworks. (I say “Open XML” because not all XML initiatives are open, unencumbered and unrestricted). They agree on Open XML technologies below the line, and compete above. They agree on things like XForms, OpenDoc XML, XBR, and Web Services. (Adobe used to agree on SVG, but that may have changed with the Macromedia acquisition).

With Adobe’s support for WorkPlace, IBM’s cross platform application environment, they’ve even agreed on a common end user interface into their applications and server side systems.

Where they compete though is higher up the stack where Adobe’s LifeCycle initiative truly threatens the heretofore untouchable Lotus Notes collaborative “intelligent” document platform.

From a customers point of view, this new found interoperability between IBM and Adobe is great news. That it all takes place within a common XML framework is something that should get everyone’s attention.

For IBM and Adobe, i think they have much more to gain from this interoperability than they have to lose from otherwise competitive challenges. Determining and forever reevaluating where this balance lies is the answer to the question of where the line is drawn. It’s also the reason why we are seeing more people like Duane Nickull. He represents a new breed of vendor executives able to first construct out of existing services highly integrated stack models embracing common infrastructures, then navigate the Open XML – Open Internet forums to present key interoperability initiatives, and finally to successfully arbitrate the differences in ways that fully validate and return the investment and trust of an anxious customer base.

There is a difference between “interoperable” and “integrated”, and i think it’s here that the importance of Open Source truly comes into play. Open Source communities provide highly interoperable components. Community friendly vendors like Novell, IBM, Red Hat, Sun, Progeny, Xandros, Lindows, TiVO and Spike Source provide integrated solutions based on interoperable components.

Microsoft, IBM, Adobe, Novell, Oracle and Sun all provide integrated stacks. The Sun integrated stack starts at the lowest level of binary compatibility and extends to the top of enterprise heap. Microsoft starts at the OS level and is just now leveraging the desktop monopoly to work it’s way up into enterprise stack. IBM is making an extremely flexible if not galactic integration model. What separates all integrated stack offerings is the measure of open interoperability at the component level. Integrated stacks built on Open Source components have an extremely high level of open interoperability, including for some components the future proof, forever open guarantee of the GPL. In contrast, the Microsoft XP stack is based on a cascading entanglement of interfaces, protocols, applications, developer tools, frameworks and server suites bolted together is ways unseen, unknown, or partially known to those with permission. This approach works very well for Microsoft, but is high risk to everyone else caught up in the Windows ecosystem.

Given a choice, everyone would choose some variation of the Open Stack model. What’s surprising is that even when not given a choice, and in the face of mounting barriers, outright intimidation, and even legal threats, the marketplace is moving to Open Source. And if they can’t make the leap to Open Source, then they are demanding at the least, Open Standards.

IMHO, the Open Internet spawned and made possible the digitally collaborative communities known as Open Source. And although the idea of the Internet preceded by four years the invention of UNiX, many would argue that it is UNiX that made the Internet possible. Perhaps the more important point to be made here though is that the Open Internet and Open Source share the UNiX model of piping together loosely coupled components.

Command line piping may be the tradition, but there is reason and ecosystem logic as to why Open Source components are so highly interoperable and easy to integrate. On their own, the efforts of Open Source communities are useless. It is only when they are piped together or professionally configured into integrated stacks that they make sense. The law of the Open Source jungle is interoperate or else. No community is an island. Cooperating with the ecosystem is the difference between having purpose or being banished.

With the Open Internet we might describe and measure this interoperability on terms of Open Interfaces, Open Communication and Messaging protocols, Open Run Time Engines and Libraries, and Open XML technologies. What the new breed of vendor executives bring to the table is the vision and skill set needed to navigate and build on these terms of interoperability.

One might argue that Microsoft trounced Apple because they brought about a platform ecosystem far more inclusive and open to participation. In selling the Windows platform to software developers, hardware vendors, and consumers, Chairman Bill made three promises. All of which needed to be fulfilled before the Windows ecosystem could form and reach critical mass. He promised an open hardware reference platform that would ease application development and commoditise everything below the OS. He promised developers above the OS ae level playing field defined by an Open Windows API. He even went so far as to assure everyone that there was a Chinese Wall between Windows OS and Microsoft Applications – Developer tools divisions. And, he promised mass distribution of the Windows OS, creating a common infrastructure for digital solutions.

The problem Microsoft now faces is that the incredibly robust ecosystem they built now finds it more important to be part of the Open Internet than subservient to the Windows API. Part of this is due to the fact that Microsoft has proved time and again that they can’t be trusted as a common carrier of a digital ecosystem much less as the foundation of a newly emerging digital civilization. Part of it is due to the fact that Microsoft ruthlessly used their developer network to grow new market categories, only to swoop in later to seize all opportunities when the category finally became profitable. And part of this is due to the simple fact that the Open Internet has such a low barrier to entry, but high level of quality participation, that it was beyond unstoppable as a universal platform of connectivity and collaborative computing.

Even though the great herd of Windows users remains tethered to Microsoft, it is the Open Internet API that developers write too. Sure you can cloak the Open Internet in .NET and MS XML garb, but they are always dancing around something owned by none, used by all. Chairman Bill pulled out all the stops to crush Netscape and Java, and yet the Open Internet invasion continues. The time tested method of “embrace, extend, extinguish” just doesn’t work well with a technology platform built exactly to route around barriers.

From the gitgo the Apple iPOD strategy has been to erect barriers, and build a marketplace within the confines. Nothing new here. At least Apple is up front about it. Microsoft’s strategy is one of deception. They hope to erect unseen barriers in hopes that by the time the great herd realizes they’re trapped, it will be to late to route around the problem. It’s worked before and it might work again. Never underestimate the great herds penchant for being deceived. The greatest threat to the Open Internet isn’t Microsoft. It’s the great herds determination to keep their heads down, their grazing uninterrupted, even while they’re being stampeded over a cliff.

And then there are those who respect the Open Internet and are willing to take their chances “running the stack”. Personally i don’t think there is anyway Microsoft can stop IBM, Novell, and Adobe from cutting into the great herd at the desktop level with WorkPlace alternatives, and moving the herd en mass to a dependence on Open Internet friendly API’s as the foundation for just about everything they do. We shall see.

Categories: Open Standards.

  • http://www.redmonk.com/sogrady sogrady

    having had exposure to the TwC group very early along, and having gotten to speak with folks like Scott Charney, i certainly wouldn't say Microsoft hasn't learned those lessons. it has, and the improvements are growing more apparent.

    that said, it's my contention that the "Integrated Innovation" approach that is core to MS's approach brings with it certain risks not inherent to other systems, simply b/c tight couplings almost invariably introduce additional security risks.

    in other words, you can and have gotten better, but the approach itself introduces risks.

  • http://blogs.msdn.com/mikechampion Mike Champion

    “The problem Microsoft now faces is that the incredibly robust ecosystem they built now finds it more important to be part of the Open Internet than subservient to the Windows API.” Uhh, I believe Chairman Bill figured that out about 10 years ago and announced it to the world on December 7 1995.

    “Even though the great herd of Windows users remains tethered to Microsoft, it is the Open Internet API that developers write too. Sure you can cloak the Open Internet in .NET and MS XML garb, but they are always dancing around something owned by none, used by all. ”

    My sense is that the Open Internet API is *not* what mainstream developers write to. If people did write to the lowest-level open APIs rather than to APIs like XmlHttpRequest, why didn’t the “AJAX revolution” occur years ago? It wasnt’ until convenient APIs for HTTP, DOM *extensions* that minimize the pain of raw XML, etc. became de-facto standardized and ubiquitous that “Web 2.0″ took off, even though the fundamental levels of the stack have been around much longer.

    “Microsoft’s strategy is one of deception. They hope to erect unseen barriers in hopes that by the time the great herd realizes they’re trapped, it will be to late to route around the problem.” Astonishingly enough that’s not the way I see it :-) The unseen barrier IMHO is the immense complexity of the “Open Internet” ecosystem — not the APIs, formats and protocols themselves, but the way they interact in ways that few really understand, the way they offer their power to the various vandals and scammers who pervert their potential, and so on.

    The common thread running through what most of us do (at MS, everywhere else I’ve worked in the last 10 years or so, and from what I see at most commercial software companies and open source projects) is to provide a level of convenience and security on top of that ecosystem that ordinary people can use to do real work. That creates a tradeoff for end users, to be sure — some narrowing of options in return for comfort and convenience. That’s true for commercial and F/OSS software – projects as well as products can become overhyped and sweep the herd toward a cliff.

    So, success will go to those who offer the most real value on top of the Open Internet, and those who help it evolve as new possibilities emerge. That’s what I see people doing at Microsoft, and I’m sure that’s the attitude at other places that are really helping the Web and related ecosystems evolve.

  • http://www.redmonk.com/sogrady sogrady

    hey mike,

    as an aside, you’re one of the individuals i had in mind when i was referring to the respectful discourse we’ve had in this space, and this post keeps that trend in line so thank you.

    anyhow, good points all around, but as you might expect we disagree in a few areas ;)

    “Uhh, I believe Chairman Bill figured that out about 10 years ago and announced it to the world on December 7 1995.”

    i’m not sure i agree with this. it could be argued, i think, that one of the reasons that Ajax was not pushed actively by Microsoft given its early lead in the space was the fact that it undermined the Windows API in favor of a thin client, web services based approach. while MS has definitely “gotten” the internet and done an admirable job of realigning the business around it, i think it’s a fair statement that much of MS’s behavior is about fiercely protecting the Windows platform from more heterogeneous alternatives. not that i’d expect MS to do anything differently, as you’ve made a bit of money in that area :), but it’s not the same as targeting the “open internet” that Gary refers to.

    “My sense is that the Open Internet API is *not* what mainstream developers write to.”

    if you’re referring to the Open Internet API as a single specification, i agree. but while i won’t speak for Gary here, i believe the Open Internet API to instead be a collection of open interfaces that leverage the existing infrastructure of the web. in James Snell terms, it’s chmod 777 web.

    “The unseen barrier IMHO is the immense complexity of the “Open Internet” ecosystem — not the APIs, formats and protocols themselves, but the way they interact in ways that few really understand, the way they offer their power to the various vandals and scammers who pervert their potential, and so on.”

    the complexity and security concerns, i think, are fair pushback on a set of wildly proliferating interoperability options. i tend to believe that will be self-correcting at some point, but there’s some truth in that statement.

    “The common thread running through what most of us do (at MS, everywhere else I’ve worked in the last 10 years or so, and from what I see at most commercial software companies and open source projects) is to provide a level of convenience and security on top of that ecosystem that ordinary people can use to do real work. That creates a tradeoff for end users, to be sure — some narrowing of options in return for comfort and convenience. That’s true for commercial and F/OSS software – projects as well as products can become overhyped and sweep the herd toward a cliff.”

    given that i’ve been fairly critical of “Integrated Innovation” in the past, it shouldn’t be a suprise to hear that i don’t buy this. i’ve been listening to the “integration as a means of simplifying things for customers” arguments for a long time, and while i think there are certainly instances where this is true, i think it obscures the fact that it actually introduces complexity (via unanticipated dependencies) and, in many cases, creates lock-in. again, these are at least marginally justifiable business decisions, but not ones that i think can be universally portrayed as serving customers. i’d caution against using security as an example, as well – i don’t think it’s viable to argue that integrating IE into Windows was a good security decision.

    so while i understand the point – and certainly think there are instances where its true – on the whole, i don’t buy it.

    “So, success will go to those who offer the most real value on top of the Open Internet, and those who help it evolve as new possibilities emerge.”

    surprise :) we’re in total agreement here.

  • http://blogs.msdn.com/mikechampion Mike Champion

    I agree that MS should be a little more genuinely fond of that loose coupling dogfood that it recommends to selected audiences. I suspect that will be one lesson people take away from the lllllooooonnnnnnnnggggggg time it took for the Whidbey/Yukon wave to ship. Still loose coupling isn't a univesal elixir for IT happiness. For example, it was plain to me a couple of years ago when I (ahem, temporarily) switched to OS X that the real power of the Mac platform came largely from its tight coupling to the hardware. Better integration with things like wireless adapters than Windows laptops of that vintage had, but then again you have utter lockin to Apple hardware. A devils bargain or the price of all that elegance? Opinions differ.

    The same thing happens all over. For example, supporting multiple platforms for your product is a great marketing story, and architecturally feasible, but it generally comes at a performance and development complexity cost. That performance overhead of XML, for example, is quite steep (about 10x) compared with a custom format. Is that a devil's bargain? Some say so, and this is a hot topic these days (Shameless pitch – I'm chairing a panel discussion on Efficient XML at the XML 2005 conference).

    From my tiny little corner of the empire, it looks like the real strategy is to support both tight and loose coupling scenarios. Those who are need the vendor/platform/version/etc. neutrality can use the XML, HTTP, web services, etc. technology in which MS invests heavilty. Those who don't need it don't have the pay the performance/complexity price of all that Open Internet API stuff, but of course have a more expensive migration path if they decide to abandon the fold. Ya architects yer software and ya takes yer choice, MS will be very happy to sell the infastructure for either.

  • http://blogs.msdn.com/mikechampion Mike Champion

    ” i’d caution against using security as an example, as well ”

    Fair enough. I think (or at leat hope) than no one still employed here would think that the decisions made in the mid-’90s were correct from a security viewpoint. I’d caution, however, against assuming that Microsoft hasn’t learned these lessons extremely well.

  • Pingback: James Governor's MonkChips

  • Pingback: James Governor's MonkChips

  • Pingback: James Governor's MonkChips