(NB: I am moving some years-old writings into Due Diligence for future reference. This piece originally appeared in early 1997 on Howard Rheingold's old Electric Minds site. Little or no effort has been expended on cleaning up links broken in the intervening decade.)
Last time I wrote about platforms in general, and how they can create
great value for society and businesses. Now I'll focus on a battle for
platform definition that has been going on since the late seventies. The
objective for the warriors may be stated as:
Share dominance in the specification of programming and user interfaces
for general purpose computing and communications, leading to a role as
de facto market definer for all other players.
There are other platform wars out there, but this is the one that counts.
The compounding experience curves and economies of scale in
general-purpose "PCs" have wiped out entire industries devoted
to special-purpose computing: dedicated word processors, LISP-coded AI
machines, and CAD-CAM workstations, to name a few. Special-purpose graphics
and transaction machinery may be the next to go under [Update 2003: Yup.], and even telephony
vendors are looking over their shoulder. If you want to play for all the
marbles, this is the game you play.
An Aside to Non-Geeks
To make some sense of what follows, you need to know a little about how
computer systems are built; in particular, you need to know about layered
architecture. Most every system is constructed of components that are
logically stacked on top of each other. As an example, let's do a simple
analysis of the system you must be running to see this:
You're connected either by a modem and phone line, or by a local area
network of some sort. That's your network layer. It provides a standard
facility and interface for moving data, called TCP/IP, to your web
browser, which is blissfully ignorant of what kind of network you have:
that's logical independence. In turn, the web browser provides a standard
display definition and interface called HTML [Update 2006: That layer's gotten quite a bit more complex in 1997 - and thereon hangs a tale...]. I wrote this column using
HTML, taking advantage of its logical independence, so that I don't have
to produce a different version for every browser out there. We've now
defined a system in three layers:
- Network connection, presenting a TCP/IP interface.
- Browser, using TCP/IP, presenting an HTML interface.
- This page, using HTML.
While the techies who've followed this know that I have glossed over many
intermediate layers and supporting functions, my example should illustrate
the following points:
- Computer systems are architected in layers.
- Each layer abstracts and hides its actual implementation, presenting a standard interface.
- The specification of the interface between the provider of service
(the bottom layer) and the user of service (the next layer) fulfills the
platform definition of interoperation without special arrangement.
In an economic sense, each layer's interface provides value by removing
specificity costs between the implementation of the layer and its users.
The Great Game is fought out over the definition and control of these
interfaces between layers, which you may also hear called APIs (application
programming interfaces), protocols, or standard data formats. A computing
and communications platform consists of a number of such APIs, dedicated to various functions
in the system.
How To Make War
There's no manual of strategy and tactics for this kind of war. The
following set of strategies is my extrapolation from moves I've seen
repeatedly over twenty years in the industry. There is no canonicity to my
names -- they're just meant to evoke the right image when I use them
again. Along with the definition, I'll mention why the move is (maybe) good for its practitioner and (maybe)
good for the market or society at large.
This is pure competition for market share between solutions with similar
"top layers," providing more or less the same value: Word versus Word
Perfect; Oracle versus SQL Server versus DB2; LAN Manager versus Netware.
Here's where marketing and financial power count. Brand power,
advertising, distribution leverage, margin slashing, and bundling are
weapons of choice. This is also the purest case of increasing returns for
both vendor and market, as I discussed last time. It's therefore hard to
come from behind with this strategy, unless your competitor screws up.
(Which does happen: heard much about dBase lately?) But if you're behind,
you're more likely to attempt one of the following moves to avoid the
Accretion: Adding a layer
Rather than go head-on, the competitor may opt to add a new architectural layer to the
platform, providing new capabilities. The benefits of reduced specificity
will extend to this new function within the platform, providing increased
value to its users and hopefully securing their continued loyalty. The
hope is also that the competing platform(s) will begin to lose market
share to the newly improved competitor, and that their purveyors will then
be forced to add the new architectural feature at a time and cost
disadvantage. An example of accretion was Apple's addition of easy-to-use
desktop publishing features to the Macintosh in the mid-eighties.
Accretion works best when the newly added function is one that is already
frequently used, but not yet standardized, among the users of the platform.
Since the need has already been filled to some degree by developers
extending the platform, they are often the victims of the accretion
strategy, while the end users typically get more functionality for the
same or less.
An accretion move can also be used aggressively, by attempting to
establish the same functional layer as a beachhead on top of the
opponent's platform. If the market accepts this move, the competitor's
ability to control how the function is added to his platform may be
reduced. In the best possible outcome, the increasing returns of adding
the function on all platforms may cause the first mover to dominate the
architectural layer. Examples of aggressive accretion moves include
Apple's move of QuickTime onto the Windows platform, and Microsoft's
support of OLE (via MS Office) on the Macintosh.
Commoditize a Layer
Suppose you've had the misfortune to lose control of a layer to a
competitor, who is thereby able to gain share and margin and to restrict
your ability to evolve your platform. Or perhaps fast-moving startups have
opened up a new area of functionality that's becoming important enough to
threaten your own definition of platform. If you have a good profit base
from the rest of your platform, you might want to commoditize the layer
where the competitor or startups are growing. Consider publishing your
specification of the layer free to all comers, or giving it to a standards
group, or adopting an already commoditized "open" alternative, or even
giving away your implementation.
Though you may not be able to recover your costs, the resulting margin
drops will reduce the benefit of the win to your competitor, and may
implode the upstarts' business models. Your users will be happy -- they're
getting something for (nearly) free.
Microsoft practiced this move when it adopted TCP/IP as a network solution
after losing this struggle to Novell. It's also used in a less obvious
fashion by tool vendors (such as Macromedia) when they give away the
players for their architectures. This has an effect of giving immediate
value to end users, while flattening competitors whose business models
required pay-per-copy to survive.
A commoditized layer in an architectural stack can also create a
"firebreak" that prevents players in lower levels from directly attacking
those above the commoditized layer. A router provider might create a
proprietary network protocol guaranteeing low latency, but would have greate
difficulty leveraging the product through the commoditized TCP/IP layer to
create a successful proprietary networked gaming platform: game developers
would simply be unwilling to walk away from the size of the intervening market defined by TCP/IP
(the whole Internet) in order to gain the advantages of the proprietary approach.
One of this strategy's risks is the loss of control that may be suffered
if an "open" solution is adopted, or if the proprietary specification is
turned over to an independent group: the resulting market may take on a
life of its own. By the same argument as above, this market in the
commoditized layer may have the effect of reducing leverage from one side
of it to the other; an extreme example could result in the disaggregation
of a vertically integrated platform.
Some platforms, like the Macintosh, include all the layers from hardware
to user interface in one vertically integrated stack. But it's not
necessary to own all the layers to have a viable platform. Instead, one
can create a standardized downward interface to the supporting hardware
and other layers. If the platform is sufficiently popular, vendors will
compete to create systems that support the downward interface, and the
volumes and learning curves in that market will create great leverage for
the owner of the platform.
CP/M was the first platform to adopt this approach in microcomputers;
Windows is the current equivalent. Although this strategy is often referred
to as "being open," I'll call it "hollowing out" to distinguish it from the
UNIX flavor of open, which is itself more a variant of a commoditization strategy.
Although hollowing out can create great leverage for the platform owner,
and lower total prices to end users, there are hidden risks. The platform
owner must retain firm ownership and control of the layers between the
downward interface and the interfaces upward to the end user market.
Failure to do so may result in an opponent taking effective control of the
market defined by the downward interface.
The downward interface is always a point of struggle. Those who work below
the downward interface (think "clonemakers") are intensely motivated to
break through to the end user market by providing features that step
outside the existing scope of the platform. Consider how Intel
periodically attempts to define systems-level elements in the PC platform,
and is just as regularly slapped down when Microsoft defines its own
reference platform and covering APIs for the same functions.
A "hollow" platform is rarely as smoothly integrated as a vertically
integrated platform. Anyone who has set up both Mac and PC systems has
seen this at first hand. If the platform provider moves too slowly,
features may appear in systems below the downward interface that are not
directly supported by the platform, and if they are sufficiently valuable,
developers and users will utilize them directly, rather than waiting for
the platform to catch up (think "SoundBlaster"). The direct linkage
between supporting systems and end market begins to erode the usefulness
of the platform in reducing specificity, offering opportunities to
competing vertically integrated systems.
Change the Conversation
If you're losing an argument, try to change the topic. The most effective
come-from-behind move is to change the market's notion of what constitutes
a viable platform, and hopefully to make the incumbent's share look
Changing the conversation is the judo throw of platform war. When it works
it's dazzling; when it fails it's pitiful. There are three ways to change
Change the scope. Redefine the functional scope expected of a platform or
the market area it should cover. "The network is the computer" was Sun's
attempt to change the scope of conversation. Network computers are a
current attempt [Update 2003: They're MIA.].
Define a new genre. If you can invent a really new platform, you will
start with the leading share. Occasionally it works: Visicalc. More often
it flops: Newton. When you read a press release that says something like
"defines a new category," this move is being attempted. Maybe 1 percent of
these genre-definers are really in the running; the rest are just
confessing that they are losing in their home category.
Create a new top layer. If you can define a new function that is not only
valuable to a large market, but also spans multiple existing platforms,
you might have a fighting chance to steal control by superimposing a layer
on all of them. The owner of the new layer gains the benefits of network
effects and reduced specificity, while the erstwhile platform players may
be reduced to struggling below the new downward interface. Netscape would
like their story to read this way [Update 2003: 'would have liked'].
Even partial success in these moves can gain a competitor new life, if a
large enough part of the market finds the increased utility in the
expanded scope or functionality to be more important than the specificity
costs induced by moving away from the dominant platform. One platform
company has spent its whole life making this type of move -- can you name