One deal closing later, and we're back...
You Didn't Ask This Either
Before we get onto the 'Web 2.0 disrupts VC' story, let's dwell for a second on how standardized software architectures have and are still feeding the VC model. Using the language from Part One, here's the argument: Increasing software layering and standardization should lead to reduced transaction costs, which should push some amount of innovation from the hierarchical model into the contract model. In other words, projects that formerly were in-house one-off software jobs should move to external purchase of standards based solutions, some of them hopefully from VC-funded growth companies. Do we see this?
Lots of it. Let's take the analytics and business intelligence markets, for example. These don't even depend on web related technology, they emerged in part from the standardization of database architectures around the relational model. RDBMS was news back in my grad student days, but it's now so commoditized you can get it for free, as mySQL. The consequences of a consensus on core data models have included the emergence of data warehousing, OLAP, ETL, analytics software and the rest. Plenty of good growth investments there - just ask Larry Ellison.
The coming of web based standardization has continued this migration away from in-house solutions. Check out SFA, CRM, SCM, or your other favorite TLA for enterprise software. There's been plenty of movement to externally sourced products or services, even before the shift towards XML based data representations, simply as a consequence of a ubiquitous TCP/IP transport infrastructure and the use of HTML clients as a uniform presentation layer. We can look for newer architectural improvement to continue chipping away at the bastions of in-house development, and delivering new growth markets to the software VCs.
Show Me The Entry Barriers
I've yet to write the apparently obligatory 'Top 10' list for VCs or entrepreneurs. My one abortive attempt in that direction did yield a slogan that seems apropos here. Let's call it Tim's Tautology:
"If you can get venture capital, you'd better take it."
Like all slogans, it's oversimplified. It tries to capture one essential issue: If you're creating a business in a market that fits the VC model, one of your competitors will take the money. And use it against you.
How?
One qualifier for VC investment is that the startup should be able to employ our capital to create an entry barrier for following competitors. Look at it this way: We know how you're going to make things easier for imitators: You're going to prove it's possible and remove uncertainty for a fast follower. So tell us how you're going to make life miserable for the imitator, to balance off that advantage and more. It can be a lot of things - intellectual property, brand, tying down major customers, hiring irreplaceable talent, creating a network effect of some sort. But it had better be there - we're not in the business of helping your competitors. And if you could do it, but don't take the money to make it possible, someone else may do so. Don't take my word on it. Here's an old acquaintance of mine, agonizing over whether or not to take the money. As he puts it: "Because of the threat of VC money in other people’s (competitors) hands. Kinda circular isn’t it?" Yup.
Before moving on, be careful to notice that entry barriers aren't the same as transaction costs. A trivial example is crude oil. Low transaction costs: Standardized grades, liquid market including futures, world wide trade flow. Plenty of entry barrier: The price to play efficiently is an oil field or two.
Net Effects
Now it's time to go after what folks are actually discussing: damage to the VC model in software and services innovation, originating from the fuzzy cluster of technologies called Web 2.0. There are in fact two interrelated, but distinct, hypotheses lurking in this notion.
Hypothesis One: Web 2.0 technologies reduce the transaction costs of software innovation, by making integration of the innovative element into the existing ecology of web services and user experience more simple and risk free, allowing the customer to easily adopt within a familiar framework. The effect is that the "content" related transaction costs of a new venture go down. The standardized "VC container" for innovation that looked efficient when compared to corporate R&D labs now looks like useless overhead.
Hypothesis Two: Web 2.0 technologies also reduce the entry barriers (fixed costs) of innovation. The same architecture that allows simpler, less risky boundaries around the innovation also allows a particular market territory to be more finely dissected. The ante to build a useful online service isn't the creation of a new CompuServe or AOL, or even a general purpose search engine like Google, instead it may be a specialized utility such as Megite. You don't build a whole walled garden location-based content, directory and advertising system. Instead, you fire a rifle shot like Plazes. So what if they're features or products, not companies? Perhaps that's the VCs' problem, not theirs.
Both hypotheses point in the same direction. Forego the outside capital with its constraints on quantum size, necessity and timing of exit, and the like. Build your innovation on sweat equity, savings and credit cards, and a little bit of angel capital, and either run it or flip it at 'Stage One'. It's the new Long Tail of innovation, where the VCs are just like the MSM - busted.
Sounds like us software and services VCs are toast, and better find a new gig, or at least resign ourselves to a future of scratching for new enterprise TLA categories. Say goodbye to those fun mass market plays ;( The wisdom of the net has spoken!
What Could Go Wrong?
Well, quite a lot, actually. There is a repeated pattern of failure in such visions. It may be different this time, but some questions need to be answered. The failure pattern is to focus on the variable or fixed costs that are being reduced, while glossing over others that are waiting just behind, or are created by the 'solution'. Take another look at the transaction costs article. Count up the different cost elements. It's a long list, and not all are given explicitly. Here are are few failure cases in point:
- Some acquaintances of mine were involved in the creation of AMIX. In the late 80's it was "the world's first on-line market for information and expertise". It was also an abject failure. While it addressed many of the mechanistic elements of transaction costs, it required participation in a closed, proprietary system. In so doing, it blew a participant's specificity costs through the roof.
- How many virtual micro-transactions have you done today? From Xanadu to DigiCash to bitPass, the notion that we will buy information in small, coin sized chunks has revived as each new generation of technology promises to reduce settlement and other costs while increasing the scope of applicability. But it does little for the cognitive overhead of search and trust, which become the new transaction cost boundaries, keeping microtransactions a micro-market.
- Eight years or so back, we could see that all those inefficient, old boy style, bricks and mortar distributors were about to be 'disintermediated'. Startups were going to create marketplaces for everything from construction material to genetic material, and run right around them. Well, maybe not. In many cases, long term relationships - the use of interpersonal trust to mitigate transaction costs - was more important than the efficiencies of a bid marketplace. The customers waited for their traditional suppliers to catch up, and the would-be marketplaces died.
There is a further pattern to these failures: Engineering oriented entrepreneurs tend to overlook the human elements of transaction costs, both at the level of individual limits and habits, and at the social level of interconnections among people. Anyone trying a play around reducing transaction costs should have a trusted analyst who has not consumed the Koolaid look at what hidden costs will emerge thereafter.
The 'dark side' is also a potential failure mode. Running some 'Red Team' thought experiments on how the Web 2.0 world will be polluted by bad actors is worthwhile, because every successful market breeds parasites. See, for instance, my earlier post on Index as Process.
Brave Old World
But, let's assume these misgivings are misplaced. Have we been this way before? Not exactly, of course, but there have been situations in which transaction and entry costs were lower, and their outcomes may be instructive.
For those of us of an age, we can try early microcomputer software. We're talking S-100, Apple II, and C64 here. Products were simple because the machines were simple. Compatibility (other than with the platform) wasn't an issue - this was before PCs were business machines. The ante to create a product was rather modest. Packaging... remember software in baggies? But finding what you needed, that was a real problem. How did the story come out? Distributors - both branded software publishers and computer stores - made out. For the independent software author, it became a hit driven market.
Videogames have followed a similar path. Early consoles had limited capabilities; you didn't need a render farm to put together Centipedes. Entry costs were relatively modest. Compatibility outside the platform was a non-issue. And there are a lot of people who like the idea of writing games - plenty of supply of creativity. Outcome? Big winners are the platform vendors (license fees) and again branded software publishers. Game writing for independent shops is often a hand-to-mouth existence, and folks like Electronic Arts have been slowly rolling up that market. Note that VCs seldom play in this market: risk mitigation is through control of distribution, not portfolio theory, and we don't have distribution.
Let's try VB (Visual Basic) controls, and their relatives written in other languages, or for the .NET framework. Component software, just like talked about in the early 90s, except it's not on a fashionable platform. A useful business, a profitable business for some. And nearly a cottage industry (a friend of mine runs one of the larger shops.) Who makes out in this market? I think their HQ is in Redmond.
Cui bono?
So we've got some indications that if (presuming a platform) entry costs are low, and standards are strong, reducing transaction costs, then those coming out ahead are branded distributors and platform vendors.
But in the case of Web 2.0, we don't have a platform vendor, do we? It's a lot of open source, and anyway you can mash up all sorts of services and components from everywhere. And distribution, it's just the net, right?
Not so fast. There's this thing about reaching customers. Just like the 10th anonymous baggy hanging on a peg in an early computer shop, if there's nothing special to distinguish your own service and rise above the noise, you've got a problem. And if if this is a list of Web 2.0 ventures in Canada alone, for Pete's sake, there's gonna be a LOT of noise.
Further, you can do these lightweight component services because they plug into other services: directories, search, geolocation, etc. So you need to pick a partner to turn on your services. Who will you pick? Would you perhaps call that a services platform?
Lastly, there's that dark side stuff I mentioned. The phishers, spammers, scammers, directory harvesters, BIN rollers, DDOSers, port scanners and all the rest of the filth of mankind translated to the net. How much overhead of that sort can you absorb and keep your project lightweight, but not something that will fall over as soon as it gets big enough to draw the attention of this disgusting lot?
Bottom Line
And the winners are: Google, Yahoo, and perhaps a very few others - Amazon? eBay? Branded distribution and services platform all rolled into one. Enough scale to hold off the dark side without bringing down the business model. Quite happy to have a lot of latter day ISVs building out ideas that can be easily aggregated onto their platform feature set, when and if they prove to have an audience. This position might be available to some of the legacy media franchises, were they to aggressively seek to convert their business models and audiences. With a few exceptions - Murdoch, Malone - they prefer their position of arrogant cash cows.
And the VCs? Well, I think those saying our conventional model will have trouble with this lower 'quantum size' end of the market are likely right. You'll notice that the examples I picked above have something else in common: very little venture capital in the category. Don't worry about us; we'll go snatch some more crumbs from the corporate in-house R&D folks. Since it's a highly adaptable business, some of the smaller capital funds that can 'move the needle' with smaller investments may come up with innovative ideas to knock down the overhead as well.
But, I wouldn't start cheering if I were a software entrepreneur, though that's been the tone of some of the posts I've read. A breakdown in the VC model suggests less demand for equity in the componentized software and services category. Meanwhile, lower entry costs suggest more supply than heretofore, which seems to be supported by the plethora of 'Web 2.0' tagged innovations floating around. (Update: See Peter Rip's on-point take on this issue..) This makes suggestions such as this one, that VCs will partially cash out entrepreneurs early just plain silly. That might happen if the supply/demand balance were strongly shifting the other way - it's not.
As always, be careful what you wish for. From here, the future of much of the 'Web 2.0' activity looks decidedly low rent. A lot of Stage 1 ventures (in my Two Stage metaphor) fueled by savings, credit cards, angels and friends and family. Likely a hit driven future, with the winners decided by some combination of the service platform owners, and VC specialists doing roll-ups and other strategies to aggregate sufficient functionality and market impact to warrant a Stage 2 investment.