Open source = cheap input?

Nicholas Carr, commenting on Larry Ellison’s raid on Red Hat

It’s always been clear that the [Open Source production] system, however you view it, imposes an economic vulnerability on the profit-making companies that engage in it. Those companies have to pay labor costs for developing a free good, a public good that that they have no proprietary control over. Their rivals can reap the fruits of that labor without having to pay for it. That creates, in theory, a dangerous asymmetry in competition.

But what hasn’t been clear is whether that vulnerability actually matters, whether the danger that exists in theory also exists in reality. Are there economic or other barriers that prevent competitors from capitalizing on the investments of the open-source companies?

We’re about to get a lot closer to an answer to that question, thanks to that great clarifying force in the technology business, Larry Ellison. Yesterday, Ellison announced that his company, Oracle, fully intends to eat the fruits of the labor of Red Hat, the leading for-profit supplier of the open-source Linux operating system. Oracle is taking the version of Linux developed by Red Hat and distributing it under its own brand, as “Unbreakable Linux.” And, in a stab at Red Hat’s very heart, Ellison claims that Oracle will substantially undercut the open-source firm’s prices for supporting the software.

It seems like a claim that shouldn’t be hard to fulfill. After all, Oracle doesn’t have to pay those labor costs.

Once open source became a business, rather than a movement, the rules changed. Larry Ellison, whos’s nothing if not a non-sentimentalist, understands that, and he doesn’t particularly care what “the community” thinks. His attack on Red Hat would never be called neighborly, but it is, as Business Week’s Steve Hamm puts it, “a ruthless and brilliant act of capitalism.”

It’s also something more. It illuminates a much broader and deeper tension in the digital world, a fault line that runs not only through the software industry but through every industry whose products or services exist, or can exist, as software. The tension is between social production and the profit motive. Volunteer labor means something very different in the context of a community than it does in the context of a business. In the context of a community, it’s an expression of fellowship, of the communal value of sharing. But in the context of a business, as Ellison’s move illustrates, it’s nothing more than a cheap input. Many of the most eloquent advocates of social production would prefer it if this tension didn’t exist. But it does, and it’s important.

This one will run and run. Carr is unduly impressed by the Ellison ploy, I think. There are subtle penalties for bad behaviour which the Oracle boss, being a corporate bully, is unlikely to understand. It’s worth remembering that when IBM, itself no slouch at the deployment of brute corporate force, decided that it would put Open Source products at the heart of its own offerings, it explicitly decided that good behaviour would be key to success. And IBM has, by and large, been a good neighbour in the Open Source community. It has also prospered mightily from it.

Microsoft vs. Open Source

Two Harvard economists have built a model to elucidate the battle between Windows and Linux. There’s an interesting interview with the authors in which they discuss their findings.

Their conclusion?

Our main result is that in the absence of cost asymmetries and as long as Windows has a first-mover advantage (a larger installed base at time zero), Linux never displaces Windows of its leadership position. This result holds true regardless of the strength of Linux’s demand-side learning. Furthermore, the result persists regardless of the intrinsically better design and potential differential value of Linux. In other words, harnessing demand-side learning more efficiently is not sufficient for Linux to win the competitive battle against Windows.

Having obtained this basic result, we investigate the conditions that will warrant that Linux ends up forcing Windows out. We do this by modifying the model in two ways. First of all, we look at the effect of having buyers such as governments and some large corporations committed to deployment of Linux in their organizations. We call such buyers strategic. In addition to cost-related reasons, governments back Linux because having access to the source code allows them to verify that sensitive data is treated securely. Binary code makes it hard to figure out who has access to information flowing in a network. Companies such as IBM, in contrast, back Linux because they see in OSS one way to diminish Microsoft’s dominance. We find that the presence of strategic buyers together with Linux’s sufficiently strong demand-side learning results in Windows being driven out of the market. This may be one main reason why Microsoft has been providing chunks of Windows’ source code to governments.

Second, we look at the role of cost asymmetries. In the base model we assume that the cost structures of Windows and Linux for the development, distribution, and support of software coincide. A natural question is then whether the central result that Windows survives in the long-run equilibrium regardless of the speed of Linux’s demand-side learning persists if there are cost asymmetries. We find that because OSS implies lower profits for Microsoft, the larger the cost differences are between Linux and Windows, the less able Microsoft is to guarantee the survival of Windows.

We also show that it is not all bad news to Microsoft. We analyze the effect of having forward-looking buyers and the presence of piracy, and conclude that both benefit Microsoft.

They also come to the counter-intuitive conclusion that piracy actually helps Microsoft!

n addition to this main result, we were also surprised to find that piracy may end up increasing Microsoft’s profits. To understand why, notice that there are two types of pirates: those who would not have bought Windows in the first place because it is too expensive, and those who would have bought Windows but now decide to pirate it. The first category increases Windows’ installed base without affecting sales. As a consequence, this group increases the value of Windows. And thanks to these pirates, Microsoft is able to set higher prices in the future (because the value of the system goes up). In addition, having these pirates means that Linux’s installed base does not grow as much as it would have if piracy weren’t there. The second type of pirates (those who in the absence of piracy would have bought Windows) reduces Windows’ sales and profit. Thus, if the proportion of first-type pirates is sufficiently large, Microsoft’s profits will increase with piracy…

One can almost hear the sighs of relief in Redmond. The only problem is that the entire hypothesis depends on the accuracy of a mathematical model.

Our cognitive bias against openness

Lovely FT column by James Boyle. Sample:

Studying intellectual property and the internet has convinced me that we have another cognitive bias. Call it the openness aversion. We are likely to undervalue the importance, viability and productive power of open systems, open networks and non-proprietary production.

Test yourself on the following questions. In each case, it is 1991 and I have removed from you all knowledge of the past 15 years.

You have to design a global computer network. One group of scientists describes a system that is fundamentally open – open protocols and systems so anyone could connect to it and offer information or products to the world. Another group – scholars, businessmen, bureaucrats – points out the problems. Anyone could connect to it. They could do anything. There would be porn, piracy, viruses and spam. Terrorists could put up videos glorifying themselves. Your activist neighbour could compete with The New York Times in documenting the Iraq war. Better to have a well-managed system, in which official approval is required to put up a site; where only a few actions are permitted; where most of us are merely recipients of information; where spam, viruses, piracy (and innovation and anonymous speech) are impossible. Which would you have picked?

Set yourself the task of producing the greatest reference work the world has ever seen. It must cover everything from the best Thai food in Raleigh to the annual rice production of Thailand, the best places to see blue whales to the history of the Blue Dog Coalition. Would you create a massive organisation of paid experts with layers of editors producing tomes that are controlled by copyright and trademark? Or would you wait for hobbyists, scientists and volunteer encyclopedists to produce, and search engines to organise, a cornucopia of information? I know which way I would have bet in 1991. But I also know that the last time I consulted an encyclopedia was in 1998….

All systems go (on my Mac)

This morning’s Observer column — about virtualisation…

At this point, dear reader, I know what you’re thinking. However fascinating this ‘virtual machine’ nonsense may be to geeks, it’s of no interest to normal human beings. You may feel as Mrs Dave Barry did when her husband, the Miami Herald humorist, took her for a spin in a Humvee and proudly explained that the vehicle could inflate and deflate its tyres while in motion. Why, she asked, would anyone want to do that?

So what’s the point of virtualisation? Simply that it provides a vivid illustration of the most disruptive attribute of digital technology – its capability to break the link between an application and a physical platform. Once upon a time, if you bought a PC it ran Windows, and if you bought a Mac it ran Apple’s operating system. But now Macs run Windows, and IBM ThinkPads – which have the same processor – can run OS X (though of course Apple is doing its best to head off that possibility). And Linux runs on everything.

This disconnection of application/ service from hardware is happening all over the place…

Open Content in action

I’ve written a little about the Net Neutrality debate , and posted some Blog entries bout it — e.g. here, here and here. It’s a complex and interesting subject, and politicians have clearly had difficulty getting their heads around it. So I was interested to see how Wikipedia would approach the topic.

The entry seemed to me to be a model of its kind — well-informed, mostly well-referenced and balanced. But its ‘neutrality’ has been challenged and has triggered Wikipedia’s discussion process. The discussion page on the issue is fascinating. Here’s the bit about the bias complaint.

This article seems to me to be slanted towards the pro-net neutrality position. The primary problem is about “framing the debate”. I think its pretty clear that the term itself is a frame, an analogy would be if the abortion debate was called “the pro-life vs. anti-life debate”. The article falls for this framing by first discussing the general or abstract concepts of network neutrality. A better approach I feel would be to discuss the origins of the debate, namely that emerging internet applications that cost ISP’s much more in bandwidth charges led them to ban certain devices or find ways to pass that charge on, by charging content providers instead of end users.

The other issue with this debate is that it seems to be an “astroturf” debate, with a inordinate amount of editorials on it.
Please see “Dispute from 71.140.198.6” below Hackajar 16:12, 16 May 2006 (UTC)

I would suggest talking about reframe here instead of forcing a NPOV [Wiki-speak for ‘Neutral Point of View’ — JN] Hackajar 16:17, 16 May 2006 (UTC)

Hackajar’s additions on May 16th are clearly biased and speculative, simply regurgitating Google’s fear-mongering tactics about the COPE Act. This sort of hysteria is part of the debate over NN regulations, but he shouldn’t be offering up such astroturf propaganda as if it were factual.

Statements were added as a matter of common sense, a UPS driver does not pay the city to use road to drive to your house to deliver a package, not influenced by “fear-mongering” generated by any company. Hackajar 13:18, 17 May 2006 (UTC)

This is an encyclopedia. We publish verifiable information from reliable third party sources. Not “common sense.” Please review WP:NOR. Thank you. Nandesuka 13:31, 18 May 2006 (UTC)

I think I added some con-NetNeurality stuff to balance it out. I’m not saying what position I have or whether I have one. John wesley 12:58, 18 May 2006 (UTC)

Once again, the page has been massively edited with a “net neutrality is good, non-regulation is bad” point of view. They’re bringing in all sorts of red herrings from the 90s and distorting the interests in the regulation fight.

Folks, Wikipedia is not supposed to be an extension of Moveon.org, it’s supposed to be place where people can get the straight story without all the spin. Net neutrality is a complex issue, not a good guys vs. bad guys emotional drama. RichardBennett 20:39, 28 June 2006 (UTC)

It’s always irritating to have one’s views changed by other people’s better arguments, but this discussion has caused me to re-evaluate the original entry. I think the point about ‘framing’ is right. Wouldn’t it be nice if all public debate about complex issues were conducted this way? Then we really would have a deliberative democracy. I’m always puzzled by people’s hostility to Wikipedia: to me, it looks like one of the best things to have emerged from the Net.

The Vista problem: a candid internal view

One of the puzzles that really interests me is why delivering Vista — the new incarnation of Windows — has proved so traumatic for Microsoft. It’s interesting because it raises the question of whether these code-monsters have now grown so large and complex that they are beyond the capacity of any single organisation — even one as smart as Microsoft — to manage. Here’s an extensive excerpt from a fascinatingly candid Blog post by a Microsoft insider. Apologies for the length, but it has already been removed once after posting (though the author says he came under no company pressure)…

Vista. The term stirs the imagination to conceive of beautiful possibilities just around the corner. And “just around the corner” is what Windows Vista has been, and has remained, for the past two years. In this time, Vista has suffered a series of high-profile delays, including most recently the announcement that it would be delayed until 2007. The largest software project in mankind’s history now threatens to also be the longest.

[…]

Admittedly, this essay would be easier written for Slashdot, where taut lines divide the world crisply into black and white. “Vista is a bloated piece of crap,” my furry little penguin would opine, “written by the bumbling serfs of an evil capitalistic megalomaniac.” But that’d be dead wrong. The truth is far more nuanced than that. Deeper than that. More subtle than that.

I managed developer teams in Windows for five years, and have only begun to reflect on the experience now that I have recently switched teams. Through a series of conversations with other leaders that have similarly left The Collective, several root causes have emerged as lasting characterizations of what’s really wrong in The Empire.

[…]

The Usual Suspects

Ask any developer in Windows why Vista is plagued by delays, and they’ll say that the code is way too complicated, and that the pace of coding has been tremendously slowed down by overbearing process. These claims have already been covered in other popular literature. A quick recap for those of you just joining the broadcast:

Windows code is too complicated. It’s not the components themselves, it’s their interdependencies. An architectural diagram of Windows would suggest there are more than 50 dependency layers (never mind that there also exist circular dependencies). After working in Windows for five years, you understand only, say, two of them. Add to this the fact that building Windows on a dual-proc dev box takes nearly 24 hours, and you’ll be slow enough to drive Miss Daisy.

Windows process has gone thermonuclear. Imagine each little email you send asking someone else to fill out a spreadsheet, comment on a report, sign off on a decision — is a little neutron shooting about in space. Your innocent-seeming little neutron now causes your heretofore mostly-harmless neighbors to release neutrons of their own. Now imagine there are 9000 of you, all jammed into a tight little space called Redmond. It’s Windows Gone Thermonuclear, a phenomenon by which process engenders further process, eventually becoming a self-sustaining buzz of fervent destructive activity.

Let’s see if, quantitatively, there’s any truth to the perception that the code velocity (net lines shipped per developer-year) of Windows has slowed, or is slow relative to the industry. Vista is said to have over 50 million lines of code, whereas XP was said to have around 40 million. There are about two thousand software developers in Windows today. Assuming there are 5 years between when XP shipped and when Vista ships, those quick on the draw with calculators will discover that, on average, the typical Windows developer has produced one thousand new lines of shipped code per year during Vista. Only a thousand lines a year. (Yes, developers don’t just write new code, they also fix old code. Yes, some of those Windows developers were partly busy shipping 64-bit XP. Yes, many of them also worked on hotfixes. Work with me here.)

Lest those of you who wrote 5,000 lines of code last weekend pass a kidney stone at the thought of Windows developers writing only a thousand lines of code a year, realize that the average software developer in the US only produces around (brace yourself) 6200 lines a year. So Windows is in bad shape — but only by a constant, not by an order of magnitude. And if it makes you feel any better, realize that the average US developer has fallen in KLOC productivity since 1999, when they produced about 9000 lines a year. So Windows isn’t alone in this.

The oft-cited, oft-watercooler-discussed dual phenomenon of Windows code complexity and Windows process burden seem to have dramatically affected its overall code velocity. But code can be simplified and re-architected (and is indeed being done so by a collection of veteran architects in Windows, none of whom, incidentally, look anything like Colonel Sanders). Process can be streamlined where inefficient, eliminated where unnecessary.

But that’s not where it ends. There are deeper causes of Windows’ propensity to slippage.

Cultured to Slip

Deep in the bowels of Windows, there remains the whiff of a bygone culture of belittlement and aggression. Windows can be a scary place to tell the truth.

When a vice president in Windows asks you whether your team will ship on time, they might well have asked you whether they look fat in their new Armani suit. The answer to the question is deeply meaningful to them. It’s certainly true in some sense that they genuinely want to know. But in a very important other sense, in a sense that you’ll come to regret night after night if you get it wrong, there’s really only one answer you can give.

After months of hearing of how a certain influential team in Windows was going to cause the Vista release to slip, I, full of abstract self-righteous misgivings as a stockholder, had at last the chance to speak with two of the team’s key managers, asking them how they could be so, please-excuse-the-term, I-don’t-mean-its-value-laden-connotation, ignorant as to proper estimation of software schedules. Turns out they’re actually great project managers. They knew months in advance that the schedule would never work. So they told their VP. And he, possibly influenced by one too many instances where engineering re-routes power to the warp core, thus completing the heretofore impossible six-hour task in a mere three, summarily sent the managers back to “figure out how to make it work.” The managers re-estimated, nipped and tucked, liposuctioned, did everything short of a lobotomy — and still did not have a schedule that fit. The VP was not pleased. “You’re smart people. Find a way!” This went back and forth for weeks, whereupon the intrepid managers finally understood how to get past the dilemma. They simply stopped telling the truth. “Sure, everything fits. We cut and cut, and here we are. Vista by August or bust. You got it, boss.”

Every once in a while, Truth still pipes up in meetings. When this happens, more often than not, Truth is simply bent over an authoritative knee and soundly spanked into silence.

The Joy of Cooking

Bundled with a tendency towards truth-intolerance, Windows also sometimes struggles with poor organizational decision-making. Good news is that the senior leaders already know this and have been taking active steps to change the situation.

There are too many cooks in the kitchen. Too many vice presidents, in reporting structures too narrow. When I was in Windows, I reported to Alec, who reported to Peter, to Bill, Rick, Will, Jim, Steve, and Bill. Remember that there were two layers of people under me as well, making a total path depth of 11 people from Bill Gates down to any developer on my team.

This isn’t necessarily bad, except sometimes the cooks flash-mob one corner of the kitchen. I once sat in a schedule review meeting with at least six VPs and ten general managers. When that many people have a say, things get confusing. Not to mention, since so many bosses are in the room, there are often negotiations between project managers prior to such meetings to make sure that no one ends up looking bad. “Bob, I’m giving you a heads-up that I’m going to say that your team’s component, which we depend on, was late.” “That’s fine, Sandy, but please be clear that the unforeseen delays were caused by a third party, not my team.”

Micromanagement, though not pervasive, is nevertheless evident. Senior vice presidents sometimes review UI designs of individual features, a nod to Steve Jobs that would in better days have betokened a true honor but for its randomizing effects. Give me a cathedral, give me a bazaar — really, either would be great. Just not this middle world in which some decisions are made freely while others are made by edict, with no apparent logic separating each from the other but the seeming curiosity of someone in charge.

In general, Windows suffers from a proclivity for action control, not results control. Instead of clearly stating desired outcomes, there’s a penchant for telling people exactly what steps they must take. By doing so, we risk creating a generation of McDevs. (For more on action control vs. results control, read Kenneth Merchant’s seminal work on the subject — all $150 of it, apparently).

Uncontrolled? Or Uncontrollable?

We shouldn’t forget despite all this that Windows Vista remains the largest concerted software project in human history. The types of software management issues being dealt with by Windows leaders are hard problems, problems that no other company has solved successfully. The solutions to these challenges are certainly not trivial.

An interesting question, however, is whether or not Windows Vista ever had a chance to ship on time to begin with. Is Vista merely uncontrolled? Or is it fundamentally uncontrollable? There is a critical difference.

It’s rumored that VPs in Windows were offered big bonuses contingent on shipping Vista by the much-publicized August 2006 date. Chris Jones even declared in writing that he wouldn’t take a bonus if Vista slips past August. If this is true, if folks like Brian Valentine held division-wide meetings where August 2006 was declared as the drop-dead ship date, if general managers were consistently told of the fiscal importance of hitting August, if everyone down to individual developers was told to sign on the dotted line to commit to the date, and to speak up if they had any doubts of hitting it — mind you, every last one of those things happened — and yet, and yet, the August date was slipped, one has to wonder whether it was merely illusory, given the collective failure of such unified human will, that Vista was ever controllable in the first place.

Are Vista-scale software projects essentially uncontrollable by nature? Or has Microsoft been beset by one too many broken windows? Talk amongst yourselves…

The answer to that question — Are Vista-scale software projects essentially uncontrollable by nature? — may well be “yes — if they’re done within a single organisation”. That’s why Steve Weber may well be right about Open Source: that it’s a better way of making unimaginably complex products.

Wrestling with the Vista monster

This morning’s Observer column

Microsoft’s problems with Windows may be an indicator that operating systems are getting beyond the capacity of any single organisation to handle them. Whatever other charges might be levelled against Microsoft, technical incompetence isn’t one. If the folks at Redmond can’t do it, maybe it just can’t be done.

Therein may lie the real significance of Open Source. In a perceptive book published in 2004, the social scientist, Steve Weber argued that it’s not Linux per se but the collaborative process by which the software was created that is the real innovation. In those terms, Linux is probably the first truly networked enterprise in history.

Weber likened Open Source production to an earlier process which had a revolutionary impact – Toyota’s production system – which in time transformed the way cars are made everywhere. The Toyota ‘system’, in that sense, was not a car, and it was not uniquely Japanese. Similarly, Open Source is not a piece of software, and it is not unique to a group of hackers. It’s a way of building complex things. Microsoft’s struggles with Vista suggest it may be the only way to do operating systems in future…

The last hurrah?

Wall Street wiped $32 billion off the value of Microsoft yesterday as its share price dropped 11 per cent. This was because the company revealed a dramatic shift in its strategy to spend bucketloads of money trying to compete in emerging online markets. Here’s what the Financial Times‘s Lex column had to say:

While Microsoft’s shares dropped like a stone after it revealed plans to pour cash into online and other new markets, Google’s stock barely budged. A warning, perhaps, of the ineffectiveness of Microsoft’s billions in the battle ahead?

The investment binge that will hammer Microsoft’s profits next year echoes other past spending sprees, such as the initial Xbox foray. The company spent years trying to convince Wall Street that it was swearing off such extravagances, so it is hardly surprising that the news was poorly received. In fact Microsoft has little choice. The coming Windows Vista product cycle could well mark the last hurrah of a truly wondrous business model. As more software moves to the web and mobile phones, Microsoft’s foothold on the PC will become progressively weaker as a place from which to shape the future of its industry. Putting Windows on servers was a nice stopgap, but further gains will become harder as Linux spreads…

I love that phrase — “the last hurrah of a truly wondrous business model”. Must remember it for future use.

Oh — and while we’re on the subject, Netcraft has revealed that Apache has overtaken Microsoft as the leading developer of secure web servers. Apache now runs on 44.0% of secure web sites, compared to 43.8% for Microsoft.