MacroMyopia

Don Dodge has a nice post on an incurable disease which afflicts both mainstream media and the blogosphere…

There is a severe case of MacroMyopia spreading across the blogosphere. Today it is The Death of Email. Yesterday it was Inbox 2.0 – Email meets Social Networks. Macro-Myopia is the tendency to overestimate the short term impact of a new product or technology, and underestimate its long term implications on the marketplace, and how competitors will react.

Straight up and to the right – It is human nature to extrapolate the early success of a “new thing” to world domination, and to the death of the “old thing”. Insert any variable for “new thing” like; Facebook, Twitter, Text Messaging, Open Source, Linux, YouTube…and you can finish the sentence with the death of the “old thing”.

The best of both worlds – In most cases the early innovator of a product or technology wins some early success in a narrow market segment. The big winners come in later by incorporating the new technology into an existing product or service and creating a best of both worlds solution that appeals to a much broader market. I call this the “Innovate or Imitate – Fame or Fortune” scenario…

Lots more where that came from. Good stuff.

Lies, damn lies and Internet statistics (contd)

Following my post of yesterday, James Cridland has done some really interesting digging into the statistics from his Media UK site, which gets over 2m page-views a month. Since it’s his site, he knows what’s going on. He then compares what Google Analytics, Alexa and Compete claim is happening on the site.

He found some curious discrepancies, and concluded that

Compete is under-representing my traffic by over two-thirds – as well as demonstrably not following any trends in mediauk.com’s site traffic. The figures are almost entirely unrelated to my website’s traffic. This is bad. Are websites basing purchase decisions on Compete’s data? In which case, do I have a legal case against them?

We’ve not, yet, mentioned comScore. That’s a blog post for another day, I suspect: because there’s so much more there than meets the eye, it’s not funny.

And to think that investors and VCs base valuations on these numbers…

Stand by for the crash

The prime motive for a bubble in any field of human activity is the delusion that investing is a one-way bet. Britain (and, to an even greater extent, Ireland) is in the grip of a crazed property bubble. I don ‘t often agree with Will Hutton, but this time he’s spot on.

The risk of history repeating itself is known, but too few people believe it. Not the clubs of four or five young people ‘co-buying’ in order to have a chance of getting into the housing market. Not the wave of buyers of flats that are bought speculatively either to be let or which just stand vacant (and which now constitute one of the prime drivers of demand). Seventy percent of the 20,000 flats built in London last year were bought by buy-to-let speculators.

Neither they, nor those who lend the money, appear to be concerned that prices will fall. Cheltenham and Gloucester has just decided that it will finance small buy-to-let borrowers to buy up to nine properties rather than the three at present. The Bank of Ireland, according to the Financial Times, has just raised the maximum it will lend to any one entrepreneur by eight times – from £2.5m to £20m. It is risk-free lending. It may be that the yield from rents is lower than the costs of borrowed money, spelling disaster, but as property prices only rise, nobody worries. It is stories like these that prove we are in a bubble…

What’s funny about bubbles is that everyone knows, really, that they’re in one; but most assume that they personally will be ok.

Unrest in Cyberspace

Hmmm…. It’s not just Digg that’s been having trouble with restive users. It seems that Second Life is also having difficulties. Tech Review reports that:

The overseers of Second Life, a complex and booming virtual world hailed by many as the first step toward an immersive 3-D Internet, attempted yesterday to calm angry cyber-citizens who have petitioned for fixes to technical bugs recently plaguing the world.

The main problem, in members’ eyes: Second Life is growing so fast that it’s straining Linden Lab’s resources to the limit, including its developers’ ability to fix old bugs and roll out new software versions that don’t introduce new problems. In a town-hall meeting yesterday inside Second Life, the company appealed for patience.

“We are working to fix bugs and enable incremental improvement,” said Cory Ondrejka, chief technology officer at Linden Lab, the venture-funded San Francisco startup that launched Second Life in 2003. The town-hall meeting was hastily arranged in response to a damning open letter published by irritated Second Life residents on April 30. “At the same time, we are building the foundations for the next-gen architecture that will radically improve our ability to scale,” Ondrejka said.

Every day, some 25,000 computer owners, plus teams from dozens of major corporations, are rushing to join Second Life. But as these new members buy virtual land, set up house for their avatars, and start in-world businesses, the strain on the Second Life “grid” is increasing. Linden Lab is adding more than 120 new servers every week, according to Ondrejka, but users say that the company still isn’t keeping up. Complaints have piled up in Second Life forums and blogs from longtime users impatient over frequent slowdowns and crashes, property that goes missing, messages that aren’t delivered, search and friend-finder functions that don’t work, purchases that aren’t completed, and poor to nonexistent customer service and technical support.

The dissatisfaction culminated this week in the open letter, which demands that Linden Lab address the bugs “immediately,” before rolling out planned features such as voice chat. More than 3,000 Second Life users have signed the letter so far.

“People feel that Linden Lab is failing them because they are paying a great deal, in some cases, for a product that is failing to work acceptably, from a company that will no longer communicate with its customers,” says one signer, a United Kingdom-based IT manager known within Second Life as Inigo Chamerberlin…

TechBubble 2.0

Very astute thought from Dave Winer…

In the late 90s, the period of irrational exuberance, we knew the end would come, and we knew what the end would look like — a stock market crash of the dotcom sector. So, if Web 2.0 is a bubble, and if like all bubbles it bursts, how will we know when it happens?

I almost wrote a piece yesterday saying that since the Web 2.0 companies aren’t going public, they’re safe from busting in a visible, dramatic way. I almost said it will be hard to tell when the bust comes, it’ll be softer and slower, you won’t hear a crash or even a pop. But I was wrong, and today we got the first rumblings of the shock that will signal the end of the bubble.

Google stock will crash. That’s how we’ll know.

When I realized this, I should have known, because I’ve been saying for almost a year that Web 2.0 is nothing more than an aftermarket for Google. Startups slicing little bits of Google’s P/E ratio, acting as sales reps for Google ads, and getting great multiples for the revenue they generate by fostering the creation of new UGC to place ads on. When Google crashes, that’s the end of that, no more wave to ride, no more aftermarket, Bubble Burst 2.0. And the flip of this is also true — as long as Google’s stock stays up, no bubble burst.

Spot on. Google’s Price/Earnings ratio at the moment is around 60. That’s nuts.

The dictatorship of the presentation layer

Bill Thompson is eloquently sceptical about Web 2.0. (I prefer the term techBubble 2.0 btw.) Here’s a sample of his Register blast:

If Web 2.0 is the answer then we are clearly asking the wrong question, and we must not be fooled by the cool sites and apparently open APIs. Most of the effort is – literally – window dressing, designed to attract venture capitalists to poorly-considered startups and get hold of enough first-round funding to build either a respectable user base or enough barely runnable alpha code to provide Google or Yahoo! with yet another tasty snack. We need to take a wider view of what is going on.

Back in the 1870s Karl Marx outlined the steps through which he believed a capitalist society needed to pass before it could reach socialism. After the revolution came the dictatorship of the proletariat, a painful but necessary stage of oppression and correction, during which the organs of the state would whither away as humanity achieved its true potential and coercion became unnecessary.

Web 2.0 marks the dictatorship of the presentation layer, a triumph of appearance over architecture that any good computer scientist should immediately dismiss as unsustainable.

Ajax is touted as the answer for developers who want to offer users a richer client experience without having to go the trouble of writing a real application, but if the long term goal is to turn the network from a series of tubes connecting clients and servers into a distributed computing environment then we cannot rely on Javascript and XML since they do not offer the stability, scalability or effective resource discovery that we need.

There is a massive difference between rewriting Web pages on the fly with Javascript and reengineering the network to support message passing between distributed objects, a difference that too many Web 2.0 advocates seem willing to ignore. It may have been twenty years since Sun Microsystems trademarked the phrase ‘the network is the computer’ but we’re still a decade off delivering, and if we stick with Ajax there is a real danger that we will never get there…M/blockquote>

YouTube starts to evaporate

From Good Morning Silicon Valley

YouTube’s fascinating catalog of Japanese television clips is quite a bit thinner today, thanks to complaints from an organization representing Japanese copyright holders. The video-sharing site deleted nearly 30,000 files after a Japanese entertainment group requested they be removed, saying they were posted without the authorization of copyright holders. According to The Japan Society for Rights of Authors, Composers and Publishers (JASRAC), an alliance of 23 Japanese TV stations, movie and music companies, 29,549 YouTube-hosted clips were posted in violation of copyright. That’s a pittance when one considers YouTube served up an average of 100 million video streams a day during July. Given that extraordinary number, who will miss a few lizard vs. humans-in-meat-hats game show clips?

Still, this first mass removal of clips should give YouTube boosters pause, because without those 29,549 videos, YouTube is that much less compelling. And if the JASRAC’s request is the beginning of a trend, we could see YouTube becoming increasingly more vanilla as it’s forced to clean up the copyright violations that proliferate on its service. As Forrester analyst Josh Bernoff pointed out earlier this year, this is the Napster scenario all over again. “YouTube is romancing media companies, just as Napster was,” Bernoff wrote. “YouTube will take down copyrighted content if you complain, just as Napster would. And YouTube’s model is based on masses of material available without regard for copyright status, just as Napster’s was. So, mark my words, YouTube will get sued. And it will lose. The tools it is talking about, that identify and remove copyrighted content, will have to be rushed into practice. And when nearly every clip that has copyrighted content — music in the background, video of Bart Simpson, photos stolen from movie posters — is gone, YouTube’s going to be a lot less interesting.”

Exactly. As I was saying only yesterday.

If (user-generated) content is king, why isn’t it getting paid?

Terrific Guardian column by Vic Keegan.

The creators of YouTube have done a great service in bringing video creation to the masses. But it was not because their technology was superior to others in the field (it wasn’t), but because they were in the right place at the right time when, unpredictably, YouTube suddenly attracted critical mass. This was a huge victory for garage start-ups over the likes of Google, Microsoft and Yahoo, which found to their cost that the mighty leverage arising from their big market shares in existing products buttered no parsnips in the new world of web creativity.

As a result YouTube, a company that has been mainstream for barely a year, attracted a price tag of $1.65bn, equivalent to almost $25m per employee (not that they will see much of it) or $123 for each of YouTube’s unique monthly users. The figure for those who actually generate the content would be far higher than $123 because only a small proportion of users actually put their own videos up.

Yet without those content creators, YouTube – and Flickr, and all the others – would be nothing. Imagine what would happen if eBay tried to value itself on the basis of all the inventory it held on behalf of its sellers. It wouldn’t because it knows the inventory doesn’t belong to it.

There’s something deeply comical about TechBubble 2.0 — which is what I’ve decided to call the current round of irrational exhuberance. Just to underscore how difficult it is to build and maintain a big, stable company in this febrile space, along come the reports of Yahoo’s difficulties — profits down 38%.

As far as user-generated content goes, the big question — as Vic Keegan says — is: where’s the value? The answer is that it’s in the stuff that people upload. But if people don’t like what you (the new corporate owner) start to do with the space then they can — and will — go elsewhere. Steve Ballmer implied in his BusinessWeek interview the other day, no rational company would have paid $1.65 billion for YouTube. For once, I agree with him.