The (un)reliability of business journalism (contd.)

David Pogue had the neat idea of looking up the Lexis-Nexis newspaper database to see how Apple was being covered a decade ago. Here’s a selection of what he found:

  • Fortune, 2/19/1996: “By the time you read this story, the quirky cult company…will end its wild ride as an independent enterprise.” * Time Magazine, 2/5/96: “One day Apple was a major technology company with assets to make any self respecting techno-conglomerate salivate. The next day Apple was a chaotic mess without a strategic vision and certainly no future.” * BusinessWeek, 10/16/95: “Having underforecast demand, the company has a $1 billion-plus order backlog…. Apart from some ideas, the only and best alternative: to merge with a company with the marketing and financial clout to help Apple survive the switch to a software-based company. The most likely candidate, many think, is IBM Corp.” * A Forrester Research analyst, 1/25/96 (quoted in, of all places, The New York Times): “Whether they stand alone or are acquired, Apple as we know it is cooked. It’s so classic. It’s so sad.” * Nathan Myhrvold (Microsoft’s chief technology officer, 6/97: “The NeXT purchase is too little too late. Apple is already dead.” * Wired, “101 Ways to Save Apple,” 6/97: “1. Admit it. You’re out of the hardware game.” * BusinessWeek, 2/5/96: “There was so much magic in Apple Computer in the early ’80s that it is hard to believe that it may fade away. Apple went from hip to has-been in just 19 years.” * Fortune, 2/19/1996: “Apple’s erratic performance has given it the reputation on Wall Street of a stock a long-term investor would probably avoid.” * The Economist, 2/23/95: “Apple could hang on for years, gamely trying to slow the decline, but few expect it to make such a mistake. Instead it seems to have two options. The first is to break itself up, selling the hardware side. The second is to sell the company outright.” * The Financial Times, 7/11/97: “Apple no longer plays a leading role in the $200 billion personal computer industry. ‘The idea that they’re going to go back to the past to hit a big home run…is delusional,’ says Dave Winer, a software developer.” David Pogue’s conclusion: “When anyone asks me what the future of technology holds, or what kids will be bringing to school in 2016, I politely decline to answer.” Amen.

The Economist on Ndiyo

From the Economist‘s current Technology Quarterly survey…

WHAT is the best way to make the benefits of technology more widely available to people in poor countries? Mobile phones are spreading fast even in the poorest parts of the world, thanks to the combination of microcredit loans and pre-paid billing plans, but they cannot do everything that PCs can. For their part, PCs are far more powerful than phones, but they are also much more expensive and complicated. If only there was a way to split the difference between the two: a device as capable as a PC, but as affordable and accessible as a mobile phone. Several initiatives to bridge this gap are under way. The hope is that the right combination of technologies and business models could dramatically broaden access to computers and the internet.

Perhaps the best-known project is the one dreamt up by a bunch of academics at the Massachusetts Institute of Technology, in Cambridge. The scheme, called “One Laptop Per Child”, aims to use a variety of novel technologies to reduce the cost of a laptop to $100 and to distribute millions of the machines to children in poor countries, paid for by governments. Nicholas Negroponte, the project’s co-founder, says he is in talks to deliver 1m units apiece to the governments of Argentina, Brazil, Nigeria and Thailand. But across the Atlantic in Cambridge, England, another band of brainy types has cooked up a different approach. They have devised a device that allows one PC to be used by many people at once.

The organisation is called Ndiyo (the Swahili word for “yes”), and was founded by Quentin Stafford-Fraser, a former researcher at AT&T. “We don’t want to have cut-down computers for poor people,” he says. “We want them to have what we have — so we need to find a better way to do it.” The system exploits a little-used feature in operating systems that permits multiple simultaneous users. Ndiyo’s small, cheap interface boxes allow multiple screens, keyboards and mice to be linked to a single PC cheaply via standard network cables.

This allows a standard PC running Linux, the open-source operating system, to be shared by between five and ten people. Computers today are many times more powerful than those of just a few years ago, but are idle much of the time. Ndiyo is returning computing to its roots, to a time when they were shared devices rather than personal ones. “We can make computing more affordable by sharing it,” says Dr Stafford-Fraser, as he hunches over a ganglion of wires sprouting from machines in Ndiyo’s office. In much of the world, he says, a PC costs more than a house. Internet cafés based on Ndiyo’s technology have already been set up in Bangladesh and South Africa. Mobile phones are used to link the shared PCs to the internet…

50 years’ hard

Something I’d forgotten — the hard drive had its 50th birthday the other day. IBM introduced the 305 RAMAC computer (shown here) on September 13th, 1956. It was the first computer to include a disk drive — named the IBM 350 Disk File. The file system consisted of a stack of fifty 24″ spinning discs with a total storage capacity of about 4.4 MB. IBM leased it to customers for $35,000 a year. How times change.

SanDisk unveils 4GB Mini SD card

From The Register

SanDisk yesterday took the wraps off a 4GB Mini SD memory card based on the “high capacity” version of the technology. So far, card makers have prepared SDHC incarnations of regular-sized SD card, but this is the first we’ve seen to use the half-size form-factor.

Don’t expect to see it on sale any time soon, though: SanDisk is shipping sample product to card suppliers and phone makers, but it doesn’t believe the product will ship commercially until some time next year. The higher-capacity cards, which use the FAT32 file-system, are not compatible with existing Mini SD card slots…

Dabs.co.uk is selling 4GB SD cards (not mini-SD) for £45.83. Given the extent to which I’ve been using my Canon IXUS as an unobtrusive camcorder, this looks interesting. Hmmm….

Me no Leica*

A new (tacky) tack in Leica’s attempts to counteract the threat of digital photography. Seen in the Financial Times‘s absurd How to Spent It supplement. That whirring sound you hear is made by Oskar Barnack whirring in his grave.

*And yes I do know that this was the headline on Dorothy Parker’s famous review of Christopher Isherwood’s I am a Camera.

Gartner: Microsoft must turn to virtualization technology

From an interesting InformationWeek piece

Microsoft’s mistakes in Vista’s development have been well-chronicled, and the company’s leaders recognize that another five-year gap between major updates of their money maker could be disastrous. In July, chief executive Steve Ballmer told financial analysts “we will never repeat our experience with Windows Vista, we will never have a five-year gap between major releases of flagship products.”

But exactly how will Microsoft do this? How can it handle the increasingly unwieldy amount of code in Windows, better secure the operating system, and maintain backward compatibility with the legions of legacy applications? Gartner’s Gammage and two colleagues, Michael Silver and David Mitchell Smith, believe they know.

“Microsoft will have to move toward virtualization at its core to change direction,” said Gammage. “We think this is what will happen. Microsoft, at the moment, disagrees with us.

“But we don’t see another way of doing this.”

In the scheme that Gammage sees playing out, Microsoft will be forced into adding a “hypervisor,” a layer of virtualization software that runs between the operating system and hardware, to Vista by no later than 2009. Virtualization-enabled processors and chipsets, such as the newer offerings from both Intel and AMD, allow hypervisors to run, which in turn let developers separate functions of an OS into chunks, then have those pieces run simultaneously in multiple virtual machine partitions.

“We expect this hypervisor to provide the key enabling technology for reversing the trend in functional integration,” wrote Gammage, Silver, and Smith in a research report they issued nearly two weeks ago.

“This is how Microsoft will be able to deal with 25 years of backward compatibility,” Gammage said. Virtualization, he said will allow a future Windows to run the legacy kernel — to support aged applications — alongside a new kernel, just as current virtual machine technologies let users run different operating systems side-by-side…

Anyone wondering why Xensource is going to be such a Big Deal need look no further. They’ve cracked the hypervisor problem. And the delicious irony is that their core technology is open source!

The Valley gets free Wi-Fi

From today’s New York Times

SAN FRANCISCO, Sept. 5 — A consortium of technology companies, including I.B.M. and Cisco Systems, announced plans Tuesday for a vast wireless network that would provide free Internet access to big portions of Silicon Valley and the surrounding region as early as next year.

The project is the largest of a new breed of wireless networks being built across the country. They are taking advantage of the falling cost of providing high-speed Internet access over radio waves as opposed to cable or telephone lines. The project will cover 1,500 square miles in 38 cities in San Mateo, Santa Clara, Alameda and Santa Cruz Counties, an area of 2.4 million residents. Its builders, going by the name Silicon Valley Metro Connect, said the service would provide free basic wireless access at speeds up to 1 megabit a second — which is roughly comparable to broadband speeds by telephone — in outdoor areas. Special equipment, costing $80 to $120, will be needed to bolster the signal enough to bring it inside homes or offices.

The consortium will also offer a fee-based service, with higher speeds and technical support, and will allow other companies to sell premium services over the network as well.Diana Hage, director of wireless services at I.B.M., said she expected the project to cost $75 million to $270 million. She said the project was meant to be a public service and, by showing the potential for the technology, to develop and promote the companies’ commercial interests.

I.B.M. is providing project management, and Cisco is providing equipment. They are joined in the project by Azulstar Networks, which plans to handle network operations, and SeaKay, a nonprofit group that focuses on providing Internet access to low-income areas.