Remember hard disks?

They were those spinning platters we used to have in laptops and iPods. But now,

Intel will ramp up its solid-state drive operation next quarter with the introduction of a range of notebook-oriented units running to 160GB of storage capacity.

According to Troy Winslow, Intel’s NAND Products Group Marketing Manager, interviewed by News.com, Q2 will see the chip giant roll-out 1.8in and 2.5in SSDs with capacities ranging from 80GB to 160GB.

Intel currently offers a number of low-capacity, 2-4GB SSDs in a form handy for installation into compact handheld devices, like UMPCs. Moving up to laptop-friendly 1.8in and 2.5in drives will see Intel’s SSD operation step up a gear and bring it into direct competition with Samsung.

In January, Samsung said it would offer a 2.5in 128GB SSD in Q2. The Korean company said its drive will use a 3Gb/s SATA interface and offer a write speed of 70MB/s – a record for this type of drive, it claimed – and a read-speed of 100MB/s…

In praise of Twitter

Bill Thompson in lyrical mode

I didn’t see the crowd start to get restless and heckle Zuckerberg about the deeply-unpopular Beacon advertising system, or get a chance to grab the microphone and ask questions when Lacy threw the conversation open to the floor.

And yet I was there in another way, listening to and even interacting with some of my friends in the audience, picking up on the vibe in the room and even tuning in later as Sarah Lacy loudly defended herself.

I was there because I was plugged into Twitter, the instant messaging service that lets users send short text messages to anyone who cares to tune in, online or on their mobile phone.

As I sat at my desk a constant stream of ‘tweets’, as they are called, was being supplied by many of the people in the room and I was able to reply directly and feel that I too was participating…

Net neutrality: the case for an icepack

Net neutrality — which, in crude terms, is the principle that the Internet ought to treat every packet equally and not privilege some at the expense of others — is one of those interesting cases where righteousness may be the enemy of rationality. At the root of it is a visceral belief that the end-to-end architecture of the Net is something very precious (and the key to understanding why the network has sparked such a tidal wave of innovation); those of us who share that belief tend to be paranoid about the lobbying of large corporations who would like to violate the principle for what we see as narrow commercial ends.

But the truth is that net neutrality is a very complicated issue — as real experts like Jon Crowcroft often point out. It may be, for example, that righteous adherence to neutrality may blind us to the fact that, in some circumstances, it may not yield optimal results. Which is why I was interested to read this thoughtful piece in MIT’s Technology Review this morning.

At the end of February, the Federal Communications Commission (FCC) held a public hearing at Harvard University, investigating claims that the cable giant Comcast had been stifling traffic sent over its network using the popular peer-to-peer file-sharing protocol BitTorrent. Comcast argued that it acted only during periods of severe network congestion, slowing bandwidth-hogging traffic sent by computers that probably didn’t have anyone sitting at them, anyway. But critics countered that Comcast had violated the Internet’s prevailing principle of “Net neutrality,” the idea that network operators should treat all the data packets that travel over their networks the same way.

So far, the FCC has been reluctant to adopt hard and fast rules mandating Net neutrality; at the same time, it has shown itself willing to punish clear violations of the principle. But however it rules in this case, there are some Internet experts who feel that Net neutrality is an idea that may have outlived its usefulness…

The article goes on to cite the views of Mung Chiang, a Princeton computer scientist, who specialises in nonlinear optimization of communication systems. He argues that,

in the name of Net neutrality, network operators and content distributors maintain a mutual ignorance that makes the Internet less efficient. Measures that one group takes to speed data transfers, he explains, may unintentionally impede measures taken by the other. In a peer-to-peer network, “the properties based on which peers are selected are influenced to a large degree by how the network does its traffic management,” Chiang says. But the peer selection process “will have impact in turn on the traffic management.” The result, he says, can be a feedback loop in which one counterproductive procedure spawns another.

Programs using BitTorrent, for instance, download files from a number of different peers at once. But if a particular peer isn’t sending data quickly enough, Chiang says, the others might drop it in favor of one that’s more reliable. Activity patterns among BitTorrent users can thus change very quickly. Network operators, too, try to maximize efficiency; if they notice a bandwidth bottleneck, they route around it. But according to Chiang, they operate on a much different timescale. A bottleneck caused by BitTorrent file transfers may have moved elsewhere by the time the network operator responds to it. Traffic could end up being rerouted around a vanished bottleneck and down a newly congested pipe.

Encyclopedia of life launches, then crashes

From Good Morning Silicon Valley

If your new site crashes under heavy traffic at launch, even when you’ve prepared for a surge, that’s a sign that you may be on to something. And by that standard, the Encyclopedia of Life got off to a healthy start Tuesday. The encyclopedia has set itself a modest goal — it simply wants to be a single, comprehensive collection of everything we know about every species on Earth. If you’re keeping score at home, that’s 1.8 million known species and an estimated 10 times that many yet to be cataloged. To fill the pages, the encyclopedia is using customized software to extract information from all manner of scholarly sources and display it in a standardized format. The data is then vetted by experts. The site hopes to have entries for all the known species within a decade, but for its public debut, it offered starter pages for 30,000 species, mostly plants, amphibians and fish. Still, that was enough to draw a crowd that exceeded the organizers’ optimistic estimates, bringing the site to its knees for a while. To give folks an idea of what a more fleshed out version of the site will look like, some demonstration pages were created, and of these, the one most viewed so far is about the death-cap mushroom, which founding chairman Jesse Ausubel whimsically attributes to society’s deep underlying homicidal tendencies.

Posted in Web

ASUS (contd)

The little ASUS sub-notebook continues to amaze me. Tonight I just plugged my 3G HSDPA modem into one of the USB ports. The machine instantly detected the model and I just followed the instructions in the Network dialog box and, bingo! — I was on the Net. This is the way Linux machines ought to be. In fact, it was easier to set up for the modem than was the MacBook Pro.

HP is planning a Linux sub-notebook

According to the Register,

HP’s going after the Eee PC with a compact laptop that sports an 8.9in display and more connectivity options than the elfin Asus machine currently offers.

So says Engadget, which has posted some pics and a very basic spec….

It’s interesting to see what ASUS started. Also interesting to find that you can’t buy an ASUS machine anywhere in the UK just now — they’re selling like the Nintendo Wii.

If HP is really entering this market, that’s good news because (a) the company makes nice kit, and (b) it further increases the penetration of Linux in new markets.

eBay overhauls its feedback system

From Nicholas Carr’s Blog

EBay has been struggling for some time with growing discontent among its members, and it has rolled out a series of new controls and regulations to try to stem the erosion of trust in its market. At the end of last month, it announced sweeping changes to its feedback system, setting up more “non-public” communication channels and, most dramatically, curtailing the ability of sellers to leave negative feedback on buyers. It turns out that feedback ratings were being used as weapons to deter buyers from leaving negative feedback about sellers…

This is an intriguing — and sobering — moment.

Posted in Web

So is it really a big deal?

A Newsnight journalist rang me on Friday evening, just after we’d arrived in deepest Suffolk, to see if I’d be interested in coming on the programme to talk about the Microsoft-Yahoo deal. I declined gracefully on the grounds that (a) I like being in deepest Suffolk, and (b) I wasn’t sure the story was such a big deal anyway. Now, it looks as though I’m not alone in thinking that. Here’s John Markoff of the NYT on the subject:

SAN FRANCISCO — In moving to buy Yahoo, Microsoft may be firing the final shot of yesterday’s war.

That one was over Internet search advertising, a booming category in which both Microsoft and Yahoo were humble and distant also-rans behind Google.

Microsoft may see Yahoo as its last best chance to catch up. But for all its size and ambition, the bid has not been greeted with enthusiasm. That may be because Silicon Valley favors bottom-up innovation instead of growth by acquisition. The region’s investment money and brain power are tuned to start-ups that can anticipate the next big thing rather than chase the last one.

And what will touch off the next battle? Maybe it will be a low-power microprocessor, code-named Silverthorne, that Intel plans to announce Monday. It is designed for a new wave of hand-held wireless devices that Silicon Valley hopes will touch off the next wave of software innovation.

Or maybe it will be something else entirely.

No one really knows, of course, but gambling on the future is the essence of Silicon Valley. Everyone chases the next big thing, knowing it could very well be the wrong thing. And those who guess wrong risk their survival….

Update: Newsnight ran a piece with Charles Arthur and Robert Scoble. See it on YouTube here.

Gutenberg 2.0

This morning’s Observer column

Today’s Gutenberg is Sir Tim Berners-Lee, inventor of the web. In the 17 years since he launched his technology on an unsuspecting world, he has transformed it. Nobody knows how big the web is now, but estimates of the indexed part hover at around 40 billion pages, and the ‘deep web’ hidden from search engines is between 400 and 750 times bigger than that. These numbers seem as remarkable to us as the avalanche of printed books seemed to Brandt. But the First Law holds we don’t know the half of it, and it will be decades before we have any real understanding of what Berners-Lee hath wrought.

Occasionally, we get a fleeting glimpse of what’s happening. One was provided last week by the report of a study by the British Library and researchers at University College London…