Free Dorothy!

The Syrian authorities have now admitted that they have detained Dorothy Parvaz, a friend and a former Fellow on the Wolfson Press Fellowship Programme of which I am the Director. Dorothy works for Al Jazeera English and is based in Doha, Qatar. She’s a terrific journalist and a lovely person. Last Friday she flew to Damascus on a reporting mission and since then nobody has seen or heard anything from her. Through the MP for Cambridge, Julian Huppert, we have been pressing the British Foreign Office to make inquiries about her whereabouts. Al Jazeera have also been pressing very hard for her release. The Syrian admission is a step in the right direction, but none of us will rest until she is safely back in Qatar.

Her fiancee, Todd Barker, has recorded this video appeal, which is now on the Al Jazeera site.

Al Jazeera are maintaining a regularly-updated page about Dorothy.

There’s an interesting post by Swami Avi on the Free Dorothy Parvaz Facebook page which says:

I interviewed Syria’s chargé d’affaires to Canada today, Bashar Akbik, and he said Dorothy Parvaz was arrested in Syria for probably not registering herself with that country’s Ministry of Information–a requirement for foreign journalists. Interestingly, he noted that Al Jazeera is a tool of the Muslim Brotherhood, and are “working to undermine Syria’s regime.”

Life in the technology jungle: the salutary tale of the Flip

This morning’s Observer column.

The Flip was a delicious example of clean, functional design and it sold like hot cakes. From the first day it appeared on Amazon it was the site's bestselling camcorder, and eventually captured 35% of the camcorder market. I bought one as soon as it appeared in the UK, and soon found that my friends and colleagues were eyeing it enviously. One – a keen tennis player – bought one along with an ingenious bendy tripod called a Gorillapod and mounted it on the fence at the court where he was having lessons with his coach. (The coach was not impressed.) Another friend, this time a golfer, bought one and used it to analyse his swing when practising at the driving range. Thousands of YouTube videos were produced using Flips. It was what technology pundits call a “game changer”.

In March 2009 the giant networking company Cisco astonished the world by buying Pure Digital Technologies, the developer of the Flip, for $590m. This seemed weird because Cisco doesn’t do retail: it’s the company that provides the digital plumbing for the internet. It deals only with businesses. It was as if BP had suddenly announced that it was going into the perfume business. But, hey, we thought: maybe Cisco is getting cool in its old age.

How wrong can you be? Just over a week ago, Cisco announced that it was shutting down its Flip video camera division and making 550 people redundant. Just like that…

How Twitter could put an end to paywalls

Intriguing post by Dave Winer.

Now here’s a chilling thought.

If Twitter wanted to, tomorrow, they could block all links that went into a paywall. That would either be the end of paywalls, or the end of using Twitter as a way to distribute links to articles behind a paywall, which is basically the same thing, imho.

Twitter already has rules about what you can point to from a tweet, and they’re good ones, they keep phishing attacks out of the Twitter community, and they keep out spammers. But that does not have to be the end of it. And if you think Twitter depends on you, I bet Adobe felt that Apple depended on them too, at one point.

The ‘End of History’ Man — on photography

A story about associative linking that would make Ol’ Vannevar Bush proud.

I’ve been reading reviews of Francis Fukuyama’s new book The Origins of Political Order: From Prehuman Times to the French Revolution and wondering whether to buy it. It looks interesting. And then I came on a Newsweek photo essay about him which included an intriguing photograph of him with his camera case. That’s when I discovered that he was a serious photographer, so of course I then went looking for his pictures, but before I got to any I found this essay by him on WSJ.com.

Let’s begin with how photography has changed. Ansel Adams’s iconic images of the Sierras were taken with an 8-inch-by-10-inch view camera, a wooden contraption with bellows in which the photographer saw his subject upside-down and reversed under a black cloth. Joel Meyerowitz’s stunning photographs of Cape Cod were taken with a similar mahogany Deardorff view camera manufactured in the 1930s. These cameras produce negatives that contain up to 100 times the amount of information produced by a contemporary top-of-the-line digital SLR like a Canon EOS 5D or a Nikon D3. View cameras allow photographers to shift and tilt the lens relative to the film plane, which is why they continue to be used by architectural photographers who want to avoid photos of buildings with the converging vertical lines caused by the upward tilt of the lens on a normal camera. And their lenses can be stopped down to f/64 or even f/96, which allows everything to be in crystalline focus from 3 inches away to infinity. (Ansel Adams, Edward Weston and Imogen Cunningham were part of a group called “f/64” in celebration of this characteristic.)

Perhaps the most important feature of these older film cameras was their lack of convenience. They had to be mounted on tripods; it took many minutes to shoot a single frame; and they were hardly inconspicuous. In contrast to contemporary digital photographers who snap a zillion photos of the same subject and hope that one will turn out well composed, view camera photography is a more painterly activity that forces the photographer to slow down and think ahead carefully about subject, light, framing, time of day, and the like. These skills are in short supply among digital photographers.

Older cameras were far better built. A few years ago I was given a Leica M3 once owned by my uncle, who joined the U.S. Army to get out of an internment camp for Japanese-Americans during World War II. He was sent to Germany where he acquired the Leica around the time I was born. This camera, with its f/2 Summicron, a classic, fast, tack-sharp lens, still takes beautiful pictures. How many digital cameras will still be functioning five years from now, much less 50? Where are you going to buy new batteries and the media to store your photos in 2061?

Where indeed? It turns out that Fukuyama is also an audio buff with strong views on the capacity of MP3 compression to ruin audio quality.

And of course I had to check out what a GigiPan Epic 100 would cost. Answer: £414 on eBAY.

From hero to zero in the blink of an eye: Cisco Shuts Down Flip

From today’s NYTimes.com.

It was one of the great tech start-up success stories of the last decade.

The Flip video camera, conceived by a few entrepreneurs in an office above Gump’s department store in San Francisco, went on sale in 2007, and quickly dominated the camcorder market.

The start-up sold two million of the pocket-size, easy-to-use cameras in the first two years. Then, in 2009, the founders cashed out and sold to Cisco Systems, the computer networking giant, for $590 million.

On Tuesday, Cisco announced it was shutting down its Flip video camera division.

Wow! I have a Flip. It’s a lovely gadget, and it came with quite elegant software. But I haven’t used it since I got an iPhone. Another illustration of the adage that the best camera is always the one you happen to have with you. It’s also a salutary lesson in how quickly this ecosystem can change.

Bet the guys who sold out to Cisco are laughing all the way to the bank.

Eye-Fi

Ten years ago if you said that I would like to have an Internet-connected camera I’d have said you were nuts. But acquiring an iPhone changed my view: I’ve found it really useful to be able to upload pictures from anywhere at any time without having to be tethered to a computer. The iPhone camera isn’t great, but as the man said the best camera is always the one you have with you, and I always have the phone. But it’d be nice to be able to have instant uploads from a better camera.

Enter Eye-Fi, an SD card which can talk to a wireless network from inside your camera. I bought one from Amazon for just under £70 — which is expensive for an SD card, but what the hell. You install some software on PC or laptop, register with Eye-Fi, put the card into your camera and — Bingo! Images are automatically uploaded. You can link your account to other services like Flickr and Facebook. And the card can do geolocation based on Wi-Fi network location.

Sounds too good to be true? In a way, it is. The system works fine, but uploads are slow unless one constrains the size and quality of the images. For shooting and uploading web-friendly jpegs it works fine: in fact it might be a good way of getting stuff in near-real-time onto Flickr. But you can’t use the full range of image quality and size available on a decent camera. So it’s got its applications, but it looks as though the iPhone camera will still find plenty of use.

It’s also got lots of embarrassing potential. Suppose, for example, you were careless with the upload settings: you might find that a set of, er, intimate pictures were attracting an admiring audience to your Flickr account or Facebook page. And, then of course, there’s this.

Paul Baran RIP

Paul Baran, the engineer who first thought of packet-switching (Donald Davies independently came up with the same idea later) has died at the age of 84.

Baran was one of the most entertaining and intriguing figures I came across when I was researching my history of the Internet way back in the 1990s. The story of how he came up with the idea — and of his hilarious experiences with AT&T — is told in Chapter 6. Essentially, AT&T’s position was: “this packet-switching stuff couldn’t work, but even if it did we wouldn’t allow it”. After he’d submitted his proposal for a packet-switched network to the Pentagon, Baran realised that the contract to build the pilot network would go to an agency staffed mainly by ex-AT&T engineers, concluded that they would make sure that it didn’t work and — rather than have them strangle his baby at birth — withdrew the proposal. It’s the kind of story that one couldn’t make up. And yet it happened.

Why some Apps work — and some don’t

Om Malik has a thoughful post about why some products work while others don’t — no matter how much VC money and industry plaudits they attract.

He picks up Gary Vaynerchuck’s idea of The Thank You Economy, in which the companies that provide the most value to their customers win. “It is a quaint notion”, writes Malik, “as old as the first bazaar, but somehow it got lost in postindustrial over-commercialization”.

When I use Marco Arment‘s Instapaper, I quietly thank him, pretty much every single time. Why? Because he solved a problem for me and made my life more manageable. As a result I gladly upgraded to the paid version of the app. And when I am not saving or reading articles using Instapaper, I am telling everyone I can tell: Try it. That is what the “thank you economy” really is — me doing marketing for a product I have only an emotional or utilitarian connection to.

I look at all these great tablets coming to market. They are feature-laden, power-packed, and have bundles of computing oomph. And yet, they will all struggle because the makers are all looking through the wrong end of the telescope. My friend Pip Coburn emailed me, pointing out that people with iPads are the ultimate commercial for the device. The more people have them, the more people want them. “People will trust other people who do not carry an agenda to build revenues and manipulate you,” Pip wrote. Bing!

Don’t believe me? Put all the things that are part of your daily routine into these two buckets — happiness and utility — and you will see it for yourself that in the end those two are the driving forces behind a successful app, service, device or media property.

That rings lots of bells round here. Instapaper has solved lots of problems for me, and I really value it. Same goes for Dropbox, in spades. I’m currently finishing off a new book, and I’ve used Dropbox from the outset: it’s been a revelation compared with the last time I wrote a book — when I was continually fretting about back-ups, the location of different versions, etc.

Another revelation is how useful the iPad has become — for me, anyway. When it first came out, I was quite critical of it. What has changed is the ecosystem of apps which have transformed it into a really powerful mobile workstation. It’s still hopeless for my kind of blogging (which really needs multitasking), but for writing non-academic articles, reading and commenting on PDFs, note-taking in seminars and conferences and email-on-the-move it’s terrific. And Dropbox is the glue that binds it to my other ‘proper’ computers.

Thanks to Quentin, I’ve also found that the iPad is a pretty good thinking and presentation tool. It does run Keynote, which is fine if you like that kind of PowerPoint-type thing. But more importantly, it has a mind-mapping App which (unlike some iPad1 Apps) can drive a projector, and I’ve found that audiences which are PowerPointed-out seem to like it. You just work out the map of what you want to say, and then talk through it, squeezing and pinching and swiping as you talk. And if they want a printed record, you can export the map as a jpeg and email it to them.

Vorsprung durch Technik (nein)

This morning’s Observer column.

Those whom the Gods wish to destroy, they first make infatuated with their own ingenuity. Witness the heady talk about “the internet of things”. The basic idea is that we are moving from an era when the network connected human beings to one where a majority of the nodes on it will be devices: printers, cameras, monitoring devices, domestic appliances – yea even unto the humble toaster.

Two forces are driving this trend…

Why computers can’t really ‘think’

Stanley Fish sparked off a lively debate with his NYT piece about IBM’s Watson machine. This is an excerpt from an interesting response by Sean Dorrance Kelly and Hubert Dreyfus.

The fact is, things are relevant for human beings because at root we are beings for whom things matter. Relevance and mattering are two sides of the same coin. As Haugeland said, “The problem with computers is that they just don’t give a damn.” It is easy to pretend that computers can care about something if we focus on relatively narrow domains — like trivia games or chess — where by definition winning the game is the only thing that could matter, and the computer is programmed to win. But precisely because the criteria for success are so narrowly defined in these cases, they have nothing to do with what human beings are when they are at their best.

Far from being the paradigm of intelligence, therefore, mere matching with no sense of mattering or relevance is barely any kind of intelligence at all. As beings for whom the world already matters, our central human ability is to be able to see what matters when. But, as we show in our recent book, this is an existential achievement orders of magnitude more amazing and wonderful than any statistical treatment of bare facts could ever be. The greatest danger of Watson’s victory is not that it proves machines could be better versions of us, but that it tempts us to misunderstand ourselves as poorer versions of them.

This comforting line of argument doesn’t square with Peter Wilby’s scepticism about the prevailing assurances of Western governments that “If enough people buckle down to acquiring higher-level skills and qualifications, Europeans and Americans will continue to enjoy rising living standards. If they work hard enough, each generation can still do better than its parents. All that is required is to bring schools up to scratch and persuade universities to teach ‘marketable” skills.'”

“Knowledge work”, supposedly the west’s salvation, is now being exported like manual work. A global mass market in unskilled labour is being quickly succeeded by a market in middle-class work, particularly for industries, such as electronics, in which so much hope of employment opportunities and high wages was invested. As supply increases, employers inevitably go to the cheapest source. A chip designer in India costs 10 times less than a US one. The neoliberals forgot to read (or re-read) Marx. “As capital accumulates the situation of the worker, be his payment high or low, must grow worse.”

We are familiar with the outsourcing of routine white-collar “back office” jobs such as data inputting. But now the middle office is going too. Analysing X-rays, drawing up legal contracts, processing tax returns, researching bank clients, and even designing industrial systems are examples of skilled jobs going offshore. Even teaching is not immune: last year a north London primary school hired mathematicians in India to provide one-to-one tutoring over the internet. Microsoft, Siemens, General Motors and Philips are among big firms that now do at least some of their research in China. The pace will quicken. The export of “knowledge work” requires only the transmission of electronic information, not factories and machinery. Alan Blinder, a former vice-chairman of the US Federal Reserve, has estimated that a quarter of all American service sector jobs could go overseas.

And John Markoff, in another essay reports the intentions of IBM executives

to commercialize Watson to provide a new class of question-answering systems in business, education and medicine. The repercussions of such technology are unknown, but it is possible, for example, to envision systems that replace not only human experts, but hundreds of thousands of well-paying jobs throughout the economy and around the globe. Virtually any job that now involves answering questions and conducting commercial transactions by telephone will soon be at risk. It is only necessary to consider how quickly A.T.M.’s displaced human bank tellers to have an idea of what could happen.

To be sure, anyone who has spent time waiting on hold for technical support, or trying to change an airline reservation, may welcome that day. However, there is also a growing unease about the advances in natural language understanding that are being heralded in systems like Watson. As rapidly as A.I.-based systems are proliferating, there are equally compelling examples of the power of I.A. — systems that extend the capability of the human mind.