The great Chinese hardware hack: true or false?

This morning’s Observer column:

On 4 October, Bloomberg Businessweek published a major story under the headline “The Big Hack: How China Used a Tiny Chip to Infiltrate US Companies”. It claimed that Chinese spies had inserted a covert electronic backdoor into the hardware of computer servers used by 30 US companies, including Amazon and Apple (and possibly also servers used by national security agencies), by compromising America’s technology supply chain.

According to the Bloomberg story, the technology had been compromised during the manufacturing process in China. Undercover operatives from a unit of the People’s Liberation Army had inserted tiny chips – about the size of a grain of rice – into motherboards during the manufacturing process.

The affected hardware then made its way into high-end video-compression servers assembled by a San Jose company called Supermicro and deployed by major US companies and government agencies…

Read on

The future of Search

This morning’s Observer column:

ype “What is the future of search?” into Google and in 0.47 seconds the search engine replies with a list of sites asking the same question, together with a note that it had found about 2,110,000,000 other results. Ponder that number for a moment, for it reflects the scale of the information explosion that was triggered by Tim Berners-Lee’s invention of the web in 1989-90. Back then there were no search engines because there was no need for them: there were very few websites in those early days.

Google turned 20 recently and the anniversary prompted a small wave of reflections by those who (like this columnist) remember a world BG (before Google), when information was much harder to find. The nicest one I found was a blog post by Ralph Leighton, who was a friend of Richard Feynman, the late, great theoretical physicist.

The story starts in 1977 when Feynman mischievously asked his friend “whatever happened to Tannu Tuva?” …

Read on

Now with added Blockchain, er… database

I get dozens of emails a week from PR firms breathlessly announcing the latest addition of “blockchain technology” to the toolsets of their clients. Most of these puffs are idiotic, but every so often they involve a large and ostensibly serious company.

Like Walmart. Today I find this report in the New York Times:

When dozens of people across the country got sick from eating contaminated romaine lettuce this spring, Walmart did what many grocers would do: It cleared every shred off its shelves, just to be safe.

Walmart says it now has a better system for pinpointing which batches of leafy green vegetables might be contaminated. After a two-year pilot project, the retailer announced on Monday that it would be using a blockchain, the type of database technology behind Bitcoin, to keep track of every bag of spinach and head of lettuce.

Impressive, eh? By this time next year, more than 100 farms that supply Walmart with leafy green vegetables will be required to input detailed information about their food into a blockchain. But… said blockchain will be run and — one presumes — hosted on IBM servers. Since the essence of a blockchain is that it’s a public ledger (so that control and oversight is decentralised) one wonders how a blockchain run on IBM servers is anything other than a fancy ol’ database?

LATER: From the you-couldn’t-make-it-up department, the UK Chancellor of the Exchequer (Finance Minister), when asked (at the Tory Conference) how the government planned to avoid having a hard border in Northern Ireland, replied: “There is technology becoming available (…) I don’t claim to be an expert on it but the most obvious technology is blockchain.”

Computational photography

It’s hard to believe but Apple has 800 people working just on the iPhone camera. Every so often, we get a glimpse of what they are doing. Basically, they’re using computation to enhance what can be obtained from a pretty small sensor. One sees this in the way HDR (High Dynamic Range) seems to be built-in to every iPhone X photograph. And now we’re seeing it in the way the camera can produce the kind of convincing bokeh(the blur produced in the out-of-focus parts of an image produced by a lens) that could hitherto only be got from particular kinds of optical lenses at wide apertures.

Matthew Panzarino, who is a professional photographer, has a useful review of the new iPhone XS in which he comments on this:

Unwilling to settle for a templatized bokeh that felt good and leave it that, the camera team went the extra mile and created an algorithmic model that contains virtual ‘characteristics’ of the iPhone XS’s lens. Just as a photographer might pick one lens or another for a particular effect, the camera team built out the bokeh model after testing a multitude of lenses from all of the classic camera systems.

Really striking, though, is an example Panzarino uses of how a post-hoc adjustable depth of focus can be really useful. He shows a photograph of himself with his young son perched on his shoulders.

And an adjustable depth of focus isn’t just good for blurring, it’s also good for un-blurring. This portrait mode selfie placed my son in the blurry zone because it focused on my face. Sure, I could turn the portrait mode off on an iPhone X and get everything sharp, but now I can choose to “add” him to the in-focus area while still leaving the background blurry. Super cool feature I think is going to get a lot of use.

Yep. Once, photography was all about optics. From now on it’ll increasingly be about computation.

The Google era

This morning’s Observer column:

This is a month of anniversaries, of which two in particular stand out. One is that it’s 10 years since the seismic shock of the banking crisis – one of the consequences of which is the ongoing unravelling of the (neo)liberal democracy so beloved of western ruling elites. The other is that it’s 20 years since Google arrived on the scene.

Future historians will see our era divided into two ages: BG and AG – before and after Google. For web users in the 1990s search engines were a big deal, because as the network exploded, finding anything on it became increasingly difficult. Like many of my peers, I used AltaVista, a search tool developed by the then significant Digital Equipment Corporation (DEC), which was the best thing available at the time.

And then one day, word spread like wildfire online about a new search engine with a name bowdlerised from googol, the mathematical term for a huge number (10 to the power of 100). It was clean, fast and delivered results derived from conducting a kind of peer review of all the sites on the web. Once you tried it, you never went back to AltaVista.

Twenty years on, it’s still the same story…

Read on

How to think about electric — and autonomous — cars

Lovely, perceptive essay by Benedict Evans. Here’s how it opens…

When Nokia people looked at the first iPhone, they saw a not-great phone with some cool features that they were going to build too, being produced at a small fraction of the volumes they were selling. They shrugged. “No 3G, and just look at the camera!”

When many car company people look at a Tesla, they see a not-great car with some cool features that they’re going to build too, being produced at a small fraction of the volumes they’re selling. “Look at the fit and finish, and the panel gaps, and the tent!”

The Nokia people were terribly, terribly wrong. Are the car people wrong? We hear that a Tesla is ‘the new iPhone’ – what would that mean?

This is partly a question about Tesla, but it’s more interesting as a way to think about what happens when ‘software eats the world’ in general, and when tech moves into new industries. How do we think about whether something is disruptive? If it is, who exactly gets disrupted? And does that disruption that mean one company wins in the new world? Which one?

Well worth reading in full.

What’s significant about Wikipedia

This morning’s Observer column:

Since its inception, it’s been the butt of jokes, a focus for academic ire and a victim of epistemological snobbery. I remember one moment when the vice-chancellor of a top university made a dismissive remark about Wikipedia, only to have a world-leading chemist in the audience icily retort that the pages on his particular arcane speciality were the most up-to-date summary currently available anywhere – because he wrote them. And this has been my experience; in specialist areas, Wikipedia pages are often curated by experts and are usually the best places to gain an informed and up-to-date overview.

Because Wikipedia is so vast and varied (in both range and quality), the controversies it engenders have traditionally been about its content and rarely about its modus operandi and its governance. Which is a pity, because in some ways these are the most significant aspects of the project. The political events of the last two years should have alerted us to the fact that Wikipedia had to invent a way of tackling the problem that now confronts us at a global level: how to get at some approximation to the truth…

Read on

Quote of the Day

“Services like Uber and online freelance markets like TaskRabbit were created to take advantage of an already independent work force; they are not creating it. Their technology is solving the business and consumer problems of an already insecure work world. Uber is a symptom, not a cause.”

Louis Hyman, an economic historian, writing in the New York Times

He has a new book coming out soon – Temp: How American Work, American Business, and the American Dream Became Temporary.

How digital technologies went from being instruments for democracy to tools for undermining it

This morning’s Observer column:

Here’s the $64,000 question for our time: how did digital technologies go from being instruments for spreading democracy to tools for undermining it? Or, to put it a different way, how did social media go from empowering free speech to becoming a cornerstone of authoritarian power?

I ask this as a distressed, recovering techno-utopian. Like many engineers of my generation, I believed that the internet would be the most empowering and liberating technology since the invention of printing by moveable type. And once the web arrived, and anyone who could type could become a global publisher, it seemed to me that we were on the verge of something extraordinary. The old editorial gatekeepers of the pre-internet media world would lose their stranglehold on public discourse; human creativity would be unleashed; a million flowers would bloom in a newly enriched and democratised public sphere. In such a decentralised world, authoritarianism would find it hard to get a grip. A political leader such as Donald Trump would be unthinkable.

Naive? Sure. But I was in good company…

Read on

Strategic changes are hard

I’ve been pondering the problem of how to make a reasonably-successful organisation that’s been going for half a century realise that it may need to make some major shifts to address the challenges it will face in the next half-century. Headline: it ain’t easy. So then I started thinking about organisations that have managed the switch. Apple and the iPhone is one, I guess. But then I remembered that I’d once done an interview with Gordon Moore, the co-founder of Intel — a company which made that kind of radical switch when moving from making semiconductor memory to making processor chips. And that wasn’t easy either.

I then happened upon on a famous essay – “Seven Chapters of Strategic Wisdom” by Walter Kiechel III — which discusses the Intel experience. Here’s the relevant bit:

Just how difficult it is to pull this off, or to make any major change in strategic direction, is wonderfully captured in “Why Not Do It Ourselves?” the fifth chapter in Andrew S. Grove’s Only the Paranoid Survive: How to Exploit the Crisis Points That Challenge Every Company, published in 1996. Grove tells how in 1985 he and Gordon Moore realized that Intel, the company they led, needed to get out of the business on which it was founded, making semiconductor memory chips, to concentrate instead on microprocessors. The reaction they encountered as they navigated their company through this “strategic inflection point” won’t surprise anyone who has tried to effect change in an organization. “How can you even think of doing this?” came the chorus from the heads of the company’s memory-chip operations. “Look at all the nifty stuff we’ve got in the pipeline” (even if we are losing our collective shirt to low-cost Japanese competitors).

Grove and Moore persisted, even though the effort entailed shutting down plants, laying off thousands of employees, and giving up what many thought of as the company’s birthright. Intel’s subsequent success in microprocessors, beginning with its celebrated “386” model, would soon make it the world’s largest semiconductor company. Read over the tale of what it took to get there if, in a delusional moment, you’re ever tempted to think that putting strategy into practice is easy, even a seemingly emergent strategy.