This morning’s Observer column.
To the technology trade, I am what is known as an “early adopter” (translation: gadget freak, mug, sucker). I had a mobile phone in the mid-1980s, for example, when they were still regarded as weird. It was the size of a brick, cost the best part of a grand and exposed me to ridicule whenever I took it out in public. But I didn’t care because the last Soviet president, Mikhail Gorbachev, used the same phone and he was cool in those days. Besides, it had always seemed absurd to me that phones should be tethered to the wall, like goats. I still have that Nokia handset, by the way: it sits at the bottom of a drawer and I sometimes take it out to show my grandchildren what phones used to be like.
Over the decades since, I have always had latest-model phones – just like all the other early adopters. And of course I used them to make phone calls because basically that’s all you could do with those devices. (Well, almost all: one of mine had an FM radio built in.) And then in 2007 Steve Jobs launched the iPhone and the game changed. Why? Because the Apple device was really just a powerful computer that you could hold in your hand. And it was a real computer; its operating system was a derivative of BSD, the derivative of Unix developed by Bill Joy when he was a graduate student at Berkeley. (Note for non-techies: Unix is to Windows as a JCB is to a garden trowel.)
The fact that the iPhone could also make voice calls seemed, suddenly, a trivial afterthought. What mattered was that it provided mobile access to the internet. And that it could run programs, though it called them apps…
Further to my Observer column yesterday, this from Bloomberg.
Apple Inc. (AAPL) will reap fees from banks when consumers use an iPhone in place of credit and debit cards for purchases, a deal that gives the handset maker a cut of the growing market for mobile payments, according to three people with knowledge of the arrangement.
That’s a small cut on millions of daily transactions. Adds up to a formidable revenue stream.
This morning’s Observer column
In the long view of history, though, the innovation that may be seen as really significant is Apple Pay – an ingenious blend of contactless payment technology with security features that are baked into the new iPhones. Apple Pay will, burbled Tim Cook, “forever change the way all of us buy things… it’s what makes the iPhone 6 the biggest advancement in the history of iPhones”.
The idea is to do away with the rigmarole of having to pull out a credit/debit card, insert in a store’s card reader, type a pin, etc. Instead, you simply bump your iPhone (and, eventually, your Apple Watch) against the store’s contactless reader and – bingo! – you’ve paid, and the store never gets to see your card. Why? Because Apple has stored the card details in heavily encrypted form on your device and assigned each card a unique, device-specific number, which is accepted by the retailer’s contactless reader.
This only works, of course, if the retailer has already signed up with Apple. Cook claimed that 220,000 US retailers have already opted in to the system, as well as six major banks, plus MasterCard, Visa and American Express – which means that 83% of all US credit card payment volume can theoretically already be handled by Apple Pay.
If true, this is a really big deal, because it puts Apple at the heart of an unimaginable volume of financial transactions. In a way, the company is now doing to the card payment business what it did to the music business with the iTunes store…
The moment I saw Tim Cook introduce Apple Pay I thought: this is the big deal. Reassuring to learn that Dave Winer thought so too:
The way we pay for stuff today is as archaic as the way we bought music before Napster and the iPod. A few years ago, it was clear that all the big tech companies were going to become banks. What else could they possibly do with the piles of cash they were accumulating? They’re going to lend it to us, and we’re going to pay them interest. Over time, the fact that they make hardware or support customers, or have retail stores, will be interesting anachronistic sidelines. Apple, Amazon and Google investors will judge their companies on how well they work as financial institutions. It’s something investors understand, and the money you make in finance comes without the headaches of having to actually make anything.
Apple has hundreds of millions of credit card numbers, and they’ll be useful until they completely replace the banks. Apple is bigger than any of them, and has bank-sized financial resources. And the way we pay for stuff today with little plastic cards, some with chips on them, is backwards. The chips in our phones are much more capable. And putting them on our wrist in a big form factor isn’t interesting. They will be embedded in our keychain next, and then in our actual bodies. It won’t be much longer before we are at least part computer.
Anyway, Apple will be a much better bank than BofA, Citibank or Chase. Consumers will have more rights from Apple than we were given by the bankers and their Washington cronies. Apple still is a fucked up mega-corporation, but they don’t have any reason not to treat us a little better than the guys they’re replacing. It’ll make for the feel-good Christmas commercial, this year and every year from now on.
Apple is simply doing the obvious thing: following the money.
This morning’s Observer column.
Thirty years ago (on 24 January 1984, to be precise), a quirky little computer company launched a new product and in the process changed lives and maybe the world. The company was called Apple and the product was named after a particular type of Californian apple – the Macintosh.
With astonishing chutzpah, the company announced the product to the world via a single advertisement screened during the Super Bowl on 22 January. The film was directed by Ridley Scott and showed a dimly lit auditorium in which ranks of drably clad zombies are being harangued by a despotic figure shown on a huge screen. Into this auditorium comes a beautiful female athlete who runs towards the screen carrying a large hammer, pursued by goons attired in riot police gear. Just as the despot’s rant reaches a climax, the athlete stops, whirls the hammer four times and then launches it at the screen. When it strikes, the screen explodes and the camera pans to the zombies, whose mouths gape in bewilderment. “On January 24th,” intones a voice over the closing scene, “Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like Nineteen Eighty-Four.”
Most people who saw the ad were probably baffled by it. But for some of us, the symbology was clear…
This morning’s Observer column.
It’s 4.30 on a gloomy winter’s afternoon. I’m sitting with my grandson having one of those conversations in which grandsons explain complicated stuff to their grandads. He is four years old, omniscient in the way that four-year-olds are, and tolerant of my ignorance of important matters.
The conversation turns to computing and he inquires whether I have Talking Tom Cat on my iPad. “No,” I say. “What is it?” He explains that it’s a cool game that his grandma has on her iPad. There is a cat called Tom who listens to what you say to him and then repeats it in a funny voice. Also there’s a dog who does funny things.
So I dig out my iPad and we head over to the app store where, sure enough, Talking Tom Cat 2 is available as a free download. A few minutes later it’s running on my iPad…
Read on to find out what happens next.
In the last year, Google has bought just about every small company (i.e. eight companies) doing interesting work in robotics — including Boston Dynamics, whose creature is shown in this video.
In the same period, Apple has, er, instituted a share-buyback program and brought out some incrementally-improved products.
So here’s my question (which is prompted by something Jason Calcanis said): which company is focussed on the distant future? The obvious inference seems to be that Apple can’t think of anything really radical to do with its mountain of cash.
UPDATE: Charles Arthur points out that, according to Wikipedia, Apple acquired ten companies in 2013, of which three are involved in mapping and two in semiconductors. So maybe they are up to something.
From one of the last interviews Jobs gave:
I have my own theory about why decline happens at companies like IBM or Microsoft. The company does a great job, innovates and becomes a monopoly or close to it in some field, and then the quality of the product becomes less important. The company starts valuing the great salesmen, because they’re the ones who can move the needle on revenues, not the product engineers and designers. So the salespeople end up running the company. John Akers at IBM was a smart, eloquent, fantastic salesperson, but he didn’t know anything about product. The same thing happened at Xerox. When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off. It happened at Apple when Sculley came in, which was my fault, and it happened when Ballmer took over at Microsoft. Apple was lucky and it rebounded, but I don’t think anything will change at Microsoft as long as Ballmer is running it.
This morning’s Observer column.
Poor Steve has gone to the great computer lab in the sky, but the church he founded endures. And it still knows what is best for its adherents. Recently, the company launched the latest release of its OS X operating system, codenamed Mavericks. What happened was this: one day, while millions of the devout were tapping industriously on their keyboards, a small dialogue box appeared on the top right-hand corner of their screens. It informed them that important upgrades were available for their computers.
For members of the Apple communion, such a message has much the same status as a text from the Vatican would have for devout Catholics. So they acted upon it. And lo! It came to pass that their computers were upgraded…
Terrific post by Dave Winer.
He starts by berating technology journalism for the way it obsesses over Apple.
All the while, tech news has come to dominate all the news, only Apple isn’t it. The big story is the NSA. It’s huge and has been building for 20 years. While we were all watching the public Internet grow, a private, secret one was being developed by the US military. But was it actually hidden? Where were all the comp sci grads going? Some were going to Redmond and Silicon Valley for sure. But a lot of them were going to Maryland and Virginia. The story was available to be grabbed by any enterprising news organization. It wasn’t.
We can learn from the Snowden leaks and adapt and reorganize the way we cover tech. Instead of accepting the stories that the industry feeds us, we can look more broadly, ask our own questions, and seek the answers outside the public relations departments of the big companies. This might result in small rebellions, like asking why the companies remove features from their products that users depend on. And big ones, like sensing things like the NSA’s social network before the leakers show up with all the documents spelling it out.
The sheer size of the Snowden leaks are themselves a judgement on the inadequacy of tech journalism. Why were none of these stories broken before? Couldn’t sources have been found to talk off the record? Weren’t there people of conscience inside the tech companies who might tell the truth? Or were the reporters even available to listen to these people?
Tech is where big news is happening this decade. It’s time to start doing it seriously.