Web design and page obseity

My Observer column last Sunday (headlined “Graphics Designers are Ruining the Web”) caused a modest but predictable stir in the design community. The .Net site published an admirably balanced round-up of comments from designers pointing out where, in their opinions, I had got things wrong. One (Daniel Howells) said that I clearly “had no exposure to the many wonderful sites that leverage super-nimble, lean code that employ almost zero images” and that I was “missing the link between minimalism and beautiful designed interfaces.” Designer and writer Daniel Gray thought that my argument was undermined “by taking a shotgun approach to the web and then highlighting a single favoured alternative, as if the ‘underdesigned’ approach of Peter Norvig is relevant to any of the other sites he discusses”.

There were several more comments in that vein, all reasonable and reasoned — a nice demonstration of online discussion at its best. Reflecting on them brought up several thoughts:

  • The Columnist’s Dilemma: writing a column about technology for a mainstream newspaper means that one is always trying to balance the temptation to go into technical detail against the risk of losing the non-technical reader. Sometimes I get the balance wrong. In this particular case I thought that getting into the pros and cons of, say, using Javascript to enhance usability, would obscure the main point I was trying to make, which is that there is an epidemic of obesity in web pages, and that that has some downsides.
  • There’s the question of what are columnists for? I remember something that Alan Rusbridger, the Editor of the Guardian, said when asked why he employed columnists like Simon Jenkins who annoyed the (mainly left-of-centre) readers of the paper. The essence of Rusbridger’s response, as I remember it, was that he needed to avoid creation of an echo-chamber — a publication in which readers only received views with which they agreed. Grit in the oyster, if you like. So perhaps one of the responsibilities of a columnist is to be provocative.
  • One thing I wish I had mentioned is that it isn’t just designers who are responsible for data-intensive web pages: it’s the slot-in advertising that is often the culprit. And here the responsibility for obesity lies with e-commerce. And here the column links to an earlier one, picking up Evgeny Morozov’s point about the way in which the Web has moved from being a cabinet of curiosities to an endless shopping mall.
  • The most common response to the column, though, was a casual shrug. So what if web pages are getting bigger and bigger? Network bandwidth will increase to meet the demand — and this may be a good thing: look at the way the demands of desktop publishing and, later, image and video editing pushed the development of personal computing technology. And of course there’s something in that argument: without the constant pressure to push the envelope, technology stagnates. The problem with that argument, however, is that for many Internet users bandwidth is not infinite. I don’t know what proportion of UK users in rural areas, for example, have a landline broadband connection that generally exceeds 2Mbps, but it sure as hell isn’t 100. And as more and more people access the Net via mobile connections, then bandwidth constraints really matter, and will continue to do so for the foreseeable future.
  • Thanks to Seb Schmoller for the .Net link.

    From web pages to bloatware

    This morning’s Observer column.

    In the beginning, webpages were simple pages of text marked up with some tags that would enable a browser to display them correctly. But that meant that the browser, not the designer, controlled how a page would look to the user, and there’s nothing that infuriates designers more than having someone (or something) determine the appearance of their work. So they embarked on a long, vigorous and ultimately successful campaign to exert the same kind of detailed control over the appearance of webpages as they did on their print counterparts – right down to the last pixel.

    This had several consequences. Webpages began to look more attractive and, in some cases, became more user-friendly. They had pictures, video components, animations and colourful type in attractive fonts, and were easier on the eye than the staid, unimaginative pages of the early web. They began to resemble, in fact, pages in print magazines. And in order to make this possible, webpages ceased to be static text-objects fetched from a file store; instead, the server assembled each page on the fly, collecting its various graphic and other components from their various locations, and dispatching the whole caboodle in a stream to your browser, which then assembled them for your delectation.

    All of which was nice and dandy. But there was a downside: webpages began to put on weight. Over the last decade, the size of web pages (measured in kilobytes) has more than septupled. From 2003 to 2011, the average web page grew from 93.7kB to over 679kB.

    Quite a few good comments disagreeing with me. In the piece I mention how much I like Peter Norvig’s home page. Other favourite pages of mine include Aaron Sloman’s, Ross Anderson’s and Jon Crowcroft’s. In each case, what I like is the high signal-to-noise ratio.

    Untethering the presenter

    I’m not a fan of PowerPoint etc. (for serious reasons and frivolous ones) but because of my book I’ve been doing quite a few talks and using Keynote as a way of providing some structure. I eventually realised that one of the reasons I hate using presentation software is that I felt tethered to my Mac — which is ridiculous given that (a) I hate tethered devices in other contexts and (b) simple wireless solutions are available — like this Kensington Wireless pointer. It communicates using 802.11 (so no line of sight or range problems), runs on 2 AAA batteries, plugs-and-plays with my Mac, has an inbuilt laser pointer (complete with H&S warnings) and — neatest of all — has a slot into which the USB dongle fits when not in use.

    I got one recently and it does what it says on the tin. Then fell to wishing I was quicker on the uptake: I should have got one of these ages ago.

    The School of Data

    Here’s a fantastic initiative by the Open Knowledge Foundation. (Disclosure: I’m on the OKF’s Advisory Board). What lies behind it is an awareness that there’s a huge — and growing — skills gap in data-analysis, visualisation, etc.

    To address this growing demand, the Open Knowledge Foundation and P2PU are collaborating to create the School of Data.

    The School of Data will adopt the successful peer-to-peer learning model established by P2PU and Mozilla in their ‘School of Webcraft’ partnership. Learners will progress by taking part in ‘learning challenges’ – series of structured, achievable tasks, designed to promote collaborative and project-based learning.

    As learners gain skills, their achievements will be rewarded through assessments which lead to badges. Community support and on-demand mentoring will also be available for those who need it.

    So What Next?

    In order to get the School of Data up and running, the next challenges are:

    To create a series of learning challenges for a Data Wrangling 101 course. Developing Data wranglers will learn to find, retrieve, clean, manipulate, analyze, and represent different types of data.

    To recruit community leaders to act as ‘mentors’, providing community support and on-demand mentoring for those who need it.

    To curate, update and extend the existing manuals and reference materials, e.g. the Open Data Handbook and the Data Patterns Handbook etc.

    To design and implement assessments which evaluate achievements. Badges can then be issued which recognize the relevant skills and competencies.

    To openly license all education content (challenges, manuals, references and materials) so that anyone can use, modify and re-use it, including instructors and learners in formal education.

    Get the word out! Promote Data Wrangling 101 to potential participants.

    Get Involved!

    At this stage, the OKF is seeking volunteers to help develop the project. Whether you would like to design educational materials, construct learning challenges, donate money or mentor on the course, we’d love to hear from you! Equally, if you are part of an organisation which would like to join with the Open Knowledge Foundation and P2PU to collaborate on the School of Data, please do get in touch by registering on the form at the end of the link.

    Apple: ARMing OS X

    Fascinating piece by Charles Arthur in the Guardian pondering the implications of revelations that Apple has been porting OS X to the ARM chip.

    Written by Tristan Schaap, the paper describes working in the PTG [Apple’s Platform Technologies Group] for 12 weeks, porting Darwin to the MV88F6281 – an ARMv5-compatible processor that’s a couple of generations old now. They were then porting Snow Leopard, aka 10.6; Mac OS X is now onto 10.7 (“Lion”), released last year.

    “The goal of this project was to get Darwin building and booting into a full multi-user prompt,” Schaap wrote in the introduction that’s generally visible on the DUT page.

    But in the paper he goes significantly further: “The goal of this project is to get Darwin into a workable state on the MV88F6281 processor so that other teams can continue their work on this platform.” Emphasis added. That tells you: Apple is working on porting Mac OS X to ARM, and thus giving itself fresh options if the ARM architecture – known for its low power demands, but equally not until now seen as a competitor in processing heft to Intel – starts offering the horsepower users need.

    And there have been indications that ARM is moving up the horsepower ratings, even while Intel tries to lower the floor on its chips’ power consumption.

    Democratising web streaming

    Very interesting development. At the moment, webcasting is great but requires significant resources (server and bandwidth) to do it on any scale. This could put it within the reach of just about everybody. It’s not ready yet, but should be out by the Summer.

    Has Microsoft Word affected the way we write

    This morning’s Observer column.

    Here’s a trick question: who’s produced the most books in the past 30 years? Answer: a guy called Charles Simonyi. Eh? Well, I said it was a trick question. Mr Simonyi, you see, is the chap who created Microsoft Word, which is the word-processing program used by perhaps 95% of all writers currently extant, and although Simonyi didn’t actually write any books himself, the tool he made has definitely affected the ways texts are created. As Marshall McLuhan was fond of saying, we shape our tools and afterwards they shape us.

    I write with feeling on the matter. When I started in journalism, I wrote on a manual typewriter. After I’d composed a paragraph, I would look at it, scribble between the lines, cross out words, type some more before eventually tearing the page out of the machine and retyping the para on a fresh sheet. This would go on until my desk was engulfed in a rising tide of scrunched-up balls of paper.

    So you can imagine my joy when Mr Simonyi’s program appeared…

    The piece has attracted some very thoughtful comments.

    The ideas man

    I’ve long been an addict of Edge.org, the website/salon founded by John Brockman. I finally got to interview him for the Observer.

    To say that John Brockman is a literary agent is like saying that David Hockney is a photographer. For while it’s true that Hockney has indeed made astonishingly creative use of photography, and Brockman is indeed a successful literary agent who represents an enviable stable of high-profile scientists and communicators, in both cases the description rather understates the reality. More accurate ways of describing Brockman would be to say that he is a “cultural impresario” or, as his friend Stewart Brand puts it, an “intellectual enzyme”. Brand goes on helpfully to explain that an enzyme is “a biological catalyst – an adroit enabler of otherwise impossible things”.

    The first thing you notice about Brockman, though, is the interesting way he bridges CP Snow’s “Two Cultures” – the parallel universes of the arts and the sciences. When profilers ask him for pictures, one he often sends shows him with Andy Warhol and Bob Dylan, no less. Or shots of the billboard photographs of his head that were used to publicise an eminently forgettable 1968 movie. But he’s also one of the few people around who can phone Nobel laureates in science with a good chance that they will take the call.

    The USB Typewriter

    This is the kind of stuff that makes my day — and makes normal people wonder what geeks are smoking.

    Lovely, lovely idea. Leading edge uselessness I call it. And that’s a compliment: it’s about doing ingenious things just for fun. You can get the kit from here. Now, where did I put that old Olivetti portable?

    Thanks to Brian for the link.

    The new Digital Divide

    From today’s NYTimes.

    The world’s congested mobile airwaves are being divided in a lopsided manner, with 1 percent of consumers generating half of all traffic. The top 10 percent of users, meanwhile, are consuming 90 percent of wireless bandwidth.

    Arieso, a company in Newbury, England, that advises mobile operators in Europe, the United States and Africa, documented the statistical gap when it tracked 1.1 million customers of a European mobile operator during a 24-hour period in November.

    The gap between extreme users and the rest of the population is widening, according to Arieso. In 2009, the top 3 percent of heavy users generated 40 percent of network traffic. Now, Arieso said, these users pump out 70 percent of the traffic.

    Michael Flanagan, the chief technology officer at Arieso, said the study did not produce a more precise profile of extreme users. But the group, he said, was probably diverse, with a mix of business users gaining access to the Internet over a 3G network while traveling, and individuals with generous or unlimited mobile data packages watching videos, the main cause of the excess traffic.

    Interesting data. At the moment, only about 13 per cent of the world’s 6.1 billion cellphones are smartphones, according to Ericsson, the leading maker of mobile network equipment, but the rate exceeds 30 percent in larger markets like the United States, Germany and Britain. My (informal) guess, based purely on observing those around me in the street and on trains, is that the proportion of smartphones is much higher than that in the UK.

    The increasing penetration of smartphones is a one-way street — and, as Jonathan Zittrain, Tim Wu and others have pointed out — the destination it’s heading towards is not necessarily an attractive one in terms of freedom and innovation.

    As the NYT report puts it:

    The more powerful phones are rapidly replacing the simpler, less voracious devices in many countries, raising traffic levels and pressure on operators to keep pace. In countries like Sweden and Finland, smartphones now account for more than half of all mobile phones … About 35 percent of Finns also use mobile laptop modems and dongles, or modems in a USB stick; one operator, Elisa, offers unlimited data plans for as little as 5 euros, or $6.40, a month.

    As a result, Finns consume on average 1 gigabyte of wireless data a month over an operator’s network, almost 10 times the European average. As more consumers buy smartphones, the level of mobile data consumption and congestion will rise in other countries.