Big Ideas in Computer Science

Went to an excellent lecture this morning by Chris Bishop of Microsoft Research. It was part of the wonderful Cambridge Science Festival (of which I’m a Patron) and his topic — “Great Ideas in Computer Science” — is very germane to something I’m writing at the moment.

It’s always intriguing to see what other people regard as key ideas. (I’ve had my own go at this recently in relation to the Internet). So I was agog to see Chris’s list.

They are:

1. Photolithography — the technology that powers Moore’s Law, enabling us to take for granted massive increases in computing power.

2. Algorithms.

3. Cryptography.

4. Machine learning.

5. Biology as computation.

As you’d expect from such an accomplished lecturer (he did the 2008 Royal Institution Christmas Lectures) it was a presentation beautifully tailored to its audience (keen children and their even-keener parents). He illustrated the idea of algorithms by getting five kids to act out a sorting algorithm on stage. For machine learning he had a very nice exposition of how ‘recommender’ engines (eg on Amazon) work. And he had some amazing animated videos of simulations of DNA replication in action.

But for me the best bits of the lecture were:

  • The way he introduced public-key cryptography. I was wondering how he was going to explain all the stuff about multiplying prime numbers etc., but instead he bypassed it brilliantly by doing it with colours. (It’s easy to mix yellow and blue to get green, but ferociously difficult to infer the precise shades of yellow and blue used from the green.)
  • His account of how the XBox Kinect makes accurate spatial inferences from infra-red cameras and parallax (and the role of machine learning in that).
  • His startling assertion that DNA is actually a Turing-complete machine — and the inference that what we currently call “genetic engineering” is really just genetic tinkering and that a future based on computational biology is opening up.
  • I came away brooding about whether the term Computer Science might not be a bit of a misnomer. We use it when trying to persuade the government (and the public) that computing is an academic subject rather than a mere skill (like learning to use Microsoft Excel) because the word ‘science’ involves difficulty, abstraction and law-like generalisations which are dependable, empirically-supported and enduring. But as I walked back to my car I remembered a conversation I had with the late, great Roger Needham in which he argued that what we call “computer science” actually involves an awful lot of “technology” as well as “science”.

    And, in a way, Chris Bishop’s lecture implicitly acknowledged that. Photolithography, for example, is a technology (though one based on the physics of light). Same goes for machine learning. Cryptography is mostly applied mathematics. So we’re left with the question: what is Computer Science? The Oxford CS department says that it’s “about learning and understanding the mathematical, scientific and engineering principles underlying every kind of computing system, from mobile phones and the internet, via systems that interpret natural language, to the supercomputers that forecast tomorrow’s weather or simulate the effects of disease on the human heart.” To be a successful Computer Science student, it continues, “you will need a curiosity about how things work, and the ability to use mathematics to solve problems creatively”. Cambridge describes CS as “a fast-moving field that brings together many disciplines, including mathematics, programming, engineering, the natural sciences, psychology and linguistics”. Carnegie-Mellon says that “Computer science can organize information, build smaller, faster, more secure systems, create channels of communication and delve deep into complex data sets” and goes on to link it to something called “Computational Thinking” — defined as “how computer scientists determine what can be computed and how to compute it. By looking at the world through the lenses of algorithms and abstraction”.

    CMU makes a big deal of this Computational Thinking idea. (The phrase comes from a much-cited editorial in Communications of the ACM in 2006 by Jeanette Wing, a professor at CMU).

    Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. To flourish in today’s world, computational thinking has to be a fundamental part of the way people think and understand the world.

    Computational thinking means creating and making use of different levels of abstraction, to understand and solve problems more effectively.

    Computational thinking means thinking algorithmically and with the ability to apply mathematical concepts such as induction to develop more efficient, fair, and secure solutions.

    Computational thinking means understanding the consequences of scale, not only for reasons of efficiency but also for economic and social reasons.

    Hmmm… I’m not sure how I’d explain that to Michael Gove.

    Postscript: Thanks to Miranda Gomperts, who was also at the lecture and provided me with the link to the DNA animations.