Turing, the NSA and the decision problem

Interesting post by George Dyson on The Edge site. Excerpt:

This is much bigger than the relative merits of national security vs. the fourth amendment to the U.S. Constitution, or any of the other debates by which the Snowden revelations have been framed. We are facing a fundamental decision (as Turing anticipated) between whether human intelligence or machine intelligence is given the upper hand. The NSA has defended wholesale data capture and analysis with the argument that the data (and metadata) are not being viewed by people, but by machines, and are therefore, legally, not being read. This alone should be cause for alarm.

And what of the current obsession with cyberterrorism and cyberwar? We should deliberately (and unilaterally if need be) abandon the weaponization of codes and the development of autonomous weapons—two different approaches to the same result. They both lead us into battles that can never be won. A good example to follow is the use of chemical and biological weapons—yes, they remain freely available, but we have achieved an almost universal consensus not to return to the horrors of poison gas in World War I. Do we have to repeat the mistake? We are currently taking precisely the wrong approach: fast-tracking the development of secret (and expensive) offensive weapons instead of an open system of inexpensive civilian-based defense.

Fourteen years ago, I spent an afternoon in La Jolla, California with Herbert York, the American physicist of Mohawk ancestry who became Eisenhower’s trusted advisor and one of the wisest and most effective administrators of the Cold War. York was appointed founding scientific director of ARPA and was instrumental both in the development of the hydrogen bomb and its deployment, in a few short years, by a working fleet of Intercontinental Ballistic Missiles, or ICBMs. He was sober enough to be trusted with the thermonuclear arsenal, yet relaxed enough about it that he had to be roused out of bed in the early morning of July 6, 1961, because he had driven someone else’s car home by mistake.

York understood the workings of what Eisenhower termed the military-industrial complex better than anyone I ever met. “The Eisenhower farewell address is quite famous,” he explained to me over lunch. “Everyone remembers half of it, the half that says beware of the military-industrial complex. But they only remember a quarter of it. What he actually said was that we need a military-industrial complex, but precisely because we need it, beware of it. Now I,ve given you half of it. The other half: we need a scientific-technological elite. But precisely because we need a scientific-technological elite, beware of it. That’s the whole thing, all four parts: military-industrial complex; scientific-technological elite; we need it, but beware; we need it but beware. It’s a matrix of four.”

We are much, much deeper in a far more complicated matrix now. And now, more than ever, we should heed Eisenhower’s parting advice. Yes, we need big data, and big algorithms—but beware.