It has been mentioned in virtually every review of this book – the title is slightly misleading. The book is not really about computer scientist Alan Turing at all. Its central character is John von Neumann, a brilliant mathematician who was able to bridge the gap between math and physics in the mid-20th century through the development of the first digital computers, and its setting is the Institute for Advanced Study in Princeton, where Turing only spent two years of his life. (Just for a quick comparison – von Neumann’s entries in this book’s index comprise two and a half pages, Turing’s: one half of a page.)

I’d be curious to find out if the title was the result of a publisher trying to sell more books or a conscious decision by author George Dyson. As a title, it actually does make some sense in hindsight – as the book is primarily about the development of the first digital computers, which were based on Turing’s 1936 paper on the theory of computable numbers. Turing was attempting to answer the * Entscheidungsproblem* that David Hilbert had posed in 1928. In addition to being difficult to pronounce, Hilbert’s problem is also difficult to understand, let alone solve. In its most basic (and hopefully not too heavily butchered) form, the Entscheidungsproblem asks for an algorithm or set of rules that can answer Yes or No to the question:

*is the input statement universally valid*?

Turing built a theoretical machine using the language of logic that showed a negative answer to the problem: there is no such algorithm that can determine if something is universally valid, which also happens to mirror the implications of Kurt Godel‘s incompleteness theorem. What was more important in Turing’s work was that it provided engineers with the structure for coding software, with a binary system of answering series of yes/no questions, in the form of zeros and ones.

Turing’s work inspired von Neumann’s team in Princeton, where an elite group of mathematicians, physicists and engineers were building the world’s first digital computers. These computers were used in the development of military weapons during World War II, most notably the atomic bomb. The book is a careful historical account of the people and ideas that made these early computers possible and provides a thought-provoking view into the nature of computing and what it means for us both culturally and as a species. Like Douglas Hofstadter did in Godel, Escher, Bach, Dyson returns again and again to the connection between DNA and computing, in that both exist through replication and transcription of short bits of information which, which when strung together, provide the language of genetics in biology and software in computing. It is an interesting connection to ponder, as it could suggest that computers represent the next stage of Earth’s evolution, with their code evolving at a much more accelerated pace than our nearly 4 billion year old molecular technology. Von Neumann’s computers (built on Turing’s theory) are only a little more than half a century old. Look what’s happened in computing since then.

George Dyson (the son of American physicist Freeman Dyson) did a fantastic job of researching the many faces, papers, ideas and technology of this time and space in computer history. At times it feels like he almost did too much research, as the biographies of each contributor can stretch out at times in a bit of a dry fashion. I feel that the book could be carefully edited down by 50 pages and still tell the same story. That being said, if you’re able to commit a good chunk of time to this somewhat slowly-paced book, you’ll be rewarded with a unique perspective on the newly forming digital universe that we have already come to take for granted.

## Leave a Reply