Books

A History Of The Digital Age: Review Of Walter Isaacson’s ‘The Innovators’

Anand Gurumoorthy

Jul 10, 2016, 06:03 PM | Updated 06:03 PM IST


Steve Jobs and Bill Gates (TONY AVELAR/AFP/Getty Images) 
Steve Jobs and Bill Gates (TONY AVELAR/AFP/Getty Images) 

Isaacson, Walter. The Innovators: How a group of of hackers, geniuses, and geeks created the digital revolution. 2014. Simon and Schuster

Walter Isaacson has the rare knack of turning out instant classics. His Einstein: His Life and Universe (2007) was a superlative study of one of the world’s renowned scientists while his Steve Jobs (2011) was a memorable portrayal of one of the greatest entrepreneurs of all time. In The Innovators:How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution (2014), Isaacson moves from studying single personalities to a team of pioneers who were instrumental in ushering in the information age.

But before going into The Innovators, I would like to compare it with an earlier book that had a tremendous impact on me: Joel Shurkin’s Engines of the Mind (1984) which is a history of computing from the beginnings of civilization to the 80s. When I read Shurkin’s book in 2000 it came as a revelation to me - before that I was completely ignorant of the history of the computer.

While people like Howard Aiken, Konrad Zuse, John Atanasoff and Alan Turing would share credit for the invention of the computer, John Mauchly and Presper Eckert of the University of Pennsylvania’s Moore School of Engineering were responsible for the first fully electronic computer ENIAC. As Shurkin first pointed out, Mauchly and Eckert were denied their fair share of credit because the mathematician John von Neumann upstaged them and wrote a report (“a first draft”) on their work which was distributed with von Neumann’s name as sole author.

Consequently, Mauchly and Eckert were unable to patent the ENIAC because von Neumann’s “first draft” had put their invention in the public domain.

Reading Isaacson’s version of this story, it seems that von Neumann had made his own contributions in the “stored programme” concept and the credit given to him was not wholly undeserved (the impression that I got from reading Shurkin’s book). I think, on the whole, Joel Shurkin has been too harsh on von Neumann while Isaacson has been too gentle on him.

While Shurkin’s book was more a straight narrative, Isaacson’s book is a series of vignettes on selected characters who developed the digital age in its present form. One of Isaacson’s aims is to show that the digital age was not a product of lone heroes but a team of collaborators who stepped in where others left off.

As Isaacson says: “[T]he digital age may seem revolutionary, but it was based on expanding the ideas handed down from previous generations. The collaboration was not merely among contemporaries, but also between generations. The best innovators were those who understood the trajectories of technological change and took the baton from innovators who preceded them.”

The first generation of collaborators were, of course, Charles Babbage and Ada Lovelace. In the twentieth century, you had John Mauchly and Presper Eckert and their team of programmers such as Jean Jennings and Kay McNulty who developed the ENIAC. The vacuum tubes were replaced by transistors developed by William Shockley, John Bardeen and Walter Brattain. The integrated circuit was created independently by Robert Noyce and Jack Kilby. The Internet was created at the Advanced Research Projects Agency using concepts by J C R Licklider, Bob Taylor and Larry Roberts among others. The PC generation and the importance of software were heralded by Steve Wozniak, Steve Jobs, Bill Gates and others. The World Wide Web was created in 1990 by Tim Berners-Lee with help from Robert Cailliau.

As Isaacson comments: “[M]ost of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs - who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.”

Towards the end of the book, Isaacson addresses the question of “Can the computer be made to think?” which is the holy grail of Artificial Intelligence (AI). Deep Blue winning against Garry Kasparov in 1997 and Watson winning Jeopardy! in 2011 were touted as successes of AI. But both these used brute force search approaches to achieve their victories. These computers had negligible intelligence of their own. Isaacson says that one should aim rather for augmented intelligence where human abilities are complemented by the skills of a computer.

As Isaacson writes:

“In other words, the future might belong to people who can best partner and collaborate with computers.”

I could talk about the things that Isaacson missed out: the development of programming languages like FORTRAN and C; the development of fibre optics; the emergence of mobile computing; social networking and microblogging. I think, however, Isaacson has held the thread steadily and he does a wonderful job of leading us in and out of the digital labyrinth.


Get Swarajya in your inbox.


Magazine


image
States