A History Of Modern Computing

  • Paul E. Ceruzzi
MIT Press: 1998. 338pp $53.95, £24.95

The computer as we know it is barely 50 years old, but it already has a rich and varied history. In writing A History of Modern Computing, Paul Ceruzzi, curator of the US National Air and Space Museum, is one of the first professional historians to look back past the feeding frenzy of the present decade and describe how we got here.

A better title would have been “A History of Big Old American Computers”, and as a sourcebook of the years between 1945 and 1980 it is a useful collation of who did what, how much it cost and what happened in consequence.

In the beginning, as Ceruzzi states, the computer was a mathematical marvel and not much more. Its use in defence and research dominated the very early days, and continued to be important until the 1970s by stimulating research into semiconductors. Ceruzzi is strongest in his description of how the commercial world first came to computing. It's easy these days to forget how the omnipresence of IBM's Big Blue once had all the significance of ancient Rome's imperial purple, but this book charts the growth and the consequences of IBM's overweening inertia in some detail.

The coverage of the current state of computing is brief and not always accurate — to correct a few errors, it is worth stating that IBM did not integrate the display system into the motherboard of the original PC; Intel's processors were not particularly ill-suited to networking; and to say that the World Wide Web got off to a slow start is to completely miss the nature of the exponential graph that consistently describes so many aspects of modern computing. Also, there is no mention of the Intel 386 architecture (let alone the Pentium), a breathtaking oversight.

Perhaps these criticisms are not entirely fair: this is a history book and, as Ceruzzi himself says, it is still too early to write the history of the past ten years. He covers the complex interaction between software and hardware development — and, of course, Bill Gates's role in changing that balance for good. But those seeking technological insights will find the book best used as a gloss, a guide to where to look next.

For those whose life and work have involved computing before the microprocessor, A History of Modern Computing will be a readable and fascinating memento. It brings together much of importance, and much that would otherwise be forgotten. Yet it fails to demonstrate how the history of computing has set the context for our current experience, surely a prime curatorial concern. I am glad the book has been written, but much remains to be said.

More on computing

Introduction to Quantum Computers

by Gennady P. Berman, Gary D. Doolen, Ronnie Mannieri and Vladimir I. Tsifrinovich World Scientific: 1998. $32, £23

The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places by Byron Reeves and Clifford Nass Cambridge University Press:1998. 323 pp. £10.95.Now available in paperback.