A factor that may prejudice my viewpoint is my age, some 2.2 gigaseconds. The most striking changes in my lifetime have been due to the digital computer. I have seen a vast panorama of calculational technology, starting with tables of four-figure logarithms (leaflets), five figures (booklets) and seven figures (tomes). Later came linear and circular slide rules, clanky electromechanical calculators, the electronic desk calculator, the mainframe computer, the mid-sized computer, the hand calculator, the programmable hand calculator, finally desktop machines: fast, powerful and cheap. Molecular processors will achieve yet more.

As an educator, I can set higher-quality examinations now because students have hand calculators. As a scientist, I no longer have to spend hours doing repetitive calculations on laboratory data and plotting the results by hand. I have a large stack of linear, semilog and log–log graph paper that I cannot bring myself to throw out, even though I shall never use it again. Data processing is quicker, more accurate and more fun. At home, our television set is little used but our two computers are used constantly for data processing, writing, sending e-mail, filing taxes and, yes, scanning the web or playing games. In view of these great advantages, it seems somewhat churlish to voice criticism of computers. But there are negative aspects.

The advent of frequency, rather than amplitude, modulation improved the quality of broadcast communication, which was much further improved by satellites and signal digitization, allowing a vast increase in distance learning. Although cheap and efficient, this can be a pedagogical disaster if economics are the sole driving force. On the other hand, recognition that most students learn best by interaction has led to effective instructor/student and student/student e-mail and newsgroups.

Kelvin's dictum, that if we can't put a number to something we don't really understand it, is true for most of science, but there are limitations. The great discoveries have been in developing a theory to fit (or, more spectacularly, predict) experimental results, best exemplified by the prediction of the existence of Neptune, Darwin's synthesis, Mendeleev's description of undiscovered elements, Einstein's gravitational bending of light and Bohr's explanation of Balmer's solar spectral lines. Computers would only have speeded up the calculations. Deciphering the genetic code, on the other hand, could not have happened without the computer: it is simply awesome bookkeeping.

Large-scale syntheses of new pharmaceutical products are now routine. Typically, an array of 96 samples — it can be as high as 15,000 — differing in one or two characteristics is automatically reacted and sampled using gas-chromatographic and mass-spectrometric analysis to find a substance with a particular property. Digitization and large-scale data storage are essential because of the enormous amount of information generated. But we are awash in data: almost every day we read of some new potential cure for a disease when all that has actually happened is that it has become easier to analyse data with statistics software and find weak correlations.

Computer simulation of reacting systems has enabled great advances in fields where change can be represented analytically. In chemistry, for example, the approximations that hobbled chemical kinetics for most of the past century have been abandoned. The concept of reaction order is valid for elementary steps in a complex mechanism but is meaningless for most overall reactions of the real world. Similarly, steady-state and quasi-equilibrium hypotheses are unnecessary. Computer simulations of chemical systems model the behaviour of the statistical averages of ensembles with populations of perhaps 1020 particles. Biological populations of cells or animals are typically more than ten orders of magnitude smaller, but the techniques are still valid. It is now relatively easy to simulate, say, complex cell growth with a sufficiently convoluted rate law. Computer simulation is seductive, but it does not banish empiricism, and when change is represented by a very complex rate law, a healthy application of Occam's razor may be needed.

The predator–prey oscillations observed by Hudson Bay Company trappers a century ago require partial second-order differential equations for their simulation, which are then easily solved on the computer. The simulation of traffic flow has recently made important strides. Among gas molecules, the frequency of collisions is proportional to the square of the particle density, but this may not be true for (far smaller) populations of cars. Monte Carlo techniques show that noise level, and thereby uncertainty, increases as the population decreases. This sets a limit to the validity of computer simulation.

The digital computer has allowed unprecedented levels of accuracy in data transmission and capacity in data storage. Let us revel in the new technology, but not uncritically embrace the results. An intellect is still required to separate what is truly significant from background noise; digitization does not guarantee understanding. We could store every musical theme that has ever been written in a few gigabytes, but the computer would not be able to generate a single beautiful melody until we can define one.