Instruments and the imagination

Article metrics

Image and Logic: A Material Culture of Microphysics

University of Chicago Press: 1997. Pp.955.$90, £71.95 (hbk); $34.95, £27.95, (pbk)

The experimenter⃛ is not one person, but a composite⃛. He is a social phenomenon, varied in form and impossible to define precisely. One thing, however, he certainly is not. He is not the traditional image of a cloistered scientist working in isolation at the laboratory bench.(Alan Thorndike, quoted in Image and Logic)

At the end of the nineteenth century, physics was transformed. The atomic and subatomic world — the realm of the invisible — became the focus of much of the enterprise, and reductionist explanations in terms of the atomic and molecular constituents of matter began to assume a privileged role.

By the early 1920s, visions of unification — at the time, of physics and chemistry — and the belief that the phenomena of nature could be described by a limited number of elementary constituents, the dynamics and interactions of which could be represented by an ultimate, ‘fundamental’ theory, became part of the metaphysics of physics. The formulation of quantum mechanics in the mid-1920s, and its amazing empirical and instrumental successes, further bolstered reductionism and the drive towards a unified description of the forces of nature.

It had been usual until the early 1980s to tell the story of the advances of physics in the twentieth century as a grand narrative beginning with Max Planck and Albert Einstein and culminating with the formulation of the Standard Model of electro-weak and -strong interactions in the 1970s. Theoretical understanding took pride of place in the story, and the commitment to reductionism and unification was seen as an important factor in explaining the success of the programme. The Kuhnian model of the growth of scientific knowledge, with its revolutionary paradigm shifts, buttressed the primacy of theory and the view that experimentation and instrumentation were subordinate to, and entrained by, theory.

The situation changed after Ian Hacking, Peter Galison, Bruno Latour, Simon Schaffer and other historians, philosophers and sociologists of science reanalysed and reassessed the practice and role of experimentation. It became clear that accounting for the growth of knowledge in the physical sciences during the twentieth century is more complex than had previously been believed.

In his splendid book, Image and Logic, Galison offers a framework for understanding what physics is about in the twentieth century. He makes a convincing case for regarding experimentation, instrumentation and theory as quasi-autonomous subcultures with languages and practices that are distinct, yet linked and coordinated. Experimental, theoretical and instrumental practices do not all change of a piece: each has its own periodicity, and their relation to one another varies depending on the specific historical situation in which each is embedded. One of Galison's aims is to demonstrate the deep continuity of experimental practices across theoretical and instrumental breaks, and he does this through an analysis of the instruments of modern microphysics that make the invisible both objective and believable.

Galison's focus is on detectors, the instruments, large and small, that were and are the mediators between the production of phenomena and the production of evidence. He identifies and describes the two competing traditions of instrument-making in the field. The image tradition — starting with C. T. R. Wilson's cloud chamber and its progenies of the 1920s and 1930s, continuing with the nuclear emulsions of the 1940s and 1950s, and culminating with the bubble chambers of the 1950s and 1960s — aspired to produce representations of natural processes in all their richness and complexity.

The aim of all the various forms of image-making in that tradition was the production of pictures of such clarity that a single picture would serve as evidence for a new particle or effect. In the logic tradition the instruments were electronic counters that were aggre-gated and embedded in a matrix of electronic logic circuitry to capture desired events. The first prototypes were the Geiger-Müller, scintillation and Cerenkov counters, the progenitors of the spark and wire chambers. These are counting — rather than image-making — devices that gather masses of data to make statistical arguments for the existence of a particle or an effect.

In the early 1970s, a symbiosis of the two traditions took place, resulting in electronic bubble chambers, which are detectors that produced computer-generated pictures using data generated by wire chambers. These advances were made possible by the revolution in microelectronic circuitry and the newly developed charge-coupled devices.

At the same time the successful operation of storage rings made experiments with colliding beams standard practice. In a conventional accelerator, the collision products in a fixed-target experiment travel primarily in the forward direction. With colliding beams, the collision products spew out in all directions, hence the requirement for larger, more complex, multicomponent 4π detectors. These multicomponent electronic detectors, such as the Mark I and the time projection chamber, introduced a further change of scale, in the physical size of the detector, in the number and kinds of people involved and, most importantly, in the role of the computer and of computing. These transformations altered both the definition of the experimenter and the structure of demonstration, and brought about a re-evalution of what it meant to be an author. Galison characterizes this shift from pure to hybrid devices as a transition from the ‘modern’ to the ‘postmodern’.

In his introduction Galison indicates that the framing questions for his book were: “Who counts as an experimenter?” and “What counts as an experiment?”. Image and Logic establishes how the answers to these questions changed over the century as accelerators increased in energy, as the site where the experiments are carried out was transformed, as engineers took over the function of running the accelerators, as computers assumed an ever more central role in running the operation and in interpreting the data, and as the nature of the interactions between the actors that make experimentation possible — experimenters, engineers, theorists, computer programmers, calorimeters, computers and so on — changed.

But the book is much more than that. It offers a detailed, valuable account of what made possible the developments in experimental high-energy physics. Galison examines not only the general context, but also probes deeply into the lives of institutions and individuals. We obtain accounts of how universities, instrumentation and the management of scientific activities changed during the Second World War. We are given sensitive portraits of, among others, Charles T. R. Wilson, Marietta Blau, Cecil Powell, Donald Glaser, Luis Alvarez, Stanislaw Ulam and David Nygren, individuals who, by their innovations, were responsible for radically transforming the enterprise of high-energy physics.

We are also shown the cunning of reason at work: Glaser inventing a table-top ether-filled bubble chamber in the hope of salvaging the practices of cosmic-ray physics in which a single experimenter can be responsible for and in control of all aspects of an experiment, only to see the instrument transformed in Alvarez's hands into a very sizeable hydrogen-filled bubble chamber. Because this detector was ideal for the new accelerators coming on line, it became the centrepiece of an industrial-scale operation involving cryogenic and electronic engineers, and teams of scanners. It was able to produce such a huge mass of data that it required the automation of the scanning process. This in turn entrained a scientific way of life, in which data analysis essentially became the experiment, totally at odds with Alvarez's original vision. Eventually Alvarez himself left the field, finding its gargantuan aspects unpalatable.

The chapter on the time projection chamber describes what is involved in the design and construction of large, heterogeneous pieces of apparatus. Galison reports what it takes to carry out successfully projects in which the level of complexity makes it impossible for any one individual to master all the technical details; where consequently no ‘one’ is in charge and a collaborative style of interaction must be implemented, and where seemingly it is the computer and its languages that holds the enterprise together. Galison melds the technical factors, the managerial aspects and the interpersonal relationships into a seamless narrative that conveys both the drama and the impressive achievement of the project.

Galison has an ability to synthesize that overwhelms. The chapter on ‘artificial reality’ is a wonderful introduction to the art and science of simulation and modelling, to Monte Carlo methods and the genesis of these practices in the design of fission and fusion weapons, and to the way in which the simulation has become a subdiscipline in many areas of science and has transformed its practice in these areas. In high-energy physics it has generated a new kind of phenomenology that blurs the line between experiment and theory. The careful reader will also find out how a geometrical interpretation of logic was derived by Ulam from his computing activities, and many other such interesting insights.

As should be clear from the above, I consider Image and Logic to be a magnificent achievement. As a penetrating history of experimental high-energy physics during the twentieth century, it will be of great interest to physicists, historians, sociologists and philosophers of science. But Image and Logic is more than that. It is a brief for ‘mesoscopic history’: for writing history at a level between macroscopic, universalizing history and microscopic, nominalistic history.

Galison proposes treating the movement of ideas, objects and practices as one of local coordination — both social and epistemic — made possible through the establishment of pidgins and creoles. For Galison these ‘interlanguages’ become a central decentred metaphor. In the case of physics, the picture that Galison paints is that of separate but correlated subcultures bound and stabilized by these interlanguages. Galison's plea for this new historiography will be widely discussed, for his book will surely be very influential and generative.

Author information

Rights and permissions

Reprints and Permissions

About this article

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.