Main

Much of my time, like that of other researchers, is spent writing proposals to funding agencies. These generally start out by stating the scientific goals and expected results, and even if we think and hope that unforeseen discoveries are likely, they are not a good selling point. It is refreshing, then, to think specifically about paths to maximize the unexpected. By definition, we cannot foresee the unforeseeable, but we can hope to recognize the paths which in the past led to such discoveries, and to make sure we leave them open in the future.

Let us take as examples unforeseen discoveries that made for dramatic advances in cosmology in the past century. Consider the careful spectroscopic observations of nebulae started around 1912 by Vesto Slipher, using the modest 24-inch (61-cm) telescope of Lowell observatory. The recession velocities of thousands of kilometres per second that he found were completely unexpected, and only in 1917 did deSitter begin to understand them as galaxies in an expanding Universe. Dark matter, a still unidentified form that outweighs all normal matter in the Universe and controls the condensation of matter after the Big Bang, was discovered by Fritz Zwicky and Vera Rubin during spectroscopic observations to weigh galaxies. The rapid motions reflected a gravitational pull too large to be accounted for by ordinary matter in stars and interstellar material. The microwave background radiation, the cooled light of the Big Bang itself, was completely unforeseen by its Nobel prize- winning discoverers, Arno Penzias and Robert Wilson (although not by others).

What common threads do we find in these and the many other unexpected discoveries in astronomy? Very often the observers were pushing new equipment to its limits to accomplish a highly focused goal unrelated to the actual discovery. Slipher was hoping that the diffuse nebulae might turn out to be nearby planetary systems in formation. Zwicky and Rubin were using the largest telescopes to see fainter objects than ever before. Penzias and Wilson were testing a highly sensitive new type of radio telescope for communications.

More generally, unforeseen discoveries are made when the range of observations is enlarged. This happens whenever new parts of the electromagnetic spectrum are opened for observation. New vistas are also opened when larger telescopes or better detectors allow us to reach fainter objects or bigger samples, or to study brighter ones in more detail or with better time resolution. A good discussion of these topics is given by Martin Harwit in his book Cosmic Discovery1.

We cannot discuss future telescopes before pointing out a sea change in astronomy that is likely to drive new instruments in the coming decades. This is the exploration of exoplanets, the planets of other stars. The existence of other living worlds like our own in the Universe has been the subject of speculation for centuries. But telescopes have not been powerful enough to discern extra-solar planets. In the Solar System, the Earth already occupies the prime location and our neighbours appear too hot or too cold. Now at last with clear evidence that giant planets exist around other stars, we have the technical possibility and the incentive to build radical, new telescopes to study them. With them we should be able to find planets even as small as Earth, and to search their spectra for the biochemical signs of life. The potential for unforeseen discovery is enormous. Already the first momentous discoveries have shown completely unpredicted phenomena: planets of Earth's mass orbiting a pulsar, and planets with the mass of Jupiter orbiting closer to their stars than does Mercury to the Sun, a place far too hot for them to form. Who knows what surprises are in store as we begin to explore new worlds and to turn on the rest of the Universe the new telescopes with the sensitivity to see exoplanets?

Looking ahead at the new technology that should lead to unforeseen discovery, we see first a wave of key advances coming in ground-based telescopes in the next decade, the fruits of investments in the past decade. Space astronomy's next big leap will come later, the result of the current period of design and development of concepts for big, cold telescopes to study exoplanets and the formation of galaxies. Still further in the future, we may see huge space telescopes of gossamer construction, if the ideas just being formulated finally are realized. I will deal in turn with these areas.

Ground telescopes come of age

During the 40 years after the Palomar 200-inch (5-metre) telescope came into operation the power of optical telescopes was increased enormously, even though none significantly bigger was built. The gains were made by improving detector sensitivity 100-fold to reach the fundamental limit set by photon noise, by extending detector sensitivity into the infrared and by multiplexing so that dozens or hundreds of objects could be analysed at once. But as these advances have reached their limits, the past decade has seen the construction of a new generation of much larger telescopes. Starting with the two 10-m Keck telescopes, there are now around a dozen of size 6.5–10 m in operation or under construction. With such an increase in collecting area, some profound unforeseen discoveries can be expected. This will be especially true when these telescopes are able to remove atmospheric blurring and are linked together as interferometers. Up until now, astronomers have had to choose between small-aperture telescopes in space that are free from blurring (the Hubble Space Telescope or HST) or large-aperture telescopes on the ground with blurring. Adaptive optics is a new technique that will give both advantages at once2. In fact, because the natural limit to resolution set by diffraction improves in proportion to aperture, the bigger ground telescopes will be several times sharper than HST.

The key element in adaptive optics is a mirror whose shape can be altered rapidly in response to the measured atmospheric distortion. By giving the mirror equal but opposite distortion, the original image sharpness is restored. It has been difficult to make the measurement and correction fast enough to keep up with the constantly changing turbulence, but today's detectors and computers make it possible. Already correction of bright objects has been accomplished, and in a few years we should start to see sharp images of even the faintest objects corrected at infrared wavelengths. When a target itself is too faint to allow fast measurement of atmospheric distortion, an artificial star created by a laser searchlight may be used as a surrogate. Experimental laser systems are in operation, although the combination of exquisite tuning and high power needed to excite scattering very high in the atmosphere is proving elusive.

Once adaptive optics are in place another technique to increase the scope for discovery becomes possible. This is interferometry, long used by radio astronomers, which relies on combining the waves from separate telescopes. By measuring the strengthening and weakening of intensity as the crests and troughs reinforce or cancel out, images with greatly increased angular resolution are obtained. Interferometry can be extended to optical wavelengths for very high resolution imaging, but sensitivity is poor unless all the waves from each dish add together coherently. In the presence of atmospheric distortion, coherent addition is possible only when very small apertures are used. But once the full apertures of the large telescopes are corrected with adaptive optics and used as interferometer elements, very faint objects will become accessible.

All of the largest telescopes have been built as multiple units in anticipation of interferometry: Europe's Very Large Telescope (VLT) consists of four telescopes while the Keck and Large Binocular Telescope (LBT) each have two (Fig. 1). Because the LBT mirrors are on a common mounting and closely spaced, as an interferometer it will act like a 22-m telescope with a resolution of 10 milliarcseconds and high infrared sensitivity to complex, faint sources over a wide field of view. It will be able to find and analyse the thermal spectrum emitted by self-luminous giant planets, younger versions of our own Jupiter. The VLT and Keck telescopes are separately mounted and widely spaced (up to 100 m) for resolution as high as 2 milliarcseconds over a field of 1 arcsecond, but at the price of lower sensitivity and some ambiguity in image structure. They should be able to resolve those Jupiter-sized exoplanets mentioned above that are so close to their stars that they are red hot.

Figure 1: The Large Binocular Telescope, with two 8.4-m mirrors side by side.
figure 1

(Drawing by European Industrial Engineering, Mestre, Italy.)

Now that the construction phase of the current generation of big telescopes is winding down, their designers are looking at ways to build bigger optical telescopes. Californians are studying a single telescope of 30-m diameter (the California Extremely Large Telescope); Europeans one of 100 m (the OverWhelmingly Large telescope). Arizona's focus is on an imaging interferometer with two 20-m moving telescopes, the 20/20 telescope. With adaptive optics and interferometry, their sensitivity and resolution can potentially far outstrip what we have now. The tasks we can foresee for such telescopes include observations of the very early Universe at much higher resolution and sensitivity than possible with HST, and the detailed spectroscopic study of Jupiter-like exoplanets. If the closest stars have Earth-like planets, they should be visible in long exposures with the 20/20 telescope. In addition, the prospect of unforeseen discovery is a powerful motivator.

New discoveries are likely not only from such larger telescopes and higher-resolution images, but also from automated analysis of deep-sky images. In the past, photographic plates from 1.2-m telescopes covering 6° of the sky on a side have provided a rich source of data. Now electronic detectors are greatly increasing accuracy, sensitivity and spectral range. Two digital all-sky surveys with 1.2- and 2.5-m telescopes are in progress now, with charge-coupled device detectors for optical wavelengths in a mosaic of area 0.1 m2 and near-infrared detectors of 0.001 m2. The coming decade should see the construction of the 8-m Large Synoptic Survey Telescope with 3° field (0.5-m2 detector illuminated at f/1.25)3. The data rate will be prodigious, with the night sky repeatedly mapped every four clear nights, creating 10 terabytes of data per map. Two key goals will be finding and measuring the orbits of asteroids that could hit the Earth as small as 300 m, and mapping the spatial distribution of dark matter on large scales. Dark matter is detected through the small but systematic distortion it induces in the shapes of background galaxies. The data reduction for these tasks is big, but well defined. The challenge for computer gurus will be to spot the unforeseen — for example, new types of erratic variable objects or subtle large-scale correlations in data.

The limitless potential of space

Despite the evolution of much more powerful ground telescopes, space still holds unique and fundamental advantages. Complete freedom from atmospheric distortion in space will remain a major asset. Even when laser guide stars are used for adaptive correction on the ground, a nearby background star is still needed to sense overall jitter of the sharpened image. Stars are dense enough for infrared imaging over most of the sky, but optical imaging, which demands more accurate and faster stabilization because of the shorter wavelengths, will be restricted to fields near a bright star. Tomographic, three-dimensional mapping of atmospheric turbulence with multiple laser guide stars may be able to relieve this optical restriction and open wide-field correction. Although possible in principle, this will be very challenging.

The second unique capability of telescopes in space is for observations in the blocked ultraviolet and infrared spectral regions. Even where it transmits in the infrared, the atmosphere is bright from heat and molecular emission. Additional background heat comes from the ground telescope optics themselves, which cannot be cooled because that would cause them to frost. Space telescopes can be made extremely cold for a million times reduction in sky background.

The current generation of space telescopes has already shown how these advantages can be exploited. HST records wide-field optical images typically ten times sharper than for uncorrected ground-based telescopes, and reaches far into the vacuum ultraviolet. Its capability has been steadily improved by astronauts. For example, a new instrument was added to extend imaging and spectroscopic capability to the near infrared, where the natural sky background from sunlit interplanetary dust is much darker than on Earth. HST is likely to remain the only large optical telescope in space for astronomy for at least the next decade. Its dominance will be weakened by a strong challenge from ground telescopes, but perhaps that will allow time for more speculative use that could lead to unforeseen discovery.

HST's mirrors are too warm to be useful further into the infrared where on the ground the atmosphere completely blocks transmission. Here three other space telescopes, the Infrared Astronomical Satellite, Cosmic Background Explorer and Infrared Space Observatory, all cooled with liquid helium to within a few degrees of absolute zero, have revealed the very cold Universe. New discoveries can be expected from the Space Infrared Telescope Facility (SIRTF), a new helium-cooled telescope with 0.85-m aperture and more advanced detectors, which is scheduled for launch in 2002. It will reach to 160-μm wavelength.

Beyond these telescopes, the potential for future space astronomy is limitless. In principle, arbitrarily high sensitivity and resolution can be achieved at any desired wavelength, by making the telescope large, accurate and cold enough. Space is an ideal environment, because not only is atmospheric degradation removed, but so also are gravity and wind. This is a huge advantage because, under such forces, big objects bend more than small ones, whereas no matter how big a telescope is made, the mirror surface accuracy cannot be relaxed if the waves are to be reflected to a sharp focus.

What might a future telescope look like, built for operation in orbits far from the Earth where distorting forces are minimal? The actual working part of a telescope mirror is the top layer of the reflecting surface, which is only a few hundred aluminium or silver atoms deep. Thus there is the possibility in space of using extremely thin membrane mirrors covering hundreds or thousands of square metres, if a way can be found to hold the shape. One way is to stretch a membrane flat from the edge, and then to curve it very slightly by electrostatic force. An experimental telescope made in this way from a membrane 600-nm thick (a few thousand atoms) was used to record the image shown in Fig. 2. While very small, this mirror paves the way for larger-membrane telescopes, with tests planned first on the ground and then in space.

Figure 2: The moon imaged with a membrane telescope.
figure 2

The f/500 primary mirror, curved by electrostatic force, had a thickness of 600 nm and a diameter of 6 mm. (Reproduced from ref. 5.)

For such gossamer telescopes in space the main disturbance comes from sunlight, which both warms spacecraft and pushes them. Light pressure was strong enough to disturb noticeably the orbit of the 100-foot (30.5-m) Echo balloon satellite launched in 1960 (only 4 years after Sputnik), and would blow a 600-nm membrane right out of the Solar System. Gossamer telescopes thus would have to be shielded from sunlight, but the shields must be lightweight too, and held off the telescope somehow, for solar pressure would push a freely orbiting shield into collision with the shielded mirror. Future gossamer telescopes may look more like sailing ships than the telescopes we now know.

Suppose at some time we find a planet that shows clear chemical evidence of life. Could we image it? The largest space telescopes we can conceive do not use mirrors at all and are extremely heavy, focusing light by gravitational bending. Figure 3a shows how multiple images of the same very distant galaxy are formed and distorted into magnified arcs by an intervening blob of dark matter, traced out by the brighter galaxies held in its gravitational field. The distortion was removed to yield the image shown in Fig. 3b. Our deepest views of the distant Universe will probably rely on finding chance alignments like this to aid our biggest ground and space telescopes. There is no chance that a nearby black hole will lie conveniently in front of and magnify an interesting planet, but we could use the Sun as a gravitational lens by sending a detector out of the Solar System in the opposite direction to the planet. Once it is far enough away, more than 600 times the radius of the Earth's orbit, light from the planet bent around the Sun will be brought to a focus, forming a large image. Taking a detector to such distance may sound difficult, but it is still far easier than interstellar travel to even the nearest star, 1,000-times further away.

Figure 3: Imaging distant galaxies.
figure 3

a, HST image showing multiple blue images of a distant background galaxy magnified by gravitational lensing. (From HST archive.) b, Rectified image of the galaxy. (Reproduced from ref. 6).

The next space telescopes

Limited only by the laws of physics and our imagination, we can see this enormous potential for space telescopes. But what path do we now take? We must be guided by a judicious mix of scientific goals and technological opportunity. The astronomy community has identified two key scientific goals to shape the development of more powerful space telescopes. One is to probe deeply the highly redshifted, early Universe of which we have caught the first glimpse with HST. It is best observed in the virtually unexplored 2–5-μm region of the electromagnetic spectrum where the natural sky background is darkest. The other goal is the exploration of other planetary systems, with particular emphasis on looking for Earth-like planets, which will be restricted to only the very nearest stars from the ground. This is best attempted from space by detection of the infrared heat emitted by planets (10–20 μm) rather than the starlight they reflect. Their thermal emission is much brighter relative to the star, and the spectroscopic signatures of key diagnostic molecules, water and ozone, could be very strong, as they are in Earth's spectrum. Ozone would be of enormous interest, as in our atmosphere it is of biological origin4.

Remarkably enough, both these major scientific objectives involve similar technical advances in the previously neglected shorter wavelengths of the thermal infrared spectrum. Both need mirror temperatures in the range of 30–100 K, which should be reachable without expendable refrigerants by carefully shielding against solar heating and by lofting so far from the warm Earth that it appears no bigger than the Moon. Mirrors even larger than HST's 2.4 m are desirable in both cases, once the technology can be developed to make them. The extreme lightweighting of membrane structure is not yet needed, but the passive, rigid mirrors that are the current standard for all space telescopes will no longer be adequate. We will have to learn how to transfer to space the key technology that has made the new, big ground telescopes possible — namely, active control to correct distortion caused by mechanical or thermal disturbance. Ground telescopes automatically check star images as fast as every minute and make appropriate corrections of alignment and mirror distortion to maintain optical quality

NASA administrator Dan Goldin has encouraged astronomers and aerospace companies to set ambitious goals. Detailed studies are being made of a telescope envisaged to succeed HST, the Next Generation Space Telescope (NGST), with a primary mirror 8 m in diameter operating at a temperature of 50 K (Fig. 4). Because the practical limit for a telescope launched in one piece is 4–6 m, depending on the size of future rocket fairings, it would be necessary to build the mirror and telescope in space. Planning has so far focused on automatic deployment from an unmanned payload. The whole spacecraft will have to weigh a lot less than HST, even though the telescope has ten times the mirror area, if it is to be carried a million miles from the Earth

Figure 4: Evolution of infrared space telescopes.
figure 4

Shown to relative scale are SIRTF, a 0.85-m cryogenic telescope to be launched in 2002, and a concept for the 8-m NGST. Both will orbit far from the Earth, with thermal shields on the sunward side.

The mind-stretching NGST goal has been a most effective way to focus research in universities, industry and government centres. New technologies have been identified, and development work is going on in many areas. As an example, Jim Burge at the University of Arizona has been devising ways to take the mature technology used at the Mirror Lab for 8-m glass mirrors for the ground (weighing 18 tons) and produce an 8-m mirror in space weighting less than 1 ton. The idea, demonstrated 4 years ago at the 0.5-m scale, is to keep only the first couple of millimetres of the reflecting glass surface, and replace the rest with an ultra-lightweight carbon-fibre truss. Figure 5 shows the 2-m diameter, 2-mm thick facesheet that will be connected to a truss by 166 active screw actuators. The whole prototype mirror assembly is one-twentieth the weight of the 2.4-m HST mirror.

Figure 5: A 2-m hexagonal concave glass mirror is lifted clear after separation from the support on which it was thinned and polished.
figure 5

It is 2 mm thick and weighs just 30 lb (13.6 kg)7. (Image courtesy of J. Burge.)

Although ultra-lightweight, active mirrors are vital, they are but one of a range of challenging new technologies being studied for NGST. For example, passive cooling to cryogenic temperature is new, and in order to cool an 8-m telescope to 50 K the sunshield shown in Fig. 4 must be the size of a tennis court. It must be such a good insulator that less than one-thousandth of the heat from the full blast of the Sun is allowed to leak through to be radiated into space. If as currently conceived, the huge, flimsy NGST structure and sunshade is to deploy automatically, how do you test beforehand in the Earth's gravity to be sure it will work?

An even more ambitious target has been set for the investigation of Earth-like planets. The basic approach is to detect their heat with a special form of interferometer in space. The beams from the different mirrors must be brought together with an accuracy of a nanometre despite tens of metres separation. Such accuracy is needed to fully cancel out the star's electromagnetic waves by maintaining the crests and troughs exactly out of phase, while leaving the planet image intact, a technique known as nulling interferometry. A mission called Terrestrial Planet Finder (TPF) is envisaged to explore not just the nearest few stars, but over a hundred sun-like stars out to a distance of 50 light years. This would require four or more cryogenic 4-m telescopes spread out over 100 m. They could be connected mechanically or orbited separately in formation.

Reliable and affordable solutions

Now that these scientific goals and the range of technical challenges required to accomplish them are identified, we must find the best path to proving solutions that they will work reliably, and will be affordable. We want our telescopes to make unforeseen discoveries, but we do not want to discover unforeseen flaws! I believe the best way is an evolutionary process based on intermediate missions, each aimed at three goals. First, testing out one or more of the key technologies needed for the full-scale NGST and TPF missions; second, proving that new technology can be delivered affordably; and third, using it for astronomical exploration.

The last requirement I see as crucial. The only way to find unanticipated flaws is by attempting challenging observations in space. Because each of the multiple technological developments for NGST and TPF is profound, any one of them alone could be used to open new scientific frontiers. Consider what might be done relatively soon. Even an old- fashioned, rigid telescope, if taken far from the Earth and passively cooled to 30 K, could open a new window to the high-redshift Universe. A warm, 4-m, ultra-lightweight telescope could test autonomous active control for high- contrast images and see further than HST. A short nulling interferometer with proven, 1-m class cryogenic mirrors in a distant cold orbit could already find and obtain images and spectra of any Earth-like planets around the very nearest stars.

Such missions could also be configured to test solutions to what may be the most difficult problem, namely how to endow remote, cold observatories with HST-like longevity and productivity. If there were no possibility of repair and refurbishment, any big space telescope of HST-level or greater complexity would inevitably be risky and become dated. Space Station provides a natural solution, as a base where complete large telescopes and spacecraft could be assembled and tested in the gravity-free space environment. With solar cells deployed, the structure would be pretty flimsy, but a low-thrust tug could, in a few months, gently push it to the cold, stable point a million miles from Earth. By the same means it could be brought back for repairs when necessary. The alternative is to leave the observatory in the cold, and to send out cryogenic robots to fix it in situ.

These are exciting times. While we sort out the challenges of space telescopes, a golden era of discovery should come from the much higher resolution and sensitivity and data throughput of big ground telescopes and interferometers. Then from space we should be ready to open wide the relatively unexplored infrared spectrum which holds such promise for discovery, from the furthest reaches of the Universe to planets of the nearest, ordinary stars.