||25 February 1999|
Is the reliable prediction of individual earthquakes a realistic scientific
The recent earthquake in Colombia (Fig. 1) has once
again illustrated to the general public the inability of science to predict
such natural catastrophes. Despite the significant global effort that
has gone into the investigation of the nucleation process of earthquakes,
such events still seem to strike suddenly and without obvious warning.
Not all natural catastrophes are so apparently unpredictable, however.
For example, the explosive eruption of Mount St Helens in 1980 was preceded
by visible ground deformation of up to 1 metre per day, by eruptions of
gas and steam, and by thousands of small earthquakes, culminating in the
magnitude 5 event that finally breached the carapace. In this example,
nearly two decades ago now, the general public had been given official
warning of the likelihood of such an event, on a timescale of a few months.
So, if other sudden onset natural disasters can be predicted to some degree,
what is special about earthquakes? Why have no unambiguous, reliable precursors
been observed, as they commonly are in laboratory tests (see, for example,
Fig. 2)? In the absence of reliable, accurate prediction
methods, what should we do instead? How far should we go in even trying
to predict earthquakes?
The idea that science cannot predict everything is not new; it dates
back to the 1755 Great Lisbon earthquake, which shattered contemporary
European belief in a benign, predictable Universe1. In the eighteenth
century 'Age of Reason', the picture of a predictable Universe1
was based on the spectacular success of linear mathematics, such as Newton's
theory of gravitation. The history of science during this century has
to some extent echoed this earlier debate. Theories from the earlier part
of the century, such as Einstein's relativity, and the development of
quantum mechanics, were found to be spectacularly, even terrifyingly,
successful when tested against experiment and observation. Such success
was mirrored in the increasing faith that the general public placed in
science. However, the century is closing with the gradual realization
by both practitioners and the general public that we should not expect
scientific predictions to be infallible. Even simple nonlinear systems
can exhibit 'chaotic' behaviour, whereas more 'complex' nonlinear systems,
with lots of interacting elements, can produce remarkable statistical
stability while retaining an inherently random (if not completely chaotic)
component2. The null hypothesis to be disproved
is not that earthquakes are predictable, but that they are not.
The question to be addressed in this debate is whether the accurate,
reliable prediction of individual earthquakes is a realistic scientific
goal, and, if not, how far should we go in attempting to assess the predictability
of the earthquake generation process? Recent research and observation
have shown that the process of seismogenesis is not completely random
earthquakes tend to be localized in space, primarily on plate boundaries,
and seem to be clustered in time more than would be expected for a random
process. The scale-invariant nature of fault morphology, the earthquake
frequency-magnitude distribution, the spatiotemporal clustering of earthquakes,
the relatively constant dynamic stress drop, and the apparent ease with
which earthquakes can be triggered by small perturbations in stress are
all testament to a degree of determinism and predictability in the properties
of earthquake populations3,4. The debate here
centres on the prediction of individual events.
For the purposes of this debate, we define a sliding scale of earthquake
'prediction' as follows.
- Time-independent hazard. We assume that earthquakes are a
random (Poisson) process in time, and use past locations of earthquakes,
active faults, geological recurrence times and/or fault slip rates from
plate tectonic or satellite data to constrain the future long-term seismic
hazard5. We then calculate the likely occurrence
of ground-shaking from a combination of source magnitude probability
with path and site effects, and include a calculation of the associated
errors. Such calculations can also be used in building design and planning
of land use, and for the estimation of earthquake insurance.
- Time-dependent hazard. Here we accept a degree of predictability
in the process, in that the seismic hazard varies with time. We might
include linear theories, where the hazard increases after the last previous
event6, or the idea of a 'characteristic
earthquake' with a relatively similar magnitude, location and approximate
repeat time predicted from the geological dating of previous events7.
Surprisingly, the tendency of earthquakes to cluster in space and time
include the possibility of a seismic hazard that actually decreases
with time8. This would allow the refinement
of hazard to include the time and duration of a building's use as a
variable in calculating the seismic risk.
- Earthquake forecasting. Here we would try to predict some of
the features of an impending earthquake, usually on the basis of the
observation of a precursory signal. The prediction would still be probabilistic,
in the sense that the precise magnitude, time and location might not
be given precisely or reliably, but that there is some physical connection
above the level of chance between the observation of a precursor and
the subsequent event. Forecasting would also have to include a precise
statement of the probabilities and errors involved, and would have to
demonstrate more predictability than the clustering referred to in time-dependent
hazard. The practical utility of this would be to enable the relevant
authorities to prepare for an impending event on a timescale of months
to weeks. Practical difficulties include identifying reliable, unambiguous
precursors9-11, and the acceptance of an
inherent proportion of missed events or false alarms, involving evacuation
for up to several months at a time, resulting in a loss of public confidence.
- Deterministic prediction. Earthquakes are inherently predictable.
We can reliably know in advance their location (latitude, longitude
and depth), magnitude, and time of occurrence, all within narrow limits
(again above the level of chance), so that a planned evacuation can
Time-independent hazard has now been standard practice for three decades, although new information from geological and satellite data is increasingly being used as a constraint. In contrast, few seismologists would argue that deterministic prediction as defined above is a reasonable goal in the medium term, if not for ever12. In the USA, the emphasis has long been shifted to a better fundamental understanding of the earthquake process, and on an improved calculation of the seismic hazard, apart from an unsuccessful attempt to monitor precursors to an earthquake near Parkfield, California, which failed to materialize on time. In Japan, particularly in the aftermath of the Kobe earthquake in 1995, there is a growing realization that successful earthquake prediction might not be realistic13. In China, thirty false alarms have brought power lines and business operations to a standstill in the past three years, leading to recent government plans to clamp down on unofficial 'predictions'14.
So, if we cannot predict individual earthquakes reliably and accurately with current knowledge15-20, how far should we go in investigating the degree of predictability that might exist?
Department of Geology and Geophysics, University of Edinburgh, Edinburgh, UK
- Voltaire, Candide (Penguin, London, 1997, first
- Bak, P. How Nature Works: The Science of Self-organised
Criticality (Oxford Univ. Press, 1997).
- Turcotte, D.L. Fractals and Chaos in Geology and Geophysics
(Cambridge Univ. Press, 1991).
- Main, I., Statistical physics, seismogenesis and seismic
hazard, Rev. Geophys. 34, 433-462 (1996).
- Reiter, L. Earthquake Hazard Analysis (Columbia Univ.
Press, New York, 1991).
- Shimazaki, K. & Nakata, T., Time-predictable recurrence
model for large earthquakes, Geophys. Res. Lett. 7, 279-283
- Schwartz, D.P. & Coppersmith, K.J., Fault behavior
and characteristic earthquakes: Examples from the Wasatch and San Andreas
fault systems, J. Geophys. Res. 89, 5681-5696 (1984).
- Davis, P.M., Jackson, D.D. & Kagan, Y.Y., The longer
its been since the last earthquake, the longer the expected time till
the next?, Bull. Seism. Soc. Am. 79, 1439-1456 (1989).
- Wyss, M., Second round of evaluation of proposed earthquake
precursors, Pure Appl. Geophys. 149, 3-16 (1991).
- Campbell, W.H. A misuse of public funds: UN support
for geomagnetic forecasting of earthquakes and meteorological disasters,
Eos Trans. Am. Geophys. Union 79, 463-465 (1998).
- Scholz, C.H. The Mechanics of Earthquakes and Faulting
(Cambridge Univ. Press, 1990).
- Main, I., Earthquakes - Long odds on prediction,
Nature 385, 19-20 (1997).
- Saegusa, A., Japan tries to understand quakes, not
predict them, Nature
- Saegusa, A., China clamps down on inaccurate warnings,
397, 284 (1999).
- Macelwane, J.B., Forecasting earthquakes, Bull.
Seism. Soc. Am. 36, 1-4 (1946).
- Turcotte, D.H., Earthquake prediction, A. Rev.
Earth Planet. Sci. 19, 263-281 (1991).
- Sneider, R. & van Eck, T., Earthquake prediction:
a political problem?, Geol. Rdsch. 86, 446-463 (1997).
- Jordan, T.H., Is the study of earthquakes a basic
science?, Seismol. Res. Lett. 68, 259-261 (1997).
- Evans, R., Asessment of schemes for earthquake prediction:
editor's introduction, Geophys. J. Int. 131, 413-420 (1997).
- Geller, R.J., Earthquake prediction: a critical review,
Geophys. J. Int. 131 425-450 (1997).
- Main, I.G., Sammonds P.R. & Meredith, P.G., Application
of a modified Griffith criterion to the evolution of fractal damage
during compressional rock failure, Geophys. J. Int. 115,
- Argus, D. & Lyzenga, G.A., Site velocities before
and after the Loma Prieta and the Gulf of Alaska earthquakes determined
from VLBI, Geophys. Res. Lett. 21, 333-336 (1994).