Fluctuations in heart rate can signal disease and may prove fatal. A measurement, based on entropy, of how 'surprising' the beat irregularity is, distinguishes healthy hearts from those suffering common forms of illness.
Although unnoticed, the healthy heart beats irregularly. The intervals between beats fluctuate widely, following complicated patterns reminiscent of music1. The origin, significance and diagnostic value of this variability of heart rate is a subject of sustained interest for biologists and, more recently, for physicists also. Writing in Physical Review Letters, Madalena Costa and colleagues2 analyse the statistics of a sample of heart beats. They find that healthy people and patients with heart conditions can be consistently differentiated by a surprisingly simple measure based on a thermodynamical concept: entropy.
The problem Costa et al. tackle is that of how best to describe a discrete series of consecutive events, in this case the time series of consecutive beat-to-beat intervals. There are already many ways of addressing this problem, but no single one seems able to solve it. Like others before them, the authors have used entropy to capture the complexity of the time series. Entropy estimates the average uncertainty of a series of discrete events. In the case of the heart, it estimates an average of how much the next beat-to-beat interval will surprise us. If the intervals are equally spaced, and thus unsurprising (for example, the unrealistic case of a perfect clock), entropy is a minimum. At the other extreme, if the beat-to-beat intervals are randomly distributed over a wide range, each event will not be easily anticipated and entropy increases.
A straightforward measurement of entropy can capture some of the predictability of heart-rate variability, but has limitations. For instance, in conditions such as atrial fibrillation (twitching of the upper chambers of the heart) that produce highly erratic rate fluctuations, entropy is actually comparable to that in perfectly healthy hearts. Therefore, researchers are adding new quantities derived from entropy in their attempts to find better ways of teasing apart different physiological and pathological states.
Costa et al.2 were inspired by the fact that physiological time series often exhibit novel features at different timescales. They hypothesized that entropy could vary according to the timescale at which it is measured. By re-sampling the original time series at various scales — from considering the sample as a whole, to dividing it into up to twenty sub-samples — they obtained a collection of data possessing different coarse-graining from which they calculated a measure of entropy. The resulting quantity, termed 'multiscale entropy', showed that the heart-beat time series of healthy people asymptotically approaches a constant value of entropy as the measurement scale is increased (Fig. 1). Intuitively, this is what one would expect, because, as in the case of music, in healthy beat-to-beat time series new features come into play at each timescale. Consequently, regardless of the length of the intervals at which the series is observed, the uncertainty remains constant. On the other hand, the analysis of heart patients shows a departure from this 'constancy of surprise', with distinctive multiscale-entropy patterns. In patients with atrial fibrillation, entropy decreases, but the opposite is seen in patients with cardiac failure (Fig. 1). Thus, multiscale entropy finds differences that are missed by other approaches.
Although most efforts, including those of Costa et al., are concerned with the statistics of thousands of beats, work is also being done on the dynamics of just a few beats. The precise sequence in which the heart fluctuates is very informative. For instance, it is known that the length of two consecutive beat-to-beat intervals cannot be made arbitrarily different, because it is limited by the intrinsic dynamics of the heart's own pacemaker. Think of a swing in motion, for example: if you try to speed it up or slow it down with isolated pushes, you realize that two consecutive oscillations cannot be made arbitrarily very different. In fact, the degree to which an oscillator can be advanced or delayed by a single perturbation is a dynamical signature of the system and has been used to predict many aspects of dynamics3.
This reasoning can be reversed: by analysing the dynamics of just a few consecutive beat-to-beat intervals, the status of the system can be inferred. Tateno and Glass4 showed that this can reveal important information. They developed a technique that could detect atrial fibrillation in patients, based on an analysis of fifty consecutive beat-to-beat intervals. Histograms generated from the differences between consecutive intervals had characteristic shapes that were dependent on the patient's heart rhythm. By comparing a patient's histogram with standard histograms, Tateno and Glass were able to identify with high sensitivity and specificity whether the patient's rhythm matched that of atrial fibrillation or some other form of arrhythmia. Atrial fibrillation is a common problem, associated with a high risk for stroke, so algorithms that can detect it, especially if they can be incorporated in portable devices, should be useful for optimizing the administration of drugs and other treatments.
Neither advance, whether achieved by studying a few heart beats or Costa and colleagues' investigation of many heart beats2, touches on the physiological mechanisms that generate the fluctuations. It is well known — though often not well appreciated — that if one variable in an organism is to be kept relatively constant, something else must fluctuate. To maintain a relatively constant supply of nutrients, the heart rate (as well as other cardiovascular variables) must somehow adjust to variations in the demand. This might explain most heart-rate fluctuations. Of course, it is not known precisely how this happens, nor what mechanisms are responsible for the quantitative features of the fluctuations. A challenging task will be to create a model that, without writing the fluctuations in explicitly, would still be able to reproduce beats possessing all the music we see in healthy hearts. Until the correct statistical behaviour arises out of the dynamics of a physiological model, our understanding of this riddle will not be complete.
Appel, M. L., Berger, R. D., Saul, J. P., Smith, J. M. & Cohen R. J. J. Am. Coll. Cardiol. 4, 1139–1148 (1989).
Costa, M., Goldberger, A. & Peng, C.-K Phys. Rev. Lett. 89, 068102 (2002).
Glass, L. & Mackey, M. From Clock to Chaos: The Rhythms of Life (Princeton Univ. Press, New York, 1988).
Tateno, K. & Glass, L. Med. Biol. Eng. Comput. 39, 664–671 (2001).
About this article
Cite this article
Chialvo, D. Unhealthy surprises. Nature 419, 263 (2002). https://doi.org/10.1038/419263a
This article is cited by
Scientific Reports (2017)
Multi-scale symbolic entropy analysis provides prognostic prediction in patients receiving extracorporeal life support
Critical Care (2014)
Decreased neuroautonomic complexity in men during an acute major depressive episode: analysis of heart rate dynamics
Translational Psychiatry (2011)
Chinese Science Bulletin (2004)