The coronavirus pandemic shows few signs of ending. After drastic lockdown actions controlled the first wave, second waves more severe than the first have risen up in many European nations. In the United States, the number of infections has already moved into a third wave, reaching unprecedented numbers of daily new infections. Aside from more than a million global deaths, the pandemic has also tested the fortitude and resilience of people everywhere, triggering heated arguments over public health policy.

It’s clear that strict lockdowns saved public health facilities during the first wave, and were wisely invoked by governments that were otherwise unprepared to stop the pandemic spread. But these harsh measures also brought enormous economic, social and psychological costs, and many people have grown restless and defiant, with sporadic protests and even riots recently breaking out in France, Italy and Germany.

Only a few nations, including South Korea and New Zealand, have been so proficient with their procedures for testing, tracing and quarantining the infectious as to avoid the need for lockdown. Those of us living elsewhere face a contradiction and paradox — we detest the very lockdown measures that have saved us. Of course, we cannot allow the pandemic to rage beyond control, yet neither do we enjoy the awful costs entailed in locking down. We’re looking into a future haunted by the spectre of sporadic and unpredictable lockdowns.

Logically, however, there may be another way, as a team of scientists and engineers have been exploring (M. Bin et al., Preprint at https://arxiv.org/abs/2003.09930; 2020). As children learn, it’s possible to put a finger into the flame of a candle without being burnt, as long as it moves in and out rapidly enough, giving sufficient time for cooling. This idea of high-frequency cycling offers a way to realize average conditions between two extremes, and forms the basis of countless engineering devices, from the automobile engine to the hard disk drives in most computers. An effective control strategy for the coronavirus pandemic, this research proposes, might also be found in a rapid, predictable weekly alternation between lockdown and no lockdown.

There’s much to question about the idea, perhaps most importantly whether it would be accepted by people. But in principle, it would allow more, though still limited, economic activity, and give people certainty so they can plan, while also keeping viral numbers in check.

As the pandemic drags on, and people tire of restrictions, many nations have been examining the usefulness of short, strategic lockdowns applied intermittently. The idea is to devise these based on precise data for things such as the demand on healthcare facilities, new daily infection numbers, local effective reproductive rates and so on. As Bin and colleagues note, however, this approach is exposed to the many sources of uncertainty affecting COVID-19 transmission, such as delays and uncertainties in making and reporting the various numbers, and the inherent uncertainties in the time taken for an individual to become symptomatic, then infectious and then recovered. Further uncertainties surround the variability in human behaviour in following safety measures.

Trying to make precise interventions based on such uncertain data can, as earlier studies have shown, lead to unexpected instabilities, which themselves lead to secondary waves of the epidemic. Moreover, the unpredictability of such interventions places a burden on economic activity, as it makes planning difficult over anything but very short horizons. Hence, Bin and colleagues argue, a preferable alternative may be to devise policy that is less sensitive to fluctuations in the data, and indeed only uses data averaged over intervals, to reduce sensitivity to unavoidable but insignificant fluctuations.

In their study, the authors first derive theoretical results supporting the intuition that consistent pulsing of lockdown conditions should result in an average epidemic behaviour — an outcome between lockdown and no lockdown. Using a standard model from epidemiology, they show that careful choice of the duty cycle — the split between the relative number of lockdown and non-lockdown days within each period — yields an average behaviour that makes possible a compromise between supporting economic activity and limiting epidemic growth. This result may seem obvious, but it isn’t, as non-intuitive feedbacks and instabilities are entirely possible.

Also not so obvious is the difference this approach might make to public acceptance of policies as they change over time. Basing policies on instantaneous data, the authors note, is dangerous precisely because of data uncertainty, which means small fluctuations have the potential to demand rapid policy changes for no good reason, requiring reversal soon after. This not only wastes resources but also ruins public confidence. In contrast, averaging uncertain data over longer periods yields more stable long-term trends, such as whether mean levels of infections are increasing or decreasing. This approach is the epidemiological equivalent of putting some inertia in the steering of a racing car, to avoid the driver overcompensating for any minor mis-steering, and ending in an unstable spin out. Modern cars are engineered to avoid this, as should public health policies.

Now, it’s important to emphasize — as the paper does prominently on page one — that most of these authors are not epidemiologists. They’re engineers. Epidemiology has a very long history, and none of it should be overlooked. As these authors note, the idea of periodic intervention isn’t new to epidemiologists, but plays a role in many strategies to control diseases such as measles. What is new here is an emphasis on high-frequency variation of policy, and a careful study from the perspective of control systems theory of the plausibility of such an approach. Given the limited success of current policies, it’s an idea that, at the very least, deserves serious attention.