Abstract
Understanding noisy information engines is a fundamental problem of nonequilibrium physics, particularly in biomolecular systems agitated by thermal and active fluctuations in the cell. By the generalized second law of thermodynamics, the efficiency of these engines is bounded by the mutual information passing through their noisy feedback loop. Yet, direct measurement of the interplay between mutual information and energy has so far been elusive. To allow such examination, we explore here the entire phasespace of a noisy colloidal information engine, and study efficiency fluctuations due to the stochasticity of the mutual information and extracted work. We find that the average efficiency is maximal for nonzero noise level, at which the distribution of efficiency switches from bimodal to unimodal, and the stochastic efficiency often exceeds unity. We identify a line of anomalous, noisedriven equilibrium states that defines a refrigeratortoheater transition, and test the generalized integral fluctuation theorem for continuous engines.
Introduction
The demon envisioned by Maxwell sees gas molecules in a vessel and, by exploiting his knowledge about their motion, extracts mechanical work, apparently violating the second law of thermodynamics^{1,2}. Resolving the paradox of this information engine revealed a deep link between the thermodynamic entropy of the system and the information transmitted about its microstate by the engine’s feedback loop (i.e., the demon)^{3,4,5,6}. But what if the information engine is noisy? (or in Maxwell’s terms, the demon is a bit myopic and cannot measure the molecules precisely). In this case, one is faced with a fundamental problem of information theory: what is the effect of noise on the capacity of a communication channel to transmit information? A seminal result by Shannon is the noisy channel coding theorem: the capacity of the channel is the maximal mutual information between its input and output^{7,8}.
This “Maxwell meets Shannon” scenario of imperfect information engines is prevalent in nonequilibrium systems^{6,9,10,11}, especially in living systems where the signaling and perception are prone to noise^{12,13,14,15}. For example, it was suggested that, by incorporating information feedback loops, the cell’s signal transduction system can adapt to become more resilient to environmental noise^{14}. Thus, owing to their fundamental significance, noisy information engines have been subject to several theoretical models^{6,9,10,16} and experimental studies^{17,18}. Most of these studies are limited to the measurement of the averaged thermodynamic quantities. For example, according to the generalized second law of stochastic thermodynamics^{6}, the average extracted work (or the average information conversion efficiency) of the engine is bounded by the average acquired information. However, due to their stochastic nature, the thermodynamic observables can wildly fluctuate along a single trajectory and often exceed the bounds set on the averages. Thus, one cannot decipher the magnitude and distribution of these fluctuations solely from the mean values. Therefore, we aim here to measure fluctuations in work, information, and efficiency of noisy information engines operating over a vast phase space of nonequilibrium steady states.
In advancing our understanding of the noisy information engines there remains a major obstacle: In general, one would expect the performance of the engine to depend on the channel’s capacity as measured by its mutual information. Yet, the direct measurement of mutual information so far has been reported either in errorfree colloidal engines^{19,20,21,22}, or in discrete electronic systems^{17,18}. But more often the signal is noisy and continuous – as in the textbook colloidal models of stochastic thermodynamics or in ubiquitous molecular sensory systems^{13} – therefore, evaluating mutual information requires measurement of the complete input–output probability distribution, which further depends on the noise distribution. Moreover, testing the fundamental limits set by nonequilibrium fluctuation theorems, such as the integral fluctuation theorem generalized for feedback systems^{6,16,23}, necessitates an experiment in which the magnitude and distribution of noise can be precisely controlled, which has not been achieved so far.
All these motivate us to examine the noisy information channels within an experimental setting which can directly measure, control, and vary the mutual information passing through the feedback loop. This allows us quantify the interplay between the performance of the engine and the capacity of the channel through the entire nonequilibrium phase space of the engine. To this end, we construct a cyclic Brownian information engine that is reset after each cycle of information transfer and work extraction. Such periodically reset engines and information channels are prevalent in stochastic thermodynamics^{24,25}, especially in living systems. For example, in molecular receptors that recurrently bind and unbind signaling ligands^{26} and the main synthesis pathways of the central dogma^{27}.
Our mutual information engine consists of a colloidal particle diffusing within the harmonic potential of an optical trap (Fig. 1). Each cycle begins with a practically instantaneous and errorfree detection of the particle position x (Fig. 1a). A Gaussian noise, of variance N, is added to x, and feedback loop (the demon) perceives this distorted value, y = x + error, as the particle position. The engine then responds by swiftly shifting the potential center to the perceived particle position y. This is followed by a relaxation step that lasts for a period τ. We measured the phase space of the consequent mutual information and thermodynamic quantities such as work, heat, efficiency, and their fluctuations, as a function of its parameters τ and N. For finite τ, we obtained a rich variety of nonequilibrium steady states stemming from the feedbackmeasurement interplay. Besides the usual equilibrium state obtained by large relaxation times (τ → ∞), we find also a line of noisedriven equilibrium states, at equal levels of noise and signal. Across this line, the thermodynamic quantities as well as their fluctuations change their qualitative behavior. This line also signifies the transitions of the engine from a refrigerator to a heater. We find that engines with longer cycle period τ are more efficient, as expected. But for a given τ, the most efficient engine is one with a finite noise N (in Maxwell’s terms, a blearyeyed demon). We show that the efficiency exhibits a transition from bimodal to unimodal distribution, and the stochastic efficiency often exceeds the bound of the generalized second law. The maximal efficiency at nonzero noise stems from the bimodal distribution of the efficiency fluctuations. Finally, we report the first examination of the validity of the generalized integral fluctuation theorem for mutual informationfueled Brownian engine.
Results
The mutual information engine
The information engine consists of a colloidal particle stochastically moving within the harmonic potential V(x, t) = (k/2)(x – λ(t))^{2} of an optical trap in a bath of temperature k_{B}T = β^{–1} (the experimental setup is expounded in Methods). Here, k is the stiffness of the trap with its center at λ(t). The colloidal particle is 2.0 μm in diameter, and its thermally agitated motion is therefore well within the overdamped lowReynolds regime^{19,28}. Without feedback, the Boltzmann distribution of the particle position describes a Gaussian of variance S ≈ (27.4 nm)^{2} (Fig. 2a). From the variance, we calibrate the stiffness of the trap, k = k_{B}T/S ≈ 5.4 pNμm^{−1}. The timescale of the overdamped dynamics is the characteristic time it takes for the particle to relax towards equilibrium, τ_{R} = γ/k ≈ 3.5 ms, where γ is the Stokes friction coefficient.
During the relaxation step, the measured particle position exhibits a timevarying Gaussian distribution, p(x, t) = G(x, b(t), S(t))^{9,10} (See Supplementary Note 1). Here, G is Gaussian distribution with time dependent center b(t) and variance S(t). Let us follow the dynamics of the engine along the ith cycle, beginning when the particle is at position x (w.r.t. the trap’s center λ_{i−1}). First, the information engine detects x as y. The noisy information channel is represented by the inputoutput relation p(yx) = G(y, x, N)^{8}. The noise broadens the distribution of the perceived position y (Fig. 2b), and the particle position distribution right after the measurement becomes p(x∣y) (See Supplementary Note 1). Next comes the feedback step, when the engine shifts the center of the trap according to the position perceived by the noisy channel, λ_{i−1} → λ_{i} = y (Fig. 1a). In relative frame of reference, the trap center is fixed while the particle position x is reset to x – y. During the last step of the protocol the system is allowed to relax for time τ and subsequent cycle is repeated.
After many repetitions of the protocol, the engine can adequately sample the shift distribution and all probabilities reach steady state (Fig. 2). Therefore, the distribution of particle position just after the reset step is exactly the distribution of errors of the Gaussian information channel, G(x, 0, N). The steady state distribution after the relaxation (at the beginning of next cycle) is then given by p(x) = G(x, 0, S^{*}), with the variance S* (See Supplementary Note 2)
Figure 2a shows p(x) widening with the noise level N, in excellent agreement with Eq. (1). Faster engines can narrow or widen the distribution, depending on the noise level N/S, with a minimum, S^{*}/S = 1 – exp(–2τ/τ_{R}), for errorfree engines (N/S = 0) (Fig. 2c). The distribution of the measurement outcome y is p(y) = G(y, 0, S^{*} + N) (Fig. 2b).
Here, one can discern between two classes of noisy information engines (blearyeyed demons) (Fig. 1b). The first one is the relatively accurate engine (N < S), that utilizes the measurementfeedback steps to narrow the distribution from a variance S^{*} to a variance N < S^{*} (Fig. 2c). During the relaxation step, the distribution spreads back but still remains narrower than the equilibrium one, S^{*} ≤ S. At the extreme, a perfect engine shrinks the distribution down to a delta function just after the feedback, which then expands towards equilibrium during the relaxation step. The other class is the more erroneous engines with widely distributed errors (N > S). By performing feedback, such engines widen the distribution to N > S^{*}. When relaxing, the distributions shrink down towards equilibrium S^{*} ≥ S. The departure of S^{*} from the equilibrium variance S, for any finite cycle τ, can be interpreted in terms of an effective temperature of the particle, k_{B}T_{eff} = kS^{*}. Thus, the information engines with N < S perform as refrigerators (T_{eff}/T = S^{*}/S < 1), while the ones with N > S act as heaters (T_{eff}/T > 1), as shown in Fig. 2c. In this context, the perfect engine (N = 0) with τ = 0 essentially operates at T_{eff} = 0.
The performance of the information engine
In the overdamped regime the kinetic energy of the particle can be ignored, so the change in the potential energy when the trap shifts, ΔV(x), is fully converted into heat and work. However, the potential is shifted much faster (within 20 μs) than the typical relaxation time such that the particle has no time to move and dissipate energy. Therefore, all the potential energy gained by the shift is converted into work. During the relaxation step, since the trap center remains fixed, no work is performed on the particle, and only heat is dissipated. Thus, the work done on the particle during each shifting of the potential center is βW ≡ βΔV = (1/2)βk[(x – y)^{2} – x^{2}]. The average work done on the particle per cycle in steadystate \(\left\langle {\beta W} \right\rangle\) and its fluctuation std(βW) are (See Supplementary Note 3)
The steadystate average heat supplied to the system 〈βQ〉 during the relaxation step is minus the average work performed on the system during the feedback, 〈βQ〉 = –\(\left\langle {\beta W} \right\rangle\) (See Supplementary Note 3). This shows that for N < S the system is cooled immediately after the feedback control, and net heat flows from the reservoir to the system during the relaxation. The effective cooling decreases with increasing the error level until N = S at which 〈βQ〉 = 0. For N > S, the work performed on the system during the feedback is positive (heating), and net heat flows from the system to the reservoir during the relaxation. Note that our observation of cooling and heating of the system is protocol dependent. As an example, a previous theoretical work^{9} shows that for a system initially in thermal equilibrium, the average extracted work \(\left\langle {  \beta W} \right\rangle\) is always positive for an optimal protocol where the particle position is instantaneously shifted to y·S/(S + N).
Figure 3a shows the distribution of extracted work –βW in several regimes of engine accuracy. A quasistatic (τ >> τ_{R}) and perfect engine (N = 0) always extracts positive work with an average \(\left\langle {\beta W} \right\rangle\) = 0.498 ± 0.003, in agreement with the theoretical value 0.5 (Eq. 2). Imperfect engines (N > 0) sometimes make mistakes in their feedback and have nonzero probability for negative extracted work. Engines with relatively good accuracy (N < S) rarely make such mistakes and, on average, always extract positive work from the bath, \(\left\langle {\beta W} \right\rangle\) > 0, performing as refrigerators. The distribution becomes symmetric for marginal engines (N = S) which extract no work on average, \(\left\langle {\beta W} \right\rangle\) = 0. At the other extreme, the more erroneous engines (N > S), often shift the trap center too far from the particle, such that the average extracted work is negative, \(\left\langle {\beta W} \right\rangle\) < 0, performing as heaters. Curves of the extracted work \(\left\langle {\beta W} \right\rangle\) as a function of the noise level N/S (Fig. 3b) agree with the theoretical prediction in Eq. (2). The maximal work \(\left\langle {\beta W} \right\rangle\) = 0.5 is extracted by perfect engines whose cycle is long enough to reach equilibrium. While the work extracted by ultrafast engines (τ → 0) vanishes, \(\left\langle {\beta W} \right\rangle\) → 0, these engines extract maximum average power, P ≡ \(\left\langle {\beta W} \right\rangle\)/τ → (1 – N/S)/τ_{R}.
The information gain at the time of measurement is the mutual information between the particle position x and the measurement outcome y, and is given by I = ln[p(xy)/p(x)]. Then the average steadystate mutual information \(\left\langle I \right\rangle\) and its standard deviation std(I) are (See Supplementary Note 3)
Since resetting the trap center erases any mutual information between x and y, 〈I〉 is the net average information gain per cycle, 〈ΔI〉 = 〈I〉. The measured 〈I〉 is smaller for larger noise level N and shorter cycles τ, agreeing with Eq. (4) (Fig. 3c and its inset). However, it always remains greater than the average extracted work, 〈I〉 ≥ \(\left\langle {\beta W} \right\rangle\), in accord with the generalized second law of thermodynamics^{6}. The information gained by perfect engines (N = 0) diverges.
Wellequilibrated engines that have enough time to relax, τ/τ_{R} → ∞, have the capacity of the classical Gaussian channel^{8}, \(\left\langle I \right\rangle = (1/2)\ln (1 + S/N)\). At the other extreme, ultrafast engines (τ → 0) still get \(\left\langle I \right\rangle = (1/2)\ln (2)\) nats from each cycle (i.e. ~½ bit). This value is also the information gained by observing a particle fluctuating with variance equal to the accuracy of the measurement, S* = N = S (Fig. 3c inset). At this extreme, the feedback step does not alter the distribution leading to noisedrivenequilibrium. Finally, it follows from Eqs. (1) and (4), that increasing the cycle period will improve the information capacity of relatively accurate engines (N < S), but will worsen the performance of the more erroneous ones (N > S), consequently the amount of work extraction is suppressed, as shown in the inset of Fig. 3c.
Figure 3d shows the plot of fluctuations in extracted work std(–βW) and mutual information std(I) as a function of normalized cycle period τ/τ_{R}. For the more erroneous engines (N > S), the fluctuations in work and mutual information decrease with increasing the cycle period as expected. However, they are found to be increased for relatively accurate engines (N < S).
Engines operated by perfect feedback loops are not the most efficient ones (at least for the current feedback protocol). To see this, we compute and measure the average efficiency of information to work conversion, \(\bar \eta \equiv \left\langle {  \beta W} \right\rangle /\left\langle I \right\rangle\) (Fig. 4a). For any given cycle period τ, the most efficient engines are noisy (blearyeyed) ones. The global maximum \(\bar \eta \approx 0.48\) is obtained by a slower engine at N/S ≈ 0.36, which uses only 〈I〉 ≈ 0.67 nats ≈ 0.97 bits of information per cycle. Retrieving more information on the particle position would have only a diminishing return. Ultrafast engines (τ = 0.5 ms) are most efficient at N/S ≈ 0.26, albeit the extracted work is very small, since the particle has little time to relax between cycles. These ultrafast engines use merely 〈I〉 ≈ 0.5 bits per cycle. The engines with N > S exhibit negative efficiency. Our observation of maximum efficiency at finite error cannot be predicted from recently demonstrated discrete information engine^{18}, which shows efficiency maximum at N/S → 0.
Interestingly, DNA recognition by transcription factors is also optimal around the regime of ~1 bit per base pair^{29}. In this case, the efficiency measures how much information about the sequence can be extracted from one unit of binding energy. Thus, DNA recognition transforms energy to information most efficiently at a regime similar to that of a colloidal engine that transforms information to energy most efficiently. In both cases, the exchange between energy and information exhibits diminishing return beyond ~1 bit of information, suggesting this region as a generic optimality regime in stochastic energyinformation systems.
Test of integral fluctuation theorem
We also test, experimentally and theoretically, the generalized integral fluctuation theorem, 〈e^{–β(W–ΔF)–ΔI}〉 = 1, which is valid for system under measurement and feedback control whose initial and final states are in equilibrium^{6,16}, and check how far the average deviates from unity for our cyclic information engine with nonequilibrium initial and final states. The value of the average 〈e^{–β(W–ΔF)–ΔI}〉 for the current feedback protocol where ΔF = 0 is equal to (See Supplementary Note 3)
which becomes unity only when S* = S. This condition is achieved when the engine reaches equilibrium either by relaxing for long periods, τ/τ_{R} → ∞ (perioddriven equilibrium), or when it mimics the equilibrium Boltzmann distribution by tuning the noise to signal, N = S (noisedriven equilibrium). Experimentally (Fig. 4b), we find that 〈e^{–βW–ΔI}〉 = 1 regardless of error size for τ = 20 ms (black circles), for which the system is fully relaxed at the end of each cycle. For finite τ, 〈e^{–βW–ΔI}〉 deviates from unity, even near τ_{R} for which the system reaches near equilibrium (blue circles). Furthermore, 〈e^{–βW–ΔI}〉 is found to be always less (greater) than one in cooling (heating) region of the engine. The experimental test of a more general fluctuation theorem for total entropy production, that is valid for arbitrary initial and final states^{30}, \(\left\langle {e^{  \Delta S_{{\mathrm{tot}}}  \Delta I}} \right\rangle = 1\), would require the direct measurement of system entropy change, heat dissipation and mutual information along individual trajectories, which is beyond the scope of the present work.
Efficiency fluctuations
Our measurement shows that the average efficiency \(\bar \eta \equiv \left\langle {  \beta W} \right\rangle /\left\langle I \right\rangle\)is maximal for finite error level and long cycle period. However, this maximal efficiency is practically useless due to vanishing average power P ≡ \(\left\langle {\beta W} \right\rangle\)/τ → 0 in this limit. On the other hand, thermal fluctuations and fluctuations in the signal received by the detector strongly affect the operation of these microscopic engines. For example, we can show from Eq. (2) that for N/S = 0 and large τ, the average work is maximal, \(\left\langle {\beta W} \right\rangle\) ≈ 0.5; however, it exhibits large variance, std (–βW) ≈ 0.7, implying that the work obtained in individual realizations fluctuates violently around the mean. As a result, the average values alone are not sufficient for understanding and designing information engines, and one must take into account fluctuations in the thermodynamic quantities such as work, heat, and information. Typical to fluctuating systems, the most probable efficiency, at the peak of the distribution, is more informative than the average. For small systems like ours, we find that the average and the most probable values have quite distinct physical behavior.
Recent studies demonstrated that, due to the fluctuations in work and heat, the efficiency of a stochastic heat engines driven by nonequilibrium protocol is not bounded and often exceeds the limit of Carnot efficiency^{31,32,33,34,35}. Here, we study the stochastic efficiency η = –βW/I of an information engine owing to the fluctuations in work and mutual information (in the N ≤ S regime). Figure 5a exhibits double peaks for the distribution of efficiency p(η) for smaller noise level (orange curve) at τ = 3 ms, which coalesce into a single peak (olive in Fig. 5b) at the noise level for which \(\bar \eta\) is maximal (N/S ≈ 0.32). For higher noise levels, p(η) broadens (wine in Fig. 5b inset) and its peak shifts towards η = 1. Similar behavior is observed for τ = 0.5 and 20 ms, except that the bimodal to unimodal transition occurs at smaller N/S values in engines with shorter periods.
The double peaks of the efficiency distribution p(η) stem from the interplay between the distributions of extracted work p(–βW) and mutual information p(I), which also bifurcate with decreasing error level (Supplementary Fig. 1). At low noise levels, the peaks of p(–βW) and p(I) are well separated, giving rise to double peaks. At higher error level, they get closer, owing to sharp decrease in I, and eventually coalesce. In particular, the observed peaks result mainly from the negative values of βW for which I is positive. There is also contribution from positive βW for which I is negative (red bars). For larger error levels, the distribution of βW spreads and broadens in the positive direction, and its peak shifts toward η = 1, resulting in a maximal average efficiency at a finite error level. This contribution of βW > 0 (the heater regime) to positive values of efficiency could not be predicted from the average values alone.
Figure 5c shows that the efficiency distribution exhibits a single peak near the origin for τ ≈ 0.1 ms, which bifurcates into bimodal distribution for a finite τ, while the second peak shifts towards η = 1 for τ ≥ τ_{R}. The origin of double peaks in our system appears consistent with recent theoretical work on stochastic heat engines^{36}. However, it is noteworthy that we do not observe a minimum near η = 1; the observed peaks in our system are mainly owing to βW < 0 and I > 0. The derivation of an exact form of p(η) in cyclic information engines, and in particular testing whether it asymptotically approaches universal scaling p(η → ±∞) ~ η^{–2} ^{36,37}, should be an interesting future work. In a twotemperature heat engine, the efficiency distribution may exhibit peaks in the negative regime^{38}, whereas in our singletemperature information engine the peaks are always in the positive regime (Fig. 5c), suggesting that the information engine is capable of extracting positive work for most cycles. Interestingly, the ensembleaveraged efficiency, 〈η〉 ≡ 〈–βW/I〉 has a global maximum near N/S ≈ 0.32 (Fig. 5d), for which the average efficiency, \(\bar \eta \equiv \left\langle {  \beta W} \right\rangle /\left\langle I \right\rangle\) is also maximal, though their values differ.
Discussion
In conclusion, we examined the MaxwellmeetsShannon problem by studying the mutual informationfueled Brownian engine. Information engines are special class of feedback control systems that are capable of extracting work from thermal fluctuations. Here, we incorporated the effect of feedback into the formalism of stochastic thermodynamics to realize stochastic engines operating in isothermal conditions. Our analysis shows that the engine extracts work from information about the microstate of the system without affecting the energy balance, but only the entropy balance. Thus, such feedback system affects the balance in the second law of thermodynamics but not in the first law.
By directly controlling and measuring the mutual information passing through the noisy detector used by the engine, we fully characterized the informationenergy interplay of noisy Gaussian engine over a wide variety of nonequilibrium steady states both in experiment and theory. The present laws of stochastic thermodynamics cannot predict the fluctuations in the thermodynamic observables, extracted work, information, and efficiency. Nevertheless, there are recent attempts to link the fluctuations with the average dissipation by means of thermodynamic uncertainty relations^{39,40,41,42,43,44}.
Unlike previously reported twobath stochastic engines^{45,46}, the distinctive feature of the present information engine is that one can use the noise to either heat or cool the system immersed in a single temperature bath. We obtain a refrigerator if the noise level is smaller than the signal level, or a heater otherwise. In the refrigerator (N < S), the system is cooled immediately after the feedback, thereby inducing a temperature difference, and net heat flows from the reservoir back to the system during relaxation with an average efficiency \(\bar \eta = \left\langle {\beta Q} \right\rangle /\left\langle I \right\rangle .\) In contrast, without feedback, the net heat flow is zero, and in heaters (N > S) heat flows from system to the reservoir. The heater and refrigerator regions in the dynamic phase diagram are separated by an anomalous, noisedriven equilibrium state along the N = S line, where all thermodynamic variables and their fluctuations switch their behavior.
We find that the most efficient engines utilize merely about 0.5–1 bits of positional information per cycle. A universal feature of our information engine, irrespective of cycle period, is the transition in the distribution of efficiency fluctuation from bimodal to unimodal. Moreover, information engines with slower cycle and finite error are occasionally capable of extracting work beyond the bound set by generalized second law, for engines starting from equilibrium states.
The output power at maximum efficiency of our information engine near quasistatic regime, τ ~ τ_{R}, is comparable to the power of molecular motors, but about an orderofmagnitude larger than the maximal power generated of a recently reported twotemperature Brownian engine^{46}. Almost all biological motors operate in noisy environment and exchange energy and information with a singletemperature bath, and hence cannot be understood on the basis twotemperature heat engine. Our study of singletemperature information engines can shed light on the underlying operation principles of these biological motors.
The generalized integral fluctuation theorem was found to be valid only when the system is fully relaxed at the beginning of each cycle or at the noisedriven equilibrium, in which the noise level is equal to that of the signal. For an arbitrary nonequilibrium steady state, it is less (greater) than unity in the refrigerator (heater) region. In the future, it would be interesting to test in our feedback system the validity of more general fluctuation theorems related to entropy production^{30}. This study can be useful in designing and understanding of efficient synthetic submicron devices, as well as biological micronscale systems, in which fluctuations of the system and the detector are inevitable.
Methods
Experimental
The basic experimental setup of the colloidal information engine is shown in the Supplementary Fig. 2. A 1064nm laser is used for trapping the particle. The laser is fed to an acoustic optical deflector (AOD) via an isolator and a beam expander. The AOD is controlled via an analog voltage controlled radiofrequency (RF) synthesizer driver. The AOD is properly mounted at the back focal plane of the objective lens so that k is essentially constant while shifting the potential center. A second laser with 980 nm wavelength is used for tracking the particle position. A quadrant photo diode (QPD) is used to detect the particle position. The electrical signal from QPD is preamplified by a signal amplifier and sampled at every τ with a fieldprogrammable gate array (FPGA) data acquisition card. The sample cell consists of highly dilute solution of 2.0 μm diameter polystyrene particles suspended in deionized water. All experiments were carried out at 293 ± 0.1 K. The parameters of the trap were calibrated by fitting the probability distribution of the particle position in thermal equilibrium without a feedback process to the Boltzmann distribution, a Gaussian of variance S ≈ (27.4 nm)^{2}. The trap stiffness is k = k_{B}T/S ≈ 5.4pNμm^{−1} and the characteristic relaxation time is τ_{R} = γ/k ≈ 3.5 ms^{19}. The particle position measurement is nearly errorfree with a resolution of 1 nm, and the potential center is shifted practically instantaneously within 20 μs. Each engine cycle of period τ includes three phases: measurement of the particle position, shift of the potential center, and relaxation (Fig.1a). After the position x_{i} of the particle, relative to trap center λ_{i}, is measured precisely, it is distorted with random Gaussian noise of variance N to get the demonmeasured value y_{i}. The potential center is then shifted to y_{i}, and the particle relaxes for duration τ before the next cycle begins. In the subsequent (i + 1)th cycle, the particle position x_{i+1} is measured with respect to the shifted potential center λ_{i} (the origin is reset) and the same feedback protocol is repeated. Since the origin is reset, the process does not depend on all previous measurement, even when the cycle period is smaller than the relaxation time.
Data availability
All data associated with this study are available from corresponding author upon reasonable request.
References
 1.
Leff, H. S. & Rex, A. F. Maxwell’s Demon 2: Entropy, Classical and Quantum Information, Computing. (Princeton University Press, New Jersey, 2003).
 2.
Szilard, L. über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. Phys. 53, 840–856 (1929).
 3.
Brillouin, L. Maxwell’s Demon cannot operate: information and entropy. I. J. Appl. Phys. 22, 334–337 (1951).
 4.
Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 5, 183–191 (1961).
 5.
Bennett, C. H. The thermodynamics of computation—a review. Int. J. Theor. Phys. 21, 905–940 (1982).
 6.
Sagawa, T. & Ueda, M. Generalized Jarzynski equality under nonequilibrium feedback control. Phys. Rev. Lett. 104, 090602 (2010).
 7.
Shannon, C. E. A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948).
 8.
Cover, T. M. & Thomas, J. A. Elements of Information Theory. 2nd Edition (Wiley, Hoboken, 2006).
 9.
Abreu, D. & Seifert, U. Extracting work from a single heat bath through feedback. Europhys. Lett. 94, 10001 (2011).
 10.
Bauer, M., Abreu, D. & Seifert, U. Efficiency of a Brownian information machine. J. Phys. A Math. Theor. 45, 162001 (2012).
 11.
Bérut, A. et al. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 483, 187 (2012).
 12.
Selimkhanov, J. et al. Accurate information transmission through dynamic biochemical signaling networks. Science 346, 1370–1373 (2014).
 13.
Bialek, W. S. Biophysics: Searching For Principles. (Princeton University Press, 2012).
 14.
Ito, S. & Sagawa, T. Maxwell’s demon in biochemical signal transduction with feedback loop. Nat. Commun. 6, 7498 (2015).
 15.
Barato, A. C., Hartich, D. & Seifert, U. Efficiency of cellular information processing. N. J. Phys. 16, 103024 (2014).
 16.
Parrondo, J. M. R., Horowitz, J. M. & Sagawa, T. Thermodynamics of information. Nat. Phys. 11, 131 (2015).
 17.
Koski, J. V., Maisi, V. F., Sagawa, T. & Pekola, J. P. Experimental observation of the role of mutual information in the nonequilibrium dynamics of a Maxwell demon. Phys. Rev. Lett. 113, 030601 (2014).
 18.
Masuyama, Y. et al. Informationtowork conversion by Maxwell’s demon in a superconducting circuit quantum electrodynamical system. Nat. Commun. 9, 1291 (2018).
 19.
Paneru, G., Lee, D. Y., Tlusty, T. & Pak, H. K. Lossless Brownian information engine. Phys. Rev. Lett. 120, 020601 (2018).
 20.
Toyabe, S., Sagawa, T., Ueda, M., Muneyuki, E. & Sano, M. Experimental demonstration of informationtoenergy conversion and validation of the generalized Jarzynski equality. Nat. Phys. 6, 988–992 (2010).
 21.
Paneru, G. et al. Optimal tuning of a Brownian information engine operating in a nonequilibrium steady state. Phys. Rev. E 98, 052119 (2018).
 22.
Gavrilov, M., Chétrite, R. & Bechhoefer, J. Direct measurement of weakly nonequilibrium system entropy is consistent with Gibbs–Shannon form. Proc. Natl Acad. Sci. USA 114, 11097–11102 (2017).
 23.
Seifert, U. Stochastic thermodynamics, fluctuation theorems and molecular machines. Rep. Prog. Phys. 75, 126001 (2012).
 24.
Fuchs, J., Goldt, S. & Seifert, U. Stochastic thermodynamics of resetting. Europhys. Lett. 113, 60009 (2016).
 25.
Evans, M. R. & Majumdar, S. N. Diffusion with Stochastic Resetting. Phys. Rev. Lett. 106, 160601 (2011).
 26.
Murugan, A., Huse, D. A. & Leibler, S. Discriminatory proofreading regimes in nonequilibrium systems. Phys. Rev. X 4, 021016 (2014).
 27.
Piñeros, W. D. & Tlusty, T. Kinetic Proofreading and the limits of thermodynamic uncertainty. Preprint at https://journals.aps.org/pre/accepted/1307bR0cYa41d017a73887a0e3881c7a30fa61887 (2019).
 28.
Wang, M. C. & Uhlenbeck, G. E. On the theory of the Brownian Motion II. Rev. Mod. Phys. 17, 323–342 (1945).
 29.
Savir, Y., Kagan, J. & Tlusty, T. Binding of transcription factors adapts to resolve informationenergy tradeoff. J. Stat. Phys. 162, 1383–1394 (2016).
 30.
Sagawa, T. & Ueda, M. Nonequilibrium thermodynamics of feedback control. Phys. Rev. E 85, 021104 (2012).
 31.
Verley, G., Esposito, M., Willaert, T. & Van den Broeck, C. The unlikely Carnot efficiency. Nat. Commun. 5, 4721 (2014).
 32.
Proesmans, K. & Broeck, C. Vd Stochastic efficiency: five case studies. New J. Phys. 17, 065004 (2015).
 33.
Verley, G., Willaert, T., Van den Broeck, C. & Esposito, M. Universal theory of efficiency fluctuations. Phys. Rev. E 90, 052145 (2014).
 34.
Park, J.M., Chun, H.M. & Noh, J. D. Efficiency at maximum power and efficiency fluctuations in a linear Brownian heatengine model. Phys. Rev. E 94, 012127 (2016).
 35.
Manikandan, S. K., Dabelow, L., Eichhorn, R. & Krishnamurthy, S. Efficiency fluctuations in microscopic machines. Phys. Rev. Lett. 122, 140601 (2019).
 36.
Polettini, M., Verley, G. & Esposito, M. Efficiency statistics at all times: Carnot limit at finite power. Phys. Rev. Lett. 114, 050601 (2015).
 37.
Proesmans, K., Dreher, Y., Gavrilov, M., Bechhoefer, J. & Van den Broeck, C. Brownian duet: a novel tale of thermodynamic efficiency. Phys. Rev. X 6, 041010 (2016).
 38.
Rana, S., Pal, P. S., Saha, A. & Jayannavar, A. M. Singleparticle stochastic heat engine. Phys. Rev. E 90, 042146 (2014).
 39.
Barato, A. C. & Seifert, U. Thermodynamic Uncertainty Relation for Biomolecular Processes. Phys. Rev. Lett. 114, 158101 (2015).
 40.
Gingrich, T. R., Horowitz, J. M., Perunov, N. & England, J. L. Dissipation bounds all steadystate current fluctuations. Phys. Rev. Lett. 116, 120601 (2016).
 41.
Seifert, U. Stochastic thermodynamics: from principles to the cost of precision. Phys. A: Stat. Mech. Appl. 504, 176–191 (2018).
 42.
Barato, A. C. & Seifert, U. Coherence of biochemical oscillations is bounded by driving force and network topology. Phys. Rev. E 95, 62409 (2017).
 43.
Horowitz, J. M. & Gingrich, T. R. Thermodynamic uncertainty relations constrain nonequilibrium fluctuations. Nat. Phys. https://doi.org/10.1038/s4156701907026 (2019).
 44.
Paneru, G., Dutta, S., Tlusty, T. & Pak, H. K. Approaching and violating thermodynamic uncertainty bounds in measurements of fluctuationdissipation tradeoffs of information engines. Preprint at https://arxiv.org/abs/1911.09835 (2019).
 45.
Blickle, V. & Bechinger, C. Realization of a micrometresized stochastic heat engine. Nat. Phys. 8, 143–146 (2012).
 46.
Martínez, I. A. et al. Brownian Carnot engine. Nat. Phys. 12, 67 (2015).
Acknowledgements
This work was supported by the taxpayers of South Korea through the Institute for Basic Science, project code IBSR020D1. We thank Dr. Jae Sung Lee and Prof. Albert Libchaber for insightful discussion.
Author information
Affiliations
Contributions
G.P. and H.K.P. designed the research. G.P. performed the experiment. G.P. and S.D. analyzed the data. S.D., T.S., and T.T. contributed in theory. T.T. and H.K.P. supervised the research. All authors discussed the results and implications and wrote the paper.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nature Communications thanks the anonymous reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Paneru, G., Dutta, S., Sagawa, T. et al. Efficiency fluctuations and noise induced refrigeratortoheater transition in information engines. Nat Commun 11, 1012 (2020). https://doi.org/10.1038/s4146702014823x
Received:
Accepted:
Published:
Further reading

Reaching and violating thermodynamic uncertainty bounds in information engines
Physical Review E (2020)

Rapidprototyping a Brownian particle in an active bath
Soft Matter (2020)

Colloidal engines for innovative tests of information thermodynamics
Advances in Physics: X (2020)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.