Nature | News Feature

The new thermodynamics: how quantum physics is bending the rules

Experiments are starting to probe the limits of the classical laws of thermodynamics.

Article tools

Illustration by Edgar Bąk

It would take a foolhardy physicist to dare attempt to break the laws of thermodynamics. But it turns out that there may be ways to bend them. At a lab at the University of Oxford, UK, quantum physicists are trying to do so with a small lump of synthetic diamond. At first, the diamond is barely visible, nestled inside a chaotic mess of optical fibres and mirrors. But when they switch on a green laser, defects in the diamond are illuminated, and the crystal begins to glow red.

In that light, the team has found preliminary evidence of an effect that was theorized only a few years ago1: a quantum boost that would push the diamond's power output above the level prescribed by classical thermodynamics. If the results hold up, they will be a tangible boon for the study of quantum thermodynamics, a relatively new field that aims to uncover the rules that govern heat and energy flow at the atomic scale.

There is reason to suspect that the laws of thermodynamics, which are based on how large numbers of particles behave, are different in the quantum realm. Over the past five years or so, a quantum-thermodynamics community has grown around that idea. What was once the domain of a handful of theoreticians now includes a few hundred theoretical and experimental physicists around the globe. “The field is moving so fast I can barely keep up,” says Ronnie Kosloff, an early pioneer of the field at the Hebrew University of Jerusalem in Israel.

A number of quantum thermodynamicists hope to find behaviour outside the remit of conventional thermodynamics that could be adapted for practical purposes, including improving lab-based refrigeration techniques, creating batteries with enhanced capabilities and refining technology for quantum computing.

But the field is still in its infancy. Experiments such as the one taking place at Oxford are just starting to put theoretical predictions to the test. And physicists working at the periphery are watching such tests closely for evidence of the useful applications that theorists have predicted. “Quantum thermodynamics is clearly hot — pardon the pun,” says Ronald Walsworth, a physicist at Harvard University in Cambridge, Massachusetts, who specializes in developing precision atomic-scale tools. “But for those of us looking in from the outside, the question is: can it really shed new light on the development of technologies?”

Breaking the law

The development of the classical laws of thermodynamics stretches back to the nineteenth century. They emerged from the effort to understand steam engines and other macroscopic systems. Thermodynamic quantities such as temperature and heat are statistical in nature and defined in reference to the average motion of large ensembles of particles. But back in the 1980s, Kosloff began pondering whether this picture would continue to make sense for much smaller systems.

It wasn't a popular line of research at the time, says Kosloff, because the questions being asked were largely abstract, with little hope of connection to experiments. “The field developed very slowly,” he says. “I was alone for years.”

Jonas Becker

An experimental set-up at the University of Oxford is used to investigate whether quantum effects can enhance the power output of a diamond.

That changed dramatically around a decade ago, as questions about the limits of technological miniaturization became more pressing and experimental techniques advanced. A flurry of attempts were made to calculate how thermodynamics and quantum theory might combine. But the resulting proposals created more confusion than clarity, Kosloff says. Some claimed that quantum devices could violate classical thermodynamic constraints with impunity and so act as perpetual-motion machines, capable of performing work without needing any energy input. Others, suggesting that the laws of thermodynamics should hold unmodified at very small scales, were equally perplexing. “In some sense, you can use the same equations to work out the performance of a single atom engine and your car engine,” says Kosloff. “But that seems shocking, too — surely as you get smaller and smaller you should hit some quantum limit.” In classical thermodynamics, a single particle doesn't have a temperature. So as both the system generating work and its environment approach that limit, it becomes increasingly absurd to imagine that they would obey standard thermodynamic rules, says Tobias Schaetz, a quantum physicist at the University of Freiburg in Germany.

The preponderance of conflicting theoretical claims and predictions initially undermined the burgeoning field's credibility. “I have been very critical of the field because there is far too much theory and not enough experiment,” says quantum physicist Peter Hänggi, at the University of Augsburg in Germany. But the community is beginning to coalesce more formally around core questions in an effort to cut through the chaos. One goal has been to use experiments to uncover the point at which the classical laws of thermodynamics no longer perfectly predict the thermal behaviour of quantum systems.

Experiments are starting to pin down that quantum–classical boundary. Last year, for example, Schaetz and his colleagues showed that, under certain conditions, strings of five or fewer magnesium ions in a crystal do not reach and remain in thermal equilibrium with their surroundings like larger systems do2. In their test, each ion started in a high-energy state and its spin oscillated between two states corresponding to the direction of its magnetism — 'up' and 'down'. Standard thermodynamics predicts that such spin oscillations should die down as the ions cool by interacting with the other atoms in the crystal around them, just as hot coffee cools when its molecules collide with molecules in the colder surrounding air.

Such collisions transfer energy from the coffee molecules to the air molecules. A similar cooling mechanism is at play in the crystal, where quantized vibrations in the lattice called phonons carry heat away from the oscillating spins. Schaetz and his colleagues found that their small ion systems did stop oscillating, suggesting that they had cooled. But after a few milliseconds, the ions began oscillating vigorously again. This resurgence has a quantum origin, says Schaetz. Rather than dissipating away entirely, the phonons rebounded at the edges of the crystal and returned, in phase, to their source ions, reinstating the original spin oscillations.

Schaetz says that his experiment sends a warning to engineers attempting to reduce the size of existing electronics. “You may have a wire that is only 10 or 15 atoms wide, and you may think that it has successfully carried the heat away from your chip, but then boop — suddenly this quantum revival happens,” Schaetz says. “It is very disturbing.”

Rebounding phonons could present a challenge in some applications, but other quantum phenomena could turn out to be useful. Efforts to identify such phenomena had been stalled by the difficulty in defining basic quantities, such as heat and temperature, in quantum systems. But the solution to a famous thought experiment, laid out 150 years ago by Scottish physicist James Clerk Maxwell, provided a clue about where to turn, posing an intriguing link between information and energy. Maxwell imagined an entity that could sort slow- and fast-moving molecules, creating a temperature difference between two chambers simply by opening and closing a door between them.

Such a 'demon', as it was later called, thus generates a hot and a cold chamber that can be harnessed to produce useful energy. The problem is that by sorting particles in this way, the demon reduces the system's entropy — a measure of the disorder of the particles' arrangements — without having done any work on the particles themselves. This seemingly violates the second law of thermodynamics.

But physicists eventually realized that the demon would pay a thermodynamic price to process the information about the molecules' speeds. It would need to store, erase and rewrite that information in its brain. That process consumes energy and creates an overall increase in entropy3. Information was once thought to be immaterial, “but Maxwell's demon shows that it can have objective physical consequences”, says quantum physicist Arnau Riera, at the Institute of Photonic Sciences in Barcelona, Spain.

Finding the limit

Inspired by the idea that information is a physical quantity — and that it is intimately linked to thermodynamics — researchers have attempted to recast the laws of thermodynamics so that they work in the quantum regime.

Perpetual-motion machines may be impossible. But an early hope was that limits prescribed by quantum thermodynamics might be less stringent than those that hold in the classical realm. “This was the train of thought we had learned from quantum computing — that quantum effects help you beat classical bounds,” says Raam Uzdin, a quantum physicist at the Technion–Israel Institute of Technology in Haifa.

Disappointingly, Uzdin says, this is not the case. Recent analyses suggest that quantum versions of the second law, which governs efficiency, and the third law, which prohibits systems from reaching absolute zero, retain similar and, in some cases, more-stringent constraints than their classical incarnations.

Some differences arise because the macroscopic thermodynamic quantity 'free energy'— the energy a system has available to do work — doesn't have just one counterpart at the microscale, but many, says Jonathan Oppenheim, a quantum physicist at University College London. Classically, the free energy is calculated by assuming that all states of the system, determined by the arrangement of particles at a given energy, are equally likely. But that assumption isn't true on tiny scales, says Oppenheim; certain states might be much more probable than others. To account for this, additional free energies need to be defined in order to accurately describe the system and how it will evolve. Oppenheim and his colleagues propose that individual second laws exist for each type of free energy, and that quantum devices must obey all of them4. “Since the second law tells you what you aren't allowed to do, in some ways, it seems that having more laws on the microscale leaves you worse off,” says Oppenheim.

“The field is moving so fast I can barely keep up.”

Much of the work done to calculate equivalents of the second and third laws remains, for now, theoretical. But proponents argue that it can help to illuminate how thermodynamic bounds are physically enforced at small scales. For instance, a theoretical analysis carried out by a pair of quantum physicists based in Argentina showed that as a quantum refrigerator nears absolute zero, photons will spontaneously appear in the vicinity of the device5. “This dumps energy into the surroundings, causing a heating effect that counters the cooling and stops you ever reaching absolute zero,” explains team member Nahuel Freitas of Ciudad University in Buenos Aires.

Theory has also revealed some potential wiggle room. In a theoretical analysis examining information flow between hot and cold chambers, or 'baths', of particles, a team based in Barcelona that included Riera and quantum physicist Manabendra Nath Bera discovered a strange scenario in which the hot bath seemed to spontaneously get hotter, while the cold bath became colder6. “At first, this looks crazy, like we can violate thermodynamics,” says Bera. But the researchers soon realized that they had overlooked the quantum twist: the particles in the baths can become entangled. In theory, making and breaking these correlations provides a way to store and release energy. Once this quantum resource was budgeted for, the laws of thermodynamics fell into place.

A number of independent groups have proposed using such entanglement to store energy in a 'quantum battery', and a group at the Italian Institute of Technology in Genoa is attempting to confirm the Barcelona team's predictions with batteries built from superconducting quantum bits, or 'qubits'7. In principle, such quantum batteries could charge considerably faster than their classical equivalents. “You won't be able to extract and store more energy than the classical bound allows — that's set by the second law,” says Riera. “But you may be able to speed things up.”

Some researchers are looking for easier ways to manipulate qubits for quantum-computing applications. Quantum physicist Nayeli Azucena Rodríguez Briones at the University of Waterloo in Canada and her colleagues have devised8 an operation that might enhance the cooling needed for quantum-computing operations by manipulating pairs of qubit energy levels. They are currently planning to test this idea in the lab using superconducting qubits.

A small spark

The concept that quantum effects could be exploited to improve thermodynamic performance also inspired the diamond experiment under way at Oxford, which was first proposed by Kosloff, Uzdin and Amikam Levy, also at the Hebrew University1. Defects created by nitrogen atoms scattered through the diamond can serve as an engine — a machine that performs an operation after being brought into contact with first a hot reservoir (in this case a laser) and then a cold one. But Kosloff and his colleagues expect that such an engine can be operated in an enhanced mode, by exploiting a quantum effect that enables some of the electrons to exist in two energy states simultaneously. Maintaining these superpositions by pulsing the laser light rather than using a continuous beam should enable the crystal to emit microwave photons more rapidly than it otherwise would (see 'Building a quantum heat engine').

Last week, the Oxford-based team posted a preliminary analysis9 showing evidence of the predicted quantum boost. The paper has yet to be peer reviewed, but if the work holds up, then “it is a groundbreaking result,” says Janet Anders, a quantum physicist at Exeter University, UK. But, she adds, it's still not clear exactly what enables this feat. “It seems to be a magic fuel, not so much adding energy, but enabling the engine to extract energy faster,” Anders says. “Theoretical physicists will need to examine just how it does this.”

Focusing on experiments is a major step in the right direction for revitalizing the field, says Hänggi. But, for him, the experiments are not yet bold enough to give truly ground-breaking insights. There is also the challenge that quantum systems can be irrevocably disturbed by measurement and interaction with the environment. These effects are rarely sufficiently accounted for in theoretical proposals for new experiments, he says. “That is difficult to calculate, and much more difficult to implement in an experiment,” he says.

Ian Walmsley, who heads the Oxford lab where the diamond experiment was conducted, is also circumspect about the future of the field. Although he and other experimenters have been drawn to quantum thermodynamics research in recent years, he says that their interest has been largely “opportunistic”. They have spotted the chance to carry out relatively quick and easy experiments by piggybacking on set-ups already in place for other uses; the diamond-defect set-up, for instance, is already being widely studied for quantum computing and sensor applications. Today, quantum thermodynamics is fizzing with energy, Walmsley says. “But whether it will continue to sparkle, or just explode into nothing, well, we will have to wait and see.”

Journal name:
Date published:


  1. Uzdin, R., Levy, A. & Kosloff, R. Phys. Rev. X 5, 031044 (2015).

  2. Clos, G., Porras, D., Warring, U. & Schaetz, T. Phys. Rev. Lett 117, 170401 (2016).

  3. Bennett, C. H. Int. J. Theor. Phys. 21, 905940 (1982).

  4. Brandão, F., Horodecki, M., Ng, N., Oppenheim, J. & Wehner, S. Proc. Natl Acad. Sci. USA 112, 3275 (2015).

  5. Freitas, N. & Paz, J. P. Phys. Rev. E 95, 012146 (2017).

  6. Bera, M. N., Riera, A., Lewenstein, M. & Winter, A. Preprint at (2017).

  7. Ferraro, D., Campisi, M., Andolina, G. M., Pellegrini, V. & Polini, M. Preprint at (2017).

  8. Rodríguez-Briones, N. A. et al. Preprint at (2017).

  9. Klatzow, J. et al. Preprint at (2017).

Author information


  1. Zeeya Merali is a freelance writer based in London. She writes occasionally for FQXi, which funds research in physics and cosmology, including areas related to this story.

Author details

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments for this thread are now closed.


11 comments Subscribe to comments

  1. Avatar for Pentcho Valev
    Pentcho Valev
    "The Formation of the Floating Water Bridge..." Electric fields manage to convert the chaotic thermal motion of water molecules into macroscopically expressed perpetual (limited only by the deterioration of the system) flows capable of doing work at the expense of ambient heat. In 2002 I tried, for the first time, to call the attention of the scientific community to the effect (but failed): " two vertical constant-charge capacitor plates partially dip into a pool of a liquid dielectric (e.g. water), the liquid between them rises high above the surface of the rest of the liquid in the pool. Evidently, if one punches a macroscopic hole in one of the plates, nothing could prevent the liquid between the plates from leaking out through the hole and generating an eternal waterfall outside the capacitor. This hypothesis has been discussed on many occasions but so far no serious counter-argument has been raised." Pentcho Valev, The Law of Self-Acting Machines and Irreversible Processes with Reversible Replicas Pentcho Valev
  2. Avatar for Pentcho Valev
    Pentcho Valev
    Nature 2002: "Second law broken. Researchers have shown for the first time that, on the level of thousands of atoms and molecules, fleeting energy increases violate the second law of thermodynamics. [...] They found that over periods of time less than two seconds, variations in the random thermal motion of water molecules occasionally gave individual beads a kick. This increased the beads' kinetic energy by a small but significant amount, in apparent violation of the second law." No thermal fluctuation can produce a kick lasting for so long (up to two seconds). Only a local FLOW can push the bead in this way, and the flow, generated by a local electric field, is essentially identical to the flows in this system: "The Formation of the Floating Water Bridge..." In the electric field between the plates of a capacitor, the same flows can be seen: "Liquid Dielectric Capacitor" Pentcho Valev
  3. Avatar for Pentcho Valev
    Pentcho Valev
    The electric field is created by the laser beam: "A deviation from the second law of thermodynamics has been demonstrated experimentally for the first time. [...] To test the idea, the researchers put about 100 latex beads - each 6.3 µm across - into a water-filled cell, which was placed on the stage of a microscope. The researchers focused a laser onto one of the beads, which induced a dipole moment in the bead and drew it towards the most intense region of the electric field in the laser beam." "Optical tweezers are capable of manipulating nanometer and micron-sized dielectric particles by exerting extremely small forces via a highly focused laser beam. The beam is typically focused by sending it through a microscope objective. The narrowest point of the focused beam, known as the beam waist, contains a very strong electric field gradient. Dielectric particles are attracted along the gradient to the region of strongest electric field, which is the center of the beam." In an electric field, water develops a specific pressure that pushes in all directions and can result in an eternal (limited only by the deterioration of the system) motion, on condition that the system provides suitable channels for water to move through. The pressure is NON-CONSERVATIVE. This means that, if suitably harnessed, the pressure will do work AT THE EXPENSE OF AMBIENT HEAT (in violation of the second law of thermodynamics). Here is a simple example: "A plane capacitor with rectangular plates is fixed in a vertical position. [...] The capacitor is charged and disconnected from the battery. [...] The lower part of the capacitor is now brought into contact with a dielectric liquid: When the plates contact the liquid's surface, a force in the upward direction is exerted on the dielectric liquid. The total charge on each plate remains constant and there is no energy transferred to the system from outside." [END OF QUOTATION] There IS energy transferred to the system from outside. The rising water can do work, e.g. by lifting a floating weight, and this work can only be done at the expense of AMBIENT HEAT. What is the molecular mechanism behind the effect? Here is a schematic presentation of water dipoles in the electrical field: If it were not for the indicated (with an arrow) dipole, other dipoles in the picture are perfectly polarized as if there were no thermal motion. Of course, this is an oversimplification – thermal motion is a factor which constantly disturbs the polarization order. The crucial point is that, as can be inferred from the picture, any thermal disturbance contributes to the creation of a pressure between the plates. Consider the indicated dipole. It has just received a strong thermal stroke and undergone rotation. As a result, it pushes adjacent dipoles electrostatically, towards the plates. Macroscopically, the sum of all such disturbances is expressed as a pressure exerted on the plates. One can also say, somewhat figuratively, that the indicated dipole has absorbed heat and now, by pushing adjacent dipoles, is trying to convert it into work. Pentcho Valev
  4. Avatar for Pentcho Valev
    Pentcho Valev
    Clifford Truesdell, The Tragicomical History of Thermodynamics, 1822-1854, p. 6: "Finally, I confess to a heartfelt hope - very slender but tough - that even some thermodynamicists of the old tribe will study this book, master the contents, and so share in my discovery: Thermodynamics need never have been the Dismal Swamp of Obscurity that from the first it was and that today in common instruction it is; in consequence, it need not so remain." [...] p. 333: "Clausius' verbal statement of the "Second Law" makes no sense, for "some other change connected therewith" introduces two new and unexplained concepts: "other change" and "connection" of changes. Neither of these finds any place in Clausius' formal structure. All that remains is a Mosaic prohibition. A century of philosophers and journalists have acclaimed this commandment; a century of mathematicians have shuddered and averted their eyes from the unclean." Jos Uffink, Bluff your way in the Second Law of Thermodynamics: "Before one can claim that acquaintance with the Second Law is as indispensable to a cultural education as Macbeth or Hamlet, it should obviously be clear what this law states. This question is surprisingly difficult. The Second Law made its appearance in physics around 1850, but a half century later it was already surrounded by so much confusion that the British Association for the Advancement of Science decided to appoint a special committee with the task of providing clarity about the meaning of this law. However, its final report (Bryan 1891) did not settle the issue. Half a century later, the physicist/philosopher Bridgman still complained that there are almost as many formulations of the second law as there have been discussions of it. And even today, the Second Law remains so obscure that it continues to attract new efforts at clarification." As Clifford Truesdell suggests, the confusion started with Clausius's 1850 idiotic argument - later formulations of the second law of thermodynamics have all been defective. However previous formulations - those of Carnot - were both clear and correct. The simplest one is this: "A cold body is necessary" That is, heat cannot be cyclically converted into work unless a hot body, source of heat, and a cold body, receiver of heat, are available. The problem is that in 1824 Carnot deduced "A cold body is necessary" from a postulate that eventually turned out to be false: Carnot's (false) postulate: Heat is an indestructible substance (caloric) that cannot be converted into work by the heat engine. Unpublished notes written in the period 1824-1832 reveal that, after realizing that his postulate was false (and discovering the first law of thermodynamics), Carnot found "A cold body is necessary" implausible: Sadi Carnot, REFLECTIONS ON THE MOTIVE POWER OF HEAT, p. 225: "Heat is simply motive power, or rather motion which has changed form. It is a movement among the particles of bodies. Wherever there is destruction of motive power there is, at the same time, production of heat in quantity exactly proportional to the quantity of motive power destroyed. Reciprocally, wherever there is destruction of heat, there is production of motive power." p. 222: "Could a motion (that of radiating heat) produce matter (caloric)? No, undoubtedly; it can only produce a motion. Heat is then the result of a motion. Then it is plain that it could be produced by the consumption of motive power, and that it could produce this power. All the other phenomena - composition and decomposition of bodies, passage to the gaseous state, specific heat, equilibrium of heat, its more or less easy transmission, its constancy in experiments with the calorimeter - could be explained by this hypothesis. But it would be DIFFICULT TO EXPLAIN WHY, IN THE DEVELOPMENT OF MOTIVE POWER BY HEAT, A COLD BODY IS NECESSARY; why, in consuming the heat of a warm body, motion cannot be produced." Generally, a cold body is not necessary, that is, the second law of thermodynamics is false. The cold body is only TECHNOLOGICALLY necessary – non-isothermal heat engines are fast-working and powerful. Heat engines working under isothermal conditions (in the absence of a cold body) are commonplace but are too slow and impuissant to be of any technological importance. Except, perhaps, for the case where water is placed in an electric field - the non-conservative force (pressure) that emerges seems to be able to convert ambient heat into work quite vigorously: Wolfgang K. H. Panofsky, Melba Phillips, Classical Electricity and Magnetism, pp.115-116: "Thus the decrease in force that is experienced between two charges when they are immersed in a dielectric liquid can be understood only by considering the effect of the PRESSURE OF THE LIQUID ON THE CHARGES themselves." "However, in experiments in which a capacitor is submerged in a dielectric liquid the force per unit area exerted by one plate on another is observed to decrease... [...] This apparent paradox can be explained by taking into account the DIFFERENCE IN LIQUID PRESSURE in the field filled space between the plates and the field free region outside the capacitor." Tai Chow, Introduction to Electromagnetic Theory: A Modern Perspective, p. 267: "The strictly electric forces between charges on the conductors are not influenced by the presence of the dielectric medium. The medium is polarized, however, and the interaction of the electric field with the polarized medium results in an INCREASED FLUID PRESSURE ON THE CONDUCTORS that reduces the net forces acting on them." "Liquid Dielectric Capacitor" "The Formation of the Floating Water Bridge including electric breakdowns" Pentcho Valev
  5. Avatar for Vyacheslav Somsikov
    Vyacheslav Somsikov
    For the verification of the deterministic mechanism of irreversibility, which obtained within the framework of the classical mechanics laws [Somsikov V.M. Non-Linearity of Dynamics of the Non-Equilibrium Systems. World Journal of Mechanics, 2017, Vol.7 No.2, 11-23], we was performed the numerical calculations of the change of D-entropy for the system with different number of the potentially interacting material points (MP) when it moves through a potential barrier. D – entropy is a relation of the value of change of the systems internal energy to its full value. [Somsikov V. M. and Andreev A. B. On criteria of transition to a thermodynamic description of system dynamics. Russian Physics Journal, Vol. 58, No. 11, March, 2016; Volume 4 – May 2015 (05)]. The calculations were carried 400 times for a given number of particles for different initial states of the system, but for the same predetermined amount of energy. This made it possible to determine the change of the D-entropy for different states of the system for a given value of its energy and a given number of MP. It was found that the fluctuations of internal energy decreasing with increasing number of particles in the system for different initial conditions. When number of particles less 64, the D –entropy can be as positive as negative.When number of particles more 64 then none of the 400 numerical experiments gave a negative value change of the internal energy. This means that when number of particles more 64 the dynamics of the system becomes irreversible. Therefore, the number 64 can be called as a first critical number of the system, beyond which the system becomes irreversible. When number of particles more than 1000, the dispersion of the internal energy reaches to the minimum. With further increase in the number of MP the increment of the internal energy is not changed. This number can be called as a second critical number. Thus if the system consist from number of particles more than 1000, the thermodynamic description is a correct. Obviously, in the general case, these critical numbers will depend on the parameters of the task, for example, the width and height of the barrier.
  6. Avatar for Pentcho Valev
    Pentcho Valev
    "Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics." It was Clausius who "noticed" that the entropy is a state function, but was he correct? Here is the story: If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function": "Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." "Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum." The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles. Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about. "My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage." Pentcho Valev
  7. Avatar for Pentcho Valev
    Pentcho Valev
    The version of the second law of thermodynamics known as "Entropy always increases" (a version which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865: Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state." Clausius' deduction was based on three postulates: Postulate 1 (implicit): The entropy is a state function. Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct. Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle. All the three postulates remain totally unjustified even nowadays. Postulate 1 can easily be disproved by considering cycles (heat engines) converting heat into work in ISOTHERMAL conditions. Postulate 3 is also false: Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash." Note that, even if Clausius's theorem were true (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that (even if Clausius's theorem were true) applications of "Entropy always increases" to processes which do not begin and end in equilibrium, that is, to processes in Nature, not in a cylinder with a piston, would still be incorrect: Jos Uffink, in the same article: "I therefore argue for the view that the second law has nothing to do with the arrow of time. [...] This summary leads to the question whether it is fruitful to see irreversibility or time-asymmetry as the essence of the second law. Is it not more straightforward, in view of the unargued statements of Kelvin, the bold claims of Clausius and the strained attempts of Planck, to give up this idea? I believe that Ehrenfest-Afanassjewa was right in her verdict that the discussion about the arrow of time as expressed in the second law of the thermodynamics is actually a RED HERRING." Pentcho Valev
  8. Avatar for Raji Heyrovska
    Raji Heyrovska
    I just saw the interesting article by Merali [1]. In this context, I wish to draw attention to the First International Conference [2] on Quantum Limits to the Second Law. In her contribution [3] to this conference, she points out that thermodynamic functions and laws were developed over the years to "bridge" the gap between the equations of state and thermal properties of matter. In [3] the author has incorporated the thermodynamic properties into the equation of state thereby forming one simple composite equation. The heat capacity difference is introduced in place of the gas constant in her earlier concise equation of state for gases, based on free volume and molecular association/dissociation. This provides a new and simple relation between the P, V, T properties, internal energy (E), enthalpy (H), Gibbs (G) and Helmholtz (A) free energies, heat energy (Q), entropy (S), partition function (f) and the thermodynamic laws. Since a proper definition of "heat" is essential for the discussion of the second law, Q for a gas at the given P, V, T, S is defined as TS = PVflnW, where W is the thermodynamic probability related to f. The latter is expressed as the ratio of free volume to volume corresponding to the de Broglie wave length. Also, for the first time experimental heat capacities at various P, V and T are correlated with the extent of molecular association. The available data for nitrogen have been used to demonstrate the validity of the new equation of state. References: 1. Merali, Z., Nature 551, 20–22 (02 November 2017) doi:10.1038/551020a 2. “QUANTUM LIMITS TO THE SECOND LAW: First International Conference on Quantum Limits to the Second Law”: 29-31 July 2002, San Diego, California (USA), ISBN: 0-7354-0098-9, Editors: Daniel P. Sheehan, Volume number: 643, Published: Nov 20, 2002, 3. Heyrovska, R., AIP Conference Proceedings 643, 157-162 (2002);
  9. Avatar for Pentcho Valev
    Pentcho Valev
    The second law of thermodynamics has an absurd implication that proves its falsehood: If we have a reversible chemical reaction and a catalyst increases the rate of the forward reaction by a factor of, say, 745492, it obligatorily increases the rate of the reverse reaction by exactly the same factor, 745492, despite the fact that the two reactions - forward and reverse - may be entirely different (e.g. the diffusion factor is crucial for one but not important for the other) and accordingly require entirely different catalytic mechanisms. The absurd implication is usually referred to as "Catalysts do not shift chemical equilibrium": "A catalyst reduces the time taken to reach equilibrium, but does not change the position of the equilibrium. This is because the catalyst increases the rates of the forward and reverse reactions BY THE SAME AMOUNT." "In the presence of a catalyst, both the forward and reverse reaction rates will speed up EQUALLY... [...] If the addition of catalysts could possibly alter the equilibrium state of the reaction, this would violate the second rule of thermodynamics..." The absurd implication is not obeyed by chemical reactions of course. Here is a publication in Nature describing a catalyst accelerating the forward and SUPPRESSING the reverse reaction: Yu Hang Li et al. Unidirectional suppression of hydrogen oxidation on oxidized platinum clusters. Another example of disobedience: Perpetual (limited only by the deterioration of the system) motion of dimer A_2 and monomer A between two catalytic surfaces, S1 and S2 (a time crystal par excellence): See the explanations here: That catalysts can violate the second law of thermodynamics by shifting chemical equilibrium is presented by Wikipedia as a fact: "Epicatalysis is a newly identified class of gas-surface heterogeneous catalysis in which specific gas-surface reactions shift gas phase species concentrations away from those normally associated with gas-phase equilibrium. [...] A traditional catalyst adheres to three general principles, namely: 1) it speeds up a chemical reaction; 2) it participates in, but is not consumed by, the reaction; and 3) it does not change the chemical equilibrium of the reaction. Epicatalysts overcome the third principle..." Pentcho Valev
  10. Avatar for Pentcho Valev
    Pentcho Valev
    The second law of thermodynamics has long been under attack but only for small, microscopic, quantum etc. systems: Nature 2002: "Second law broken. Researchers have shown for the first time that, on the level of thousands of atoms and molecules, fleeting energy increases violate the second law of thermodynamics." The truth is that MACROSCOPIC systems violating the second law of thermodynamics are COMMONPLACE. The problem is that misleading education diverts the attention from relevant examples: "A necessary component of a heat engine, then, is that two temperatures are involved. At one stage the system is heated, at another it is cooled." So educators present the two temperatures as NECESSARY and deal with non-isothermal heat engines only: "All materials react to heat in some way. But this new shape-changing polymer reacts to temperatures as small as the touch of human skin to contract - in the process lifting as much as 1,000 times its own weight." "Stretchy Science: A Rubber Band Heat Engine. Learn how a rubber band can turn heat into mechanical work with this simple activity. [...] Your blow dryer essentially turned your rubber band into a heat engine - a machine that turns thermal energy into mechanical work." The second law of thermodynamics would be long forgotten if isothermal analogs which almost obviously violate the second law of thermodynamics had been analyzed (one should only evaluate the work involved in a quasi-static cycle): "When the pH is lowered (that is, on raising the chemical potential, μ, of the protons present) at the isothermal condition of 37°C, these matrices can exert forces, f, sufficient to lift weights that are a thousand times their dry weight." A. KATCHALSKY, POLYELECTROLYTES AND THEIR BIOLOGICAL INTERACTIONS, p. 15, Figure 4: "Polyacid gel in sodium hydroxide solution: expanded. Polyacid gel in acid solution: contracted; weight is lifted." The following four-step isothermal cycle, if carried out quasi-statically (reversibly), clearly violates the second law of thermodynamics: 1. The polymer is initially stretched. The operator adds hydrogen ions (H+) to the system. The force of contraction increases. 2. The polymers contracts and lifts a weight. 3. The operator removes the same amount of H+ from the system. The force of contraction decreases. 4. The operator stretches the polymer and restores the initial state of the system. The net work extracted from the cycle is positive unless the following is the case: The operator, as he decreases and then increases the pH of the system (steps 1 and 3), does (loses; wastes) more work than the work he gains from weight-lifting. However electrochemists know that, if both adding hydrogen ions to the system and then removing them are performed quasi-statically, the net work involved is virtually zero (the operator gains work if the hydrogen ions are transported from a high to a low concentration and then loses the same amount of work in the backward transport). That is, the net work involved in steps 1 and 3 is zero, and the net work extracted from steps 2 and 4 is positive, in violation of the second law of thermodynamics. Pentcho Valev
  11. Avatar for Pentcho Valev
    Pentcho Valev
    Philip Ball explains why Frank Wilczek's time crystals are bogus: "But to make that happen, the researchers must deliver kicks to the spins, provided by a laser or pulses of microwaves, to keep them out of equilibrium. The time crystals are sustained only by constant kicking, even though - crucially - their oscillation doesn't match the rhythm of the kicking. The experiments are ingenious and the results show that this modified version of Wilczek's vision is feasible. But are we right to award the new findings this eye-catching new label, or are they really just a new example of a phenomenon that has been going on since the first primeval heart started beating? If these fancy arrangements of quantum spins deserve to be called time crystals, can we then say that we each already have a time crystal pulsing inside of us, keeping us alive?" That is, Frank Wilczek's time crystals are regularly "kicked" by the experimentalist. However, there are genuine time crystals "kicked" by ambient heat and breathtakingly violating the second law of thermodynamics. Here is perpetual (limited only by the deterioration of the system) motion of water in an electric field, obviously able to produce work - e.g. by rotating a waterwheel: "The Formation of the Floating Water Bridge including electric breakdowns" "The water movement is bidirectional, i.e., it simultaneously flows in both directions." The work will be done at the expense of what energy? The first hypothesis that comes to mind is: At the expense of electric energy. The system is, essentially, an electric motor. However, close inspection would suggest that the hypothesis is untenable. Scientists use triply distilled water to reduce the conductivity and the electric current passing through the system to minimum. If, for some reason, the current is increased, the motion stops - the system cannot be an electric motor. If the system is not an electric motor, then it is ... a perpetual-motion machine of the second kind! Here arguments describing perpetual-motion machines as impossible, idiotic, etc. are irrelevant - the following conditional is valid: IF THE SYSTEM IS NOT AN ELECTRIC MOTOR, then it is a perpetual-motion machine of the second kind. In other words, if the work is not done at the expense of electric energy, then it is done at the expense of ambient heat, in violation of the second law of thermodynamics. No third source of energy is conceivable. In the electric field between the plates of a capacitor, the same perpetual motion of water can be seen (we have a time crystal again): " Liquid Dielectric Capacitor" In the capacitor system the rising water can repeatedly do work, e.g. by lifting floating weights. The crucial question is: The work (lifting floating weights) will be done at the expense of what energy? Obviously "electric energy" is not the correct answer - the capacitor is not an electric motor. Then the only possible answer remains "ambient heat". The system is a heat engine violating the second law of thermodynamics! Pentcho Valev
sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up



Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.