The language of theoretical physics implies a hierarchy. We celebrate 'unified' models that encompass previously disjoint entities. The stated goal of many theoretical physicists is a 'theory of everything', which is taken to mean a unified theory of gravity and the standard model of particle physics. Yet should not a theory of everything contain literature, penguins and the Holy Roman Empire? Implicit within this terminology is the assumption that with such a theory this should be possible — it's simply a matter of following the mathematics until we derive these phenomena.

There are insurmountable practical issues as to why it can never be the case that social science or Brexit is derived from fundamental physics. We cannot even model complex molecules directly from the standard model. There is no way to derive DNA, human organs or all the complex structure that lies between the standard model and the behaviour of a society from their physical foundations. As Kohn pointed out in his Nobel Prize acceptance lecture1 in 1998, to solve a system of just 100 electrons would require minimizing a function across 10150 dimensions. Thus “traditional wavefunction methods ... are generally limited to molecules with a small total number of chemically active electrons”. The size of our observable universe is approximately 10184 Planck volumes, so even an ideal computer capable of storing detailed values for each of these dimensions at the quantum gravity scale could not sufficiently explore the space required to completely model a glucose molecule to a good level of accuracy at the subatomic level. To do so would require over 33×192 ≈ 10274 minimizations for the electrons and protons alone. As photosynthesis is explained in chemical formulae for glucose that are understood and tested by high-school students the world over, why should we claim that that explanation is less fundamental than that prescribed in such a way that it can never be calculated? Occam's razor cuts deeply into such explanations. In practice, we take the available data and construct a map to compress the information into a useful form that actually works. If the map is bigger than the object it represents, then there is no need for the map.

The solution to this problem is that we make approximations, simplifications and effective laws. Effective field theory is put into place to explain phenomena that cannot be derived from a more fundamental level, and is often the tool used to make predictions that are confirmed by observations. As Rovelli notes2, “There is no physics without approximations”. Follow the chain of emergence from quarks to atoms to molecules to cells to neurons to lobes to brains to people to social groups — at each stage we do not derive the behaviour directly, but our approximations study the system in terms of new variables appropriate to that level. We model and test the larger system to ensure that our approximations hold. So the science of the larger system is established through empirical observation independently of its constituent parts. It isn't any less scientifically valid to talk of the behaviour of atoms instead of quarks, or people instead of neurons. To insist that we do so is to ignore the primality of observation: the model in question is tested and found to work — therefore it is a valid scientific description of reality.

To insist that any scientific model must descend from fundamental physics neglects the issue of the utility of the model. No engineer building a bridge ever lost sleep over the inability of theorists to reconcile quantum mechanics and general relativity. Nor should they. Science is the modelling of reality. When a model is tested and found to work in some regime, it is anchored in reality. It is immaterial that we cannot derive this model from something more 'fundamental' — as its scientific truth is borne out by its utility. If we insist that only the most 'fundamental' model is true, then science becomes a house of cards. As we do not have a confirmed candidate for this most fundamental model, the situation would be dire: we have neither determined nor tested the foundations on which we would construct our hierarchy. Also, any error in any of the layers between the fundamental and the model in question brings the whole system down.

This, fortunately, is not how science is done. A revolutionary observation that confirmed string theory would be immaterial to the condensed-matter physicist. Their observations and models would remain unchanged, and the practical challenge to the fundamental model would be to reproduce the known observations at the higher level. The burden of finding agreement lies not with the established macro-level theory, but with the microscopic hypothesis. What is most fundamental is that a model matches testable reality, and that applies at each level. As Philip Anderson has written3, “This principle of emergence is as pervasive a philosophical foundation of the viewpoint of modern science as is reductionism. It underlies, for example, all of biology, as emphasized especially by Ernst Mayr, and much of geology. It represents an open frontier for the physicist, a frontier which has no practical barriers in terms of expense or feasibility, merely intellectual ones.”