As a child, I read a joke about someone who invented the electric plug and had to wait for the invention of a socket to put it in. Who would invent something so useful without knowing what purpose it would serve? Mathematics often displays this astonishing quality. Trying to solve real-world problems, researchers often discover that the tools they need were developed years, decades or even centuries earlier by mathematicians with no prospect of, or care for, applicability. And the toolbox is vast, because, once a mathematical result is proven to the satisfaction of the discipline, it doesn't need to be re-evaluated in the light of new evidence or refuted, unless it contains a mistake. If it was true for Archimedes, then it is true today.

The mathematician develops topics that no one else can see any point in pursuing, or pushes ideas far into the abstract, well beyond where others would stop. Chatting with a colleague over tea about a set of problems that ask for the minimum number of stationary guards needed to keep under observation every point in an art gallery, I outlined the basic mathematics, noting that it only works on a two-dimensional floor plan and breaks down in three-dimensional situations, such as when the art gallery contains a mezzanine. “Ah,” he said, “but if we move to 5D we can adapt ...” This extension and abstraction without apparent direction or purpose is fundamental to the discipline. Applicability is not the reason we work, and plenty that is not applicable contributes to the beauty and magnificence of our subject.

Credit: ILLUSTRATIONS BY DAVID PARKINS

There has been pressure in recent years for researchers to predict the impact of their work before it is undertaken. Alan Thorpe, then chair of Research Councils UK, was quoted by Times Higher Education (22 October 2009) as saying: “We have to demonstrate to the taxpayer that this is an investment, and we do want researchers to think about what the impact of their work will be.” The US National Science Foundation is similarly focused on broader impacts of research proposals (see Nature 465, 416–418; 2010). However, predicting impact is extremely problematic. The latest International Review of Mathematical Sciences (Engineering and Physical Sciences Research Council; 2010), an independent assessment of the quality and impact of UK research, warned that even the most theoretical mathematical ideas “can be useful or enlightening in unexpected ways, sometimes several decades after their appearance”.

There is no way to guarantee in advance what pure mathematics will later find application. We can only let the process of curiosity and abstraction take place, let mathematicians obsessively take results to their logical extremes, leaving relevance far behind, and wait to see which topics turn out to be extremely useful. If not, when the challenges of the future arrive, we won't have the right piece of seemingly pointless mathematics to hand.

To illustrate this, I asked members of the British Society for the History of Mathematics (including myself) for unsung stories of the unplanned impact of mathematics (beyond the use of number theory in modern cryptography, or that the mathematics to operate a computer existed when one was built, or that imaginary numbers became essential to the complex calculations that fly aeroplanes). Here follow seven; for more, see http://www.bshm.org. Peter Rowlett

Mark McCartney & Tony Mann: From quaternions to Lara Croft

University of Ulster, Newtownabbey, UK; University of Greenwich, London

Famously, the idea of quaternions came to the Irish mathematician William Rowan Hamilton on 16 October 1843 as he was walking over Brougham Bridge, Dublin. He marked the moment by carving the equations into the stonework of the bridge. Hamilton had been seeking a way to extend the complex-number system into three dimensions: his insight on the bridge was that it was necessary instead to move to four dimensions to obtain a consistent number system. Whereas complex numbers take the form a + ib, where a and b are real numbers and i is the square root of −1, quaternions have the form a + bi + cj + dk, where the rules are i2 = j2 = k2 = ijk = −1.

Hamilton spent the rest of his life promoting the use of quaternions, as mathematics both elegant in its own right and useful for solving problems in geometry, mechanics and optics. After his death the torch was carried by Peter Guthrie Tait (1831–1901), professor of natural philosophy at the University of Edinburgh. William Thomson (Lord Kelvin) wrote of Tait: “We have had a thirty-eight-year war over quaternions.” Thomson agreed with Tait that they would use quaternions in their important joint book the Treatise on Natural Philosophy (1867) wherever they were useful. However, their complete absence from the final manuscript shows that Thomson was not persuaded of their value.

By the close of the nineteenth century, vector calculus had eclipsed quaternions, and mathematicians in the twentieth century generally followed Kelvin rather than Tait, regarding quaternions as a beautiful, but sadly impractical, historical footnote.

So it was a surprise when a colleague who teaches computer-games development asked which mathematics module students should take to learn about quaternions. It turns out that they are particularly valuable for calculations involving three-dimensional rotations, where they have various advantages over matrix methods. This makes them indispensable in robotics and computer vision, and in ever-faster graphics programming.

Tait would no doubt be happy to have finally won his 'war' with Kelvin. And Hamilton's expectation that his discovery would be of great benefit has been realized, after 150 years, in gaming, an industry estimated to be worth more than US$100 billion worldwide.

Graham Hoare: From geometry to the Big Bang

Correspondence editor, Mathematics Today

In 1907, Albert Einstein's formulation of the equivalence principle was a key step in the development of the general theory of relativity. His idea, that the effects of acceleration are indistinguishable from the effects of a uniform gravitational field, depends on the equivalence between gravitational mass and inertial mass. Einstein's essential insight was that gravity manifests itself in the form of space-time curvature; gravity is no longer regarded as a force. How matter curves the surrounding space-time is expressed by Einstein's field equations. He published his general theory in 1915; its origins can be traced back to the middle of the previous century.

In his brilliant Habilitation lecture of 1854, Bernhard Riemann introduced the principal ideas of modern differential geometry — n-dimensional spaces, metrics and curvature, and the way in which curvature controls the geometric properties of space — by inventing the concept of a manifold. Manifolds are essentially generalizations of shapes, such as the surface of a sphere or a torus, on which one can do calculus. Riemann went far beyond the conceptual frameworks of Euclidean and non-Euclidean geometry. He foresaw that his manifolds could be models of the physical world.

The tools developed to apply Riemannian geometry to physics were initially the work of Gregario Ricci-Curbastro, beginning in 1892 and extended with his student Tullio Levi-Civita. In 1912, Einstein enlisted the help of his friend, the mathematician Marcel Grossman, to use this 'tensor calculus' to articulate his deep physical insights in mathematical form. He employed Riemann manifolds in four dimensions: three for space and one for time (space-time).

It was the custom at the time to assume that the Universe is static. But Einstein soon found that his field equations when applied to the whole Universe did not have any static solutions. In 1917, to make a static Universe possible, Einstein added the cosmological constant to his original field equations. Reasons for believing in an explosive origin to the Universe, the Big Bang, were put forward by Aleksander Friedmann in his 1922 study of Einstein's field equations in a cosmological context. Grudgingly accepting the irrefutable evidence of the expansion of the Universe, Einstein deleted the constant in 1931, referring to it as “the biggest blunder” of his life.

Edmund Harriss: From oranges to modems

University of Arkansas, Fayetteville

In 1998, mathematics was suddenly in the news. Thomas Hales of the University of Pittsburgh, Pennsylvania, had proved the Kepler conjecture, showing that the way greengrocers stack oranges is the most efficient way to pack spheres. A problem that had been open since 1611 was finally solved! On the television a greengrocer said: “I think that it's a waste of time and taxpayers' money.” I have been mentally arguing with that greengrocer ever since: today the mathematics of sphere packing enables modern communication, being at the heart of the study of channel coding and error-correction codes.

In 1611, Johannes Kepler suggested that the greengrocer's stacking was the most efficient, but he was not able to give a proof. It turned out to be a very difficult problem. Even the simpler question of the best way to pack circles was only proved in 1940 by László Fejes Tóth. Also in the seventeenth century, Isaac Newton and David Gregory argued over the kissing problem: how many spheres can touch a given sphere with no overlaps? In two dimensions it is easy to prove that the answer is 6. Newton thought that 12 was the maximum in 3 dimensions. It is, but only in 1953 did Kurt Schütte and Bartel van der Waerden give a proof.

The kissing number in 4 dimensions was proved to be 24 by Oleg Musin in 2003. In 5 dimensions we can say only that it lies between 40 and 44. Yet we do know that the answer in 8 dimensions is 240, proved back in 1979 by Andrew Odlyzko of the University of Minnesota, Minneapolis, and Neil Sloane. The same paper had an even stranger result: the answer in 24 dimensions is 196,560. These proofs are simpler than the result for three dimensions, and relate to two incredibly dense packings of spheres, called the E8 lattice in 8-dimensions and the Leech lattice in 24 dimensions.

This is all quite magical, but is it useful? In the 1960s an engineer called Gordon Lang believed so. Lang was designing the systems for modems and was busy harvesting all the mathematics he could find.

He needed to send a signal over a noisy channel, such as a phone line. The natural way is to choose a collection of tones for signals. But the sound received may not be the same as the one sent. To solve this, he described the sounds by a list of numbers. It was then simple to find which of the signals that might have been sent was closest to the signal received. The signals can then be considered as spheres, with wiggle room for noise. To maximize the information that can be sent, these 'spheres' must be packed as tightly as possible.

In the 1970s, Lang developed a modem with 8-dimensional signals, using E8 packing. This helped to open up the Internet, as data could be sent over the phone, instead of relying on specifically designed cables. Not everyone was thrilled. Donald Coxeter, who had helped Lang understand the mathematics, said he was “appalled that his beautiful theories had been sullied in this way”.

Juan Parrondo & Noel-Ann Bradshaw: From paradox to pandemics

University of Madrid; University of Greenwich, London

In 1992, two physicists proposed a simple device to turn thermal fluctuations at the molecular level into directed motion: a 'Brownian ratchet'. It consists of a particle in a flashing asymmetric field. Switching the field on and off induces the directed motion, explained Armand Ajdari of the School of Industrial Physics and Chemistry in Paris and Jacques Prost of the Curie Institute in Paris.

Parrondo's paradox, discovered in 1996 by one of us (J.P.), captures the essence of this phenomenon mathematically, translating it into a simpler and broader language: gambling games. In the paradox, a gambler alternates between two games, both of which lead to an expected loss in the long term. Surprisingly, by switching between them, one can produce a game in which the expected outcome is positive. The term 'Parrondo effect' is now used to refer to an outcome of two combined events being very different from the outcomes of the individual events.

A number of applications of the Parrondo effect are now being investigated in which chaotic dynamics can combine to yield non-chaotic behaviour. For example, the effect can be used to model the population dynamics in outbreaks of viral diseases and offers prospects of reducing the risks of share-price volatility. Plus it plays a leading part in the plot of Richard Armstrong's 2006 novel, God Doesn't Shoot Craps: A Divine Comedy.

Peter Rowlett: From gamblers to actuaries

University of Birmingham, UK

In the sixteenth century, Girolamo Cardano was a mathematician and a compulsive gambler. Tragically for him, he squandered most of the money he inherited and earned. Fortunately for modern actuarial science, he wrote in the mid-1500s what is considered to be the first work in modern probability theory, Liber de ludo aleae, finally published in a collection in 1663.

Around a century after the creation of this theory, another gambler, Chevalier de Méré, had a dilemma. He had been offering a game in which he bet he could throw a six in four rolls of a die, and had done well out of it. He varied the game in a way that seemed sensible, betting he could throw a double six with two dice in 24 rolls. He had calculated the chances of winning in both games as equivalent, but found he lost money in the long run playing the second game. Confused, he asked his friend Blaise Pascal for an explanation. Pascal wrote to Pierre de Fermat in 1654. The ensuing correspondence laid the foundations for probability theory, and when Christiaan Huygens learned of the results he wrote the first published work on probability, De Ratiociniis in Ludo Aleae (published 1657).

In the late seventeenth century, Jakob Bernoulli recognized that probability theory could be applied much more widely than to games of chance. He wrote Ars Conjectandi (published, after his death, in 1713), which consolidated and extended the probability work by Cardano, Fermat, Pascal and Huygens. Bernoulli built on Cardano's discovery that with sufficient rolls of a fair, six-sided die we can expect each outcome to appear around one-sixth of the time, but that if we roll one die six times we shouldn't expect to see each outcome precisely once. Bernoulli gave a proof of the law of large numbers, which says that the larger a sample, the more closely the sample characteristics match those of the parent population.

Insurance companies had been limiting the number of policies they sold. As policies are based on probabilities, each policy sold seemed to incur an additional risk, the cumulative effect of which, it was feared, could ruin a company. Beginning in the eighteenth century, companies began their current practice of selling as many policies as possible, because, as Bernoulli's law of large numbers showed, the bigger the volume, the more likely their predictions are to be accurate.

Julia Collins: From bridges to DNA

University of Edinburgh, UK

When Leonhard Euler proved to the people of Königsberg in 1735 that they could not traverse all of their seven bridges in one trip, he invented a new kind of mathematics: one in which distances didn't matter. His solution relied only on knowing the relative arrangements of the bridges, not on how long they were or how big the land masses were. In 1847, Johann Benedict Listing finally coined the term 'topology' to describe this new field, and for the next 150 years or so, mathematicians worked to understand the implications of its axioms.

For most of that time, topology was pursued as an intellectual challenge, with no expectation of it being useful. After all, in real life, shape and measurement are important: a doughnut is not the same as a coffee cup. Who would ever care about 5-dimensional holes in abstract 11-dimensional spaces, or whether surfaces had one or two sides? Even practical-sounding parts of topology such as knot theory, which had its origins in attempts to understand the structure of atoms, were thought to be useless for most of the nineteenth and twentieth centuries.

Suddenly, in the 1990s, applications of topology started to appear. Slowly at first, but gaining momentum until now it seems as if there are few areas in which topology is not used. Biologists learn knot theory to understand DNA. Computer scientists are using braids — intertwined strands of material running in the same direction — to build quantum computers, while colleagues down the corridor use the same theory to get robots moving. Engineers use one-sided Möbius strips to make more efficient conveyer belts. Doctors depend on homology theory to do brain scans, and cosmologists use it to understand how galaxies form. Mobile-phone companies use topology to identify the holes in network coverage; the phones themselves use topology to analyse the photos they take.

It is precisely because topology is free of distance measurements that it is so powerful. The same theorems apply to any knotted DNA, regardless of how long it is or what animal it comes from. We don't need different brain scanners for people with different-sized brains. When Global Positioning System data about mobile phones are unreliable, topology can still guarantee that those phones will receive a signal. Quantum computing won't work unless we can build a robust system impervious to noise, so braids are perfect for storing information because they don't change if you wiggle them. Where will topology turn up next?

Chris Linton: From strings to nuclear power

Loughborough University, UK

Series of sine and cosine functions were used by Leonard Euler and others in the eighteenth century to solve problems, notably in the study of vibrating strings and in celestial mechanics. But it was Joseph Fourier, at the beginning of the nineteenth century, who recognized the great practical utility of these series in heat conduction and began to develop a general theory. Thereafter, the list of areas in which Fourier series were found to be useful grew rapidly to include acoustics, optics and electric circuits. Nowadays, Fourier methods underpin large parts of science and engineering and many modern computational techniques.

However, the mathematics of the early nineteenth century was inadequate for the development of Fourier's ideas, and the resolution of the numerous problems that arose challenged many of the great minds of the time. This in turn led to new mathematics. For example, in the 1830s, Gustav Lejeune Dirichlet gave the first clear and useful definition of a function, and Bernhard Riemann in the 1850s and Henri Lebesgue in the 1900s created rigorous theories of integration. What it means for an infinite series to converge turned out to be a particularly slippery animal, but this was gradually tamed by theorists such as Augustin-Louis Cauchy and Karl Weierstrass, working in the 1820s and 1850s, respectively. In the 1870s, Georg Cantor's first steps towards an abstract theory of sets came about through analysing how two functions with the same Fourier series could differ.

The crowning achievement of this mathematical trajectory, formulated in the first decade of the twentieth century, is the concept of a Hilbert space. Named after the German mathematician David Hilbert, this is a set of elements that can be added and multiplied according to a precise set of rules, with special properties that allow many of the tricky questions posed by Fourier series to be answered. Here the power of mathematics lies in the level of abstraction and we seem to have left the real world behind.

Then in the 1920s, Hermann Weyl, Paul Dirac and John von Neumann recognized that this concept was the bedrock of quantum mechanics, since the possible states of a quantum system turn out to be elements of just such a Hilbert space. Arguably, quantum mechanics is the most successful scientific theory of all time. Without it, much of our modern technology — lasers, computers, flat-screen televisions, nuclear power — would not exist.