The turn of the 20th century saw a burst of revolutionary developments in physics that left heads spinning. In the words of Robert Millikan1, it was “… a period in which new viewpoints and indeed wholly new phenomena follow one another so rapidly across the stage of physics that the actors themselves scarcely know what is happening.” Not least among the dizzying discoveries was that of a new elementary particle, the electron, having a universal elementary charge, e.

The concept of an elementary charge dates to 1874, when George Stoney introduced “a single definite quantity of electricity” as one of three base quantities for a natural system of measurement units2. Stoney’s idea arose from Michael Faraday’s 1834 law of electrolysis, and from experiments at that time Stoney estimated a value 16 times smaller than today’s value for e. The terms ‘electron’ and ‘atom of electricity’ were introduced by Stoney in the early 1890s, but the concept remained restricted to electrolysis. The electric charges and currents in all other electromagnetic phenomena were viewed in terms of vortex-like tubular structures in the all-pervading ether, as proposed by Faraday and formalized by James Clerk Maxwell in his famous equations. Heinrich Hertz’s observation in 1887 of waves propagating at precisely the speed predicted by these equations was considered definitive confirmation of the etherial model.

Credit: Granger Historical Picture Archive / Alamy Stock Photo

In 1897, J. J. Thomson demonstrated3 that cathode rays are composed of charged ‘corpuscules’, whose charge he estimated in 1899 to be about 1.4 times larger than e. Thomson argued these subatomic objects were constituents of all matter, a revolutionary idea at a time when atoms were considered indivisible, but he still viewed them in terms of various conformations of the ether4. It was another decade before atomistic models, bolstered by the spontaneous decay of atoms and the scattering of alpha particles from metal foils, finally displaced the etherial picture. With the existence of e established, its value needed precise measuring. Robert Millikan (pictured) developed his oil drop method to reach a relative uncertainty of 1 part in 103 in 19171, but electrolysis methods soon surpassed this and by 1969 the uncertainty was a mere 4 × 10–6.

In contrast, the history of e in metrology is far from one of steady progress. A system of electrical units adopted in the United States in 1893 defined the ampere in a way that would have pleased Stoney: 1 A is the current that would cause silver to deposit on a cathode at a rate of 1.118 mg s–1. By the time it was adopted internationally in 1908, this definition could be restated as: 1 A is the flow of 1.118 mg/MAg elementary charges per second, where MAg is the mass of a silver atom. This is equivalent to fixing the value of e in coulombs to be (MAg/1.118 mg) C, since 1 A = 1 C s–1. It thus became arguably the first measurement standard based on atomic physics. Everything changed in 1948 when, in order to harmonize electrical and mechanical measurements, the MKSA system chose Ampère’s law over Faraday’s law in defining 1 A as the current that generates a force of 2 × 10–7 N m–1 between two parallel, infinite wires 1 m apart in vacuum. With this definition, carried over into the International System of Units (SI) in 1960, the concept of an elementary charge played no role in the definition of any units — that is, it had disappeared from metrology! Of course, the microscopic picture of current as a flow of electrons was still valid and widely used, but as a measurement standard it lay outside the carefully constructed, self-consistent definitions that comprise the SI.

A new chapter for e began in 1990 with the adoption of quantum electrical standards for voltage and resistance based on the Josephson and quantum Hall effects, respectively. Both effects involve e and the Planck constant h, and together the new definitions for 1 V and 1 Ω (and hence 1 A through Ohm’s law) are equivalent to fixing the values of both e and h. A role for e in electrical metrology was thereby restored, albeit somewhat indirectly.

The role of e in metrology is about to change again5. In May of 2019, several SI definitions will be rewritten in the form of a fixed value for a fundamental constant, just as was done when the speed of light c was fixed in 1980. By fixing SI values for h and e, the quantum electrical standards will be brought within the SI and Stoney’s atom of electricity will once again have a central place in our system of units.