Introduction

The 45th Cambridge Ophthamological symposium took place in the United Nations ‘Year of Light and Light-Based Technologies’. (UN General Assembly 68th Session proclaimed 2015 as the International Year of Light and Light-based Technologies (IYL 2015A/RES/68/221)). As a consequence it was thought to be appropriate to review light in man’s environment and the way that this environment has changed with increasing sophistication and technology.

Light played such an important role in the life of early man that virtually all religious texts cite light as an early part of creation. For example, Genesis chapter 1 verse 3, the King James version of the Bible The First Book of Moses, says: ‘in the beginning God created the heaven and the earth and the earth was without form and void and darkness was upon the face of the deep (The Holy Bible). And the Lord said let there be light’. Such spiritual obsession with light preceded the Judeo-Christian and Islamic traditions by hundreds of years and seems to be ubiquitous throughout the ancient world. It is perhaps not surprising given that darkness settled at the end of each day and that the period of darkness varied with the tilt of the Earth at different times of the year. Imagine with no knowledge of science the overriding awe that this transition from light to darkness must have induced and the pressures on primitive civilisations to pray that the cycle would continue and that light-dependent food products would remain in good supply. Many civilisations, particularly those of Egypt, worshipped the sun and placed the sun gods in a very special position. Almost universally we see light as associated with goodness and dark as associated with evil.

It is not known how man first controlled light in the form of fire, but it must have been obvious that the world was illuminated as a consequence of fires induced by lightning strikes. When man mastered fire he had light under his control. Fire-based light sources stretch from the earliest civilisations using fire-branded torches through oil lamps and candles and ultimately gas. Thus, fire and light in man’s environment lasted for several thousand years until the advent of electrical light sources in the late 1800s; however, it should be remembered that the author’s grandmother had gas lighting in her home in the 1950s as did many others.

Fire-based lighting had huge constraints and obvious safety issues: (1) too much heat was generated when attempts were made to increase light levels; and (2) the more the sources of fire created, the more the fire risk. When William the Conqueror arrived in London, he was appalled by the proximity of wooden-framed thatched-roofed dwelling places and in 1068 he introduced the ruling of ‘Couvre-Feu’ (cover fire; corrupted to ‘curfew’) whereby all fires and fire-based light sources had to be extinguished by 2000 hours throughout the country. The danger he recognised was demonstrated in many fires throughout Europe and in the great fire of London in 1666, although this was said to be caused by a baker not extinguishing his ovens. More than 10 years later, in February 1677, Samuel Pepys said, ‘to this very day I cannot sleep a night without great terrors of fire and this very night could not sleep until 2 AM through thoughts of fire’ (The Diary of Samuel Pepys).

Of the many technologies that impacted upon man’s desire to control light, the advent of printed text was huge. Before the printing press was invented in 1440, manuscripts were handwritten in relatively large text and only read by an extremely small fraction of the community, particularly clerics. The advent of print meant that ideas and information could be disseminated to a growing educated minority. Most individuals did not have the luxury of reading during their working day and as a consequence desired light to read at night. Many of the educated members of the society were elderly and presbyopic. For any given visual task, the over-50s require 50–100% more light and optical assistance to read the relatively small printed text. There was then an increasing demand for spectacles, and in 1629 the Worshipful Company of Spectacle Makers was granted a Royal Charter to protect the standards of spectacles available in England. Under their ordinances they could seize spectacles said to be of poor standards and break them on the London Stone. Not surprisingly, the Liverymen used their powers to restrict imports of ‘cheap Dutch spectacles’! The combination of an ageing eye and poor light led Samuel Pepys to write in his diary on Friday, 31 May 1669: ‘and this ends all that I doubt I shall ever be able to do with my own eyes in the keeping of my journal. I being not able to do it any longer having done now so long as to undo my eyes almost everytime that I take a pen in my hand.’

It is frequently forgotten that cities were extremely dark places after sunset, with very little, if any, street lighting.1, 2 There were huge pressures on city authorities to protect their citizens when darkness fell and that is why most cities had walls surrounding them with gates that closed at sunset. Remnants of the gate structure are seen in London, with the nomenclature of areas such as Aldersgate, Moorgate, Cripple Gate, etc. Such was the threat of night that even under the laws of 1766 crimes of night and day were distinguished, as were potential penalties. Intruders in a house at night were burglars and if they were killed by the householder there were few, if any, consequences as the absence of light would mean that they would be difficult to identify and bring to justice. By contrast, daytime intruders were housebreakers and because they could be recognised and brought to law, undue violence against them was discouraged and had consequences if perpetrated.

Many individuals tried to maximise the use of sunlight in their homes by the use of windows and architectural design. Unfortunately, the government as always was looking for increased revenue, and, having decided that only the wealthy could afford windows with glass, introduced the window tax between 1696 and 1852. This resulted in many Georgian houses that we see today with their windows bricked up in an attempt to avoid the tax. At the time the term ‘daylight robbery’ came into being and is still used today in relation to unfair practices. With the passage of the window tax Victorian architects tried to maximise the use of sunlight within dwellings by increasing the height of the Windows and as a consequence the rooms. For example, if the sunlit environment had a luminance of 5000 lux, in order to achieve a level of 150 lux inside a room 15 feet from the window, then the top of the window had to be 11 foot high. For the wealthy, after sunset the rooms could be illuminated by candelabra or by more sophisticated oil lamps. For the poor, however, many went to bed with the use of candles.

It wasn’t until the middle of the 1600s that cities began to think about illuminating their streets after dark. Until this time many cities used torch boys to guide the unwary through the dark city streets and, not surprisingly, many individuals were robbed as a result of collusion between the torch boys and the rogues. Paris was the first city in modern times to introduce street lighting in 1667, followed by Amsterdam in 1669, Berlin in 1682 and London in 1683. This represented a significant change in metropolitan thinking, but still only occurred at low levels and in major thoroughfares. It wasn’t until 1809 that gas lighting was introduced in London in Pall Mall. This lighting source, still based on fire, produced a level of illumination whereby individuals could be readily recognised. This had a mixed reception because the ladies of the night felt it was restriction of trade and the gentleman availing themselves of the ladies’ services felt it was a breach of their confidentiality! Nevertheless, with the advent of gas lighting general light levels began to increase within cities. It should however be remembered that the poor were still only able to afford minimal lighting in their homes after dark and that their light sources were for the most part naked flames generated by candles or oil, with the consequence of many fires in slum areas. Although incandescent lamps of the type invented by Swan and Edison appeared in 1879, very few individuals other than the very wealthy could afford lighting of this sort, and it was dependent upon household generators. Many years would pass before a grid system was established with electrical supply to domestic properties. Between the wars electric lighting became more established and cities took on a much brighter aspect, particularly wealthy cities like London that was beginning to use lighting for street advertising as well as illumination. The advent of the motor vehicle also saw electrical light sources advance relatively rapidly as compact battery-driven sources were required. It is very significant that the improvement of city lighting was so dramatic that when Draconian laws were introduced during the Second World War in order to limit lighting, the so-called blackout regulations, there were appalling consequences. In a letter to the BMJ in 1939 the Kings’ surgeon Wilfred Trotter wrote, ‘HM government by frightening the population into blackout regulations, the Luftwaffe was able to kill 600 British citizens per month without ever taking to the air, at a cost to itself of exactly nothing’.3 In such a short period, such was the dependency of the general public on lighting in cities that it was obviously extremely difficult to adapt to the new regulations, and the regulations were subsequently modified to minimise the risk of air attack, while at the same time maximising the safety of civilians.

Electrically based artificial light sources really revolutionised lighting in man’s environment because unlike nature, where levels of illumination gradually increase in the morning and decrease in the night, now at the flick of a switch the environment could instantly be flooded with light.4 In 1937 fluorescent lighting became available and has dominated commercial and industrial lighting for over 50 years; this source meant that almost daylight levels were instantly available without significant increase in heat. It is interesting to note that when regulations for office lighting were first recommended at the turn of the last century, levels in the order of 50 lux were not uncommon. Today many offices are lit in excess of 1000 lux and even reach levels of 2000 or more. Given that the elderly need more light for any given visual task and it is the elderly, that is, management, that set lighting levels, it is not surprising that many young workers find the lit environment in their offices unpleasant. This has not been helped by many architects seizing on the fact that lighting is heat and as a consequence submitting their plans for buildings with apparently reduced heating costs as a result of increased lighting. The current trend for reducing energy has resulted in the so-called low-energy light sources. The original systems were compact fluorescents, but suffered by having a slow start-up time and very limited light output. Many individuals started using low-voltage quartz halogen sources, but these rarely reduced the cost significantly. Today the advent of LEDs is heralding yet another revolution in domestic and commercial lighting. Unfortunately, little attention has been paid to the emission characteristics of these sources. Incandescent bulbs give out light over almost 360 steradians, that is, in all directions. The light is biologically friendly in that there was little emission at short wavelengths, blue and ultraviolet (UV), but unfortunately much of its energy is dissipated in the form of infrared radiation or heat. By contrast, fluorescent lighting has rather peculiar emission characteristics, with one sodium emission line only contributing 8% of the light but more than 40% of potentially biologically unfriendly emissions. The low-energy sources, particularly LEDs, also have a lot of emission in the blue and UV, and these are emissions that are not retina friendly. Studies on light damage of the retina show that in the phakic eye the peak of light damage occurs at 441 nm and that the peak moves into the UV when the natural lens is removed. It should be remembered that the ageing lens yellows and that in the macular region there is a yellow pigment present in the inner-connecting fibres of the photoreceptor cells, the fibre layer of Henle. Both of these pigments would attenuate blue light, particularly in the region 440 nm. Further, there are no short-wavelength (blue) cones in the fovea, resulting in foveal tritanopia; thus, blue in this region has very little role in high-acuity vision. By contrast, recent studies have shown that there is a pigment in a percentage of the ganglion cells, Melanopsin, with an absorption peak at around 470 nm, which is responsible for triggering circadian rhythms and in particular stops us from suffering from seasonal adjusted disorder.

Given that novel light sources are rapidly reaching the marketplace for domestic and commercial lighting, attention must be paid to the spectral emission of such sources as well as their overall luminance. It should be remembered that millions of years of evolution were governed by an approximate 12-h light–dark cycle and that we are really the first generation to have had daylight levels of lighting under fingertip control. In nature, transitions from dark to light and light to dark were gradual, giving biological systems time to adjust. By contrast, light levels induced by artificial sources are acute and can occur at any time during circadian rhythms. Worldwide light levels are now approaching levels of what could be described as light pollution. The impact of such levels using novel wavelengths and random illumination times must be considered in relation to age-related eye diseases. Dermatologists have long known the impact of light on such diseases as malignant melanomas and the correlation between Victorian societies where sunlight was avoided and the modern mindset where tanned bodies are desirable to the extent that many individuals pay money to lay on UV radiation sources such as sunbeds. Given the importance of light sources in all their forms in today’s environment, it is reassuring that Public Health England has a group devoted to determine the potential risks of all novel light-emitting devices.5