Let There Be Light!

How light became a beacon for science and human progress

 

 
The story of light is the story of life itself; all species on earth in some way depend on it to survive. But humankind is unique, for we actively strive to use light to help us understand more about the world in which we live. This goes way back to the very dawn of human existence; indeed, if we recognise ‘science’ as the systematic study of the world through observation and experiment, then light has always been humanity’s greatest tool. Today, light-based technology forms the bedrock of modern science, but to understand how we got here, we have to go right back to the beginning.
 
Early humans popped up around 2 million years ago, and right from the start, we were completely dependent on the sun. But after a while, we worked out a way of making light work for us. Somewhere between 0.2 and 1.7 million years ago, people learned how to make the hot crackly stuff; and with fire, we were suddenly able to explore dark corners of the world, hunt by night, and protect ourselves from predators.
 
Fire was the first big step. From then on, humanity continued to find new ways of using light to help us understand and negotiate the world. The Ancient Greeks brought about a scientific revolution, fundamentally changing the way we perceived light. While the ancient scholars held on to the theistic view of the sun as a god, they also attempted to rationally comprehend light in a way that hadn’t been done before.
 
The early Grecian theories about light were actually pretty sensible. The great mathematician, Pythagoras, suggested that light was a series of tiny particles – photons. Aristotle piped up shortly after, arguing that light actually moved in waves like the sea. As it turns out, they were both right. In recent years, scientists have discovered that light somehow operates as both a wave and a particle – it’s amazing stuff, but we’ll get back to that later. 
Robert Hooke's drawing of a flea
Robert Hooke's drawing of a flea
 
Fast forwards about 2000 years from Ancient Greece, and a Renaissance mathematician and astronomer named Nicolaus Copernicus was working on a manuscript that would forever alter our relationship with light. Published in 1543, On the Revolutions of the Heavenly Spheres declared the unthinkable: the Earth was not the centre of the universe; it revolved around the sun. This theory was initially viewed as heresy and violently repressed by the Catholic Church. Nonetheless, our ideas about light and its physical properties were evolving. The Age of Enlightenment was dawning and scientists were more curious than ever.
 
In 1665, Robert Hooke published Micrographia, a historic book which contained drawings of the world as seen through a very novel tool: the microscope. This invention bent and focussed light in such a way as to allow the viewer to see objects in previously unimaginable detail. Hooke’s Micrographia contained sketches of a fly’s eye and plant cells: things that had never been seen before.
 
Light was becoming more and more important as a tool for science. About 100 years later, Benjamin Franklin began conducting his own (very risky) investigations into light. Franklin’s most famous experiment saw him fly a kite with a key attached to it into a lightning storm in an attempt to prove that the deadly flashes were indeed electricity. This demonstrated something extremely significant: with a little work, electricity could be harnessed.
 
But to really capitalise on all that light had to offer, scientists first had to understand more about it. In 1800, William Herschel did just that; he discovered infrared, an invisible form of light. Herschel shined light through a prism and observed the way that it split off into different shades. This experiment had been performed way back in 1666 by Isaac Newton, who had deduced that visible light is made up of a number of different colours. But Hershel went a step further; he measured each colour to try and determine its energy level. What he found was an area in the region beyond red that had a much higher temperature; we can’t see it, but we can experience it as heat or energy – and so infrared was born.
There are now almost 50 synchrotrons around the world
There are now almost 50 synchrotrons around the world
 
Just a year later in 1801, another monumental discovery was made on the nature of light. Johann Wilhelm Ritter shone different colours onto silver chloride-soaked paper to examine the ways in which light could be used to prompt chemical reactions. Unexpectedly, he discovered a colour right at the bottom of the spectrum, deeper than violet, capable of causing the chemical reaction to take place much quicker. Ultraviolet leapt on to the scene, and with it came new possibilities for scientific research.
 
Things were moving pretty quickly, even by science standards. Throughout the early 1800’s, Michael Faraday’s experiments into the curious science of electromagnetism uncovered new and exciting properties of certain materials: work that revolutionised electricity production and brought the dream of widespread energy use that bit closer. Thomas Edison’s light bulb, invented in 1879, made incandescent light a commercially viable product. It wasn’t long before people began packing up their candles and installing electric light in their homes.
 
By 1895, the academic community was grappling with yet another form of intriguing light. William Röntgen had submitted a paper, On a new kind of ray: A preliminary communication, describing the effects of a mysterious phenomenon: X-rays.Röntgen had discovered that these novel rays were capable of passing straight through solid matter. He was even kind enough to test the effect on his wife’s hand, after which the poor woman famously remarked: “I have seen my death!”
 
By the 20th Century, these new forms of light had become integral to scientific enquiry. In 1912, William Henry and Lawrence Bragg discovered that X-rays could be used to determine the atomic structure of matter; and so the science of X-ray crystallography took off, giving us penicillin, vaccines, and the DNA double helix. Meanwhile, ultraviolet was being used to observe objects in space, and as a catalyst for chemical reactions, whilst infrared was exploited in lasers, geochemistry, and determining molecular structures. By 1980, the world’s first dedicated synchrotron light source was built at Daresbury, Cheshire. This pioneering machine produced special kinds of light – X-rays, ultraviolet, and infrared – for use in scientific experiments.
 

And that brings us right up to the present day. There are now almost 50 synchrotrons around the world, and they are used to study everything, from medicine, to nanotechnology, to advanced engineering. These powerful machines exploit the vast potential of light, and hundreds of thousands of scientists visit synchrotrons each year to use the light in their experiments.

To celebrate all that light has given us, UNESCO has declared 2015 the International Year of Light and Light-based Technologies. Diamond is one of many institutions, including Downing Street and CERN, celebrating the historic year with a host of activities and events.
 
But the story of light is not over yet; there’s still so much more to discover. Even today, scientists aren’t 100% sure about the nature of light itself: is it a wave or a particle? It appears to be both, behaving in strange ways that seem to elude the laws of physics. There’s still work to be done towards pinning down the characteristics of light and identifying the best ways we can harness it. But one thing is for certain: we owe almost everything we know about the world to bright light and science.
 
 

Read more about cutting edge research in Diamond's magazine: