The Engineer’s founding in the midst of the industrial revolution gave our predecessors front-row seats on the start of many developments that have shaped our modern world, whether they realised it or not. Less than 20 years before Charles Healey put the first issue of the journal together from his offices on the Strand, Michael Faraday was in the basement of the Royal Institution, a few hundred yards away, laying the first foundations for the revolution that would see electricity become our primary source of energy for many of the activities which make up our lives.
Faraday was a polymath, as involved in chemistry as he was in physics. His most significant discovery for the energy sector was that of electromagnetic induction, in a series of experiments beginning in 1831; these demonstrated that moving a small coil of wire through which an electric current is circulating into or out of a larger coil causes a current to flow in the larger coil. Faraday’s realisation that a changing magnetic field produces an electric field is at the basis of Field theory, which underpins much of our understanding of the fundamental forces of nature, but its significance for the energy sector is that it led Faraday to build the first electrical generator, a simple piece of equipment with a metal disc rotating in a plane perpendicular to a static magnetic field. Though not a practical method for generating useful electricity, the Faraday wheel contains the vital elements that still form the basis of the generators in every form of energy generation used today apart from photovoltaic solar.
Such was the interest in electricity from the mid- to late-19th century that Faraday’s design remained a source of inspiration even after it became obvious that, in itself, it wasn’t especially useful. Subsequent improvements on the design incorporated magnets into the rotating component and, in 1883, A. Floyd Delafield of Connecticut patented a simplified version of a dynamo which did not require a commutator (a moving switch that reverses the direction of current flow between the rotor and external circuit). Delafield and Faraday’s work inspired the Serbian-American genius Nikola Tesla, who patented his own design of dynamo in 1889; Tesla had moved to the US in 1884 to work for Thomas Edison, where he completely redesigned Edison’s motor and generator, an event that seems to have led directly the two men falling out, as Tesla thought Edison had promised him $50,000 if he could complete the task (Edison claimed to have been joking, and offered Tesla a $10 per week raise on his $18 per week salary; the Serb promptly resigned and founded his own electric light company.)
Tesla developed an induction motor in 1887, which (via his friendship with the editor of the Journal Electrical World) brought him to the attention of George Westinghouse, whose engineers had been trying to develop an alternating-current motor and power system. Electricity was spreading through cities by that time, thanks to the near-simultaneous invention of the electric light bulb by Edison in the US and Thomas Swan in the UK; Westinghouse took Tesla on as a consultant and set him to work on developing an AC system to power streetcars in Pittsburgh.
This period led to a conflict known as the War of the Currents, where Edison tried to promote his DC technology (and incandescent light) over Westinghouse’s AC (with arc lights and a different design of incandescent light). The War of the Currents concerned whether DC or AC were best for long-distance distribution and which was safer; the simultaneous introduction of the electric chair as a method of execution in the US lent the whole farrago a lurid air. The ‘War’ came to and end with a series of scandals over executions, accidental electrocutions and a merger which sidelined Edison, who had been a vocal opponent of AC and led to him choosing to abandon the electrical business. But the crucial factor was probably technological: the adoption of a practical transformer that had been developed by the Hungarian team of Károly Zipernowsky, Ottó Bláthy and Miksa Déri in 1884. The ZBD high-efficiency closed-core shunt transformer allowed AC to be sent at high voltage along relatively small cables, then reduced in voltage to a level that could be used by consumers (DC, by contrast, had to be distributed at the voltage used in homes along large, expensive wires, and needed generating plants to be near to loads).
The UK was largely an onlooker in these events; the d’Oyly Carte theatre on the Strand became the first public bulding to be lit electrically when it opened in 1880, the House of Commons was lit with electric lights in 1881, and the Electric Lighting Act, passed in 1882, allowed supply systems to be established.
The country’s first AC power station opened in Deptford in 1891. This was designed by Liverpool-born Sebastian Ziani de Ferranti, a prodigy who designed an arc-light system at 13 and a generator at 16; he patented a generator with the US patent office in 1886. Deptford, coal-fired and regarded as the world’s first high-voltage power station, supplied electricity at the then unheard-of voltage of 10,000V. It operated until 1957 (its sister plant, Deptord West, was finally demolished in 1991).
The National Grid started operating in 1933, allowing the spread of electricity across the UK; the post-War Labour government established the nationalised Central Electricity Generating Board in 1947, by which time the research that was to lead to the UK’s pioneering place in the next major phase of power generation technology was well underway.
The UK was involved in the Manhattan Project to build the first nuclear weapons, with Britain’s own nuclear weapon research (known cryptically as the Tube Alloys Project, and triggered by work on nuclear fission of uranium at the University of Birmingham by German refugees Otto Frisch and Rudolf Peierls, working under Australian physicist Mark Oliphant) subsumed into the larger endeavour.
After the war, the UK continued with its own nuclear weapons programme (hindered by the US making all Manhattan Project results classified, even from allies, which necessitated the repeating of some experiments). The first UK nuclear reactors were built at the Atomic Weapons Research Establishment n Harwell, and work began on the first civil nuclear reactors at Windscale in Cumbria in 1953. The first, alder Hall, was a dual-purpose Magnox station; both producing plutonium for weapons and commercial power (Calder Hall stopped making military plutonium in 1964). Similar scaled-up Magnox nuclear stations were built across the country in the first civil nuclear programme; a new type of reactor, the Advanced Gas-Cooled Reactor (AGR) was developed at Windscale with a prototype plant coming on-stream in 1963. Seven twin-reactor AGR power stations were built between 1963 and 1989, but a combination of technical problems, financing and changing political philosophies meant that an adapted version of the Westinghouse pressurised water reactor (PWR) was the last reactor to be built in the UK - at Sizewell in Suffolk – until the reactivation of the nuclear build programme in the late 1990s that has led to the current, continuing controversy over the Hinkley Point C project.
Meanwhile, the UK has continued to be active in the development of renewable energy, particularly in the marine environment, with the contribution of Stephen Salter of Edinburgh University in wave power particularly notable, along with tidal stream energy. Nuclear fusion research has been at its most successful in the UK, with the Joint European Torus experiment at Culham leading the way for the design of ITER, planned to be the first reactor to demonstrate the feasibility of fusion controlled by electromagnetism on a large scale. Michael Faraday, equally at ease with conceptual atoms and practical electromagnets, would have approved.