top of page
  • Writer's pictureRahul Rao

Going nuclear: splitting the atom

by Rahul Rao, Varun Rao and Yasir Aheer


Summary:

  • The splitting of the nitrogen atom in 1918 by Ernest Rutherford opened up a whole new branch of physics - nuclear physics

  • Splitting the nucleus of the atom (nuclear fission) provides us with a way to convert mass to energy via Einstein’s famous equation E=mc2

  • Nuclear physics paved the way for the development of the atomic bombs used near the end of World War II.

  • Post-war, scientific attention turned towards the use of nuclear fission for the production of clean electricity.

 

“Nuclear” is a term often used negatively in popular media. Phrases that come to mind are “the world is on the brink of a nuclear war” and “the nuclear reactor accident at...”. HBO’s recent series Chernobyl vividly brings to life the human cost of our atom-splitting endeavours. The US-Iran nuclear deal has been in the news several times over the past few years. It seems some mention of the atomic nucleus must creep into our lives every few months. It wasn’t always so.


Exploration of the nucleus of atoms was still cutting-edge science as recently as the early 1900s. The tool of choice for this work was the alpha particle, which is emitted from decaying radioactive matter. The size of these particles made them perfect for exploring the atom - small enough to interact with individual atomic nucleii, yet large enough to not penetrate the atom.


In 1918, Kiwi scientist Ernest Rutherford’s experiments on the bombardment of the nitrogen atom with alpha particles produced an interesting result - the nitrogen atom split into an unknown isotope of oxygen and a hydrogen nucleus (now known to be the proton). This seminal discovery of the splitting of an atom would lay the foundations for the Manhattan Project and modern day nuclear weapons.


Research on atomic nuclei continued for the next couple of decades, yielding the discovery of nuclear fission in Germany in 1938 - just a year before World War II broke out. Researchers at the Kaiser Wilhelm Institute for Chemistry (now known as the Max Planck Institute for Chemistry) bombarded uranium-235 with neutrons and found that the uranium atom had been split roughly in half. They recorded one of the new elements as barium but were unable to identify the second element (now known to be the inert gas krypton).


The unusual outcome of this splitting was that the sum of the masses of the new elements fell just short of the mass of the original uranium 235 atom. Where did this missing mass go? To answer that, we must consider perhaps the most famous equation in the history of science.


E = mc2


Einstein’s Theory of Special Relativity postulated that energy (E) and mass (m) were interchangeable via the equation above. The third parameter c is a constant, equal to the speed of light in vacuum - a staggering 300 million metres per second or just over 1 trillion kilometres per hour. The sheer number of zeros in c means that even the smallest loss of mass produces colossal amounts of energy. A loss of just 10g of mass releases as much energy as the Tsar Bomba, the most powerful nuclear device ever detonated. 10g is approximately the weight of a paper clip.


World War II was still months away at this point and governments had not yet co-opted scientific endeavours for the furthering of nationalistic agendas. The German scientists shared their discovery with the world in January 1939 through Nobel Laureate Niels Bohr at the Fifth Washington Conference on Theoretical Physics.


Present at that conference was American theoretical physicist Edward Teller, later to be known as the father of the hydrogen bomb. He recalls the discussion after the Bohr’s revelation being subdued and his neighbour at the conference whispering to him “Perhaps we should not discuss this. Clearly something obvious has been said, and it is equally clear that the consequences will be far from obvious.” The conference attendees knew the world had been irreversibly changed; history tells us their instincts were correct.



The atom bomb

On August 2nd 1939, after Germany invaded Poland, Albert Einstein and Leo Szilard penned a letter to US President Franklin Roosevelt calling for his personal attention to the matter of the development of the atomic bomb in the US, given the possibility of German scientists building one first. Made famous in later years as the Einstein-Szilard letter, this set the scene for the creation of the Manhattan project that would lead the development of such a weapon.


The Japanese attack on Pearl Harbour in 1941 suddenly dragged the US into World War II; the war was now personal to Americans and the stakes were higher. The very next month, Roosevelt approved the construction of an atom bomb for use in the war. A year later, several independent research efforts were consolidated into the Manhattan project.


Things moved swiftly from here. In early 1943, Robert Oppenheimer was named director of the Los Alamos Laboratory, where he worked with Edward Teller. A mere two and a half years later, on July 16th 1945, the Trinity Test was conducted - the first ever demonstration of a nuclear weapon - and in a blink humanity was catapulted into the nuclear age. The mushroom cloud above the detonation site stood 40,000ft tall and was unlike anything the world had ever seen. When Oppenheimer saw the devastation the explosion caused, he recalled a line from Hindu scripture, “Now I am become death, destroyer of worlds”. This line was to be forever emblematic of the nuclear bomb.




The events of World War II that followed the US development of the bomb are well known, as is the devastation brought upon the Japanese citizens of Hiroshima and Nagasaki. Ever since, the spectre of nuclear war has hovered over humanity. Every time political tensions between nuclear-armed countries rise, we hold our collective breaths hoping for cooler heads to prevail.


Despite these fears, the splitting of the atom was not wholly negative. As with the Space Race we have written about before, technology first developed by the military for destructive purposes spilled over into civilian life for more wholesome applications. Shortly after the war, scientific attention turned towards the use of nuclear fission for good - the generation of clean electricity. In late 1951, the first facility that used nuclear energy to generate electricity was commissioned in Idaho. We will cover the highs and lows of nuclear energy for good in a future article.


We leave you with the man who started it all - Ernest Rutherford. For his contribution to nuclear physics, Rutherford was honoured on New Zealand’s 1c and 7c stamps in 1971 (pictured below), 100 years after his birth.




 

Co-Authors:













 

Disclaimer: This article is based on our personal opinion and does not reflect or represent the views of any organisation that we might be associated with.

0 comments

Recent Posts

See All
bottom of page