Nuclear Power In The United States
Nuclear energy is in trouble. Nuclear power has fallen from favour due to high costs, and concerns about plant safety and radioactive waste disposal. In addition, there is the energy efficiency option – we can avoid wasting so much energy, so that it becomes easier to meet our needs from renewable sources.
Once again the atomic-visionaries rushed into print repeating the old 1940 predictions with a few new ones. There would be nuclear-powered planes, trains, ships and rockets; nuclear energy would genetically alter crops and preserve grains and fish; and nuclear reactors would generate very cheap electricity.
Obviously, commercial production of electric power from nuclear fuel is here, however because of the economics involved, it is less likely now than ever that predictions of large-scale production of electric power alone from a nuclear reactor will ever be realized.
Electrical power from nuclear reactors was touted by utilities as an energy ‘truly too cheap to meter’, but all we have learned after more than 50 years is that the utilities are ‘too cheap to truly meter it.
The Real Nuclear Product – Plutonium
The first thing I want to talk about is what is produced at a nuclear plant.
Contrary to the views of most Americans, the only large-scale production of electric power comes as a co-product of the manufacture of plutonium.
All commercial nuclear reactors produce plutonium from uranium activation chain in the core. By the end of a reactor cycle (about 18 months), plutonium accounts for about 15% of the generated reactor power.
Unfortunately for the nuclear power industry, there are un-shakeable links between civilian and military plutonium stocks. For decades, governments and utilities alike maintained that the two could be kept functionally separate, but this distinction became very blurred in the post–Cold War 1990s.
A fist-sized chunk of plutonium is enough for a terrorists’ bomb – the bomb dropped on Nagasaki used only 10 kg. Even though we are told to reserve our concerns about personal safety, or even genetic hazard, the bomb and proliferation make nuclear energy unique.
We live in what could be defined as a plutonium economy.
The number one goal has always been to produce only enough electrical power that it does not affect the amount or quality of Plutonium being created. If removing the heat energy from the reactor causes an appreciable decrease in the Plutonium output then the decision to produce power from a nuclear power plant would face serious economic problems.
Nuclear technology was developed and deployed for the purpose of killing the most humans per bomb as possible. The reason nuclear power was deployed was to supply the killers with the materials to make those bombs. There’s a reason they call ’em “Weapons of MASS Destruction.”
When the fuel has been activated, the fissioned materials left are intensely radioactive, and extremely dangerous to personnel. The operation of the reactor, and the processing of the enriched uranium has to be carried out by remote control within air-tight shields which are several feet thick.
After the fission products are removed, the relatively small amount of Plutonium is separated from the bulk of the parent uranium, and purified, and is then ready for incorporation with a nuclear weapon.
Approximately 24,000 kw hours of heat energy is released when one gram of U-235 is fissioned. According to my latest energy bill, which charges me $0.071 per kw/hour, that would come to a grand total of $1705.00 worth of electrical energy at current market value in Illinois.
Now take the same gram of fissioned U-235, and think of the value in terms of a military grade explosive, and you will note that it is valued much higher than the value of the electrical power generated.
In a sweep of publicity, the facts are often ignored or hidden beneath a flood of over-enthusiastic extrapolation and speculation.
Electric Power From Nuclear Fuel?
This leads us into the second thing I want to talk about, which is generating electrical power from nuclear energy, and fission.
The generation of electrical power is not a product of fissioning nuclear fuel, and it took over a decade to engineer and create the necessary additional equipment to remove heat from the reactors, and then cool and condense it enough to be thermodynamically capable to operate a steam turbine and produce electricity.
Nuclear plants need uranium to fuel them and inevitably produce dangerous long-lived wastes, whereas most renewables need no ‘fuel’ and produce no wastes, and so there are no fuel or ‘backend’ costs.
The process of nuclear fission (‘splitting the atom’ or, more precisely, ‘splitting the atomic nucleus’) releases immense amounts of energy. Under controlled conditions within a nuclear reactor, this process can release one million times more energy per atom than any chemical reaction, including combustion. So it is hardly surprising that over the past 60 years many of the considerable efforts that have been made to harness this ‘source of energy’ have failed.
The ‘fast neutron’, ‘fast breeder’ or simply ‘fast’ reactor is a type designed to produce more fuel while it is generating power. The reactor core requires the use of a more highly enriched uranium fuel, in order to sustain a chain reaction based on fast neutrons alone.
Reprocessing has proved more difficult and expensive than uranium enrichment, furthermore, the fast reactor itself has proved technically complicated, with an intensely hot core subjecting the materials around it to high fluxes of both heat and neutrons.
It’s not that nuclear power plants are efficient, enormous amounts of heat produced in nuclear reactors is wasted, and the current efficiency of reactors to produce electrical power from the heat energy generated is less than 20% in generous estimates.
To understand this, let’s look at the financial strategy that a nuclear power plant employs to maximize profits.
Construction of a nuclear power plant
The earliest large nuclear reactors built in the USA, Britain, USSR and China were all designed to make weapons-grade plutonium for atomic bombs.
The first electricity to be generated by nuclear power, in 1951, actually came from a small breeder reactor in Idaho, named EBR-1. But the US nuclear research programme was directed less at electricity generation and more towards propulsion for submarines, and increasing the capacity of their plutonium stockpiles for nuclear weapons.
The first large American power reactor, which began operating in 1957 at Shippingport, Pennsylvania, was a 60 MW unit hastily modified from the US military submarine design, and originally destined for a nuclear aircraft carrier.
Thus a compact American configuration, intended for the restricted space onboard a submarine, evolved into the most commonly used reactor types today, the pressurised water reactor (PWR) and the related boiling water reactor (BWR).
As governments sought to support an electricity generation industry based around nuclear power, the size of power reactors grew. But energy was cheap, and there was little incentive for industry to invest in nuclear technology without heavy government subsidies.
American electricity utilities refused to participate at first, on the grounds of cost, risk and the availability of cheap oil and abundant coal. The US government responded by building a series of demonstration reactors using different technologies. Most of these performed poorly.
The rate at which nuclear energy was introduced in the United States was determined pretty much by competitive market forces rather than by perception of energy demand. Nuclear Power in the former Soviet Union was introduced at less than one-fifth the rate it has been introduced here.
Beginning with a 500 MW BWR at Oyster Creek, New Jersey, in 1963, a string of fixed-price (and often loss-making) commercial contracts for nuclear power stations were let by competitors General Electric and Westinghouse.
Losses of up to $100 million per plant are thought to have been sustained by the manufacturers in their determination to build up the market
In the nuclear power heyday of the 1960s and ’70s, the default rate on construction/financing loans for nuclear plants was 50%. Now, after TMI, Chernobyl and Fukushima, the default rate would be calculated by any decent risk analyst at 100%.
Although about 40 reactors were ordered in the United States in each of 1973 and 1974, barely one single power station has been ordered and completed since, and no new orders were placed after 1978. Some US reactors were abandoned when more than 90% complete.
In the early 1970s, the United States was predicted to have around 1000 nuclear power stations in operation by the year 2000 – instead it managed barely more than 100.
There is many reasons that the nuclear renaissance has come to a screeching halt, but one of the largest is the cost to construct and operate a nuclear power station.
The cost to build a new nuclear power station is so astronomical, that the government is forced to provide large benefits and funding opportunities to ensure they can be completed.
Moreover, when nuclear plants come to the end of their working life, they must be decommissioned, which is a very expensive process – which generates yet more wastes. Nuclear facilities are also potential terrorist targets, whereas most renewables are unlikely to attract the attention of terrorists.
This forces regulators to view existing stations in a different light, as if one is taken off of the grid, there is no promise that a new one will be built. To cope with this regulators have taken the approach that they can operate until they fail.
When they fail, they can melt down and render huge swaths of land uninhabitable for generations.
In 1978, the cost to build a nuclear power station was on average under 5,000,000 USD, after 1979 the cost immediately jumped to between 3 to 5 billion USD. The construction of a new nuclear power plant today is easily estimated between 8 to 12 billion USD, using a conservative scale.
This does not include future costs that may be incurred due to retrofits, broken equipment, future permanent nuclear spent fuel storage, and other additional costs that can add billions to the cost of each reactor over the course of it’s lifetime.
License renewal does not require replacement of parts – it’s blanket extension of the operational life of that reactor and its entire support system.
In its final stages, this involves the handling of very large pieces of radioactive material from the reactor core and associated cooling and heat transfer systems. Cooling water for concrete cutting saws, dust, cleaning solvents, etc. may all be potential agents for the spread of radioactivity and need to be carefully contained
Many stations will have cost upwards of 20 billion dollars from construction to decommissioning periods, with permanent fuel storage costs added.
When a nuclear reactor is taken off of the power grid, the utility must purchase reserve power to ensure that consumers do not face a unexpected power outage without any recourse. That is estimated to total approximately $1 million USD per day to the utility.
Let’s do some math.
1 Million USD X 365 Days X 40 Years = 14 Billion 6 Hundred Million Dollars
So this means that to purchase reserve power for each station every day over the entire operating license period, would actually cost less than it does to build, operate, and decommission each reactor.
More recently regulators have been forced to allow venting during normal operations to deal with the excess heat and steam that threaten the stability of nuclear reactors.
This combined with the faulty and archaic piping has led to many reports of contamination in the environment, which is generally not acknowledged by the utility until the NRC or an environmental monitoring agency intervenes.
Many Americans were shocked to learn of the vast amount of Tritium leakage from nuclear power plants in the United States, especially after watching the Vermont Yankee ‘operate’ for the last 40 years
Much of the world’s more dangerous radioactive waste (i.e. spent reactor fuel) is still in interim storage, mostly at the reactor sites themselves.
While many countries have explored or supported deep geologic disposal as the best method for isolating highly radioactive, long-lived waste, no country has yet commissioned such a repository, although several have opened central interim stores for used reactor fuel and high-level wastes (HLW).
Sweden and Finland are arguably the most advanced in their plans, promising operational repositories ‘some time after 2010’, but most other countries have vague timetables beyond 2020 or even 2030.
US government plans for a US $10 billion repository in a remote but geologically active site at Yucca Mountain, Nevada, originally planned for 2010, look to never become a reality.
The safety of nuclear power did not really become an issue until 1979 – it was previously dismissed on the grounds of statistical analysis of component failures, each with a low probability.
Although there was no breach of the pressure vessel or direct loss of life, the March 1979 loss of coolant and partial core meltdown in one of two PWRs at Three Mile Island near Harrisburg, Pennsylvania, was a severe setback for nuclear power worldwide, leading to moratoria on future
construction in Italy, Belgium, Sweden and elsewhere.
The economic prospects for nuclear power were therefore looking worse still when, in April, a large leak of radioactivity was detected in the atmosphere over Sweden.
One of four 1000MW steamcooled reactors at Chernobyl had suffered a steam explosion, graphite fire and core meltdown two days earlier, but the then-secretive USSR government had initially tried to cover up the accident.
There is little doubt that Chernobyl contributed to the 1989 fall of the Berlin Wall and the 1991 demise of the former Soviet Union.
It also marked a symbolic opening of the world nuclear industry to careful scrutiny by economists, investors and the public. Nuclear technology experts from the United States, Japan and Western Europe have been visiting and advising their Eastern European and Asian counterparts ever since, in the knowledge that if another reactor melts down or leaks badly anywhere in the world, their industry faces almost certain shutdown.
That inevitable disaster occurred this March, and the full ramifications and sequence of events will not be known for decades if ever. The most recent assessments show that the disaster recovery efforts will easily exceed over 200 billion USD.
There are very visible effects that nuclear power stations have on the environment, during normal operations. At the Hanford site in Washington for example, an appreciable rise in temperature of the Columbia River could be registered whenever one pile was operating at the nuclear power plant.
Only about 1–2% of the world’s new power plants are nuclear, and today most of the reactors under construction, on order or planned are in Asian countries such as China and India
Nevertheless, pressure has been mounting for a reassessment of the nuclear option in the United States and Europe, in part because of the potential role that nuclear power can play in response to climate change.
In 1998 the nuclear industry journal Nucleonics Week (22, October 1998) said, perhaps rather tongue in cheek, that ‘nuclear needs climate change more than climate change needs nuclear’ and that issue remains central. The nuclear protagonist claim that it can help respond to climate change since nuclear plants do not generate carbon dioxide.
But wherever nuclear expansion or extension is considered, as this brief history has illustrated, there will be many technical and financial obstacles that the industry has to overcome, as well as a range of strategic security issues that threaten not only our own nation, but our Allies as well.
The Toughest Summer On Nuclear
This was made evident here in the United States this year, which has been the toughest on nuclear power stations that I have ever seen.
Two nuclear power plants were closed due to flooding, jellyfish and other debris have forced other nuclear stations to trip and rely on off-site power. The Cooper Nuclear Power Plant may never open again, because of the costs incurred from the flooding damage this summer.
There have also been multiple reactor SCRAMS due to earthquakes across the country. The most powerful rocked the North Anna Nuclear Power Plant, rocking huge spent fuel casks which weigh multiple tons, and are arranged and placed exactly. The Utility, Dominion, has also been forced to admit that these casks may have moved twice, due to aftershocks felt.
A study done by Virginia Tech which is scheduled to be released later this year is studying whether or not this earthquake is an indicator of a larger quake to come.
To explain why this is so dangerous, lets look at the North Anna nuclear power plant event this August.
The nuclear fuel assemblies and control rods are secured at one end. The fuel assemblies are bolted at the top, and rest inside of the reactor. The control rods are also gripped at the top of the rod, and have been thought to fail in seismic events.
In my opinion it is likely that the North Anna Operators are going to be shocked in a few weeks when they open the reactor to attempt to refuel it, and find that they cannot retrieve the fuel or the control rods as they are likely wedged inside of the reactor.
When the earthquake occurred, a negative neutron flux caused the reactor to trip and a SCRAM to occur, but the sequence of events recorded by the equipment did not follow normal protocol.
In my opinion it is likely that the earthquake caused the fuel assemblies to start moving, and dislodged at least one of the control rods, causing it to drop, known in the nuclear industry as ‘raining rods’ causing the negative flux, tripping the reactor.
During the SCRAM event the rest of the control rods were mechanically forced into the reactor, but with the movement still occurring inside of the reactor due to the top-secured assemblies I would not be surprised at all if multiple control rods are stuck and will require some serious engineering to remove.
This event has serious implications on the industry, and yet that is not the news that you read in the newspaper.
When you look at this event, combined with the other lessons learned this year, along with the history of nuclear energy, it is easy to understand that our nuclear power stations are not at risk from one threat, but a plethora the like of which no one can ‘predict’ or ‘safeguard’ against and be financially profitable.
The regulators are forced to admit that its not just operator error or component failures, but from the environment, from un-forseen events, and even terrorist attacks. In itself the nuclear industry is an INCOMPLETE industry, because there is no plan for permanent storage of nuclear waste.
As I’ve showed, it’s not financially beneficial for many of the residents who end up using the electricity generated from these stations, and every nuclear power station poses a very real threat not just to those living around it, but all within range of the deadly materials held within it’s fiery bowels.
I would encourage each and every one of you to educate yourselves on the role that energy plays in our nation, and in your community.
Find out how you can contribute by being a more responsible and informed citizen, and how together we can work to achieve our goals to provide clean and responsible energy production and management throughout our country.
The key to a future in which we are not enslaved by our dependence on energy is to develop and implement strategies and technologies that help make electrical power generation more responsible, more efficient, more available, more dependable, and A HELL OF A LOT SAFER.
- Tags: BWR, Chernobyl, Contamination, Decommissioning, Fission, Fukushima Daiichi, General Electric, Nuclear, Nuclear Fuel, Nuclear Industry, Nuclear Meltdown, Nuclear Regulatory Commission, Nuclear Waste, Plutonium, Radioactive Material, Spent Fuel, Three Mile Island, Uranium, Vermont Yankee Nuclear, Westinghouse