In 2007, nuclear power accounted for 19% of the total electricity generated in the United States, according to government data. That is roughly the amount of electricity used to power all of California, New York and Texas –our three most populous states– combined. With 104 nuclear power plants spread across the country, the United States produced a record nearly 807 billion kilowatt hours, more than any other country in the world.
Nuclear power provides 30% of the electricity needs in the European Union, while France leads the pack with nearly 80% of their electrical energy produced by nuclear reactors. The percentage of electricity that the U.S. derives from nuclear power is considerably less than many industrial nations.
Although the operations and maintenance costs of a nuclear power plant are cheaper than conventional plants, they are more expensive to construct and take longer to build.
The range in cost is greatly disputed among available sources, but as recently as June, 2008, Moody’s Investors Service estimated that the cost of installing new nuclear capacity in the United States may rise above $7,000 per kilowatt in final cost.
According to the Energy Information Administration (EIA), it would take six years to build a new nuclear power plant. On the contrary, to build a new coal or natural gas powered plant would take only 2 to 4 years, according to the EIA.
Nuclear energy is emission-free and reliable, and less expensive than most other traditional energy sources. It is increasingly being viewed by countries across the world as an enticing form of electricity generation.
Accidents at nuclear power plants in Chernobyl and Three Mile Island caused a great swing in public opinion against the use of nuclear power. Critics believe that nuclear power is a potentially dangerous energy source, while proponents contend that new technology has made nuclear power safer than ever. There are various groups on each side, with both camps enjoying the backing of influential people.