Thursday, March 31, 2011

The Maturing of Radiation Understanding

While I am much more interesting in other things, reading about Fukushima and comparing that event to others has made me more that a little curious about radiation. Our understanding of the impacts of radiation is still maturing. So it is understandable that there is so much confusion in the public, the press and even among the experts.

Ann Coulter, makes a living pissing democrats off. She is very good at it by the way. She commented on the Bill O'Reilly Show that radiation has health benefits. Being an effective Democrat pisser offer, there is enough truth in her statement to ensure her job security. Higher background radiation is not consistently indicative of higher cancer rates and some studies indeed indicate that certain levels of radiation promote DNA repair mechanisms in the human body. There are plenty of odd things in medicine that seem to go against common reasoning. Chiropractic medicine has plenty of "real" medical antagonists, but does have varying degrees of effectiveness. Many will accept Chiropractic, Holistic, and herbal medicine, but doubt radiation therapy. The human mind is truly amazing.

The Ann Coulter's, Rush Limbaugh's, and Bill O'Reilly's of the world will continue making good livings tweaking the sensibilities of the Left until the Left realizes how easily they are manipulated. It is not just the Left, the Right is also quite gullible. Each just has different buttons that can be pushed. Humorists will win the day ultimately. A great deal of the more accurate reporting of the condition of the world comes from comics. Perhaps I should help create a new political party, The Comicunist Party (CP).

The CP is much better equipped to deal with the irrational fear and policies of the less humorous powers at be. I have understood that for a long time and would have voted for Pat Paulson had I not been so agnostic at the time. I did vote for H. Ross Perot. Unfortunately, I later found out he was serious.

The real tragedy in Japan makes it difficult to find any humor. Real and irrational fears are fanned by the all too ignorant main stream media. There are still many in Japan and around the world finding ironic and real humor. That is one of the nice things about human nature. Crisis brings out the best of us, provided we are actually dealing with the crisis. Being removed from the crisis tends to bring out the idiotic nature of humanity. Green Peace, in search of what they are sure will be examples of governmental conspiracy, is helping the Japanese survey radiation. The Church of Scientology sees the need to convert the heathen Japanese to the light of the Lord following what surely must have divine intervention. Various humanitarian groups are offering to adopt the thousands of newly orphaned Japanese children, that do not exist.

In the midst of all of this, humanity is being forced to learn more about one of their irrational fears, radiation. The intense surveys in Japan will produce more understanding, just like Three Mile Island and Chernobyl. Before TMI, radon is homes was never considered. Test after TMI showed higher than expect levels of Radon. In 1980, Radon was the crisis De Jour. The thousands of home Radon surveys proved one thing, that radiation in our lives exists, like it or not.

Since then places like Guaraprira, Brazil background radiation level 175,000 microsieverts per year, Ramsar, Iran, 260,000 microsieverts per year, and other places with very high background radiation levels have been discovered. Now you can expect more background radiation maps to appear on the internet to provide a better perspective.

Our understanding of radiation is maturing. Hopefully, our understanding of ourselves will as well. If not, we will always be able to fall back on the Comicunist Party.

Tuesday, March 29, 2011

The Uncertainty of the Impact of Radiation - Fukushima

The impact of Fukushima's radioactive fallout will be compared to three major things, background radiation, Three Mile Island and Chernobyl. The actual long term impact cannot be exactly stated. Statistical probabilities with levels of uncertainty will be used to estimate the impact. Statistical methodology is a powerful mathematical tool in the proper hands and powerful propaganda tool in the hands of the biased or poorly trained. Abuse of statistics is unfortunately all too common.

Mrs. Cindy Folker, was recently on C-Span stating her concerns about nuclear power. She stated basically, that it is not the job of the public to prove the dangers of radiation, but the job of the government to disprove the danger of radiation due to nuclear power. She is partially right. It is the job of the government to state the statistical probability of health impacts of radiation by type, source and exposure. To Mrs. Folker, there is no safe level of radiation. While that is probably true, there is no way to avoid exposure to radiation.

Everyone will make their own determination of the potential danger of radiation. Some will make total uninformed determinations, some will study to become informed before making a decision. The major problem is that many outspoken, caring and passionate people will sway the opinion of many with often uninformed, biased and passionate rhetoric. Often the sources they site, if indeed they site sources, are biased to their opinion. Not intentionally, it is human nature. You are more likely to find and believe what you expect to exist than what may exist. This is one of the major problems with statistical methodology.

Mrs. Folker referenced two TMI studies. From Wikipedia, this is the opening paragraph,
"The health effects of the 1979 Three Mile Island nuclear accident are widely, but not universally, agreed to be very low level. According to the official radiation release figures, average local radiation exposure was equivalent to a chest X-ray, and maximum local exposure equivalent to less than a year's background radiation. Local activism based on anecdotal reports of negative health effects led to scientific studies being commissioned. A variety of studies have been unable to conclude that the accident had substantial health effects, but a debate remains about some key data (such as the amount of radiation released, and where it went) and gaps in the literature."

For all its faults, Wikipedia is an easy source and generally fairly accurate. The "not universally agreed" sources are the ones Mrs. Folker cited, quite passionately. If conspiracies are your cup of tea, there is no way that you can make an informed decision, so you may as well find something else to read.

One of the sources, again per Wikipedia, Dr. Steven Wing was hired by lawyers for 2000 residents of the TMI area to perform a study to counter the government's study. "Wing found cancer rates raised within a 10-mile radius two years after the accident by 0.034% +/- 0.013%, 0.103% +/- 0.035%, and 0.139% +/- 0.073% for all cancer, lung cancer, and leukemia, respectively.[15] An exchange of published responses between Wing and the Columbia team followed." Dr. Wing's report conclude that a small but statistically significant increase in cancers in the TMI area was evident.

At the Columbia team's recommendation another study was performed by the University of Pittsburg which found a slight increase in mortality but no direct attribution of causation to TMI radiation. To which Dr. Wing responded, "Wing et al. criticized the Pittsburgh study for making the same assumption as Columbia: that the official statistics on low doses of radiation were correct - leading to a study "in which the null hypothesis cannot be rejected due to a priori assumptions."[20] Hatch et al. noted that their assumption had been backed up by dosimeter data,[17] though Wing et al. noted the incompleteness of this data, particularly for releases early on."

In 2005,there was another study concluding, "In 2005 R. William Field, an epidemiologist at the University of Iowa, who first described radioactive contamination of the wild food chain from the accident[citation needed] suggested that some of the increased cancer rates noted around TMI were related to the area's very high levels of natural radon, noting that according to a 1994 EPA study, the Pennsylvania counties around TMI have the highest regional screening radon concentrations in the 38 states surveyed."

Then in 2008 as study often quoted by nuclear power advocates, "A 2008 study on thyroid cancer in the region found rates as expected in the county in which the reactor is located, and significantly higher than expected rates in two neighbouring counties beginning in 1990 and 1995 respectively. The research notes that "These findings, however, do not provide a causal link to the TMI accident."[23] Mangano (2004) notes three large gaps in the literature: no study has focused on infant mortality data, or on data from outside the 10-mile zone, or on radioisotopes other than iodine, krypton, and xenon.[4]"

The note of "no study focused on infant mortality data." ", the Pennsylvania Department of Health, examining death rates within the 10-mile area around TMI for the 6 months after the accident, said that the TMI-2 accident did not cause local deaths of infants or fetuses." While there may not have been a follow up study, the basic data would be available at the Pennsylvania Department of Vital Statistics should you wish to form your own opinion.

The passionate statements of Mrs. Folker are not supported by evidence. She did not even provide the results of the sources she sited with the margin of error nor mention the higher than normal background Radon concentration of the areas surrounding TMI.

There will be many passionate pleas from all segments of society. Passionate pleas with no basis in fact are useless for making informed decisions. It is likely that the impact of Fukushima will be greater than TMI. The question is, when the data is available, will rational, informed decisions be made?

If you would like another opinion.

Monday, March 28, 2011

The Renewal of the Nuclear Debate

Following a brief period in the late 1950's when the Atom was the new wonder of modern technology, reality set in. Taming the atom was not as simple and inexpensive as predicted by visionaries of the day. The debate over the safety of nuclear power has had its ebbs and flows. As with most political debates, facts are of little use.

Design flaws and potential design flaws of the forty year old veteran General Electric Mark I nuclear design are having a Deja Vu moment. I have had my own criticisms of the design and the emergency response procedures. I, of course, feel my criticisms are founded in reality while of course everyone else feels the same. Of what I feel are the more informed people to comment on the design, I really have only one point where I disagree. That disagreement is not particularly a major issue.

Corium catch basins is one flaw the GE Mark I design is criticized for by experts. Corium, the molten combination of fuel and other reactor components, should be contained in the event of a total melt down. The GE Mark I is perceived to be inadequate to maintain the potential corium that could develop in a large power reactor. I say "perceived" because it is not proven. Critics of the design use the typical words, could, maybe, possibly etc. without any real numbers to back up their points.

I am not particularly a fan of GE, mega scale reactors or boiling water reactors. I can do math fairly well and I also can compare real examples from the past to possibilities in the future. Like TMI for example.

From Wikipedia, "Following corium relocation to the lower plenum, the potential exists for corium to breach the primary pressure boundary (in light water reactors, this is the reactor pressure vessel). What happens when the corium reaches the bottom of the reactor pressure vessel in a Western light water reactor is the subject of actual experience and considerable speculation, and depends on temperatures, the age of the fuel, the amount of activity the fuel has been exposed to, as well as the physical composition of the RPV, the dimensions of the RPV, the pressure of the primary coolant system (whether or not pressurized) and numerous other considerations. It is not likely for the corium to remain critical in the bottom of the RPV unless - first - the corium is quenched by a large excess of coolant water and turned back into solid phase, allowing the interposition of a water moderator and the formation of a critical geometry - second - after the quench of the corium, there remains sufficient unborated water in the lower plenum to moderate the reaction and support criticality - third - the corium remains unadulterated with a neutron-absorptive alloy or substance from the melt of the control rods, such as boron carbide or cadmium." RPV is reactor pressure vessel. As you can see there are a lot of ifs. It cannot be completely ruled out. The statistical probability is very low however.

This situation occurred at TMI but ran into a few ifs. "It was later found that about half the core had melted, and the cladding around 90% of the fuel rods had failed,[9][37] with five feet of the core gone, and around 20 tons of uranium flowing to the bottom head of the pressure vessel, forming a mass of corium.[38] The reactor vessel, the second level of containment after the cladding, maintained integrity and contained the damaged fuel with nearly all of the radioactive isotopes in the core.[39]"

Note: there is one minor error in the Wikipedia quote. The RPV is actually the third level of containment, Cladding first, fuel composition second, Reactor pressure vessel third and containment building fourth. Really there is a fifth level of containment, the over burden of the containment building. I'll let people think about that final one.

So there is actual real world data to determine the ability of the RPV to contain corium with a 50% core melt. There is actual real world data available to determine the probability of the total complete melt and relocation of all corium to the base of the RPV. Many of the arguments now are based on suppositions posed prior to TMI. It is probably impossible to have the arguments limited to "real" possibilities with "real" probabilities. I can dream though.

My other main concern with the GE Mark I design is emergency response planning. First the pressure vessel should have pressure relieved as soon as a real loss of coolant accident (LOCA) was determined. The lower pressure increases the chance of adequate water flow from back up cooling systems to prevent fuel damage. TMI showed that vented steam posed a much lower public health risk than advertised. With that knowledge the Japanese operators should have been able to make a better decision at the onset. Sorry, that is just the facts. Trying to do too much does too much damage. At Fukuhima the limited availability of cooling water caused more damage than no cooling water. Quenching the hot reactor, then having it heat up again, then quenching the again hot reactor causes damage to both the core and containment components. The mysterious cause of the radioactive core water leak will be found to be piping connections weakened by repeated heating and quenching. No, I am not clairvoyant, it is basic metallurgy. The piping connections are the weakest point.

So once again the debate will lead to less nuclear power of any kind or worse, more pressure on operators to not vent for irrational fear reasons that will lead to more damage and potential fall out.

It will be about five years until all the results are in, check back then.

Radiation Stuff - It is Maddening I Tell You!

Just when you start to get a handle on one unit of radiation measurement the nuclear guys throw a curve ball. Becquerels is a small unit of radiation measurement that is really confusing. A Becquerel is one count of one pop per second. I doesn't describe the type of pop, how much energy the pop has or anything related to health, other than there is some radiation present. On top of that, the pops are given in different units, kilometers, meters, centimeters, kilograms and grams.

When the Becquerels are used for describing food or water contamination they are most useful. If you consume the food or water then the majority of the radiation has a chance to do something. As a background level, Becquerels are almost useless. Sieverts are used to describe the health impacts of absorbed radiation.

The milk and vegetables around the reactors has been measured in Becquerel per kilograms (Bq/Kg). Spinach, for example, was tested at levels of 15,000 Becquerels for the Isotope Iodine 131. The normal Japanese standard for Iodine 131 is 2000 Becquerel per kilogram. So anything over 2000 Bq/Kg is bad right? Maybe? At 15,000 Bq/Kg one report said you would have to eat 40 kilograms (88 pounds) of spinach to have a possible increase in cancer. Different folks have different risk factors. The younger you are the more likely the risk. That is why water at 100 Becquerels per kilogram is the Japanese limit for infants and 300 Bq/Kg for adults.

There is a big difference between 2000 Bq/Kg for spinach and 100 Bq/Kg for water. Does that mean that infants don't eat spinach? Probably not, but by the time they do eat spinach their body mass is high enough they can statistically tolerate a few grams of slightly radiated spinach.

The Japanese radiation standards are very conservative. That is why there is some confusion about the moving radiation targets. There is no international standard I know of that properly addresses the issue of radiation either background or in food stuffs. That is because there are huge variations in background radiation and for content of certain foods.

Bananas for example contain 520 Pico Curries of radioactive potassium 40 which converts to approximately 130 Bq/Kg. Brazil nuts have about 207 Bq/Kg potassium 40 and up to 259 Bq/Kg of radium 226 for a total of 466 Bq/Kg. Brazil nuts are the only food that contains any where near that amount of total radiation. White potatoes have an average 125 Bq/Kg. These are all naturally occurring radiation levels that have not been impacted by any atomic tests or accidents. That is just the way it is in nature.

So why is there no international food radiation standards? Because it is not that simple. People living in areas with different normal background radiation level typically have different tolerances and genetic predispositions to forms and levels of radiation. A Scandinavian person for example is much more susceptible to sunburn than a member of a second or more generation Scandinavian family living in the tropics and much, much, more than a tropical native. It is the same thing with other forms of radiations, the human body adapts.

To figure out your own radiation tolerances you would have to look at your average diet, life style, local radiation background level and work exposure to radiation. Outdoors types build more tolerance than indoor types, vegetarians differ from omnivores and genetics play a very large role. So national standards reflect national averages which have meaning while international standards would not.

The worst thing about all this confusion and most often irrational fear, is that the use of radiation for treating foods is not looked at rationally. Irradiation of foods kill the nasty bacteria and other organisms that cause the majority of food related deaths. Buzz words like "organic" are helpful for reducing some levels of pesticide, genetic modification and hormones, but do nothing to address salmonella, E. Coli or other more damaging contaminates. For food sensitive people, organic food is a quality of life saver. Irradiated food is a real life saver. Any mention of radiation, though is a frightening red flag. The impact that fear may have on nuclear energy is meaningless. The potential of irradiated food to reduce food related fatalities, extend useful life of stored foods and improve international health standards being ignored is were the real tragedy lies, and it will only suffer more because of this event.

The largest adverse health of the Fukushima nuclear crisis will be emotional stress. Education is the only thing that can reduce that stress. Enough education may actually allow more people to lead healthy lives.

Sunday, March 27, 2011

Odd Things About Natural and Background Radiation

I was reading the Fukushima reports and noted a Cesium 137 level of 150 Becquerel was reported in one area. Norm background is in the 100 Becquerel range. So I was wondering if the Cesium resulted in a total radiation of about 250, the 100 background plus the Cesium measured. I don't know yet, but suppose the total is 250 Becquerel per meter squared. That is high enough to be some what concerned especially if you have young children. Anyway, I left a comment with a scientist guy living in Japan about the report and my questions about the background level of Cesium 137. I was corrected that Cesium 137 does not occur naturally in nature.

I thought the same thing myself, but there is still background Cesium 137 radiation from nuclear weapons tests, Chernobyl and a variety of other nuclear industry sources. Not much, but enough that it can be used to date wines and whiskeys made after 1950 and Chernobyl. Curiosity keep me researching and I stumble across something new to me, a natural nuclear reactor.

So guess what? Cesium 137 does naturally occur in nature. Very, very small amounts, but Cesium 137 and even Plutonium occur in nature. Weird stuff.

Saturday, March 26, 2011

A Dollar a Watt?

One of my fantasies is to build a "green" RV. Not that I [am] a particularly big time green guy, I just like the idea. Hydrogen of course is a part of the fantasy and either wind and/or solar. My RV is 34 feet long and nearly 10 feet wide. allowing for basic access to junk on the roof I may have 300 square feet of space for solar cells. Let's round that off to 30 meters squared. Sun energy down here is about 1000 watts/meter^2. But solar cells are not that efficient so let's use the Nanosolar estimate of 15% efficiency. So I could get about 4.5 kilowatts hours while the sun is shining. The sun doesn't always shine all the time, so say 6 hours per day. That's 27 Kwh per day. More than enough for my RV. Less shipping and installation that solar array would cost me $4500. I would need to spend about $800 for a power inverter, some batteries plus some wiring and stuff, so let's say $6000 so nothing too unexpected comes up.

Assuming I did everything right and it all lasts 10 years, how neat would this system be? My average monthly electric bill is about $75. That is not much because I have gas heat, stove and hot water. If nothing broke I would save $3000 in ten years. Nothing is prefect, so stuff will break eating up that $3000, but the system should last longer, so everything is free after 10 years. A ten year pay pack is a lot sportier than I would have had with solar three years ago. So solar is making up some ground finally. Summer time is my maximum energy use time. The solar array provides enough power, but not when I need it, at night for the a/c. I only allowed for 5 deep cycle batteries which be pressed hard in the summer, but will make it. They would need to be swapped out every 18 months to 2 years, but they will do the job.

My ideal fantasy system would have a reverse cycle fuel cell and store about 60 kilograms of hydrogen. I will continue with those estimates later.

America - The Sudia Arabia of Trash

I like looking at opportunities. One of our country's greatest un-natural resources is trash. We have trash everywhere. Mountains of garbage entombed in landfills all over the country. Treasure troves of not so raw materials and potential energy just waiting for the right time to exploit. How close is that time?

With all the hype related to global warming, energy security, protecting our environment and high energy prices what is being done with the garbage crisis? Not a lot!

Garbage is biomass. That warm and fuzzy buzz word implying our salvation from big everything is waiting in our own back yards. It just might be back there in mount Trashmore.

Incinerators are so 20th century. Much more eloquent designs remove virtually all of the nasty pollutants incinerators were known for in the past. Handling the garbage is the main problem. Sorting and separating often becomes a hands on situation that even illegal immigrants avoid. Nasty job! Mechanical mining and separation of garbage is less distasteful. Machines don't gag when they encounter junior's decade old dirty dity. When this valuable un-natural energy source is combined with other clean combustibles, magic happens. Energy from waste, reduced trash leaching into our ground water and fewer unsightly mount Trashmores. Sounds like a plan right? It is and a better plan than many think.

Not only is there energy in them thar trash mountains, there is biofuel! Yep, chemical engineers not only can convert trash to cash, but trash you can pump into you gas tank. So why aren't we doing this? Stupidity is the only thing that pops into my mind. We have to be careful though, we could run into the great trash shortage in the not too distant future. Damn the bad luck!

Friday, March 25, 2011

Why Waste Heat?

Waste heat is money down the drain. Unfortunately, we cannot help but waste heat. If we could avoid it, we would have perpetual motion machines. No energy problem would exist. In the real world there is energy loss. How much is really a matter of money. This is not some conspiracy, it is thermodynamics versus finance. You make energy decisions every day. You have a ten year old car that gets 20 MPG, but you can drive new hybrid that gets 48 MPG. Is it worth $45,000 to you to more than double your gas mileage? Just like the laws of thermodynamics limit engine efficiency, your financial situation limits your energy efficiency choices.

Some say, "There is so much more we can do to conserve energy". Of course there is, but there are financial and physical limits. You can add insulation to your house of even paint your roof white. You can't live in a thermos bottle. So there is a point were adding more insulation is not cost effective. Painting the roof white is fine, but you have to clean the roof to keep it white.

Is there a conspiracy by big oil and Detroit to keep 100 MPG cars off the road? No, there are real technological, financial and utility limits. You can build something close, but is not very useful. A one seater car with hard rubber tires and no get up and go limits consumer demand. The Chevy volt EV1 was a neat battery powered car. It had a limited consumer base, high maintenance costs and was not worth the money for Chevy support. It was cheaper for them to shred them than keep them on the road. Sad but true. A lot of the new high efficiency ideas require the public putting their cash on the line. There is a bunch of public out there with different needs, wants and financial resources.

A surprisingly large number of the public are second hand buyers. That is not just people buying cars it is also municipalities waiting for proven technology. There is no magic wand that anyone can wave to change things. You have to work with what you got.

As prices for energy grow, the options to save energy increase because they become more affordable. Take the car example, with gas at $1.50 a gallon it would be foolish to invest so much to save so little. With gas at $5.00 a gallon that new hybrid starts to look more attractive. The Chevy Volt would still be on the road with lots of buddies had gas prices stayed high. It is not that bright to artificially inflate gas prices just so you can see more Volts than Volkswagen. Trust me, poverty sucks.

Estimating future energy prices is needed to determine the value of energy options. Now, energy security issues adds to the criterion and potential global warming is in the mix. That all combines to make more energy options viable. Financially, bang for the buck is still a major issue with either your or the peoples money.

Security wise, the Department of Defense will lead the way on transportation fuels. That limits fuels to NATO compatible blends. There is some use of waste heat that can be included in manufacture of alternate energy types of these fuels.

The big uses for waste heat are in industry. The concrete industry can use power plant exhaust heat. Not for the full process, but to preheat materials prior to final processing. They use waste heat now, it will just mean that it worth the money to use more. Lumber kilns can use waste heat even ice making companies can use waste heat.

In the US, some where around 60% of the energy used by power plants is wasted. At the time they were built, technology and finances greatly limited efficiency. In today's world that can be reduced to 50% and with a little extra need even 40%. That will decrease energy use by 20% with only one catch. That improved efficiency will not be just electrical output. It requires joint government/business partnerships to explore waste heat recovery options.

It is all in the Sales Pitch

Getting warm and fuzzy folks to even think about any energy efficient solutions using coal in any form is like getting a 5 year old to eat broccoli. Spinach is easier because you can dig out old Popeye cartoons for the sales pitch. Biomass is like spinach, easy sell.

The Green Car Congress is about as warm and fuzzy as you can get. They like this idea, http://www.greencarcongress.com/2010/12/liu-20101209.html because it uses "Bio Mass". FutureGen, the prototype coal plant of the future, is broccoli,Blah! If you add 15% to 25% bio mass to FutureGen, that is like covering the broccoli with cheese and bacon bits.

Why you have to play games with the warm and fuzzies is something I am beginning to understand. Actual thinking about real world issues is too far removed from the abstract Utopian vision they share. Compromise is taboo. Why else would activists fall for the Di-Hydrogen Mono-oxide skit? They definitely did not show effective reasoning skills. So we have to sugar coat or cover things with cheese and bacon bits to get them to taste reality.

Thursday, March 24, 2011

Energy Infrastructure

I have hit on infrastructure a few times. Today on Dr. Curry's blog one of the commenters ChE, made a statement about confusion many people have with "The National Grid".

It is impossible to have a true "national electrical grid" without futuristic room temperature superconductive wires. That is a long way from happening. I have commented on various ways to extend the usefulness of the existing grid, but never really explained why. Wire is not a perfect conductor so there are line losses. The wires get hot and that is an indication of the loss of energy. Voltages are kicked way high to reduce current flow, which is the main culprit behind line losses.

Spreading power plants in anticipation of current load reduces the line losses. So more and smaller power station is part of the solution of extending the useful life of the grid(s)serving the country. Wind power in the middle of no where has more line losses which is another limit of that form of energy. Wind is effective if a reasonable load is in a reasonable distance of the wind farm, provided base load or peak power generation is available to balance out the no wind times. Massive solar power farms would have the same problem.

Allowing growth of base load demand is another reason I prefer smaller modular power plants. They are not a great idea for huge metropolitan areas, but perfect for mid-size areas nearing the current/power rating of the grid serving them.

This is also the reason that I promote off peak hydrogen production. Even at a lower than desired efficiency, it allows storage of energy where the power source is not easily adjustable with demand. Also fuel for peak demand generation as needed.

Hydrogen storage works for all scales of energy requirements. A home solar or wind system with hydrogen storage is capable of providing consistent electricity under more varied situations. Subterranean hydrogen storage can provide prolonged peak power as needed on large scales.

Critical?

Critical generally means bad. He/she is in critical condition, bad thing. He/she was critical of my paper, bad thing. People try to dress up critical, "it was just constructive criticism." No matter how you dress it up, critical is not considered much of anything but bad.

Engineers use the term critical for not always bad things. Critical steam is a good thing. If you are making energy with atoms, critical is a good thing. Explaining good critical to the non-engineering masses is tough. Maybe we need a new buzz word?

Critical situation deserves a little PR work as well. Degrees of risk are part of determining if a situation is "critical". Why not just communicate degrees of risk and let others figure out how critical that is?

Take Fukushima for example, the risk of the nuclear fuel creating a nuclear explosion was zero. That possibility was not a "critical" situation. The risk of the nuclear fuel melting and burning through the pressure vessel containment was very low. Nobody, other than a very few that no one listens to, mentioned that. The risk of the nuclear fuel melting, burning through the pressure vessel and then burning through the concrete containment vessel was ridiculously low. That was never communicated very well. The risk of additional radiation release by continued cooling in the pressure vessel after fuel damage was never communicated. What would have happened if everyone ran like hell and did not try to add water to the pressure vessel after the fuel damage? There would have been no water to make the steam that carried the radiation away from the power plant. The very low risk of burning through the pressure vessel containment would have increased. The extremely low risk of the molten fuel burning through the concrete containment building would have increased. With out water to provide the steam to carry the radiation, the radiation release would have been decreased.

Of course, had the molten nuclear material performed the near impossible, there could have been a steam explosion if it hit the ground water. Then you would want to know the risk of a large enough steam explosion to expel enough radioactive material either up through the concrete containment building or around the base of the containment building to cause public health issues. Contaminated ground water risk would be nice to know as well as mitigation plans to prevent the spread of contamination.

I guess I am the only member of the lay public that would like to know all that.

The risk of the spent fuel in the pools going "critical" is also very low. Wouldn't it be nice to know how low? In a dry pool, the risk of "fire" from the cladding is low. The risk of that "fire" spreading is lower. Two things I would like to know, is how low for each. The risk of spreading more radiation trying to recover a dry pool is high. How much and how high? The only thing good about Fukushima is I may get those questions answered.

Super critical steam for power plants using any type of fuel greatly increases the efficiency and provides options to even further increase efficiency. Doesn't that inspire more questions?

I will try to find some of the answers.

Our Hydrogen Economy and Synfuels

"Implementing our "Hydrogen Economy" with Synfuels" is an interesting paper. I have not discussed this because I am biased to fuel cell vehicles. Still we have a butt load of internal combustion engines and gas turbines that would go to waste if some liquid fuel is not available. If I were king, I would phase out IC engines, but gear heads would have my head on a platter.

Using coal to produce synthetic fuels has been around for a long time. FutureGen is a prototype coal project intended to maximize the efficiency of coal combining carbon capture and sequestering. Sequestering carbon is not cheap. All those nasty carbon atoms need a useful purpose in life. Augmenting synfuel production with hydrogen from a clean or "green" source gives those poor misunderstood carbon atoms a purpose.

Of course, this "Carbon Recycling" concept is lost on the warm and fuzzy crowd bearing torches and pitch forks. Read the paper. Later we will discuss possibilities. BTW, you can leave comments, you can even call me an idiot if you like. I have a thick skin so I won't cry too long.

Wednesday, March 23, 2011

Future Energy Scenarios

People get paid big bucks to do detailed commodity futures analysis. This is free, so remember you are getting what you pay for.

Strategic needs seem to be the winner as far as short term driver of energy policy. The Global Warming guys have shot themselves in the foot and are still trying to figure out how that could have happened. I don't have a dog in that hunt, so I could not care less. AGW will probably be in the range of 1 to 3 C by the end of this century, worthy of consideration. Gas at 6 bucks a gallon by the end of the decade is more worthy. Doesn't really matter.

Transportation fuel is the biggest short term concern. Drill, Baby, Drill is now one of our new president's battle cries. He made the obligatory comments about staying the course even if oil prices drop so we feel comfortable again. Same message I have heard since the mid 70's. This time though, our fearless leader popped a few caps at the North African wing nut. Wing nuts are like red neck families, they fight all the time among themselves, but will take on all comers that slight fat cousin Betty. So expect more foreign oil supply problems.

Deeper and more remote areas will be drilled to get crude, even though California has crude burbling up off their coast. Coal gasefication for syn-fuels is on the back burner because of cost and the anti-coal lobby. Natural gas for large trucks is growing much more attractive and will really kick in with the trans-Canada gas pipeline. Bio-fuels are nearing their peak using food crops which should start a real push for non-food crop bio mass production. It will be at least a decade for that to gain traction.

Electric generation is switching to natural gas and higher efficiency coal. While more nuclear is likely, Generation IV reactors are still a couple decades off, spent fuel storage policy is killing rapid expansion. Wind is nearing its peak due to grid requirements. Solar is still the energy of the future, though the nanosolar concept is promising. The condition of the aging national grid and natural gas infrastructure will dictate what and where, until something real happens.

To make any major progress, somebody is going to have to take a get real pill. Hydrogen as energy storage will suck hind teat because it would require actually forethought to include it in the infrastructure build out. Something we are not noted for. The defense department is not going to be much help, because their needs are NATO based. The Department of Energy is throwing fair amounts of money at the issue, though the lion's share will need to be focused on infrastructure.

Nudging things toward a more optimum direction is complicated by the spent fuel issue, infrastructure and fear of nearly everything that happens to not be "your" idea. There is absolutely nothing new here, just restating the obvious. Real change, tailored to our future needs, will require what seems to be the lost Art of Compromise.

The Political Aspect of a Hydrogen Economy

Before the exciting diversion of Fukushima, which I pronounce oddly, I was building on misunderstandings of other transportation fuel options. Natural gas makes perfectly good sense from a energy security perspective, but squat from a CO2 emission perspective.

Politically, the global warming issue is losing ground mainly because proponents suffer from the same lack of trust they toss around about individual governments. They seem to believe that one global government is a better option than a bunch of individual governments doing their own thing. Conspiracy theories are so 20th century. The realism is that a majority government by third world leaders is no better than a minority government lead by first world capitalists. Bribery, corruption and stupidity are not reduced by increasing the number of players. Gridlock is, so that may be the only advantage of one world government.

Infrastructure cost is for some odd reason, one of the political stumbling blocks for hydrogen. An initial basic hydrogen infrastructure is only about 2 billion. That is a fraction of the subsidies for wind and nuclear power which no one complains about too loudly. That basic infrastructure is enough to start making Fuel Cell Vehicles (FCV) economically viable. If you can't fill up on the road why buy one? The basic infrastructure would initially limit FCV use to the more densely populated ares, but everything has to start somewhere.

America has an advantage in leading the alternate transportation energy front. For whatever reason, the rest of the world covets our neat toys. Other countries bitch about our gas guzzling SUVs, but foreign sales have remained strong. As I have said before, FCV's will be big gas guzzling looking vehicles that happen to be efficient and "green". Damn the bad luck. Shouldn't we all drive Mini Coopers packed to the gills with kids to look like we care about the environment? Maybe doing something instead of appearing to, makes sense?

Anyway, is a road system full of "green" Hummers really that absurd?

Tuesday, March 22, 2011

Time to get Back to the Fun Stuff!

Now that the Fukushima situation is almost over but the clean up, I should be able to get back to pondering our energy future. There was more radiation leakage than I initially thought, but the forty year old design still hung in there. Even some stanch anti-nuclear folks are re-thinking their views. How the heck could that old, tired nuke plant facility not Melt down? News flash, three of them did. At Three Mile Island the core was 50% melted, that is fuel, and the molten fuel burned less than an inch into about 4 inches of steel at the bottom of the pressure vessel. Even the old bottom control rod design doesn't make it easy for a China Syndrome moment. Newer designs of the same plant have more safety features and they are being replaced by newer, better designs.

The whole incident has taught a lot of people a lot of lessons. Hopefully, some of the main stream media stayed awake in class. There are real issues that need to be addressed with spent fuel storage. Maybe now the congress will get off their butts and let some fuel reprocessing get started. The nuclear industry is doing a good job despite the idiots they have to deal with. Nuclear boy didn't poo.

More Radiation Stuff From Japan

First, I do have a life of sorts. My trip today was postponed, but I have been shopping, went to the laundry, hung out at the dock with buddies, went to a meeting and done a few other things today. I am fascinated with the goings on in Japan, so I have been digging up stuff instead of watching the tube.

The radiation map thing and its thinly veiled allegation that there is a Japanese government cover up going on, has been on my mind. I found plenty of data on radiation readings including fallout for Iodine (I 131) and Cesium (Cs 137). The readings are of course in new units which means more calculations to see what they mean.

These two radioactive Isotopes have half lives, I 131 is 8 days and Cs 137 is 30.17 years, per wikipedia. Cs 137 is the nastier of the two. This link has the fallout readings for most of Japan, but is not complete for two prefectures because of limited access following the Earthquake.

Ibaraki has the highest fallout with 85,000 I 131 and 12,000 Cs 137 recorded as MBq/KM2. A MBq is MegaBecquerel and KM2 is kilometers squared. The IAEA report is in meters squared, so one KM2 is equal to 1,000,000 M2. That makes the conversion easy since MBq/KM2 is equal to Bq/M2. But, the IAEA report is in MBq/M2, so we have to divide the 12,000 by a million which is 0.012 MBq/M2.

So for the nasty Cs 137, the fallout is 0.012 Bq/M2. There is also I 131 which would be 0.08 MBq/M2 for at total of about 0.1 MBq for simplicity. Now what does that mean? A Bq is one pop or nucleus decay per second. That really does not mean a lot until it is converted to Severts, damaging absorbed radiation. Since I was not there when the IAEA guys did the test, I will ratio the high reading in the report 160 microseverts and 0.9 MBq/M2 to get about 17 microseverts/hr per meter squared.

Now if that wasn't complicated enough, I 131 has a half life of 8 days, so every 8 days the pops drop by half. 80,000 today, 40,000 in 8 days, 20,000 in 16 days and 10,000 in 24 days. So in a couple months, only Cs 137 is really a concern. Then the fallout radiation drops to 1.7 microseverts per hour. That would be 14,900 microseverts per year. Unless you happen to be a Brazilian airline pilot, that is high (see the chart and radiation readings on this link for Ibaraki.)

That was simple enough, but wait! There is more! From Wikipedia, "Health officials have predicted that over the next 70 years there will be a 2% increase in cancer rates in much of the population which was exposed to the 5–12 EBq (depending on source) of radioactive contamination released from the reactor." (Re: Chernobyl). So what the hell is an EBq? That is an exabecquerel which is 10e18 Becquerel or 1,000,000 MBq! So if there is a 2% increase in cancer rates for 12,000,000 MBq, what is the increase for 0.1 MBq/M2? Don't know. It don't work like a linear relationship. That is why we underpay the scientist types and have to trust our governments. I doubt that anyone on CNN, FOX or the BBC has a clue either! I wouldn't drink any milk or eat any leafy green veggies from that area until next year. Once the rain washes the Cs 137 into the soil, things will be different. Stay tuned!

More Main Stream Media Fun



Luckily, lawyers can find work as weather experts.



They are thinking about changing the name to Club Meltdown. Oh, let's not forget Sky News.

So How is the World Press Doing?

I have beat on trustworthiness and clearly communicated information needed for people in any disaster to make informed decisions. Several Japanese media outlets have expressed some aggravation at the way the foreign press has fanned the flames of fear with sensationalized and inaccurate reporting.

From the Japan Times:

" According to another "fact," authorities have been warning those in a position to leave Tokyo to flee the city immediately, because another severe quake or an eruption at Mount Fuji could spark a meltdown at the "Shibuya Eggman nuclear reactor" — which in reality is a live house, or concert hall, in Tokyo."

And from the Japan Probe"

"Hikaru added: ‘There is a lot of anger in Japan against the French. We are not particularly close to that country and their officials have been coming out very publicly and accusing the Japanese of mishandling the power-plant disaster.

‘How dare the French attack us when they are always the first to collapse and run away at the first sign of any trouble?’"

And recently on all the major US main stream media outlets there has been the report of radiation levels 1600 times normal 12 miles away from Fukushima. The International Atomic Energy Agency report did indeed state that measured level of combination beta/gamma radiation was measured at level of 2 to 161 microseverts per hour compared to a background radiation of 0.1 microseverts per hour. That is a high dose rate. It is cause for concern. It is not a cause for panic.

For emergency workers at the plant, 100,000 microseverts (100 milliseverts) is the maximum dosage recommended before they are removed from the site for precautionary treatment with stable iodine for example. Younger people who would be more susceptible to radiation exposure should closely monitor reliable official announcements to find if they should evacuate the area and where would be best to relocate. If that level is sustained, those evacuation notices will be announced. At the maximum tested radiation rate, residents would have nearly 13 days to determine where and if they need to relocate before they were exposed to one half the limit of plant emergency workers. That has not been reported in the press, nor has that the readings will be monitored to see if they are sustained or a short term spike that may not require any action.

The press has made note of past situations where the electrical utility company TEPCO, has been reprimanded for failures to perform various tests of equipment. The press has not to my knowledge made attempts to determine the degree of the issues nor any corrective action taken on the dated irregularities.

The US announcement that US citizens should be evacuate from a 50 mile radius of the incident has also been sensationalized. Evacuation of non-essential personnel from an emergency area is common practice. Staged evacuation greatly simplifies the overall evacuation should it be required.

The Japanese people are industrious and intelligent. They have a reliable government worthy of trust in this situation to do the right thing. Under the extremely difficult conditions they are doing an admirable job which should have much more focus in the foreign press.

Just my thoughts.

Monday, March 21, 2011

More on Radiation Dosage

I just borrowed a neat radiation dosage chart that has just about everything you ever wanted to know about radiation on it. One problem, to read the chart I have to write some nonsense to get the chart below the advertising junk that no one cares about. So I am going to insert some line breaks to get that crap out of the way.

I will also stick a silly picture in to fill the gap. The Blues Brothers imitators, they were good by the way.


This is a link for the current radiation doses in areas of Japan. The readings are in nGy/h or nanograys per hour. One microrad (10e-6) equals 10 nanograys. So 0.1 rads equals on gray. The conversion to Severt depends on the type of radiation, but 1 Severt is approximately equal to 100 rems. Or if you like, 1 nanogray per hour is approximately equal to 0.001 microseverts per hour. Hmm? I wonder if the map make is anti-nuke?













Original link for this image xkcd.com

Xkcd.com is a very neat site you should visit. It has some great cartoons. They recently posted this chart of radiation doses that has been found at several blogs I have visited. Please visit their site often so the don't think I am some kind of graphics stealer, which I am in this case, but I am trying very hard to not tick them off too much. This is though about the coolest radiation chart I have ever seen!

A Renewed Interest in Pool-Type Reactors?

One man's trash is another man's treasure. Pool-type reactors are used in training and in rare cases, hot water production. Spent Fuel Storage Pools (SFSP) are not pool-type reactors because the storage racks and boric acid added to the water are used to prevent reaction. The Chinese are looking to use SFSP's as a solution to their nuclear waste storage issues.

Light water reactors create neutron poisons, isotopes that capture neutrons limiting the chain reaction in the reactor. This one of the safety factors of LWR technology and one of the problems with LWR technology, it generates more waste that has to be reprocessed or stored. Nuclear non-proliferation places limits on reprocessing in the United States that helps increase the waste storage problem. Most nuclear wast is not really waste it is wasted energy because of political issues.

Spent fuel still generate decay heat and can still sustain a less efficient chain reaction. The Chinese idea is to use the energy in the spent fuel as it decays for producing heating hot water for use in district heating and desalination plants. Spent fuel ponds can also be used for hydrogen production.

The boric acid added to the SFSP increases the production of hydrogen from near zero for pure water to usable quantities as the percentage of boric acid increases. Different impurities can be added to the water to increase hydrogen production.

Eventually, spent fuel will either be reprocessed into new fuel or added to Generation IV reactors for more complete burn up. Until then, spent fuel trash may be a treasure.

Sunday, March 20, 2011

Concerns for US Nuclear Power Post Fukushima

The Fukushima events have reveled a number legitimate questions about the safety of nuclear power in the United States. A thorough review of the safety here at home is warranted and has commenced. While I am a no body, here are my major concerns.

1. Pressure relief venting. The design of the GE Mark I initially allowed for venting into the exterior containment buildings. A recommended modification to reroute relief pressure in excess of the suppression pool capacity was proposed by GE which appears to not have been implemented in the Fukushima case, which caused the explosions damaging several buildings and also appears to have caused damage to one of the reactor main containment suppression pool structure.

2. Spent Fuel Storage Pool design may have cause greater than expected release of radiation. One of the primary reasons appears to be the design of the SFSP weir gate allowing more water leakage than expected or make up water loss for greater than expected duration. Make up water is a critical consideration that appears to have been compromised.

3. The design of the racks used in the SFSP and the spent fuel loading configuration needs to be verified. While considered an unlikely possibility, oxidation of the spent fuel rod cladding can occur and possibly spread if adequate dry pool passive ventilation is not available. Tests for the exact conditions at Fukushima have never been analyzed, only worst cases of totally dry pool conditions. The situation provides a unique opportunity to study mid-range situations assumed to be less critical than worst cases situations.

4. Due to tsunami destruction, offsite response equipment was greatly delayed and onsite equipment damage greater than expected. Availability of offsite equipment was reduced by the massive destruction caused by the combination earthquake and tsunami. Assigning proper priority of available equipment to the degree of risk appears to have been in question. Emergency response planning needs to be reviewed.

Those are the major considerations. "Real" risk in terms of released radiation needs to be clarified. Decisions by the plant control operators appear to have been influenced by less likely risks to the public resulting in greater overall risk. Plant designs allow for human error, but impact of human error can be reduced through better training.

Saturday, March 19, 2011

What Nuclear Power Designs Should be in Our Future?

With the Fukushima crisis a number of question arise of what kind and if nuclear energy should be in our future.

If, is a decision for the public. Nuclear agencies are tasked with safety standards that make the nuclear industry designs and procedures predict a huge variety of combinations of potential events and responses to those events. Public perception of the competence of these agencies to do that job is very important when making that decision. In our democracy, the opinion of all individuals is considered and open for the public to review.

A number of opinions on the safety of design aspects of nuclear plant design, spent fuel storage and other elements have been raised that the NRC has addressed which are now open to more rigorous review. I want to touch on a few of those before discussing my view on plant design.

First is partial melt down. Melt down is not a very good term in my opinion. Fuel damage is more accurate. The first and second levels of containment are fuel (pellets in this case) and the cladding that houses the fuel pellets. The cladding is made of a material call zircaloy. It is a zirconium metal alloy that has different material properties than the fuel pellets. Zircaloy has a melting point of ~1820 degrees C that is lower than the melting point of the fuel, oxides of uranium and in one case Plutonium in Mixed Oxide (MOX), which is ~2700 degrees C (http://www.ornl.gov/~webworks/cpr/v823/rpt/109264.pdf). Zircaloy can also burn at temperatures between (Correction)1000 900 C and 1200 C, depending on several factors. The fuel rod(s) are damaged at a temperature much lower than that of the fuel. Reactors are designed for worst case scenarios. This worst case is where the reaction was not stopped by the control rods and cooling was lost. A complete melt of the fuel is assumed in this case and the third level of containment, the reactor pressure vessel, and fourth level, the reactor containment building are designed based on this situation.

From the NTC Design requirements:

"Criterion 50--Containment design basis. The reactor containment structure, including access openings, penetrations, and the containment heat removal system shall be designed so that the containment structure and its internal compartments can accommodate, without exceeding the design leakage rate and with sufficient margin, the calculated pressure and temperature conditions resulting from any loss-of-coolant accident. This margin shall reflect consideration of (1) the effects of potential energy sources which have not been included in the determination of the peak conditions, such as energy in steam generators and as required by § 50.44 energy from metal-water and other chemical reactions that may result from degradation but not total failure of emergency core cooling functioning, (2) the limited experience and experimental data available for defining accident phenomena and containment responses, and (3) the conservatism of the calculational model and input parameters."

The bold section can be interpreted as not considering a total cooling loss which should be reviewed. I have a request for clarification from the designer, General Electric, for the design as modified in Fukushima. One design modification I would recommend relates to the proper handling of the steam produced by loss of cooling. The Fukushima Mark I design has a pressure suppression system that condensates relief steam in the suppression pool within the containment building. An external condensation pool recommended by the original designer should be considered in all installations. Improperly timed and diverted steam relief appears to have contributed to the current difficulties in Fukushima.

Second, the design of the Spent Fuel Storage Pools should allow for total loss of mechanical cooling including total loss of water in the pool. Questions of the possibility of renewed criticality and Zircaloy cladding fire potential in the SFSP has been raised. I have reviewed SFSP designs which appear to be adequate with reasonable increase in radioactive material release anticipated for the short term. If the pools are dry, that greatly complicates regaining control of the SFSP radiation levels. In a passive air cooling situation, the spent fuel will become hotter than the boiling point of water. Adding water will produce steam which will carry radiation out of the pool. Some believe that this may also contribute to cladding oxidation and hydrogen production. Recovery from a dry SFSP is not clearly addressed in the literature I have located so far. In a loss of cooling, make up water is designed to maintain radiation containment. The original design allows for boiling of the water in the pool with maximum decay heat. In general situations, the total decay heat of a normal SFSP spent fuel load would warm the water causing increased surface evaporation but not boiling. (As stated by a TEPCO representative.) The make up water should be provided by hard piped make up water systems. Should that system fail, water from tankers or other emergency back up sources can be provided. I have requested more information from the manufacturer better explain design considerations for this worst case scenario.

Added: I don't want the link police after me so here, http://www.osti.gov/bridge/purl.cover.jsp?purl=/6135335-5voofL/

Speaking of link police, I made a comment on another blog that the SFSP do not need cooling. They of course attacked me as an uniformed idiot. Well, SFSP's do not require mechanical cooling, evaporation provides all the cooling they need for safety (The water could boil, so that is a safety risk if you are around the pool). They do require make up water. A hot SFSP can go about 4 days with no cooling and no make up water. A hot pool is one with all fresh rods with less than 96 days of decay. The pool at Fukushima number unit 4 was hotter (deecay heat wise) than the rest but far from a "hot" pool

Small modular reactors like Nuscale modular designs, better address fuel damage which is both an important economic and public health consideration that needs to be considered in future nuclear power plants. The spent fuel storage issue, both SFSP and dry cask storage, is an important issue to be considered before nuclear power in the United States is expanded.

While I have determined that these issues are adequately addressed (contingent on expanded Dry Cask Storage), they are concerns worthy of much more intense scrutiny. I will try to provide more information on the important and complex issue of worst case nuclear power plant design and waste storage.

Small Modular Designs (SMR) include a variety reactor technologies. Light Water Reactor design is the basis for the Nuscale modulars which appears to be more likely available for near term deployment. In my opinion, their design is very similar to US Navy Designs which have a proven safety and reliability record. There are people that disagree and would prefer to wait for the increased safety of Generation IV designs. I totally agree that Generation IV designs, especially Molten Salt Reactors, are inherently safer, but the current generation of reactor designs offer adequate safety for responsible expansion of nuclear power. I will try to get an opposing view to expand the quality of the discussion.

Friday, March 18, 2011

The Fantastical World of The Hypothetical

I spent way too much time debating the nuclear situation with real scientists the past few days. Mainly the climate sciences guys and their followers. The only reason I am involved in the climate science debate is that I really do not trust a lot of their statistical methodology. The nuclear debate was illuminating.

Probability is an important tool for scientists. Nuclear energy would not be possible without probability and statistics. Significant probabilities are the things lurking in the hypothetical world that need to be discovered. Possibility is meaningless in science. Solid statistical probability is the pooh.

Events with probabilities approaching zero were being contemplated as potential reality by scientists that should know better. Not all by any means, but a few. That is the real world of climate science. A few, unfortunately more vocal, members of the climate science community are lost in the fantastical world of the hypothetical. They honestly believe they have discovered the climatological equivalent of perpetual motion. Trees located so that they magically represent the sweet spot of climate teleconnection. Novel statistical methods to manufacture missing data to incredibly certainty. The mystical insight to prove that the Japanese earthquake was caused by climate change. Amazing feats of the metaphysical!

Some of the players were realists. One, Eli Rabbet, surprised me with a display of common sense and realism that is truly worthy of an engineer. Another, James Annan, living in Japan, was the purest model of pragmatism. Those are two useful isms.

Japanese Nuclear Crisis - Radiation Impact

The effects of radiation exposure is a unpredictable puzzle that is scary and fascinating. What I think are weird contradictions are documented.

In 2008, about 20 years after Three Mile Island, a thyroid cancer study discovered that cases were much lower in the county of the reactor than the counties around the reactor. A study of thyroid cancer following Chernobyl discovered a much greater number of cases. In the Chernobyl area, contaminated milk produced by cows eating contaminated grass was found to be the main source of the radiation causing thyroid cancer. The TMI study did not mention the specifics of the source or reason for the reduction of cases. Heavier isotopes were dispersed at Chernobyl and lighter tritium was the main type of material released at TMI. The difference in the type of radioactive material explains most of the differences, but why tritium seems to act as a vaccination against thyroid cancer is not completely clear to me. It makes some sense, but the amount of radiation dosage was so small at TMI that I am surprised that any significant result was found. Perhaps, that is an endorsement for homeopathy medicine?

The recommended range of exposure limits are much wider than I would expect. NCR recommended maximum public exposure is much lower because of unknown factors. Limits for workers in the nuclear industry are 50 time greater than the public exposure limit. The acute dosage where health impact is not significant is 100 times the public exposure dosage. The acute dose where there is a 50% chance of lethal exposure is 500 times the public dose limit. A relative guideline for the Japanese situation would seem to be the minimum limit with the maximum acute dose as a maximum limit. The range with uncertainty would seem to be more informative for the public than just the low end. The news media rarely reports the range and uncertainty properly in my opinion. If people have better information they can make better personal choices. Children and pregnant women have the highest risk, so they should definitely take greater precautions. The government announcements do not clearly define the risk per age, dose and radiation type to avoid unneeded panic. Honesty in reporting uncertainty would be important for me.

My decision on the value of nuclear energy is based on a comparison of the TMI and Chernobyl accidents. There are huge differences in the two cases, design, isotopes released, fallout distribution and government trust worthiness are some of the main factors I considered. Democratic governments for all their faults are much more trust worthy than any other form of government. That is a very important factor because you can't make good decisions without good data.

The location, prevailing wind and the apparent blend of isotopes would indicate that the Japanese crisis will be similar to TMI. Reactor design is the main reason the impact will be much lower than indicated in the media. There are several things were radioactive release could have been reduced. It is hard at this moment to determine how much design changes would have effected the outcome.

I hope everyone will keep an open mind, take reasonable precautions and not spread too much incorrect information. Human nature means people will exaggerate the situation, so bear that in mind. Considering the magnitude of the events, the design and people involved performed admirably. Then that is just my opinion. Unfortunately, it will take time to know the whole story.

My prayers are with the nation of Japan.

Thursday, March 17, 2011

The Japanese Nuclear Crisis

While tragic in many ways, the Japanese nuclear crisis is interesting in even more ways. Design of nuclear reactors have to meet criteria often unfathomable. Events with so low a probability that they would be insignificant, have to be considered. Since few things in life have zero probability, the design elements of nuclear reactors and containment of waste are causing considerable panic.

Designers have to include one element that is extremely difficult to predict. The impact of heroic measures. Emergency workers are valiantly risking their lives to control the situation. Well intentioned, but not very prudent actions are the result in this case.

The type of reactors involved have design features to contain and minimize dangers. Once a significant amount of damage is done to the fuel, partial meltdown, the design of the pressure vessel and concrete containment building should handle the maximum amount of molten radiative fuel possible. The possibility of all the fuel melting, then all the molten fuel melting through the metal pressure vessel, then all of that fuel remaining molten as it falls to the concrete floor of the containment building dry well are extremely remote. That is the basis of the design, and extremely remote possibility.

Should the emergency actions cause water to be introduced into the "DRY" well, the possibility of a steam explosion that can spread the heavier, more dangerous, radioactive elements increases. The Japanese military are dumping sea water on the concrete containment buildings which increase the potential of the disaster, not decreases the potential.

The shape, weight and design of the concrete plug at the top of the containment building takes even this remote possibility into consideration. The plug is heavy enough to not move out of position until pressure in the containment building is high enough to produce potential damage to the containment building. If the plug is removed by a major steam explosion, the shape and size of the containment structure to the plug location, is designed to allow pressure from the steam explosion to be relieved without destroying the containment structure. More water than normal, can create a jet of steam from the containment building until either the water evaporates or the fuel is sufficiently cooled. While the jet is evidence of some cooling of the fuel, it is also evidence of more radioactivity being spread. Health impact wise it is best not to dump sea water on the reactor.

The spent fuel cooling pools is another overly complicated situation. There is a remote possibility that the spent fuel can return to criticality. There is a remote possibility that the Zircaloy cladding of the spent fuel rods can "burn" or oxidize. Both of these are "Extremely" remote possibilities. Most people should have the common sense to report the possibilities in terms that the general public can relate to. The criticality probability is equivalent to winning the Irish sweepstakes, being struck by lightning a couple of times and then still being able to spend the money without your ex-wife getting half of it. The probability of the cladding fire is much greater. While I don't have all the information needed to give a more exact estimate, it is more like winning the Irish sweepstakes and not having to give your ex-wife half.

There is also a huge misunderstanding about how much and what kind of radiation is a problem. Too much too fast is without a doubt a problem, but how much is too much too fast? A good guideline is 5000 millirad per year and 10,000 millirads in rapid doses. Below those numbers there is no measurable increase in cancer risk, per the NRC fact sheet on average dosage. There is a large gray area between 10,000 millirads and 50,000 millirads, the point where significant risk is verified. Even the significance of that risk is dependent on the individual.

So while the situation is tragic, it is also a good learning experience.

Monday, March 14, 2011

How Great an Idea is Natural Gas Powered Vehicles?

Since the main focus of this blog is supposed to be about hydrogen in our energy economy, maybe I should write something about hydrogen. A commenter mentioned that hydrogen is a storage of energy not a source of energy. He is right and I mention that all the time. Now natural gas is an energy source. It is being pushed pretty hard because it produces less carbon than coal, is available in abundance and pretty cheap. Like hydrogen, natgas, has to be compressed or liguefied to to be a protable energy for trucks and stuff. Compressed is easier, but takes up space, just like hydrogen, so bigger vehicles are better suited for compressed natgas.

Some people think that switching to natgas for your truck or bus will cut down on carbon emissions. Carbon is all the rage of course, so how true is that thought? Hmmm?

Here is a link to the Engineering Toolbox I am going to borrow a table from their blog because I am lazy today. I am giving them a link, talking them up, but I really should ask permission. Hopefully, they won't get too hacked off.


Click to enlarge

1) Commonly viewed as a Bio fuel

2) Bio Energy is produced from biomass derived from any renewable organic plant, including

dedicated energy crops and trees
agricultural food and feed crops
agricultural crop wastes and residues
wood wastes
aquatic plants
animal wastes
municipal wastes and other waste materials

I will touch up the table later, but notice Specific emmission of CO2 column (that is the last column on the bizzarro chart). Natural gas is 0.23 and diesel is 0.24. Yep, that is right, natgas produces a whopping 0.01 Kilogram of CO2 less per Kilowatt of energy! So do you think putting a million hydrogen fuel cell vehicles on the road is the same as converting a million semis to natgas. Not even close, hydrogen rules!

The table, now that it isn't so bizarre, has very useful information. I need to double check, but the values seen accurate. The engineering tool site is very informative if you have some questions or want to double check my stuff. There are a few guestimates on previous posts that I should tighten up.

What Happened at the Japanese Nuclear Reactors?

UPDATE: Thanks to Eli the Wacky Wabbit, here is a link with more complete information. It has a few minor corrections to what I have here. Some of the very short live isotopes I did not mention are included. He listed cesium and iodine and what a minor worry they are in a neat way. Also the portable generators appeared to arrive in time, they just didn't have the right plug to hook up the power! I need to double check that because it should only take minutes to splice in directly.

The earthquake and tsunami that struck Japan is a tragic combination of natural disasters. I feel deeply for the Japanese people. On top of the huge destruction by natural forces is the nuclear incident.

As far as worst case scenarios go, the Fukushima reactor is about as close as you get. While accounts are still sketchy, it appears that the plant went through its standard "scram" or shutdown for the situation. The control rods were positioned to stop the reaction. While the reaction was stopped, the reactor core is still hot, so cooling water has to be pumped through the reactor core until the temperature drops, ideally to below 100 degrees C, but below the temperature where the core is damaged will do. Since the Tsunami appears to have flooded the back up diesel generators, the plant only had 4 hours of battery operation for emergency cooling. Because of all the other emergencies caused by the combination earthquake and tsunami, portable cooling equipment was not brought to the reactor in time. Without cooling, the reactor core experienced fuel damage which is a partial meltdown.

Engineers are not really sick individuals, but incidents like this are learning experiences. To be a truly walk away, passively safe design, there would have been no damage to the fuel in this case. Terms have different meanings, so passively safe can mean no significant loss of ionizing radiation, meaning no public health disaster. Whether I disagree with the meaning of the term or not is immaterial, the plant designers have to explain their meaning to the buyers which have to explain it to the public. The circumstances of the incident do qualify it as and extremely unlikely event or combination of events. As such, the final chain of safeties, containment, are designed to prevent significant release of radiation.

The pressure vessel containing the reactor core is the first level of containment. Pressure is the first job of the vessel. It is designed to hold pressure much greater than the operating pressure of the system. Three times operating pressure is a good number with an average burst pressure about twice that. This is just my recollection of design criteria, not the actual dsign of the plant. Pressure relief valves have different settings, normally starting a ten percent over operating pressure. So this plant with an operating pressure of 1000 psi would have 1100 psi first relief valve with 3000 psi pressure vessel rating and an average burst rating of 6000 psi. Since there was an apparent steam explosion, that probably happened outside the main containment building. This is often a design feature so that an explosion in a worst case scenario causes the least possible damage, which in this case is major release of radioactive material.

The containment building is the second level of containment. It is the reinforced concrete building that houses the reactor. Three feet or more of reinforce concrete is a nominal design for a containment building. While a containment building may be designed to contain some pressure, its main design is to contain heat and radiation. The floor of the containment building is the thickest construction to contain molten radioactive fuel in the unlikely event that it burns through the pressure vessel.

With this design, the radiation most likely released is tritium, a radioactive form of hydrogen. Tritium forms during operation when water, the moderator of the reaction, captures two neutrons. thanks to fairly complex statistics, the amount of tritium formed in the reactor cooling water is a small percentage of the normal water and heavy water (also formed in operation which is a stable isotope)in the cooling system. So the total amount of harmful tritium that can be released is small. Tritium occurs in nature and we are exposed to it all the time. The concentration of tritium is the factor of concern. maximum levels of allowed tritium release are normally a fraction, roughly a quarter, of tritium that may be in drinking water in some locations. It is naturally occurring so zero exposure is impossible even without a nuclear reactor. The released tritium as a gas, is spread over a wide area. This appears to be a problem but the wide area decreases the tritium concentration to safe levels, often unmeasurable levels, rendering it harmless. It is a popular misconception that wide spread release of radioactivity is a bad thing. It is actually an important natural safety factor.

The absolute worst case scenario is a complete burn through which should not be confused with a melt down. A burn through is when the nuclear material melts down, then burns through the pressure vessel and the containment building. Should this happen, the molten radioactive material can cause a steam explosion with ground water releasing the more toxic heavier ionizing radioactive elements. In order for this to happen, an extremely unlikely sequence of events would have to happen. First, enough of the fuel would have to melt, all of it pretty much, then it would have to burn through the bottom of the pressure vessel. Again, nearly all of the fuel would have to drop through the bottom of the pressure vessel, then melt through the concrete floor of the containment building. The thermal mass of the containment building is figured into the design to cool several times the amount of "possible" molten fuel that may fall out of the pressure vessel. Then enough molten fuel would have to strike enough ground water to cause an explosion of enough force to blow a path out of the containment building to allow the spread of the heavier radioactive materials. Without going through all the math, the probability of this happening is about as likely as being struck by lightning every day for a couple weeks in a row.

Finally, should the nearly impossible happen, the spread of the heavy radioactive elements is greatly limited by their weight. So the vast majority of the radiation does not go very far. That is bad for the area of the site but a good thing for the local population. The area around the reactor has several rings outlining levels of safety (on a site map not painted on the floor). I will try to dig up a drawing showing regions of risk. Chernobyl, BTW, did not have a containment building, so I will try to compare the two designs.

So what should you expect the actual damages to be in terms of human life lost due to the event? Emergency workers in the vicinity of the reactor have a large risk of being caught in a steam explosion. God love them, firefighting is a dangerous job. In an event of this type it is often best to stay away and let the design its job. Hopefully, emergency operations managers will recognize this in the future. If you exclude emergency workers, by which I am in no way implying their lives are less valuable, the loss of live would be very near zero.

The biggest lesson that should be learned from this incident is trust the design and walk away. The overall design provides for more than enough time to actually walk away to a safe distance.

One reason I promote small modular light water reactors is that the smaller amount of fuel in the reactor allows the system to be "truly" walk away. While mega large scale reactors are most common because of various reasons, the scale of the project increases the probability of reactor core damage. Reactor core damage turns the former containment building into a few billion dollar monument to poor choice. Even though that possibility is small it is significant enough to be concern as this case illustrates.

When this situation plays out, try to keep an open mind about the real damages, instead of the sci-fi hype potential catastrophe damages.

Sunday, March 13, 2011

How to Overly Complicate a Simple Problem

In the United States, energy is at the core of a variety of real and perceived problems, both now and projected into the future. Due to an amazingly wide array of political and technological limitations, a clear path to energy related solutions is obscured in noise generated by agenda driven groups. Honest risk assessment is key to resolving this overly complicated issue.

1. Reduced dependence on potentially unstable sources of energy is in the best interest of our nation. It is impossible to make a case were reduction in the supply of foreign sources of energy will not have an adverse impact on America.

2. Every source of energy has risk and rewards. Honest evaluation of the "real" risks related to health, welfare, environmental and financial concerns, is improperly communicated. All of these risks are known. Even the "unknown" events can be quantified in terms of risk. Understanding of the "real" risks is inadequate for the general population to agree on appropriate action.

3. Sustainable energy sources are defined by both availability and cumulative negative impacts. Sustainable should not be interpreted as infinite. Technological advances and scientific evaluation of impact progresses as the "unknowns" are identified. This requires flexibility and diversity of energy choices to limit negative impacts.

Based on the three facts stated above each energy source should be evaluated.

Oil - Oil is primarily used as a transportation fuel. The methods used to convert any form of oil into energy are limited by the efficiency of the process used. Internal combustion engines are approaching a plateau of maximum efficiency. Efficiency limits the reduction of negative impacts possible with mass use of oil. For example; lower efficiency leads to more rapid depletion, increased of waste products and increase probability of "unknown" impacts.

Coal - As with oil, efficiency is a major consideration. Unlike oil, average efficiency as used, is much lower than the maximum limit of current technology and the theoretical limit of efficiency. Current technology limits efficiency to 80% versus a current industry standard efficiency of 40% (See efficiency options). Environmental concerns with coal production and use are real, but have a high degree of variability. Increased efficiency in conversion will reduce many of the risks, but methods of extraction and transport must be evaluated to address to minimization of negative impacts.

Natural Gas - Natural gas can be used for fuel a variety of processes. Efficiency considerations vary with use. A utility scale power plant application is were highest efficiency operation is required (See efficiency options). Environmental concerns of natural gas are fairly well known.

Efficiency options: Theoretical maximum efficiency is 95% for utility scale fossil fuel power plants. While maximum efficiency is desired in most cases there are limits and exceptions. Co-generation efficiencies of 80% are obtainable where waste heat (which can produce chilled water or ice) can be use for heat related processes including building heating and cooling. In applications where waste products are included as a percentage of fuel, lower efficiencies may be desired.

Nuclear - Efficiency of conversion to energy has to be considered differently in the case of nuclear. Lower efficiency in many designs increases safety. Wide spread use of nuclear energy requires lower efficiency reactor designs for greater geographical coverage with centralized high efficiency reactors to process fuel from the lower efficiency reactors in a more secure environment. As technology is proven, a variety of passively safe designs will provide one or more alternatives for current technology. Clearly communicating "real" risks associated with nuclear technology is much more problematic than most energy sources. False perceptions of risk have been instilled in society through generations of science fiction and action entertainment productions that greatly overstate real and create unreal scenarios of mass destruction. Despite real and grotesquely portrayed unreal risks, environmental and financial benefits of responsible use of nuclear energy are much greater than many other alternatives.

Wind - Efficiency of conversion has primarily financial and environmental impacts. Environmental impacts are less likely to be completely understood because of the limited time the impacts have been studied. Some impact bird and bats has been noted. Sound and light pollution (pulsating shadows of the moving blades) impact has been noted. The degree of impact of these issues are not well understood. There are also instances where designs are failing prematurely, blade icing posing problems, grid capacity limitations and requirements for other energy sources to compensate for intermittent availability. Real costs have been artificially deflated with subsidies while long term cost cannot be accurately determined. Wind energy is currently limited to no more than 20% of total production due to the nature of its intermittent source.

Solar - Efficiency plays a huge role in its viability both in terms of conversion of energy and cost of production. While technological advancement is slowing, the proper balance between product efficiency and production efficiency has not been resolved. Reduction in over all cost continues which reduces the wisdom of major expansion faced with a falling market. Environmental impacts appear to be low, but wide scale implementation may produce unanticipated negative impacts related to disposal, manufacturing waste and land use.

Biomass - Efficiency in terms of production, land use and end use, poses a complex blend of real and potential issues. Use of commodities such as corn, soybeans and sugar subject prices to radical fluctuation common to the commodities markets. Diversion of acreage suitable to food production, to fuel production, decreases the real sustainability of biomass as an energy source. Biomass using non-arable acreage shows promise though current production is too low for complete analysis. Environmental impact varies greatly depending on acreage used, methods used, resources used and percentage of total energy produced.

Standard Hydroelectric - Management of water resources and flood control are the major considerations. Environmental impacts are well known and have to be weighed against the water resource and flood control needs. Energy only expansion is unlikely to be significant. Efficiency is not an issue.

Non-Standard Hydroelectric - This is a potentially huge area that has limited current applications installed for proper evaluation. Waves, currents, tides and ocean temperature differentials can be used to produce energy. Transmission distance is a current limiting factor that may be reduced by energy storage. Environmental impact has to be determined by design and method of transmission and/or storage. Efficiency, as it relates to cost, is an issue that can be simplified to comparative cost of usable energy output.

Geo-Thermal - This area also has a wide variety of applications. Currently, environmental impacts and scale of produced usable energy are limiting factors. Smaller scale hybrid geo-thermal applications for heating and cooling are promising. Large scale electrical production is limited by geographical and technological constraints.

Other - Undiscovered energy production potential and hybridization of various technologies to better exploit waste are interesting and often over looked potential energy sources. Normally lumped in with other energy categories, it deserves a place of its own. Everything from landfill mining, sterling engines, super-conductive technology, hydrogen production and methane production from waste can be exciting additions to our energy portfolio.

Energy from trash (waste) combines disposal of certain types of problematic waste, petroleum coke, used tires and combustible, recyclable waste. The quality of the waste as a combustion fuel dictates the type of process used. Liquifecation of waste under heat and pressure allows for easier separation of non-combustible material such as metal tire reenforcement wires, then the liquid waste fuel can be used for a variety of purposes. When waste is used as a fuel mix in coal or natural gas fired plants, the plant design must allow for removal of non-combustibles after firing. Lower thermal efficiency is offset by benefit of waste disposal. Environmental impact of the raw waste and the combustion waste must be compared when evaluating these systems.

I will try to expand the detail and add links to promising new designs in the future.

Health risks - Coal, nuclear and combustible waste have the greater health risks of the energy sources. Each require special considerations to reduce introduction of harmful pollutants that can cause health problems. Ionizing radiation is a harmful by-product of all three. In nuclear applications containment and reactor design are used to minimize release of harmful radiation. With coal and waste fuels, dispersion minimizes health impact. Non-radioactive pollutants are more prevalent. Pre-combustion additives can be used to chemically react with certain elements during the combustion process. Post combustion, neutralizing chemical scrubbers, electrostatic precipitation and filtration can be used in combination to capture harmful pollutants. 100% capture is not economically feasible with current direct combustion technology. The Environmental Protection Agency (EPA) has minimum standards for all known pollutants for coal fired plants. Waste fueled (incinerators) and combined waste fueled power plants have a greater potential for unknown pollutants. Current EPA requirements appear to be more than adequate for public health protection.

CO2 - Carbon dioxide has been recently added to the EPA pollutant list. CO2 is not considered a pollutant in terms of human health. It may lead to harmful climate disruption and ocean PH reduction in sufficiently high concentrations. Fossil fuels, oil, coal and natural gas produce CO2 which are considered positive carbon contributors. The percentage of carbon dioxide per unit energy is a consideration needed for all fossil fuel applications. At equal thermal efficiency, coal produces the most, followed by oil then natural gas. A coal plant that is 60% more efficient than natural gas plant will produce roughly the same CO2 as the natural gas plant. Integrated Gasification Combined Cycle and other experimental combustion cycles can capture carbon dioxide. If the CO2 is used for practical purposes or sequestered, the process can approach carbon neutral. CO2 scrubbing is an experimental process using salt water and lime to react with flue gas to capture CO2. Algae can also be used to scrub CO2 producing bio-mass as a by-product.

Biomass also produces carbon dioxide, but is considered carbon neutral. All biomass is not carbon neutral. Only biomass derived from annual bio-crops is carbon neutral. Peat and mature trees are carbon positive. If the age of the bio-mass is equal to the atmospheric residence time of carbon dioxide it should be considered a fossil fuel. Time to harvest is a major consideration for bio-crops.

NOTE: I HAVE NOT COMPLETELY REVIEWED THIS POST SO THERE MAY BE SOME ERRORS.

Friday, March 11, 2011

An Open Mind Doesn't Mean Letting Your Brains Leak Out

The greatest thing about being able to stand back and look objectively at things is you learn a lot! Sticking with a single ideology is really boring. My greatest strength as a manager was actually listening to the people that worked for me. Even though many had much lower educational backgrounds, they all had things to teach. It didn't take long for me to realize that everyone has a little Einstein and a little Dilbert in them. Of course, some had an affinity for their Dilbert, but I directed them to other career paths where I am sure their opportunities to get in touch with their inner Einstein were expanded.

I have even known liberals that were capable of rational thought. The problem is that the Einstein/Dilbert ratio fluctuates in a pseudo-noise encrypted pattern. Since some of you may have forgotten your spread spectrum communications systems primers, Pseudo-noise is a sequence of information intended to appear completely incoherent, like noise. If you know the sequence you can decipher the message concealed in the sequence. Actress Heddi Lamar is credited with the concept and design of spread spectrum communications systems. Her first system was called frequency hopping. A message was divided between a number of different radio frequencies so that if you knew the sequence of frequencies to be used, you could hear the entire message by jumping at the right time to the next frequency.

The reasons I propose that Einstein/Dilbert fluctuates in Psuedo-noise patterns rather than truly random patterns, is that a pattern appears to exist. This is fairly obvious if you discuss any meaningful but controversial subject. Knowing the person's ideological tenancies, you can anticipate, to a point, Einstein/Dilbert responses to elements of the topic. By synchronizing your Einstein/Dilbert with the other person's Einstein/Dilbert, a coherent conversation can take place. As the Einstein/Dilbert ratio approaches infinity the quality of the conversation approaches perfection. An Einstein/Dilbert ratio greater than or equal to 1 is the minimum required for normal understanding.

Improving communication of information on blogs for example requires adjusting the Einstein/Dilbert responses, interjecting or removing Dilberts as required will maintain a reasonable level of conversation. The advantage of removing Dilberts is obvious, eliminate the Dilberts and there are only Einsteins. The advantage of interjecting Dilberts requires the understanding of Pseudo-noise encrypted communications theory in a chaotic communications environment and the law of biased ideological phraseology. As Einsteins are limited to a small percentage of population and Dilberts are unbounded, pure Einsteinian conversation is only theoretical.

In our next installment we will discuss the basics of Ideological Phraseology.

Blog Archive