Saturday, April 30, 2011

Where is the Future?

Blue - 1979 to 1998 Red - 1979 to 2011 Yellow - 1999 to 2011

While still playing with the opensource charts, I made this one of the RSS global surface temperature data. There is nothing really special about RSS, it just is a smaller data file that is easier to play with. The linear regression for the three plots extends to the year 2100 (2097 actually, I set the scale so that 1998 shows up better.)

My main thing with this blog is hydrogen and alternate energy. The only reason I have any of these climate charts and other climate stuff posted, is that climate change, aka global warming, is one of the reasons hydrogen as a transportation fuel is being considered. Except for saving the planet, hydrogen is pretty limited in most cases. It is costly to develop and risky to use.

More people using hydrogen drives the cost for electrolysers, fuel cells and storage mediums down. Without mass production, hydrogen for residential power storage is just an expensive hobby. The data in the above chart shows why entrepreneurial hydrogen use and production development is lagging. Governments can do a lot to compel citizens to use certain products. Governments though are inefficient. Entrepreneurs, are the real drivers of any capitalistic country. The entrepreneurs that make the best products are smart enough to know when to pull the trigger. The above chart scares the alternate energy guys to death.

The chart indicates the drastic impact of human caused global warming is greatly over estimated. The chart doesn't prove anything, it just is an indication that climate change is not an immediate concern. Without an immediate concern, the entrepreneurs that are investing their own money or reputations are not ready for the risk. There are still niche markets, but they seriously limit the potential profits. Too many players for not, what appears to be, a rapidly expanding market.

Ballard Power and their partner Plug Power have a decent niche which could be some what rapidly expanding. The new battery cars, Chevy Volt for example, have serious limitations. 30 miles to a charge will not appeal to a very wide customer base. It is basically a toy for the wantabe green crowd and government motor pools. The general American public wants a car that gets at least a 200 mile range, is bigger which is assumed to be safer and convenient refueling. Ballard Power's FC Velocity fuel cell for use in material handling is a good upgrade for the Volt battery pack. Even with the cumbersome fuel storage tanks for compressed hydrogen, a Ballard Power upgrade can deliver the 200 mile range and peppy performance. Refilling though is still an issue limiting the potential of the upgraded Volt.

I mentioned before that the Honda Home Hydrogen station was a natural gas reforming unit. Molten Salt fuel cells are designed to run on natural gas, natural gas and propane are much easier to locate now for refueling, so why bother with the Proton Exchange Membrane fuel cell manufactured by Ballard if you have no access to inexpensive refueling? None really.

A molten salt fuel cell for a vehicle is a little more complicated than the PEM. While the molten salt fuel cell generates electricity it generate a lot more heat. That heat, which is a high temperatures (over 500 C) would require co-generation to dissipate the heat and generate more of something while it is at it. A small gas turbine driven generator is the most likely co-generation. Depending on the temperature range and heat available, a turbine using say a refrigerant for its gas, could generate electricity using the waste heat of the natural gas fuel cell. It is entirely doable, but the extra complexity means higher initial cost, more risk for the consumer and higher maintenance cost. So doable or not, it ain't likely to happen. This goes right back to the matter of scale, utilities with government subsidies may give it a go, but consumers needing a cost effective alternative will still be screwed. Without a hydrogen infrastructure, the Volt, Ballard Power and my dreams of a hydrogen RV are dead.

The Home Hydrogen Stations I was counting on could cost less than $4000 with over 100,000 units per year production and less than $2500 a year with 250,000 plus units per year. Manufacturing scale is a really big deal. Right now, you would be lucky to find a high pressure, water splitting home unit for less than $50,000 dollars. If the greens really wanted to put their money where their mouth is, they should be more involved in strengthen this weak link.

There are a number of low pressure PEM based electrolysers that can be used with extra value added design to make home refueling units. Then, low pressure hydrogen production is not exactly rocket science. Dangerous Laboratory's design looks complicated, but that can be tighten up fairly easily. Also, some of the idiots playing with HHO or Brown's gas aka Oxyhydrogen, have some homemade systems that could be modified to produce reasonable quality hydrogen. It is all in separating the anode gas O2 from the cathode gas H2. A fairly easily made plastic part could separate the gas streams. Porous nickle anodes and cathodes with an alkaline electrolyte (KOH) can produce lots of hydrogen reasonably inexpensively.

What is reasonable inexpensive depends on you power source and your needs. Wanting the highest efficiency is admirable, but getting the job done with affordability is the ticket. In the island home example, forty percent efficient is more than acceptable. After spending $10,000 for a 10 KW solar array another $10,000 for storing energy in not a big deal. The cost for gge (gasoline equivalent gallon) can be over $6, but in the island house case, transporting, storing and convenience are worth it when gasoline is damn near $5 a gallon at the fuel dock now. I can make the same case for the RV of the future. No way can I do it for the Tahoe FCV.

Interestingly, with the Island Home or the RV of the Future, I may be able to make a case for a FCV boat or a FCV convenience vehicle for the RV. More likely, a hydrogen fueled Internal Combustion Engine (ICE) would be better despite the lower efficiency.

So now the focus needs to be on reasonably affordable hydrogen production. Since we need pressurized hydrogen, the compressor is the one component that has to be absolutely right. I am fairly confident that the metal diaphragm pump will prove to be the best for overall safety and efficiency.

A multistage diaphragm pump doesn't have a lot of moving parts. Reed valves direct the gas flow and are not very complicated. The diaphragms are flexed by a rotational motor shaft with cam lobes. That is about it, the cam lobes push on the center of the metal diaphragm to displace the gas volume forcing the gas out of the discharge reed valve while the intake reed is forced closed. When the diaphragm relaxes, the discharge reed closes and the lower pressure opens the intake reed valves. The discharge of a lower stage is piped to the intake of the next stage. Simple and elegant design with no piston rings or lubrication required in the gas stream. Just to make the pump last longer, teflon coated metal diaphragms would be nice. The teflon would absorb the energy of the tiny hydrogen molecules reducing damage to the metal diaphragm. The cams and motor shaft can be lubricated outside of the gas stream or oil free shaft to cam surfaces can be used. A non-sparking electric motor adds to the cost, but not significantly. The motor can be either really beefy for higher flow, higher pressure, or not so beefy for low flow, medium pressure. The cost of the storage tanks is a limiting factor. 4500 to 5000 psi is the maximum cost effective pressure currently for storage, with the 3000 psi range considerably more affordable.

The 3000 psi range for storage also extends the life of the unlined storage tanks. Aluminum cylinders (tanks) have a burst pressure nearly twice the 3000 psi range. So the wall thickness is great enough to handle the constant bombardment of the active little hydrogen atoms at medium pressure for a reasonable number of years. As a guesstimate, approximately ten years, so figure eight years to allow a safety margin.

For the Island Home, the size of the electrolyser is not very important. It is more important for the RV of the future. Considering that, I will start looking at electrolysers designed for 4 kw input power (the rough size of the RV of the Future solar array) and a total space required of 4 cubic feet. A target efficiency of 50% will be used but after all the what ifs, 30% to 50% will probably be enough to justify the design.

That 30% to 50% range will drive most mechanical engineers nuts. After conversion to useful energy, the actual system efficiency would only be 15% to 40%. They can handle the 40%, but 15% they would laugh at. Remember though, we are using fixed cost solar energy. Any power not used is lost, so recovering 15% of something wasted can be a big deal. We have to compare the initial cost, weight, maintenance cost and replacement cost of batteries plus the cost of energy conversion from stored battery energy to useful work. Some of the work we may want, transportation, is a key factor.

So I am off to research affordable ways to make hydrogen under medium pressure. I have seen a lot of websites with big promises, but are there any that really have the goods? With a $10,000 budget and $6 gge goal, let's see if it is doable today.

Wednesday, April 27, 2011

Our Hydrogen Home - Is it Worth it?

An energy self contained home is the dream of many. Living in the islands of the Florida Keys, I know of several islands that have no connection to the main electrical power grid. Some of the home owners of the electrically remote island homes are very wealthy, others not so much. They deal with their power needs differently based on their financially situations. Hmmm, kinda like nations have to.

Alternate energy is a matter of cost and convenience. The less cash you have, the more inconvenience you can deal with. I like to look at the true cost for an energy self contained home (or Recreational Vehicle) every few years to see how things are shaping up. Wind and solar are the main energies chosen by the Island guys with Internal Combustion (ICE) Generators as backup. They all have to have backup generators because, they cannot store enough energy when the wind or sun is working to cover times they need energy. That adds significantly to the cost. Very significantly as fuel costs are over $3.75 a gallon. Fuel was less than a dollar a gallon when most of the island home energy systems were installed.

Because on building height limits, the massive, tall wind turbines are out of the question. This is an island tourist destination after all. Not creating an eyesore is important. Some of the older hydrogen home built in the past decade have nearly a dozen large propane style storage tanks, which are not only potential eyesores, they tend to float away in hurricane flooding. That is an expensive and potentially dangerous problem. The hurricane winds also tend to relocate wind generators.

As you can see, there are issues involved in designing an energy self sufficient tropical home. Even more if that home is to be a sustainable energy self sufficient home.

Learning from the screw ups of others is a lot cheaper than learning through your own screw ups. There is no way to avoid screwing up, but you can endeavor (I like that word)to minimize your screw ups and their magnitude.

Storage of energy is the big problem. Batteries are expensive, require maintenance and can be dangerous. While having some batteries is unavoidable, you want to get the right balance between cost, longevity, maintenance cost and safety. The telephone companies seem to have hit on the best battery storage ideas. They use massive 1.5 volt cells connected in series and parallel to provide the DC power they need. A compromise for residential consumers are large 4 and 6 volt lead acid batteries. Big banks of lithium ion sound great, but cost is prohibitive. Also, do it yourself installation and maintenance can be more dangerous. The 6 volts are cheapest initially, but specialty 4 volt batteries are a better choice. The four volt batteries have a warranted life of ten years and can last longer if properly maintained. Maintenance is pretty easy, just maintain water levels with deionized water and keep the connections clean. Any isolated island home should have three, required for the basic 12 volt systems that are less expensive. The high quality lead acid batteries would store nearly 15 kwh of energy at a cost of about $3000 USD. Medium quality 6 volts would require a bank of six batteries, $1300 USD worth, for about half the kwh and a quarter to about a half of the life expectancy, depending on use and maintenance. The lower capacity of the 6 volt batteries increases the chance of draw downs that limit the life. With batteries you basically get what you pay for, so I would not recommend a bigger bank of the six volts unless you get a killer warranty.

Hydrogen is by far a superior energy storage (battery if you will) medium. Hydrogen, because of cost, has not been used much in the Keys. That is starting to change. Home hydrogen electrolysers have not only started to drop in price, most offer something very nice, pressurized hydrogen. Compressing hydrogen in the past has been a big issue. The newer electrolysers produce decent amounts of hydrogen at a variety of pressures. Ultra high pressure, 5,000 to 10,000 psi, is ideal, but expensive. High pressure, up to 2,900 psi, is more affordable. Hydrogen is a small active atom that tends to want to be free. That puts some interesting, but not too complicated, problems on the table for storage. Soft materials absorb the energy of the hydrogen bouncing around where hard materials tend to get micro fractures fighting with the hydrogen. Luckily, steel or aluminum tanks can be internally coated with high density plastic to have both soft absorption and hard strength. The larger tanks or bottles are rated for 4500 psi which is fine for the 2900 psi electrolyser with some room for extra compression.

There are plenty of hydrogen electrolyser system manufacturers. Most are very proud of their products, meaning they are expensive. There are advantages to the higher cost systems. They are completely self contained with all the bells and whistles for those that just want a working system without the design headaches. HySTAT, has complete systems for off grid power using solar and/or wind if you have the cash. Their systems have a fairly good size foot print. Being self-contained, you have to be able to live with that. Mix and match systems offer more flexibility with of course more headaches, which I am now exploring. The price is pretty frightening too, which sends me into do it yourself mode.

The plumbing of the tanks, since it will take several to store enough hydrogen, will require either more expensive material or more frequent maintenance. Stainless steel, is a fairly soft metal, which is a good trade off between maintenance cost and initial cost. You can go into any dive shop and see air fill storage tanks with manifolds that look like what you would need for home hydrogen storage, the only difference is that it is a good idea to have the tanks lined with plastic.

Lining the tanks with plastic is pretty simple if you have the equipment. A process called rotational molding can coat the inside of the tanks by putting powered plastic inside the bottles then heating the bottles while slowly rotating the bottles in all directions to uniformly melt the plastic, forming a uniform layer inside the bottles. It is a little pricey, but more than doubles the life of the fairly expensive tanks. So this headache is not to hard to cure if you have space for low pressure storage.

You need to first be able to make the hydrogen and then use it. To make the hydrogen you need water, an electrolyser and power. The self-contained, higher price systems use any combination of electrolyser pressure and/or compressors. Every component added is an extra headache. Home hydrogen fueling systems, can reduce some of that headache. Designed to refuel hydrogen vehicles, more forklifts that on the road vehicles right now, they produce hydrogen at the pressure you need, approximately 5000 psi. To avoid warranty issues, the storage system would have to be manually filled.

Honda Motors has a pretty good idea. The Honda Home Energy Station, uses waste heat for home hot water heating and provide backup power. Unfortunately, it generates hydrogen from natural gas. Which is kinda stupid, since fuel cells can run off natural gas to begin with.

Low pressure hydrogen with compression is still an option. Looking back at the the first solar powered home, Mike has 10 1000 gallon propane tanks to store about forty gallons worth of gasoline equivalent hydrogen gas at 200 psi. Living in the country with not many neighbors, Mike can get away with this. Mike also has one hellava expensive system that is like a Rube Goldburg design based on what we need. The electrolyser he has is pretty much what we need though.

Building from scratch, like Dangerous Laboratories, is not for most folks. Cal State University built a hydrogen filling station that has a few good ideas worth looking into. this requires learning that 1 kilogram of hydrogen would occupy 11 cubic meters of space. All the hydrogen generators list production in Nm3/hr, which is Normal cubic meter per hour. Normal meaning Standard Temperature and Pressure (STP). One low pressure hydrogen electrolyser, the M2-EL2500-12.5, produces 0.83 Nm3/hr @ 60 PSI requiring 2500 Watts/hr. That would be 1 kg-H per 13 hours, just under 2 kg-H per day. Since we are planning a 12 volt system, that is a good choice. Production could be doubled by using two or doubling the voltage and using the M2-EL2500-25. At 60 PSI we would need to compress to around 4000 PSI. Hydrogen compressors are a little special. Hydrogen is an energetic molecule in more than one way. Being the tiniest of molecules, it is hard to contain. When it is not contained properly, it tends to cause explosions. We don't want that, so a compressor specifically rated for hydrogen is what we want. Hydropak makes low pressure compressors we need, unfortunately, their main website is in Turkish.

Converting the hydrogen into the power you want has three options; burning like propane, running a motor for a generator etc. or using a fuel cell for conversion to electricity. For the island home, the motor as in running a generator is the easy, cheap way to go. That is fine for a place that is not occupied all the time. The motor for the generator will need oil changes and maintenance just like any gas power motor. Other than aggravation, the generator simplifies things quite a bit. All the basic use for the generator would be for high load things like air conditioning and back up charging of the batteries if there is a high demand. A fairly inexpensive inverter can be used for 120 volt electric with the batteries providing power. Shopping wisely can reduce the demand of 120 volts appliances. This is very similar to most of the island off grid designs, the only difference is storing energy as hydrogen rather than buying more batteries.

Using a hydrogen fuel cell is another option. Ballard Power makes a home hydrogen fuel cell designed for 2Kw backup power system for $2600 USD. Two kw gives you 18 amps at 110 volts AC. Enough for most of the basic stuff. For air conditioning, a little known option is a natural gas drive condensing unit. The Trane Company used to make a condensing unit for air conditioning with a natural gas powered motor driving the compressor. It looks like that design has passed away, but the idea is good for the island home. Here you can just match a hydrogen gas driven generator when there is a need for cooling. Beefing up the fuel cell and inverters is always a possibility if you have the cash.

Ballard also makes a 20Kw fuel cell for $10,000. This is about the lowest cost per kw of any fuel cell on the market thanks to mass production for the materials handling industry. This would require a bit more controls than the other units and better consideration of the DC voltage range of the overall design.

This brings us to solar cells. Since the 12volt system is the standard for most solar residential applications, that is what we would most likely go with, unless we opted for the larger 20 kw fuel cell. In that case matching voltage ranges for the fuel cell and solar panel array makes more sense. This would also require another look at the hydrogen electrolyser input voltage. We only want to make hydrogen from "free" energy, that in this case is the solar panel array. Finally, the amount of hydrogen storage would depend on that decision as well.

Adequate hydrogen storage depends on demand and replenishment rate. Compressed hydrogen makes the space required more tolerable. At 2900 PSI, the space required is a lot less than the 200 PSI Mike used in his video. There are a variety of manufacturers of suitable storage tanks. They do not have the plastic liner, that you would have to arraign on your own at a rotational molding facility. Not lined will do, they just need to be inspected annually and may require replacement in five years or so. All the plumbing should be inspected annually. Being able to buy a ready made system would be great if you have the bucks, but they seem to be going out of business pretty regularly. If you have ever been to a dive shop to get a cylinder refilled, that is the basic system you want. A good compressor that doesn't contaminate breathing air will work and the tank manifold is rated for 4500 psi in most cases.

These 200 bar storage and compression systems are not cheap. You will be looking at $5000 USD easy. Home CNG, Compressed Natural Gas systems will work also. The price is about the same once you add the storage. There are a couple things you need to know now. First, I am not responsible if you blow your silly ass up. This is a do it yourself thing. Second, with reasonable precautions this system will work fine.

The first precaution is remembering that hydrogen is an energetic gas. At some time there will be leakage, you don't want that leakage to build up where it can go boom. A nice open detached shed should be the hydrogen system home. You can screen the shed in and cover it with tasteful looking lathe to make it look like a tiki bar, but don't box it in. Second, think valves, lots of high quality valves. Things break and you need to be able to isolate the broken things. Third, you need a low pressure tank for the electrolyser output which the compressor can then boost to pressure in the storage system.

A fairly good size propane tank can serve as the discharge tank for the low pressure electrolyser. The output pressure of a low pressure electrolyser is about 60 PSI. The volume of this tank just determines how often your expensive boost compressor runs. I would recommend a 100 pound propane tank as a minimum for a small compressor with a hundred gallon propane tank not a bad choice if you have the room. If you use a medium pressure electrolyser up to 200 psi, either of these tanks will be fine. More than that pressure, you can can the compressor and size the storage tanks for the pressure and volume you need. Right now, the low pressure electrolysers are cheap enough to justify the five grand for compressor and higher pressure storage. Remember, that the price is dropping on some of the home refueling systems, so this stage may soon not be needed, other than storage.

The low pressure electrolyser I selected above only makes about 1 kilogram a day on a good day. One kilogram is about 33 kilowatt hours of energy. So if you use the generator only design, you get about 1/3 of that as useful energy or 10 kilowatt hours because of engine efficiency. With a fuel cell you get nearly 2/3 or 20 kilowatt hours out of one kilogram of hydrogen. So unless you are a party hardy energy hog, you really don't need that much storage. Mike in the video had ten 1000 gallon propane tanks at 200 psi that only held 40 kilograms of hydrogen. One of the standard 5 foot tall air cylinders has a volume of about 2700 cubic inches or 1.6 cubic feet which is equal to about 12 gallons. At 3000 psi, the energy density of hydrogen is roughly 1900 watts per gallon, so each tank would hold roughly 20 KWh which is close to 2/3 kilogram of hydrogen. So 10 eight inch round by 55 inch tall cylinders at 2900 psi will hold 200 kwh of gross energy, about 6 kilograms of hydrogen.

1900 watts per gallon at 3000 psi is a good number to remember. That is not a lot of energy per unit volume. Also remember hydrogen likes to leak. The choice of high quality valves and plumbing is important. Lining the tanks with plastic helps reduce leakage, high density plastic or Teflon(r)seals all are important considerations. Ventilation as I said is a must. Hydrogen flashes in concentrations for 4% to 75% by volume. The number of connections with a ten tank storage system is about as much as I would even think about. Any more and larger high pressure tanks are definitely worth the extra money.

Using the 20kwh per tank we can start completing our design. With the generator only system we would get about 6.7 kw hours per tank. On a good sunny day, the cheap electrolyser would come close to refilling the tank. As long as you are not an energy hog, that is not bad. Remember, most cooling is required when you have the most sun. So the solar panels can take up a good deal of the air conditioning slack. With the hybrid system, the 2kwh fuel cell would deliver nearly 12 kwh per tank because of its higher efficiency. Depending on use, the air conditioning dedicated generator drop the kwh per tank towards the 6.7 number. With the more expensive larger fuel cell, you would stay near the 12 kw number. This makes a big difference in the number of tanks needed for storage. With the bigger fuel cell, three or four tanks should be plenty. Six for the hybrid system. These configurations would give you about two low to nearly no solar energy days of backup. Since the generator really doesn't care if it has hydrogen, propane or regular gasoline, the hybrid and stand alone generator system (both can charge the battery back while running) gives you a plenty of backup.

Cheating up to 4500 PSI is an option with the right tank system. That would increase the storage per tank to 31 kwh or very nearly one kilogram per tank. Remember each tank is 12 cubic feet of volume. The higher pressure would increase leakage and decrease the life of the unlined tanks. So the cheating up should only be for a short term occupied mode with the 2900 psi the normal unoccupied storage mode. I will have to double check, but the leakage at 4500 PSI is the squared ratio of the pressure change. So there would be 55% more leakage at the higher pressure. Nominal leakage at 2900 psi is only about 0.05% per day, livable. At the higher pressure, the pump should be the weak leakage link which may require solenoid operated isolation valves to protect the pump when not running. While 55% more of a small number doesn't sound like much, damaging the expensive pump would not be a nice thing.

While I am a big fan of hydrogen only, for the island off grid house, I would go with the hybrid system with either gas or propane as a backup energy source. That being the case, a 4 to 6 kw solar array is all that is required. That will provide enough energy for the electrolyser and battery charging with basic high solar conditions covering most of the needed daily electric, refrigerator and lighting. With expected higher air conditioning use, a larger 8 to 10 kw array should be considered. Nanosolar has the lowest cost per Watt at $1 USD, unfortunately, they are not for sale to the general public yet. Most panels currently available are close to $2 USD per Watt. That would tend to bring the budget mined down to 3 kw for the minimum array size. At 3 kw you can still generate hydrogen (2.5 kw per kilogram) and maintain battery charging if you are careful.

If you have been following my fantasy hydrogen designs, you may notice a few changes. One is the the 200 bar home electrolyser for $2300 USD. The only one using water as a source is kaput. Also the Honda Phill, which was a compressed natural gas system, which could compress and store hydrogen in a pinch, is also kaput. As mentioned earlier, the Honda home fill system is a natural gas reformer, not suited to the the off grid Island Design. This is making me take a new look at Dangerous Laboratories. Making hydrogen from water is a piece of cake. Separating the hydrogen from the oxygen produced is the trick. There are several low pressure PEM electrolyser designs, but you real don't need a PEM to make hydrogen. The second change is storage. Without the less expensive home electrolysers that can produce the higher 3000 to 5000 PSI, that increases storage space yet again. There are some tricks there I am researching. It seems that only Ballard Power is sticking to their game plan and staying solvent financially with their materials handling angle.

For the hydrogen compressor, I am pretty sure a metallic diaphragm two stage compressor will fill the bill. The solid metallic diaphragm is a natural protection for the pump in case of high pressure leakage. Being oil free, the diaphragm design is also likely to have a longer life. I just have to locate one that is small enough for a home application that doesn't have the medical gas compressor cost. That will bring everything back to being viable again. I'll be back to this when I find out.

Monday, April 25, 2011

The Trends?

While I am playing with Openoffice trying to get back into charting and simple models, I thought it might be neat to post these trends(?) from the RSS satellite temperature series. I plotted the continental US just because and extended the year out to 2097 to see the trends of the lower troposphere (surface kinda) and the upper (stratosphere kinda). The temperature in 2097 would be about 1.6 degrees C higher than the average of the 1978 to 2011 data provided by RSS. The RSS baseline average is not the same as the UAH baseline, but it is close enough to be no big deal.

Then I did the same for the RSS global (-82.5 to +82.5) and the Tropics (-20 to +20) latitudes. Who would have thunk, their trends to 2097 are about the same (1.5 ish)for the surface with monster differences in the upper trends. The upper data is pretty wild, to the point I am not really impressed with its utility. It can make 10 C swings from one month to the next. So trying to make sense of it for seasons is pretty useless. That also means trying to track strat cooling versus tropo warming is an exercise in futility. That is mainly due to the envelope included in the upper temperature series. It overlaps the mid-troposphere enough that it is useless. I am a lot surprised that the UAH and RSS teams have not tried to fine tune their filters to better separate atmospheric layers.

The lower troposphere (surface) has been fine tuned to death in order to match the surface station data. That is fine, I guess, but the value of the surface temperature record is not as important as the atmospheric layers where you are trying to see the start of the change in greenhouse gas concentration impact.

The data both UAH and RSS have provided, are impressive, considering that the Microwave Sounding Units were not intended to be the last word in atmospheric temperature measurement. I am sure that filtering the data by pressure must be a major bitch, but it would be nice if you really want to see the physics in action. That is beyond my pay grade, so I am not going to waste my time trying to do it myself. Besides, as an unpaid blogger, my job is to just pick nits right? Since all the hoopla is over increased radiation capture in the upper troposphere, wouldn't it be nice to see a little better what is happening there? See I picked a nit just like I am supposed to. Based on the data available, you can't say crap about what will happen in 89 years, so get the right data or shut the hell up!

Now that that rant is out of the way, notice the US 48 chart without a trend. That one shows the "noise" of the series that has to be overcome to evaluate any "real" trends. What may be a trend starting around 1998, should be pretty flat if it continues. Since that happens to be a predicted climate shift (Tsonis et al), that happens to coincide with a shift in the Pacific Decadal Oscillation, it could continue for another 10 to 15 years. That would decrease the trend, (now approximately 1.5 C at 2097)by a considerable amount. The impact of CO2 is not linear, it is logarithmic, so I should add a log curve to show that, but with the noise of the temperature data, it would not be all that informative.

Back to my original objective, I am trying to combine spectra for the basic atmospheric layers up to the top of the stratosphere. Since the main gases in the stratosphere are O2, O3, N2, CO2 and a trace of H2O, the biggest issue is weighting for the relative gas concentrations at shared spectral lines and overall spectrum. None of these are even close to being effectively saturated, so it is unlikely they can block all radiation at their spectral signatures. Accurate temperatures would help, since the radiative absorption/emission has a dependence on pressure and temperature (phase should only be gas, but ice crystals may have some impact of significance). So completing the picture of my "picture window" scenario of the Whacky Tropopause is far from simple. I will try to overlay all the gases except water vapor, which should show a clean window, but water vapor will tend to fog it up.

Since the RSS data has more stuff, I plotted the rest below. The quality of the charts is pretty crappy, I will work some more to clear them up. If you look carefully, you can see that the northern polar region is pretty whacked out and the south pole upper shows more cooling due to the more variable ozone layer. For surface trends, the northern hemisphere shows what I consider "regional" variability, it does have a higher percentage land versus ocean. If you are up on the debate, you know that the medieval warming period is considered "regional". Because it was regional it doesn't supposedly impact "global" temperatures. What's good for the goose is good for the gander. So the top charts represent global and tropic temperatures with an ~1.5 degree trend at 2097. The US 48 just happens to agree with that trend. The poles are showing opposite trends, "regional" impact?

So it looks to me that a whole lot of stuff is going on. To tease out some valid indication of CO2 radiative effect, I will have to do some looking. Possibly, I can use just the data for the Sahara desert. Don't know.

Saturday, April 23, 2011

Scale Matters

After my little frustrated rant on Synfuels, I was struck by the variety of attitudes in play by environmentalists, climate scientists, anti-global warmists and the rest of the crowd trying to find labels. There are a lot of very smart people that seem to be clueless. While the arguments may sound intelligent (or not), there is little common sense involved.

Every issue is pigeon holed. The groups have these narrow little tunnels which make them oblivious to the broader picture. They can't or won't see common solutions, that while far from perfect, achieve progress toward common goals. Whether the climate is warming due to man or not, energy insecurity is a common problem. It seems a no brainer to me that progress can be made toward energy security that would have a positive impact on pollution would be an easy sell. Life is not so simple.

Synfuels, to me, are a perfect starting point. They can help resolve several problems with one effort. Thinking far into the future, there will always be a use for Synfuel products. Even if the transportation fuel of the future is something novel, lubricants, backup power, aviation fuels and small power applications can always make use of Synfuels. Properly designed, Synfuel plants can use nearly any fuel source to produce synfuels. It is all a matter of scale.

Scale matters with synfuel plants because the plant has to be large enough to cost effectively produce its product. Large plants, though not MEGA size, plants offer the options to produce meaningful quantities with a variety of feed stocks at high efficiency. This scale, along with uncertain commodity prices, tend to force government backing to be successful. Oil prices have such large potential variability, that big oil producers can easily squeeze Synfuels out of the market for the near term. That is why I think Synfuels are a perfect project for the military to address the main core of energy security. They can budget an average fuel cost over a decade or two to justify fairly large scale Synfuel production. Some large corporations can also the justify the long term investment.

The basic chemical building blocks that can be produced through Synfuel processes can produce more valuable oil based commodities with higher quality control. Synthetic lubricants, which are very expensive, is just one example. There are already examples where Fischer-Trope (F-T) Synfuel processes are being planned in India and China using South African designs.

Synfuel is opposed by the warmists since it does not cut the fossil fuel craving. "Cleaner" in terms of GHG pollution, is not an option they will consider, even though the F-T process can be used to convert CO2 to fuel and other products, albeit at a higher cost. For example, a standard F-T plant can produce gasoline at a cost of roughly $2.40 a gallon and CO2 converted gasoline at a cost of roughly $3.40 a gallon. That higher cost CO2 gasoline is not that attractive. It is close enough to being viable that it should justify some prototype plants to see if co-generation or other applications could make it more competitive. At $2.40 a gallon, the more standard F-T process is competitive as a co-generation process or waste disposal application. It is right at the ragged edge of being profitable as a stand alone application. Future oil commodity prices should only serve to be advantageous for F-T Synfuels.

Solar, the darling energy of the warmist's future, has applications, only on smaller scale. It is being incorporated in numerous communities of the future where it can produce a sizable portion of the community scale energy demand. In areas with high electrical rates, solar can be somewhat competitive, though subsidies are required in nearly all cases. The warmists can rationalize subsidies that match their agenda of course. While they complain about subsidies for other industries, often for research to meet their high demands to begin with, they have no problem with pork if it is in their trough. There are a few actually putting their money where their mouth is. Google has invested substantially in solar, Nanosolar is a main investment, that has promise. Active participation, instead of activist obstruction, is a positive sign that should be expanded.

James Annan, a British Climate Scientist, has been waxing technologically on Japan and solar following the Earthquake and nuclear situation. He does seem to "get" some of the scale issues, but still is a little on the warm and fuzzy side. One part of US government research into solar is the adaption of solar panels as an integral part of residential construction. If the solar panels can replace most of the typical roofing cost of a new home, that will considerably offset the initial cost of solar energy. Nanosolar received a 20 million dollar research grant for solar roofing integration. While that is not a lot of money, it has high potential return on investment. Solar in this case, has very good promise of becoming cost effective. With more improvement is efficiency and/or reduced production costs, solar could eventually be the savior envisioned by the environmental crowd. It is still not ready for prime time though. Active promotion of solar, with that realization, is healthy for pursuing part of the energy puzzle. It will not solve the transportation fuel problem or solve all energy problems. There will have to be a mix of options in the energy mix.

Nuclear, as I have mentioned, suffers from a problem of scale. Bigger is not necessarily better in nuclear as illustrated by Fukushima. The Generation IV technologies have promise, but just like solar, promise don't cut it. Nuclear co-generation is greatly limited with current technology. As I have also mentioned, off peak nuclear power production is well suited for the highly variable supply of energy from wind. You can't just start and stop a nuclear plant of any scale, but you can divert energy output to other processes to balance load. This is a key part of the energy puzzle that seems to be missed by the warm and fuzzy crowds all or nothing mentality. It is all about options and opportunities, which is lost on the negatively thinking crowd.

The same divert to other processes rather than trying to ramp down power on large power plants applies. Efficiency suffers with ramping down output. While not as critical in all cases, maintaining maximum efficiency offsets the less efficient processes of say producing hydrogen through electrolysis and/or F-T Synfuel production. Even if the hydrogen is not used most effectively as a stored fuel, enrichment of F-T or NatGas has promise of increasing overall energy efficiency.

There are logical steps to reducing energy insecurity that also reduce greenhouse gas and other pollution cost effectively. They are not the big bang solutions many dream about, but reality is not a dream. In the background, many governments and private enterprises are actively working on solutions. It would be nice if all the polarized groups would start throwing their support and money at action to solve the issues we face, rather than political actions that stymie progress. Then thinking is much easier than actually doing. Transportation fuels are the main issue and battery power golf carts are not that great a solution.

Friday, April 22, 2011

That Simple Greenhouse Effect

While playing with Openoffice to try and figure out to create charts that have years instead of some garbage on the X-axis, I wandered across a funny post at Gavin Schmit, wrote a post explaining why the Stratosphere cools and the Troposphere warms due to extra greenhouse gases. If you check out the link you will see that Gavin had a face plant moment. After a couple updates, he linked to a newer post called the Sky is Falling.

Trying to make the complicated as simple as possible, is the trick for climate scientists. It falls back to the first thing I was taught in Thermodynamics, Keep It Simple Stupid (KISS). The radiation balance with increased greenhouse gases is not all that simple, but can be communicated a little simpler.

Unfortunately, one of the more complicated parts is the first thing that needs to be explained, that is the Radiation Windows. As I have been on about in the Whacky Tropopause posts, heat or temperature flow from warmer to cooler. That is the first law of thermodynamics and it is a LAW. It is not like a recommendation, that's the way it is. Radiation has its own set of laws. When it transfer heat energy, it will follow the first law of thermodynamics. Every element in the known universe has a spectral signature, that means the radiation it emits is based on the elements that it is made of. Before space probes, astronomers could tell what a star or any other astronomical body is made of if it gave off enough radiation as heat, light or X-ray for them to "see". Science started to become aware of the "hidden" things over the past couple hundred years. Infrared radiation is one of those things.

The photo is from Wikipedia in the article Absorption Spectroscopy. As you can hopefully see, our sun has a very broad electromagnetic spectrum. Each wavelength has different properties and as you can see there are a few black lines which are gaps are missing wavelengths due to the chemical makeup of the Sun. That is our Sun's spectral signature. Other stars will have different signatures unless they have the same chemical makeup, pressure and temperature. The electromagnetic energy of the Sun powers our climate and the different wavelengths responds in different ways to different elements at different pressures and/or phases like solid liquid or gas.

For the greenhouse effect, we are mainly concerned with the atmospheric gases. Not just the greenhouse gases, all the gases. Nitrogen and oxygen are not greenhouse gases, but they still have an absorption spectrum, which means they have an emission spectrum. Anything with heat will emit radiation. The greenhouse effect involves special gases that tend to block outgoing infrared electromagnetic radiation. That radiation can still be transferred to non-greenhouse gases by other means besides direct radiative transfer. In other words, they can be bumped into. If the greenhouse gases could not transfer heat to the other gases in the atmosphere, we wouldn't be here, the world would be about 30 degrees colder.

Barrett Bellamy Climate has this above chart on their website which I am borrowing. You should pay them a visit. The Sun energy in red show gaps that are the same as the black lines in the other picture. Infrared radiation leaving the Earth is in blue and below in gray you can see the spectra of common greenhouse gases. Unfortunately, Nitrogen which makes up most of our atmosphere, is not shown. I will try to dig up the nitrogen spectrum later.

Any infrared radiation leaving Earth's surface or atmosphere has a radiation window if there is no gas absorption spectrum blocking its path. So at different altitudes and temperatures there are different windows. Water vapor, the main greenhouse gas, reduces with altitude to the point it is nearly gone by the top of the troposphere. There are traces above that have an impact, but the main impact is in the lower troposphere with a different kinda impact above the top of the troposphere. Because the other greenhouse gases are over whelmed by the impact of water vapor in a great deal of the lower atmosphere, the CO2, methane etc. gases play their role in the drier air, deserts,poles and upper atmosphere.

Pressure, which reduces with altitude, plays and important role in the physics of the Greenhouse effect. In my best Rod Sterling voice, "Imagine if you will, the rarefied air of the upper atmosphere, at to point where molecular collisions are reduced allowing infrared radiation to become the major component of heat transfer." Up here things get interesting.

As in earlier posts, I call this the Tropopause Sink, because infrared radiation has a picture window up here. Because of the little water vapor and few molecules, the height of the spectra of some of the greenhouse gases is not as tall. They still absorb in their spectra, but fewer molecules will absorb less infrared. More CO2 and more other greenhouse gases will absorb more, but how much more? The principal behind Stratospheric cooling is that CO2 will block more infrared in the spectrum that O2 and Ozone absorb, reducing the temperature of the ozone layer in the Stratosphere. Ozone is of course the main infrared absorber, but ozone is created by Ultra Violet (UV) breaking up O2 so that O3, Ozone, is formed. Things are happening in both directions in the stratosphere. Since the Stratosphere is warmer than the top of the Troposphere, it can heat the top of the Troposphere. This is a bit of a conundrum, if the Stratosphere cools, that is less it can heat the Troposphere, either directly or through a cascade of radiative transfer to Greenhouse gases.

Comparing the Stratosphere temperatures to the middle Troposphere temperatures collected by satellites, you can see that conundrum messing with the principals. According to the Global Warming crowd, greenhouse gases took over in 1950 and have been doing most of the warming every since. If that were the case, I would expect a rises in the Troposphere to correspond with a drop in the Stratosphere temperatures. Since the Satellite temperatures were measured starting in 1979, only the last decade has shown that relationship.

The delay in the inverse relationship of the Troposphere and the Stratosphere can be interpreted a couple of ways. First, that the measurements aren't all that great or second, the CO2 forcing is not as great as thought. While I am far from certain, I tend to go with the second, because the CO2 forcing estimate is a compromise, not a calculation. The first choice still has its place, and a combination of the two is quite possible. Either way, things are not going according to plan in the upper troposphere. Hence, my Tropospheric Sink theory.

Upper tropospheric warming by either Stratospheric radiation or CO2 interaction has plenty of window to get the hell out of Dodge. If the simple models underestimate the heat transfer to other atmospheric gases that can radiate infrared in other than the Greenhouse gas or O2/O3 spectrum, that would explain most of the "Failure to Indicate" anthropogenic warming. As I said, this does not mean there will be no warming, just less, with the potential of increased convection (thunder storms and tropical cyclones) modulating the impact. I need to figure out the Openoffice plotting thing better, then I can post the chart showing the unusual relationship between Tropo and Strato. I am not going into what happens in the upper atmosphere until I fine tune the Whacky Tropopause data.

Middle Troposphere (red) versus Stratosphere (blue), for tropics in anomalies by month starting 01/79 ending 12/10

Okay, I am still dealing with charting issues with Openoffice (I miss my Lotus 123 I used for years). I did figure out how to export the chart as a jpeg. The labeling of the X-axis is still an issue. The chart above is labeled as months instead of years starting in 1979. As you can see the correlation is not super, but you can see the inverse relationship a little in the later months. There is an odd direct relationship in the months corresponding with 1998.

In this chart of potential temperatures, the tropics are the center hump. Again you can see the northern hemisphere smaller hump and the southern hemisphere playing more nicely. I am going to do a north pole and south pole comparison like above. The polar coverage is not equal, so I may have to fiddle with those a bit. The north pole should show more of the poor correlation between Strat and Trop but the southern pole may have better inverse correlation though without much amplitude.

Well, I am making some progress with Openoffice, at least I have the year on the axis. The chart above is for the North Pole middle troposphere versus the upper including the Stratosphere. Not much inverse correlation and lots of noise with a warming trend in the mid Trop and slight cooling in the upper trop/lower strat. The trends at the
South pole were very small with not much correlation.

Update: Sorry about that, the HTML went whacky on me. Now the photos and text should be in order. Since I was fixing thing, You have to go read Micheal Tobias whipping out the F-word.

Thursday, April 21, 2011

It is Time to Pull the Trigger on Synfuels!

The economy sucks. Gas prices are insane. Employment has lots of room to rebound. Despite all the less than ideal timing, it is time to start building Synfuel plants.

Waffling environmentalists have proven one thing, they can't lead. Leadership requires action after thought, not just more thought. Like it or not, Synfuels are not only a potentially cleaner option for transportation fuels, they are a political message. If we start building synfuel plants, the oil producing world WILL take notice.

Synfuel plants can be fine tuned with hydrogen enrichment, co-generation and biomass (trash) feed stocks to provide high quality fuels with a smaller carbon foot print and a competitive price. Yes, that's right, a competitive price. The target price per gallon of synfuel is $2.31 a gallon. We have past that with flying colors. Allow fifty percent for overly optimistic estimation and ridiculously over regulation by the "Greens", you still have $3.50 a gallon. There will of course be an oil price drop with any significant Synfuel production, but do you really think that oil prices and construction prices won't go back up again? Planning ahead saves money in the long run. Synfuel processes have been proven and work. Unlike the energies of the future that are still in the future, Synfuels can send a message now.

That is the end of my rant for the day.

Wednesday, April 20, 2011

More on that Whacky Tropopause

I borrowed the graph above from Dr. Ryan Maue from Florida State University. Dr. Maue, is a bit skeptical about the impact of CO2 induced global warming. He seems like a pretty intelligent guy even though he is a Seminole instead of a Gator. Might be one of those mixed marriage things. Anyway, he does a lot of stuff studying tropical cyclones, hurricanes in my neck of the woods. The Accumulate Cyclone Energy (ACE) is one of the things he does. One ACE unit is one day of an average hurricane which is roughly 6x10^14 Watts/day. That is a crap load of energy as I have said before.

The Whacky Tropopause sucks up all that energy without missing a beat. 1998 was a high ACE year, about the hottest year ever according to all the temperature averages and all that is blamed on the humongous El Nino of that year. The TTS chart I used the other day, shows the Tropopause warmed a lot that year and dropped right back to near normal by the next year.

That impresses me, but I guess I am easily impressed sometimes. Since the Whacky Tropopause only has to lose an extra 4 to 8 Watts per meter squared to offset most or all of the CO2 doubling warming, I figured I would look at how well it does with hurricane energy. Of course all the units are all over the place so I will have to do some converting. Since I can go brain dead trying to use the Open Office spread sheet, I may be back to fix some screw ups. Feel free to do your own double checking.

First, the average hurricane energy from NASA is 6.0 x 10^14 Watts per day for an average area of 665 kilometers squared based on latent heat loss (rain). The area of the Earth is 510x10^6 kilometers squared which is 510x10^12 meters squared or 5.1x10^14 m^2. You can get that area from Wikipedia. So an average hurricane day results in a heat loss globally of about 1.2 W/m^2 per day. That is about 0.05 W/m^2 per hour due to the rainfall. Hurricanes also have huge cloud formations, which they are famous for, that reflect sunlight. The 665 kilometer average radius listed on the NASA website frequently asked questions gives an average area of 1.39x10^12 m^2. This is where I may screw up. The NASA estimate for average solar irradiation is 342 w/m^2. That is an Earth average and the hurricanes are mainly tropical where the irradiation is higher. The NASA energy budget cartoon also has a factor for cloud reflection already, but I am just going to figure what the average hurricane is reflecting and deal with that later. That gives me 4.75 x 10^14 W/m^2 for reflected energy with no time requirement since the clouds are there all the time the hurricane is churning. Taking that and dividing by the Earth's area I get about 1 W/m^2. I am using "about" since I have to make an approximation anyway.

While the clouds of the hurricane reflect a lot of sunlight, they don't get it all. This is one of the big questions in the climate debate, how much is the net radiation impact of the clouds. I have been through a few hurricanes and know it is pretty damn dark even in the day under those clouds. So my not too scientific wild ass guess is the clouds reflect about 50% of the Sun's energy over the course of the day and that should account a little for longwave radiation at night. (Before when I was working on the precipitation rule of thumb I used a lower number, but that was for general convection not hurricanes.) Since the NASA cartoon is a daily average (as I understand it anyway), that gives me a total hourly global hurricane impact of about 0.55 W/m^2. If I didn't screw up too bad, I should be able to just multiply the ACE number by 0.55 and get a general annual energy. Remember that 10% of that number is due to latent heat sucked up by the Tropopause. (I know somebody else has already done this, but I like to exercise my brain from time to time.)

That average global ACE is about 1200 based on Dr. Ryan's graph. That is annual total of daily energy and I am working in hours now, so hourly the ACE is 50. If all that is right, 27.5 W/m2 is the average total contribution of negative feed back by hurricanes per year (on an hourly basis), with 2.75 W/m2 the latent portion. Both of these numbers should be included as part of the energy budget under cloud reflection and latent heat.

Hurricane ACE varies. From Dr. Ryan's graph, 1998 was a big year at about 2200 which would be 92 in hourly form for a total of 50 W/m2. Ten percent of that is 5 W/m2 which contributed somewhat to the temperature increase of the Tropopause.

I am going to stop here a second. Looking at the two graphs, ACE seems to follow temperature only on the peak years. If you look at the surface temperature, ACE follows temperature more closely. That makes sense, because you would expect warmer temperatures to create more hurricanes. The only rub is that ACE has dropped pretty good since 2006. That makes me go Hmmm. Also you should note that my average of ACE, 1200, is more like the bottom of the ACE than average. The reason I used the 1200 is because I am comparing ACE to the Tropopause. Most of the temperature bumps in the TTS temperature are the big hurricane years. You can't see it because the TTS graph I stole, er borrowed, doesn't have a year scale. The big up near the middle is 1998 and the fatter hump to the left is around 1993. The big hump on the right is 2010, so 2006 ACE impact is lost in the noise before the last hump. That means there is no direct correlation, though the 1993 and 1998 tend to indicate that hurricanes may be a player. I just can't tell to what degree. I am going to try and overlay the TTS on the ACE graph so you can see it better. That should be interesting because the computer I am using likes to fight me doing that kinda stuff.

To be Continued..... Okay, that's not happening today. The dits on the TTS graph are one year with the right side 2011 and the middle up at 1998. Here is a link to the RSS website so you can look at it better. Part of the problem is the TTS is not really the Tropopause. It includes a good deal of the upper troposphere and the lower stratosphere. So the tropopause is in there, but there is a lot of other noise. There is a graph showing the weighting of the TTS graph on the RSS link. I really need to tweak the weighting for the tropics or average the TTS and TLS channels to hone in better on the tropical Tropopause. Eyeballing the average of the TTS and TLS, 1993 is weird. There seems to be top down warming while in 1998 it looks like bottom up warming like I would expect. That may mean that the Tropopause can suck up energy better than I expected or that the MSU data sucks. I have to assume that the MSU data is not too suckie or I may tick those guys off.

BTW, while I am not into the whole back radiation thing, the Tropopause is cooler than the surface and the Stratosphere. So there is real "back radiation" from the Stratosphere to the Tropopause and upper troposphere that obeys the laws of thermodynamics without the fancy back radiation crap. That is consistent with my goofy Tropopause Sink theory. I will get into the radiation spectrum later to show the Picture Window, that longwave radiation above and below 14.7 micron sees (14.7 microns is the main CO2 impacted radiation wave length) in the area of the Tropopause. This of course flies in the face of the general theory that the upper troposphere warms first. Since I haven't Tom Sawyered anyone into taking this mess over yet, I have a few more things to work on before I can start pulling it all together.

More Natural Gas Pains

The savior of our energy lifestyle, plentiful Natural Gas (NatGas), is being analyzed by everyone that even thinks they have a clue. I have written a bit about NatGas. How it is not all that "green" for transportation fuel and how the pipelines (infrastructure) needs some serious upgrading. NatGas (Ch4) is a greenhouse gas. It is short lived as Ch4, being converted over time to CO2 and water. With all the current buzz, I will once again step outside of my comfort zone and try to make sense of what is the whole deal.

According to Wikipedia,Ch4 last about 10 years in the lower atmosphere and should it make it through the lower atmosphere, it may last 12 years in the stratosphere. NatGas or methane, is about 20 time more potent than carbon dioxide as a greenhouse gas and is responsible for about 12% of the greenhouse effect. Most sources for atmospheric methane are natural, it is a by product of nearly every biological function. Vegetarians produce more methane than omnivores and carnivores. So cows, grazing animals and PETA members are primary natural animal sources :). Decay of vegetation, incomplete burning, etc. are all sources. The question is how much is due to NatGas production for energy leaking into the atmosphere?

During most of our petroleum production era, NatGas was not the main source of our drilling efforts. It has been "flared" off or burned until crude oil production from a well could be established. Since there is more NatGas than oil, when the oil wells were close enough to a NatGas customer, it was piped to areas for use. Fairly recently, Liquefied Natural Gas (LNG) has been shipped to customers from more remote regions. Shipping LNG requires pretty solid contracts, because NatGas is so plentiful, a customer could shop providers and build a dedicated pipeline.

This is where the new shale deposits of NatGas enter the picture. has a good article on Marcellus shale.

The map above borrowed from, shows that Marcellus shale is perfectly located for use near high population areas of the East coast of the US. It also shows the depth of the shale formations. One thing I do not hear discussed by the talking heads is the proximity and depth of the NatGas to be produced by the Marcellus shale drilling operations. While there is bound to be some leakage, the short distance needed for transport and the depth of the formations will tend to reduce the percentage leakage. At the formation depths, Fracking, which is just tight formation fracturing as I have discussed before, leakage due to the process of "fracking" is about the stupidest conversation I have witnessed on the internet. If someone has a sensible, intelligent explanation of why fracking Marcellus shale is a problem, I would love to hear it. Now normal drilling practice, is something else. Is there excess leakage at the well head? Is there ground water pollution? Did the drillers get your cousin drunk? There is nothing unusual about Fracking, it has been used for decades both for oil and gas. So add, "Fracking is bad" to your bullshit detector.

So if shale formation gas production is bad for the environment boils down to just two things, is the top side drilling operation sufficiently tight and is the piping infrastructure up to the job? As for the infrastructure, most of the NatGas will be used for power generation. That means dedicated piping and demand use. Instead of storing the Natgas, it can be left in the formations and used as needed. That "greatly" reduces potential leakage. Greatly is in quotes because that is a bullshit detector trigger. I don't know how much "greatly" is in this case. I do know that new piping and less secondary storage will leak less than old crap. Also there have been some big leaks caused by trying to store gas in underground formations that the gas didn't come out of. So do your own research!

Anyway, the whole NatGas thing really is just normal stuff blown out of proportion. The 100 year old NatGas infrastructure sucks. The old infrastructure is not rated for modern pressures and is not designed for hydrogen enriched gas. That will be the next aw shit moment. And old equipment ain't as good as new equipment, most of the time. It is just normal engineering stuff. Don't kill the plentiful source, deal with the issues.

Tuesday, April 19, 2011

Nuclear Power Thoughts with Fukushima in Mind

I am pro nuclear power, with a few caveats. The Fukushima situation has strengthened most of my opinions about nuclear and anti-nuclear activists.

Activists for the most part are driven by their agenda and have little use for facts or fact checking. There are exceptions of course, but not that many really. They will quote any source that gives them ammunition for their crusade no matter how outdated or wrong it may be. The activists have been out in full force with Fukushima, but this time it seems their zeal, despite evidence, has bit them in the ass.

George Monboit, a left wing type blogger picked up on the lack of facts thing. Why Does It Matter, he is focused on the facts for a change after being duped for quite some time. Why he has suddenly started to use logic and reason is a bit of a puzzle. The non-sense has been out there for a long time, all he had to do was question the logic, i.e. build a bullshit detector. His revelation may be past due, but it has come.

I had mentioned a few passionate activists spreading lies without having a clue they were just pawns of other bullshitters. The Penn and Teller Di-hydrogen Monoxide video is a way of life for many of the activists. Some though are the bullshit originators. As the Japanese radiation situation unfolded, one activist posted a radiation map. The data is supposedly real time updated daily by SPEEDI, whoever they are. The original posting of the map mentioned that the data for Fukushima and a neighboring prefecture were not posted and insinuated that a possible Japanese Government cover-up was involved. Well, the real time data from SPEEDI stopped 17 days ago and the little anti-government note is missing. No, I do not believe government censorship was involved, more like the situation was getting better which went against his agenda. By the way, data for Fukushima prefecture and the neighboring prefecture were available from very early on at the MEXT website. It is in PDF form so the concerned activist would have actually had to look.

There will be more quotes available for activist use soon as the legal process for Fukushima begins. Expert witnesses can be found to testify for or against anything. So the misinformation hasn't even started. It will be interesting to find if the Japanese legal system is bullshit tolerant, for there will be plenty of bullshit to spread. All of this will confuse the real issues which I had hoped would finally be addressed. My main one is the degree of safety of light water reactor designs.

The basic light water reactor design is very safe though not terribly efficient. Using regular water as a coolant and moderator makes these reactors self regulating. Maximum chain reactions requires the water moderator. Should the water over heat, air bubbles slow the reaction. If all the water disappears the reaction nearly stops. It really is a very safe design. The problem with Fukushima and Three Mile Island is the size of the reactors. While they still have the inherent safety of the design, the mass of fuel and the residual temperature of that mass is enough to cause fuel damage, i.e. melting, in a loss of coolant event.

Because of the fuel damage, imaginations run wild. The China Syndrome comes to mind and people forget that it is a fantasy, a movie, not the real world. Once people start with the theoretical, it is Katey bar the door, anything goes. The designs, even the old General Electric Mark I without recommended modifications, will never produce a China Syndrome event. Much of the radiation released in Japan is due to attempts to save the population from something that cannot happen, a China Syndrome.

While imaginative use of the theoretical can convince people that doom is on its way, the real physics behind the design shows there is very little probability of a total meltdown burning through the reactor pressure vessel much less the concrete containment building. Unfortunately, this design feature has never been fully tested, so the theoretical can be brought into play by activists.

Smaller light water reactors eliminate even the theoretical possibility of a catastrophic meltdown. A few hours after the control rods are placed to stop the reaction, the energy of the reactor mass drops to about 1.5% of its maximum operating energy. The Fukushima reactors where rated for about 750 megawatts with roughly a 33% efficiency. The maximum power of the reactors allowing for efficiency is 2250 megawatts, 1.5% of that is 33.75 megawatts or 33,750 Kilowatts. That is enough energy to cause melting of the fuel rod cladding and some melting of the fuel oxide which has a higher melting point. That is enough energy to melt some of the way through the massive bottom of the reactor pressure vessel, but not enough to penetrate en mass. A smaller light water reactor designed for say 200 megawatts at the same efficiency would have a maximum energy of about 600 megawatts and a few hours after shut down, a residual energy of 9 megawatts. This amount of energy could cause some melting of the fuel cladding, but is unlikely to cause melting of the fuel oxide, so there is little chance of a major fuel damage scenario. That means the accident would be just that, a accident that is fairly easily corrected and the reactor could quite possibly be placed back into service. There is a huge potential cost difference between the design scales.

An even smarter design is 50 Megawatts, then with 150 megawatts maximum energy reduced to 1.5% you have 2.25 megawatts residual energy. That is 2,250 kilowatts which is not enough to reliably provide power to the average US household (Correction: that should be neighborhood not household, an average household is about 20 kilowatts). It is unlikely that there would be any damage at all to the reactor core with a loss of coolant accident.

These smaller designs also do something very interesting, by eliminating the theoretical dooms day scenario, they can actually be evaluated by insurance actuaries so that they can be insured by the utility instead of the government. They can even be used in destructive testing to "prove" what would happen in a realistic worst case accident. Eliminating the bizarre fantasies of the theoretical with realism is a good thing.

A less than realistic destructive test, one where say the reactor is run full bore without coolant, would cause some meltdown, which would damage the pressure vessel and core controls to the point that the reactor is unusable, but even that would not produce a very impressive accident. Then you could assume that Homer Simpson's less intelligent cousin was in charge and tried to cool the reactor with just enough water to produce maximum steam, therefore maximum radioactive fallout, kinda like at Fukushima, which would produce nearly a Three Mile Island event. In other words, small modular light water reactors are damn near idiot proof! (Added: Here is another view of small modular reactors from safety and security standpoints.)

Small modular reactors in the range of 50 to 150 megawatts each, used for a nuclear power parks, can produce equivalent power output at a more reasonable cost with much greater safety. That makes way too much sense though, I mean realistic regulations might even become part of life if we lose the theoretical.

Since facts are not much use in the activist world, we can always go with man powered treadle devices to recharge our electric cars in a few days. Then, there is no real need to think, is there?

Monday, April 18, 2011

Future Farming in Fukushima Prefecture

The radioactive fall out in Fukushima is a concern for many, but the farmers and residents of the prefecture are the ones with the real concerns. There is great concern by many that purchase food stuff grown in the prefecture, that the meats and produce be safe.

One area that the prefecture can be compared to is Bikini Atoll, the site of hydrogen bomb tests following World War II. This is a very interesting comparison in that it involves both real and legal assessment of the risks. Legally, the once and future residents want the cleanest possible conditions which have much lower background radiation standards than many places in the world. A battle between the US Environmental Protection Agency (EPA) and the Nuclear Regulatory Commission (NRC)is at the heart of the issue.

To many, including the NRC, the EPA regulatory limits are impractically restrictive. From the Bikini Atoll website, "To give one an ideas of how strict this 15-millirem standard is, the EPA stated that the standard means that a person living eleven miles from the Yucca Mountain site, the distance to which the standard applies, will absorb less radiation annually than a person receives from two round-trip transcontinental flights in the United States. The EPA also stated that background radiation exposes the average American to 360-millrem of radiation annually, while three chest x-rays total about 18-millirem." The Yucca Mountain case was used by the Bikinians to press for tighter clean-up standards.

It is understandable that people are worried and afraid of radiation. The standards are so poorly defined that few, even experts in the field, are in agreement to what is reasonable. 100 millirem, equivalent to 1,000 microsieverts per year is considered a reasonable background dosage even though many places in the world have higher natural background radiation doses. To my knowledge, there are no studies showing that 1,000 millirems a year background have any indication of increase health risk. The 100 millirem standard appears to already be very conservative which is equivalent to approximately 0.1 microsieverts per hour.

1,000 millirem per year (~1.0 microsieverts per hour)is greater than the effective annual average combination dose (background and other exposure)of 3000 microsieverts per year (~0.34 microsieverts per hour), but much less than areas with high natural background radiation and no evident radiation health risk. "In Guarapari, Brazil, a city of 80 000 inhabitants built on the seaside, peak measurements made by EFN on the thorium-rich beach were as high as 40 microSv/hour (about 200 times higher than the average natural background radiation in other areas of the world)." Based on other regions with higher background radiation, it would seem that 1 to 10 microsieverts per hour is not an unreasonable estimate of safe background exposure.

Ultimately, that decision should be up to the residents of area impacted by the Fukushima NPP accident to make. Another question farmers in the area would have is the uptake of radioactive isotopes by crops. Cesium 137 is the primary fall out isotope. By the same study of Bikini Atoll, potassium fertilizers inhibit plant uptake of cesium 137. Proper fertilization to build potassium levels along with normal tilling and erosion, should enable most effected farmland to be productive without any significant increase in normal radiation level.

Realistically, the radiation damage is much less than most in the press have indicated. Legally, is a different matter. Many residents may opt for much tighter standards and the larger compensation they are likely to win through the courts.

The situations, both in Japan and Bikini Atoll, illustrate the need for much clearer definition of radiation exposure standards. With the inconstant and overly conservative current limits, the nuclear industry may not be viable. Moving to more realistic standards is sure to be opposed by anti-nuclear groups responsible for the much tighter standards that are in place now. It will be interesting to see if logic or passion wins.

The Zonal Potential Temperature plot above is from this Atmospheric Physics PDF. If you read it you will see there is a lot of stuff about atmospheric physics to learn. The big thing I want to look at on the plot is in the northern hemisphere, that up bump.

When I posted my crappy drawing, it was not to scale. The reason simple two dimensional radiation models are so popular is that they assume that the arc of the Earth at any point is pretty much flat with respect to any single point above the Earth. So my CO2 molecule would look more like a BB in a parking lot than a sphere above a sphere. The atmosphere ain't that simple though. Looking at the potential temperature plot, there is a big hump in the tropics and a smaller hump near the north pole. Think of them as high chairs for the CO2 molecule. The is a lot better view from the high chair which is more radiation window for the infrared heat to see.

These high chairs are also not fixed. The atmosphere is like a 60's lava lamp with stuff moving up and down in a somewhat random pattern. Each one of the up bubbles can dump tons of heat to the tropopause and there to space with the bigger picture window.

There are a lot more of these up bubbles in the northern hemisphere because of the land mass to sea surface ratio. As surface warming by either natural or Greenhouse gas forcing occur, the size and frequency of the up bubbles will increase. With in increase in water vapor due to warming, much more heat will be transported up. So this is a major part of the Earth's temperature control system.

Cloud cover increases are one of the big questions in climate change, will cloud cover increase? Of course it will, where it will is the bigger question in my mind. It is pretty obvious that it will increase at certain time of the year in the northern hemisphere. Should it increase near the tropics, it can completely wipe out the impact of greenhouse gas forcing. That is a bit unlikely. There will be negative forcing from increased cloud cover, but it is unlikely to be greater than 25% of the radiative forcing of greenhouse gas increase. Still, 25% is much greater than the less than 10% considered as natural unforced variation in the climate models. This will tend to greatly limit the positive water vapor feed back assumed by many climate scientists.

It kinda sorta brings Arrhenius' second estimate of 1.6 (2.1 with water vapor) degree C for a doubling back into the picture. That 1.6 estimated by Arrhenius has be reduced by most modern estimates to the 1 to 1.2 degree C range. Now all I have to do is prove it! Fat chance right?

The bugger is that data for the troposphere, especially in the Arctic, is pretty sparse and not very long time wise. There is a larger gap in the North Pole satellite data because of the orbits, so a good deal of what is happening in the far North is a mystery. Though there are more Arctic Tropopause maps available thanks to guys looking into the Wild and whacky Tropopause.

So while it will take me probably forever to even figure out how to solve this puzzle, I will keep playing with it. I will probably come back to this post to add some pretty pictures and hopefully an animation of the Tropopause Lava Lamp.

Sunday, April 17, 2011

Things that Go Bump in the Twilight

Above is the NASA Earth Energy Budget cartoon. While it is simplified, it still shows that there is a lot going on. It only takes a small change in any of the percentages shown to make a pretty big difference in climate. If you combine the incoming reflected arrows, 30% of the Sun's energy is reflected to space. Clouds are the largest reflected component. Cloud cover can increase by a couple of percent and wipe out all the greenhouse gas increased forcing. With rising temperature there is more water vapor so there is a good likelihood there will be more clouds. That is not my theory. Plenty of other people are arguing over that one. Surface reflection is only 4% and atmospheric reflection is only 6% with most of that happening above the tropopause.

For the surface reflection to impact climate there has to be a larger percentage change in surface albedo to have the same impact as cloud cover change. That is not to say that surface albedo is not important, it just has to change more to have a similar impact as the constant changes in cloud cover. Two of the biggest surface albedo changes are snow cover, sea ice and land use.

Since we are talking about reflection, it is easy to see how bright white snow is reflective. That bright white gets less bright with dirty dust and ash fall out on the snow. Most of that fall out we have some control over. In the US and other top economies, air pollution is a big deal so there has been great progress cleaning the air by removing ash from energy use. Developing nations are finding out that too much air pollution is really bad, so they will be forced to reduce air pollution either by their own problems or neighboring nation getting hacked off. So in the normal course of growing, ash or black carbon will be reduced. A little extra bitching can speed that process up, so I am not particularly concerned with the black carbon part of the puzzle.

Land use is a lot more complicated. Farming and housing expansion are are the main things changing land use. The great dust bowl of the 30's forced the US and Canada to rethink farming practices, which was watched closely by other nations. So dust from agriculture is becoming less of a problem. Part of the solution though, irrigation, adds a new dimension to the puzzle. Areas which once had lower humidity are now irrigated increasing the local temperatures, mainly at night. Wetlands have been drained to increase farmland and provide more land for housing expansion. Forests have been cut down as well. While it is harder to visualize, forests and natural wetlands have more than just albedo impact. Many plants, especially trees, can control their temperature somewhat. It is cooler under trees not only because of the shade, but also because photosynthesis is an endothermic chemical process. That means leaves absorb heat energy and basically exhale cooler temperatures.

Humans have been whacking down trees forever. They provide shelter and fuel. They also block sunlight needed for farming. More trees are a good thing, but with the drive for biofuels and food, forest areas are still under huge pressure. This is a dicey issue. You can't ask people to starve for the good of mankind. You can bite the bullet and choose less "Clean" and "Safe" energy sources to reduce agricultural demand. You can also support a more symbiotic relationship between humans and environment. This forces opinionated buttheads to realize that the enemy of my enemy is my friend. The PETA and Enviro whack jobs have to understand that the Hunting Whack jobs share a common goal, more natural wood lands and wet lands. The hunting whack jobs have done more to preserve natural game habitats than the PETA and Enviro whack jobs have in the past few decades. Will the whack jobs come to their senses? I doubt it, some people are pretty dense. PETA guys for example are big into "organic" things like bananas and neat tropical fruits. Where are the bananas and neat tropical fruit grown? In areas that were once rain forests. They would have a bigger impact trying to ban coffee, bananas, palm oil and ethanol. So PETA promoting "EAT MOR Chickin", the other white meat, or Hassenpfeffer, would be supporting more food protein per unit land than banning meat. Most humans are omnivores, get over it!

So surface albedo is pretty much either going to take care of itself or not. Cloud cover changes are going to be debated. CO2 is going to continue to rise until people embrace energy reality. Nukes are scary, dangerous and expensive, but they are very clean compared to the other choices which seem to always be the energy of the future. Until then, people are still going to screw up pushing bright ideas that have unintended consequences. That is just the way it is.

So that brings me back to the tropopause and why people are not the sharpest tacks in the box. People are opinionated and suffer from politically focused tunnel vision. That causes us to miss the obvious. This planet has been around a lot longer than us, so it probably has a few tricks left up its sleeve. Man has not even come close to exploring the atmosphere's response to our impact.

So why did I title this post Things that go Bump in the Twilight? Because when you model the Troposphere in day or night mode you can miss the subtle activities of the in between. Day and night may be the big players, but it only takes a few percent to make big changes. The radiation balance of the tropopause changes big in the twilight. There is no up or down as far as radiation is concerned, only hard or easy. It is easy to see the change in incoming solar. The angle is different, scattering increases, there is less absorbed and reflection. Why would we assume it is any different for longwave? Longwave interacts with the Greenhouse gases more than the inert gases. That heats them. Water vapor has a much broader spectrum than the other greenhouse gases, so that heat in the form of infrared radiation flows to cooler areas. In the tropopause, every thing in every direction is cooler except for down. So as heat is transferred to the tropopause by convection, greenhouse warming or mixing rising air, it has a lot of paths open to it to escape to space. The remarkable stable temperature of the Tropopause trend shows that those paths are used, we just don't know how well.

Perhaps, thinking in a twilight perspective is the easiest way to explain the wondrous protection the Tropopause gives us as a thermostat to control temperatures. Looking at the Tropopause, very small changes would give us that few percent that has the big impact on climate.

Blog Archive