temperature graph   global Warming      
 Fudge Factor Replaces Science 
 Saturation Precludes 
Gary Novak

Global Warming Home

Alphabetical Page List

Trapping Heat

Dilution Factor

Underlying Science

Chatty Description

Temperature Effects

Equilibrium in Atmosphere

Radiative Transfer Equations

Fudge Factor


Greenhouse Gas Mathematics

Temperature Measurements

Recent History


Firing Scientists

Acid in the Oceans

Heinz Hug Measurement

Methane is Weaker

Changing Weather

Oceans not Rising

Heating 2,500C

Natural Log Curve

Published not as Science

Fake Ice Core Data

Ice Melt Fraud

Future Ice Age

"Delicate Balance" Fraud

Heat-Trapping Gases

Back Radiation is Absurd

The Cause of Ice Ages and Present Climate


Second Climategate


The Disputed Area

Zone of Emission Fraud

Errors in Claims

IPCC Propaganda

The 30% Fraud

The 41% Fraud

The Water Vapor Fraud

Humidity Fraud

River, not Window

Hockey Stick Graph

CO2 Charlatanism

A Fake Mechanism

220x10-12 °C

Global Dynamic

Long Wave Infrared Radiation

What about Argo

Forcing Error

The Concept of Distance

Harry_Read_Me Files

Meaning of Hacked Files


A Look at Modeling 

Conduction Heat


Temperature Effects cannot be Determined from Radiative Transfer Equations.

It's extremely strange that radiation was calculated, when there is no method of converting radiation into temperature. The radiation was calculated using radiative transfer equations with the results expressed as the three component fudge factor. From this, the primary effect of carbon dioxide heating the atmosphere is supposedly determined.

escape radiation

To model temperature increase due to CO2, a primary effect is the starting point, and then secondary (feedback) effects are added. The primary effect is described as "radiative forcing due to CO2 without feedback," which is the fudge factor, and this is converted to a temperature with a "conversion factor." The result is said to be 1C for doubling CO2 in the atmosphere. The simple math is this:

Primary Effect

5.35ln2 = 3.7 w/m = 1C.

This little calculation is promoted as an unquestionable law of physics upon which all else is based. It is the primary effect from which secondary effects are modeled. Only the secondary effects are in question. The calculation is total fakery.

What this means is that climatologists claim that the primary effect upon doubling carbon dioxide in the atmosphere will heat the atmosphere by 1C, and then modeling is used to determine secondary effects, which supposedly add 2C for a total of about 3C. The primary effect is not questioned, supposedly being an absolute law of physics, and only the secondary effects are studied and argued.

A source for this claimed law of physics cannot be determined. A quote for it comes from Stefan Rahmstorf, "Anthropogenic Climate Change: Revisiting the Facts." P34-53 in "Global Warming: Looking Beyond Kyoto," by Ernesto Zedillo. 2008, where he said this: "Without any feedbacks, a doubling of CO2 (which amounts to a forcing of 3.7 W/m) would result in 1C global warming, which is easy to calculate and is undisputed." Rahmstorf's citation is this: "IPCC, Climate Change 2001: Synthesis Report." There is nothing resembling the Rahmstorf claim in the IPCC reports.

Inquiring scientists cannot find a source for the calculation which Rahmstorf refers to. Attempts to explain it result in endless complexity and confusion. The simple reason is because there is no way to determine temperature from radiation in the atmosphere.
Applying the Stefan-Boltzmann Constant

Climatologists presumably use the Stefan-Boltzmann constant (SBC) to derive the temperature of 1C from the 3.7 watts per square meter, because they all say the number is easy to calculate, and only such a simple calculation as the SBC shows a relationship between radiation and temperature. Physicists claim the relationships go both ways— from temperature to radiation, and from radiation to temperature.

The SBC is this:

     W/m = 5.670373 x 10-8 x K4

The global average, near-surface temperature is said to be 15C. Average emissivity is said to be 0.64.


3.708/3.486 = 1.064C

The result is the desired 1C for the primary effect of doubling CO2 in the atmosphere, as if climatologists could calculate such things with extreme precision. They claim about 1% error on this factor. However, the SBC shows about 20 times too much radiation at normal temperatures. Reducing the radiation in the SBC by a factor of 20 shows this:


3.708/0.175 = 21.189C

The result shows 20 times as much temperature increase as climatologists claim, when the SBC is corrected for too much radiation. None of these results are real, as the claimed radiation (3.708 w/m upon doubling CO2) was contrived for the purpose of eliminating the significance of saturation. With saturation, no radiation change would occur to increase temperatures as global warming.

In addition to the quantitative absurdities, it is not valid to reverse the Stefan-Boltzmann constant as a method of determining temperature, and there is no other method of getting temperature out of any scientific calculation. Temperature is determined by the total energy dynamics of changing systems, with heterogeneity in complex systems. The forward direction of the SBC looks only at a definable surface, while the reverse of the SBC is influenced by the total dynamics. Yet the result of the radiative transfer equations is translated into the temperature of the near-surface atmosphere based upon a claimed reduction in emission at the top of the troposphere.

What this shows is that climatologists started at the end point of 1C being the desired near-surface temperature increase upon doubling CO2 in the atmosphere, but correcting the math (SBC) shows 20 times more than they would have wanted for a result.

Ocean Fraud: Notice that the calculations above show a fixed relationship between the 3.7 W/m and 1C. These numbers have existed since the seventies. Myhre et al stated in 1998 that these numbers were only off by 15%. It means there is no place for subtracting ocean heat in these calculations.

Recently, the explanation for the "pause" is that measurements of ocean heat were re-adjusted, and the missing heat was found. Then the claim emerged that 90% of the heat caused by CO2 ended up in the oceans. If so, that 90% must be subtracted from the 1C claimed, because it is a calculated total. But no subtracting for ocean heat is being done in the claims for air temperature increases.

If the oceans are absorbing 90% of the heat produced by CO2 in the atmosphere, the primary effect in the near-surface atmosphere needs to be reduced from 1C to 0.1C upon doubling the amount of CO2 in the air. This result would be irrelevant, because the social concern has been that the temperature should not go up to 2C.

Such contradictions cannot be resolved, because there is no real science. The numbers are contrived by starting at the desired end points, which leaves no space for changes afterwards.

Perhaps more importantly, why did the oceans just start to absorb heat from the atmosphere when the pause began in 1998? Why weren't the oceans absorbing the heat before then? The surface area of the oceans has not changed significantly. If something is causing variations in the ability of oceans to absorb heat, how reliable are any of the temperature measurements?

There are no real answers to these absurd questions, because there is no real science to the subjects. The claims are totally fabricated.

The real reason for the temperature pause which supposedly began in 1998 is that temperature measurements were contrived by lowering earlier measurements and increasing recent measurements. That process can only be done once. One of the tricks was to discard cold-reading stations, which was a one time process.

So, how real is the error in the Stefan-Boltzmann constant? Saying that a cold basement wall at 15C is giving off 390 w/m is totally preposterous. Physicists are not exactly saying otherwise, they are saying that it also absorbs that amount, so you do not notice anything. But that claim is absurd also, because biological process, and other complexities such as ice melting, would be sensitive to the difference between emission and absorption, and skin cells would be fried by that much energy being absorbed, regardless of how or when it is re-emitted.

Before absorbed radiation can be re-emitted, it must first be converted into heat, which means molecules vibrating. Those vibrating molecules increase in their chemical reactivity as their temperature increases. Biological systems will not tolerate significant increases in temperature without being destroyed.

Biology is like a thermometer which can tell the difference between radiation absorption and emission. If a cold basement wall were emitting 390 w/m, you wouldn't be able to get near it without skin cells being rapidly heated and damaged.

It means there is no mysterious cancellation of the high absorption and emission indicated by the SBC, and it means climatologists started at a desired end point and contrived the method of getting there. It's the only thing they do in climatology, because the randomness and complexities of climate cannot be reduced to scientific analysis.

Another problem with such an analysis is that the SBC is not appropriate for the purpose. It relates to the surface of an opaque solid. But the temperature in question is the near-surface air temperature. A transparent gas radiates in a vastly different manner than the surface of an opaque solid.

The absurdity of the Stefan-Boltzman constant shows up in an extreme way in the attempts to create an energy flow chart for the planet, described here.

Alternative Explanation

I've seen an alternative explanation for deriving the 1C primary effect. It uses the assumed temperature increase each month due to global warming and correlates it with satellite measurements of radiation escaping from the topopause into space. There is a circular logic involved. The assumed global warming must be measured to calculate the assumed global warming. Similarly, satellite measurements of radiation going into space are used to calculate the amount of radiation going into space.

If measurements are needed to determine the results, why do calculations? The only thing calculations do is get rid of saturation, while they cannot be used for anything else, and measurements must determine the results. Rationalizers use crude measurements to get high precision with their calculations. The more they mess with it, the more perfect it gets.

Besides the circular logic being used as a rationalization, the measurements are worthless. Measuring global warming with thermometers is a total farce, as only about 10% of the earth's surface is measured, with very few measurements at the poles or in impoverished countries, and none over the oceans. Satellite measurements showed almost no increase, until they were re-adjusted (in the wrong direction) to conform to thermometer measurements. And even then, thermometer measurements showed no significant results in the raw data, until earlier measurements were lowered, and later measurements were increased, to show an increase. Scientists cannot find out how or why the adjustments were made for measured temperatures. And then no increase has occurred since 1998.

Satellite measurements of radiation being emitted from the atmosphere are just as farcical. Absolute values are needed for the claimed forcing, while satellites can only determine relative change. For some purposes, relative values can be referenced for absolute values, such as using weather balloons. But they only measure temperature, they cannot reference radiation going into space. Calibrating satellites for absolute values of radiation is impossible, because each wavelength has different intensities, and the total aggregate requires evaluating each separately, which satellites cannot do. Crude measurements are however used in the rationalizations.

So a primary effect cannot be determined, yet it is needed to produce an amount of heating to be acted upon by secondary effects. Without a primary effect, there is no value to be entered into computer models for heat resulting from carbon dioxide. So a primary effect is contrived and said to be an unquestionable fact, because modeling cannot proceed without such a number to build upon.
Calculating the relationship between radiation and temperature is totally impossible due to infinite complexities. One of the problems is that radiation being absorbed by a molecule is partially re-emitted at black body wavelengths. How the energy is distributed before being re-emitted determines the temperature increase. No theory can say how the energy is distributed.

This image shows how energy is re-distributed when radiation is absorbed by carbon dioxide.

molecules radiating emission peak

This is a one-time absorption of radiation.

When a molecule of CO2 in the atmosphere absorbs fingerprint radiation (the only thing in question) it increases in vibratory motion, which is heat. As it bumps into surrounding molecules (mostly nitrogen gas), it imparts some motion, which reduces its own motion, while increasing the motion of the other molecule. This bumping goes from molecule to molecule, as the energy spreads through the atmosphere.

The vibrating motion of molecules sends out waves of infrared radiation. As the molecular motion decreases, the intensity of the radiation and its frequency get lower. Since the atmosphere already has heat in it, radiation is constantly being emitted. Absorption of fingerprint radiation just slightly increases the intensity. With slight changes in energy, a CO2 molecule probably has to bump nearby molecules a hundred times or more to give up half of its gained energy.
If the average wavelength of emission is 25 microns, there are 83 femto seconds for each initial bump. (frequency equals velocity over wavelength. Time equals inverse of frequency) (3x108 ÷ 25x10-6 = 12x1012, inverse = 83x10-15) A hundred bumps would give up half of the energy in 8.3 pico seconds. Maybe it actually requires 200 bumps to give up half of its energy. This would require 17 pico seconds. No one knows exactly how the energy disperses through the surrounding environment. So no one knows how much energy is retained before it is radiated out again. Each of the molecules which receive energy will emit some outflowing radiation, but how much and when cannot be determined. So no one knows how much temperature increase occurs for the few molecules which radiate the energy away. It might take 10 or 20 molecules to re-radiate the energy, while the other 2,480 surrounding air molecules are unaffected.

There is extreme variation in the amount of time required (the number of bumps involved) to release absorbed energy, because the intensity varies with temperature of emitting molecules, and rate of release varies with the nature of wobble induced in the absorbing molecules. With equal temperatures for emitting molecules and absorbing molecules, as few as 5 bumps could release absorbed energy. Five times 83 femto seconds would be 415 femto seconds to give up the energy of fingerprint radiation.

This bumping and emitting is happening to every molecule in the air at all times, while the net result is equilibrium between the energy entering from the sun and the energy being emitted into space. The resulting equilibrium temperature is determine by the temperature of the entire emitting mass of the atmosphere and surface of the earth, not some miniscule greenhouse gases.
Continuous Energy Transfer

The required 2,500C temperature for CO2 to heat the whole atmosphere by 1C average is a one-time addition. It only shows the nature of dilution. In actuality, the radiation flows create a dynamic system with energy flowing in and out constantly.

The one-time analysis does not indicate what the actual temperature would have to be for each CO2 molecule to heat the surrounding 2,500 molecules to 1C average. If the heat loss were extremely fast, the temperature of each CO2 molecule would have to be close to 2,500C. If the heat loss were extremely slow, the CO2 would be less than 2,500C.

temperature peaksThe heat loss is extremely fast, which means each CO2 molecule would have to be close to 2,500C to create an average atmospheric temperature of 1C.

The task would be like trying to heat a brick building by heating one brick. What temperature would that brick have to be to heat the whole building 1C average? If the heat were disappearing very fast, the brick would have to be extremely hot. If the heat were disappearing slowly, the brick would not have to be so hot.

A Calculation for Continuous Dissipation

molecules radiatingAn analysis of continuous absorption and re-emission of radiation goes like this: Each time a CO2 molecule bumps a surrounding molecule, it loses half its energy through conduction, and it loses energy through radiation. But the energy is constantly being replaced. Losing half of the energy while being replaced at the same rate would result in three fourths of the energy being retained. Reduction to three fourths would be 1875C.

The CO2 molecule is also emitting radiation while absorbing radiation. If rates were equal, another reduction by half would be required for loss of energy to sustain the needed temperature, which results in 2188C being the required temperature for transferring enough heat to the surrounding 2,500 air molecules for 1C average air temperature increase. The remaining 312 units of heat as temperature increase would be distributed between the molecules which CO2 bumps into. It would bump into about five surrounding molecules heating each one by 62C—that is to get the required 1C total over 2,500 molecules.

However, the absorbed radiation is fingerprint radiation, which is weaker than the emitted radiation, which is black body radiation. The fingerprint radiation which CO2 absorbs is 8% of black body radiation. This means emission is 12.5 times stronger than absorption.

But equilibrium would require emission to equal absorption. The higher tendency to emit than to absorb would drag down the temperature increase by CO2. Hypotheticals break down at this point, because the needed 2,500C is a total absurdity to start with.

What would actually occur is that the CO2 would only be heated trillionths of a degree centigrade, and no greenhouse effect would occur. Why trillionths of a degree? Because radiation is extremely weak. It's energy is dissipated in femto seconds. The energy cannot build up. This effect rides on top of normal temperatures, which are mostly produced through conduction, convection and evaporation.

If energy were coming from a warm earth and going into a cold atmosphere, more time would be required to re-emit the energy. But most of the radiation in the atmosphere moves less than ten meters, because saturation occurs within ten meters
Change and variation in the flow of heat, including time factors, cannot be determined. Sure, a calorie of heat entering a gram of water will produce 1C temperature increase; but when change and variation are added to the temperature of the mass, calculations cannot be made. The change is too variable. The second law of thermodynamics says heat dissipates. That means it constantly moves from more to less concentrated areas. It moves not only through conduction and convection but also radiation. The complexities cannot be calculated, they can only be measured with limitations on the ability to measure the complexities. Modeling of global warming is not possible for these reasons.

For the same reason temperature cannot be calculated, radiation cannot be calculated, because they inter-convert. One transforms into another. Superficially, these effects are not being calculated, they are measured for small slices of the atmosphere and then added up. The problem with that line is that the results are expressed as 3.7 w/m upon doubling the amount of CO2 in the atmosphere. The atmosphere does not have square meters, it has cubic meters. Square meters are assumed to represent the amount of radiant energy which falls onto the surface of the earth. It means there is no accounting for the heating of the atmosphere due to absorption of radiation by CO2.

Where then do the watts per square meter come from? They are the difference between the amount of radiation assumed to go into space based on the calculations of the radiative transfer equations and the amount entering the earth from the sun. Who cares where or how that difference in radiation creates heat. It has to create heat someplace.

One of the problems is that the calculations are not direct enough to do a comparison between calculated radiation at the top of the atmosphere and the total energy entering from the sun. The differences are extreme and render all such analysis so absurd that any result has to be a predetermined contrivance.
The radiative transfer equations must start with some radiation which goes through the atmosphere and ends up in outer space. graphThere is nothing resembling a starting point for such an analysis. No radiation or heat on planet earth has an identifiable or quantifiable starting piont other than the total entering from the sun. Implicitly, the radiation at the starting point is that which is emitted from the surface of the earth. No one has a clue as to what that quantity would be, and it is almost irrelevant to the process.

The NASA energy budget claims 41% of the heat on the surface of the earth leaves as radiation. It's a preposterous guess. Only white hot metals give off 41% of their energy as radiation. Cooling fans would never be used if that much radiation were emitted from a cold and rough surface with wind blowing over it. The real number would be closer to 1-3% on land, very little from oceans. The Kiehl-Trenberth model shows 79%. That model is forced into a ridiculously high number, because the Stefan-Boltzmann constant was applied, and it is in error. There is about a 100% difference between these two official sources. How reliable can the radiative transfer equations be in picking some such starting point?

But the problem is even worse due to the fact that very little of the radiation in the atmosphere gets there by radiating from the surface of the earth. Most heat gets into the atmosphere through conduction, convection and evaporation from the surface, and quite a bit from solar energy. Kiehl-Trenberth says 29% of the solar energy enters the atmosphere rather than striking the earth's surface. The NASA model says 19%.

Much of the energy in the atmosphere is converted into radiation, as all matter emits radiation in proportion to its temperature. How fast the transformation of energy occurs is anyone's guess. In other words, the application of radiative transfer equations involves no ability to determine how much radiation is coming from where or going to where. And yet the equations are portrayed as being so precise in their latest rendition that they could determine that earlier calculations were only off by 15%.


The amount of fakery with this subject is beyond description and is nothing resembling science.
Fake Calculations of Ocean Heating

The recent, sort-of claim is that global warming heated the oceans by 0.2C. It's unlikely that a published science study made such a claim, as it is too disgraceful, and journalists have been making up most of the claimed science which the public sees.

Air has no ability to heat the oceans, because its heat capacity is too low.

One cubic meter of air can only heat 0.29 millimeters of water to the same temperature change. No cooling of the air has been attributed to ocean heating.

     ocean heat

The calculations are these, with rounding and approximations:

The density of air is 1.2 kilograms per cubic meter (kg/m). Its specific heat is 1 kilojoule per kilogram per C (kj/kg/C). Therefore, one cubic meter of air would release 1.2 kj of heat to drop 1C.

The density of water is 1000 kg/m. Its specific heat is 4.18 kj/kg/C. For 1000 kg, it's 4180 kj. Dividing the 1.2 kj of the air by the 4180 kj of the water yields 0.00029 meters for the water, which is 0.29 mm.

What depth would one expect the ocean to be heated by the claimed 0.2C? The new Argo project measures down to 700 meters. At first it found slight cooling; so the lower measurements were thrown out to achieve no change. Somehow since then, an increase of 0.2C is being claimed. If the depth of the heating is half of the measured depth, it would be 350 meters. The air would have to be cooled the same amount to a height of 1,207 km. That is approximated 100 times the height of the normal atmosphere (troposphere), not considering pressure reduction. Nothing resembling it is happening.

On top of that, the calculated 1C near-surface, atmospheric temperature increase is stated by be 1C upon doubling the amount of CO2 in the air, presumably based upon the Stefan-Boltzmann constant, as indicated above. No reduction for heating the oceans is allowed for in that calculation.

These contradictions are too ridiculous to be called science, yet they are in our faces day in and day out and the claimed reason for destroying the energy systems and increasing the cost of electricity by several hundred percent.

The number one panic over global warming is oceans rising, which could swamp Miami and Maldives. (Who cared what happened to Detroit.) Since air cannot cause ocean temperatures to increase, and much less cause ice to melt, whatever causes the oceans to rise, it won't be carbon dioxide. Oceans rise 400 ft, (130 meters) between ice ages, as glaciers melt. Glaciers are almost as melted as they can get, which has reduced ocean rising to little or none at this time, beyond the same fakery/fraud that goes with everything else on this subject.