Science Errors

Chapter 3: The Fakery of Global Warming Science


Chapter Summary

There is no such thing as greenhouse gases, because there is no such thing as trapping heat in the atmosphere. Absorbed radiation is re-emitted in femto seconds. That's why cool-down occurs at night.

For all matter, each vibration is a wave of infrared radiation being emitted. It's not a lot of energy, but the amount of fingerprint radiation absorbed by CO2 is even less.

All heat is the same. Why isn't it all trapped? Conduction, convection and evaporation put most heat into the atmosphere. It dissipates into space at the same rate it enters from the sun, called equilibrium. Claiming CO2 absorption is different is unscientific.

Miniscule amounts of radiation were exaggerated by a factor of 20-40. Supposedly, 79% of the energy leaving the surface of the earth is radiation, with the remaining 21% being in the form of conduction and evaporation. White hot metals could not easily give off 79% radiation under atmospheric conditions. The real number would be about 1-3% radiation. This requires reducing the claimed global warming by a factor of 20-40 in calculations, though there is no real effect.

Then secondary effects are said to do two thirds of the heating. To evaluate secondary heating, every influence over climate for the next century is evaluated. Climate is too complex and random for the questions which are raised.

◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆

radiation peaks

Incompetents in science imagined global warming and contrived unreal science to get there. Climate is too complex and random for the tools of science. The science is total fakery. Real scientists do not go down the path climatologists follow pretending to measure complexities and randomness which cannot be measured. A measurement requires that all influences over the results be identified and separated from other influences. Climate has too many interacting complexities to do that.

For these reasons, there is nothing resembling real science to the subject of global warming. Science is a process, not a conclusion. Conclusions come out of a dark pit in global warming science. Fake procedures are claimed, with no explanation or logical purpose. Necessary scientific standards are defied in extreme ways attempting to contrive a subject without accountability.

Conservative critics of global warming have been saying that the underlying science is correct, but global warming is not occurring because of the effects of clouds. They haven't looked at the underlying science, because they dont understand it. Their position has left society with no significant criticism of the basic science of global warming. As a result, criticism is brushed off claiming that it is disproven and needs to stop. Conversely, nothing has been shown to be correct in the science. The burden of proof should be on the scientists, not the critics.

One reason for this situation is that hired scientists cannot be significantly critical without being kicked out of science or being denied grants or the ability to publish. There is a long list of scientists who met that fate. (Firing Scientists) This practice alone is a major fraud upon the public. How can science (or anything else) be right, when no one is allowed to criticize it? Truth benefits from criticism. The opposition to criticism points to an unjustifiable position.

Criticism is stymied by an absence of validly published research. Research publications on climatology lack the necessary descriptions of methodology. Key information needed for evaluation is omitted in an attempt to obfuscate the subject. Without proper publications, the only way criticism can be produced is to draw upon 500 years of evolved knowledge and show that the conclusions are self-contradictory impossibilities.

Most scientists are not aware of the frauds at the origins of global warming science. Scientists are so specialized and wrapped up in their narrowly defined subjects that they cannot spend much time looking into the large amount of related material. It took me decades of detective work as an independent scientist to determine the nature of the frauds at the origins of global warming science.

A flat-earther is supposedly someone who can't understand that absorption of radiation means heat. Five hundred years of science has produced a lot more knowledge than that. After absorption, then what? These proofs explain the rest of the science.
Here are six proofs of science fraud at the origins of global warming.

1. Dilution Factor

Climatologists skipped over the dilution factor. Each CO2 molecule in the air would have to be 2,500C to heat the air 1Can impossibilitybecause there are 2,500 air molecules around each CO2 molecule.

If a brick building has 2,500 bricks, heating one brick won't heat the building.

There cannot be greenhouse gases creating global warming for this reason. Climatologists admit that the CO2 in the air is about the same temperature as the air, as it would have to be. They are thereby implying that CO2 is a cold conduit for heat. There is no such thing as a cold conduit for heat, as thermal conductivity coefficients show.

2. White Hot Metals

The amount of energy given off by the surface of the earth is claimed to be 79% radiation and 21% conduction and vaporization. White hot metals could not emit 79% radiation under atmospheric conditions. The real proportion would be 1-3% radiation. Reducing the radiation by a factor of 40 would reduce the calculated global warming by a factor of 40.

3. Trapping Heat

The term "heat trapping gas" is a scientific fraud. Heat cannot be trapped, because it is too dynamic. It flows into and out of the atmosphere in femto seconds.

Each vibration by molecules in the air is a wave of radiation being emitted. There are typically 83 femto seconds per bump (both directions). About five bumps removes added energy. Five bumps occur in 415 femto seconds. That's half of a pico second. A half of a pico second for holding heat is not trapping heat.

The amount of heat entering from the sun during the day is the amount that leaves during the night. A miniscule amount is not going to get trapped while the rest radiates into space.

The claim by some scientists that only greenhouse gases heat the atmosphere is another fraud. Most heat gets into the atmosphere through conduction, convection and evaporation.

4. Heat Capacity

The air has too little heat capacity to warm ocean water or melt Arctic ice. Twelve-year-olds were supposed to learn what heat capacity is, but physicists didn't.

To heat oceans with air requires a ratio of 3483 by volume for same temperatures. The heat capacity for air is 1.2 kj/m/C, while for water it is 4180 kj/m/C. To heat the oceans 0.2C to a depth of 350 meters would require air losing 0.2C to a height of 1,219 kilometers (at constant surface pressure). That's 100 atmospheres. The oceans cannot be heated by the atmosphere.

Melting ice with air is even more absurd, as an additional "heat of fusion" is required, which is 334 kj/kg, which is an additional 278,000 m of air per C per m of ice. In other words, air in contact with ice sucks the heat out of the air with no effect upon the ice. With a small amount of ice and a lot of air, the cool air gets replace with warm air, but on a global scale, the replacing does not occur. It means ice melting has nothing to do with global warming.

The Arctic is warming due to warm Pacific Ocean water flowing over the Bering Strait, not a miniscule air temperature increase. With the recent El Nino, the northern Pacific Ocean is warming causing warm water to flow over the Bering Strait to heat the Arctic and melt Arctic ice.

5. Temperature Measurements are Fake

Not only are humans not the cause of global warming, a temperature increase did not actually occur. The temperature measurements were faked. The original data shows no temperature increase over the past 35 years at least, while contrivers lowered earlier measurements and increased recent measurements to show a false increase. Critics have been studying these fabrications for the past six years and found endless examples. Satellite measurements have shown no significant temperature increase since they began making such measurements in the late seventies. Only satellite measurements are suitable for the purpose of climatology, because they average over a wide area and cover everything, while land-based measurements cover about 10% of the earth and have no standards for cross-comparisons or uniformity.

6. Starting at the End-Point

For a mechanism, climatologists used radiative transfer equations to supposedly show 3.7 watts per square meter less radiation leaving the planet than entering from the sun due to carbon dioxide. There can never be a difference between energy inflow and outflow beyond minor transitions because of equilibrium, as climatologists recognize. Yet they claim the 3.7 w/m is a permanent representation of global warming upon doubling CO2. This number is supposed to result in 1C near-surface temperature increase as the primary effect by CO2. However, watts per square meter are units of rate, while rates produce continuous change, not a fixed 1C. The 1C was supposedly produced by reversing the Stefan-Boltzmann constant, but reversing it is not valid. (Secondary effects supposedly triple the 1C to 3C.)

It means climatologists started at the desired end point of 1C and applied the Stefan-Boltzmann constant in the forward direction to the get the 3.7 w/m attributed to radiative transfer equations. Radiative transfer equations cannot produce any such number, because radiation leaves from all points in the atmosphere with 15-30% going around greenhouse gases. That dynamic, combined with equilibrium, is beyond scientific quantitation.

The atmosphere is cooled by radiation which goes around greenhouse gases. The amount going around doesn't matter. A gate half open won't keep in half the sheep. The cooling occurs until equilibrium is established with the amount of energy coming in from the sun.

The amount of radiation going around greenhouse gases is said in Wikipedia to be 15-30%. Calculations by climatologists are based upon none going around. They calculate the amount of radiation getting to the top of the atmosphere using "radiative transfer equations." Those equations cannot account for equilibrium, which is a response to every influence upon temperature.

If the amount going around were calculated without equilibrium, there would be a 100% error in the range (15-30%), while the product of the radiative transfer equations is said to have about 1% error. That product serves as the primary effect by carbon dioxide, which no one in science questions. Only secondary effects are argued.

But the gate is not 15-30% open. Each molecule in the atmosphere radiates energy, with 15-30% going directly into space. That which is absorbed by greenhouse gases is re-emitted with 15-30% going into space. It means the gate is about 99.99% open. The atmosphere cools as fast as heat enters it leaving very low temperatures in equilibrium with the sun's energy. The equilibrium temperatures are very cold, because heat leaves in all directions during all hours, while it enters from one direction, half the time. The energy from the sun lands on the surface (mostly), while it leaves from the entire atmosphere at a depth of 12-15 kilometers.
Evolution of the Concept

The initial concept of global warming was that more carbon dioxide in the atmosphere would absorb more radiation and heat the atmosphere. Scientists then found that laboratory tests were not showing an increase in radiation absorption with an increase in CO2 for a very simple reason: A very small amount of CO2 absorbs all radiation available to it in a short distance. Adding more CO2 only shortened the distance required for absorbing all of the radiation. Climatologists refer to this concept as "saturation."

     CO2 molecules

But hold-outs were sure global warming must be caused by increases in CO2 and looked for explanations. During the seventies, as computers became available, complex modeling was used to show heating of the atmosphere upon increases in CO2.

In 1979, a quasi governmental office created a study group to clarify the climate influences of carbon dioxide. The result was a publication by Charney et al, 1979 (1), who used modeling of atmospheric effects. Their conclusion was that a doubling of CO2 in the atmosphere would result in a temperature increase of 3C. The claims were total fakery. Scientists do not have the slightest ability to convert the details, complexities and randomness of the atmosphere to measurement or calculation, which is why weathermen cannot predict more than a few days for simple elements such as temperature and precipitation.

Charney et al claimed to model such things as "horizontally diffusive heat exchange" and "heat balance." The terms used are nothing but word salad. There is no such thing as horizontally diffusive heat exchange in the atmosphere or oceans. In large fluids, diffusion would cover no more than a few nanometers before convection renders it irrelevant. Why add "heat exchange?" There needs to be two mediums with an interface for heat exchange. If atmosphere and oceans were the interface, there is no "horizontally diffusive" element to it. Diffusion is a chemistry concept, not an energy concept. Heat moves through conduction, not diffusion. There is also no such thing as "heat balance." Heat migrates and transforms to and from other forms of energy. There is nothing balanced about it.

To model heat through the atmosphere resulting from carbon dioxide, the starting point must be some quantity of heat which is supposed to be moving through the atmosphere. Yet that quantity was the end result of the Charney study rather than the starting point. Numerous other studies used the same basic modeling concepts.

In 1984 and 1988, Hansen et al (2,3) used similar modeling but started with a concept of how much heat carbon dioxide should produce determined as "empirical observation," by which they meant the assumed historical record of carbon dioxide heating the atmosphere. [The assumed historical record is that humans increased the amount of carbon dioxide in the air by 100 parts per million (ppm) (280-380 ppm) when the first 0.6C temperature increase occurred in the near-surface atmosphere.] Modeling then had the purpose of showing complex future temperatures with no clear source or method. Implicitly, the atmosphere would add secondary effects to the primary effect of carbon dioxide. But the historical record included the secondary effects, which means the secondary effects were compounded. In other words, there is no clear concept of a purpose or a logical set of cause-and-effect relationships.

Yet Hansen et al arrived at approximately the same conclusion as Charney et althat the expected temperature increase upon doubling the amount of CO2 in the atmosphere would be about 3C. This result is always given for hundreds of such studies with widely varying procedures, which shows that it is nothing but a contrived end result with nothing but fakery for a method of deriving it. How could the same number be produced with and without a starting point for the amount of heat CO2 supposedly produces based on the historical record?

The reason for the invariable 3C increase upon doubling CO2 is that journalists said they would not be concerned unless the temperature increase would be 3C. Otherwise, why not just use the historical record? If it is extended, it would indicate a temperature in crease of 1.7C upon doubling CO2 in the atmosphere. (280/100 x 0.6 = 1.7) To get some other number than 1.7C upon doubling CO2 is to say the atmosphere is going to do something different than it did in the past. There is no explanation of why it should.

In 1998, Myhre et al (4) did "radiative transfer equations," which supposedly defined the energy increase due to CO2 with extreme precision, which required the world's largest computers. They claimed that the primary effect by CO2 causes 3.7 watts per square meter of energy to accumulate in the atmosphere upon doubling the amount of CO2. One of the absurdities is that there is no accumulation of energy in the atmosphere, as equilibrium requires the average amount of radiation leaving to equal the amount coming in from the sun.

Radiative transfer equations also remove the saturation problem. Saturation can be determined in a laboratory in a few minutes. It leads other scientists to conclude that no greenhouse effect can occur for this reason along. Heinz Hug did such a measurement and said all relevant radiation is absorbed by the time it travels 10 meters in the atmosphere. He wasn't allowed to publish such significant criticism, but he put it on the internet. It means the radiative transfer equations calculated away saturation.
Dilution Requires Extreme Temperature

Total carbon dioxide is 400 parts per million in the atmosphere. That means 2,500 air molecules surrounding each CO2 molecule. To heat the air 1C, each CO2 molecule would have to be 2,500Can impossibility.

On top of that, only a small percent of the CO2 is increasing and adding the heat due to saturation. No one can say what percent is unsaturated, but the highest number used is 5%. That means increasing the extremeness by a factor of 20, which is 50,000C required for each CO2 molecule.

The temperature of each CO2 molecule cannot be much higher than the temperature of the emitting surfaces, which is 15C for the claimed average surface of the earth, and somewhat less for the atmosphere. Climatologists admit that the temperature of the CO2 molecules is not much different than that of the atmosphere. They are thereby implying that CO2 functions as a cold conduit for heat. There is no such thing as a cold conduit for heat, as thermal conductivity coefficients show.

The second law of thermodynamics says energy can only move from more concentrated to less concentrated. Temperature is the concentration of heat. Net (total for effects) temperature can only decrease, never increase. This means the temperature of any CO2 molecule in the air can never be higher than the temperature of the sources of the energy. The sun's energy concentrates the heat on the earth's surface and raises temperatures slightly, sometimes 40C, but nothing resembling the 2,500C required for the fake greenhouse effect, and not in the atmosphere.

The proportionalities must be maintained at 2,500 to one, because rate of heat loss is similar for the CO2 molecules and air molecules, which means heat must be replaced at the same rate.

The temperature increase of CO2 in the atmosphere due to radiation from heated molecules or surfaces is usually determined using the Stefan-Boltzmann constant. It states the amount of radiation emitted by matter at any temperature.

CO2 is said to absorb about 8% of the black body infrared radiation which leaves the surface of the earth.

radiation peaks

The only method of calculating temperature which physicists have for this subject is the Stefan-Boltzmann constant (SBC). It is this:

     w/m = 5.670373 x 10-8 x K4

It indicates the amount of radiation given off by an opaque surface, as watts per square meter, at any given temperature (K). The SBC is off by about a factor of 20 at normal temperatures, and could be off by 30-50 at the chilly temperatures of the earth's surface average.

With the SBC used by physicists, there would be 390 w/m given off by the surface of the earth at the claimed average temperature of 15C. An emissivity of 0.64 reduces the emission to 249.667 w/m. With 8% of it being absorbed by CO2, it is 19.97 w/m going from the earth to the CO2.

Climatologists reverse the SBC to get temperature from radiation, but doing so is not valid, because there is not a definable surface for emission or absorption in the reverse direction. With reverse analysis, 19.97 w/m would correlate with 5.73C temperature increase for the CO2 molecules. This amount is certainly not the 2,500C required to create a 1C temperature increase for the nearby atmosphere. This table shows the watts per square meter per degree centigrade:


After emissivity is subtracted, there are 3.486 w/m per C. Dividing this into the 19.97 w/m going into CO2, there would be 5.73C increase in the temperature of the CO2 molecules. It doesn't come close to getting the required 2,500C. It's off by a factor of 436. If we divide the claimed global warming of 1C increase upon doubling CO2 in the atmosphere by 436, it is 0.0023C.

It means doubling the amount of CO2 in the air would increase the near-surface, atmospheric temperature by 0.0023C, if everything else were true about the claims of climatologists.

Correcting the SBC by a factor of 20 doesn't fix the problem. The ratios are about the same.

These numbers apply to the total amount of CO2 in the atmosphere, which is 400 ppm. If only 5% of the CO2 adds heat upon doubling the amount of CO2, the deficiency is a factor of 8720 instead of 436. Decreasing the claimed 1C global warming by a factor of 8720 yields 0.00012C for global warming.

So how much error is due to reversing the SBC? Since a transparent gas gives off a lot more radiation per degree centigrade than an opaque solid, there would be fewer degrees centigrade per watt per square meter. So the global warming temperature would be smaller than 0.0023C or 0.00012C. How much is impossible to say.

There is another major problem. If CO2 is increasing 5.73C by absorbing 8% of the radiation from the surface of the earth, the other molecules which absorb the remaining 92% must be increasing by 72C (minus whatever goes around greenhouse gases). Physicists say, no problem, because the molecules emit as much as they absorb, and the temperature changes slightly due to delayed emission. Then the CO2 molecules must be emitting as much as they absorb with slight delay, and they are heating a small fraction of the 5.73C or the 0.0023C or 0.00012C.

In other words, if a surface at 15C radiates into a surface at 15C, nothing changes. If 400 ppm of that surface is some other substance, it won't be heating any more than the rest of it, which is none at all. A transparent gas will be somewhat different than an opaque solid, but the temperature differences won't be any greater, at least when the differences are supposed to be zero.

Of course the greenhouse effect is supposed to be additive, as less radiation goes around a greenhouse gas. Adding a small amount to what is already there is of no significance. Adding 5% to 5.73C is 6.02C, which still misses the required 2,500C by a long ways, even before reducing it a large amount due to delayed re-emission.

It's 5%, because most of the CO2 is already saturated. But even with zero saturation, the 5.73C would be doubled to 11.5C, which misses the required 2,500C by a factor of 217, even before the 11.5C is reduced a large amount due to delayed re-emission. In other words, what climatologists are missing is the fact that temperature due to radiation absorption cannot be much different than the temperature of the nearby emitting surface. This includes the temperature of dilute molecules. Climatologists missed the dilution factor for the amount of heat involved.

The fact that climatologists missed the greenhouse effect for CO2 by a factor of 436 or more shows that there is no science to the subject. The claimed global warming science is totally contrived, as all other evidence shows.

Even critics within science are saying the primary effect by carbon dioxide is an unquestionable law of physics, while they argue secondary effects. There is no primary effect by carbon dioxide.
Calculating Temperature is Impossible

Calculating the relationship between radiation and temperature is totally impossible due to infinite complexities. One of the problems is that radiation being absorbed by a molecule is partially re-emitted at black body wavelengths. How the energy is distributed before being re-emitted determines the temperature increase. No theory can say how the energy is distributed.

This image shows how energy is re-distributed when radiation is absorbed by carbon dioxide in the atmosphere.

     molecules radiating

When a molecule of CO2 in the atmosphere absorbs fingerprint radiation (the only thing in question) it increases in vibratory motion, which is heat. As it bumps into surrounding molecules (mostly nitrogen gas), it imparts some motion, which reduces its own motion, while increasing the motion of the other molecule. This bumping goes from molecule to molecule, as the energy spreads through the atmosphere.

The vibrating motion of molecules sends out waves of infrared radiation. As the molecular motion decreases, the intensity of the radiation and its frequency get lower.

If the average wavelength of emission is 25 microns, there are 83 femto seconds for each initial bump. (frequency equals velocity over wavelength. Time equals inverse of frequency) (3x108 ÷ 25x10-6 = 12x1012, inverse = 83x10-15).

The number of waves or bumps required to liberate absorbed radiation depends upon the relative temperatures of emitting and absorbing molecules. From a hot surface to a cold atmosphere, strong radiation is absorbed by CO2, and weak radiation is emitted. But most radiation travels less than 10 meters, because the saturation distance is ten meters at sea level air pressure. For this, weak radiation is emitted, absorbed and re-emitted.

With weak radiation, five bumps or waves should dissipate the energy absorbed by one wave. The emitting radiation is in fact stronger than the absorbed radiation, because radiation is absorbed by CO2 as fingerprint radiation at three narrow bands and is emitted as black body radiation. But more than one wave emitted per wave absorbed would be required, because energy is being imparted to surrounding molecules through collisions. Those molecules also emit radiation, which means a few bumps should send the added radiation out.
At 5 bumps (two directional) or waves and 83 femto seconds per cycle, the energy radiates away in 415 femto seconds, which is about a half of a pico second. A hundred bumps would give up half of the energy in 8.3 pico seconds. No one knows exactly how the energy disperses through the surrounding environment. So no one knows how much energy is retained before it is radiated out again. Each of the molecules which receive energy will emit some outflowing radiation, but how much and when cannot be determined. So no one knows how much temperature increase occurs, but it is miniscule to a point of irrelevant when one CO2 molecule out of 2,500 air molecules is doing the absorbing and the rest are emitting. Of course, most of the heat enters the atmosphere through conduction and evaporation, and that heat is emitted through radiation also.

Change and variation in the flow of heat, including time factors, cannot be determined. Sure, a calorie of heat entering a gram of water will produce 1C temperature increase; but when change and variation are added to the temperature of the mass, calculations cannot be made. The change is too variable. The second law of thermodynamics says heat dissipates. That means it constantly moves from more to less concentrated areas. It moves not only through conduction and convection but also radiation. The complexities cannot be calculated, they can only be measured with limitations on the ability to measure the complexities. Modeling of global warming is not possible for these reasons.

For the same reason temperature cannot be calculated, radiation cannot be calculated, because they inter-convert. One transforms into another. Superficially, these effects are not being calculated, they are measured for small slices of the atmosphere and then added up. The problem with that line is that the results are expressed as 3.7 w/m upon doubling the amount of CO2 in the atmosphere. The atmosphere does not have square meters, it has cubic meters. Square meters are assumed to represent the amount of radiant energy which falls onto the surface of the earth. It means there is no accounting for the heating of the atmosphere due to absorption of radiation by CO2.

Where then do the watts per square meter come from? They are the difference between the amount of radiation assumed to go into space based on the calculations of the radiative transfer equations and the amount entering the earth from the sun. Who cares where or how that difference in radiation creates heat. It has to create heat someplace.

One of the problems is that the calculations are not direct enough to do a comparison between calculated radiation at the top of the atmosphere and the total energy entering from the sun. The differences are extreme and render all such analysis so absurd that any result has to be a predetermined contrivance.

The radiative transfer equations must start with some radiation which goes through the atmosphere and ends up in outer space. graphThere is nothing resembling a starting point for such an analysis. No radiation or heat on planet earth has an identifiable or quantifiable starting point other than the total entering from the sun. Implicitly, the radiation at the starting point is that which is emitted from the surface of the earth. No one has a clue as to what that quantity would be, and it is almost irrelevant to the process.

The NASA and Kiehl-Trenberth energy distribution models show 79% of the heat on the surface of the earth leaves as radiation. Only white hot metals could give off 79% of their energy as radiation. Cooling fans would never be used if that much radiation were emitted from a cold and rough surface with wind blowing over it. The real number would be closer to 1-3% on land, very little from oceans. The models are forced into a ridiculously high number for radiation, because the Stefan-Boltzmann constant was applied, and it is in error. It shows about 20-50 times too much radiation being given off at normal temperatures.

Much of the energy in the atmosphere is converted into radiation, as all matter emits radiation in proportion to its temperature. How fast the transformation of energy occurs is anyone's guess. In other words, the application of radiative transfer equations involves no ability to determine how much radiation is coming from where or going to where. And yet the equations are portrayed as being so precise in their latest rendition that they could determine that earlier calculations were only off by 15%.


The amount of fakery with this subject is beyond description and is nothing resembling science.
Radiative Transfer Equations

Radiative transfer equations produce the magic of defying laws of physics in showing heat where none is possible. The worlds largest computers were used, which means critics cannot look into the methodology or producing alternative results.

The procedure is to slice the atmosphere into numerous thin sections and calculate radiation going into and out of each one. The need for this method is supposedly in the fact that the atmosphere gets thinner as it goes upward, which makes each section different. When added all up, there is supposedly 3.7 watts per square meter less energy escaping into space than coming in from the sun upon doubling the CO2, which causes a buildup of heat. There cannot be an average difference between radiation leaving the earth and entering from the sun due to equilibrium. How could they have gotten just the right amount if their procedures werent flawless? By starting at the desired end point.

Climatologists admit that equilibrium exists, yet they base the entire subject of global warming on a difference between energy leaving the earth and energy entering from the sun. That contradiction does not get resolved. It is extremely absurd, because CO2 levels vary in major ways. CO2 graphThere was 5 times as much CO2 in the air during dinosaur years, and 20 times as much when modern photosynthesis began. When does dis-equilibrium occur? Volcanoes were putting a lot of CO2 in the air until recently. If there were dis-equilibrium in the past, why was it zero before humans influenced the result? Such contradictions cannot be resolved when contriving a subject where there is none.

The degree of complexity in heat transfer and transformation in the atmosphere is beyond scientific measurement or calculation. In fact, it is impossible to determine what the procedure was supposed to be with the radiative transfer equations. Implicitly, the amount of radiation getting to the top of the atmosphere was determined, but almost no radiation gets to the top. It almost all exits from throughout the atmosphere.

The amount of radiation leaving from anywhere in the atmosphere cannot be determined, as radiation is continuously being emitted, absorbed, transformed and re-emitted from every point in the atmosphere. All the while 15-30% of the radiation is escaping into space. Wikipedia states that 15-30% of the infrared radiation goes around greenhouse gases and into space. The range of 15 to 30% means 50 or 100% uncertainty. The time intervals are even more uncertain. Yet the radiative transfer equations supposedly determined the total that did not escape with about 1% error.

There can never be an average difference between the amount of radiation exiting the planet and entering, because equilibrium makes both the same. Climatologists say the equilibrium temperature is shifted upward. That claim is a contradiction, but even if it were true, there can never be the 3.7 W/m difference where there is equilibrium. There is a closed loop of contradictions in saying there is a difference, which is used for calculating temperature increase, while equilibrium does not allow a difference.

Equilibrium is so complex that it cannot be observed, measured or calculated for pinning down quantities. And then equilibrium temperature cannot be pushed upward, because exiting events and locations are virtually unlimited resulting in an equilibrium temperature determined by the total mass which is radiating including atmosphere and surface of the earth, not some miniscule churning somewhere in the middle of it all. A gate half open will not keep in half the sheep. The CO2 molecules in the air supposedly close the gate by some miniscule amount, which the sheep could not care less about.

But CO2 does not close the gate by any amount, because it doesnt matter how heat gets into the atmosphere. Large amounts of energy move back and forth between the ground and atmosphere, as shown in the energy distribution schemes of climatologists. Regardless of where the energy is or how it gets there, it will exit into space until the temperature of the total mass causes the escape into space to equal the amount entering from the sun. The total, average temperature stays the same regardless of the other factors.

Radiative transfer equations erase the problem of saturation. Direct measurements easily show the saturation, so obfuscation was needed to erase the saturation. The math for the radiative transfer equations requires a concept of how much radiation is going from each parcel and how much goes into the next parcel. Accounting for saturation must occur to determine these numbers. Therefore, the erasure of saturation occurs in deciding how much radiation goes where.

The general explanation is that the heating occurs at some high elevation in the atmosphere. The height is given as either 5 kilometers up or 9 km up based upon the rationalization. The variations get endless on explaining how the heating occurs. The general concept is that saturation does not occur in the thin atmosphere up there, so more CO2 causes more radiation to be absorbed.

Why do the radiative transfer equations start at the bottom of the atmosphere, when the bottom of the atmosphere has nothing to do with the heating? There is no logic as to why the radiative transfer equations are used at all. Picking a point where saturation no longer exists has nothing to do with the radiative transfer equations. The explanations concern where the bandwidth narrows for CO2 absorption and separates from overlap by water vapor. Sometimes, the Stefan-Boltzmann constant is used to determine the height at which the heating should occur based upon the temperature at which radiation leaves the planet in the same quantity it enters.

The effect of radiative transfer equations is to show that less radiation leaves the atmosphere than enters from the sun, until equilibrium is restored at a warmer temperature for the atmosphere. In doing so, saturation is reduced to trivia, and the mechanism is reduced to simple absorption of radiation throughout the atmosphere, based on the premise that the radiation all starts at ground level and moves upward. Most energy gets into the atmosphere through conduction, convection and evaporation. Therefore, there is no means of determining how far radiation must travel to get out of the atmosphere, which must be known to calculate how much fails to be emitted at the top of the atmosphere.

Logic Contradictions

Climatologists say the equilibrium temperature is shifted upward, while admitting that equilibrium exists. Such a situation would require a change in emissivity of the Stefan-Boltzmann constant. Emissivity is an adjustment made in the Stefan-Boltzmann constant due to variations for different materials. For the average of the earth's surface, climatologists say the emissivity is 0.64. The average surface temperature is said to be 15C. The Stefan-Boltzmann constant says that any substance at 15C will give off 390 w/m radiation minus an adjustment for emissivity. When adjusting the 390 w/m given off by the earth's surface with an emissivity of 0.64, the result should be 250 w/m. But the energy schemes show 390 w/m being given off by the earth's surface, on average. They forgot to take into account the emissivity. That's 56% error. Yet the end result for the primary effect is supposed to have an error of something around 1%. The publications did not provide such details, as all science should, but earlier work was said to be off by 15%, which implies an accuracy around 1% error.

The product of the radiative transfer equations was said to be 3.7 w/m less radiation being emitted into space than entering from the sun upon doubling the amount of CO2 in the air. This radiation does not translate into a temperature. The implicit method of translating the radiation into temperature is to apply the Stefan-Boltzmann constant in reverse, as doing so yields the desired 1C temperature increase from 3.7 w/m. However, this procedure is in conflict with the concept of shifting the equilibrium temperature upward. First, equilibrium means the same amount of radiation exits and enters. If the temperature changes with a fixed amount of radiation, the emissivity must be changed. If the emissivity is changed, a reversal of the Stefan-Boltzmann constant will not yield the same result. Climatologists would need to know how much to change the emissivity. The radiative transfer equations do not provide that type of information.

In other words, climatologists contradicted themselves with every point they made. It means their infinite precision in calculating and measuring the temperature increase was totally contrived by starting at the desired end point and faking a method of getting there.
The Fudge Factor

Through whatever mysterious means, the end result of the radiative transfer analysis is a fudge factor for determining how much primary effect CO2 produces in heating the atmosphere. Critics are not entirely sure that the fudge factor originates with the publication of Myhre et al, 1998, but no other source can be located.

The fudge factor is the method of calculating the primary effect of carbon dioxide heating the atmosphere. The difference between the primary effect and secondary effects is everything in climatology. You will not see the term "primary effect" in climatology. This term is something I use to clarify a bunch of muddle. Instead of "primary effect," you will see "forcing" and sometimes "sensitivity." No two climatologists use these terms in the same way. Whatever they happen to be talking about will be called forcing or sensitivity.

So to clarify, the primary effect is what CO2 supposedly does to add heat to the atmosphere. There is no apparent disagreement in determining the primary effect by climatologists or critics. They plug in the fudge factor and pretend that it is flawless science. They only disagree upon secondary effects, which means a small amount of heating caused by CO2 will cause other things to happen. It is mostly increased water vapor that is the concern of secondary effects, as water vapor is said to be a stronger greenhouse gas than CO2. Heat caused by CO2 supposedly causes more water to vaporize and heat the atmosphere.

The primary effect was supposedly determined through radiative transfer equations, while secondary effects are determined by modeling the climate.

The general assumption by promoters of global warming is that the secondary effects are larger than the primary effect. The total of the primary and secondary effects is generally said to be 3C upon doubling the amount of CO2 in the air.

     fudge factor curve

The fudge factor is this: heat increase = 5.35 ln C/C0. Temperature increase is 0.75 times heat increase. C/C0 is the ratio of expected amount of CO2 divided by previous amount. Since the usual question is what happens upon doubling CO2 in the air, the ratio is 2. ln is natural log. So the result is 5.35 times the natural log of 2 for doubling CO2 in the atmosphere. This quantity is 3.7 watts per square meter.

One of the absurdities of the fudge factor is that it always shows 3.7 w/m upon doubling CO2 in the air regardless of how small the starting quantity. If it is one molecule, the second molecule supposedly adds 3.7 w/m.

Rationalizers sometimes claim a limited range for the fudge factor, such as 250 to 1000 ppm CO2. It's nothing but an arbitrary attempt to hide the ridiculousness of the fudge factor. There is nothing in nature that would start or stop such a curve. Since the fudge factor curve is totally contrived in contradiction to the effects of saturation, adding range limits is adding contrivance to contrivance.

One of the most significant things about the fudge factor is that it sort of eliminates saturation, as do the radiative transfer equations which produced the fudge factor. The fudge factor curve would need to have a flat top where saturation occurs. The natural log curve never does have a flat top, but it approaches flatness at a ridiculously high amount of CO2 in the air.

Critics measure saturation in a laboratory, which can be done in minutes, and say saturation is close to total after radiation travels 10 meters and is approximately achieved with very little CO2about 10% of the existing amount in the air.

The overwhelming question for decades has been, how much CO2 is not saturated for producing the claimed heating. Rationalizers used to say about 5% is non-saturated. The IPCC documents have never said the amount. The amount has to be known to calculate radiative transfer equations. Instead, the mystery of radiative transfer equations produces a number with no clue given for the assumed amount of saturation. The pretense is that the amount of radiation failing to exit into space is the saturation. Implicitly, the use of radiative transfer equations is a de facto calculation of saturation. But the sweeping conclusion (fudge factor) includes everything that influences radiation emission into space, which is totally beyond any mathematical analysis or measurement.

Another absurdity is that the curve for the fudge factor does not start at zero. It must do so to represent actual heating. Instead, the curve has to be slid to the right to get present-time quantities located where they are said to exist. This means the bottom of the curve would need to make a sharp turn toward the left to reach zero. Nothing in nature does anything that ridiculous.

The curve would need to look like this to start at zero and have a flat top:

     chopped curve

These absurdities stem from the fact that climatologists started at the desired end point pretending to have a method of getting there, while there is no method of evaluating the extreme complexities, and certainly no way to put the complexities in the form of a three component fudge factor.

The fudge factor shows heat as 3.7 w/m upon doubling CO2 in the air. There is no way to determine temperature from heat due to extreme complexities. Climatologists use the Stefan-Boltzmann constant in reverse to do this. Reversing the Stefan-Boltzmann constant is not valid, because the causes and effects are not definable in reverse direction. This is particularly true for the atmosphere, because the Stefan-Boltzmann constant was designed for emissions from the surface of opaque solids, not transparent gases.

Stefan Rahmstorf is quoted to have said this, "Without any feedbacks, a doubling of CO2 (which amounts to a forcing of 3.7 W/m2) would result in 1C global warming, which is easy to calculate and is undisputed," in a book attributed to Ernesto Zedillo, 2008.

Applying the Stefan-Boltzmann Constant

Climatologists apparently use the Stefan-Boltzmann constant (SBC) to derive the temperature of 1C from the 3.7 watts per square meter, because they say the number is easy to calculate, and only such a simple calculation as the SBC shows a relationship between radiation and temperature. Physicists claim the relationships go both ways— from temperature to radiation, and from radiation to temperature.

Applying the SBC to this question is extremely nonscientific, because the claimed heat (3.7 w/m) would dissipate in femto seconds in a dynamic atmosphere which is liberating 235 w/m continuously in equilibrium with energy entering from the sun. The SBC was designed to show how much radiation leaves in a continuous manner from the surface of an opaque solid, not from a transparent gas such as the atmosphere.

The SBC is this:

     W/m = 5.670373 x 10-8 x K4

The global average, near-surface temperature is said to be 15C. Average emissivity is said to be 0.64.


3.708/3.486 = 1.064C

The result is the desired 1C for the primary effect of doubling CO2 in the atmosphere, as if climatologists could calculate such things with extreme precision. They claim about 1% error on this factor. However, the SBC shows about 20 times too much radiation at normal temperatures. Reducing the radiation in the SBC by a factor of 20 shows this:


3.708/0.175 = 21.189C

The result shows 20 times as much temperature increase as climatologists claim, when the SBC is corrected for too much radiation. None of these results are real, as the claimed radiation (3.708 w/m upon doubling CO2) was contrived for the purpose of eliminating the significance of saturation. With saturation, no radiation change would occur to increase temperatures as global warming.

In addition to the quantitative absurdities, it is not valid to reverse the Stefan-Boltzmann constant as a method of determining temperature, and there is no other method of getting temperature out of any scientific calculation. Temperature is determined by the total energy dynamics of changing systems, with heterogeneity in complex systems. The forward direction of the SBC looks only at a definable surface, while the reverse of the SBC is influenced by the total dynamics. Yet the result of the radiative transfer equations is translated into the temperature of the near-surface atmosphere based upon a claimed reduction in emission at the top of the troposphere.

What this shows is that climatologists started at the end point of 1C being the desired near-surface temperature increase upon doubling CO2 in the atmosphere, but correcting the math (SBC) shows 20 times more than they would have wanted for a result.

Notice that the calculations above show a fixed relationship between the 3.7 W/m and 1C. These numbers have existed since the seventies. Myhre et al stated in 1998 that these numbers were only off by 15%. It means there is no place for subtracting ocean heat in these calculations.

Recently, the explanation for the "pause" is that measurements of ocean heat were re-adjusted, and the missing heat was found. Then the claim emerged that 90% of the heat caused by CO2 ended up in the oceans. If so, that 90% must be subtracted from the 1C claimed, because it is a calculated total. The numbers should say there has been 0.09C near-surface temperature increase so far, and it will increase to 0.1C upon doubling the amount of CO2 in the air. The concern is said to be temperatures going over 2C.

Such contradictions cannot be resolved, because there is no real science. The numbers are contrived by starting at the desired end points, which leaves no space for changes afterwards.

So, how real is the error in the Stefan-Boltzmann constant? Saying that a cold basement wall at 15C is giving off 390 w/m is totally preposterous. Physicists are not exactly saying otherwise, they are saying that it also absorbs that amount, so you do not notice anything. But that claim is absurd also, because biological process, and other complexities such as ice melting, would be sensitive to the difference between emission and absorption, and skin cells would be fried by that much energy being absorbed, regardless of how or when it is re-emitted.

Before absorbed radiation can be re-emitted, it must first be converted into heat, which means molecules vibrating. Those vibrating molecules increase in their chemical reactivity as their temperature increases. Biological systems will not tolerate significant increases in temperature without being destroyed.

Biology is like a thermometer which can tell the difference between radiation absorption and emission. If a cold basement wall were emitting 390 w/m, you wouldn't be able to get near it without skin cells being rapidly heated and damaged.

It means there is no mysterious cancellation of the high absorption and emission indicated by the SBC, and it means climatologists started at a desired end point and contrived the method of getting there. It's the only thing they do in climatology, because the randomness and complexities of climate cannot be reduced to scientific analysis.
What Really Happens

The earth is cooled by radiation which goes around greenhouse gases (GHG). About thirty percent of the infrared, black body radiation goes around greenhouse gases. It can cool the planet, because it is emitted from all sides, while the sun's energy enters from only one side.

   rays       radiation

Energy gets into the atmosphere mostly through conduction and convection from the surface and evaporation from the oceans. Very little radiates from the surface.

   from surface       from atmosphere

Most radiation leaves the planet and goes into space from the atmosphere, and a small amount leaves from the surface of the earth.

This energy cools the planet and establishes equilibrium with the amount of energy entering from the sun. Equilibrium is not a stroke of luck. All major forces in nature proceed until they can't proceed any farther, which is equilibrium.

Most importantly, whatever is blocked, it can't change one iota, because saturation occurred with the first 2% of the CO2 in the air.

Real scientists are accustomed to looking at complex and dynamic systems in terms of how the component elements interact. Nowhere in nature do complexities interact as global warming promoters assume. Unrealistic persons assume that greenhouse gases are blocking just the right amount of radiation to keep the temperature of the planet just right. It's absurdly preposterous. There is no such thing as "just right" in nature, no such thing as an absence of constant change and no such thing as a delicate balance. Photosynthesis has one twentieth the amount of carbon dioxide it evolved on. That's not a delicate balance.

The absurdities are a result of incompetents pushing their way into science and shoving out rational persons. This is why the global warming issue blew up out of nowhere a few decades ago, while the concept has been promoted by a few radicals in science since 1850.

There are a significant number of scientists who criticize global warming, not simply as being wrong but as being junk science. They are denied grants and the ability to publish. Their views were assembled by Marc Morano who accumulated a list of 1,000 such scientists (5).

Modeling is the only basis for the subject of global warming. Modeling is nothing resembling real science. What real science should be is defined by its purpose. The purpose of real science is to verify through reproducible measurements. Modeling was never a part of science a few decades ago except in physics, which hasn't been real science since 1686 beyond Newton's laws.

The first problem with modeling is that it is nothing but an expression of opinion. There seems to be a growing drive to sanctify expert opinions as a replacement for real science. Expert opinion is good and necessary, but only with explanations which allow each person to evaluate for themselves. Let the expertise show up in the value of the product rather than justifying fakery which cannot be questioned. Stripped of the logic and explanations, expertise it's nothing but charlatanism.

The problem with modeling is the same problem with statistical procedures. Garbage in; garbage out. Statistical procedures were designed and are used to muddle a subject while arbitrarily controlling the results.

It's all about objective reality. There is no place for subjectivity in science. In fact, everywhere else in society, people have a right to expect objectivity without having someone else's subjectivity forced onto them. Objective reality is supposed to be the medium which allows constructive social interactions.

Modeling in global warming, like statistics, is used for nothing but obfuscation. The obfuscation is maximized rather than minimized attempting to conceal the total fakery and force motives onto society. It's not something that other scientists can evaluate. There is no clue given in what was done in producing the models. "Just trust us" is the entire basis for claimed global warming science.

The social debate over global warming wouldn't exist if there were a real science to the subject. Science does not produce that much arguing. Real science clarifies truth which rapidly ends arguments. Around the periphery and leading edge of science arguing needs to occur, but those disputes are so trivial that they aren't even visible. When they become visible, it is only the absence of real science that creates the dispute.

The concept of greenhouse gases creating global warming was ridiculous to real scientists when they were in control of science. The climate involves huge forces which totally swamp the miniscule effects claimed for humans. An ice age occurs every hundred thousand years. One of the most ridiculous things about the concept of global warming is the assumption that all of those forces are "delicately balanced," and humans are upsetting the balance. This notion seems to be borrowed from ecology, where human effects are not so miniscule. Ecology wasn't so delicately balanced, until humans mowed the surface of the earth. The vegetation was easy to strip and intricately webbed into the total biology.

There is a ridiculous concept of balance with global warming, where the cumulative effects reach just the right quantities for survival. This result is supposed to be somewhere between 180 and 280 parts per million carbon dioxide in the air. For a billion years? How absurd can anyone be. There was five times as much carbon dioxide in the air during dinosaur years, and twenty times as much when modern photosynthesis began. How many species can live on one twentieth as much food as they evolved on?
All biology is on the verge of becoming extinct due to a shortage of carbon dioxide in the air, which is needed for photosynthesis. Oceans absorb carbon dioxide and convert it into calcium carbonate and limestone. CO2 graphAs a result, carbon dioxide almost disappeared from the atmosphere 300 million years ago. In the nick of time, the level bounced back to the amount during the dinosaur years. The bounce-back would have been due to volcanoes, as tectonic plates started moving around and separating at that time. (Tectonic plates continually get thicker as the planet cools. They were just getting thick enough to do things during the dinosaur years.) Three hundred million years ago is also when conifers evolved. They have needley leaves as a method of maximizing surface area for absorbing carbon dioxide in low supply. The pointedness of the needles also reduces the tendency of dinosaurs to bite into them, though the wood would have been more relevant. Dinosaurs ate nonwoody plants.

The claim of oceans acidifying due to the absorption of carbon dioxide is nothing but more fakery. No one has ever found any other pH than 8.1 in the oceans beyond isolated environments such as estuaries. The claims otherwise are all predictions, projections, aquarium tests, etc., at least as they leave the science domain; and then nonscientists take it from there and create everything up to the green Martian disease with it. The reason why the ocean pH never varies is because it is buffered by calcium, which combines with the carbon dioxide to form carbonate. The calcium never runs out, so the ocean pH never varies.

All major global effects equilibrate. This means they change until they cannot change further in their interactions. These equilibrium forces are large, and they established themselves billions of years ago. For this reason, the temperature of the globe is in equilibrium with the energy being absorbed by the sun. Equilibrium is not a delicate balance.

The planet is cooled by radiation which goes around greenhouse gases, not through them. The equilibrium temperature is independent of how heat gets into the atmosphere. Moving energy around does not change the equilibrium temperature. A jar of pickles will absorb radiation, but it doesn't heat the kitchen. A gate half open won't keep in half the sheep.
Ocean Heat

There is nothing resembling real science to the subject of global warming. Science is a process, not a conclusion. Conclusions come out of a dark pit in global warming science. Fake procedures are claimed, with no explanation or logical purpose. Necessary scientific standards are defied in extreme ways attempting to contrive a subject without accountability.

The climate is too complex and random to be reduced to scientific measurements. Real scientists do not go down the path climatologists follow pretending to measure complexities and randomness which cannot be measured. A measurement requires that all influences over the results be identified and separated from other influences. Climate has too many interacting complexities to do that.

Another major problem is the "signal to noise ratio." Minute effects are supposedly measured, while huge influences overwhelm the measurements.

An example is the pretense of measuring an increase in ocean temperatures. Supposedly, the average ocean temperature increased 0.2C over the past several decades. The heterogeneity of the oceans is too extreme and rapidly changing for the trivial measurements being made. A claimed point of measurement doesn't say whether it represents five feet of water of five hundred miles of water. The temperatures can vary by several degrees in a few feet.

Physicists, includinig climatologists, have a bad habit of pretending to get a representative average out of several measurements, with no accounting for the unknowable variations. A large part of what they do is framed in statistical analysis; yet they don't bother with the statistical impossibility of getting a properly represented average out of unknown variations.

Any temperature in the oceans could be due to any number of unknown effects other than humans putting carbon dioxide into the air. Evaporation removes much more heat from the oceans than the miniscule quantities supposedly being measured. The heat gets replaced by the sun and accumulates, which too occurs in huge quantities compared to the miniscule effects claimed for carbon dioxide influences in the air.

The sun is said to add 168 watts per square meter to the earth including oceans, while the claimed temperature increase for the oceans due to carbon dioxide in the air is said to be added through 0.27 w/m (published article). That's a noise to signal ratio of 622 to 1. Most of the sun's energy leaves through evaporation, but some stays in the ocean. How much is impossible to guess.

Between ice ages, solar heat slowly accumulates in the oceans. The average increase is imperceptibly slow, while 168 w/m from the sun is going into and out of the oceans constantly. The amount equilibrates, which means it stabilizes with all interacting forces. If 168 w/m is equilibrating, why is 0.27 w/m accumulating and showing a short-term increase of 0.2C? Any such effect would equilibrate to zero short-term effect along with the other 168 w/m. In other words, if 168 w/m is going to exit the oceans to equilibrate, 168.27 w/m is going to exit for the same reasons.

According to climatologists, there is a lot more radiation than 168 w/m going into the oceans. They have huge amounts of radiation going into the oceans from the atmosphere and radiating back out. They use extreme amounts of radiation, because it is necessary for contriving a greenhouse effect. This image shows the numbers they use.

     radiation numbers

They have an additional 324 w/m radiating from atmosphere to earth including oceans (which are 71% of the surface area of the earth) and 390 w/m radiating back out, with only 78 w/m leaving by evaporation and 24 w/m leaving by conduction. It's 0.05% of the energy staying in the oceans and adding heat, while 99.95% exits the oceans (0.27168+324=0.0005). That's a noise to signal ratio of 1,822 to one.

In actuality, there is almost no radiation emitted by the cold, average surface temperature of the earth, said to be 15C (59F). I estimate that 1% of the energy leaving the surface of the earth is in the form of radiation, with the rest being conduction and evaporation.

Climatologists don't determine the 0.27 w/m in a direct manner; they claim to measure the temperature increase of the oceans and then calculate how much radiation would be required to produce that much heat when accumulating over 40 years. How come the 168 w/m of energy which the sun puts into the oceans comes back out with no detectable amount staying in, while the 0.27 w/m of energy which the humans put into the oceans stays there and accumulates for 40 years with not an iota coming back out? Logic is sacrificed to contrivance in Climatology.

The primary absurdity is that determining an average temperature of the oceans, repeated over the past 40 years, is totally impossible. The oceans are extremely heterogeneous, with rivers of motion and mountains of hot and cold temperatures.

A few years ago, a project called ARGO used 3,000 diving buoys to measure temperatures over the top 700 meters of the oceans. That's 351 kilometers of space between each one on average (3.7x1083,000=123,000km, x=351km). There is an infinite amount of variation every 351 km in the oceans, and it changes constantly.

The mentality throughout climatology and physics is that if you collect enough nonsensical data points for any question they will average out to a representative average. The implication of that assumption is that infinite measurements are made over the domain in question. Yet physicists/climatologists can only measure miniscule fractions of the variations that occur. The error is in assuming that a miniscule number of measurements will produce the same average as an infinite number of measurements.

Climatologists get a huge amount of random error in their miniscule measurements—so much so that they can't get anything close to an actual number for temperature of the oceans, and hence for global average surface temperatures. The claimed measurements are contrived by starting at desired end points and faking a method of getting there.

Science requires standards for these reasons. Standards are not arbitrary in science. They are needed to overcome corruption of the process. It's not just a question of precision or convenience; it's a question of verifying with reliability. Science has the purpose of verifying. If it doesn't verify, it's not science.

The main problem is that procedures are not explained. Scientific criticism is blocked by that standard. Blocking criticism and accountability is not an alternative in science, it is a total absence of science.

In place of methodology, a few mockeries are splattered onto the page, such as a few simple math equations. They tell nothing. Methodology has to explain why and how in terms that can be checked out. It doesn't exist in physics or climatology. The pretense is that everything is too complex, as if an encyclopedia would be required. Bull roar. If it isn't described, it isn't science. Anything that is actually done can be described.
The Heat Capacity of Air

It's too Low to Heat Oceans of Melt Polar Ice

The talk of oceans heating and polar ice melting due to carbon dioxide is nonscientific for the simple reason that there is not enough heat capacity in air to do that.

The heat capacity (called specific heat) of air is 1.0035 joules per gram per degree centigrade (j/g/C), which is the same as kilojoules per kilogram per degree centigrade (kj/kg/C).

The specific heat for water is 4.1813 j/g/C or 1 calorie/g/C.

The density of air at 15C and sea level is 1.225 kilogram per cubic meter.

The density of water is 1,000 kilograms/m. Sea water is slightly more dense, but we will ignore that.

Therefore, a cubic meter of water holds 3401 times as much heat as a cubic meter of air at the same temperature. (4.18131.0035x1,0001.225=3401)

This means that to heat a cubic meter of water by 0.2C from air would require 3401 cubic meter of air losing 0.2C. If the oceans were heated 0.2C to a depth of 350 meters (half the depth of ARGO measurements), there would need to be 167 atmospheres of air losing 0.2C. (350x34015kmx70%=167) (The height of the normal atmosphere is 12-15 km. A rough average is to assume it is all at sea level pressure to a height of 5km. Oceans cover 70% of the earth's surface.) This means 167 times planet earth to do the eating of oceans which is claimed.

There isn't that much air, yet fakes claim the oceans have been heated by 0.2C due to global warming. It's total contrivance. There isn't anywhere near enough heat in the air to heat the oceans the slightest amount.

Temperature Increase in the Oceans

If the atmosphere gave up 0.2C to the oceans, the amount of heat that it could transfer to the oceans would theoretically create about 0.001C temperature increase for the top one tenth of the oceans.

The calculations are these: Air has a heat capacity of 1 kj/kg/C. The density of air is 1.23 kg/m. The atmosphere has an equivalent of 5 km height at sea level pressure. A one square meter column has 6,150 kj/C (1x1.23x5,000=6,150). Transferring 0.2C leaves 1,230 kj available (6,150x0.2=1,230). Water has a heat capacity of 4.18 kj/kg/C. Its density is 1,000 kg/m. A column to a depth of 350 m has a capacity of 1.46x106 kj/C (4.18x1,000x350=1.46x106). Dividing it into the 1,230 kj available leaves 0.001C temperature increase in the ocean.

Melting Ice

Melting polar ice with air is even more ridiculous, because melting ice requires a lot of heat, called heat of fusion, which is 334 kj/kg. Each cubic meter of ice melted would require 261,000 m of air losing 1C (334,0001.28=261,000). (A cubic meter of water or ice are about 1,000 kg. Melting requires 334 kj/kg. Combined, it's 334,000 kj/m. The specific heat of air is 1kj/kg with a density of 1.28 kg/m at 0C.)

This number can be divided by the height of the atmosphere, which is equivalent to 5km at normal pressure, and it is 52 atmospheres of height above the ice. (261,0005,000=52). That's for one meter of ice depth and 1C of global warming. If the ice is 10 meters thick, 520 atmospheres above it would be required to hold enough heat to melt it.

Of course the air would not circulate well enough at more than a few kilometers of height. What really happens is that the air above polar ice rapidly becomes the same temperature as the ice, and nothing melts. It takes warm ocean currents to melt polar ice. The melting that has been occurring at the North Pole results from warm Pacific Ocean water flowing over the Bering Strait and into the North Pole area.

Ice at the South Pole keeps getting thicker, because it sits over land. Warming ocean currents put more moisture in the air which adds snow inland over Antarctica. Around the edges, a small amount of ice melts due to warming ocean currents. Why ocean currents warm and cool, no one knows, except that ocean temperatures slowly increase between ice ages, and oceans are extremely heterogeneous for temperature.

The glaciers on mountains are totally irrelevant, because they are usually too small. Only the Himalayas are large, and they are not melting, because they are too high to be reached by warm air currents. The low level ice melted shortly after the last ice age. The edges of the mountain glaciers constantly increase and decrease for random reasons. This effect was shown by the "iceman" found in the Alps after some ice melted. He died there about 5,000 years ago. This means there was no ice where he was at about 5,000 years ago, then ice covered over him, and then the ice melted again a few years ago. Such ice melting and reforming has nothing to do with greenhouse gases.

Where does the heat come from?

The latest claim is that 90% of the heat produced by greenhouse gases went into the oceans. This claim is one of the attempts to explain why there has been no detectable change in the average, near-surface temperature over the past 18 years.

Climatologists already took care of all this, and that isn't the result they got. The didn't account for an iota of heat going into the oceans until recently. In fact, when the first ARGO measurements were made, about ten years ago, the result was that the oceans were slightly cooling. So the coldest measurements were thrown out, and the temperature was stable.

Then in 2015, a controversial calculation was made showing that the oceans heated 0.2C due to global warming, and this is why air temperatures have not been going up as expected. That's more heat than greenhouse gases can account for, as explained above. The past 40 years of climatology supposedly accounted for all heat, and none of it went into heating the oceans. Should not the past 40 years of calculations and measurements be done over? No one is saying a word about it. Contradictions of this sort exist in every point made in climatology, because the subject is totally contrived with no relationship to anything happening in nature. The contradictions are ignored rather than resolved.

The obfuscated methodology was this: The heat produced by carbon dioxide (primary effect) was calculated using the "radiative transfer equations" showing that 3.7 w/m of energy less than the sun's energy gets trapped in the atmosphere and does not exit into space upon doubling the amount of CO2 in the air. This 3.7 w/m is translated into a 1C near-surface temperature increase by applying the Stefan-Boltzmann constant in reverse.

As the atmosphere gets close to doubling the CO2 content, the supposed measurements are getting close to showing the expected 1C temperature increase, showing the godly precision and wisdom of climatology. None of this accounted for any of the heat going into the oceans.

For secondary effects, something about the oceans was calculated and modeled, but no explanations were published. If the primary effect did nothing to heat the oceans, why would the secondary effects? The models showed a continuous increase, which did not occur over the past 18 years; and then ocean heat was used as the explanation. Since the models showed an increase, while none occurred, the models must not have accounted for ocean heat.

Why was the expected 1C increase found with such precision, if 90% of the heat was going into the oceans and not accounted for in the analysis of either the primary or secondary effects? The answer is simple: It's impossible to contrive falsehoods without contradictions.
A Mechanism is Not Known

CO2 moleculesThe endless claim that every molecule of CO2 is more heat in the atmosphere is absurd. Saturation means CO2 did all it can do long ago. It absorbed all radiation available to it. Increasing the CO2 only shortens the distance radiation travels before being completely absorbed. Shortening the distance is not increasing the heat.
There is a false concept that increased CO2 in the atmosphere causes the absorption spectrum to widen and thus absorb more radiation. Somehow, radiative transfer equations slide into this concept. It doesn't happen, because the bandwidth is determine by the energy state of the molecules. In other words, some bonds stretch more with more energy causing them to absorb at a different frequency. But increasing the amount of CO2 does absolutely nothing to change the energy state of the molecules. Increased air pressure does increase the energy state, because then the molecules bump into each other harder, which stretches the bonds farther. curveIncreasing the amount of CO2 in the air does not increase the pressure, so it doesn't change the energy state of the molecules, and it doesn't widen the bandwidth.

To rationalize this error, a World War Two graph of absorption is used, created from a propeller aircraft high in the atmosphere. The graph is nothing but engine noise. It has sine waves and spikes within the sine waves. Absorption spectra are never sine waves; they are always bell curves.

At the center of the main peak for CO2, all available radiation gets absorbed in ten meters at ground level (Heinz Hug) (6). On the shoulders of the curve are CO2 molecules which have stretched bonds causing them to absorb at slightly different wavelengths. There are fewer of these molecules; so they don't absorb as much radiation, and the distance traveled is greater before all such radiation gets absorbed. Supposedly, it is these shoulder molecules which heat the atmosphere, because they are not saturated. No dice. It's impossible to get them thin enough to not saturate.

In 2001, the IPCC (AR3) (7) stated that saturation exists in these terms: "Carbon dioxide absorbs infrared radiation in the middle of its 15 mm [sic] band to the extent that radiation in the middle of this band cannot escape unimpeded: this absorption is saturated. This, however, is not the case for the bands wings. It is because of these effects of partial saturation..."

A few years ago, scientists were saying that five percent of the CO2 molecules were unsaturated for creating global warming. shoulders This should mean that instead of the radiation traveling ten meters before getting totally absorbed, it should travel about twenty times as far, which is 200 meters. There is no significant difference between 10 meters and 200, as the air mixes over such short distances. Doubling the amount of CO2 in the air does nothing but reduce those distances in half. Changing the distance is not increasing the heat.
Equilibrium Shift: After back radiation began to lose credibility, the rationalization changed, at least for some persons, to a shift in equilibrium temperature. Supposedly, the top of the atmosphere radiates energy into space at the same rate the sun adds energy to the earth. The equal rates are called equilibrium. When carbon dioxide in the atmosphere increases, radiation is said to not escape as easily, and a warmer temperature is required to liberate the same amount of energy as the sun provides.

One problem with this claim is that the planet is cooled by radiation which goes around greenhouse gases, not through them. shifting radiationAbout thirty percent of the infrared radiation given off by the earth and atmosphere is not obstructed by greenhouse gases. The unobstructed radiation cools the planet. A gate half open does not keep in half the sheep.

Another problem is that the location where CO2 no longer obstructs the flow of radiation outward is not at the top of the normal atmosphere (troposphere) but much farther out into the stratosphere. The pressure at the top of the troposphere is one tenth that of the earth's surface. With saturation occurring in 10 meters at the surface, it would occur in 100 meters at the top of the troposphere.

To then say that the near surface temperature increases is to say heat leaves from the near surface to heat the exiting point at 9 km up and does not radiate into space from any other place in the atmosphere but 9 km up. Supposedly heat moves upward through an "adiabatic effect," which creates the temperature gradient with height in the atmosphere. There is no adiabatic effect, since expansion is required, and no expansion occurs. It is radiation being emitted from all points in the atmosphere and going around greenhouse gases which creates the temperature gradient or assumed adiabatic effect.

Another problem with the latest rationalization for a mechanism high in the atmosphere is a reversal of cause and effect. The claim is that increased CO2 restricts the escape of radiation at the top of the atmosphere, and therefore, more heat is needed at ground level to increase the temperature at the top of the atmosphere causing more radiation to escape and achieve equilibrium with the energy entering from the sun. Needing energy at ground level is not going to produce it. If a higher temperature is needed at the top of the atmosphere, reduced emission will cause the increase in temperature. Nothing could possibly change at ground level in that mechanism. The claim is not being made that back radiation heats the earth's surface with that mechanism.

Notice that these rationalizations appear decades after the result has been decided. The claimed 3C temperature increase upon doubling carbon dioxide hasn't changed since 1979. These mechanisms are so vague that no quantitation is possible, yet all depends upon quantitating the mechanism—a mechanism that changes every few years.

Satellite: A sometimes-mentioned claim is that satellite measurement shows radiation to be emitted from 9 kilometers up for the wavelengths which are absorbed by CO2, which shows that CO2 absorption is not saturated at that height. Satellites cannot determine the height specific wavelengths come from. Satellites can only determine the height of total radiation, because it is a shift in wavelength which indicates the height. Shorter wavelengths do not go as far through the atmosphere. Wavelength cannot shift for the CO2 absorption spectrum, and therefore, the height cannot be determined for CO2 absorption wavelengths.
Back Radiation Doesn't Happen

The location of the primary effect of global warming is being placed high in the atmosphere, usually said to be 9 km (5.6 mi.) up, because saturation supposedly does not occur up there. Wrong. At the top of the normal atmosphere (troposphere), air pressure is one tenth that at sea level. A factor of ten is a joke. The saturation distance changes from 10 meters to 100 m. Looking at shoulder molecules as if they were 5%, the distance is still only 2 km. Doubling the CO2 reduces the distance to 1 km. Changing the distance is not increasing the heat. Complete blocking is still occuring at the top of the troposphere, which is 12 km up. The 9 km concept is nothing but rationalism with no relationship to reality.

The proof of the absurdity is an inability to get the heat to the surface from high in the atmosphere. "Back radiation" is said to be the mechanism. Back radiation would require a very large temperature increase high in the atmosphere to produce any amount of heating near the surface of the earth, while no temperature increase has been found high in the atmosphere. 
The obstacles to getting energy radiated back to the surface are numerous. radiationIt takes a lot of heat to create a lot of radiation. As others have noted, half of the radiation will go upward instead of downward. This means there would need to be twice as much temperature increase up high as occurs down low. Another major factor is that the temperature at 9 km height is -43C, which emits 40% as much radiation as near surface temperatures, according to the Stefan-Boltzmann constant. That means the temperature increase at 9 km height must be 2.5 times as much as the temperature increase near the surface. Only 30% of the radiation will go around greenhouse gases, which means 3.3 times as much temperature increase must occur at 9 km up. About 30% of the radiation will be reflected, which requires 30% more temperature increase at 9 km up. Totalling these effects requires 24C temperature increase at 9 km up to heat an equal amount of air near the surface of the earth by 1C. (2 x 3.3 x 2.5) 0.70 = 24).

This does not account for oceans, which are 70% of the earth's surface. The oceans absorb radiation to a depth of 10 meters (30 ft) and do not release it easily. The oceans constantly accumulate heat between ice ages due to the absorption of radiation, mostly coming from the sun. Only an ice age cools them back down. Ice ages have been occurring at precisely 100 thousand year intervals. If 70% of the back radiation disappeared into the oceans, the temperature at 9 km would have to increase by 80C to heat the near surface by 1C. (80=24/1-.70). No temperature increase is occurring at 9 km up, because everything about the mechanism is ridiculous.
Absurd Distances between Molecules

There is a strange way of projecting the horror of greenhouse gases by contriving a potency out of absorption of radiation. Methane absorbs more strongly than carbon dioxide and is therefore portrayed as a greater potential danger. The fact is, the logic is reversed due to saturation. The strongest absorbers of radiation saturate more readily and have less for shoulder radiation which is supposed to do the magic. It's hypothetical, since none of them can do anything, but the assault on logic permeating through the media in describing such dreads as methane is suffocating.
If nonsaturation is where radiation gets to the top of the normal atmosphere (troposphere), the molecules would have to be 1,700 times as far apart than usual for carbon dioxide. This is because the top of the troposphere is about 17 km high (It varies from 12 to 17 km.), while saturation occurs in 10 meters at the center of the absorption curve
(Heinz Hug) (6). The distance between CO2 molecules is normally 500 nanometers. This is 200 picometers for each air molecule divided by 400 parts per million CO2 in the air. Multiplying this distance times 1,700 for nonsaturated shoulder molecules yields a distance of 0.85 millimeters between each CO2 molecule which heats the globe. This distance is visible with the naked eye. Molecules far enough apart to see, if they were visible, won't heat anything. The molecules which supposedly do the heating would have four million air molecules between each one.

To determine the same thing for methane, we are told that its potency is somewhere between 20 and 120 times that of CO2. We are told that residence time in the atmosphere reduces the potency to something like one half. So we will estimate methane to be 30 times stronger than CO2. The concentration of methane in the atmosphere is about 1.8 ppm. So we take the 10 m of travel distance for CO2 and divide it by 30 and then multiply this by the concentration ratio of 400 ppm divided by 1.8 ppm; and the result is that methane should saturate at the center of its absorption peak after traveling 74 meters compared to carbon dioxide's 10 meters. Next we divide this into the distance to the top of the troposphere (17 km) and get a ratio of 230 times as much distance on the supposed nonsaturated shoulders as at the center of the absorption peak. The distance between each methane molecule is 200 pm for each air molecule divided by the methane concentration of 1.8 ppm, which is 111 micometers (microns). The shoulder molecules of concern are 230 times this far apart, which is 26 millimeters. This is approximately an inch.

So at this time, the globe is supposedly being heated by carbon dioxide molecules which are nearly a millimeter apart and methane molecules which are an inch apart. distance If the amount of these dreaded things double in the atmosphere, they will be twice as close together, and the distance for saturation will be half way to the edge of the troposphere. Reducing the distance by half is not increasing the heat.

What the saturation problem shows is that the so-called potent greenhouse gases, such as methane, saturate sooner than the others; and therefore, they are even farther from overcoming saturation and less of a threat, not more. For example, the U.S. Government is now planning to outlaw a refrigerant which is said to be 10,000 times more potent of a greenhouse gas than carbon dioxide. This would mean that it saturates somewhere in the millimeter range instead of the ten meters for carbon dioxide.

These contradictions prove that global warming cannot occur. At ground level, the molecules cannot get thin enough to not saturate. They would be almost a millimeter apart for carbon dioxide when saturating at the top of the troposphere. Changing the location to nine kilometers up requires at least 24C of heating, before back radiation can get enough heat to the surface of the earth to create 1C temperature increase.

The distances discussed here are never mentioned in the science of climatology. Science cannot be constructed out of phantom claims. Descriptions are omitted as a pattern and practice of contriving a subject out of fraud. The measurements produced by Heinz Hug (6) could not be published, because these points were discussed and distances were evaluated in his write-up.

The prevailing standard is an attempt to reduce science to revelation rather than clarification and verification. As with relativity, the justification is that the absence of something cannot be proven wrong. It's nonfalsifiable. Science is not the absence of something; it's the presence of something.

There are endless strange twists to this subject, as rationalizers attempt to resolve contradictions, while they can only compound the fraud, since there is no valid science to the subject, and the global warming which they try to justify doesn't exist. An example is the application of the Stefan-Boltzmann constant (SBC) to the atmosphere. the SBC was designed to show the amount of radiation which leaves the surface of a solid at any given temperature. It is totally inappropriate for gases such as the atmosphere, because they do not have a surface. Even if a strange definition of a surface is used, such as a kilometer thickness of the atmosphere, which is sometimes implied without a specific thickness, as one might see from outer space, a gas is still vastly different from the surface of a solid, since radiation can be emitted from all points in a transparent gas.

So climatologists often make this claim: The SBC says the earth must emit its 235 watts per square meter (W/m) at a temperature of -19C. This temperature is found at a height of five kilometers in the atmosphere. Therefore, the earth is cooled by radiation which leaves from a height of five kilometers in the atmosphere. Any idiot could see the total absurdity of that claim. What keeps radiation from leaving at other heights, such as three or one kilometers up? Nothing does. Adding up all the radiation from different heights, and it is immensely more than 235 W/m. In fact there is no way to add it all up, because there is no way to determine how many layers there are. There are infinite layers. So some other method must be used to determine how much radiation is leaving from any height. No theory exists to straighten that mess out.

Radiative transfer equations tell how radiation is depleted at different concentrations of a gas, as Myhre et al vaguely referred to, but that methodology does nothing to resolve these contradictions, as there is no fix for applying the SBC to the atmosphere.

Heinz Hug reported that CO2 absorbs 99.94% of the available radiation within ten meters of travel at the center of its main peak when near the surface of the earth. But rationalizers say the shoulders are not saturated. If the shoulders are not saturated, the problem is distance. What distance is non-saturation? It doesn't exist.
If 5% of CO2 molecules are not saturated due to shoulder characteristics, as some rationalizers were saying a few years ago, they would be spread over 20 times as much distance as the other 95%. Not only do they represent 1/20th the heat captured by CO2, but they produce 1/20th as much temperature change with each unit of heat, since they are spread through 20 times as much atmosphere. Multiplying 1/20 times 1/20 equals 1/400th as much temperature change as the other 95% of the CO2. If these shoulder molecules are responsible for the 1C temperature increase due to the primary effect, the other 95% of the CO2 molecules would have had to produce 400 times that much temperature increase by the time they saturated, which is 400C. distance The existing atmosphere could not have been heated 400C.

If shoulder molecules allow radiation to travel to the top of the normal atmosphere (troposphere) the distances would graph like this image. Doubling the amount of CO2 in the air would reduce the distances to half without changing the total heat absorbed. Changing the distance is not increasing the heat. If reducing the distance is increasing the temperature near the surface of the earth, then the temperature must be reduced proportionately higher up. Instead, at least some climatologists say the heating is created higher up.
Most radiation leaves the planet from the atmosphere, not the surface. About 30% of the emitted infrared radiation (called black body radiation) goes around greenhouse gases. Wikipedia says 15-30%. atmospheric radiation This includes radiation leaving from the atmosphere. This radiation cools the planet establishing an equilibrium with radiation from the sun. Nothing more than that is relevant to global warming, because the planet cools independent of greenhouse gases. Rationalizers ignore this fact and contrive a mechanism at the top of the troposphere, where saturation is supposedly not relevant.

As carbon dioxide increases in the atmosphere, the distance traveled by radiation before being absorbed is shortened. Shortening the distance is not increasing the heat. This is as true for radiation emitted from the atmosphere as from the surface of the earth. Nothing reduces the amount of radiation escaping into space.

Radiation emitted from the atmosphere goes in all directions. spheres of radiation Therefore, it does not appear to move energy toward space, and only the radiation which goes around the greenhouse gases goes into space. However, there is a subtle effect, where absorption by "greenhouse gases" and re-emission functions like conduction. Energy actually does move outward simply through the tendency to move from higher to lower concentrations of energy. This effect would cool the planet, even if no radiation went around greenhouse gases. Saturation is not relevant to this question, as most heat enters the atmosphere through conduction and evaporation. I estimate that less than ten percent of the energy gets to the top of the atmosphere and escapes this way based on the fact that several hours are required for cool-down at nights.

In other words, if no radiation could go around greenhouse gases; past, present or future; heat would work its way toward the upper atmosphere and into space by transferring through emission and re-absorption of radiation in the atmosphere. Whatever this does, whatever it means, increasing carbon dioxide has not the slightest effect upon the result. This is the truism throughout this subject—whatever was happening, is happening or will be happening, it cannot change the slightest amount with changes in carbon dioxide or other greenhouse gases in the atmosphere.
This method of slowly moving energy outward through absorption and re-emission is what creates the gradient of temperatures with height in the atmosphere. It's like heating a metal rod at one end and getting a gradient of temperatures to the other end. In the atmosphere, heat radiates into space all along the way creating a temperature gradient. The temperature gradient in the atmosphere ends at the top of the troposphere, because water vapor ends there, and it is the primary "greenhouse gas" by a long ways. Where the water vapor ends, the temperature gradient ends.

The usual claim is that an "adiabatic" effect creates the gradient of temperatures. Adiabatic means expansion of a gas results in cooling of temperature without an actual loss of energy. It is absurd for the atmosphere, because there is no significant expansion of the atmosphere. It has been approximately the same pressure for billions of years. Vertical convection is required for expansion of the atmosphere, and there is very little large scale, vertical convection. Only cumulous clouds rise significantly, and they are quite rare. If there was significant vertical convection, clouds would mix and not be visible.

In other words, those who assume an adiabatic effect in the atmosphere cannot tell the difference between expansion of a gas and pressure gradient. This point traces back, to some extent, into science. It's impossible to say now days what the knowledge of science is; it's illusive, because it's exploitive. Where there is real science, scientists study all points of new information and rapidly acquire common knowledge. There isn't enough real science left to acquire common knowledge. Too much of it is falsehood. As a result, scientists each have their own version of reality now days, and the contradictions don't get resolved.

The contradictions in these points are never addressed in climatology. To produce such a degree of fraud, publications have to be totally devoid of real science. The publications are nothing but news blurbs without a trace of indication of what sort of science produced the claimed results. Afterwards, the results all contradict each other, because a consistent set of falsehoods cannot be produced.

From a scientific perspective, what this subject looks like, within the science and the childish explanations on the internet, is tangential rationalizations. Some trivial irrelevancy will be studied, and monumental conclusions are drawn from the results. The given meaning is not in the data. Graphs which tell nothing are forever produced and given a meaning, while infinite alternatives are possible.

Normally, a measurement in science tells something, but not in climatology. The difference is that climatology is so infinitely complex that it totally evades scientific evaluation. To then pretend that something can be seen in the results of a measurement is total contrivance. It's on the order of astrology or palm reading. It's tangential, because the measurements have no relationship to the conclusions. Any measurement which can be made will supposedly answer some question, while there is no relationship between the measurement and the conclusion. Similarly, climatologists evaluate some effect as if it can be separated from infinite influences. Their evaluations are preposterous, because they omit infinite other influences.
An example is a pretense of determining how much oceans have heated due to global warming caused by humans. Oceans are extremely heterogeneous with rivers and mountains of cold and warm water, because heat is stored and doesn't release easily. To then assign a number for human influence is total contrivance. One of the points being missed is that oceans are continuously heating, because they trap energy from the sun and a small amount from geothermal energy. Only ice ages cool the oceans back down.ocean temperature change A large part of the change that is occurring is due to oceans continually heating.

This graph is a proxy measurement of ocean temperatures using sea shell analysis. Each peak is an ice age. The past few ice ages have been occurring at exactly 100 thousand year intervals.

Similarly, mountainous glaciers are constantly going through long-term cycles of forming and melting. Notice that the "ice man" was in a location in the Alps without ice 5,300 years ago and was then covered with ice which didn't melt until recently. Yet glaciers melting are attributed to human production of carbon dioxide. Some scientists will correct such assumptions, while other scientists promote them.

My impression is that the jagged edges on the lines of the temperature graph are largely due to variations in solar influences, but they cannot create an ice age, because ocean temperatures must heat up substantially, particularly around the Arctic, to put enough moisture in the air to cause more snow to accumulate than can melt.
Temperature Measurements are Fake

Temperatures stopped rising, because measurements are fake, and they can't be faked upward any farther. Some of the procedures used to show a false increase could only be done once, such as eliminating colder stations, so they cannot be carried fake temperaturefurther.

The biggest concern with global warming over the past few years is why temperatures have not been increasing since 1998. Since temperatures never were increasing, why did the fakery stop in 1998? The obvious answer is that the fakery cannot be carried further. The discrepancy with actual measurements is getting too extreme to keep adding fake excuses for alterations of data.

After the email scandal, commonly referred to as climategate, which occurred in 2009, critics looked into temperature measurements, and everywhere they looked they saw tampering which created a temperature increase where raw data showed none. Earlier measurements were being lowered, and recent measurements were being increased to show an upward curve.

The TV Station KUSI in San Diego stated it this way: "Skeptical climate researchers have discovered extensive manipulation of the data within the U.S. Government's two primary climate centers: the National Climate Data Center (NCDC) in Asheville, North Carolina and the NASA Goddard Institute for Space Studies (GISS) at Columbia University in New York City. These centers are being accused of creating a strong bias toward warmer temperatures through a system that dramatically trimmed the number and cherry-picked the locations of weather observation stations they use to produce the data set on which temperature record reports are based."

Renowned meteorologist Joseph D'Aleo stated it this way: "NOAA is seriously complicit in data manipulation and fraud. ..."NOAA appears to play a key role as a data gatherer/gatekeeper for the global data centers at NASA and CRU."

D'Aleo and Anthony Watts conducted a study (8) drawing these conclusions: "The startling conclusion that we cannot tell whether there was any significant global warming at all in the 20th century is based on numerous astonishing examples of manipulation and exaggeration of the true level and rate of global warming.

That is to say, leading meteorological institutions in the USA and around the world have so systematically tampered with instrumental temperature data that it cannot be safely said that there has been any significant net global warming in the 20th century."

Itemizing conclusions, they stated:

1. Instrumental temperature data for the pre-satellite era (1850-1980) have been so widely, systematically, and unidirectionally tampered with that it cannot be credibly asserted there has been any significant global warming in the 20th century.
2. All terrestrial surface-temperature databases exhibit very serious problems that render them useless for determining accurate long-term temperature trends.
3. All of the problems have skewed the data so as greatly to overstate observed warming both regionally and globally.
4. Global terrestrial temperature data are gravely compromised because more than three-quarters of the 6,000 stations that once existed are no longer reporting.
5. There has been a severe bias towards removing higher-altitude, higher-latitude, and rural stations, leading to a further serious overstatement of warming.
6. Contamination by urbanization, changes in land use, improper siting, and inadequately-calibrated instrument upgrades further overstates warming.
7. Numerous peer-reviewed papers in recent years have shown the overstatement of observed longer term warming is 30-50% from heat-island contamination alone.
8. Cherry-picking of observing sites combined with interpolation to vacant data grids may make heat-island bias greater than 50% of 20th-century warming.
9. In the oceans, data are missing and uncertainties are substantial. Comprehensive coverage has only been available since 2003, and shows no warming.
10. Satellite temperature monitoring has provided an alternative to terrestrial stations in compiling the global lower-troposphere temperature record. Their findings are increasingly diverging from the station-based constructions in a manner consistent with evidence of a warm bias in the surface temperature record.
11. NOAA and NASA, along with CRU, were the driving forces behind the systematic hyping of 20th-century global warming.
12. Changes have been made to alter the historical record to mask cyclical changes that could be readily explained by natural factors like multidecadal ocean and solar changes.
13. Global terrestrial data bases are seriously flawed and can no longer be trusted to assess climate trends or VALIDATE model forecasts.
14. An inclusive external assessment is essential of the surface temperature record of CRU, GISS and NCDC chaired and paneled by mutually agreed to climate scientists who do not have a vested interest in the outcome of the evaluations.
15. Reliance on the global data by both the UNIPCC and the US GCRP/CCSP also requires a full investigation and audit.

This was January 23, 2010. Not a word of it was seen or heard by most persons, while the charade of climate change goes on, and critics are excluded from the media under the pretense of protecting the truth from contamination. Real truth is not produced through the safe keeping of the wise. Real truth is strengthened through criticism. Only fraud needs to be sheltered from criticism.

Now there is a concern about a "pause" in the temperature increase. The rationalizations are moving in the direction that the oceans are absorbing the heat, and the increase will resume in the near future. Even if that were true, it shows that there are natural influences over global average temperature beyond human influences. What the evidence really indicates is that there never was a significant temperature increase in this century. The increase was contrived through manipulations. But the manipulations cannot be carried further, because most of the change was produced by lowering earlier temperatures and throwing out stations which show colder temperatures. That routine cannot be carried any further and hence a pause in the fake temperature increase.

There are images which show temperature changes in grids superimposed onto an image of the globe. They show variations over the oceans. There are no weather stations over the oceans to determine what is happening. Satellite measurements show no significant temperature increase in recent decades. This means the variations shown over oceans are contrivances. Scientists have no ability to determine what is happening over the oceans without satellite measurements.

Very little of the earth's surface can actually be measured through weather stations. The gaps are filled in through contrivance, as indicated by the exposed emails called "climategate." The claimed global average temperature changes are totally contrived.

A recent explanation is this: In April, 2015, this issue of fake temperature data was in the news, and Senator Inhofe said his committee would conduct a hearing on it. The fakes countered by saying they increased the temperature because earlier measurements were made during the afternoons, and now they are made during the mornings. Adjustments were said to compensate for the difference. Apparently, Inhofe's committee did not conduct the hearing.

This claim shows the frivolous pattern of contriving shameless lies and the inability of nonscientists to deal with them. The temperature measurements are made at weather stations, which have no uniform standards. No one would have been told if they changed the time of measurement. But there is no time of measurement. Weather data is read several times per day and always states the highs and lows for the day. Why not just use the highs? They would have. To say the time changes was a blatant lie.

Furthermore, most of the discrepancy was due to lowering earlier measurements and much less due to increasing recent measurements. Not only did the explanation fail to account for most of the discrepancy, it went the wrong way for most of it. Also, the alteration showed a long term incline for about thirty years, and at the same stations. One change does not produce a long term incline, it would show a one time jump. Everything about the fakery is totally incredible.
Secondary Effects are Unscientific

Climatologists claim warming by a greenhouse gas results in secondary effects, mostly due to increased water vapor, and these effects are twice as large as the primary effect. The first absurdity is that natural variations in temperature are extreme. If they were producing twice as much secondary effect, everything would be frying. The second absurdity is that water vapor in the air is not determined by air temperature but by ocean temperature. The third absurdity is that any secondary heating which is greater than the primary heating would produce hysteresis, which means thermal runaway.

The claim that secondary effects produce most of the global warming is preposterous. If it were true, all temperature variations in the lower atmosphere would have to be mostly due to secondary effects. A need developed for increasing 1C caused by carbon dioxide into a 3C effect, so 2C was tacked on as a secondary effect.

The assumed historical record was indicating that if CO2 in the atmosphere were doubled, an increase in temperature of 1.7C (but usually assumed to be 1C) would occur. (The assumed historical record appears to be faked, as explained in the section on temperature measurements.) But critics were saying the increase would have to be 3C before they would be concerned. And abracadabra, another 2C showed up. It was said to be a secondary effect. The historical record included secondary effects. So there was an inherent contradiction in adding another 2C as secondary effect (called feedback). It was rationalism in contempt for facts and logic.

From season to season, temperatures typically vary by 25-40C. If 1C triples to 3C, why does not 25C triple to 75C? Perhaps only long term averages are relevant; but not quite. Temperature changes do what they do in hours, or not at all.

If the secondary effects are primarily due to water vapor, as claimed, dry air should be producing a lot less heat than humid air, like maybe one third as much heat. But we see the opposite, as desert air is the hottest, and ocean air is the coolest.

The claim is that global warming due to carbon dioxide increases the holding capacity of the atmosphere for water vapor, and water vapor is said to be something like 100 times as strong of a greenhouse gas as carbon dioxide. (Numbers are hard to pin down, since there is no objective reality to it.) Extending from that starting point, increased holding capacity is said to result in increased water vapor. Specifics are non-existent, as the modeling is never published beyond the equivalent of a news blurb.

Climatologists err in claiming the amount of water vapor in the air is determined by holding capacity. If so, the air would always tend toward saturation. No one knows what the global average humidity is, but a usual guess is somewhere in the vicinity of half saturation. Saturation is typically 3% for warm air, so an average is usually considered to be 1-1.5%. Do the modelers have a better number? Without it, how can their results have less than about 50% error. They used to claim 15% error, but now they give a range with various degrees of certainty. (The errors are additive for hundreds of effects which they claim to model.)

Humidity is primarily determined by ocean temperatures. Air gets dryer the farther it gets from the oceans. Cooling draws moisture out of the air by creating precipitation, which includes uprising over mountains. Changes in holding capacity due to supposed effects of greenhouse gases will not occur over oceans, as air temperature over oceans equilibrates with the surface temperature of the oceans. This means greenhouse gases will not determine how much moisture enters the airthe essence of the claimed secondary effect.

The forces which remove moisture from the air would also swamp supposed effects by greenhouse gases. Simple changes in temperature do not remove much moisture from the air, only precipitation conditions do. Precipitation conditions involve dramatic effects including lower air pressure and collisions with cold fronts. The process of precipitation then releases massive amounts of heatas much heat as absorbed in the evaporation which put the moisture in the air.

A 1C global air temperature increase would disappear in such major forces associated with precipitation conditions. But climatologists supposedly have the effect calculated and modeled over the next hundred years. Weathermen can't say much about it for more than a few days. Why don't climatologists reveal their superior predictive abilities to the weathermen? When they publish nothing more than a summary and number, they produced nothing more than a summary and number. They start at the endpoint and juggle numbers to get there.

One of the basic pretenses of physicists including climatologists is that they can read any effect through any amount of noise. They are dealing with miniscule effects, which they supposedly can calculate in disregard to major effects which overwhelm their process. Miniscule effects do not survive major opposing forces.

In fact, the 1C upon doubling CO2 in the air has not yet occurred. Humans supposedly caused 0.2C increase because of CO2 up to this point, and still twice as much secondary effect (0.4C) has supposedly occurred. This pattern indicates that the most miniscule effects survive the opposing forces of nature, in the claims of climatologist.


If 1C caused 2C additional increase due to feedback, the additional 2C would cause another 4C increase, and these increases would keep compounding. But the claim is that the secondary increase due to water vapor feedback cannot increase more than 2C.

Putting a cap on feedback or secondary effects is an impossibility and contrivance. The claimed maximum would have been reached long ago due to the compounding effect, and no further increase would be possible at this time, if there is a 2C cap on it. There is no concept of why the cap would be any different now than in earlier times. Why would the cap have changed now due to human activities? This concept is nonsensical. Putting a 2C cap on secondary effects is an oxymoron or self-contradiction, because nothing can have two temperatures simultaneously. A cap says 2C above some temperature, but the starting point disappears due to the secondary effect. The effect would have to be a force for increase, which could not have a cap on it.

This effect is called hysteresis. It's a force, not a number. In electronics, hysteresis is caused by positive feedback from the output of an amplifier to the noninverting input. It causes the output to go rapidly to one of the voltage rails. A form of this is used for digital outputs, because it locks rapidly at either the plus or minus rail with nothing in between. The input must cross a threshold voltage to change the output.

There is no rail or maximum for temperatures. If temperatures are limited for some reason, that limit cannot be set by some hypothetical starting point, such as a 2C cap above a hypothetical primary effect. There is no such temperature as 2C above itself. In other words, an upper limit for temperature would have to be determined by some external requirements, not a hypothetical starting point called a primary effect.

If a secondary effect can influence itself, it creates a dramatic event. An example is a nuclear reaction. In electronics, thermal runaway is such an example. It burns up components. Combustion is another example. In the atmosphere, no such events have ever occurred, not the least reason being that nothing in the atmosphere can cause heat to generate more heat than it started with, as falsely claimed for greenhouse gases.

Even if greenhouse gases are assumed to create some heat, the amount of secondary heating would have to be less than the amount of primary heating, or heat generating itself would result in a hysteresis effect. Heat from a secondary effect would do the same thing as heat from the primary effect, which means heat generating itself. Secondary heating would have to be less than the primary heating to prevent hysteresis. Yet the secondary effect of 2C is said to be greater than the primary effect of 1C for carbon dioxide in the atmosphere.

If the secondary effect of heating is greater than the primary effect, it's hysteresis, which is preposterous for the atmosphere. In the atmosphere, temperatures change by a large amount, and they have never crossed a hysteresis threshold. A hysteresis threshold would have to be set by external factors, as combustion is set by oxidization of a substance. The hysteresis threshold cannot be set by a primary heating effect, as external factors must determine the threshold. Otherwise there is no definable threshold.

Of course, the promoters of global warming are not assuming a secondary effect is a hysteresis effect. But anytime a secondary effect is greater than the primary effect, and the secondary effect produces the same result as the primary effect, it is a hysteresis effect. In the atmosphere, a primary effect could be any temperature. Temperatures change drastically all the time. To say a 1C increase triggers hysteresis is preposterous, and to put a temperature dependent cap on it is preposterous, as only external factors can set limits for hysteresis.

The concept of secondary effects is an absurdity patched into the analysis for the purpose of showing more heating than could be attributed to carbon dioxide alone.
The Hockey Stick Graph

The pretense of humans upsetting a delicate balance was dependent upon creating the impression in the minds of the public that nature was forever the way it is now. The "hockey stick graph" had that purpose. Hockey Stick GraphIt needed a straight handling showing the invariability of nature followed by an upward spike showing the dastardliness of human activity. A climb in temperature a thousand years ago called "the Medieval Warm Period" and a drop in temperature afterwards called "The Little Ice Age" needed to be obscured with modern, impeccable scientific measurement. Tree ring data from northern Siberia was used for that. It showed no change through measureable history. They couldn't lose, because tree ring data doesn't show temperature; it shows rainfall. Supposedly, the northern climate would make temperature more significant than rainfall. No one has conducted studies to show that assumption to be true.

Then an increasing temperature since the industrial revolution needed to be grafted onto the tree ring data, which by then was showing an aberrant decline, which could have resulted from anything. So "hide the decline" showed up in the climategate emails and became the song and dance of deniers.

What rationalizers say about the flat handle which omitted the Medieval Warm Period and Little Ice Age is that those things only occurred in the northern hemisphere. One problem is that the tree ring data was only collected in Siberia, which is in the northern hemisphere. Why was it flat? If the global average were flat, the opposite must have occurred in the southern hemisphere. Both hemispheres constantly changing doesn't point to the stable average and delicate balance that fakes promote.

It's impossible to determine a global average atmospheric temperature due to technical limitations. There is nothing for measurements over the oceans, very little in the southern hemisphere or "developing countries," very little over the poles, and worst of all, the weather stations were not designed for climate, because they have nothing for uniformity or proper standards including maintenance. Under these conditions, satellite measurements would be a major improvement; but they showed no change. So satellite measurements were altered to conform with contrived land-based measurements.
Recent measurements with thermometers show that atmospheric temperatures vary wildly for no identifiable reasons. Even the past seventeen years, where the average is said to show no significant change, the actual measurements vary wildly from year to year, and only the average shows no significant change.

This means scientists don't have a clue as to the factors influencing atmospheric temperatures; and there is nothing in atmospheric temperatures that says a thing about so-called global warming.

century temperatures

recent temperatures

These graphs show how wildly atmospheric temperatures vary when using the same instruments each year. What global average is doing, scientists don't have the slightest ability to determine.
The Intimidation of Scientists

Numerous scientists are prevented from getting grants or publishing and often fired for being critical of global warming claims. Criticism is being disallowed in science and journalism. Scientists have no problem with criticism; only frauds do.

This mentality of science promoting and protecting some cause is the height of fraud. Science cannot promote and protect. It can only acquire evidence through measurement.

If there were such a thing as separating good science from bad, global warming would not exist, nor would relativity or the misdefinition of energy. Only the power of truth allows scientific knowledge to evolve forward.

Science is a measurement business, not a political or religious business. There are no mechanisms in science for differentiating good values from bad. In fact, there are no mechanisms in science for separating good science from bad science. Each scientist determines the difference between good science and bad, and there is no agreement on the subject. Bad science is simply ignored and falls to the wayside.

Truth evolves in science, or it doesn't exist. Truth cannot be arbitrated, and the only persons who try are corrupters.

Here are some quotes of scienetists:

Tim Ball (9), climatologist. Reported by Toronto Sun, February 13, 2010: "If people knew just how deep and dark this conspiracy is yes, conspiracy theyd be amazed, he explains. More and more academics are standing up to refute climate-change theories, but its still dangerous to do so. It can mean the end of a career, the targeting of someone by well-organized fanatics.

Zbigniew Jaworowski (10). reported by Lawrence Solomon, Financial Post Published: May 04, 2007

"...Because of the high importance of this realization, in 1994 Dr. Jaworowski, together with a team from the Norwegian Institute for Energy Technics, proposed a research project on the reliability of trace-gas determinations in the polar ice. The prospective sponsors of the research refused to fund it, claiming the research would be "immoral" if it served to undermine the foundations of climate research.

"The refusal did not come as a surprise. Several years earlier, in a peer-reviewed article published by the Norwegian Polar Institute, Dr. Jaworowski criticized the methods by which CO2 levels were ascertained from ice cores, and cast doubt on the global-warming hypothesis. The institute's director, while agreeing to publish his article, also warned Dr. Jaworowski that "this is not the way one gets research projects." Once published, the institute came under fire, especially since the report soon sold out and was reprinted. Said one prominent critic, "this paper puts the Norsk Polarinstitutt in disrepute." Although none of the critics faulted Dr. Jaworoski's science, the institute nevertheless fired him to maintain its access to funding."

Bill Gray (11), a climatologist at Colorado State University. Reported by channel 9 news KUSA TV in Colorado: "There's a lot of chicanery involved with pushing this global warming business," he said.

Gray, who has gained fame through his hurricane forecasts, says he has been a skeptic of global warming for two decades.

"We're persona non grata in a lot of circles," he said. "I've been told I'm no longer a credible scientist and I've lost grants ... I've had trouble getting papers published."

Professor Philip Stott (12). Reported by The Telegraph, January 30, 2010: There are many more scientists who think the way I do...But they dont want to stick their heads above the parapet. They dont want to lose their jobs.

Professor Lennart Bengtsson (13), climate scientist. Reported by the Telegraph, May 15, 2014: "I have been put under such an enormous group pressure in recent days from all over the world that has become virtually unbearable to me. If this is going to continue I will be unable to conduct my normal work and will even start to worry about my health and safety."

There is no place for pressure in science. It is not science under such conditions. Incompetents in science cannot understand the process of science. They cannot understand that their subjective concerns are not supposed to be part of science. They cannot understand that the process of science takes care of itself when done in a valid manner. They cannot understand that truth takes care of itself through rationality. They cannot understand that they are not the protectors of truth. Truth has no masters; it evolves through the interactions of realities.
The Kiehl-Trenberth Model

The Kiehl-Trenberth Model was produced in 1997 (14) to balance some of the energy flows into and out of the planet. This was done using the Stefan-Boltzmann constant which shows about twenty times too much radiation being given off by matter at normal temperatures. The result was very little energy left for conduction of heat from the earth's surface into the atmosphere.

The ridiculously small amount of conduction shows that the Stefan-Boltzmann constant is wrong. Yet it is used throughout climatology and physics. And get this: Without such excessive radiation, the claimed greenhouse effect would not exist. With the Stefan-Boltzmann constant, there is not enough conduction; without it, there is not enough radiation.

According to the Stefan-Boltzmann constant, the surface of the earth must be giving off 390 watts per square meter of radiation at its average temperature of 15C (59F). To get their numbers to balance, climatologists have only 24 W/m leaving the surface by conduction and convection. That's 6% as much conduction and convection as radiation, even though the earth's surface has a lot of wind moving across it. Cooling fans would never be used if only 6% increase in cooling could be achieved. Fans remove far more heat than radiation alone would.

Here are the numbers:

claimed numbers

Average radiation from sun to earth: 235 watts per square meter.

Radiation from sun onto earth's surface: 235 - 67 = 168 W/m.

Radiation from atmosphere to earth: 324 W/m.

Total on earth's surface: 324 + 168 = 492 W/m.

From surface by conduction (air rising): 24 W/m.

From surface by evaporation: 78 W/m.

From surface as radiation: 390 W/m.

Of the 390 W/m: 40 W/m directly into space and 350 W/m into atmosphere.

Net radiation from surface to atmosphere: 350 - 324 = 26 W/m.

Net energy from surface to atmosphere: 24 + 78 + 26 = 128 W/m.

From sun to atmosphere: 67 W/m.

Emitted from atmosphere to space: 128 + 67 = 195 W/m.

Total into space: 195 + 40 = 235 W/m.

Alarmist climatologists use this procedure to show that the numbers can be balanced when using the Stefan-Boltzmann constant, and the greenhouse effect is supposed to be a necessary method of getting the surface temperature up from the -18C which liberates 235 W/m, based on the Stefan-Boltzmann constant, to 15C.

But there is only 24 W/m leaving the surface as conduction, with 390 W/m leaving as radiation. That's 16 times as much radiation as conduction. Nothing resembling that happens below the temperature of white hot metals. Those numbers were forced into the method because of preposterously high radiation indicated by Stefan-Boltzmann constant.


To account for the extremely high radiation indicated by the Stefan-Boltzmann constant, there had to be a lot of radiation interacting with the earth's surfacespecifically 324 W/m going from the atmosphere back to the surface. This amount left almost nothing for conduction, convection and evaporation. The 390 W/m being emitted from the surface included 40 W/m going into space and 350 W/m going into the atmosphere. The 324 W/m coming back out of the atmosphere and onto the surface had to be less than the 350 W/m going in. The 324 W/m left almost no space for conduction, convection and evaporation, because most of it had to be used to create the 390 W/m.

An important thing to notice about alarmist science is how sloppy everything is. Throughout the subject, there are contradictions. That isn't how science is supposed to work. When things don't look right, you find out what the problem is. You don't say the science is settled. Climatologists pushed themselves into a corner with fake numbers and false claims, and they can't remove the resulting contradictions.

In addition to the absurdly high radiation required by the Stefan-Boltzmann constant of 390 W/m, this number is supposed to adjusted for emissivity, which is now days said to be 0.64 for the earths surface. This means 0.64 times 390, which equals 250 W/m instead of 390 W/m. Yet a recently produced NASA energy budget continues to show the same 390 W/m of the Kiehl-Trenberth model.

Presumably, when the Kiehl-Trenberth model was produced in 1997, a number did not exist for the emissivity of the earths surface, so it was omitted. Later, a model by NASA reduced the radiation from 79% to 41%, presumably attempting to make it look more credible. But by then, the Kiehl-Trenberth number had been enshrined in several editions of the IPCC reports, so NASA apparently felt maintaining the same number would be less incriminating than reducing it to almost one half. And still, emissivity was not used to reduce the number to 250 W/m, which shows that a consistent absurdity was more important to them than correct scientific procedures.

Balancing ridiculous numbers was more important to alarmist climatologists than a credible logic. Throughout the global warming issue, logic is sacrificed to absurd claims and fake mathematics including falsified data. It also means physicists made up the Stefan-Boltzmann constant off the top of their heads with no relationship to objective reality. It rationalizes fake math and numbers for greenhouse gases, which requires a lot of radiation, but contradicts logic and evidence.

An approximate correction would look like this:

If the Stefan-Boltzmann constant were reduced to one twentieth, at 15C (59F) only about 5% of the heat leaving the earth's surface would be radiation, while the remainder would leave as conduction, convection and evaporation, which is more in line with what is really happening. No greenhouse gas effect would be involved.

Instead of 390 W/m radiation given off by the earth's surface at the average global temperature of 15C, the amount would be about 20 W/m. About half would go into the atmosphere and half around the atmospheric gases and into space, which is 10 W/m in each.

corrected numbers

15C surface: 20 W/m radiation = 10 into space, 10 into atmosphere.

Sun's energy onto surface: 168 W/m.

Energy from the sun would heat up the surface and air around it through conduction, convection and evaporation.

Claimed from sun to atmosphere: 67 W/m, then back into space.

From atmosphere to space: 158 + 67 = 225 W/m.

Same up and down: 225 W/m onto surface.

Total into space: 225 + 10 = 235 W/m.

Total onto surface: 168 + 225 = 393 W/m.

Conduction, convection and evaporation from the surface: 393 - 20 = 373 W/m.

Total into atmosphere: 373 + 10 + 67 = 450 W/m.

Total leaving atmosphere: 225 up, 225 down = 450 W/m.

This means conduction, convection and evaporation heat the atmosphere at 373 W/m with no greenhouse effect involved.

The 10 W/m radiating from the surface of the earth and being picked up by molecules in the atmosphere do not create a greenhouse effect, because there is no difference between heat entering the atmosphere through radiation and that entering through conduction, convection and evaporation. The heat entering from one method is subtracted from heat entering by another source. All heat is the same.

There is no analysis which says what temperature should result from energy moving around. The temperature equilibrates, and only measurement tells what it does.
The Stefan-Boltzmann Constant is in Error

Here is the Stefan-Boltzmann constant:

     W/m = 5.67051 x 10-8 x K4

This result is the number of watts per square meter of infrared radiation supposedly given off by matter at a temperature represented by K (degrees Kelvin, which is 273 + C).

For exactness, this calculation must be adjusted for emissivity, which means variation from the Stefan-Boltzmann constant. For rough, nonreflective materials, emissivity is usually in the range of 75-95%. These variations show the influence of chemistry.

At a normal temperature of 27C (80F), the Stefan-Boltzmann constant without emissivity indicates 459 W/m2 being radiated.

At the assumed average temperature of the earth (15C, 59F), it's 390 W/m2.

At the freezing temperature of water (0C, 32F), it's 315 W/m2.

On a hot day of 37C (98F), it's 524 W/m2.

If freezing water were emitting and absorbing the heat of 3 100 watt light bulbs per square meter as radiation, the heat would interfere with the freezing process. Freezing would be highly finicky and prone to variation.

Simple observations indicate that 20 times too much radiation is projected by the Stefan-Boltzmann constant at normal temperatures. How accurate the constant is at higher temperatures is hard to say. sb graphThe excessive radiation at normal temperatures is used to rationalize greenhouse gases.

Such a constant would not really exist. It's nothing but a fudge factor. Chemistry and other forces would create too much complexity for a single curve, as indicated by emissivity, which is an attempt to adjust for obvious errors.

With a corrected Stefan-Boltzmann constant, the surface of the earth without an atmosphere would emit 235 W/m2 at a temperature of something like 50C, not -19C. With an atmosphere, the surface average is 15C. The atmosphere cools the surface, as it should, because the atmosphere is like a heat sink. This means the atmosphere picks up energy through conduction and convection, which removes heat much faster than radiation alone. Heat sinks (usually made of aluminum) are used for this reason throughout electronics to speed cooling.

Besides the Stefan-Boltzmann constant resulting in too high of a quantity at low temperatures, the constant is applied to solids and gasses equally, which is absurd. Gasses have a three dimensional surface and low density, which promote the escape of radiation. Therefore, gasses should have a much higher quantity for radiation emission than solids. But there would not be a single constant for gasses, because the chemical composition would determine how radiation escapes.

Fake scientists rely heavily upon the Stefan-Boltzmann constant as a rationalization gimmick. They focus upon such claims as radiation leaving the earth at a height of 5km attempting to create a concept in minds of there being a greenhouse effect. Global warming propaganda is all about impressions.

The Stefan-Boltzmann constant is expressed in terms of square meters, because it is supposed to be applied to the opaque surfaces of solids. The atmosphere is not opaque. How thick should the substitute for a surface be? What keeps radiation from leaving from other heights? Nothing could. The shamelessness of pretending such absurd effects shows no concern for honesty.
A Ridiculous 33C

The number one "fact" produced by nonscientists, particularly bureaucratic authorities, is that greenhouse gases supposedly heated the planet by 33C. This number, or a description of it, is at the top of nearly every web site on the dangers of "climate change," which nearly every state and local government maintains. The derivation of this number is disgusting; yet it is repeated by Ph.Ds.

The sun puts an average of 235 watts per square meter of energy onto the earth. The Stefan-Boltzmann constant says that matter emits this amount of radiation at a temperature of -18C. But with an atmosphere the global average temperature near the earth's surface is 15C, which is 33C higher. Therefore, greenhouse gases supposedly heated the earth by 33C.

They missed the conduction, convection and evaporation which heat the atmosphere without greenhouse gases influencing the result. No real scientists could miss the conduction, convection and evaporation.
Quantities Don't Add Up

Supposedly, humans put 30% of the CO2 in the air. It comes from 8.6 giga tons of carbon per year produced by humans. A giga ton (GT) is a billion tons. The atmosphere contains 780 GT of carbon (GTC). Dividing shows that 1.1% of that amount is 8.6 GTC. The human amount would have to accumulate for 27 years to do that. But half of the human amount is said to go into the oceans; so 54 years would be required.

Vegetation exchanges 100 GTC per year with the atmosphere. That's 12 times as much as humans produce. How come the carbon coming from vegetation doesn't accumulate the way the human sourced carbon does?

Why does the human amount stop accumulating in 54 years? Maybe the human amount never stops accumulating, but no one is keeping a cumulative total. Instead, a dynamic state is described, where carbon dioxide has a "residence time." And afterwards, where does it go? No explanation. The residence time is usually stated to be 100-200 years.

The explanation is, "The amount of carbon dioxide taken out of the atmosphere by plants is almost perfectly balanced with the amount put back into the atmosphere by respiration and decay. Small changes as a result of human activities can have a large impact on this delicate balance."

There is an extreme shortage of CO2 in the air for plants to grow on. That's not a delicate balance. There was 20 times as much CO2 in the air when modern photosynthesis evolved. All biology is on the verge of becoming extinct due to the shortage of CO2 for photosynthesis. Greenhouse operators often add three times as much CO2 to the air to promote plant growth.

What rationalizers are trying to say is they like the temperature as it is. But it varies every few kilometers north or south. They could just move if they don't like it. But they also say weather will get extreme. It always is and always will be as extreme as 15 kilometers of troposphere will allow. What caused the drought of the thirties, the storms of the 50s or the little ice age a few centuries back? Why is the Sahara desert different from the Amazon rain forest?

The amount of carbon dioxide in the air has nothing to do with respiration and decay. Oceans continuously remove CO2 from the air. If oceans absorb half of the CO2 which humans produce, the amount would be 4.3 GTC per year. If oceans are absorbing that amount now, why didn't they absorb 4.3 GTC in 1970? Humans put 4.3 GTC/Y into the air around 1970. Why then did not the oceans absorb all 4.3 GTC? In fact, why were not the oceans absorbing vast amounts of CO2 from the air before humans came along? The answer to all of the above is that oceans could absorb everything humans produce, but ocean temperatures determine the amount absorbed, and ocean temperatures are increasing, as they continuously do between ice ages. There is a shortage of CO2 in the oceans for marine biology as well as in the atmosphere.
Absurdity of Ocean Acidification

One of the emphatic concerns which keeps climate change alarmists on edge is acidification of oceans. As oceans absorb carbon dioxide, they supposedly get more acidic. What might the pH of the oceans have been during dinosaur years, when there was five times as much CO2 in the air as now? would sea creatures have been dying off?

Supposedly, the shells of sea creatures will dissolve if the pH of the oceans gets lower, because calcium carbonate dissolves at lower pH. Calcium carbonate is a soft substance. It doesn't create sea shells without a lot of other things with it. Seashells are like teeth, hardened with numerous substances. Sea shells also have coatings like paint, often proteinatious. To claim that unmeasureable reductions in pH are killing sea life is mindless. Evolution has been creating the things for half a billion years, and they are going to die when someone sneezes? Someone just doesn't grasp what biology and evolution are, and they supposedly conduct laboratory experiments showing the dangers to sea creatures from a miniscule amount of CO2 in the air.

Rationalizers have been using the word "precipitation" in regard to the formation of calcium carbonate in the shells of sea creaturesthe purpose being to use physical chemical properties for analysis. Physical chemistry is vastly different from biochemistry. The purpose of reducing complex biological products to physical chemistry analysis is reductionism which guarantees an improper result.

Every atom and molecule in biological systems is controlled through enzymes, structures and environments to produced a defined and complex result. Shells include oxides, zinc, magnesium and other exotic elements in addition to calcium carbonate, to control hardness and resistance to acids, combined with proteins and other organics for structure and protection (similar to paint). The result is nothing resembling precipitated calcium carbonate.
The pH of the oceans can never get significantly lower than 8.1, which is thirteen times more alkaline than neutrality. There are thirteen times more hydrogen ions (which create acidity) in neutral water than in the oceans. This pH cannot change under usual conditions, because it is buffered by calcium which combines with CO2 in the water to form calcium carbonate. No one has ever measured any other pH in the oceans beyond isolated environments. pH chart The talk of acidified oceans is nothing but contrivance and incompetent science.

Sometimes, fake scientists will say the pH of the oceans was 8.25 before human influences, and now it is 8.14 (more acidic). They don't have the slightest ability to determine what it was before human influences, as shown by the ridiculous claim that human minutia could influence the result. Life would never have existed for billions of years, if the climate were that finicky and biology so vulnerable. Fake scientists and gullible persons assume that nothing ever changes, and climate was never any different. The miniscule effects being viewed as catastrophic are beyond belief.

Gullible persons are supposed to assume that such claims are measured science. If you look at the details, you will notice that nothing was measured, it was theorized. Not even proxy measurements will produce such miniscule resolution for the difference between 8.25 and 8.14. Throughout global warming "science," guessing and theorizing are promoted as measured science.

The oceans have had four billion years to absorb carbon dioxide. Why are they not acidic, when 100% of that time the atmosphere had more CO2 in it than before human influences by all assumptions? No one ever claims that there was less CO2 in the air than at the start of human influences. There was five times as much CO2 in the air during dinosaur years, and twenty times as much when modern photosynthesis began. So why weren't the oceans more acidic then?

If carbon dioxide kills sea creatures by acidifying the oceans, why aren't they all dead? How did they survive five times as much CO2 in the air during dinosaur years? Even the previous ice age cycle (100 thousand years ago) would have had more CO2 in the air, because the graphs show a continuous downward slope. Fake scientists have to be total idiots to claim the slight increase in CO2 caused by humans is detrimental to sea creatures, when CO2 has never been this low in the past 500 million years.
Weather Claims Ignore the Obvious

Weather is primarily controlled by the oceans. The first thing this should mean is that greenhouse gases could not have much influence on weather, even if they were creating global warming.

Oceans have a large influence on air temperature, since the surface of the earth is seventy percent oceans. Air sweeping over such expanse acquires a similar temperature. But even more important is humidity. The temperature of the ocean surface determines the amount of water vapor that enters the air moving over it. When the ocean surface warms up, a lot of precipitation results, as El Ninos show. And as the ocean surface cools down, the air gets dry, as La Ninas show.

Since air temperature has no significant ability to influence ocean temperature (due to lack of heat capacity in air and huge heat capacity in oceans), the large-scale, average, air temperature increase which greenhouse gases are supposed to create would have no significant influence on weather.

Yet all weather effects are attributed to global warming. Often, scientists will say that severe weather events, such as hurricanes or tornados, were not caused by global warming, but to no avail, as journalists and nonscientists totally swamp scientific statements with their propaganda.

There was a major weather shift in North America starting around 1980. It was caused by warming of the surface of the Pacific Ocean. In the northern plains, where weather is seldom influenced by Gulf of Mexico air which sweeps northward over Iowa and Illinois, the result was that the corn belt moved about a hundred miles further west due to increased precipitation. The warmer Pacific Ocean air had much more moisture in it. One of the results was warmer winters and cooler summers due to increased cloudiness.

Around 1998, the surface of the Pacific Ocean started to cool back down, and precipitation on the northern plains returned to the normal pattern of sporadic dryness and storms with cold winters and hot summers.

Changing Weather is not Global Warming

A lot of nonscientists think they can look out the window and see global warming. This is probably why society cannot be told that the science of global warming is not there. They can see it out their window.

Extreme incompetence in recent science promotes the problem. Scientists claim that another ice age will be caused by some mysterious cooling effect resulting in snow not melting and reflecting away sunlight creating more cooling. It's nauseating stupidity.

There is a law of physics which says energy cannot be created or destroyed. Seventh graders are supposed to learn this. Where then is the cool-down supposed to come from before snow accumulates to reflect away sunlight? Scientists have the cooling occurring before the snow accumulates to reflect away sunlight.

There are significant changes in solar intensity, which apparently caused the "little ice age" of several centuries ago. But these cannot create an ice age, because they don't create enough long-lasting snow for reflecting away sunlight. The summers melt the snow. And there is less snow during such cool-downs, because cold air does not hold much moisture.

Ice ages have been occurring at 100 thousand year intervals for the past million years. The general assumption is that the earth's orbit changes in cyclic ways to cause ice ages. Those cycles are too trivial and complex to be of much significance. When the earth tilts, it still gets the same amount of solar radiation, only the location changes. And why have the recent ice age cycles been occurring for only one million years?

The cause of recent ice age cycles is a water clock in the Pacific Ocean. It cycles about every 80 or 90 years. Shifting of tectonic plates created the present conditions. As oceans heat up and rise between ice ages, warm water from the Pacific Ocean flows over the Bering Strait and melts Arctic ice. The water flows out the Arctic into the Atlantic. It can stop the ocean conveyer, which used to be called the "Gulf Stream," in that area, which it did a few years ago causing some panic before restarting. After several years, the Pacific Ocean cools back down, and the Arctic freezes over again.

With each cycle, the oceans get warmer, as constantly occurs between ice ages. The sun's energy penetrates into oceans about 10 meters (30 ft), and it does not escape easily. So it accumulates between ice ages. Only an ice age can cool the oceans back down. As ice forms on land, the ocean level drops about 130 meters (400 ft). As the ice melts, cold water flows back into the oceans.

An ice age begins when warm Pacific Ocean water flows over the Bering Strait melting Arctic ice and warming the Arctic area to such an extent that a large amount of snow falls in northern areas. Warm water causes a lot of moisture to enter the air through evaporation. In the north, the result is a lot of snow. If the snow cannot all melt during the summer, it reflects away so much solar energy that a precipitous cool-down occurs causing the next ice age.

During the eighties, the Pacific Ocean was getting so warm that it put a lot of moisture in the air and caused a lot of precipitation in the northern USA. The increased precipitation caused the corn belt to shift about a hundred miles further west allowing corn and soybeans to be grown where there were normally prairie grasslands. The winters were warmer and the summers cooler due to increased precipitation and cloud cover. This condition continued sporadically during the nineties and two thousands. These changes were attributed to carbon dioxide in the air by global warming alarmists. Several years ago, they were saying, if this heat-up continues, the Arctic ice could melt by the year 2050 and cause some sort of disaster. Over the next few years, most of the Arctic ice melted, as warm Pacific Ocean water flowed over the Bering Strait and melted the Arctic ice. There was a heavy accumulation of snow in northern Canada, but not enough to trigger the next ice age, as it melted during the summer. Alaska, of course, was warmed immensely, as the normal ice was replaced by warm Pacific Ocean water.

All this is attributed to global warming caused by carbon dioxide by global warming alarmists, while it is caused by endless cycles of a water clock. Arctic ice melted some time around 1900, as plans were made for a shipping route through the "Northwest Passage." It was navigated by Amundsen in 1903-1906. But ice soon closed it back up. Scientists do not claim that humans were causing global warming at that time.

Heating of the Arctic by warm Pacific Ocean water causes a lot of heat loss from the planet due to ice thawing and remelting. The heat radiates into space at a greater rate than occurs when the Arctic is cold. For this reason, there is a cool-down of ocean temperatures that follows heating. This cool-down resulted in the drought of the thirties in the USA. The drought lasted about ten years. There appears to be a drought beginning again in the USA, as the Pacific Ocean has gotten colder, at least on the surface. If this cycle is similar to the last one, the drought will probably last ten years again.

These cycles get more extreme each time due to solar energy accumulating in the oceans. The next ice age will be triggered when the Arctic gets so warm that more snow accumulates in northern areas than can melt during the summers. Notice, it is warming of Arctic ocean water that causes the ice age to begin, not some coldness transplanted from who knows where.

The short cycles appear to be quite variable. What is consistent is the amount of time required for oceans to heat between ice ages. There needs to be enough heat in the oceans to cause a lot of precipitation to occur in northern areas where snow accumulates.
Wrap Up

The primary affect of carbon dioxide was supposedly determined through radiative transfer equations plus modeling which converted radiation into a global average temperature. The claimed result was that about 1C temperature increase would occur upon doubling the amount of CO2 in the atmosphere. A fudge factor was produced for calculating this quantity (heat increase = 5.35lnC/Co, temperature increase = 0.75 times heat increase.) Then secondary effects were modeled claiming that the primary effect would approximately be tripled to 3C, mostly due to increased water vapor in the air, which is said to be a much stronger greenhouse gas than CO2.

Saturation was skipped over implying that radiative transfer equations took care of everything influencing absorption of radiation by carbon dioxide in the air. The problem is, saturation cannot be overcome. To claim some math gets heat out of the result shows that the math is wrong. Even if Heinz Hug was wrong about the distance for saturation, which he claimed is 10 meters based upon measurements which he and others did, the problem of saturation cannot be overcome, because the molecules get too thin at any distance. There would be 4 million air molecules surrounding each CO2 molecule which does the heating, if Heinz Hug was right in showing saturation to occur in 10 meters. Even if he were off by a factor of a thousand, there would be 4 thousand air molecules surrounding each CO2 molecule which does the heating. To get 1C average temperature increase would require an equivalent heat for 4000C for each of the CO2 molecules.

To then model the radiation, whatever it's level, for converting radiation into heat could not produce a correct result, because no mechanism is known for converting radiation into heat. The location shifted several times, and a different mechanism is required for each location. escaping radiationThe current assumption seems to be (it varies from person to person) that the radiation which escapes from the top of the atmosphere is diminished upon increasing CO2.

Two absurdities are that the top is endlessly beyond the troposphere, and most radiation which cools the planet leaves from the troposphere—more from the bottom than from the top, because it's warmer and more dense.

Why would the official scientists not have measured saturation distance, when it is extremely easy to do and the central concern? They undoubtedly measured it many times over. Since there is no evidence of them doing so, they obviously didn't like the result and used a method of erasing the question, which is the only reason for doing radiative transfer equations.

Radiative transfer equations are a ridiculously complex method of determining radiation transmission through a gas, when it could be directly measured with a few hours of work. It's like a boxer who is too heavy for welter weight class, and he knows he would lose in the heavy weight class. So instead of using a scale to determine his weight, he uses infinite complexity including diameter, buoyancy, color, height, hair length and time of day to determine his weight, and then he fits in the welter weight class.

The modeling goes back to the seventies, and the primary effect was modeled in 1998 producing the invariable and undisputed fudge factor. If a real mechanism were used or known, the description would not vary from person to person and year to year. If the assumed mechanism really did change that much, the result would not have always been the same 3C temperature increase upon doubling CO2 in the atmosphere.

The secondary effect requires the same modeling assumptions as for the primary effect, since water vapor functioning as a greenhouse gas is assumed to produce most of the secondary effect. Water vapor is assumed to be a much stronger greenhouse gas than CO2 (more or less 100 times stronger), which means it saturates in a much shorter distance. In other words, the saturation problem is much more extreme for water vapor than CO2, and yet water vapor is said to produce twice as much heating as a secondary effect than the CO2 does as the primary effect.

Whatever is being done in the dark holes of radiative transfer equations and modeling atmospheric effects, it couldn't be anything resembling correct science, because the contradictions in the claims cannot be resolved through math or modeling.

The procedures and publishing used for this subject are so far removed from scientific methods that they should not be viewed as science. The reason is that no one can determine what was done to verify the results. It's like not filing tax forms and saying, just trust us. The government wouldn't get much for taxes, if no forms were filed. This leaves nothing but the impeccable standards of climatologists to base reliability upon, while they shove out anyone who criticizes them, preventing them from getting grants or publishing and often firing them. Real science does not require a suppression of criticism. The end result is to base the entire subject on the claim that 97% of the scientists agree, while no one in science is allowed to disagree. It's like having the Mafia do the banking with no bookkeeping, since they can do no wrong.

Creationist say dinosaurs drowned in a flood ten thousand years ago, because the Bible says so. They have a geologist who says rocks can form in 500 years, and that's why fossils are found in rocks on mountains. It's the same time and station for carbon dioxide. Nonscientists feed us their imagination, because they don't have a clue as to what scientists are saying, but it couldn't be wrong, because they agree with them. But with climatology, even the scientists don't know what other scientists are doing, not only because it isn't published, but because there isn't enough objective reality to it to even guess what they are doing beyond fakery and propaganda. No two scientists produce the same description of a mechanism or related complexities.


1. Charney, J. G., Arakawa, A., Baker, D., Bolin, B., Dickerso, R., Goody, R., Leith, C., Stommel, H.M. & Wunsch, C.I. 1979 Carbon Dioxide and Climate: A Scientific Assessment. Washington, DC. National Academy of Sciences Press.

2. Hansen, J., A. Cacis, D. Rind, G. Russell, P. Stone, I. Fung, R. Ruedy, and J. Lerner, 1984. Climate Sensitivity: Analysis of Feedback Mechanisms. Geophys. Mono. 29:130-163.

3. Hansen, J., I. Fung, A. Llacis, D. Rind, S. Lebedeff, R. Ruedy, G. Russell, and P. Stone, 1988. Global Climate Changes as Forcast by Goddard Institute for Space Studies, Three Dimensional Model. J. Geophys. Res. 93:9341-9364.

4. Myhre, G., E.J. Highwood, K.P. Shine, and F. Stordal, 1998. New estimates of radiative forcing due to well mixed greenhouse gases. Geophys. Res. Lett. 25:2715-2718.

5. Marc Morano. U.S. Senate Minority Report. More than 1000 scientists dissent on global warming. Climate Depot.

6. Heinz Hug. The Climate Catastrophe - A Spectroscopic Artifact? July 31, 1998. Reproduced Here

7. IPCC, AR3, 1.3.1, 2001.

8. Temperature Study. Joseph D'Aleo and Anthony Watts. Surface Temperature Records: Policy Driven Deception? August 27, 2010, Science and Public Policy Org.

9. Tim Ball. Michael Coren. Climatology expert threatened for climate change views. February 13, 2010, Toronto Sun.

10. Zbigniew Jaworowski. Lawrence Solomon. The ice-core man. May 23, 2007, National Post.

11. Bill Gray. Adam Chodak. Stolen e-mails spark climate change firestorm. November 25, 2009, 9NEWS.

12. Philip Stott. James Delingpole. Climategate: time for the tumbrils. January 30, 2010, The Telegraph.

13. Lennart Bengtsson. Lucy Kinder. Climate scientist forced from position after 'McCarthy" style pressure.' May 15, 2014, The Telegraph.

14. The Kiehl-Trenberth Model. Jeffrey T. Kiehl, Kevin Trenberth. Earths Annual Global Mean Energy Budget; in Bulletin of the American Meteorological Society, Vol. 78, No. 2/1997, S. 197-208 — link at IPCC