Temperature Effects cannot be Determined from Radiative Transfer Equations.
It's extremely strange that radiation was calculated, when there is no method of converting radiation into temperature. The radiation was calculated using radiative transfer equations with the results expressed as the three component fudge factor. From this, the primary effect of carbon dioxide heating the atmosphere is supposedly determined.
To model temperature increase due to CO2, a primary effect is the starting point, and then secondary (feedback) effects are added. The primary effect is described as "radiative forcing due to CO2 without feedback," which is the fudge factor, and this is converted to a temperature with a "conversion factor." The result is said to be 1°C for doubling CO2 in the atmosphere. The simple math is this:
This little calculation is promoted as an unquestionable law of physics upon which all else is based. It is the primary effect from which secondary effects are modeled. Only the secondary effects are in question. The calculation is total fakery.
What this means is that climatologists claim that the primary effect upon doubling carbon dioxide in the atmosphere will heat the atmosphere by 1°C, and then modeling is used to determine secondary effects, which supposedly add 2°C for a total of about 3°C. The primary effect is not questioned, supposedly being an absolute law of physics, and only the secondary effects are studied and argued.
A source for this claimed law of physics cannot be determined. A quote for it comes from Stefan Rahmstorf, "Anthropogenic Climate Change: Revisiting the Facts." P34-53 in "Global Warming: Looking Beyond Kyoto," by Ernesto Zedillo. 2008, where he said this: "Without any feedbacks, a doubling of CO2 (which amounts to a forcing of 3.7 W/m²) would result in 1°C global warming, which is easy to calculate and is undisputed." Rahmstorf's citation is this: "IPCC, Climate Change 2001: Synthesis Report." There is nothing resembling the Rahmstorf claim in the IPCC reports.
Inquiring scientists cannot find a source for the calculation which Rahmstorf refers to. Attempts to explain it result in endless complexity and confusion. The simple reason is because there is no way to determine temperature from radiation in the atmosphere.
Climatologists presumably use the Stefan-Boltzmann constant (SBC) to derive the temperature of 1°C from the 3.7 watts per square meter, because they all say the number is easy to calculate, and only such a simple calculation as the SBC shows a relationship between radiation and temperature. Physicists claim the relationships go both ways from temperature to radiation, and from radiation to temperature.
The SBC is this:
W/m² = 5.670373 x 10-8 x K4
The global average, near-surface temperature is said to be 15°C. Average emissivity is said to be 0.64.
3.708/3.486 = 1.064°C
The result is the desired 1°C for the primary effect of doubling CO2 in the atmosphere, as if climatologists could calculate such things with extreme precision. They claim about 1% error on this factor. However, the SBC shows about 20 times too much radiation at normal temperatures. Reducing the radiation in the SBC by a factor of 20 shows this:
3.708/0.175 = 21.189°C
The result shows 20 times as much temperature increase as climatologists claim, when the SBC is corrected for too much radiation. None of these results are real, as the claimed radiation (3.708 w/m² upon doubling CO2) was contrived for the purpose of eliminating the significance of saturation. With saturation, no radiation change would occur to increase temperatures as global warming.
In addition to the quantitative absurdities, it is not valid to reverse the Stefan-Boltzmann constant as a method of determining temperature, and there is no other method of getting temperature out of any scientific calculation. Temperature is determined by the total energy dynamics of changing systems, with heterogeneity in complex systems. The forward direction of the SBC looks only at a definable surface, while the reverse of the SBC is influenced by the total dynamics. Yet the result of the radiative transfer equations is translated into the temperature of the near-surface atmosphere based upon a claimed reduction in emission at the top of the troposphere.
What this shows is that climatologists started at the end point of 1°C being the desired near-surface temperature increase upon doubling CO2 in the atmosphere, but correcting the math (SBC) shows 20 times more than they would have wanted for a result.
So, how real is the error in the Stefan-Boltzmann constant? Saying that a cold basement wall at 15°C is giving off 390 w/m² is totally preposterous. Physicists are not exactly saying otherwise, they are saying that it also absorbs that amount, so you do not notice anything. But that claim is absurd also, because biological process, and other complexities such as ice melting, would be sensitive to the difference between emission and absorption, and skin cells would be fried by that much energy being absorbed, regardless of how or when it is re-emitted.
Before absorbed radiation can be re-emitted, it must first be converted into heat, which means molecules vibrating. Those vibrating molecules increase in their chemical reactivity as their temperature increases. Biological systems will not tolerate significant increases in temperature without being destroyed.
Biology is like a thermometer which can tell the difference between radiation absorption and emission. If a cold basement wall were emitting 390 w/m², you wouldn't be able to get near it without skin cells being rapidly heated and damaged.
It means there is no mysterious cancellation of the high absorption and emission indicated by the SBC, and it means climatologists started at a desired end point and contrived the method of getting there. It's the only thing they do in climatology, because the randomness and complexities of climate cannot be reduced to scientific analysis.
Another problem with such an analysis is that the SBC is not appropriate for the purpose. It relates to the surface of an opaque solid. But the temperature in question is the near-surface air temperature. A transparent gas radiates in a vastly different manner than the surface of an opaque solid.
The absurdity of the Stefan-Boltzman constant shows up in an extreme way in the attempts to create an energy flow chart for the planet, described here.
I've seen an alternative explanation for deriving the 1°C primary effect. It uses the assumed temperature increase each month due to global warming and correlates it with satellite measurements of radiation escaping from the topopause into space. There is a circular logic involved. The assumed global warming must be measured to calculate the assumed global warming. Similarly, satellite measurements of radiation going into space are used to calculate the amount of radiation going into space.
If measurements are needed to determine the results, why do calculations? The only thing calculations do is get rid of saturation, while they cannot be used for anything else, and measurements must determine the results. Rationalizers use crude measurements to get high precision with their calculations. The more they mess with it, the more perfect it gets.
Besides the circular logic being used as a rationalization, the measurements are worthless. Measuring global warming with thermometers is a total farce, as only about 10% of the earth's surface is measured, with very few measurements at the poles or in impoverished countries, and none over the oceans. Satellite measurements showed almost no increase, until they were re-adjusted (in the wrong direction) to conform to thermometer measurements. And even then, thermometer measurements showed no significant results in the raw data, until earlier measurements were lowered, and later measurements were increased, to show an increase. Scientists cannot find out how or why the adjustments were made for measured temperatures. And then no increase has occurred since 1998.
Satellite measurements of radiation being emitted from the atmosphere are just as farcical. Absolute values are needed for the claimed forcing, while satellites can only determine relative change. For some purposes, relative values can be referenced for absolute values, such as using weather balloons. But they only measure temperature, they cannot reference radiation going into space. Calibrating satellites for absolute values of radiation is impossible, because each wavelength has different intensities, and the total aggregate requires evaluating each separately, which satellites cannot do. Crude measurements are however used in the rationalizations.
So a primary effect cannot be determined, yet it is needed to produce an amount of heating to be acted upon by secondary effects. Without a primary effect, there is no value to be entered into computer models for heat resulting from carbon dioxide. So a primary effect is contrived and said to be an unquestionable fact, because modeling cannot proceed without such a number to build upon.
This image shows how energy is re-distributed when radiation is absorbed by carbon dioxide.
This is a one-time absorption of radiation.
When a molecule of CO2 in the atmosphere absorbs fingerprint radiation (the only thing in question) it increases in vibratory motion, which is heat. As it bumps into surrounding molecules (mostly nitrogen gas), it imparts some motion, which reduces its own motion, while increasing the motion of the other molecule. This bumping goes from molecule to molecule, as the energy spreads through the atmosphere.
The vibrating motion of molecules sends out waves of infrared radiation. As the molecular motion decreases, the intensity of the radiation and its frequency get lower. Since the atmosphere already has heat in it, radiation is constantly being emitted. Absorption of fingerprint radiation just slightly increases the intensity. With slight changes in energy, a CO2 molecule probably has to bump nearby molecules a hundred times or more to give up half of its gained energy.
There is extreme variation in the amount of time required (the number of bumps involved) to release absorbed energy, because the intensity varies with temperature of emitting molecules, and rate of release varies with the nature of wobble induced in the absorbing molecules. With equal temperatures for emitting molecules and absorbing molecules, as few as 5 bumps could release absorbed energy. Five times 83 femto seconds would be 415 femto seconds to give up the energy of fingerprint radiation.
This bumping and emitting is happening to every molecule in the air at all times, while the net result is equilibrium between the energy entering from the sun and the energy being emitted into space. The resulting equilibrium temperature is determine by the temperature of the entire emitting mass of the atmosphere and surface of the earth, not some miniscule greenhouse gases.
The required 2,500°C temperature for CO2 to heat the whole atmosphere by 1°C average is a one-time addition. It only shows the nature of dilution. In actuality, the radiation flows create a dynamic system with energy flowing in and out constantly.
The one-time analysis does not indicate what the actual temperature would have to be for each CO2 molecule to heat the surrounding 2,500 molecules to 1°C average. If the heat loss were extremely fast, the temperature of each CO2 molecule would have to be close to 2,500°C. If the heat loss were extremely slow, the CO2 would be less than 2,500°C.
The heat loss is extremely fast, which means each CO2 molecule would have to be close to 2,500°C to create an average atmospheric temperature of 1°C.
The task would be like trying to heat a brick building by heating one brick. What temperature would that brick have to be to heat the whole building 1°C average? If the heat were disappearing very fast, the brick would have to be extremely hot. If the heat were disappearing slowly, the brick would not have to be so hot.
A Calculation for Continuous Dissipation
An analysis of continuous absorption and re-emission of radiation goes like this: Each time a CO2 molecule bumps a surrounding molecule, it loses half its energy through conduction, and it loses energy through radiation. But the energy is constantly being replaced. Losing half of the energy while being replaced at the same rate would result in three fourths of the energy being retained. Reduction to three fourths would be 1875°C.
The CO2 molecule is also emitting radiation while absorbing radiation. If rates were equal, another reduction by half would be required for loss of energy to sustain the needed temperature, which results in 2188°C being the required temperature for transferring enough heat to the surrounding 2,500 air molecules for 1°C average air temperature increase. The remaining 312 units of heat as temperature increase would be distributed between the molecules which CO2 bumps into. It would bump into about five surrounding molecules heating each one by 62°Cthat is to get the required 1°C total over 2,500 molecules.
However, the absorbed radiation is fingerprint radiation, which is weaker than the emitted radiation, which is black body radiation. The fingerprint radiation which CO2 absorbs is 8% of black body radiation. This means emission is 12.5 times stronger than absorption.
But equilibrium would require emission to equal absorption. The higher tendency to emit than to absorb would drag down the temperature increase by CO2. Hypotheticals break down at this point, because the needed 2,500°C is a total absurdity to start with.
What would actually occur is that the CO2 would only be heated trillionths of a degree centigrade, and no greenhouse effect would occur. Why trillionths of a degree? Because radiation is extremely weak. It's energy is dissipated in femto seconds. The energy cannot build up. This effect rides on top of normal temperatures, which are mostly produced through conduction, convection and evaporation.
If energy were coming from a warm earth and going into a cold atmosphere, more time would be required to re-emit the energy. But most of the radiation in the atmosphere moves less than ten meters, because saturation occurs within ten meters
For the same reason temperature cannot be calculated, radiation cannot be calculated, because they inter-convert. One transforms into another. Superficially, these effects are not being calculated, they are measured for small slices of the atmosphere and then added up. The problem with that line is that the results are expressed as 3.7 w/m² upon doubling the amount of CO2 in the atmosphere. The atmosphere does not have square meters, it has cubic meters. Square meters are assumed to represent the amount of radiant energy which falls onto the surface of the earth. It means there is no accounting for the heating of the atmosphere due to absorption of radiation by CO2.
Where then do the watts per square meter come from? They are the difference between the amount of radiation assumed to go into space based on the calculations of the radiative transfer equations and the amount entering the earth from the sun. Who cares where or how that difference in radiation creates heat. It has to create heat someplace.
One of the problems is that the calculations are not direct enough to do a comparison between calculated radiation at the top of the atmosphere and the total energy entering from the sun. The differences are extreme and render all such analysis so absurd that any result has to be a predetermined contrivance.
The NASA energy budget claims 41% of the heat on the surface of the earth leaves as radiation. It's a preposterous guess. Only white hot metals give off 41% of their energy as radiation. Cooling fans would never be used if that much radiation were emitted from a cold and rough surface with wind blowing over it. The real number would be closer to 1-3% on land, very little from oceans. The Kiehl-Trenberth model shows 79%. That model is forced into a ridiculously high number, because the Stefan-Boltzmann constant was applied, and it is in error. There is about a 100% difference between these two official sources. How reliable can the radiative transfer equations be in picking some such starting point?
But the problem is even worse due to the fact that very little of the radiation in the atmosphere gets there by radiating from the surface of the earth. Most heat gets into the atmosphere through conduction, convection and evaporation from the surface, and quite a bit from solar energy. Kiehl-Trenberth says 29% of the solar energy enters the atmosphere rather than striking the earth's surface. The NASA model says 19%.
Much of the energy in the atmosphere is converted into radiation, as all matter emits radiation in proportion to its temperature. How fast the transformation of energy occurs is anyone's guess. In other words, the application of radiative transfer equations involves no ability to determine how much radiation is coming from where or going to where. And yet the equations are portrayed as being so precise in their latest rendition that they could determine that earlier calculations were only off by 15%.
Air has no ability to heat the oceans, because its heat capacity is too low.
One cubic meter of air can only heat 0.29 millimeters of water to the same temperature change. No cooling of the air has been attributed to ocean heating.
The calculations are these, with rounding and approximations:
The density of air is 1.2 kilograms per cubic meter (kg/m³). Its specific heat is 1 kilojoule per kilogram per °C (kj/kg/°C). Therefore, one cubic meter of air would release 1.2 kj of heat to drop 1°C.
The density of water is 1000 kg/m³. Its specific heat is 4.18 kj/kg/°C. For 1000 kg, it's 4180 kj. Dividing the 1.2 kj of the air by the 4180 kj of the water yields 0.00029 meters for the water, which is 0.29 mm.
What depth would one expect the ocean to be heated by the claimed 0.2°C? The new Argo project measures down to 700 meters. At first it found slight cooling; so the lower measurements were thrown out to achieve no change. Somehow since then, an increase of 0.2°C is being claimed. If the depth of the heating is half of the measured depth, it would be 350 meters. The air would have to be cooled the same amount to a height of 1,207 km. That is approximated 100 times the height of the normal atmosphere (troposphere), not considering pressure reduction. Nothing resembling it is happening.
On top of that, the calculated 1°C near-surface, atmospheric temperature increase is stated by be 1°C upon doubling the amount of CO2 in the air, presumably based upon the Stefan-Boltzmann constant, as indicated above. No reduction for heating the oceans is allowed for in that calculation.
These contradictions are too ridiculous to be called science, yet they are in our faces day in and day out and the claimed reason for destroying the energy systems and increasing the cost of electricity by several hundred percent.
The number one panic over global warming is oceans rising, which could swamp Miami and Maldives. (Who cared what happened to Detroit.) Since air cannot cause ocean temperatures to increase, and much less cause ice to melt, whatever causes the oceans to rise, it won't be carbon dioxide. Oceans rise 400 ft, (130 meters) between ice ages, as glaciers melt. Glaciers are almost as melted as they can get, which has reduced ocean rising to little or none at this time, beyond the same fakery/fraud that goes with everything else on this subject.