söndag 6 juni 2010
Basic Climate Sensitivity = 0.15 C
The Earth surface receives about 180 Watts/m2 out of which about 60 Watts are returned
by radiation at a mean surface temperature of 15 C, and 120 Watts by convection coupled with evaporation/condensation. By Stefan-Boltzmann's radiation law the 60 Watts radiated corresponds to a temperature drop of about 15 C, which is in accordance with a radiation temperature of 0 C of the stratopause.
Suppose now the radiative properties of the atmosphere is changed by 1%, which is the estimated effect of doubled CO2. This could require an extra 0.6 Watts to be radiated, which by Stefan-Boltzmann would correspond to an increase of surface temperature 0.15 C.
This argument suggests a climate sensitivity 0.15 C. With an even more simplistic argument based on Stefan-Boltzmann, IPCC suggests instead 1 C, which is elevated by feedbacks
to an alarming 3 C. Starting instead with 0.15 C gives no reason for alarm.
Which argument do you think is more correct? Both are simplistic and do not require more than common sense to evaluate.