According to Strahler and Strahler, "Modern Physical Geography," 1987, pg
60, 68% of solar (shortwave, i.e., light) energy that strikes the earth
per year is absorbed by the atmosphere (15%), clouds (3%) and the ground
(50%). (Ground includeds water surfaces.) The atmosphere and ground both
act as heat storage devices.
In response to the absorbtion of the shortwave energy, the planet radiates
the same percent (68%) of that energy in longwave form (heat). The direct
radiation loss to space from the ground (which Adam is talking about above)
represents only 8% of the longwave loss from the planet. The other 60% of
longwave radiation comes directly from the atmosphere itself!
> With all the vulcanism and cosmic debris
> raining down the atmosphere would be optically thick and very likely
> things would be a lot hotter than at present due to the resulting
> greenhouse effect.
Optical opacity may not affect longwave radiation. But even if the
atmosphere became opaque or translucient to longwave radiation and absorbed
all the longwave radiation from the ground, the longwave energy absorbed by
the atmosphere would only increase from 90% to 98% of the total solar input
energy (an increase of only 10%). The average temperature of the
atmosphere might increase by about 10%.
A catastrophe would introduce energy into the system, some of which would
be absorbed by the ground and some by the atmosphere. In the case of a
large asteroid explosion, some of the energy would escape directly into
space after blowing a huge hole out of the atmosphere. So far I have not
found useful information that would relate this additional energy to the
global energy ballance, but I'm still looking.
Longwave opacity may also reduce the longwave radiation of the atmosphere,
and further increase the temperature of the atmosphere. However, the
increase in temperature would increase the amount of energy available for
radiation and thus offset the loss of radiation ability. (I know that's
not clear, but can't seem to clear it up right now).
> >You hear of how Noah and Family would be broiled alive in extreme
> >temperatures. But where is the calculations of corresponding heat loss
> >into space?
> The point is the output goes up, but only with the average temperature.
> To lose heat rapidly it has to be really, really hot!
Not so, because you are not considering the direct longwave radiation from
the atmosphere itself, which is the where the major porition of heat loss
to space comes from.
Like I said before, where are the calculations on heat loss to space. None
that I have seen consider radiation from the atmosphere itself.
> The impact/vulcanism energy input you're discussing would probably cause
> substantial amounts of evaporation and water is a super-greenhouse gas,
> not mention all the extra CO2 from lava, fires and oxidation of biomass.
> Plus methane from biomass decay and vulcanism...
>
> So much of the heat outflow would be impeded and then things would
> really heat up! And hot seawater doesn't hold on to CO2 that well
> either, so the process could accelerate into a final state much hotter
> than any life inside or outside of the Ark could stand.
You are right that water vapor and CO2 in the atmosphere absorb heat. But
unless they impede direct longwave radiation from the atmosphere, they
won't reduce the rate of heat loss by the atmosphere directly into space.
The would just increase the quantitiy of heat which the atmosphere could
store.
Another thing to consider is that any optical transluciency of the
atmosphere would greatly reduce the amount of energy absorbed by the
ground. Normally 50% of the incoming solar shortwave (light) radiation is
absorbed by the ground. Another 32% is reflected by scatter, clouds and
the ground. Reflection would likely increase due to dust particles in the
high atmosphere. And, 18% is absorbed by the atmosphere and clouds.
Absorption is likely to increase also because of the dust particles.
50+32+18=100%
One might expect to see reflection increase from 32% up to 35 to 40%.
Atmospheric absorption may increase from 18% up to 20 to 25%. Thus we may
reduce the radiation absorbed by the ground from 50% down to 45 to 35%.
This would greatly reduce the amount of heat converted from the absorbed
shortwave energy to be radiated into the atmosphere.
Thus an increase in air temperature due to longwave opacity would be offset
by loss of absorbed shortwave energy because of dust in the atmosphere.
The net result would not be the run-away hot-house you have described.
I am still researching how fast the system could adjust to changes.
Allen