söndag 30 december 2012

Global Warming Denial vs Hysteria Industry Database

I am listed on the Desmogblog Extensive Database of Individuals Involved in the Global Warming Denial Industry including many serious climate scientists. A corresponding list of Individuals Involved in the Global Warming Hysteria Industry would be much shorter.

Negative Climate Sensitivity: Global Cooling 1

The preceding posts lead to the conclusion that the Earth (and Venus) including atmosphere up to a pressure of 0.2 - 0.3 bar have a TOA temperature at the tropopause equal to the bolometric temperature determined by the distance to the Sun, as a minimal temperature.

The surface temperature would then be determined by a lapse rate observed to be 6.5 C/km resulting from atmospheric thermodynamics driven by radiative forcing of the Earth surface,  to be compared with the dry adiabatic lapse rate of g/c_p = 9.8 C/km with g gravitational acceleration and c_p the heat capacity of air at constant pressure.

The thermodynamics in the atmosphere would thus have the effect of reducing the dry adiabatic lapse representing a possible state without radiative forcing and thermodynamics, and thus an effect of reducing the surface temperature.

Doubling the atmospheric CO2 is by IPCC estimated to correspond to a radiative forcing of 2- 4 W/m2, to be added to the 180 - 40 = 140 W/m2 effectively absorbed by the Earth surface with 180 incoming and 40 directly outgoing through the atmospheric window.  The effect on the surface temperature would then be determined by the lapse rate with the bolometric temperature of TOA at the tropopause  unchanged because the distance to the Sun is unchanged.

The effect of additional effective radiative forcing of the Earth surface would be more active thermodynamics which would tend to further reduce the lapse rate and thus the Earth surface temperature.

Climate sensitivity as the increase of the Earth surface temperature upon doubling of CO2, would thus be negative: More CO2 would tend to be cooling rather than warming, but the effect would probably be so small that it could not be observed. Climate sensitivity would thus seem to be non-positive and the risk of global warming would (very likely) be small (with a most likely value of 0).

(This insight is now quickly eating its way into the minds of both people and politicians and global warming hysteria is already history).

Compare with the climate sensitivity of + 3 C by IPCC, which is obtained by a combination (i) radiative forcing increasing the bolometric temperature, as if the Earth was moved closer to the Sun and (ii) positive thermodynamics feedback, as if thermodynamics could slow down by additional forcing.

The IPCC view is presented by its Swedish representative Lennart Bengtsson with the following key argument:
  • .... the Earth energy balance can temporarily be changed by reduced radiation to outer space by increased concentration of greenhouse gases. 
We see here the idea of heating (less outgoing radiation) with necessarily a warming effect by greenhouse gases, which is the key of global warming propaganda. But LB eliminates the warming effect by stating that it is only temporary, and thus seems to say that the effect in the end is zero.

PS Notice that in the IPCC and LB greenhouse gas argument, the TOA would be put at 5 km at a bolometric temperature of - 18 C corresponding to 240 W/m2 outgoing radiation, and would then be shifted upwards to cooler levels under increased concentration of greenhouse gases and then eventually cause surface warming.  But there is no TOA other than the tropopause (as concerns thermodynamics), and shifting an artificial TOA up or down would lack physical meaning.

lördag 29 december 2012

The Earth-Atmosphere System as Blackbody

Data suggests that the Earth including the troposphere can be viewed as a blackbody heated by 140 W/m2 absorbed by the Earth surface and emitting 140 W/m2 at a bolometric temperature of about -55 C  (according to Stefan-Boltzmann's Law) attained at the tropopause at 0.1 - 0.3 bar as its "outer boundary".

The 140 W/m2 absorbed by the Earth surface comes from 240 W/m2 absorbed by the Earth-atmosphere with an albedo of 0.3 out of a total of 340 W/m2 incoming from the Sun, minus 60 W/m2 absorbed and re-emitted by the atmosphere minus 40 W/m2 directly radiated from the Earth surface to outer space through the "atmospheric window".

Including the troposphere and the stratosphere in the Earth-atmosphere system makes the stratopause  at 0.001 bar the outer boundary with an observed temperature of 0 C with corresponding blackbody radiation of 320 W/m2, which is close to the total of 340 W/m2.

The internal thermodynamics/raditation of the Earth-atmosphere system is very complex and difficult to accurately model, but it thus appears that it is possible to view the system as a single blackbody in two ways:
  1. Earth + troposphere 
  2. Earth + troposphere + stratopshere 
with an observed temperature of the "outer boundary" in reasonable correspondence in both cases with the bolometric temperature as the blackbody temperature of the same irradiance.

Viewing the Earth + troposphere as a blackbody with outer boundary or top of the atmosphere TOA temperature determined as bolometric temperature, makes the surface temperature depend on the lapse rate, and with the lapse rate mainly determined by thermodynamics, the surface temperature would be insensitive to small changes in the internal radiative composition of the atmosphere. In other words, there would be no detectable greenhouse effect.

For Venus the observed temperature at 0.25 bar is -20 C which is again equal to the bolometric temperature.

We thus find evidence that for both the Earth and Venus including atmospheres up to about 0.3 bar, the TOA temperature is the bolometric temperature determined by distance to the Sun, and the surface temperature is determined by a lapse rates mainly set by thermodynamics. 

Most Likely IPCC AR5 Very Likely Uncertain

The leaked IPCC AR5 states: 
  • Despite considerable advances in climate models and in understanding and quantifying climate feedbacks, the assessed literature still supports the conclusion from AR4 that climate sensitivity is likely in the range 2–4.5°C, and very likely above 1.5°C. The most likely value remains near 3°C. 
  • A few studies argued for low values of climate sensitivity, but almost all of them have received criticism in the literature.
  • Equilibrium climate sensitivity, transient climate response and climate feedbacks are useful concepts to characterize the response of a model to an external forcing perturbation. However, there are limitations to the concept of radiative forcing. 
  • Projections of climate change are uncertain, firstly because they are primarily dependent on scenarios of future anthropogenic and natural forcings, secondly because of incomplete understanding and inadequate models of the climate system and finally because of the existence of natural climate variability
  • The term climate projection tacitly implies these uncertainties and dependencies. 
  • Nevertheless, as greenhouse gas concentrations continue to rise, we expect to see future changes to the climate system that are greater than those already observed and attributed to human activities. 
  • It is possible to understand future climate change using models and to use models to quantify likely outcomes and uncertainties dependent on assumptions about future forcing scenarios. 
We read contradictory statements expressing that the most likely value of climate sensitivity is 3 C, while projections of climate change are uncertain. In short, IPCC AR5 tells the world: 
  • It is uncertain that climate sensitivity very likely is above 1.5 C and most likely is 3C.
One may ask if it is in fact very uncertain or most uncertain that the value is very likely above 1.5 C or most likely is  3 C? 

In any case, it seems to be very likely that the terminology developed by IPCC most likely does not represent considerable advances in climate models and in understanding

Recall that the IPCC climate sensitivity of 3 C is obtained from a starting guess of radiative sensitivity of 1 C from a guess of radiative forcing of 4 W/m2, combined with a freely invented thermodynamic feedback factor of 3, giving 3 C as an inventive guess work.

Leaked Climate Sensitivity of 0.3 C

The leaked Second Order Draft IPCC AR5 essentially repeats the AR4 estimate of a climate sensitivity of 3 C:
  • Equilibrium climate sensitivity is likely in the range 2°C–4.5°C, and very likely above 1.5°C. The most likely value is near 3°C. Equilibrium climate sensitivity greater than about 6°C–7°C is very unlikely. 
Let me here  leak the following update of my previous 10 times smaller estimate of climate sensitivity coming down to 0.3 C, based on the following argument using the standard numbers of 
  • Earth surface temperature: + 15 C
  • top of the atmosphere TOA temperature: - 18 C at 5 km altitude
  • lapse rate: 6.5 C/km 
  • dry adiabatic lapse rate: 10 C/km
  • transported from surface to TOA by thermodynamics: 120 W/m2 
  • transported from surface to TOA by radiation: 60 W/m2.
Assuming that thermodynamics reduces the lapse rate from 10 to 6.5 C/km, thermodynamics would thus have the effect of decreasing the temperature increase from TOA to Earth surface by 5 x 3.5 = 18 C, thus with a relative decrease of 18/120 = 0.15 Cm^2/W. 

The corresponding number for radiation would be an increase of 33/60 = 0.5 Cm^2/W. 

The combined effect would thus be with a partition of 2/3 thermodynamics and 1/3 radiation:
  • 1/3 x 0.5 - 2/3 x 0.15  = 1/6 - 1/10 = 5/30 - 3/30 = 2/30 = 1/15 Cm^2/W.
An assumed radiative forcing of 4 W/m2 would thus lead to a warming of  4/15 C, which is less than 0.3 C, thus a factor 10 smaller than IPCC's most likely value of 3 C. 

An alarm of 3 C would thus be replaced by a harmless 0.3 C, which could never be noticed.  

Do you say that the above argument is simplistic? Yes, it is, but it may well be more realistic than the IPCC argument  leading to a climate sensitivity probably inflated by a factor 10.

In business a value inflated by a factor 10, would be viewed as fraudulent.    

fredag 28 december 2012

Blackbody as Universal Reference Thermometer

The analysis on Computational Blackbody Radiation identifies a blackbody by a radiation spectrum with all frequencies of the same brightness or light intensity,  thus sharing a common temperature (equidistribution), together with maximal high frequency cut-off (expressing Wien's displacement law).

This makes a blackbody into a universal thermometer which can be used to define the temperature of an arbitrary body with arbitrary absorption/emission characteristic, through radiative equilibrium with the blackbody by near-resonance.

This makes it possible to compare the temperatures of two different bodies with different absorption/emission spectra, and in particular infer radiative transfer of heat energy from hot to cold.

Viewing a blackbody as a reference thermometer may remove some of the mystery of the common notion of a blackbody as an empty cavity absorbing all incident radiation. This may help to understand that the equally mysterious greenhouse gas effect is non-existent.

torsdag 27 december 2012

Evidence of Zero Greenhouse Gas Effect 4

The above picture from Giant Planets of Our Solar System by P. Irwin, shows observed cloud structure and temperature variation through the atmospheres of the outer gaseous planets: Jupiter, Saturn, Uranus and Neptune. We find minimum temperature close to observed bolometric temperature as the temperature of a blackbody of the same irradiance:
  • Jupiter: 124 K (167 K)
  • Saturnus:  95 K (138 K)
  • Uranus: 59 K (79 K)
  • Neptunus: 59 K (70 K)
which we can identify as temperatures attained at a pressure of about 0.3 bar (with those at 1 bar in parenthesis).

We compare with a bolometric temperature of the Earth + atmosphere assuming an irradiance of 200 W/m2 = 240 W/m2 absorbed from the Sun (with albedo 0.3) minus 40 W/m2 radiated through the atmospheric window directly from the Earth surface, which equals (200/5.66)^0.25 x 10^2 = 243 K. This is close to the temperature of the Earth atmosphere at 0.3 bar.
We thus find evidence that the bolometric temperature  of a planet including atmosphere, is close to the minimum temperature attained at a pressure of about 0.3 bar, which we may view to define the "outer boundary" of the planet plus atmosphere.

The internal temperature including the temperature of a planet surface would then be determined by the bolometric temperature and a lapse rate specified by thermodynamics, both being independent of the specific radiative properties of the atmosphere, thus supporting the idea of zero greenhouse gas effect. 

onsdag 26 december 2012

Evidence of Zero Greenhouse Gas Effect 3

It is not clear (not even for an ideal blackbody like a cavity) from where in the interior of a radiating body the radiation passing through its boundary, effectively originates: Somehow a coordinated wave motion or oscillation is formed inside the body, which is then transmitted through its boundary to the outside as radiation.

This may be similar to an idea somehow formed as a coordinated wave motion involving the whole brain, which is then transmitted to the exterior through the mouth.

Evidence of Zero Greenhouse Gas Effect 2

By universality of blackbody radiation the temperature of a planet orbiting the Sun, both assumed to be ideal blackbodies, would only depend on the distance to the Sun and not on the composition of the planet including its atmosphere. We would then expect the blackbody temperature to be the "surface" temperature of the planet including its atmosphere as the temperature of the "outermost layer" of the atmosphere, which would also be the smallest temperature assumed.

Data like that above suggests that this layer is defined by an atmospheric pressure of 200-400 millibar, which for a gaseous planet like Jupiter defines an "effective outer boundary" as a radiating body.

The solid surface temperature of planet, like the Earth or Venus, would then be determined by a lapse rate set by thermodynamics, and would thus be independent of the radiative properties of the atmosphere including the concentration of CO2, thus contradicting the presence of any greenhouse gas effect, on both the Earth and Venus.

See also No Greenhouse Effect with the observation that the temperature of Venus is about 1.176 times that of the Earth at the same pressure level, which conforms with the difference of blackbody temperatures from different distances to the Sun.

PS The albedo would then be taken into account by reducing the effective blackbody temperature at a certain distance from the Sun, alternatively effectively increasing the distance to the Sun. For example, with albedo = 0, the effective Earth blackbody temperature would be 273 K, and with the observed  albedo = 0.4, it would be 240 K attained at a pressure of about 300 millibar.

tisdag 25 december 2012

Evidence of Zero Greenhouse Gas Effect 1

The above plots of temperature vs pressure in planetary atmospheres indicate that the surface temperature of a planet with an atmosphere is determined by its blackbody temperature, as the temperature of the atmosphere at a pressure of about 0.3 bar attained at a certain altitude and then moderated by a certain lapse rate (less than g/c_p where g is the gravitational acceleration and  c_p the specific heat of the atmosphere at constant pressure) to set the surface temperature. The blackbody temperature would then be the minimum temperature of the atmosphere attained at a pressure of about 0.3 bar, and the planet surface temperature would be independent of the composition of the atmosphere, as long as the lapse rate would not be subject to  change.

The greenhouse effect, supposedly influencing the temperature of the Earth surface, as an effect depending on the radiative properties of the atmosphere, would thus be zero. In particular, with a given lapse rate the Earth surface temperature would be independent of the amount of CO2 in the atmosphere.

The rationale would be that a gas at a pressure of more than 0.3 bar radiates like a blackbody. Is this true?

torsdag 20 december 2012

The Arrow of Time and Another New Year

As the year 2012 is now coming to an end it is maybe again time to contemplate the irreversibility of the world and our lives in particular. A clock can be made to tick backwards, but not our lives. A clock is reversible but our life is irreversible. Why? Why is there an arrow of time pointing from past to future?

The standard answer you hear is that by the 2nd law of thermodynamics the entropy always increases and increasing entropy is what defines the direction of time. If you say that you don't understand, then you are in good company: The clever mathematician John von Neumann pointed out that referring to entropy is a secure way of winning any argument, because nobody knows what entropy is. So, refereeing to increase of entropy as the definition of increase of time, says nothing.

If you are interested in understanding irreversibility and why you get older as time passes, you may get wiser by browsing The Clock and the Arrow: A Brief Theory of Time, where a new approach is explored based on an idea of finite precision computation.

The basic idea is illustrated in the picture above showing two sawlike structures on top of each other, which can slide with respect to each other one way, but not the other.  Why?

Because, what happens pulling the top structure to the right with a certain force F, assuming the bottom structure is fixed, is that the top structure moves right while being lifted up,  and the speed of motion is then determined by the power supplied and the weight of the top structure.

But motion to the left is impossible, because an infinite power would required to lift the top structure
the saw-teeth depth in zero time.

Why is then our lives like a saw gliding on top of another always to the right, towards an inescapable end without possibility to return? This is what the Clock and Arrow book seeks to explain. Take a look if you feel you want to know.

onsdag 19 december 2012

Francis Bacon Quotes on New Theory of Flight

Here are some quotes by Francis Bacon (1561-1626) relating to science in general and to the New Theory of Flight in particular:
  • If a man's wit be wandering, let him study the mathematics. 
  • A prudent question is one-half of wisdom.
  • Age appears to be best in four things; old wood best to burn, old wine to drink, old friends to trust, and old authors to read. 
  • Who questions much, shall learn much, and retain much. 
  • Nature, to be commanded, must be obeyed. 
  • Read not to contradict and confute, nor to believe and take for granted... but to weigh and consider. 
  • Truth emerges more readily from error than from confusion. 
  • A sudden bold and unexpected question doth many times surprise a man and lay him open.
  • Science is but an image of the truth.
  • Studies perfect nature and are perfected still by experience.
  • Truth is so hard to tell, it sometimes needs fiction to make it plausible. 
  • They are ill discoverers that think there is no land, when they can see nothing but sea. 
  • Silence is the virtue of fools. 
  • The subtlety of nature is greater many times over than the subtlety of the senses and understanding.
  • Truth is a good dog; but always beware of barking too close to the heels of an error, lest you get your brains kicked out. 
  • Write down the thoughts of the moment. Those that come unsought for are commonly the most valuable.
  • The genius, wit, and the spirit of a nation are discovered by their proverbs. 

Tornados and New Theory of Flight

The New Theory of Flight identifies rotational separation at the trailing edge as the physical mechanism allowing a wing to generate large lift at the price of small drag. Rotational separation is illustrated above left in the case of flow of air past a circular cylinder.

The swirling flow is similar to the swirling flow of a tornado generated by rising hot air sucking air
towards a low pressure center in a spiraling motion, illustrated above right.

NBCNEWS reports today on Man-made tornadoes could power the future:
  • Coiled up in a tornado is as much energy as an entire power plant. So a Canadian engineer has a plan to spin up his own twister and extract energy from its tethered tail.

tisdag 18 december 2012

Separation in Slightly Viscous Flow

The New Theory of Flight presented on The Secret of Flight offers a new analysis of the central problem of separation in slightly viscous flow modeled by the Navier-Stokes equations with a slip boundary condition. This analysis is fundamentally different from the classical analysis by Prandtl of boundary layer separation with a no-slip boundary condition, which has dominated modern fluid mechanics.

Separation with slip show to take one of the following forms illustrated in the oil film visualizations above:
  1. Rotational separation with opposing flows (top)
  2. Parellel flow separation (bottom)
The analysis is presented in the upcoming book The Secret of Flight in Chapters 24 - 30. Why not take a look!

lördag 15 december 2012

New Theory vs Old Peer Review System

Our article New Theory of Flight, first rejected by AIAA Journal and now under review by Journal of Mathematical Fluid Mechanics (JMFM), raises an important question as concerns the peer review system with anonymous referees used in scientific publishing.

The negative referee reports from AIAA show that the fluid dynamics community will do whatever is needed to suppress the New Theory in defense of the Old Theory.

If JMFM were to choose referees from this community the New Theory would have to be rejected by JMFM. On the other hand, JMFM has already published our article Resolution of d'Alembert's Paradox underlying the New Theory by relying on expertise outside the fluid dynamics community, with basis in mathematics and computation.

But Wikipedia being controled by the fluid dynamics community has blocked every reference to this published article in the Wikipedia article on d'Alembert's paradox, with details recorded on Wikipedia Inquisition, and so the publication in JMFM does not count. JMFM is outside while Journal of Fluid Mechanics (JFM) and AIAA Journal are inside.

It is thinkable that JMFM could likewise publish the New Theory, by finding referees endorsing the New Theory. But these referees could not come from inside the fluid dynamics community and thus would lack scientific credibility. In short:
  • Referees from inside the fluid dynamics community will reject the New Theory, even if it is correct, because it challenges the Old Theory. 
  • Referees outside the fluid dynamics community cannot endorse the New Theory, even if it is correct, because they lack scientific credibility.   
This shows that the conventional referee system does not seem to be functional in this case of a New Theory challenging an established Old Theory: Either the New Theory will be rejected without good reason, or it will be published without good reason.

The only possibility in this case is to publish the article without the usual refereeing process and invite to an open discussion of the New Theory vs the Old Theory.

What seems to be needed is thus
  • A: open refereeing process after publishing with non-anonymous referees,
instead of
  • B: closed refereeing process before publishing with anonymous referees.
If there once in the era of analog printing on paper was a reason for preferring B before A, because what was once printed on paper remained printed on paper, this reason no longer holds in the digital era of electronic publishing where rejection and erasing can be done at any time.

Functional science like functional society can only be maintained through open discussion by people with names and faces. Science and politics behind closed doors is against the basic principle of science and democracy of open critical inquiry.

In this light the censorship by KTH of my book Mathematical Simulation Technology recorded as KTH-gate, is troublesome for KTH and a nuisance for me.  

måndag 10 december 2012

Light-Matter Interaction: Nobel Prize in Physics 2012

The 2012 Nobel Prize in Physics awarded Serge Haroche and David Wineland, concerns experimental investigations of light-matter or radiation-matter interaction for small (quantum) systems: ions in a harmonic trap and photons in a cavity.

There is a connection to the study of blackbody radiation on Computational Blackbody Radiation based on a mathematical model of the form W + R = E of radiation-matter interaction, including the following components:
  • wave equation for matter (W)
  • small radiative damping: outgoing radiation from matter (R)
  • excitation from incoming radiation (E).
The study brings out a basic aspect of near-resonance resulting in matter waves out-of-phase with excitation waves leading to a balance between incoming and outgoing radiation in the form of Planck's radiation law.

Of particular interest is the following information suggesting a connection to near-resonance:
  • In most experiments performed by Haroche’s group, the atom and field have slightly different frequencies. 
The basic idea of Computational Blackbody Radiation is to replace statistical quantum mechanics by deterministic wave mechanics with finite precision high-frequency cut-off.  Change of spin may then be thought of as change of phase-shift.

It remains to be seen if this idea is productive. Statistical quantum mechanics is troubled with unsolved riddles and it is not clear (to me at least) that deterministic finite-precision wave mechanics cannot model small systems. 

lördag 8 december 2012

New Theory of Flight and Practice

The New Theory of Flight revealing the Secret of Flight is based on 3d rotational separation at the trailing edge which allows the flow to leave the wing without the high pressure of potential flow destroying the lift created at the leading edge, as illustrated in the above left picture, in the case of a circular cylinder.

The right picture show vortex generators on an upper wing surface with the objective of delaying stall according to the new version P180 3xFJ33 of P180 Avanti. We see a close connection between theory and practice.

New Theory for Wind Turbines

                                                 Transport of 50 meter wind turbine blade.

Wind turbines commonly use stall control to limit the power at higher wind speeds, as described in the 2001 thesis Flow Separation on Wind Turbine Blades by Gustave Paul Corten. The thesis starts with the following description of state-of-the-art:
  • Presently it is impossible to calculate the lift and drag characteristics of an airfoil accurately. Especially beyond the stall angle, the calculations can be off by tens of percents. For that reason airfoil characteristics still have to be determined in wind tunnels.
We show on The Secret of Flight based on New Theory of Flight that today it is possible to compute lift and drag characteristics of airfoils in the full range of angles of attack including stall and beyond to a tolerance of 5%, by solving the Navier-Stokes equations with slip boundary condition. This opens new possibilities in wind turbine design and control.

torsdag 6 december 2012

Liberation from the Prandtl Spell 2

Continuing the argument in the preceding post, we consider a (laminar) shear layer L between to regions of fluid with different velocity. If epsilon is the viscosity, the width of the shear layer then scales like square-root-of-epsilon = epsilon^0.5 and the velocity gradient like 1/epsilon^0.5, which gives a total dissipation
  • D = integral-over-L epsilon x grad u^2 ds ~ A x epsilon^0.5 
where A is the area of L. If the area is bounded, which is the case for a boundary layer of a bluff body, then D tends to zero with epsilon, which indicates a vanishing effect with vanishing viscosity, contrary to Prandtl's basic postulate.

On the other hand, the length of the rolls of 3d rotational separation stretching into the flow behind the body may increase with decreasing viscosity thus keeping D positive under vanishing viscosity. The effect of the rolls on the boundary of a bluff body and total dissipation, which determine drag and lift, would thus stay the same under vanishing viscosity, while the rolls after the body would get longer.

In the case of a turbulent shear layer, D ~ A x epsilon^0.2 according to experimental data, but the conclusion would be similar.

See also The Secret of Flight.

onsdag 5 december 2012

Liberation from the Prandtl Spell 1

The New Theory of Flight explaining for the first time the mathematics and physics of the flight of birds and airplanes is based on solving the Navier-Stokes equations with slip boundary conditions as a model for slightly viscous flow.

The Navier-Stokes equations with slip boundary conditions and vanishingly small viscosity are scale invariant in the sense that a change of the scale in space leaves the equations invariant (assuming a vanishingly small viscosity has no scale). We see this effect as large-scale features of bluff body flow reflecting the geomety of the body, which remain the same under mesh refinement (modulo gradient sharpening effects) such as the 3d rotational separation pattern of the flow around a circular cylinder shown above, which is also seen at the trailing edge of a wing and what lies behind the secret of flight.

It is the scale invariance of solutions to the Navier-Stokes equations with slip boundary condition and vanishing viscosity, which make solutions computable on meshes resolving only the large scale features of the flow in what can be referred to as Large Eddy Simulation (LES), without the need of user-defined turbulence models.

This makes computational solution of slightly viscous flow possible by removing the Prandtl spell of having to resolve thin boundary layers from no-slip boundary conditions, which has parlalyzed fluid mechanics for so long time. Follow the story in the upcoming book The Secret of Flight.

In fact, a no-slip boundary condition is mathematically incompatible with vanishing viscosity and Prandtl's insistence on using no-slip can be seen as a consequence of Prandtl's limitation as mathematician witnessed by e.g. von Karman.

lördag 1 december 2012

Incorrect Explanation of Magnus Effect

Above is the the standard explanation of the Magnus effect, giving a backspin tennis ball an upward lift force, as potential flow (left) augmented with large scale circulation (middle) to give an unsymmetric flow with lift (right). The same type of argument is used in the circulation theory of Kutta-Zhukovsky supposedly explaining the lift of a wing.

However, the above standard explanation of the Magnus effect is incorrect. The correct explanation is by unsymmetric separation as seen in the following picture of real flow:

and not by large scale circulation as in the above standard explanation. We see that in reality the incoming flow is not affected by the rotation of the ball, while in the standard explanation the incoming flow is changed from horizontal direction without circulation to a slanted upward direction with circulation.

We thus see that the standard circulation theory for the Magnus effect is unphysical, and so is Kutta-Zhukovsky circulation theory for the lift of a wing. For a correct theory see The Secret of Flight.