Atmospheric Scattering

发表于2015-12-19
评论1 3k浏览

代码公式排版错乱,请参考原文

In this chapter we will learn about atmospheric scattering. We recommend that you also read the lesson on volume rendering and subsurface scattering. They share with this topic almost the same concepts. In fact, atmospheric scattering can be seen as an extension of volume rendering.

Introduction

For centuries, the sky has surely been a subject of fascination to many artists who tried to depict its colors with as much accuracy as possible. Many generations of physicists and mathematicians before the 19th-20th century also probably got obsessed with trying to figure out what could cause the sky to appear orange at sunset and sunrise and blue during the day. Historically, the discovery of atmospheric scattering is attributed to Rayleigh (also known as John Strutt), a Nobel prize English physicist whose main body of work was produced in the last 19th century (Rayleigh succeeded to Maxwell at the Cambridge University who is known for his work on electromagnetism. A large portion of the computer graphics research relates to Maxwell's work. Rayleigh also studied black bodies with Max Planck). Atmospheric scattering also has an effect on an objects appearance. This effect is known as aerial perspective and was first observed, studied and reproduced by Leonardo Da Vinci in his paintings (in fact this technique can be found in paintings prior to Da Vinci's work). As the distance between an object and an observer increases, the colors of the object are replaced with the colors of the atmosphere (for the sky it is usually blue during the day and red-orange at sunrise and sunset). This is an important visual clue as we, humans, are used to evaluate the distance of objects in a landscape by comparing how blue they appear in relation to each other. In a (digital) painting, aerial perspective can greatly help adding depth to an image (the brain can more easily process that an object is further away than another) as illustrated with the following reproduction of a Da Vinci painting (Madonna of the Rocks).

In more recent history, that is history of computer graphics, Nishita wrote a seminal paper in 1993 entitled "Display of the Earth Taking into account Atmospheric Scattering" in which he describes an algorithm to simulate the colors of the sky. Interestingly enough his paper and the following one he wrote in 1996 on the same topic ("Display Method of the Sky Color Taking into Account Multiple Scattering") was not entirely about atmospheric simulation but more about realistic rendering of the Earth (including rendering of ocean surfaces, clouds and continents) seen from outer space.

Figure 1: top a real photograph of the Earth from outer space. Bottom, a simulation using Nishita's model. These images have been copied from the paper "Display of the Earth taking into account Atmospheric Scattering" written by Nishita in 1993 (c) Siggraph.

This reminds us that in these years, the development of computer graphics techniques was driven by manufacturing industries for accurate simulation of their products or to provide a 3D virtual training environment, rather than by the entertainment industries (films and games). The technique described by Nishita to simulate the sky hasn't changed much since his time. Most of the research made in this area focused on implementing his algorithms on the GPU without any noticeable improvement to the simulation quality (focus was more on speed and real time simulation). Another sky model exists though which was described by Preetham & al. in a paper published at Siggraph in 1999 ("A Practical Analytic Model for Daylight"). As its title suggests, this is an analytical model which provides an accurate simulation of the sky colors however it is more restrictive than the technique proposed by Nishita (for instance the model only works for an observer located on the ground). It is worth mentioning a paper published byJensen & al. at Siggraph in 2011 on the simulation of a night sky ("A Physically-Based Nightsky Model"). This chapter will propose an implementation of the Nishita model. In the following chapters, we will study the other algorithms.

In the next paragraph, we will describe the various phenomenons which once put together are responsible for the colors of the sky. We will then show how Nishita's algorithm simulates these phenomenons. Once we have the model implemented in a C++ program, we will show that we can easily extend the technique to simulate aerial perspective and that by varying the parameters of the model, we can also create alien skies.

Atmospheric Model

An atmosphere is a layer of gases surrounding a planet. This atmosphere stays in place because of the planet's gravity. What defines this atmosphere is mainly its thickness and its composition (the different elements the atmosphere is made of). The earth atmosphere thickness is about 100 km (delimited by the Kármán line) and is made of oxygen, nitrogen, argon (a gas discovered by Rayleigh), carbon dioxide and water vapor. However in the simulation we will be using a thickness of 60km (we will only consider scattering in the first two layers of the Earth's atmosphere, the troposphere and the stratosphere). If you have read the lesson on volume rendering (we invite you to do it now before you continue if you haven't yet) you will know that photons are scattered when they collide with particles. When light travels through a volume (the atmosphere) in a certain direction, photons are deflected in other directions when they collide with particles. This phenomenon is called scattering. How frequently light is scattered by particles both depends of the particles properties (mainly their size) and their density in the volume (how many particles/molecules are in one unit of volume). We have know the size of the molecules which the Earth atmosphere is made of and we also know their concentration/density. We will use these scientific measurements for our atmospheric scattering simulation. One more important aspect of the atmosphere surrounding the planet, is that the density of these particles decrease with height. In other words, we find many more molecules per unit cube of air at the sea level than 2 km above the sea. This is an important fact as the amount of scattering depends on the molecules density as we just mentioned. As light travels in the atmosphere from the upper layer to the ground, we will need to sample the atmosphere density along the ray at regular intervals and compute the amount of scattering based on the atmosphere density at these sample positions (we perform a numerical integration which we explain in detail further down). Most papers (including Nishita's) make the assumption that the atmosphere density decreases exponentially with height. In other words it can be modelled with a simple equation of the form:

density(h)=density(0)e?hH

where density(0) is the air density at sea level, h is the current height (altitude) where we measure the atmosphere density and H is the thickness of the atmosphere if its density were uniform (in scientific papers H is called the scale height and in fact relates to the altitude by which the density of the atmosphere decreases by a factor of e. H depends on temperature). Some papers claim that this model is a good approximation, however some others claim it to be incorrect and prefer to model the sky density as a series of concentric layers which are each determined by their thickness and density (XX please quote paper here XX).

Modelling the sky densities as a series of layers (called spherical shell in Nishita's paper) also presents an advantage from a computational point of view. As we will perform a numerical integration along the path of the light towards the viewer, samples taken at position in the atmosphere where the density is high are more important than samples taken where the density is low. Performing a "constant step" integration along the ray would therefore result in a poor convergence. Nowadays computers are so fast that we can brute force this computation by taking regular steps but in Nishita's time optimising the algorithm was necessary. He therefore proposed a model where the atmosphere is represented as series of spherical shells, separated by small intervals at low altitude and long intervals at high altitude. Note however in the digram that was published in the paper that the thickness of the layers very much seem to have an exponential variation. This is not so surprising as the radius of each shell is precisely determined in his model by the density distribution of air molecules (and the formula he uses to compute this density is very similar to the one we have written above).

Figure 2: aerosols present in the atmosphere are responsible for haze.

The final point to understand to complete an atmospheric model is that the atmosphere is made of small particles (air molecules) which are also mixed up at low altitude with bigger particles calledaerosols. These particles can be anything from dust or sand raised by wind to result in air pollution. They certainly have an impact on the atmosphere appearance especially as they don't scatter light the same way air molecules do. The scattering of light by air molecule is called Rayleigh scattering and the scattering of light by aerosols is calledMie scattering. In short, Rayleigh scattering (the scattering of light by air molecules) is responsible for the blue color of the sky (and it's red-orange colors at sunrise and sunset) while the Mie scattering (the scattering of light by aerosols) is usually responsible for the white-grey haze that you typically see above polluted cities (haze obscures the clarity of the sky, see figure 2).

Figure 3: a simple graphical representation of the atmospheric model we will be using in this lesson. It is defined by the radius of the planet (Re) and the radius of the atmosphere (Ra). The atmosphere is made of aerosols (mainly present at low altitude) and air molecules. Density of particles decreases exponentially with altitude. This diagram is not to scale.

The model would be incomplete without mentioning that it will be required from the user to specify the radius of the planet and the sky. These numbers will be used to compute the altitude of the sample positions taken along the view and light rays. For Earth we will be using Re = 6360 km (e for Earth) and Ra = 6420 km (a for atmosphere). All distances and parameters from our model which relate to distance should be expressed in the same unit system (either km, m, etc.).

For atmospheric models, this can be tricky as the size of the planets can be large. For such dimensions, kilometer is usually a more suitable choice. However scattering coefficients are very small and they are more easily expressed in terms of meters or millimeters (even nanometers for light wavelengths). Choosing a good unit is also driven by floating precision limitations (that is if the numbers are too big or too small, the floating representation of these numbers by the computer can become inaccurate which would result in calculation errors). Input parameters to the model can be expressed in different units and internally converted by the program.

Figure 4: the sun is so far away from the Earth, that each light ray reaching the Earth's atmosphere can be considered as being parallel to each other.

Finally, we will finish the description of our model by noting that the main source of illumination of our sky is the sun. The sun is so far away from the Earth that we can assume that all the light rays reaching the atmosphere are parallel to each other. We will show further down how this observation can simplify the implementation of our model.

Rayleigh Scattering

The scattering of light by air molecules was discovered by Rayleigh in the late 19th century. He showed in particular that this phenomenon has a strong wavelength dependency and more precisely that air molecules scatter blue light more than green and red light. This form of scattering and the equation he came up with to compute the scattering coefficients for volumes made of molecules such as the one we find in the atmosphere, only apply to particles which sizes are much smaller than the wavelengths making up visible light (the particle should be at least 1/10 smaller than the scattered wavelength). Wavelengths for visible light vary from 380 to 780 nm, 440, 550 and 680 being considered as the peaks for blue, green and red light. We will be using these values for the rest of this lesson. The Rayleigh scattering equation provides the scattering coefficients of a volume for which we know the molecular density. In the literature on atmospheric scattering the scattering coefficients are denoted by the greek letter β (beta).

βsR(h,λ)=8π3(n2?1)23Nλ4e?hHR

The superscript S holds for scattering and the subscript R holds for Rayleigh (to differentiate these coefficients from the Mie scattering coefficients). In this equations h is the altitude, λ (lambda) is the wavelength, N is the molecular density at sea level, n is the index of refraction of air, and HR is the scale height. In our simulation we will be using HR = 8km (see scale height on the web for more accurate measurements if needed). As you can see with this equation, light with short wavelength (such as blue light) will give higher value for β than light with long wavelength (red) which explains why the sky appears blue during the day. As light from the sun travels through the atmosphere, more blue light is scattered towards the observer than green and red light. Why does it appear red-orange at sunrise and sunset then ? In that particular configuration, the light from the sun has to travel a much longer portion of the atmosphere to reach the observer's position than it does when the sun is above your head (the zenith position). That distance being considerably longer, most of the blue light has been scattered away before it can reach the observer's location, while some of the red light which is not scattered as often as the blue light still remains. Hence the red-orange appearance of the sky at sunrise and sunset (which sometimes can either be purple or slightly green).

Figure 5: when the sun is at the zenith, sunlight travels a short distance before it reaches the eye. At sunset (or sunrise) sunlight travels a much bigger distance before it reaches the observer's eyes. In the first case blue light is scattered towards the eye and the sky appears blue. In the second case, most of the blue light has been scattered away before it can have a chance to reach the eye. Only red-green light reaches the observer's position which explain why the sky looks red-orange when the sun is at the horizon.

We could use some measurements for N, n, etc in order to compute a value for β but we will use precomputed values which correspond to the scattering coefficients of the sky at sea level (33.1e-6m-1, 13.5e-6m-1, 5.8e-6m-1 for wavelengths 440, 550 and 680. Note how the scattering coefficients for blue light is greater than the coefficients for green and red light). By just applying the exponential part of the equation (the right term) we can adjust these coefficients for any given altitude h.

One of the main problems with Nishita's papers and the other papers we mention in this lesson, is that they either don't give the values that were used for N, n, etc to produce the images that are shown in the papers. When measurements are used, authors of the papers don't usually cite their sources. Finding scientific data for molecular density at sea levels is not obvious and if you have any useful link or information on this topic that you could provide us with we would like to hear from you. The sea level scattering coefficients that we are using in this lesson can be found in two papers: "Efficient Rendering of Atmospheric Phenomena", by Riley et al. and "Precomputed Atmospheric Scattering" by Bruneton et al.

We won't go into too much details about the density of air, but interesting information can be found on the internet on this matter. In this lesson, all we really need are average scattering coefficients at sea level that work well for recreating the colors of the sky, but learning how to compute these coefficients will be of interest to any curious reader who wishes to have more control over the atmospheric model.

If you have read the lesson on Volume Rendering you will know that the physical models used to render participating media are the scattering coefficients, theabsorption coefficients and the phase function which describes how much and in which directions light is scattered when it collides with particles. In this lesson so far, we have only given values (and explained how to compute) scattering coefficients for the Earth's atmosphere. For atmospheric scattering it is usually admitted that absorption is negligible. In other words we will assume that atmosphere doesn't absorb light. Remember from the lesson on Volume Rendering that the extinction coefficient that we need in the physical model used to render a participating media, is the sum of the absorption and scattering coefficients, therefore (since the absorption coefficient is zero) we can write for the extinction coefficient:

βeR=βsR

The Raleigh phase function looks like this:

PR(μ)=316π(1+μ2)

where μ is the cosine of the angle between the light and the view directions (see figure 8).

Mie Scattering

Mie scattering is similar to the Rayleigh equation in a way but applies to particles which size is greater than the scattered wavelength. It is the case of aerosols which we find in the low altitudes of the Earth's atmosphere. If we were to apply the Rayleigh equation to aerosols we would not get a convincing image. The Mie equation to render the scattering coefficients looks like this:

βsM(h,λ)=βsM(0,λ)e?hHM

where the subscript M holds for Mie. Note that there is a specific scale height value (HM) for the Mie scattering which is usually set to 1.2 km. To the contrary of the Rayleigh scattering, we don't provide with an equation to compute the Mie scattering coefficients. We will use values from measurement made at sea level instead. For Mie scattering we will use βS=210e-5m-1. Aerosols too have their density decreasing exponentially with altitude and like for the Rayleigh scattering we will be simulating this effect by including an exponential term on the right inside of the equation (modulated by the scale height factor HM). Mie extinction coefficient is about 1.1 times the value of its scattering coefficient. And the Mie phase function equation is:

PM(μ)=38π(1?g2)(1+μ2)(2+g2)(1+g2?2gμ)32

The Mie phase function includes a term g (the Rayleigh phase function doesn't) which controls the anisotropy of the medium. Aerosols exhibit a strong forward directivity. Some papers use g=0.76 (which we will also be using as our default value).

The Concept of Optical Depth

Before we look at a practical implementation of the algorithm in C++ we will just put all the pieces of the puzzle together. FIrst, we should note that the sky is nothing else than a spherically shaped volume surrounding a solid sphere. Because it is nothing else than a volume, to render the sky, we can use the ray-marching algorithm which is the most common technique to render participating medium and which we studied in detail in the lesson on Volume Rendering. When we render a volume, the observer (or the camera) can either be inside the volume or outside.

Figure 6: the camera can either be inside the atmosphere or outside. When it is inside, we are only interested in the intersection point between the camera ray and the atmosphere. When the camera is outside it can intersect the atmosphere in two points.

When the camera is inside we need to find the point where the viewing ray exits the volume. When it is outside, we need to find the points where the viewing ray enters and exits the volume. Because the sky is a sphere, we will use a ray-sphere intersection routine to analytically compute these points. We have presented a couple of techniques to compute the intersection of a ray and a sphere in the basic section (Ray-Quadratic Shapes Intersection). We know if the camera is inside or outside the atmosphere by testing its altitude using the ground as a reference. If this altitude is greater than the atmosphere thickness then the camera is outside the atmosphere and the viewing ray might intersect the atmosphere in two places.

For now we will assume that the camera is on the ground (or one meter above the ground) but the algorithm will work for any arbitrary camera position (we will render an image of the sky from outer space at the end of this lesson). Lets assume that we have a viewing ray (corresponding to one pixel in the frame) and that we know where this ray intersects with the upper limit of the atmosphere. The problem we need to solve at this point, is to find out how much light is traveling along this ray in the direction of the viewer. As we can see in following figure, it is unlikely that our camera or observer will be directly looking at the sun (which is dangerous for your eyes).

Figure 7: single scattering is responsible for the sky color. A viewer is rarely directly looking at the sun (which is dangerous). However when we are looking away from the sun the atmosphere has a color which is the result of blue light from the sunlight being scattered in the direction of the observer's eyes.

In all logic if you were looking in the direction of the sky, you shouldn't see anything because you are looking at the outer space which is empty (space between celestial bodies). However, the fact is that you see a blue color. This is caused by light from the sun entering the atmosphere and being deflected by air molecules in the viewing direction. In other words, there's no direct light coming from the sun traveling along the viewing direction (unless the viewing direction is pointing directly at the sun) but as the light coming from the sun is scattered by the atmosphere, some of that light ends up traveling in the direction of your eye. This phenomenon is called single scattering (which is fully explained in the lesson on Volume Rendering).

Figure 8: to compute the sky color we first trace a ray from Pc to Pa and then integrate the amount of light coming from the sun (with direction L) that is reflected back along that ray (V).

What do we know so far? We know that to render the sky color for one pixel in the frame we will first cast a viewing ray (V) from the point Pc (the camera position) in the direction of interest. We will compute the point Pa where this ray intersects with the atmosphere. Finally, we need to compute the amount of light that is traveling along this ray due to single scattering. This value can be computed with the volume rendering equation which we have studied in the lesson on Volume Rendering:

L(Pc,Pa)=PaPcT(Pc,X)Lsun(X)ds

Where T is the transmittance between point Pc and X (the sample position along the viewing direction) and L is the amount of light in the volume at X. All it says, is that the total amount of light reaching the observer at Pc is equal to all the sunlight being scattered along the viewing direction V. This quantity can be obtained by summing up the light at various positions (X) along V. This technique is called a numerical integration. The more samples we take along the ray the better the result but the longer it takes to compute. The transmittance term (T) in the equation accounts for the fact that light scattered along V in the direction of the viewer at each sample position (X), is also attenuated as it travels from X to Pc.

Recall from the lesson on volume rendering that light is attenuated as it travels from a point to another in the volume due to absorption and out-scattering. Lets call the amount of light we have at Pb, Lb and the amount of light arriving at Pa from Pb, La. In the presence of a participating medium, La (light received from Pb) will be lower than Lb. Transmittance represents the ratio La over Lb and T is therefore in the range zero to one. In equation form, transmittance looks like this:

T(Pa,Pb)=LaLb=exp(?PaPbβe(h)ds)

In plain english this equation means that we need to measure the extinction coefficients BetaE of the atmosphere at various sample positions along the path ray Pa-Pb, sum these values up, multiply them by the length segment ds (that is the distance Pa-Pb divided by the number of samples used), take the negative of this value and feed it to an exponential function.

Figure 9: as light (Lb) travels from Pb to Pa, it gets attenuated because of out-scattering and absorption. The result is La. Transmittance is the amount of light received at Pa from Pb, after it was attenuated while traveling through the atmosphere (that is T=La/Lb).

Going back to what we said about Rayleigh scattering you will remember that βs (from which we will compute βe) can be computed using the scattering coefficients at sea level modulated by the exponential of the altitude h over the scale height (H).

βs(h)=βs(0)exp(?hH)

For Rayleigh scattering, absorption can be ignored and we can write:

βe(h)=βs(h)+βa(h)=βs(h)+0=βs(h)

You can move BetaE(0) out of the integral (because it is a constant) and you can re-write the transmittance equation as:

T(Pa,Pb)=exp(?βe(0)PaPbexp(?hH)ds)

The sum inside the exponential function can be seen as the average density value between Pa and Pb and the equation itself is usually referred to in the literature as theoptical depth of the atmosphere between points Pa-Pb.

Adding the Sunlight

Finally we will use the rendering equation that we have presented in the previous paragraph to compute the sky color. Lets write it again:

L(Pc,Pa)=PaPcT(Pc,X)Lsun(X)ds

We have explained how to render the transmittance. Lets now have a look at the term Lsun(X) and explain how to evaluate it. Lsun(X) corresponds to the amount of sunlight scattered at the sample position X along the viewing direction. To evaluate it, first we need to compute the amount of light arriving at X. Taking the sunlight intensity directly is not enough. Indeed, if light enters the atmosphere at Ps it will also be attenuated as it travels to X. We therefore need to compute this attenuation using a similar equation to the one we have been using to compute the attenuation of light traveling along the viewing ray from Pa to Pc (equation 1):

Lsun(X)=SunIntensity?T(X,Ps)

Equation 1

Technically for each sample position X along the viewing ray we will need to cast a ray in the sunlight direction (L) and find where this light ray intersects with the atmosphere (Ps). We will then use the same numerical integration technique we used for the viewing ray to evaluate the transmittance (or optical depth) term in equation 1. The light ray will be cut into segments and the density of the atmosphere at the centre of each light segment will be evaluated.

One of the last elements in building an algorithm to compute the sky color, is to account for the amount of light that is scattered in the viewing direction based on the light direction itself and the view direction. This is the role of the phase function. Rayleigh and Mie scattering have their own phase function which we gave further up. If you need a refresher on what the phase function is, read the lesson on Volume Rendering. In short, the phase function is a function that describes how much light coming from direction L is scattered in direction V. The Mie phase function contains an extra term g, called the mean cosine (among many other possible names) which defines if the light is mainly scattered along the forward direction (L) or the backward direction (-L). For forward scattering g is in the range (0:1) and for backward scattering g is in the range (-1:0). When g equals zero, light is equally scattered in all directions and we say that the scattering is isotropic. For Mie scattering we will set g to 0.76.

Lsun(X)=SunIntensity?T(X,Ps)?P(V,L)

Finally we need to account for the fact that mainly blue light for instance for Rayleigh scattering is scattered along the view direction. To reflect this we will multiply the result of the previous equation by the scattering coefficients (βs) which gives the full equation for the light part of the equation. However recall that βs changes with the altitude (its value is a function of height) where h is the height of X in relation to the ground (sea level more precisely):

Lsun(X)=SunIntensity?T(X,Ps)?P(V,L)?βS(h)L(X)=4πLightIntensity?T(X,Ps)?P(V,L)?βS(h)

For the Earth Atmosphere the sun is the only light source in the sky however, to write a generic form of the previous equation we should account for the fact that light might come from many directions. In scientific paper you will usually see this equation being written as an integral over 4π which describes a sphere of incoming directions. This more generic equation would also account for sunlight reflected by the ground (multiple scattering) which we will ignore in this chapter.

Computing the Sky Color

Putting all the elements together we have (equation 2):

SkyColor(Pc,Pa)=PaPcT(Pc,X)Lsun(X)ds

Equation 2

with (equation 3):

Lsun(X)=SunIntensity?T(X,Ps)?P(V,L)?βS(h)

Equation 3

If we replace 3 in 2 we get:

SkyColor(Pc,Pa)=PaPcT(Pc,X)?SunIntensity?P(V,L)?T(X,Ps)?βs(h)ds

The result of the phase function is a constant as well as the sunlight intensity. We can therefore move these two terms out of the integral (equation 4):

SkyColor(Pc,Pa)=SunIntensity?P(V,L)PaPcT(Pc,X)?T(X,Ps)?βs(h)ds

Equation 4

This is the final equation for rendering the color of the sky for a particular view and light direction. As such it's not so complicated. It's more computationally demanding since we compute an integral and do many call to the exponential function (through the transmittance term). Plus, you need to remember that they sky color is actually the result of Rayleigh and Mie scattering. Therefore we need to compute this equation twice for each type of scattering:

SkyColor(Pc,Pa)=SkyColorRayleigh(Pc,Pa)+SkyColorMie(Pc,Pa)

Remember from calculus that the multiplication of two exponential functions is equal to one exponential function which argument is the sum of the arguments from the first two functions:

eaeb=ea+b

We can use this property at our advantage (we compute one exponential instead of two) and re-write the multiplication of the two transmittance terms in equation 4 with:

T(Pc,X)=e?βe0T(X,Ps)=e?βe1T(Pc,X)?T(X,Ps)=e?βe0?e?βe1=e?(βe0+βe1)

Equation 5

Lets now have a look at the implementation of the atmospheric model in a C++ program.

Implementation (C++)

We have now all the elements needed to implement a version of Nishita's algorithm in a C++ program. As usual, the program will be written so that the algorithm can easily be understood. Some optimisations will be suggested at the end of this paragraph that could make the program run faster. However real time is not the goal here and images of the sky can still be created with this program in a matter of seconds.

First we will create an Atmosphere class that we will use to specify all the parameters of our system: the radius of the planet and of the atmosphere (Re, Ra), the Rayleigh and Mie scattering coefficients at sea level, the Rayleigh and Mie scale height (Hr and Hm), the sun direction, the sun intensity and the mean cosine. All distances are expressed in meters (as well as the scattering coefficients).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
template<typename T> class Atmosphere{ public: Atmosphere(Vec3<T> sd = Vec3<T>(0, 1, 0), T re = 6360e3, T ra = 6420e3, T hr = 7994, T hm = 1200) : Hr(hr), Hm(hm), radiusEarth(re), radiusAtmosphere(ra), sunDirection(sd), sunIntensity(20), g(0.76) {} static const Vec3<T> betaR; /// Rayleigh scattering coefficients at sea level static const Vec3<T> betaM; /// Mie scattering coefficients at sea level T Hr; /// Rayleigh scale height T Hm; /// Mie scale height T radiusEarth; /// Earth radius T radiusAtmosphere; /// Atmosphere radius Vec3<T> sunDirection; /// Sun direction T sunIntensity; /// Sun intensity T g; /// Mean cosine Vec3<T> computeIncidentLight(const Ray<T> &r, unsigned i, unsigned j) const;}; template<typename T> const Vec3<T> Atmosphere<T>::betaR(5.5e-6, 13.0e-6, 22.4e-6); template<typename T> const Vec3<T> Atmosphere<T>::betaM(21e-6);

We will render the sky as if it was seen by a fisheye lens. The camera looks straight up and captures a 360 degrees view of the sky. To create an animation of the sky rendered for different position of the sun, we will render a series of frames. At the first frame, the sun is at the zenith (centre frame). At the last frame, the sun is slightly under the horizon.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
template<typename T> void renderSkydome(const Vec3<T> sunDirection, const char *filename){ clock_t t = clock(); Atmosphere<T> atmosphere(sunDirection); const unsigned width = 512, height = 512; Vec3<T> *image = new Vec3<T>[width * height], *p = image; memset(image, 0x0, sizeof(Vec3<T>) * width * height); for (unsigned j = 0; j < height; ++j) { T y = 2 * (j + 0.5) / T(height - 1) - 1; for (unsigned i = 0; i < width; ++i, ++p) { T x = 2 * (i + 0.5) / T(width - 1) - 1; T z2 = x * x + y * y; if (z2 <= 1) { T phi = atan2(y, x); T theta = acos(1 - z2); Vec3<T> dir(sin(theta) * cos(phi), cos(theta), sin(theta) * sin(phi)); // 1 meter above sea level *p = atmosphere.computeIncidentLight(Ray<T>(Vec3<T>(0, atmosphere.radiusEarth + 1, 0), dir), i, j); } } fprintf(stderr, "bbbb%3d%c", (int)(100*j/(width-1)), '%'); } // Save to PPM ... delete [] image;} int main(int argc, char **argv){ unsigned nangles = 128; for (unsigned i = 0; i < nangles; ++i) { char filename[1024]; sprintf(filename, "./skydome.%d.ppm", i); float angle = i / float(nangles - 1) * M_PI * 0.6; fprintf(stderr, "bbbbRendering image %d angle = %0.2fn", i, angle * 180 / M_PI); renderSkydome<float>(Vec3<float>(sin(angle),cos(angle),0), filename); } return 0;}

Finally here is the function used to compute equation 4 (that is the color of the sky for a particular camera ray). The first thing we do is finding the intersection point of the camera ray with the atmosphere (line 5). Then we compute the value of the Rayleigh and Mie phase function (using the sun and camera ray direction. Line 15 and 17). The first loop (line 18) creates samples along the camera ray. Note that the sample position (X in the equations) is the middle point of the segment (line 19). From there, we can compute the sample (X) altitude (line 20). We compute the exp(-h/H) multiplied by ds for Rayleigh and Mie scattering (using Hr and Hm). These values are accumulated (line 24 and 25) to compute the optical depth at X. We will also use them later to scale the scattering coefficients (βs(h) in equation 4 (line 43 and 44). Then we compute the amount of light coming from the sun at X (line 32 to 39). We cast a ray in the direction of the sun (rays are parallel) and find the intersection with the atmosphere. This ray is cut into segments and we evaluate the density in the middle of the segment (line 36 and 37). Accumulating these values gives us the optical depth of the light ray. Note that we test if each light sample is above or below the ground. If it is below, the light ray is in the shadow of the earth and can discard the contribution of this ray (line 35 and 40). Note that the Mie extinction coefficient is about 1.1 times the value of the Mie scattering coefficient (line 41). Finally using the trick from equation 5, we can compute the accumulated transmission of the light and the camera ray by accumulating the their optical depth in one single exponential call (line 41 and 42). At the end of the function, we return the final color of the sky for this particular ray, which is the sum of the Rayleigh and Mie scattering transmittance, multiplied by their respective phase function and scattering coefficients. This sum is also multiplied by the sun intensity (line 48).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
template<typename T>Vec3<T> Atmosphere<T>::computeIncidentLight(Ray<T> &r) const { T t0, t1; if (!intersect<T>(r, radiusAtmosphere, t0, t1) || t1 < 0) return Vec3<T>(0); if (t0 > r.tmin && t0 > 0) r.tmin = t0; if (t1 < r.tmax) r.tmax = t1; unsigned numSamples = 16; unsigned numSamplesLight = 8; T segmentLength = (r.tmax - r.tmin) / numSamples; T tCurrent = r.tmin; Vec3<T> sumR(0), sumM(0); T opticalDepthR = 0, opticalDepthM = 0; T mu = r.direction.dot(sunDirection); T phaseR = 3 / (16 * M_PI) * (1 + mu * mu); T g = 0.76; T phaseM = 3 / (8 * M_PI) * ((1 - g * g) * (1 + mu * mu))/((2 + g * g) * pow(1 + g * g - 2 * g * mu, 1.5)); for (unsigned i = 0; i < numSamples; ++i) { Vec3<T> samplePosition = r(tCurrent + T(0.5) * segmentLength); T height = samplePosition.magnitude() - radiusEarth; // compute optical depth for light T hr = exp(-height / Hr) * segmentLength; T hm = exp(-height / Hm) * segmentLength; opticalDepthR += hr; opticalDepthM += hm; // light optical depth Ray<T> lightRay(samplePosition, sunDirection); intersect(lightRay, radiusAtmosphere, lightRay.tmin, lightRay.tmax); T segmentLengthLight = lightRay.tmax / numSamplesLight, tCurrentLight = 0; T opticalDepthLightR = 0, opticalDepthLightM = 0; unsigned j = 0; for (j = 0; j < numSamplesLight; ++j) { Vec3<T> samplePositionLight = lightRay(tCurrentLight + T(0.5) * segmentLengthLight); T heightLight = samplePositionLight.magnitude() - radiusEarth; if (heightLight < 0) break; opticalDepthLightR += exp(-heightLight / Hr) * segmentLengthLight; opticalDepthLightM += exp(-heightLight / Hm) * segmentLengthLight; tCurrentLight += segmentLengthLight; } if (j == numSamplesLight) { Vec3<T> tau = betaR * (opticalDepthR + opticalDepthLightR) + betaM * 1.1 * (opticalDepthM + opticalDepthLightM); Vec3<T> attenuation(exp(-tau.x), exp(-tau.y), exp(-tau.z)); sumR += hr * attenuation; sumM += hm * attenuation; } tCurrent += segmentLength; } return 20 * (sumR * phaseR * betaR + sumM * phaseM * betaM);}

Each frame only takes a few seconds to compute on a 2.5GHz processor (this time depends on the number of view and light samples you use). A few features can be added to this program. Some papers run a tone mapping on the resulting image before saving it out. This can help reducing the contrast between the brightness of the sky around the sun (very bright and white) and the rest of the sky. In addition, you can save the resulting image to a floating point image format (HDR, EXR) as values can easily be greater than one depending on your sun intensity (remapping and clipping the values is not a great choice). This function can easily be added to any existing renderer (see the paragraph on Aerial Perspective for more details about this). Note in the images, how the color of the sky turns red-orange when the sun is above or slightly below the horizon. You can also comment out the contribution of the Mie scattering to see the contribution of the Rayleigh scattering independently (and vice and versa). However by just looking at the resulting images and remembering what you have learned about these two scattering models, you can easily guess that Mie scattering is responsible for the main white halo around the sun and the brightening/desaturation of the sky at the horizon while Rayleigh scattering is responsible for the blue/red/orange colors of the sky.

An interesting project could consist of passing a date and time to the program which would compute the sun position based on this data, rendering a frame with this sun direction and comparing the result to a photograph of the real sky taken at the same time on the same day. How closely would they match ? The sun position is related to its declination and the solar hour angle (in other words the time of the day).

Optimisation: Nishita observed that the sky is symmetrical in relation to the axis defined by the sun position and the centre of the Earth. Based on this observation, the paper proposes a technique to bake the optical depth in a 2D table which you can access using μ (the cosine of the angle between the view and the light direction) and h (the height of the sample point). You will need to create a table for each scattering model (Rayleigh and Mie). These tables can be computed before hand saved into a file and reused each time the program is run. Using precomputed values for the optical depth rather than computing it on the fly (which requires a few expensive exponential calls) would speed the render quite a lot. If speed is important for your application, this optimisation is probably the first one you could implement (see Nishita's paper for the details).

Light Shafts

Atmospheric effects can sometimes create spectacular visual effects. Light shafts usually make these effects more dramatic. If light was traveling freely through the volume, the volume would appear uniformly lit. But if we were to place some objects in the scene, areas of the volumes shadowed by these objects would appear darker than areas which are not occluded. Light shafts are beams of light which corresponds to regions of the volume illuminated by a light source (the sun). But they are only visible because regions of this volume are shadowed by objects in the scene (in the case of the Earth's atmosphere, mainly mountains and clouds).

Figure 10: light shafts appear when some regions of the volume are shadowed by objects in the scene. In this example, some area of the atmosphere are shadowed by the mountains.

Simulating Aerial Perspective

Computing aerial perspective actually doesn't require much changes to our code. As you can see with the following figure, the color of the mountain is affected by the presence of the blue atmosphere. Because the distance from the eye to the ground is shorter for ray A than it is for B, the contribution of the atmosphere is less noticeable on the color of ray A than it is on the color of ray B.

First you should render the color of the geometry (the terrain) for the ray. Then you should render the transmittance (aka the opacity) and the color of the atmosphere (using the eye position and the intersection point of the ray with the geometry for Pc and Pa) and composite the two colors together using the following alpha blending formula:

Ray Color = Object Color * (1 - Transmittance) + Atmosphere Color

In this lesson though, to keep the code simple, we won't be rendering the color of the ground which by default will be black. If you wish to understand in details how this compositing operation can be done, check the lesson on volume rendering. Here are a couple of results we got with two different positions for the sun. The camera is so far away that the Earth is visible in the bottom half of the frame. We first find if the primary ray intersects with the Earth. If it does we then compute the color of the atmosphere from the eye position to this intersection point.

These images can be computed with a very basic ray-tracer. If the camera ray hits the Earth, all we need to do is update the ray tmin/tmax variables (lines 23-24) before we call the function that computes the atmosphere color.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
template<typename T> void renderSkydome(const Vec3<T> sunDirection, const char *filename){ clock_t t = clock(); Atmosphere<T> atmosphere(sunDirection); const unsigned width = 640, height = 480; Vec3<T> *image = new Vec3<T>[width * height], *p = image; memset(image, 0x0, sizeof(Vec3<T>) * width * height); T aspectRatio = width / T(height); T fov = 65; T angle = tan(fov * M_PI / 180. * 0.5); unsigned numPixelSamples = 4; for (unsigned y = 0; y < height; ++y) { for (unsigned x = 0; x < width; ++x, ++p) { for (unsigned m = 0; m < numPixelSamples; ++m) { for (unsigned n = 0; n < numPixelSamples; ++n) { T rayx = (2 * (x + (m + drand48()) / numPixelSamples) / T(width) - 1) * aspectRatio * angle; T rayy = (1 - (y + (n + drand48()) / numPixelSamples) / T(height) * 2) * angle; Vec3<T> rayDirection(rayx, rayy, -1); rayDirection.normalize(); Ray<T> ray(Vec3<T>(0, atmosphere.radiusEarth + 1000, 30000), rayDirection); T t0, t1; if (intersect<T>(ray, atmosphere.radiusEarth, t0, t1) && t1 > 0) ray.tmax = std::max(T(0), t0); *p += atmosphere.computeIncidentLight(ray); } } *p *= T(1) / (numPixelSamples * numPixelSamples); } } // Save as PPM ... delete [] image;}

Alien Skies

By changing the parameters of the Atmosphere model, it's quite easy to create skies which have very different looks than our Earthly sky. One could for instance imagine re-creating the atmosphere of the planet Mars or simply create his/her own imaginary sky. The scattering coefficients and the thickness of the atmosphere are the most obvious parameters you can play with to change its look. Increasing the contribution of Mie scattering quickly turns the atmosphere into a very foggy-hazy sky, which combined with light shafts, can create expressive/moody images.

What about Multiple Scattering ?

The color of the sky is the result of light being scattered once or multiple times in the atmosphere towards the viewer. However, in the literature, it is emphasised that single scattering predominates. Therefore rendering images of the sky by ignoring multiple scattering still gives very plausible results. Most models in the literature ignore or do not provide a technique to take multiple scattering into account. Bruneton however suggests that the amount of light reflected by the ground is large enough to influence the sky color. He proposes a model in which light scattered by the ground is taken into account (in this model, Earth is assumed to be a perfectly spherical shape).

Conclusion

The main few ideas you need to remember from this chapter is that the sky can be rendered as a very large volume in the shape of a sphere (surrounding another larger sphere representing the Earth). Any volume (as learned in the lesson on Volume Rendering) can be defined by absorption and scattering coefficients as well as a phase function. The sky appearance is the combination of Rayleigh and Mie scattering. Rayleigh scattering is responsible for the blue color of the sky (and its red-orange color at sunrise and sunset) and is caused by light being scattered by air molecules which size are much smaller than the light wavelength. Air molecules scatters blue light more than green and blue light. Mie scattering is responsible for the whitish hazy look of the atmosphere. It is caused by light being scattered by molecules bigger than the light wavelength. These molecules are known as aerosols. Aerosols scatter light of all wavelength equally.

References

Display of The Earth Taking into Account Atmospheric Scattering, Nishita et al., Siggraph 1993.

Display Method of the Sky Color Taking into Account Multiple Scattering, Nishita et al., Siggraph 1996.

Precomputed Atmospheric Scattering, Eric Bruneton and Fabrice Neyret, Eurographics 2008.

A Practical Analytic Model for Daylight, A. J. Preetham et al. Siggraph 1999.

如社区发表内容存在侵权行为,您可以点击这里查看侵权投诉指引