0% found this document useful (0 votes)
27 views

Local Illum

Uploaded by

PAVAN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Local Illum

Uploaded by

PAVAN
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Local Illumination and Shading

Kalpathi Subramanian

Computer Science
The University of North Carolina at Charlotte

Last Modified: October 2024


Illumination and Shading

Learning Objectives
1 Explain the motivation behind shading in computer graphics
2 Distinguish between local and global illumination
3 List the illumination models and their rationale that make up the
local illumination model used in the graphics pipeline for shading
3D objects
4 Illustrate how the ambient, diffuse and specular reflection models
are evaluated through examples.
5 Distinguish between Phong and Gouraud shading models used to
shade polygonal meshes, and state their computational tradeoffs.
Illumination and Shading
Why Shade Objects?
We want renderings to be realistic, i.e. like how they would appear in
the real world.
Difficult to perceive objects in 3D with constant illumination, for eg.,
objects appear flat
Need to understand how light interacts with matter
Governed by the Rendering Equation (Kajiya ’86), a scattering
model.
 Z 
′ ′ ′ ′ ′′ ′ ′′ ′′
I (x, x ) = g (x, x ) ϵ(x, x ) + ρ(x, x , x )I (x , x )dx
S
′ ′′
where x, x , x are object points, ρ is surface reflectance, g is a
geometry term accounting for visible surface calculations
Cannot solve analytically, so must approximate the solution, leading
to a number of global illumination algorithms
We will initially focus on local illumination models.
Light Scattering/Absorption

Light bounces around

(Courtesy: Angel/Shreiner, Fig. 6.2)


Light interaction with a surface results in absorption and scattering
(reflection, transmission)
Particle, Wave models for light transport
We are interested in the illumination that reaches the viewer
Global Effects

Multiple Scattering

(Courtesy: Angel/Shreiner, Fig. 6.2)


 Z 
′ ′ ′ ′ ′′ ′ ′′ ′′
I (x, x ) = g (x, x ) ϵ(x, x ) + ρ(x, x , x )I (x , x )dx
S
Local vs. Global Illumination

Local Illumination:
▶ Assumes a set of designated lights in the environment
▶ Accounts for only illumination at object points directly arriving
from designated light sources in scene that are visible to the viewer.
Global Illumination:
▶ Also consider illumination from all other object points, due to all other
light transport mechanisms, such as transmission, reflection.
▶ Each object point in the scene can act as a secondary light
source, receiving and sending illumination along various paths.
▶ Primary light sources are emitters, secondaray sources usually are not.
Most often, in computer graphics, we can get away with local
illumination models for a large number of applications, which can take
advantage of existing graphics hardware.
In recent years, significant advances have been made in supporting
global illumination algorithms in hardware.
What influences Shading on 3D objects?

Light-Material Interactions

Light striking an object is partly reflected and partly absorbed.


Reflected illumination determines object color.
Reflection is governed by surface characteristics, illumination
spectrum, orientation of incoming illumination and surface
orientation.
What influences Shading on 3D objects?
Light Sources: Types, Characteristics

Area Sources: Illumination comes from all points in the source, and
generally expensive to model/compute.
Point Source: Model with position, color
Distant Source: Model with orientation, all illumination rays are
parallel.
Spotlight: Restrict illumination to a specified angle around the point
source.
Ambient: Constant source of illumination, used to model illumination
from secondary sources (reflection, transmission), global illumination.
What influences Shading on 3D objects?
Surface Types
Smooth Surface: Reflected illumination is concentrated in a
direction, governed by the laws of reflection (approaching mirror)
Rough Surface: Scatters light in all directions with equal intensity.
Translucent Surface: Allows part of light to go through the surface,
and dependent on refractive index of the material. And some
incident reflection.

(Courtesy: Angel/Shreiner Fig. 6.4)


What influences Shading on 3D objects?

Accuracy, Sampling Density, Illumination Models


Actual object geometry vs. polyhedral approximations.
Object precision vs. image precision algorithms.
Sampling density, one sample/pixel versus multiple samples/pixel.
Illumination models, true optics vs. approximations, “hacks”,
“kludges”.
Local Illumination Models in Computer Graphics
Only consider direct illumination from designated light sources
Uses 3 models: Ambient, Specular, Diffuse
Illumination is the sum of the 3 models’ illumination components
Ambient models remaining global illumination, diffuse and specular
model rough and smooth surfaces.
Approximations used in both ambient and specular models.
Diffuse model is based on Lambert’s law.

Problems with Local Models


Does not account for secondary sources of illumination through
scattering.
Superposing Ambient, specular and diffuse models as a linear
combination has no physical basis.
Specular reflection model is purely empirical.
Local Illumination Models in Computer Graphics
Ambient Reflection Model
Role is to provide uniform illumination in environments.
Most real world environments have some ambient illumination,
typically through inter-object scattering.
Expensive to compute, hence approximated by a constant.

Computing Ambient Illumination

Iambient = ka Ia

where Ia , the ambient intensity is a constant, ka is the ambient


reflectance coefficient, that limits the ambient contribution.
Purely empirical model, has no physical basis.
Models interactions of light between objects with a constant
intensity, non-directional light source.
Typically Used in conjunction with ambient and specular models.
Local Illumination Models in Computer Graphics
Diffuse Reflection Model

Assumes a perfectly diffuse reflector, a Lambertian surface


Lambertian Reflection: Diffuse scattering of light is uniform and
independent of direction (microfacet model).
Dull, matte surfaces exhibit diffuse reflection.
Reflected energy varies based on the direction of the incident beam;
for shallow angles it is lesser (spread over a larger area), compared to
a more direct beam (that aligns with the surface normal).
Reflected Light intensity: is constant in all directions, and
proportional to the cosine of the angle between the direction to the
light source and the surface normal, i.e. Intensity ∝ cosϕn
Local Illumination Models in Computer Graphics

Diffuse Reflection Model

Diffuse reflection intensity is proportional to projected area orthogonal


to incident light, or the cosine of the angle between N ⃗ and incident
light ⃗L.

Idiffuse = Ip Kd cos(θ)
⃗ ⃗L)
= Ip Kd (N.
Local Illumination Models in Computer Graphics

Specular Reflection Model

Reflections from smoooth or shiny surfaces.


Reflection is modeled as taking place at the top of the surface.
Reflection intensity is concentrated in the direction and plane
governed by Snell’s Laws of reflection.
Responsible for generation of highlights.
Sensitive to viewer position.
Local Illumination Models in Computer Graphics
Phong Illumination Model

64
4 Cos α
Cos ( α ) Cos α

α 90 α 90 α 90

Phong proposed an empirical model for specular reflection.


The specular illumination is proportional to the cosine of the angle
between the perfect reflection direction and the direction to the
viewer, raised to a positive power, also termed the shininess
coefficient.

Ispec = ks cos n α

⃗ R)
where α = cos −1 (V. ⃗ is the angle between the viewer and reflection
direction.
Local Illumination Models in Computer Graphics
Modified Phong Model (Blinn’77)

(Courtesy Angel/Shreiner, Fig. 6.20)

The Phong model requires computing the reflection direction.


Blinn suggested a more efficient approximation to the Phong model
⃗ between the light and reflection
by using the half-way vector H
directions.
⃗ • N,
Specular illumination is proportional to H ⃗ where

⃗ = (⃗L + V
⃗)
H ,
|⃗L + V
⃗|
Ispec = ks cos n ψ = ks (H ⃗ n
⃗ • N)
Local Illumination Models in Computer Graphics

Combining the 3 models,

⃗ ⃗L) + Ispec ks cos n α


Ilocal = Iamb ka + Idiff Kd (N.

Light sources are parameterized by wavelength λ. The model needs to be


evaluated for each wavelength of the light source,
n
⃗ ⃗L) + Ispecλ ksλ (V
Ilocalλ = Iambλ kaλ + Idiff λ kdλ (N. ⃗ .R)

For an R,G,B model, used in computer graphics, we will use a 3


wavelength based model for local illumination.
Local Illumination Models in Computer Graphics

Light Source Attenuation


For more accuracy, lighting models can include an attenuation factor,
representing light energy attenuation as a function of distance.
Inverse square law can be used, but given the approximate lighting
model, its better to incorporate a linear component.
1
fatten =
a + bd + cd 2
Each component of the lighting model is scaled by the above term,
where d is the distance of the light source to the object point.
Local Illumination Models in Computer Graphics
Example: Teapot Model with varying reflection model parameters
Computing local Illumination - Problems

For the following problems, compute diffuse and specular components


of hte illumination with the given parameters
1 ⃗ = (−1, 0, 0), Shininess = 4,
Eye = (0, 0, 0), LightPos = (0, 4, 0), N
Object Point = (1, 1, 0)
2 ⃗ = (0, 0, 1), Shininess = 4,
Eye = (0, 0, 0), LightPos = (2, 4, 7), N
Object Point = (5, 1, −2)
Polygonal Shading Models

Motivation
In the early days of computer graphics, realtime rendering of polygons
was expensive
Graphics hardware was in its infancy, memory was expensive
Need for more efficient algorithms to render large polygonal meshes
Computing Normals: Polygons

Equation of a (plane of the) polygon is given by

Ax + By + Cz + D = 0
⃗ = (A, B, C ) to the polygon surface
where the normal vector N
If V1 , V2 , V3 are points on the polygon
⃗ = (V⃗2 − V⃗1 ) × (V⃗3 − V⃗1 )
N
= (A, B, C )

where V⃗1 , V⃗2 and V⃗3 are non-collinear points in the polygon
Polygonal Shading Models
Computing Normals: Sphere

Sphere Eqn: f (x, y , z) = x 2 + y 2 + z 2 = 1


Normal of any point (x, y , z) is given by the gradient,
 δf   
δx 2x
 δf  = 2y 
δy
δf 2z
δz

Polygonal Meshes derived from curved surfaces approximations:


subdivision algorithms will evaluate the tangent vectors at the vertices
and use cross products to determine vertex normals.
Polygon Shading Algorithms

Constant (Flat, Faceted) Shading

A constant intensity for entire polygon.


Assumptions:
▶ Light source is at infinity, N⃗ • ⃗L is constant.
▶ ⃗ ⃗
Viewer is at infinity, N • V is constant.
▶ Polygon represents the exact surface being modeled.
Fast to compute, illumination equation is evaluated once per polygon,
but produces unrealistic images in most cases.
Polygon Shading Algorithms
Interpolated Shading (Wylie, Romney, Evans, Erdahl, Gouraud,
Phong)

Interpolates shading intensities at each point on the polygon,


using the shades computed at the vertices of the polygon
Developed for triangular facets originally and generalizes to any
polygon by Henri Gouraud, called Gouraud shading
Can be easily integrated with a scan-line algorithms.
Interpolation replaces the more expensive application of the
illumination model at each interior point of the polygon.
Assumes polygon represents the exact surface being modeled.
Polygon Shading Algorithms
Gouraud Shading: How to get vertex normals in a mesh?

(Courtesy, Angel/Shreiner, Fig. 6.28)


Face Normals are averaged to obtain vertex normals.
Vertex Intensities (I1 , I2 , I3 ) are obtained by application of a lighting
model.
Intensities at interior points are obtained by linear interpolation of the
vertex normals.
For curved surfaces, normals at grid vertices are computed from the
higher order surfaces
Polygon Shading Algorithms

Phong Shading

(Courtesy, Angel/Shreiner, Figs. 6.30, 6.31)


Compute Normals at vertices of polygon.
Interpolate each component of normal at any interior point of polygon
Apply illumination model.
Can easily be incorporated into a scanline algorithm.
More expensive than Gouraud shading. Why?
Shading Interpolation: Problem
Given the above triangle ABC with vertex coordinates and pixel
intensities, compute the intensity at D and E.

I (D) =?
I (E ) =?
Transforming Normals by an Affine Transform
Normals are direction vectors, not object points!
Applying an arbitrary affine transform M to a normal vector n⃗ can
result in incorrect normals.
Consider a tangent plan t and a normal vector ⃗n in that plane.
We want to ensure the transformed normal (by some matrix N) is
orthogonal to the plane that transform t by M, i.e.,
(M.t) • (Nn) = 0, transformation results in a new plane
(M • t.)T • N.⃗
n = 0, must take transpose for matrix vector multiply
(t T M T ).(N n⃗) = 0
t T (M T N)⃗
n = 0

Thus, M T N must be set to I ,


−1
MT N = I , N = MT

Thus transforming object points in a surface by M requires


−1
multiplying normals on its surface by (MT )
Per Vertex Shading in WebGL: Vertex Shader
Color is calculated at each incoming vertex using a local illumination model
a t t r i b u t e vec4 v P o s i t i o n ;
a t t r i b u t e v e c 4 vNormal ;
v a r y i n g vec4 f C o l o r ;
uniform vec4 ambientProduct , d i f f u s e P r o d u c t ,
specularProduct , lightPos ;
u n i f o r m mat4 modelView , p r o j e c t i o n ;
uniform f l o a t s h i n i n e s s ;

v o i d main ( ) {
vec3 pos = ( modelViewMatrix ∗ v P o s i t i o n ) . xyz ;
vec3 l i g h t = l i g h t P o s . xyz ;
vec3 L = n o r m a l i z e ( l i g h t − pos ) ;

v e c 3 E = n o r m a l i z e (−p o s ) ; // camera i s a t o r i g i n
vec3 H = normalize (L + E ) ;
v e c 3 N = n o r m a l i z e ( ( modelView ∗ vNormal ) . x y z ) ; // t r a n s f o r m n o r m a l

// Compute t e r m s i n t h e i l l u m i n a t i o n e q u a t i o n
vec4 ambient = ambientProduct ;
f l o a t Kd = max ( d o t ( L , N) , 0 . 0 ) ;
vec4 d i f f u s e = Kd∗ d i f f u s e P r o d u c t ;
f l o a t Ks = pow ( max ( d o t (N, H) , 0 . 0 ) , s h i n i n e s s ) ;
vec4 s p e c u l a r = Ks∗ s p e c u l a r P r o d u c t ;

i f ( d o t ( L , N) < 0 . 0 ) { // l i g h t s o u r c e i s n o t v i s i b l e
s p e c u l a r = vec4 ( 0 . 0 , 0.0 , 0.0 , 1 . 0 ) ;
}
f C o l o r = ambient + d i f f u s e + s p e c u l a r ;
fColor . a = 1.0;
g l P o s i t i o n = p r o j e c t i o n ∗ modelView ∗ v P o s i t i o n ;
}
Per Vertex Shading in WebGL: Fragment Shader

v a r y i n g vec4 f C o l o r ;

v o i d main ( ) {
gl FragColor = fColor ;
}

Each fragment’s position in the triangle is assigned an interpolated color.


Colors are interpolated from the vertices of the triangle (computed in the vertex
shader) to the fragment’s position
Per Fragment Shading in WebGL: Vertex Shader

Color is calculated in the fragment shader using interpolated values of vertex


position, normal vector, light vector and vector to the viewer. No lighting
calculations in the vertex shader.

a t t r i b u t e v e c 4 v P o s i t i o n , vNormal ;
v a r y i n g v e c 3 N, L , E ;
u n i f o r m mat4 modelView , p r o j e c t i o n ;
uniform vec4 l i g h t P o s ;

v o i d main ( ) {
vec3 pos = ( modelViewMatrix ∗ v P o s i t i o n ) . xyz ;
vec3 l i g h t = l i g h t P o s . xyz ;
vec3 L = n o r m a l i z e ( l i g h t − pos ) ;

v e c 3 E = n o r m a l i z e (− p o s ) ; // camera i s a t o r i g i n
vec3 H = normalize (L + E ) ;
v e c 3 N = n o r m a l i z e ( ( modelView ∗ vNormal ) . x y z ) ; // t r a n s f o r m n o r m a l

g l P o s i t i o n = p r o j e c t i o n ∗ modelView ∗ v P o s i t i o n ;
}
Per Fragment Shading in WebGL: Fragment Shader
Local lighting model is applied to each position using interpolated values of
vertex position, normal vector, light vector and vector to the viewer

uniform vec4 ambientProduct , d i f f u s e P r o d u c t ,


specularProduct , lightPos ;
uniform f l o a t s h i n i n e s s ;
v a r y i n g v e c 3 N, L , E ;
v o i d main ( ) {
vec4 fColor ;
vec3 H = normalize (L + E ) ;
vec4 ambient = ambientProduct ;
f l o a t Kd = max ( d o t ( L , N) , 0 . 0 ) ;
vec4 d i f f u s e = Kd ∗ d i f f u s e P r o d u c t ;
f l o a t Ks = pow ( max ( d o t (N, H) , 0 . 0 ) , s h i n i n e s s ) ;
vec4 s p e c u l a r = Ks ∗ s p e c u l a r P r o d u c t ;

i f ( d o t ( L , N) < 0 . 0 ) { // l i g h t s o u r c e i s n o t v i s i b l e
s p e c u l a r = vec4 ( 0 . 0 , 0.0 , 0.0 , 1 . 0 ) ;
}
f C o l o r = ambient + d i f f u s e + s p e c u l a r ;
fColor . a = 1.0;
gl FragColor = fColor ;
}

You might also like