The Definitive Guide To Lighting in The High Definition Render Pipeline
The Definitive Guide To Lighting in The High Definition Render Pipeline
HIGH DEFINITION
RENDER PIPELINE
(HDRP)
2 0 2 0 LT S E D I T I O N
Contents
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
System requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Unity Hub. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Project Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Graphics Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Quality Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Optimizing HDRP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Forward rendering. . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Deferred shading. . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Anti-aliasing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Post-processing Anti-aliasing. . . . . . . . . . . . . . . . . . . . . . 22
Volumes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Volume Overrides. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Overrides workflow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Exposure override. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Fixed mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Curve Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Physical Camera. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Lights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Light types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Shapes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Additional properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Light Layers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Units. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
HDRI Sky . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Gradient Sky. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Global fog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Volumetric Lighting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Shadow maps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Shadow Cascades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Micro shadows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Reflections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Reflection Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Sky reflection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Reflection hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Post-processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Post-processing overrides. . . . . . . . . . . . . . . . . . . . . . . . . 67
Tonemapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Bloom. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Depth of Field. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
White Balance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Color Adjustments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Channel Mixer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Lens Distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Vignette. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Motion Blur . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Rendering Debugger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Ray tracing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Setup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Overrides. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Next steps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Introduction
It all begins with light.
Unity built the High Definition Render Pipeline (HDRP) to help creators like you
realize your vision, tapping the power of high-end PC or console hardware to
reach new levels of graphical realism.
This guide was created to guide existing Unity artists and developers in
exploring HDRP’s physically based lighting techniques. HDRP represents a
technological advance in Unity’s real-time rendering, allowing you to work with
light just as it behaves in the real world.
Build game environments that transport your players to places, anywhere from
the familiar to fantastical.
— Fog: Add depth and dimension to your scenes with fog. Enable volumetrics
to integrate fog effects with your foreground objects and render cinematic
shafts of light. Maintain per-light control of volumetric light and shadows
and use the Local Volumetric Fog component for fine control of fog density
with a 3D mask texture.
— Volume system: HDRP features an intuitive system that lets you block
out different lighting effects and settings based on camera location or by
priority. Layer and blend volumes to allow expert-level control over every
square meter of your scene.
Completely new to HDRP? Make sure you read Getting started with the High
Definition Render Pipeline.
12.x 2021.2
11.x 2021.1
Tying the HDRP graphics packages to a specific Unity release helps ensure
compatibility. However, you can also switch to a custom version of HDRP by
overriding the manifest file.
System requirements
HDRP only works on console and desktop platforms that support compute
shaders. HDRP does not support OpenGL or OpenGL ES devices. Refer to the
documentation for more complete requirements and compatibility information.
See Virtual Reality in the High Definition Render Pipeline to learn about
supported VR Platforms and devices.
To get started, create a new Project. Select 3D Sample Scene (HDRP) from the
available templates (called High Definition RP in older versions of the Hub).
This imports the HDRP package with some example presets.
If you create your project with the 3D Core template, Unity uses the older
Built-In Render Pipeline. You can migrate the project to HDRP manually
from the Package Manager (Window > Package Manager).
Find the High Definition RP package in the Unity Registry (or use the Search
field to locate it), and install.
If there is a conflict with the current Project Settings, the HDRP Render Pipeline
Wizard will appear to help you troubleshoot (also found under Window >
Render Pipeline > HDRP Render Pipeline Wizard).1
Click Fix All under Configuration Checking, or click Fix for each issue to repair
individually. This checklist can help you migrate from a non-SRP project.
1
In Unity 2021.2/HDRP 12, this is located under Window > Rendering > HDRP Wizard.
When the wizard finishes, a prompt will ask you to create a new HDRP Pipeline
Asset. This is a file on disk that will hold your specific pipeline settings. Select
Create One to add a new Render Pipeline Asset and assign the file here.
Note that manual installation from a blank project does not import the
3D Sample Scene (HDRP). Use the 3D Sample Scene template if you want
access to the example assets shown in this guide.
We’ll use this project to demonstrate many of the HDRP’s features throughout
this guide.
Use the WASD keys and mouse to drive the FPS Controller around the level.
— Room 1 is a round platform lit by the overhead sunlight. Decals add grime
and puddles of water to the concrete floor.
For a deeper look at the HDRP 3D Sample Scene, please check out this
blog post from Unity technical artist Pierre Yves Donzallaz, which describes the
template scene in more detail.
You might find some other projects helpful once you’re done exploring the HDRP
3D Sample Scene.
Though not originally intended for gaming, the Auto Showroom project
demonstrates a highly detailed vehicle with realistic lighting. Change the stage
lights, car paint materials, and backdrop in an interactive demo. This project
is available via the Unity Hub.
The Spaceship Demo primarily showcases the Visual Effect Graph, but it also
features many HDRP features in a sci-fi environment. You can download it from
Unity’s GitHub repository.
To learn how to make cinematics or animated films, use our Cinematic Studio
Sample, which showcases how to set up and light the shots of a funny short
movie called Mich-L, mixing stylized and photoreal rendering.
We invite you to explore these additional projects as you discover the High
Definition Render Pipeline.
Note: HDRP Default Settings is called HDRP Global Settings in Unity 2021.2/
HDRP 12 and above.
Project Settings
Graphics Settings
The top field, the Scriptable Render Pipeline Settings, represents a file on disk
that stores all of your HDRP settings.
You can have multiple such Pipeline Assets per project. Think of each one as
a separate configuration file. For example, you might use them to store
specialized settings for different target platforms (Xbox, PlayStation, and so on),
or they could also represent different visual quality levels that the player could
swap at runtime.
The Quality Settings allows you to correspond one of your Pipeline Assets with
a predefined quality level. Select a Level at the top to activate a specific Render
Pipeline Asset, shown in the Rendering options.
Be aware that enabling more features in the Pipeline Asset will consume more
resources. In general, optimize your project to use only what you need to achieve your
intended effect. If you don’t need a feature, you can turn it off to improve performance
and save resources.
Here are some typical features that you can disable if you don’t use them:
— In the camera’s Frame Settings (Main Camera, cameras used for integrated
effects like reflections, or additional cameras used for custom effects):
Refraction, Post-Process, After Post-Process, Transmission, Reflection Probe,
Planar Reflection Probe, and Big Tile Prepass
The HDRP Default Settings section (called HDRP Global Settings in Unity 2021.2/
HDRP 12) determines the baseline configuration of which features you have enabled
when you start the project. You will override these with local settings based on
camera position and/or priority (see Volumes below).
Global Settings save in their own separate Pipeline Asset defined at the top field. Set
up the default rendering and post-processing options here.
As you develop your project, you may need to return to the HDRP Default Settings
to toggle a specific feature on or off. Some features will not render unless the
corresponding checkbox in the Default Settings is explicitly enabled. Be aware that
certain settings appear in the Volume Profiles and some features appear in the Frame
Settings, depending on usage.
While familiarizing yourself with HDRP’s feature set, make use of the top right Search
field in the Project Settings. This will only show you the relevant panels with the
search terms highlighted.
Enabling a feature in the HDRP Default Settings does not guarantee that it is active. To
verify, you must also locate the Pipeline Asset currently selected as the Quality level.
Check the corresponding option there as well. HDRP features must be enabled in
both the global and local settings in order to function.
You may want to understand how these rendering paths work to see how Lit
Shader Mode will impact the other settings in our pipeline.
Forward rendering
In Forward rendering, the graphics card splits the on-screen geometry into
vertices. Those vertices are further broken down into fragments, or pixels,
which render to screen to create the final image.
Each object passes, one at a time, to the graphics API. Forward rendering comes
with a cost for each light. The more lights in your Scene, the longer rendering
will take.
Forward rendering draws lights in separate passes. If you have multiple lights
hitting the same GameObject, this can create significant overdraw, slowing
down when a lot of lights and objects are present.
Deferred shading
HDRP can also use deferred shading, where lighting is not calculated per object.
Instead deferred shading postpones heavy rendering to a later stage and uses
two passes.
Deferred shading applies lighting to a buffer instead of each object. Each of these passes contributes
to the final rendered image.
In the first pass, or the G-buffer geometry pass, Unity renders the GameObjects.
This pass retrieves several types of geometric properties and stores them in a
set of textures (e.g., diffuse and specular colors, surface smoothness, occlusion,
normals, and so on).
In the second pass, or lighting pass, Unity renders the Scene’s lighting after the
G-buffer is complete. Hence, it defers the shading. The deferred shading path
iterates over each pixel and calculates the lighting information based on the
buffer instead of the individual objects.
For more information about the technical differences between the rendering
paths, see Forward and Deferred rendering in the HDRP documentation.
In your active Pipeline Asset, set the Lit Shader Mode to Forward Only. Next
select MSAA 2x, MSAA 4x, or MSAA 8x for the Multisample Anti-aliasing
Quality. Higher values result in better anti-aliasing, but they are slower.
We can see this more clearly when we zoom into the camera view.
Original scene
Post-processing Anti-aliasing Adjust Post Anti-aliasing on your camera when using deferred shading.
Note: When combining Post-processing Anti-aliasing with Multisample Anti-aliasing, be aware of the rendering cost.
As always, optimize your project to balance visual quality with performance.
A Volume is just a placeholder object with a Volume component. You can create
one through the GameObject > Volume menu by selecting a preset. Otherwise,
simply make a GameObject with the correct components manually.
The Light Explorer can list all the Volumes in the open Scene(s).
Set the Volume component’s Mode setting to either Global or Local, depending
on context.
A global Volume works as a “catch-all” without any boundaries, and it affects all
cameras in the Scene. In the HDRP template scene, the VolumeGlobal defines
an overall baseline of HDRP settings for the entire level.
A local Volume defines a limited space where its settings take effect. It uses a
Collider component to determine its boundaries. Enable IsTrigger if you don’t
want the Collider to impede the movement of any physics bodies like your FPS
player controller.
As your camera moves around the scene, the global settings take effect until
your player controller bumps into a local Volume where those settings take over.
Performance tip
Volume Profiles
You can also switch to another Profile you already have saved. Having the
Volume Profile as a file makes it easier to reuse previous settings and share
Profiles between your Volumes.
Use the Profile field to switch Volume Profiles or create a new one.
Note that changes done to Volume Profiles in Play mode will not be lost when
leaving said mode.
Each Volume Profile begins with a set of default properties. To edit their values,
use Volume Overrides and customize the individual settings. For example,
Volumes Overrides could modify the Volume’s Fog, Post-processing,
or Exposure.
Once you have your Volume Profile set, click Add Override to customize the
Profile settings. A Fog override could look like this:
Each of the Volume Override’s properties has a checkbox at the left, which you
can enable to edit that property. Leaving the box disabled means HDRP uses
the Volume’s default value.
Each Volume object can have several overrides. Within each one, edit as many
properties as needed. You can quickly check or uncheck all of them with the All
or None shortcut at the top left.
The higher-level Volume settings serve as the defaults for lower-level Volumes.
Here, the HDRP Default Settings pass down to the global Volume. This, in turn,
serves as the “base” for the local Volumes.
The Global Volume overrides the HDRP Default Settings. The Local Volumes, in
turn, override the Global Volume. Use the Priority, Weight, and Blend Distance
(outlined below) to resolve any conflicts from overlapping Volumes.
Debugging a Volume
To debug the current values of a given Volume component, you can use the
Volume tab in the Rendering Debugger.
You can find a complete Volume Overrides List in the HDRP documentation.
Because you often need more than one Volume per level, HDRP allows you to
blend between Volumes. This makes transitions between them less abrupt.
At runtime, HDRP uses the camera position to determine which Volumes affect
the final HDRP settings.
Blend Distance determines how far outside the Volume’s Collider to begin fading
on or off. A value of 0 for Blend Distance means an instant transition, while a
positive value means the Volume Overrides begin blending once the camera
enters the specified range.
The Volume framework is flexible and allows you to mix and match Volumes
and overrides as you see fit. If more than one Volume overlaps the same space,
HDRP relies on Priority to decide which Volume takes precedence. Higher
values mean higher priority.
Use Blend Distance, Weight, and Priority when overlapping local Volumes.
Your exposure range in HDRP will typically fall somewhere along this spectrum:
Greater exposure values allow less light into the camera and are appropriate for
more brightly lit situations. Here, an EV value between 13 and 16 is suitable for a
sunny daytime exterior. In contrast, a dark, moonless, night sky might use an EV
between -3 and 0.
You can vary a number of factors in an actual camera’s settings to modify your
Exposure value:
— The shutter speed, the length of time the image sensor is exposed to light
Photographers call this the exposure triangle. In Unity, as with a real camera, you can
arrive at the same exposure value using different combinations of these numbers.
HDRP expresses all exposure values in EV100, which fixes the sensitivity to that of
100 International Standards Organisation (ISO) film.
Exposure formula
It’s a logarithmic base-2 scale. As the exposure value increases 1 unit, the
amount of light entering the lens decreases by half.
HDRP allows you to match the exposure of a real image. Simply shoot a digital
photo with a camera or smartphone. Grab the metadata from the image to
identify the f-number, shutter speed, and ISO.
Then, calculate the exposure value using the formula above. If you use the same
value in the Exposure override (see below), the rendered image should fall in line
with the real-world image exposure.
In this way, you can use digital photos as references when lighting your level.
While your goal isn’t necessarily to recreate the image perfectly, matching an
actual photograph can take the guesswork out of your lighting setups.
In the Mode dropdown, you can select one of the following: Fixed, Automatic,
Automatic Histogram, Curve Mapping, and Physical Camera.
Compensation allows you to shift or adjust the exposure. You would typically
use this to apply minor adjustments and “stop” the rendered image up and
down slightly.
Fixed mode
Automatic mode
While Automatic mode will work under many lighting situations, it can also
unintentionally overexpose or underexpose the image when pointing the camera
at a very dark or very bright part of the scene.
Use the Limit Min and Limit Max to keep the exposure level within a desirable
range. Playtest to verify that your limits stay within your expected exposure
throughout the level.
Metering Mode, combined with mask options, determines what part of the
frame to use for autoexposure.
The Adaptation mode controls how the autoexposure changes as the camera
transitions between darkness and light, with options to adjust the speed. Just
like with the eye, moving the camera from a very dark to a very light area, or vice
versa, can be briefly disorienting.
Automatic, Automatic Histogram, and Curve Mapping modes use Metering mode
to control what part of the frame to use when calculating exposure. You can set
the Metering mode to:
— Spot: The camera only uses the center of the screen to measure exposure.
— Center Weighted: The camera favors pixels in the center of the image and
feathers out toward the edges of frame.
Automatic Histogram mode takes Automatic mode a step further. This computes
a histogram for the image and ignores the darkest and lightest pixels when
setting exposure.
By rejecting very dark or very bright pixels from the exposure calculation,
you may experience a more stable exposure whenever extremely bright or
dark pixels appear on the frame. This way, intense emissive surfaces or black
materials won’t underexpose or overexpose your rendered output as severely.
Curve Remapping lets you remap the exposure curve as well (see Curve
Mapping below).
Automatic Histogram mode uses pixels from the middle of the histogram to calculate exposure.
Here, the x-axis of the curve represents the current exposure, and the y-axis
represents the target exposure. Remapping the exposure curve can generate
very precise results.
Those familiar with photography may find Physical Camera mode helpful for
setting camera parameters.
Switch the Exposure override’s Mode to Physical Camera, then locate the
Main Camera.
Important to exposure are the ISO (sensitivity), Aperture (or f-number), and
Shutter Speed. If you are matching reference photos, copy the correct settings
from the image’s Exif data. Otherwise, this table can help you guesstimate
Exposure Value based on f-number and shutter speed.
Though not related to exposure, other Physical Camera properties can help you
match the attributes of real-world cameras.
For example, we normally use Field of View in Unity (and many other 3D
applications) to determine how much of the world a camera can see at once.
In real cameras, however, the field of view depends on the size of the sensor
and focal length of the lens. Rather than setting the field of view directly, the
Physical Camera settings allow you to fill in the Sensor Type, Sensor Size,
and Focal Length from the actual camera data. Unity will then automatically
calculate the corresponding Field of View value.
Use the physical camera parameters to reshape the aperture. The five-bladed iris can show a pentagonal (left)
or circular (right) bokeh.
Light types
These Light types are available, similar to the other render pipelines in Unity:
— Directional: This behaves like light from an infinitely distant source, with
perfectly parallel light rays that don’t diminish in intensity. Directional lights
often stand in for sunlight. In an exterior scene, this will often be your
key light.
— Spot: This is similar to a real-world spotlight, which can take the shape of
a cone, pyramid, or box. A spot falls off along the forward z-axis, as well
as toward the edges of the cone/pyramid shape.
— Area: This projects light from the surface of a specific shape (a rectangle,
tube, or disc). An area light functions like a broad light source with a
uniform intensity in the center, like a window or fluorescent tube.
Modify how the spot, point, and area lights fall off with the Range. Many HDRP
Lights diminish using the inverse square law, like light sources in the real world.
Spot and area lights have additional shapes for controlling how each light falls off.
— Cone: Projects light from a single point to a circular base. Adjust the Outer
Angle (degrees) and Inner Angle (percentage) to shape the cone and
modify its angular attenuation.
— Pyramid: Projects light from a single point onto a square base. Adjust the
pyramid shape with the Spot Angle and Aspect Ratio.
— Tube: Projects light from a single line in every direction, out to a defined
Range. This light only works in Realtime Mode.
— Disc: Projects light from a disc shape in the local positive Z direction,
out to a defined Range. This light only works in Baked Mode.
All HDRP Light types have Emission properties that define the light’s appearance.
You can switch the Light Appearance to Color and specify an RGB color. Otherwise,
change this to Filter and Temperature for more physically accurate input.
Color temperature sets the color based on degrees Kelvin. See the Lighting and
Exposure Cheat Sheet for reference.
You can also add another color that acts like a Filter, tinting the light with another
hue. This is similar to adding a color gel in photography.
Additional properties
HDRP also includes some advanced controls under the More Options button
at the top right of the Inspector properties. Click it to reveal additional controls.
These include toggles for Affect Diffuse and Affect Specular. In cutscene or
cinematic lighting, for example, you can separate Lights that control the bright
shiny highlights independently from those that produce softer diffuse light.
You can also use the Intensity Multiplier to adjust the overall intensity of the
light without actually changing the original intensity value. This is useful for
brightening or darkening multiple Lights at once.
HDRP allows you to use Light Layers to make Lights only affect specific meshes
in your Scene. These are LayerMasks that you can associate with a Light
component and MeshRenderer.
In the Light properties, click the More Options button. This displays the Light
Layer dropdown under General. Choose which LayerMasks you want to
associate with the Light.
In the Light properties, select Show Additional Properties from the More
Items menu (☰). This displays the Light Layer dropdown under General.
Choose what LayerMasks you want to associate with the Light.
Next, set up the MeshRenderers with the Rendering Layer Mask. Only Lights on
the matching LayerMask will affect the mesh. This feature can be invaluable for
fixing light leaks, making sure that lights only strike their intended targets. It can
also be part of the workflow to set up cutscene lighting, so that characters only
can receive dedicated cinematic lights.
Set the Rendering Layer Mask so that only specific Lights affect the mesh.
To set up your Light Layers, go to the HDRP Default Settings. The Layers
Names section lets you set the string name for Light Layer 0 to 7.
For more information, including the full list of Light properties, see the
Light component documentation.
Units
Physical Light Units can include units of both luminous flux and illuminance.
Luminous flux represents the total amount of light emitted from a source, while
illuminance refers to the total amount of light received by an object (often in
luminous flux per unit of area).
For recreating a real lighting source, switch to the unit listed on the tech specs and
plug in the correct luminous flux or luminance. HDRP will match the Physical Lighting
Units, eliminating much of the guesswork when setting intensities.
Click the icon to choose presets for Exterior, Interior, Decorative, and Candle.
These settings provide a good starting point if you are not explicitly matching a
specific value.
The following cheat sheet contains the color temperature values and light intensities
of common real-world light sources. It also contains exposure values for different
lighting scenarios.
You can find a complete table of common illumination values in the Physical Light
Units documentation.
Make your point, spot, and area lights more closely mimic the falloff of real lights
using an IES profile. This works like a light cookie to apply a specific manufacturer’s
specs to a pattern of light. IES profiles can give your lights an extra boost of realism.
Import an IES profile from Assets > Import New Asset. The importer will
automatically create a Light Prefab with the correct intensity. Then just drag the
Prefab into the Scene view or Hierarchy and tweak its color temperature.
Real-world manufacturers
— Philips
— Lithonia Lighting
— Efficient Lighting Systems
— Atlas
— Erco
— Lamp
— Osram
Artist sources
— Renderman
In HDRP, you can use the Visual Environment override to define the sky and
general ambience for a scene.
Use Ambient Mode: Dynamic to set the sky lighting to the current override
that appears in the Visual Environment’s Sky > Type. Otherwise, Ambient Mode:
Static defaults to the sky setup in the Lighting window’s Environment tab.
Even with your other light sources disabled, the SampleScene receives general
ambient light from the Visual Environment.
Environment lighting only – with the sun directional light disabled, Direct sunlight combined with the environment lighting
the sky still provides ambient light.
Adding the key light of the sun completes the general illumination of the scene.
The environment light helps fill in the shadow areas so that they don’t appear
unnaturally dark.
HDRP includes three different techniques for generating skies. Set the Type
to either HDRI Sky, Gradient Sky, or Physically Based Sky. Then, add the
appropriate override from the Sky menu.
Applying a Visual Environment sky is similar to wrapping the entire virtual world
with a giant illuminated sphere. The colored polygons of the sphere provide a
general light from the sky, horizon, and ground.
HDRI Sky allows you to represent the sky with a cubemap made from high-
dynamic range photographs. You can find numerous free and low-cost sources of
HDRIs online. A good starting point is the Unity HDRI Pack in the Asset Store.
If you’re adventurous, we also have a guide to shooting your own HDRIs.
HDRI Sky
Once you’ve imported your HDRI assets, add the HDRI Sky override to load the
HDRI Sky asset. This lets you also tweak options for Distortion, Rotation, and
Update Mode.
Because the sky is a source of illumination, specify the Intensity Mode, then
choose a corresponding Exposure/Multiplier/Lux value to control the strength of
the environmental lighting. Refer to the Lighting and Exposure Cheat Sheet above
for example intensity and exposure values.
Its opacity depends on the object’s distance away from the camera. Fog can
also hide the camera’s far clipping plane, blending your distant geometry back
into the scene.
Set up the Fog override on a Volume in your Scene. The Base Height determines
a boundary where constant, thicker fog begins to thin out traveling upward.
Above this value, the fog density continues fading exponentially, until it reaches
the Maximum Height.
Likewise, the Fog Attenuation Distance and Max Fog Distance control how
fog fades with greater distance from from the camera. Toggle the Color Mode
between a Constant Color or the existing Sky Color.
Volumetric Fog Distance sets the distance (in meters) from the Camera’s
Near Clipping Plane to the back of its volumetric lighting buffer. This fills the
atmosphere with an airborne material, partially occluding GameObjects
within range.
Each Light component (except area lights) has a Volumetrics group. Check
Enable, then set the Multiplier and Shadow Dimmer. A Real-time or Mixed
Mode light will produce ’god rays’ within Volumetric Fog. The Multiplier dials
the intensity, while the Shadow Dimmer controls how shadow casting surfaces
cut into the light.
Room 2 in the Sample Scene features a skylight and Volumetric Fog. The
frame of the glass case carves volumetric shadows out of the sunbeams
from the ceiling. Dial up the Shadow Dimmer and exaggerate the Multiplier to
intensify the effect.
Local Volumetric Fog appears in a bounding box. Volumetric Lighting dramatically illuminates the Volumetric Fog areas.
Shadow maps
For a directional light, the shadow map covers a large portion of the scene,
which can lead to a problem called perspective aliasing. Shadow map pixels
close to the camera look jagged and blocky compared to those farther away.
Contact Shadows
Micro shadows
HDRP can extend even smaller shadow details into your Materials. Micro
shadows use the normal map and ambient occlusion map to render really fine
surface shadows without using the mesh geometry itself.
Simply add the Micro shadows override to a Volume in your Scene and adjust
the Opacity. Micro shadows only work with directional lights.
Adjust the settings for the shadow thickness, quality, and fade.
— Reflection Probes
— Sky reflections
Each reflection type can be resource intensive, so select the method that works
best depending on your use case. If more than one reflection technique applies
to a pixel, HDRP blends the contribution of each reflection type. Bounding
surfaces called Influence Volumes partition the 3D space to determine what
objects receive the reflections.
Screen Space Reflections use the depth and color buffer to calculate reflections.
Thus they can only reflect objects currently in camera view and may not render
properly at certain positions on screen. Glossy floorings and wet planar surfaces
are good candidates for receiving Screen Space Reflections.
Screen Space Reflections ignore all objects outside of the frame, which can be a
limitation to the effect.
Make sure you have Screen Space Reflection enabled in the Frame Settings
(HDRP Default Settings or the Camera’s Custom Frame Settings) under
Lighting. Then, add the Screen Space Reflection override to your Volume object.
Material surfaces must exceed the Minimum Smoothness value to show Screen
Space Reflections. Lower this value if you want rougher materials to show the
SSR, but be aware that a lower Minimum Smoothness threshold can add to the
computation cost. If Screen Space Reflection fails to affect a pixel, then HDRP
falls back to using Reflection Probes.
Use the Quality dropdown to select a preset number of Max Ray Steps. Higher
Max Ray Steps increase quality but come with a cost. As with all effects,
balance performance with visual quality.
Each Scene can have several probes and blend the results. Localized reflections
then can change as your camera moves through the environment.
— Real-time probes create the cubemap at runtime in the Player rather than
in the Editor. This means that reflections are not limited to static objects,
but be aware that real-time updates can be resource intensive.
A Planar Reflection Probe allows you to recreate a flat, reflective surface, taking
surface smoothness into account. This is perfect for a shiny mirror or floor.
The probe then stores the resulting mirror image in a 2D RenderTexture. Drawn
to the boundaries of the rectangular probe, this creates a planar reflection.
A Planar Reflection Probe captures a mirror image by reflecting a camera through a plane.
A Reflection Probe shows the surrounding room whereas a sky reflection reflects the Gradient Sky.
Reflection hierarchy
When one technique does not fully determine the reflection at a pixel, HDRP falls
back to the next technique. In other words, Screen Space Reflection falls back
to the Reflection Probes, which in turn fall back to sky reflections.
It’s important to set up your Influence Volumes for your Reflection Probes
properly. Otherwise, you could experience light leaking from unwanted
sky reflections.
For more details about determining Reflection Hierarchy, see the Reflection in
the HDRP documentation page.
Because the capture point of a Reflection Probe is fixed and rarely matches
the Camera position near the Reflection Probe, there may be a noticeable
perspective shift in the resulting reflection. As a result, the reflection might
not look connected to the environment.
A Reflection Proxy Volume helps you partially correct this. It reprojects the
reflections more accurately within the proxy Volume, based on the
Camera position.
A Reflection Proxy Volume reprojects the cubemap to match the room’s world space position.
Screen Space Global Illumination (SSGI) uses the depth and color buffer of the
screen to calculate bounced, diffuse light. Much like how lightmapping can
bake indirect lighting into the surfaces of your static level geometry, SSGI more
accurately simulates how photons can strike surfaces and transfer color and
shading as they bounce.
In Room 2 of the Sample Scene, we can see the green of the moving tree leaves
transfer through bounced light onto the wall with SSGI enabled.
Screen Space Global Illumination captures the bounced light from the foliage in real time.
SSGI is enabled under the Frame Settings under Lighting, and it must be
enabled in the Pipeline Asset’s Lighting section as well.
Note: We recommend that you use Unity 2021.2 or above with Screen Space
Global Illumination. HDRP 12 includes significant improvements to SSGI quality.
The Screen Space Refraction override helps to simulate how light behaves when
passing through a denser medium than air. HDRP’s Screen Space Refraction
uses the depth and color buffer to calculate refraction through a transparent
material like glass.
To enable this effect through the HDRP/Lit shader, make sure your material has
a Surface Type of Transparent.
HDRP post-processing uses the Volume system to apply the image effects to
the camera. Once you know how to add overrides, the process of applying more
post effects should already be familiar.
Many of these post-processing overrides for controlling color and contrast may
overlap in functionality. Finding the right combinations of them may require
some trial and error.
You won’t need every effect available. Just add the overrides necessary to
create your desired look and ignore the rest.
Post-processing overrides
Tonemapping
The Shadows, Midtones, Highlights override separately controls the tonal and
color range for shadows, midtones, and highlights of the render. Activate each
trackball to affect the respective part of the image. Then, use the Shadow and
Highlight Limits to prevent clipping or pushing the color correction too far.
Bloom creates the effect of light bleeding around the light source. This conveys
the impression that the light source is intensely bright and overwhelming
the camera.
Bloom override
Adjust the Intensity and Scatter to adjust the Bloom’s size and brightness.
Lens Dirt applies a texture of smudges or dust to diffract the Bloom effect.
Depth of Field simulates the focus properties of a real camera lens. Objects
nearer or farther from the camera’s focus distance appear to blur.
When Depth of Field is active, an out-of-focus blur effect called a bokeh can
appear around a bright area of the image. Modify the camera aperture’s shape
to change the appearance of the bokeh (see Additional Physical Camera
Parameters above).
The White Balance override adjusts a Scene’s color so that the color white is
correctly rendered in the final image. You could also push the Temperature to
shift between yellow (warmer) and blue (cooler). Tint adjusts the color cast
between green and magenta.
In the HDRP Sample Project, the local Volumes include White Balance overrides
for each room.
White Balance
Color Curves
Grading curves allow you to adjust specific ranges in hue, saturation, or luminosity.
Select one of the eight available graphs to remap your color and contrast.
Color Curves
Color Adjustments
Use this effect to tweak the overall tone, brightness, hue, and contrast of the
final rendered image.
Color Adjustments
The Channel Mixer lets one color channel impact the “mix” of another. Select
an RGB output, then adjust the influence of one of the inputs. For example,
dialing up the Green influence in the Red Output Channel will tint all green
areas of the image with a reddish hue.
Channel Mixer
Lens Distortion
Lens Distortion simulates radial patterns that arise from imperfections in the
manufacture of real-world lenses. This results in straight lines appearing slightly
bowed or bent, especially with zoom or wide-angle lenses.
Vignette imitates an effect from practical photography, where the corners of the
image darken and/or desaturate. This can occur with wide-angle lenses or result
from an appliance (a lens hood or stacked filter rings) blocking the light. This
effect can also be used to draw the attention of the viewer to the center
of the screen.
Motion Blur
Real-world objects appear to streak or blur in a resulting image when they move
faster than the camera exposure time. The Motion Blur override simulates
that effect.
To minimize performance cost, reduce the Sample Count, increase the Minimum
Velocity, and decrease the Maximum Velocity. You can also reduce the Camera
Clamp Mode parameters under the Additional Properties.
Rendering Debugger
The Debugger can help you troubleshoot a specific rendering pass. On the
Lighting panel, you can enter Fullscreen Debug Mode and choose features
to debug.
With the fullscreen Debug mode active, the Scene and Game views switch to a
temporary visualization of a specific feature. This can serve as a useful diagnostic.
Lighting Debug Mode or Fullscreen Debug Mode can help you understand the sources of illumination in your scene.
HDRP includes preview support for ray tracing with select GPU hardware and
the DirectX 12 API. See Getting started with ray tracing for a specific list of
system requirements.
Setup
In order to enable ray tracing (in Preview), you need to change the default
graphics API of your HDRP project to DirectX 12.
Open the Render Pipeline Wizard (Window > Render Pipeline > HD Render
Pipeline Wizard).2 Click Fix All in the HDRP + DXR tab, then restart the Editor.
Follow the Pipeline Wizard prompts to activate any disabled features.
2
In Unity 2021, the HDRP Wizard is located under Window > Rendering > HDRP Wizard
Once you enable ray tracing for your project, check that your HDRP Global or
Camera Frame Settings also has ray tracing activated. Make sure you are using
a compatible 64-bit architecture in your Build Settings and validate your scene
objects from Edit > Rendering > Check Scene Content for HDRP Ray Tracing.
Overrides
Ray tracing adds some new Volume overrides and enhances many of the
existing ones in HDRP:
Use the Quality setting for complex interior environments that benefit from
multiple bounces and samples. Performance mode (limited to one sample
and one bounce) works well for exteriors, where the lighting mostly comes
from the primary directional light.
— If you’re migrating from the Built-In Render Pipeline, see this chart for a
detailed feature comparison between the two render pipelines.
Remember that the 3D Sample Project is available from the Unity Hub when
you want to explore further. Be sure to check out the additional resources
listed below, and you can always find tips on the Unity Blog or
HDRP community forum.
At Unity, we want to empower artists and developers with the best tools to
build real-time content. Lighting is both an art and a science – and where
these meet, it’s a little bit like magic.