The Definitive Guide To Lighting in The High Definition Render Pipeline Unity 2021 Lts Edition
The Definitive Guide To Lighting in The High Definition Render Pipeline Unity 2021 Lts Edition
HIGH DEFINITION
RENDER PIPELINE
(HDRP)
2 0 2 1 LT S E D I T I O N
Contents
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
System requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Unity Hub. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Project Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Graphics settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Quality settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Optimizing HDRP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Forward rendering. . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Deferred shading. . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Anti-aliasing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Post-processing Anti-aliasing. . . . . . . . . . . . . . . . . . . . . . 25
Volumes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Volume Profiles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Volume Overrides. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Overrides workflow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Exposure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Exposure override. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Fixed mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Automatic mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Automatic Histogram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Curve Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Physical Camera. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Lights. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Light types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Shapes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Additional properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Light Layers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Light Anchors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Units. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
HDRI Sky . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Gradient Sky. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Color tip. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Global fog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Volumetric Lighting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Clouds. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Cloud Layer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Volumetric Clouds. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Shadow maps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Shadow Cascades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Contact Shadows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Micro shadows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Reflections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Reflection Probes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Optimization tip. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Sky reflection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Reflection hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Post-processing overrides. . . . . . . . . . . . . . . . . . . . . . . . . 77
Tonemapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Bloom. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Depth of Field. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
White Balance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Color Curves. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Color Adjustments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Channel Mixer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Lens Distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Vignette. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Motion Blur . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Lens Flare. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Dynamic resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
NVIDIA DLSS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
AMD FSR. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
TAA Upscale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Rendering Debugger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Setup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Overrides. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
More resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Next steps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Introduction
It all begins with light.
Unity built the High Definition Render Pipeline (HDRP) to help creators like you
realize your vision, tapping the power of high-end PC or console hardware to
reach new levels of graphical realism.
This guide was created to guide existing Unity artists and developers in
exploring HDRP’s physically based lighting techniques. HDRP represents a
technological advance in Unity’s real-time rendering, allowing you to work with
light just as it behaves in the real world.
Build game environments that transport your players to places, anywhere from
the familiar to fantastical.
— Fog: Add depth and dimension to your scenes with fog. Enable volumetrics
to integrate fog effects with your foreground objects and render cinematic
shafts of light. Maintain per-light control of volumetric light and shadows
and use the Local Volumetric Fog component for fine control of fog density
with a 3D mask texture.
— Volume system: HDRP features an intuitive system that lets you block
out different lighting effects and settings based on camera location or by
priority. Layer and blend volumes to allow expert-level control over every
square meter of your scene.
Completely new to HDRP? Make sure you read Getting started with the High
Definition Render Pipeline.
Installation
Unity 2021 LTS and above includes the HDRP package with the installation to
ensure that you’re always running on the latest verified graphics code. When
you install the most recent Unity release, it also installs the corresponding
version of HDRP.
13.x 2022.1
Tying the HDRP graphics packages to a specific Unity release helps ensure
compatibility. However, you can also switch to a custom version of HDRP by
overriding the manifest file.
System requirements
HDRP only works on console and desktop platforms that support compute
shaders. HDRP does not support OpenGL or OpenGL ES devices. Refer to the
documentation for more complete requirements and compatibility information.
See Virtual Reality in the High Definition Render Pipeline to learn about
supported VR Platforms and devices.
To get started, create a new project. Select either the 3D (HDRP) empty
template or 3D Sample Scene (HDRP) from the available templates (also called
High Definition RP in older versions of the Hub). Choose the latest to import the
HDRP package with some example presets.
Load the Sample Scene. You should see something like this:
If you create your project with the 3D Core template, Unity uses the older
Built-In Render Pipeline. You can migrate the project to HDRP manually
from the Package Manager (Window > Package Manager).
Find the High Definition RP package in the Unity Registry (or use the Search
field to locate it), and install.
If there is a conflict with the current Project Settings, the HDRP Render Pipeline
Wizard will appear to help you troubleshoot
(also found under Window > Rendering > HDRP Wizard).1
Click Fix All under Configuration Checking, or click Fix for each issue to repair
individually. This checklist can help you migrate from a non-SRP project.
1
In Unity 2021.2/HDRP 12, this is located under Window > Rendering > HDRP Wizard.
When the wizard finishes, a prompt will ask you to create a new HDRP Pipeline
Asset. This is a file on disk that will hold your specific pipeline settings. Select
Create One to add a new Render Pipeline Asset and assign the file here.
Note that manual installation from a blank project does not import the
3D Sample Scene (HDRP). Use the 3D Sample Scene template if you want
access to the example assets shown in this guide.
We’ll use this project to demonstrate many of the HDRP’s features throughout
this guide.
Use the WASD keys and mouse to drive the FPS Controller around the level.
— Room 1 is a round platform lit by the overhead sunlight. Decals add grime
and puddles of water to the concrete floor.
For a deeper look at the HDRP 3D Sample Scene, please check out this
blog post from Unity technical artist Pierre Yves Donzallaz, which describes the
template scene in more detail.
You might find some other projects helpful once you’re done exploring the
HDRP 3D Sample Scene.
Though not originally intended for gaming, the Auto Showroom project
demonstrates a highly detailed vehicle with realistic lighting. Change the stage
lights, car paint materials, and backdrop in an interactive demo. This project
is available via the Unity Hub.
The Spaceship Demo primarily showcases the Visual Effect Graph, but it also
features many HDRP features in a sci-fi environment. You can download it from
Unity’s GitHub repository or get the playable on Steam.
Install the Cinematic Studio Template from the Unity Hub to learn how to
make cinematics or animated films. The template teaches how to set up and
light shots using a funny short movie called Mich-L, which mixes stylized and
photoreal rendering.
We invite you to explore these additional projects as you discover the High
Definition Render Pipeline.
Note: HDRP Default Settings is called HDRP Global Settings in Unity 2021.2/
HDRP 12 and above.
Project Settings
Graphics settings
The top field, the Scriptable Render Pipeline Settings, represents a file on disk
that stores all of your HDRP settings.
You can have multiple such Pipeline Assets per project. Think of each one as
a separate configuration file. For example, you might use them to store
specialized settings for different target platforms (Xbox, PlayStation, and so on),
or they could also represent different visual quality levels that the player could
swap at runtime.
The 3D Sample Scene includes low-, medium-, and high-quality Pipeline Assets.
Quality settings
The Quality Settings allows you to correspond one of your Pipeline Assets with
a predefined quality level. Select a Level at the top to activate a specific Render
Pipeline Asset, shown in the Rendering options.
You can customize the defaults or create additional Quality Levels, each paired
with additional Pipeline Assets.
A Quality Level represents a specific set of visual features active in the pipeline.
For example, you could create several graphics tiers within your application. At
runtime, your players could then choose the active Quality Level, depending on
hardware.
Edit the actual pipeline settings in the Quality/HDRP subsection. You can
also select the Pipeline Asset in the Project view and edit the settings in the
Inspector.
Be aware that enabling more features in the Pipeline Asset will consume more
resources. In general, optimize your project to use only what you need to
achieve your intended effect. If you don’t need a feature, you can turn it off to
improve performance and save resources.
Here are some typical features that you can disable if you don’t use them:
Read more in this blog post about getting acquainted with HDRP settings
for enhanced performance.
The HDRP Global Settings section (or HDRP Default Settings prior to version
12) determines the baseline configuration of the project. You can override these
settings in the scene by placing local or global Volume components, depending
on the camera position (see Volumes below).
Global Settings save in their own separate Pipeline Asset defined at the top
field. Set up the default rendering and post-processing options here.
As you develop your project, you might need to return to the Global settings
to toggle a specific feature on or off. Some features will not render unless the
corresponding checkbox in HDRP Global Settings is enabled. Make sure you
only enable features you require because they might negatively impact the
rendering performance and memory usage. Also, certain settings will appear
in the Volume Profiles, while other features appear in the Frame Settings,
depending on usage.
Enabling a feature in the HDRP Global Settings does not guarantee it can be
rendered at any time by any camera. You must ensure that the Render Pipeline
Asset whose Quality level is selected under Projects Settings > Quality
supports that feature as well. For instance, to ensure cameras can render
Volumetric Clouds, you must toggle them under HDRP Global Settings > Frame
Settings > Camera > Lighting and in the active Render Pipeline Asset, under
Lighting > Volumetrics.
Forward vs
Deferred rendering
When configuring your HDRP settings in the Pipeline Asset, you will usually
start with the Lit Shader Mode under Rendering. Here you can choose between
Deferred, Forward, or Both. These represent the rendering path, a specific
series of operations related to how the pipeline will render and light
the geometry.
Choose Forward or Deferred in the Lit Shader Modeto set your default
rendering path.
HDRP is flexible and also allows you to choose Both. This option lets you use
one render path for most rendering and then override it per camera.
However, this approach uses more
GPU memory. In most cases, it is
better to choose either Forward or
Deferred.
You may want to understand how these rendering paths work to see how Lit
Shader Mode will impact the other settings in our pipeline.
Forward rendering
In Forward rendering, the graphics card splits the on-screen geometry into
vertices. Those vertices are further broken down into fragments, or pixels,
which render to screen to create the final image.
Each object passes, one at a time, to the graphics API. Forward rendering comes
with a cost for each light. The more lights in your Scene, the longer rendering
will take.
Forward rendering draws lights in separate passes. If you have multiple lights
hitting the same GameObject, this can create significant overdraw, slowing
down when a lot of lights and objects are present.
Unlike traditional forward rendering, HDRP does add some efficiencies to the
forward renderer. For example, it culls and renders several lights together in a
single pass per object material. However, it’s still a relatively expensive process.
If performance is an issue, you may want to use Deferred Shading instead.
Deferred shading
HDRP can also use deferred shading, where lighting is not calculated per object.
Instead deferred shading postpones heavy rendering to a later stage and uses
two passes.
In the first pass, or the G-buffer geometry pass, Unity renders the GameObjects.
This pass retrieves several types of geometric properties and stores them in a
set of textures (e.g., diffuse and specular colors, surface smoothness, occlusion,
normals, and so on).
In the second pass, or lighting pass, Unity renders the Scene’s lighting after the
G-buffer is complete. Hence, it defers the shading. The deferred shading path
iterates over each pixel and calculates the lighting information based on the
buffer instead of the individual objects.
For more information about the technical differences between the rendering
paths, see Forward and Deferred rendering in the HDRP documentation.
Anti-aliasing
The rendering path in the Lit Shader Mode influences how you can use anti-
aliasing to remove the jagged edges from your renders. HDRP offers several
anti-aliasing techniques, depending on your production needs.
In your active Pipeline Asset, set the Lit Shader Mode to Forward Only. Next
select MSAA 2x, MSAA 4x, or MSAA 8x for the Multisample Anti-aliasing
Quality. Higher values result in better anti-aliasing, but they are slower.
We can see this more clearly when we zoom into the camera view.
— Because MSAA only deals with polygon edge aliasing, it cannot prevent
aliasing found on certain textures and materials hit by sharp specular
lighting. You may need to combine MSAA with another Post-processing
Anti-aliasing technique (right) if that is an issue.
Post-processing Anti-aliasing
Post-processing anti-aliasing – compare the results of FXAA, SMAA, and TAA settings.
A Volume is just a placeholder object with a Volume component. You can create
one through the GameObject > Volume menu by selecting a preset. Otherwise,
simply make a GameObject with the correct components manually.
The Light Explorer can list all the Volumes in the open Scene(s).
Set the Volume component’s Mode setting to either Global or Local, depending
on context.
A global Volume works as a “catch-all” without any boundaries, and it affects all
cameras in the Scene. In the HDRP template scene, the VolumeGlobal defines an
overall baseline of HDRP settings for the entire level.
As your camera moves around the scene, the global settings take effect until
your player controller bumps into a local Volume where those settings take over
Performance tip
Volume Profiles
Use the Profile field to switch Volume Profiles or create a new one.
Note that changes done to Volume Profiles in Play mode will not be lost when
leaving said mode.
Volume Overrides
Each Volume Profile begins with a set of default properties. To edit their values,
use Volume Overrides and customize the individual settings. For example,
Volumes Overrides could modify the Volume’s Fog, Post-processing,
or Exposure.
Once you have your Volume Profile set, click Add Override to customize the
Profile settings. A Fog override could look like this:
Each of the Volume Override’s properties has a checkbox at the left, which you
can enable to edit that property. Leaving the box disabled means HDRP uses
the Volume’s default value.
Each Volume object can have several overrides. Within each one, edit as many
properties as needed. You can quickly check or uncheck all of them with the All
or None shortcut at the top left.
The higher-level Volume settings serve as the defaults for lower-level Volumes.
Here, the HDRP Default Settings pass down to the global Volume. This, in turn,
serves as the “base” for the local Volumes.
The Global Volume overrides the HDRP Default Settings. The Local Volumes, in
turn, override the Global Volume. Use the Priority, Weight, and Blend Distance
(outlined below) to resolve any conflicts from overlapping Volumes.
To debug the current values of a given Volume component, you can use the
Volume tab in the Rendering Debugger.
Debugging a Volume
You can find a complete Volume Overrides List in the HDRP documentation.
Because you often need more than one Volume per level, HDRP allows you to
blend between Volumes. This makes transitions between them less abrupt.
At runtime, HDRP uses the camera position to determine which Volumes affect
the final HDRP settings.
Blend Distance determines how far outside the Volume’s Collider to begin
fading on or off. A value of 0 for Blend Distance means an instant transition,
while a positive value means the Volume Overrides begin blending once the
camera enters the specified range.
The Volume framework is flexible and allows you to mix and match Volumes
and overrides as you see fit. If more than one Volume overlaps the same space,
HDRP relies on Priority to decide which Volume takes precedence. Higher
values mean higher priority.
Use Blend Distance, Weight, and Priority when overlapping local Volumes.
Your exposure range in HDRP will typically fall somewhere along this spectrum:
Greater exposure values allow less light into the camera and are appropriate for
more brightly lit situations. Here, an EV value between 13 and 16 is suitable for a
sunny daytime exterior. In contrast, a dark, moonless, night sky might use an EV
between -3 and 0.
You can vary a number of factors in an actual camera’s settings to modify your
Exposure value:
— The shutter speed, the length of time the image sensor is exposed to light
Exposure triangle
HDRP expresses all exposure values in EV100, which fixes the sensitivity to that
of 100 International Standards Organisation (ISO) film.
Exposure formula
It’s a logarithmic base-2 scale. As the exposure value increases 1 unit, the
amount of light entering the lens decreases by half.
Then, calculate the exposure value using the formula above. If you use the same
value in the Exposure override (see below), the rendered image should fall in line
with the real-world image exposure.
In this way, you can use digital photos as references when lighting your level.
While your goal isn’t necessarily to recreate the image perfectly, matching an
actual photograph can take the guesswork out of your lighting setups.
Exposure override
In the Mode dropdown, you can select one of the following: Fixed, Automatic,
Automatic Histogram, Curve Mapping, and Physical Camera.
Compensation allows you to shift or adjust the exposure. You would typically
use this to apply minor adjustments and “stop” the rendered image up
and down slightly.
Fixed mode
Automatic mode
While Automatic mode will work under many lighting situations, it can also
unintentionally overexpose or underexpose the image when pointing the camera
at a very dark or very bright part of the scene.
Use the Limit Min and Limit Max to keep the exposure level within a desirable
range. Playtest to verify that your limits stay within your expected exposure
throughout the level.
Metering Mode, combined with mask options, determines what part of the
frame to use for autoexposure.
The Adaptation mode controls how the autoexposure changes as the camera
transitions between darkness and light, with options to adjust the speed. Just
like with the eye, moving the camera from a very dark to a very light area, or
vice versa, can be briefly disorienting.
Automatic, Automatic Histogram, and Curve Mapping modes use Metering mode
to control what part of the frame to use when calculating exposure. You can set
the Metering mode to:
— Spot: The camera only uses the center of the screen to measure exposure.
— Center Weighted: The camera favors pixels in the center of the image and
feathers out toward the edges of frame.
Automatic Histogram
Automatic Histogram mode takes Automatic mode a step further. This computes
a histogram for the image and ignores the darkest and lightest pixels when
setting exposure.
Curve Remapping lets you remap the exposure curve as well (see Curve
Mapping below).
Automatic Histogram mode uses pixels from the middle of the histogram to calculate exposure.
Curve Mapping
Those familiar with photography may find Physical Camera mode helpful for
setting camera parameters.
Switch the Exposure override’s Mode to Physical Camera, then locate the
Main Camera.
Important to exposure are the ISO (sensitivity), Aperture (or f-number), and
Shutter Speed. If you are matching reference photos, copy the correct settings
from the image’s Exif data. Otherwise, this table can help you guesstimate
Exposure Value based on f-number and shutter speed.
Though not related to exposure, other Physical Camera properties can help you
match the attributes of real-world cameras.
For example, we normally use Field of View in Unity (and many other 3D
applications) to determine how much of the world a camera can see at once.
In real cameras, however, the field of view depends on the size of the sensor
and focal length of the lens. Rather than setting the field of view directly, the
Physical Camera settings allow you to fill in the Sensor Type, Sensor Size,
and Focal Length from the actual camera data. Unity will then automatically
calculate the corresponding Field of View value.
Use the physical camera parameters to reshape the aperture. The five-bladed iris can show a pentagonal (left)
or circular (right) bokeh.
Lights
HDRP includes a number of different Light types and shapes to help you control
illumination in your scene.
Light types
These Light types are available, similar to the other render pipelines in Unity:
— Directional: This behaves like light from an infinitely distant source, with
perfectly parallel light rays that don’t diminish in intensity. Directional lights
often stand in for sunlight. In an exterior scene, this will often be your
key light.
— Spot: This is similar to a real-world spotlight, which can take the shape of
a cone, pyramid, or box. A spot falls off along the forward z-axis, as well as
toward the edges of the cone/pyramid shape.
— Area: This projects light from the surface of a specific shape (a rectangle,
tube, or disc). An area light functions like a broad light source with a
uniform intensity in the center, like a window or fluorescent tube.
Modify how the spot, point, and area lights fall off with the Range. Many HDRP
Lights diminish using the inverse square law, like light sources in the real world.
Spot and area lights have additional shapes for controlling how each light
falls off.
— Cone: Projects light from a single point to a circular base. Adjust the Outer
Angle (degrees) and Inner Angle (percentage) to shape the cone and
modify its angular attenuation.
— Pyramid: Projects light from a single point onto a square base. Adjust the
pyramid shape with the Spot Angle and Aspect Ratio.
— Tube: Projects light from a single line in every direction, out to a defined
Range. This light only works in Realtime Mode.
— Disc: Projects light from a disc shape in the local positive Z direction,
out to a defined Range. This light only works in Baked Mode.
All HDRP Light types have Emission properties that define the light’s appearance.
You can switch the Light Appearance to Color and specify an RGB color. Otherwise,
change this to Filter and Temperature for more physically accurate input.
Color temperature sets the color based on degrees Kelvin. See the Lighting and
Exposure Cheat Sheet for reference.
You can also add another color that acts like a Filter, tinting the light with another
hue. This is similar to adding a color gel in photography.
Additional properties
HDRP also includes some advanced controls under the More Items menu (⋮) at
the top right of the Inspector properties. Select Show Additional Properties to
see extra options.
These include toggles for Affect Diffuse and Affect Specular. In cutscene or
cinematic lighting, for example, you can separate Lights that control the bright
shiny highlights independently from those that produce softer diffuse light.
Light Layers
HDRP allows you to use Light Layers to make Lights only affect specific meshes
in your Scene. These are LayerMasks that you can associate with a Light
component and MeshRenderer.
In the Light properties, click the More Options button. This displays the Light
Layer dropdown under General. Choose which LayerMasks you want to
associate with the Light.
In the Light properties, select Show Additional Properties from the More
Items menu (☰). This displays the Light Layer dropdown under General.
Choose what LayerMasks you want to associate with the Light.
For example, if you wanted to prevent the lights inside of a building from
accidentally penetrating the walls to the outside, you could set up specific Light
Layers for the interior and exterior. This ensures that you have fine-level control
of your Light setups.
Set the Rendering Layer Mask so that only specific Lights affect the mesh.
To set up your Light Layers, go to the HDRP Default Settings. The Layers
Names section lets you set the string name for Light Layer 0 to 7.
For more information, including the full list of Light properties, see the
Light component documentation.
Unity 2021 LTS and above includes a Light Anchor system to help you set up
lights quickly, by controlling the angle and distance between the camera and
subject. It also enables you to select common lighting angles via nine presets.
First, make sure your Camera is tagged as MainCamera for the Light Anchor
to work. Then, add a Light Anchor component on the Spot Light you want to
control. Align the light on the subject; its position is now the anchor point of the
Spot Light. Increase the Distance between the anchor point and the Spot Light.
You can now adjust the position of the light around the anchor point by tuning
the Orbit, Elevation, and Roll of the light within the Game View, rather than
having to manually adjust the Transform of the light in the Scene view.
Aim your lights with Light Anchors instead of changing their transforms directly.
Units
Physical Light Units can include units of both luminous flux and illuminance.
Luminous flux represents the total amount of light emitted from a source, while
illuminance refers to the total amount of light received by an object (often in
luminous flux per unit of area).
— Candela: One unit is equivalent to the luminous flux of a wax candle. This
is also commonly called candlepower.
— Lumen: This is the SI unit of luminous flux defined to be the 1 candela over
a solid angle (steradian). You will commonly see lumens on commercial
lightbulb specifications. Use it with Unity spot, point, or area lights.
— Lux: A light source that emits 1 lumen onto an area of 1 square meter has
an illuminance of 1 lux. Real-world light meters commonly read lux, and
you will often use this unit with directional lights in Unity.
For recreating a real lighting source, switch to the unit listed on the tech specs
and plug in the correct luminous flux or luminance. HDRP will match the Physical
Lighting Units, eliminating much of the guesswork when setting intensities.
Click the icon to choose presets for Exterior, Interior, Decorative, and Candle.
These settings provide a good starting point if you are not explicitly matching a
specific value.
The following cheat sheet contains the color temperature values and light
intensities of common real-world light sources. It also contains exposure values for
different lighting scenarios.
You can find a complete table of common illumination values in the Physical Light
Units documentation.
Guidance for lighting and exposure levels Guidance for lighting and exposure levels
Make your point, spot, and area lights more closely mimic the falloff of real lights
using an IES profile. This works like a light cookie to apply a specific manufacturer’s
specs to a pattern of light. IES profiles can give your lights an extra boost of realism.
Import an IES profile from Assets > Import New Asset. The importer will
automatically create a Light Prefab with the correct intensity. Then just drag the
Prefab into the Scene view or Hierarchy and tweak its color temperature.
Real-world manufacturers
— Philips
— Lithonia Lighting
— Efficient Lighting Systems
— Atlas
— Erco
— Lamp
— Osram
Artist sources
IES Profile and Import Settings — Renderman
Environment lighting
In the real world, light reflects and scatters around us. The sky and ground
contribute to environment lighting as random photons bounce between the
atmosphere and earth and ultimately arrive at the observer.
In HDRP, you can use the Visual Environment override to define the sky and
general ambience for a scene.
Use Ambient Mode: Dynamic to set the sky lighting to the current override
that appears in the Visual Environment’s Sky > Type. Otherwise, Ambient Mode:
Static defaults to the sky setup in the Lighting window’s Environment tab.
Environment lighting only – with the sun directional light disabled, Direct sunlight combined with the environment lighting
the sky still provides ambient light.
Adding the key light of the sun completes the general illumination of the scene.
The environment light helps fill in the shadow areas so that they don’t appear
unnaturally dark.
HDRP includes three different techniques for generating skies. Set the Type
to either HDRI Sky, Gradient Sky, or Physically Based Sky. Then, add the
appropriate override from the Sky menu.
Applying a Visual Environment sky is similar to wrapping the entire virtual world
with a giant illuminated sphere. The colored polygons of the sphere provide a
general light from the sky, horizon, and ground.
HDRI Sky
HDRI Sky allows you to represent the sky with a cubemap made from high-
dynamic range photographs. You can find numerous free and low-cost sources
of HDRIs online. A good starting point is the Unity HDRI Pack in the Asset Store.
HDRI Sky
Because the sky is a source of illumination, specify the Intensity Mode, then
choose a corresponding Exposure/Multiplier/Lux value to control the strength
of the environmental lighting. Refer to the Lighting and Exposure Cheat Sheet
above for example intensity and exposure values.
You can animate your HDRI Sky by distorting the HDRI map either procedurally
or with a flow map. This allows you to fake a wind effect on a static HDRI or to
create more specific VFX.
Gradient Sky
Blend the color ramp with Gradient Diffusion, and dial the Intensity for the
strength of the lighting.
The Top, Middle, and Bottom colors blend into a Gradient Sky.
Color tip
Choose the ground color according to the average color of your actual ground
(e.g., terrain) where your objects won't be affected by reflection probes.
Its opacity depends on the object’s distance away from the camera. Fog can
also hide the camera’s far clipping plane, blending your distant geometry back
into the scene.
HDRP implements global fog as a Fog override. Here, the fog fades exponentially
with its distance from the camera and its world space height.
Set up the Fog override on a Volume in your Scene. The Base Height determines
a boundary where constant, thicker fog begins to thin out traveling upward.
Above this value, the fog density continues fading exponentially, until it reaches
the Maximum Height.
Likewise, the Fog Attenuation Distance and Max Fog Distance control how fog
fades with greater distance from the camera. Toggle the Color Mode between a
Constant Color or the existing Sky Color.
Volumetric Lighting
Each Light component (except area lights) has a Volumetrics group. Check Enable,
then set the Multiplier and Shadow Dimmer. A Real-time or Mixed Mode light will
produce ’god rays’ within Volumetric Fog. The Multiplier dials the intensity, while the
Shadow Dimmer controls how shadow casting surfaces cut into the light.
If you want more detailed fog effects than the Fog override can provide, HDRP
additionally offers Local Volumetric Fog (called a Density Volume component
before HDRP 12).
This is a separate component, outside the Volume system. Create a Local Volumetric
Fog GameObject from the menu (GameObject > Rendering > Local Volumetric Fog)
or right-click over the Hierarchy (Rendering > Local Volumetric Fog).
This generates a fog-filled bounding box. Adjust the size, axis control, and
blending/fading options.
Add some Scroll Speed for animation and adjust the Tiling. Your Volumetric Fog
then can gently roll through the scene.
In HDRP versions 12 and above, you can enable local volumetric resolutions up
to 256x256x256 in your HDRP Pipeline Asset, allowing for more precise, large-
scale effects.
Clouds
Skies would not look complete without clouds. In HDRP 12 and above, the
Cloud Layer system can generate natural-looking clouds to complement your
Sky and Visual Environment overrides. Volumetric Clouds produce realistic
clouds with an actual thickness that react to the lighting and wind.
Cloud Layer
The Cloud Layer is a 2D texture that you can animate with a flowmap, which
uses the red and green channels to control vector displacement. The clouds
can add some slight motion to your skies in Play mode, making the background
more dynamic. The Cloud Layer sits in front of the sky, with an option to cast
shadows on the ground.
The cloud map itself is a texture using a cylindrical projection, where the RGBA
channels all contain different cloud textures (cumulus, stratus, cirrus, and
wispy clouds, respectively). You can then use the Cloud Layer controls to blend
each channel and build up your desired cloud formation. Two Layers with four
channels lets you simulate and blend from up to eight cloudscapes.
Volumetric Clouds
If you need clouds to interact with light, use Volumetric Clouds. These can
render shadows, receive fog, and create volumetric shafts of light. Combine
these with Cloud Layer clouds or add them separately.
— In your HDRP Asset: Enable Lighting > Volumetric Clouds > Volumetric
Clouds.
The Advanced and Manual Cloud Control options allow you to define maps for
each type of cloud.
Refer to the Clouds in HDRP documentation for the Cloud Layer and
Volumetric Clouds overrides. Please see New Lighting Features in Unity 2021.2
for an in-depth look at Volumetric Clouds.
Shadow maps
Shadow Cascades
For a directional light, the shadow map covers a large portion of the scene,
which can lead to a problem called perspective aliasing. Shadow map pixels
close to the camera look jagged and blocky compared to those farther away.
Perspective aliasing with blocky shadows Shadow Cascades reduce perspective aliasing
HDRP gives you extra control over your Shadow Cascades with the Shadows
override. Use the cascade settings per Volume to fine-tune where each cascade
begins and ends.
Shadows override
Contact Shadows
Shadow maps often fail to capture the small details, especially at discernible
edges where two mesh surfaces connect. HDRP can generate these Contact
Shadows using the Contact Shadows override.
Contact Shadows are a screen space effect and rely on information within the
frame in order to calculate. Objects outside of the frame do not contribute to
Contact Shadows. Use them for shadow details with a small onscreen footprint.
Make sure that you enable Contact Shadows in the Frame Settings. You can also
adjust the Sample Count in the Pipeline Asset under Lighting Quality Settings.
Adjust the settings for the shadow thickness, quality, and fade.
This feature was improved in the 2021 LTS and newer versions to work better
with Terrain and Speedtree. Read more on the blog.
Micro shadows
HDRP can extend even smaller shadow details into your Materials. Micro
shadows use the normal map and ambient occlusion map to render really fine
surface shadows without using the mesh geometry itself.
Simply add the Micro shadows override to a Volume in your Scene and adjust
the Opacity. Micro shadows only work with directional lights.
— Reflection Probes
— Sky reflections
Each reflection type can be resource intensive, so select the method that works
best depending on your use case. If more than one reflection technique applies
to a pixel, HDRP blends the contribution of each reflection type. Bounding
surfaces called Influence Volumes partition the 3D space to determine what
objects receive the reflections.
Screen Space Reflections use the depth and color buffer to calculate reflections.
Thus they can only reflect objects currently in camera view and may not render
properly at certain positions on screen. Glossy floorings and wet planar surfaces
are good candidates for receiving Screen Space Reflections.
Screen Space Reflections ignore all objects outside of the frame, which can be a
limitation to the effect.
Make sure you have Screen Space Reflection enabled in the Frame Settings
(HDRP Default Settings or the Camera’s Custom Frame Settings) under
Lighting. Then, add the Screen Space Reflection override to your Volume object.
Material surfaces must exceed the Minimum Smoothness value to show Screen
Space Reflections. Lower this value if you want rougher materials to show the
SSR, but be aware that a lower Minimum Smoothness threshold can add to the
computation cost. If Screen Space Reflection fails to affect a pixel, then HDRP
falls back to using Reflection Probes.
Use the Quality dropdown to select a preset number of Max Ray Steps. Higher
Max Ray Steps increase quality but come with a cost. As with all effects,
balance performance with visual quality.
Note: In HDRP versions 12 and above, you can choose between a physically
based algorithm using accumulation (new) or a less accurate approximate
algorithm (default).
Each Scene can have several probes and blend the results. Localized reflections
then can change as your camera moves through the environment.
— Real-time probes create the cubemap at runtime in the Player rather than
in the Editor. This means that reflections are not limited to static objects,
but be aware that real-time updates can be resource intensive.
Optimization tip
A Planar Reflection Probe allows you to recreate a flat, reflective surface, taking
surface smoothness into account. This is perfect for a shiny mirror or floor.
The probe then stores the resulting mirror image in a 2D RenderTexture. Drawn
to the boundaries of the rectangular probe, this creates a planar reflection.
A Planar Reflection Probe captures a mirror image by reflecting a camera through a plane.
A Reflection Probe shows the surrounding room whereas a sky reflection reflects the Gradient Sky.
Reflection hierarchy
When one technique does not fully determine the reflection at a pixel, HDRP falls
back to the next technique. In other words, Screen Space Reflection falls back
to the Reflection Probes, which in turn fall back to sky reflections.
It’s important to set up your Influence Volumes for your Reflection Probes
properly. Otherwise, you could experience light leaking from unwanted
sky reflections.
For more details about determining Reflection Hierarchy, see the Reflection in
the HDRP documentation page.
Because the capture point of a Reflection Probe is fixed and rarely matches
the Camera position near the Reflection Probe, there may be a noticeable
perspective shift in the resulting reflection. As a result, the reflection might
not look connected to the environment.
A Reflection Proxy Volume helps you partially correct this. It reprojects the
reflections more accurately within the proxy Volume, based on the
Camera position.
A Reflection Proxy Volume reprojects the cubemap to match the room’s world space position.
Screen Space Global Illumination (SSGI) uses the depth and color buffer of the
screen to calculate bounced, diffuse light. Much like how lightmapping can
bake indirect lighting into the surfaces of your static level geometry, SSGI more
accurately simulates how photons can strike surfaces and transfer color and
shading as they bounce.
In Room 2 of the Sample Scene, we can see the green of the moving tree leaves
transfer through bounced light onto the wall with SSGI enabled.
Screen Space Global Illumination captures the bounced light from the foliage in real time.
SSGI is enabled under the Frame Settings under Lighting, and it must be
enabled in the Pipeline Asset’s Lighting section as well.
Note: We recommend that you use Unity 2021.2 or above with Screen Space
Global Illumination. HDRP 12 includes significant improvements to SSGI quality.
The Screen Space Refraction override helps to simulate how light behaves when
passing through a denser medium than air. HDRP’s Screen Space Refraction
uses the depth and color buffer to calculate refraction through a transparent
material like glass.
To enable this
effect through the
HDRP/Lit shader,
make sure your
material has a
Surface Type of
Transparent.
Then choose a
Refraction Model Transparency Inputs to control refraction
and Index of
Refraction under
Transparency Inputs. Use the Sphere Refraction Model for solid objects.
Choose Thin (like a bubble) or Box (with some slight thickness) for hollow objects.
HDRP post-processing uses the Volume system to apply the image effects to
the camera. Once you know how to add overrides, the process of applying more
post effects should already be familiar.
Many of these post-processing overrides for controlling color and contrast may
overlap in functionality. Finding the right combinations of them may require
some trial and error.
You won’t need every effect available. Just add the overrides necessary to
create your desired look and ignore the rest.
Post-processing overrides
Tonemapping
The Shadows, Midtones, Highlights override separately controls the tonal and
color range for shadows, midtones, and highlights of the render. Activate each
trackball to affect the respective part of the image. Then, use the Shadow and
Highlight Limits to prevent clipping or pushing the color correction too far.
Bloom creates the effect of light bleeding around the light source. This conveys
the impression that the light source is intensely bright and overwhelming
the camera.
Bloom override
Adjust the Intensity and Scatter to adjust the Bloom’s size and brightness.
Lens Dirt applies a texture of smudges or dust to diffract the Bloom effect.
Use the Threshold to keep sharpness on non-bright pixels.
Depth of Field simulates the focus properties of a real camera lens. Objects
nearer or farther from the camera’s focus distance appear to blur.
— The Volume Override with the Manual Ranges Focus Mode: Here the
Volume itself controls the focus distance. For example, you could use
this to make your camera intentionally blurry based on location (e.g.,
underwater scene).
— Using a Cinemachine camera with the Volume Settings Extension: This lets
you follow and autofocus on a target.
— The Physical Camera properties using Physical Camera Focus Mode: This
allows you to animate the Focus Distance parameter from the
Camera component.
When Depth of Field is active, an out-of-focus blur effect called a bokeh can
appear around a bright area of the image. Modify the camera aperture’s shape
to change the appearance of the bokeh (see Additional Physical Camera
Parameters above).
For cinematics or offline rendering, you can try choosing a more expensive but
Physically Based depth of field by enabling Additional Settings
and Custom Quality.
White Balance
The White Balance override adjusts a Scene’s color so that the color white is
correctly rendered in the final image. You could also push the Temperature to
shift between yellow (warmer) and blue (cooler). Tint adjusts the color cast
between green and magenta.
White Balance
Color Curves
Color Curves
Color Adjustments
Use this effect to tweak the overall tone, brightness, hue, and contrast of the
final rendered image.
Color Adjustments
The Channel Mixer lets one color channel impact the “mix” of another. Select
an RGB output, then adjust the influence of one of the inputs. For example,
dialing up the Green influence in the Red Output Channel will tint all green
areas of the image with a reddish hue.
Channel Mixer
Lens Distortion
Lens Distortion simulates radial patterns that arise from imperfections in the
manufacture of real-world lenses. This results in straight lines appearing slightly
bowed or bent, especially with zoom or wide-angle lenses.
Vignette imitates an effect from practical photography, where the corners of the
image darken and/or desaturate. This can occur with wide-angle lenses or result
from an appliance (a lens hood or stacked filter rings) blocking the light.
This effect can also be used to draw the attention of the viewer to the center
of the screen.
Motion Blur
Real-world objects appear to streak or blur in a resulting image when they move
faster than the camera exposure time. The Motion Blur override simulates
that effect.
To minimize performance cost, reduce the Sample Count, increase the Minimum
Velocity, and decrease the Maximum Velocity. You can also reduce the Camera
Clamp Mode parameters under the Additional Properties.
Lens Flare
A lens flare is an artifact that appears when bright light is shining onto a
camera lens. A lens flare can appear as one bright glare or as numerous colored
polygonal flares matching the aperture of the camera. In real life flares have an
unwanted effect but they can be useful for a narrative or artistic purpose. For
example, a strong lens flare can be used to grab the player’s attention or change
the mood of a setting or scene. Lens flares are essentially a post-processing
effect since they are rendered in the later stages of the rendering process.
The appearance of the flare is indicated in the Flare asset, but in order to render
the effect, you need to add the component Lens Flare (SRP) to an object in
the Scene view. For instance, a light source or an object generating a flare.
The component controls the effect’s general intensity, scale, and occlusion
parameters. It also provides the option to run off-screen when the flare is
outside the camera view but it should still project some flare effects visible in
the scene, for example, when the flare comes from a static source.
The best way to get familiar with Lens flares is by installing the samples from
the Package Manager. This will add a set of predefined Flare assets and modify
HDRP Sample Scene to include lens flare effects. It also contains a test scene to
browse lens flares and help you build your own.
You can find Lens Flare samples in the Package Manager under High Definition RP
The Lens Flare samples come with presets ranging from realistic lens effects to more fictional ones.
Lens Flares are made out of Lens Flare Elements, whereby each element
represents the different artifacts that the flare produces. The element’s shape
can be a Polygon, a Circle or a custom image.
In Unity 2021 LTS, HDRP offers multiple choices with some of the latest super
sampling technologies.
NVIDIA DLSS (Deep Learning Super Sampling) is natively supported for HDRP
in Unity 2021 LTS. NVIDIA DLSS uses advanced AI rendering to produce image
quality that’s comparable to native resolution while only conventionally rendering
a fraction of the pixels.
With real-time ray tracing and NVIDIA DLSS, you can create beautiful worlds
running at higher frame rates and resolutions on NVIDIA RTX GPUs. DLSS also
provides a substantial performance boost for traditional rasterized graphics.
Learn more about it on NVIDIA’s Unity developer page, blog post
and documentation.
Read this blog to learn more about the inner workings of the DLSS technology that allows games,
such as Nakara Bladepoint by 24 Entertainment, to run at 4K.
AMD FidelityFX™ Super Resolution is now available in Unity 2021 LTS, which
includes built-in FSR support for HDRP and URP. You can use FSR by enabling
dynamic resolution in HDRP assets and cameras, and then selecting FidelityFX
Super Resolution 1.0 under the Upscale filter option.
You can either force the scaling in the HDRP Asset or code your logic to adjust
the scale dynamically.
To set up dynamic resolution in your project and get guidance on how to choose
the best algorithm for your needs, please read this page from
the documentation.
Rendering Debugger
The Rendering Debugger window (Window > Analysis > Rendering Debugger)
contains debugging and visualization tools specific to the Scriptable Render
Pipeline. The left side is organized by category. Each panel allows you to isolate
issues with lighting, materials, volumes, cameras, and so on.
Rendering Debugger
The Debugger can help you troubleshoot a specific rendering pass. On the
Lighting panel, you can enter Fullscreen Debug Mode and choose features
to debug.
These Debug modes let you play “pixel detective” and identify the source of
a specific lighting or shading issue. The panels on the left can show you vital
statistics from your cameras, Materials, Volumes, and so on, to help optimize
your render.
Lighting Debug Mode or Fullscreen Debug Mode can help you understand the sources of illumination in your scene.
You can also debug several common material properties. On the Material screen,
select from the Common Material Properties: albedo, normal, smoothness,
specular, and so on.
Ray tracing in HDRP is a hybrid system that still relies on rasterized rendering as
a fallback and includes preview support for ray tracing with a subset of select
GPU hardware and the DirectX 12 API. See Getting started with ray tracing for a
specific list of system requirements.
Setup
In order to enable ray tracing (in Preview), you need to change the default
graphics API of your HDRP project to DirectX 12.
Open the Render Pipeline Wizard (Window > Render Pipeline > HD Render
Pipeline Wizard). Click Fix All in the HDRP + DXR tab; you will be asked to
restart the editor.
2
In Unity 2021, the HDRP Wizard is located under Window > Rendering > HDRP Wizard
Once you enable ray tracing for your project, check that your HDRP Global or
Camera Frame Settings also has ray tracing activated. Make sure you are using
a compatible 64-bit architecture in your Build Settings and validate your scene
objects from Edit > Rendering > Check Scene Content for HDRP Ray Tracing.
Overrides
Ray tracing adds some new Volume overrides and enhances many of the
existing ones in HDRP:
Use the Quality setting for complex interior environments that benefit
from multiple bounces and samples. Performance mode (limited to one
sample and one bounce) works well for exteriors, where the lighting
mostly comes from the primary directional light.
Ray tracing can produce natural-looking shadows that soften as their distance
from the caster increases, like in real life.
Ray-traced shadows soften as they fall farther from the caster to achieve a different effect than with shadow mapping.
Directional, Point and Spot Lights can also generate semi-transparent shadows.
Watch Activate ray tracing with HDRP for a walkthrough of the High Definition
Render Pipeline’s ray-tracing features in Preview. See the ray tracing
documentation on the HDRP microsite for more information.A
More resources
— If you’re migrating from the Built-In Render Pipeline, see this chart for a
detailed feature comparison between.
— Delve into HDRP settings for enhanced performance with this blog post.
Remember that the 3D Sample Project is available from the Unity Hub when
you want to explore further. If you have questions or feedback on this guide
please let us know in this forum thread.
Be sure to check out the additional resources listed below, and you can always
find tips on the Unity Blog or HDRP community forum.
At Unity, we want to empower artists and developers with the best tools to
build real-time content. Lighting is both an art and a science – and where
these meet, it’s a little bit like magic.