Ray_tracing_(graphics)
Ray_tracing_(graphics)
Ray tracing is capable of simulating a variety of optical effects,[3] such as reflection, refraction, soft
shadows, scattering, depth of field, motion blur, caustics, ambient occlusion and dispersion phenomena
(such as chromatic aberration). It can also be used to trace the path of sound waves in a similar fashion to
light waves, making it a viable option for more immersive sound design in video games by rendering
realistic reverberation and echoes.[4] In fact, any physical wave or particle phenomenon with approximately
linear motion can be simulated with ray tracing.
Ray tracing-based rendering techniques that involve sampling light over a domain generate image noise
artifacts that can be addressed by tracing a very large number of rays or using denoising techniques.
History
The idea of ray tracing comes from as early as the 16th century when it was described by Albrecht Dürer,
who is credited for its invention.[5] Dürer described multiple techniques for projecting 3-D scenes onto an
image plane. Some of these project chosen geometry onto the image plane, as is done with rasterization
today. Others determine what geometry is visible along a given ray, as is done with ray tracing. [6][7]
Using a computer for ray tracing to generate shaded pictures was first accomplished by Arthur Appel in
1968.[8] Appel used ray tracing for primary visibility (determining the closest surface to the camera at each
image point) by tracing a ray through each point to be shaded into the scene to identify the visible surface.
The closest surface intersected by the ray was the visible one. This
non-recursive ray tracing-based rendering algorithm is today called
"ray casting". His algorithm then traced secondary rays to the light
source from each point being shaded to determine whether the point
was in shadow or not. "Draughtsman Making a Perspective
Drawing of a Reclining Woman" by
Later, in 1971, Goldstein and Nagel of MAGI (Mathematical Albrecht Dürer, possibly from 1532,
Applications Group, Inc.)[9] published "3-D Visual Simulation", shows a man using a grid layout to
wherein ray tracing was used to make shaded pictures of solids. At create an image. The German
the ray-surface intersection point found, they computed the surface Renaissance artist is credited with
normal and, knowing the position of the light source, computed the first describing the technique.
brightness of the pixel on the screen. Their publication describes a
short (30 second) film “made using the University of Maryland’s
display hardware outfitted with a 16mm camera. The film showed the
helicopter and a simple ground level gun emplacement. The
helicopter was programmed to undergo a series of maneuvers
including turns, take-offs, and landings, etc., until it eventually is shot
Dürer woodcut of Jacob de Keyser's
down and crashed.” A CDC 6600 computer was used. MAGI
invention. With de Keyser's device,
produced an animation video called MAGI/SynthaVision Sampler in the artist's viewpoint was fixed by an
1974.[10] eye hook inserted in the wall. This
was joined by a silk string to a gun-
Another early instance of ray casting came in 1976, when Scott Roth sight style instrument, with a pointed
created a flip book animation in Bob Sproull's computer graphics vertical element at the front and a
course at Caltech. The scanned pages are shown as a video in the peephole at the back. The artist
accompanying image. Roth's computer program noted an edge point aimed at the object and traced its
outline on the glass, keeping the
at a pixel location if the ray intersected a bounded plane different
eyepiece aligned with the string to
from that of its neighbors. Of course, a ray could intersect multiple
maintain the correct angle of vision.
planes in space, but only the surface point closest to the camera was
noted as visible. The platform was a DEC PDP-10, a Tektronix
storage-tube display, and a printer which would create an image of
the display on rolling thermal paper. Roth extended the framework,
introduced the term ray casting in the context of computer graphics
and solid modeling, and in 1982 published his work while at GM
Research Labs.[11]
Turner Whitted was the first to show recursive ray tracing for mirror
reflection and for refraction through translucent objects, with an
angle determined by the solid's index of refraction, and to use ray Flip book created in 1976 at Caltech
tracing for anti-aliasing.[12] Whitted also showed ray traced shadows.
He produced a recursive ray-traced film called The Compleat
Angler[13] in 1979 while an engineer at Bell Labs. Whitted's deeply recursive ray tracing algorithm reframed
rendering from being primarily a matter of surface visibility determination to being a matter of light
transport. His paper inspired a series of subsequent work by others that included distribution ray tracing and
finally unbiased path tracing, which provides the rendering equation framework that has allowed computer
generated imagery to be faithful to reality.
For decades, global illumination in major films using computer-generated imagery was approximated with
additional lights. Ray tracing-based rendering eventually changed that by enabling physically-based light
transport. Early feature films rendered entirely using path tracing include Monster House (2006), Cloudy
with a Chance of Meatballs (2009),[14] and Monsters University (2013).[15]
Algorithm overview
Scenes in ray tracing are described mathematically by a programmer or by a visual artist (normally using
intermediary tools). Scenes may also incorporate data from images and models captured by means such as
digital photography.
Typically, each ray must be tested for intersection with some subset of all the objects in the scene. Once the
nearest object has been identified, the algorithm will estimate the incoming light at the point of intersection,
examine the material properties of the object, and combine this information to calculate the final color of the
pixel. Certain illumination algorithms and reflective or translucent materials may require more rays to be re-
cast into the scene.
It may at first seem counterintuitive or "backward" to send rays away from the camera, rather than into it (as
actual light does in reality), but doing so is many orders of magnitude more efficient. Since the
overwhelming majority of light rays from a given light source do not make it directly into the viewer's eye, a
"forward" simulation could potentially waste a tremendous amount of computation on light paths that are
never recorded.
Therefore, the shortcut taken in ray tracing is to presuppose that a given ray intersects the view frame. After
either a maximum number of reflections or a ray traveling a certain distance without intersection, the ray
ceases to travel and the pixel's value is updated.
Calculate rays for rectangular viewport
On input we have (in calculation we use vector normalization and cross product):
eye position
target position
field of view - for humans, we can assume
numbers of square pixels on viewport vertical and horizontal direction
numbers of actual pixel
vertical vector which indicates where is up and down, usually - roll
component which determine viewport rotation around point C (where the axis of rotation is the
ET section)
The idea is to find the position of each viewport pixel center which allows us to find the line going from
eye through that pixel and finally get the ray described by point and vector (or its
normalisation ). First we need to find the coordinates of the bottom left viewport pixel and find the
next pixel by making a shift along directions parallel to viewport (vectors i ) multiplied by the size of
the pixel. Below we introduce formulas which include distance between the eye and the viewport.
However, this value will be reduced during ray normalization (so you might as well accept that
and remove it from calculations).
Pre-calculations: let's find and normalise vector and vectors which are parallel to the viewport (all
depicted on above picture)
note that viewport center , next we calculate viewport sizes divided by 2 including
and then we calculate next-pixel shifting vectors along directions parallel to viewport ( ), and left
bottom pixel center
A reflection ray is traced in the mirror-reflection direction. The closest object it intersects is
what will be seen in the reflection.
A refraction ray traveling through transparent material works similarly, with the addition that a
refractive ray could be entering or exiting a material. Turner Whitted extended the
mathematical logic for rays passing through a transparent solid to include the effects of
refraction.[19]
A shadow ray is traced toward each light. If any
opaque object is found between the surface and
the light, the surface is in shadow and the light
does not illuminate it.
These recursive rays add more realism to ray traced
images.
While the direct illumination is generally best sampled using eye-based ray tracing, certain indirect effects
can benefit from rays generated from the lights. Caustics are bright patterns caused by the focusing of light
off a wide reflective region onto a narrow area of (near-)diffuse surface. An algorithm that casts rays directly
from lights onto reflective objects, tracing their paths to the eye, will better sample this phenomenon. This
integration of eye-based and light-based rays is often expressed as bidirectional path tracing, in which paths
are traced from both the eye and lights, and the paths subsequently joined by a connecting ray after some
length.[22][23]
Photon mapping is another method that uses both light-based and eye-based ray tracing; in an initial pass,
energetic photons are traced along rays from the light source so as to compute an estimate of radiant flux as
a function of 3-dimensional space (the eponymous photon map itself). In a subsequent pass, rays are traced
from the eye into the scene to determine the visible surfaces, and the photon map is used to estimate the
illumination at the visible surface points.[24][25] The advantage of photon mapping versus bidirectional path
tracing is the ability to achieve significant reuse of photons, reducing computation, at the cost of statistical
bias.
An additional problem occurs when light must pass through a very narrow aperture to illuminate the scene
(consider a darkened room, with a door slightly ajar leading to a brightly lit room), or a scene in which most
points do not have direct line-of-sight to any light source (such as with ceiling-directed light fixtures or
torchieres). In such cases, only a very small subset of paths will transport energy; Metropolis light transport
is a method which begins with a random search of the path space, and when energetic paths are found,
reuses this information by exploring the nearby space of rays.[26]
To the right is an image showing a simple example of a path of rays recursively generated from the camera
(or eye) to the light source using the above algorithm. A diffuse surface reflects light in all directions.
First, a ray is created at an eyepoint and traced through a pixel and into the scene, where it hits a diffuse
surface. From that surface the algorithm recursively generates a reflection ray, which is traced through the
scene, where it hits another diffuse surface. Finally, another reflection ray is generated and traced through
the scene, where it hits the light source and is absorbed. The color of the pixel now depends on the colors of
the first and second diffuse surface and the color of the light emitted
from the light source. For example, if the light source emitted white
light and the two diffuse surfaces were blue, then the resulting color
of the pixel is blue.
Example
As a demonstration of the principles involved in ray tracing, consider
Image showing recursively
how one would find the intersection between a ray and a sphere. This generated rays from the "eye" (and
is merely the math behind the line–sphere intersection and the through an image plane) to a light
subsequent determination of the colour of the pixel being calculated. source after encountering two
There is, of course, far more to the general process of ray tracing, but diffuse surfaces
this demonstrates an example of the algorithms used.
Any point on a ray starting from point with direction (here is a unit vector) can be written as
where is its distance between and . In our problem, we know , , (e.g. the position of a light source)
and , and we need to find . Therefore, we substitute for :
The two values of found by solving this equation are the two ones such that are the points where
the ray intersects the sphere.
Any value which is negative does not lie on the ray, but rather in the opposite half-line (i.e. the one starting
from with opposite direction).
If the quantity under the square root (the discriminant) is negative, then the ray does not intersect the sphere.
Let us suppose now that there is at least a positive solution, and let be the minimal one. In addition, let us
suppose that the sphere is the nearest object on our scene intersecting our ray, and that it is made of a
reflective material. We need to find in which direction the light ray is reflected. The laws of reflection state
that the angle of reflection is equal and opposite to the angle of incidence between the incident ray and the
normal to the sphere.
where is the intersection point found before. The reflection direction can be found by a
reflection of with respect to , that is
Now we only need to compute the intersection of the latter ray with our field of view, to get the pixel which
our reflected light ray will hit. Lastly, this pixel is set to an appropriate color, taking into account how the
color of the original light source and the one of the sphere are combined by the reflection.
Example: let Kr = 0.5 for a set of surfaces. Then from the first surface the maximum contribution is 0.5, for
the reflection from the second: 0.5 × 0.5 = 0.25, the third: 0.25 × 0.5 = 0.125, the fourth: 0.125 × 0.5 =
0.0625, the fifth: 0.0625 × 0.5 = 0.03125, etc. In addition we might implement a distance attenuation factor
such as 1/D2, which would also decrease the intensity contribution.
For a transmitted ray we could do something similar but in that case the distance traveled through the object
would cause even faster intensity decrease. As an example of this, Hall & Greenberg found that even for a
very reflective scene, using this with a maximum depth of 15 resulted in an average ray tree depth of 1.7.[27]
Bounding volumes
Enclosing groups of objects in sets of bounding volume hierarchies (BVH) decreases the amount of
computations required for ray tracing. A cast ray is first tested for an intersection with the bounding volume,
and then if there is an intersection, the volume is recursively divided until the ray hits the object. The best
type of bounding volume will be determined by the shape of the underlying object or objects. For example,
if the objects are long and thin, then a sphere will enclose mainly empty space compared to a box. Boxes are
also easier to generate hierarchical bounding volumes.
Note that using a hierarchical system like this (assuming it is done carefully) changes the intersection
computational time from a linear dependence on the number of objects to something between linear and a
logarithmic dependence. This is because, for a perfect case, each intersection test would divide the
possibilities by two, and result in a binary tree type structure. Spatial subdivision methods, discussed below,
try to achieve this. Furthermore, this acceleration structure makes the ray-tracing computation output-
sensitive. I.e. the complexity of the ray intersection calculations depends on the number of objects that
actually intersect the rays and not (only) on the number of objects in the scene.
Kay & Kajiya give a list of desired properties for hierarchical bounding volumes:
Subtrees should contain objects that are near each other and the further down the tree the
closer should be the objects.
The volume of each node should be minimal.
The sum of the volumes of all bounding volumes should be minimal.
Greater attention should be placed on the nodes near the root since pruning a branch near the
root will remove more potential objects than one farther down the tree.
The time spent constructing the hierarchy should be much less than the time saved by using it.
The next interactive ray tracer, and the first known to have been labeled "real-time" was credited at the 2005
SIGGRAPH computer graphics conference as being the REMRT/RT tools developed in 1986 by Mike
Muuss for the BRL-CAD solid modeling system. Initially published in 1987 at USENIX, the BRL-CAD ray
tracer was an early implementation of a parallel network distributed ray tracing system that achieved several
frames per second in rendering performance.[30] This performance was attained by means of the highly
optimized yet platform independent LIBRT ray tracing engine in BRL-CAD and by using solid implicit CSG
geometry on several shared memory parallel machines over a commodity network. BRL-CAD's ray tracer,
including the REMRT/RT tools, continue to be available and developed today as open source software.[31]
Since then, there have been considerable efforts and research towards implementing ray tracing at real-time
speeds for a variety of purposes on stand-alone desktop configurations. These purposes include interactive 3-
D graphics applications such as demoscene productions, computer and video games, and image rendering.
Some real-time software 3-D engines based on ray tracing have been developed by hobbyist demo
programmers since the late 1990s.[32]
In 1999 a team from the University of Utah, led by Steven Parker, demonstrated interactive ray tracing live
at the 1999 Symposium on Interactive 3D Graphics. They rendered a 35 million sphere model at 512 by 512
pixel resolution, running at approximately 15 frames per second on 60 CPUs.[33]
The Open RT project included a highly optimized software core for ray tracing along with an OpenGL-like
API in order to offer an alternative to the current rasterization based approach for interactive 3-D graphics.
Ray tracing hardware, such as the experimental Ray Processing Unit developed by Sven Woop at the
Saarland University, was designed to accelerate some of the computationally intensive operations of ray
tracing.
The idea that video games could ray trace their graphics in real time
received media attention in the late 2000s. During that time, a
researcher named Daniel Pohl, under the guidance of graphics
professor Philipp Slusallek and in cooperation with the Erlangen
University and Saarland University in Germany, equipped Quake III
and Quake IV with an engine he programmed himself, which
Saarland University then demonstrated at CeBIT 2007.[34] Intel, a Quake Wars: Ray Traced
patron of Saarland, became impressed enough that it hired Pohl and
embarked on a research program dedicated to ray traced graphics,
which it saw as justifying increasing the number of its processors' cores.[35]: 99–100 [36] On June 12, 2008,
Intel demonstrated a special version of Enemy Territory: Quake Wars, titled Quake Wars: Ray Traced, using
ray tracing for rendering, running in basic HD (720p) resolution. ETQW operated at 14–29 frames per
second on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz.[37]
At SIGGRAPH 2009, Nvidia announced OptiX, a free API for real-time ray tracing on Nvidia GPUs. The
API exposes seven programmable entry points within the ray tracing pipeline, allowing for custom cameras,
ray-primitive intersections, shaders, shadowing, etc. This flexibility enables bidirectional path tracing,
Metropolis light transport, and many other rendering algorithms that cannot be implemented with tail
recursion.[38] OptiX-based renderers are used in Autodesk Arnold, Adobe AfterEffects, Bunkspeed Shot,
Autodesk Maya, 3ds max, and many other renderers.
In 2014, a demo of the PlayStation 4 video game The Tomorrow Children, developed by Q-Games and Japan
Studio, demonstrated new lighting techniques developed by Q-Games, notably cascaded voxel cone ray
tracing, which simulates lighting in real-time and uses more realistic reflections rather than screen space
reflections.[39]
Nvidia introduced their GeForce RTX and Quadro RTX GPUs September 2018, based on the Turing
architecture that allows for hardware-accelerated ray tracing. The Nvidia hardware uses a separate functional
block, publicly called an "RT core". This unit is somewhat comparable to a texture unit in size, latency, and
interface to the processor core. The unit features BVH traversal, compressed BVH node decompression, ray-
AABB intersection testing, and ray-triangle intersection testing.[40] The GeForce RTX, in the form of
models 2080 and 2080 Ti, became the first consumer-oriented brand of graphics card that can perform ray
tracing in real time,[41] and, in November 2018, Electronic Arts' Battlefield V became the first game to take
advantage of its ray tracing capabilities, which it achieves via Microsoft's new API, DirectX Raytracing.[42]
AMD, which already offered interactive ray tracing on top of OpenCL through its Radeon ProRender,[43][44]
unveiled in October 2020 the Radeon RX 6000 series, its second generation Navi GPUs with support for
hardware-accelerated ray tracing at an online event.[45][46][47][48][49] Subsequent games that render their
graphics by such means appeared since, which has been credited to the improvements in hardware and
efforts to make more APIs and game engines compatible with the technology.[50] Current home gaming
consoles implement dedicated ray tracing hardware components in their GPUs for real-time ray tracing
effects, which began with the ninth-generation consoles PlayStation 5, Xbox Series X and Series
S.[51][52][53][54][55]
On 4 November, 2021, Imagination Technologies announced their IMG CXT GPU with hardware-
accelerated ray tracing.[56][57] On January 18, 2022, Samsung announced their Exynos 2200 AP SoC with
hardware-accelerated ray tracing.[58] On June 28, 2022, Arm announced their Immortalis-G715 with
hardware-accelerated ray tracing.[59] On November 16, 2022, Qualcomm announced their Snapdragon 8
Gen 2 with hardware-accelerated ray tracing.[60][61]
On September 12, 2023 Apple introduced hardware-accelerated ray tracing in its chip designs, beginning
with the A17 Pro chip for iPhone 15 Pro models.[62][63] Later the same year, Apple released M3 family of
processors with HW enabled ray tracing support.[64] Currently, this technology is accessible across iPhones,
iPads, and Mac computers via the Metal API. Apple reports up to a 4x performance increase over previous
software-based ray tracing on the phone[63] and up to 2.5x faster comparing M3 to M1 chips.[64] The
hardware implementation includes acceleration structure traversal and dedicated ray-box intersections, and
the API supports RayQuery (Inline Ray Tracing) as well as RayPipeline features.[65]
Computational complexity
Various complexity results have been proven for certain formulations of the ray tracing problem. In
particular, if the decision version of the ray tracing problem is defined as follows[66] – given a light ray's
initial position and direction and some fixed point, does the ray eventually reach that point, then the
referenced paper proves the following results:
Ray tracing in 3-D optical systems with a finite set of reflective or refractive objects represented
by a system of rational quadratic inequalities is undecidable.
Ray tracing in 3-D optical systems with a finite set of refractive objects represented by a
system of rational linear inequalities is undecidable.
Ray tracing in 3-D optical systems with a finite set of rectangular reflective or refractive objects
is undecidable.
Ray tracing in 3-D optical systems with a finite set of reflective or partially reflective objects
represented by a system of linear inequalities, some of which can be irrational is undecidable.
Ray tracing in 3-D optical systems with a finite set of reflective or partially reflective objects
represented by a system of rational linear inequalities is PSPACE-hard.
For any dimension equal to or greater than 2, ray tracing with a finite set of parallel and
perpendicular reflective surfaces represented by rational linear inequalities is in PSPACE.
Software architecture
Middleware
GPUOpen
Nvidia GameWorks
API
Metal (API)
Vulkan
DirectX
See also
Beam tracing Path tracing
Cone tracing Phong shading
Distributed ray tracing Progressive meshes
Global illumination Shading
Gouraud shading Specular reflection
List of ray tracing software Tessellation
Parallel computing Per-pixel lighting
References
1. Shirley, Peter (July 9, 2003). Realistic Ray Tracing. A K Peters/CRC Press; 2nd edition.
ISBN 978-1568814612.
2. "Sponsored Feature: Changing the Game - Experimental Cloud-Based Ray Tracing" (https://w
ww.gamasutra.com/view/feature/134692/sponsored_feature_changing_the_.php).
www.gamasutra.com. Retrieved March 18, 2021.
3. "Disney explains why its 2D animation looks so realistic" (https://www.engadget.com/2016-08-0
2-disney-explains-hyperion-renderer.html). Engadget. Retrieved March 18, 2021.
4. "The Next Big Steps In Game Sound Design" (https://www.gamasutra.com/view/feature/13264
5/the_next_big_steps_in_game_sound_.php). www.gamasutra.com. January 28, 2010.
Retrieved March 18, 2021.
5. Georg Rainer Hofmann (1990). "Who invented ray tracing?". The Visual Computer. 6 (3): 120–
124. doi:10.1007/BF01911003 (https://doi.org/10.1007%2FBF01911003). S2CID 26348610 (ht
tps://api.semanticscholar.org/CorpusID:26348610)..
6. Steve Luecking (2013). "Dürer, drawing, and digital thinking - 2013 FATE Conference" (http://w
ww.brian-curtis.com/text/conferpape_steveluecking.html). brian-curtis.com. Retrieved
August 13, 2020.
7. Steve Luecking. "Stephen J Luecking" (https://www.academia.edu/34722794). Retrieved
August 13, 2020.
8. Appel, Arthur (April 30, 1968). "Some techniques for shading machine renderings of solids".
Proceedings of the April 30--May 2, 1968, spring joint computer conference on - AFIPS '68
(Spring) (http://graphics.stanford.edu/courses/Appel.pdf) (PDF). pp. 37–45.
doi:10.1145/1468075.1468082 (https://doi.org/10.1145%2F1468075.1468082).
S2CID 207171023 (https://api.semanticscholar.org/CorpusID:207171023).
9. Goldstein, Robert; Nagel, Roger (January 1971), "3-D Visual simulation", Simulation, 16 (1):
25–31, doi:10.1177/003754977101600104 (https://doi.org/10.1177%2F00375497710160010
4), S2CID 122824395 (https://api.semanticscholar.org/CorpusID:122824395)
10. Syntha Vision Sampler (https://archive.org/details/synthavisionsampler). 1974 – via Internet
Archive.
11. Roth, Scott D. (February 1982), "Ray Casting for Modeling Solids", Computer Graphics and
Image Processing, 18 (2): 109–144, doi:10.1016/0146-664X(82)90169-1 (https://doi.org/10.10
16%2F0146-664X%2882%2990169-1)
12. Whitted T. (1979) An Improved Illumination Model for Shaded Display (http://citeseerx.ist.psu.e
du/viewdoc/summary?doi=10.1.1.156.1534). Proceedings of the 6th annual conference on
Computer graphics and interactive techniques
13. The Compleat Angler (https://archive.org/details/thecompleatangler1978). Bell Laboratories.
1978 – via Internet Archive.
14. "Food for Laughs" (https://www.cgw.com/Publications/CGW/2009/Volume-32-Issue-9-Sep-200
9-/Food-for-Laughs.aspx). Computer Graphics World.
15. M.s (May 28, 2013). "This Animated Life: Pixar's Lightspeed Brings New Light to Monsters
University" (https://thisanimatedlife.blogspot.com/2013/05/pixars-chris-horne-sheds-new-light-o
n.html). This Animated Life. Retrieved May 26, 2020.
16. Hart, John C. (June 1995), "Sphere Tracing: A Geometric Method for the Antialiased Ray
Tracing of Implicit Surfaces" (http://graphics.stanford.edu/courses/cs348b-20-spring-content/up
loads/hart.pdf) (PDF), The Visual Computer
17. Hart, John C.; Sandin, Daniel J.; Kauffman, Louis H. (July 1989), "Ray Tracing Deterministic 3-
D Fractals" (http://graphics.stanford.edu/courses/cs348b-20-spring-content/uploads/hart.pdf)
(PDF), Computer Graphics, 23 (3): 289–296, doi:10.1145/74334.74363 (https://doi.org/10.114
5%2F74334.74363)
18. Tomas Nikodym (June 2010). "Ray Tracing Algorithm For Interactive Applications" (https://web.
archive.org/web/20160303180450/https://dip.felk.cvut.cz/browse/pdfcache/nikodtom_2010bac
h.pdf) (PDF). Czech Technical University, FEE. Archived from the original (https://dip.felk.cvut.c
z/browse/pdfcache/nikodtom_2010bach.pdf) (PDF) on March 3, 2016.
19. Whitted, T. (1979). "An Improved Illumination Model for Shaded Display" (http://citeseerx.ist.ps
u.edu/viewdoc/summary?doi=10.1.1.156.1534). Proceedings of the 6th annual conference on
Computer graphics and interactive techniques. CiteSeerX 10.1.1.156.1534 (https://citeseerx.is
t.psu.edu/viewdoc/summary?doi=10.1.1.156.1534). ISBN 0-89791-004-4.
20. Chalmers, A.; Davis, T.; Reinhard, E. (2002). Practical Parallel Rendering. AK Peters. ISBN 1-
56881-179-9.
21. Aila, Timo; Laine, Samulii (2009). "Understanding the Efficiency of Ray Traversal on GPUs".
HPG '09: Proceedings of the Conference on High Performance Graphics 2009. pp. 145–149.
doi:10.1145/1572769.1572792 (https://doi.org/10.1145%2F1572769.1572792).
ISBN 9781605586038. S2CID 15392840 (https://api.semanticscholar.org/CorpusID:1539284
0).
22. Eric P. Lafortune and Yves D. Willems (December 1993). "Bi-Directional Path Tracing" (http://w
ww.graphics.cornell.edu/~eric/Portugal.html). Proceedings of Compugraphics '93: 145–153.
23. Péter Dornbach (1998). "Implementation of bidirectional ray tracing algorithm" (https://old.cesc
g.org/CESCG98/PDornbach/paper.pdf) (PDF). Retrieved June 11, 2008.
24. Global Illumination using Photon Maps (http://graphics.ucsd.edu/~henrik/papers/photon_map/g
lobal_illumination_using_photon_maps_egwr96.pdf) Archived (https://web.archive.org/web/200
80808140048/http://graphics.ucsd.edu/~henrik/papers/photon_map/global_illumination_using_
photon_maps_egwr96.pdf) 2008-08-08 at the Wayback Machine
25. "Photon Mapping - Zack Waters" (http://web.cs.wpi.edu/~emmanuel/courses/cs563/write_ups/z
ackw/photon_mapping/PhotonMapping.html).
26. Veach, Eric; Guibas, Leonidas J. (1997). "Metropolis Light Transport". SIGGRAPH '97:
Proceedings of the 24th annual conference on Computer graphics and interactive techniques.
pp. 65–76. doi:10.1145/258734.258775 (https://doi.org/10.1145%2F258734.258775).
ISBN 0897918967. S2CID 1832504 (https://api.semanticscholar.org/CorpusID:1832504).
27. Hall, Roy A.; Greenberg, Donald P. (November 1983). "A Testbed for Realistic Image
Synthesis". IEEE Computer Graphics and Applications. 3 (8): 10–20.
CiteSeerX 10.1.1.131.1958 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.131.19
58). doi:10.1109/MCG.1983.263292 (https://doi.org/10.1109%2FMCG.1983.263292).
S2CID 9594422 (https://api.semanticscholar.org/CorpusID:9594422).
28. "【Osaka University 】 LINKS-1 Computer Graphics System" (http://museum.ipsj.or.jp/en/com
puter/other/0013.html). IPSJ Computer Museum. Information Processing Society of Japan.
Retrieved November 15, 2018.
29. Defanti, Thomas A. (1984). Advances in computers. Volume 23 (http://www.vasulka.org/archiv
e/Writings/VideogameImpact.pdf#page=29) (PDF). Academic Press. p. 121. ISBN 0-12-
012123-9.
30. See Proceedings of 4th Computer Graphics Workshop, Cambridge, MA, USA, October 1987.
Usenix Association, 1987. pp 86–98.
31. "About BRL-CAD" (https://web.archive.org/web/20090901222818/http://brlcad.org/d/about).
Archived from the original (http://brlcad.org/d/about) on September 1, 2009. Retrieved
January 18, 2019.
32. Piero Foscari. "The Realtime Raytracing Realm" (http://www.acm.org/tog/resources/RTNews/d
emos/overview.htm). ACM Transactions on Graphics. Retrieved September 17, 2007.
33. Parker, Steven; Martin, William (April 26, 1999). "Interactive ray tracing" (https://dl.acm.org/citat
ion.cfm?id=300537). Proceedings of the 1999 symposium on Interactive 3-D graphics. I3D '99.
Vol. 5. pp. 119–126. CiteSeerX 10.1.1.6.8426 (https://citeseerx.ist.psu.edu/viewdoc/summary?
doi=10.1.1.6.8426). doi:10.1145/300523.300537 (https://doi.org/10.1145%2F300523.300537).
ISBN 1581130821. S2CID 4522715 (https://api.semanticscholar.org/CorpusID:4522715).
Retrieved October 30, 2019.
34. Mark Ward (March 16, 2007). "Rays light up life-like graphics" (http://news.bbc.co.uk/1/hi/techn
ology/6457951.stm). BBC News. Retrieved September 17, 2007.
35. Peddie, Jon (2019). Ray Tracing: A Tool for All (https://books.google.com/books?id=CS2oDwA
AQBAJ). Springer Nature Switzerland. ISBN 978-3-030-17490-3. Retrieved November 2, 2022.
36. Abi-Chahla, Fedy (July 22, 2009). "When Will Ray Tracing Replace Rasterization?" (https://ww
w.tomshardware.com/reviews/ray-tracing-rasterization,2351.html). Tom's Hardware. Archived
(https://archive.today/20221103235551/https://www.tomshardware.com/reviews/ray-tracing-ras
terization,2351.html) from the original on November 3, 2022. Retrieved November 4, 2022.
37. Valich, Theo (June 12, 2008). "Intel converts ET: Quake Wars to ray tracing" (https://web.archiv
e.org/web/20081002030022/http://www.tgdaily.com/html_tmp/content-view-37925-113.html).
TG Daily. Archived from the original (http://www.tgdaily.com/html_tmp/content-view-37925-113.
html) on October 2, 2008. Retrieved June 16, 2008.
38. Nvidia (October 18, 2009). "Nvidia OptiX" (http://www.nvidia.com/object/optix.html). Nvidia.
Retrieved November 6, 2009.
39. Cuthbert, Dylan (October 24, 2015). "Creating the beautiful, ground-breaking visuals of The
Tomorrow Children on PS4" (http://blog.eu.playstation.com/2014/10/24/creating-striking-unusu
al-visuals-tomorrow-children-ps4-2/). PlayStation Blog. Retrieved December 7, 2015.
40. Kilgariff, Emmett; Moreton, Henry; Stam, Nick; Bell, Brandon (September 14, 2018). "NVIDIA
Turing Architecture In-Depth" (https://developer.nvidia.com/blog/nvidia-turing-architecture-in-de
pth). Nvidia Developer. Archived (https://web.archive.org/web/20221113010753/https://develop
er.nvidia.com/blog/nvidia-turing-architecture-in-depth/) from the original on November 13,
2022. Retrieved November 13, 2022.
41. Takahashi, Dean (August 20, 2018). "Nvidia unveils GeForce RTX graphics chips for real-time
ray tracing games" (https://venturebeat.com/games/nvidia-unveils-geforce-rtx-graphics-chips-f
or-real-time-ray-tracing-games). VentureBeat. Archived (https://web.archive.org/web/20221113
013850/https://venturebeat.com/games/nvidia-unveils-geforce-rtx-graphics-chips-for-real-time-r
ay-tracing-games) from the original on November 13, 2022. Retrieved November 13, 2022.
42. Chacos, Brad (November 14, 2018). "RTX on! Battlefield V becomes the first game to support
DXR real-time ray tracing" (https://www.pcworld.com/article/402902/battlefield-v-dxr-rtx-ray-tra
cing.html). PCWorld. Archived (https://web.archive.org/web/20221113014909/https://www.pcw
orld.com/article/402902/battlefield-v-dxr-rtx-ray-tracing.html) from the original on November 13,
2022. Retrieved November 13, 2018.
43. "Real-Time Ray Tracing with Radeon ProRender" (https://gpuopen.com/announcing-real-time-r
ay-tracing). GPUOpen. March 20, 2018. Archived (https://web.archive.org/web/202211130300
34/https://gpuopen.com/announcing-real-time-ray-tracing) from the original on November 13,
2022. Retrieved November 13, 2022.
44. Harada, Takahiro (November 23, 2020). "Hardware-Accelerated Ray Tracing in AMD
Radeon™ ProRender 2.0" (https://gpuopen.com/learn/radeon-prorender-2-0). GPUOpen.
Archived (https://web.archive.org/web/20221113025101/https://gpuopen.com/learn/radeon-pro
render-2-0) from the original on November 13, 2022. Retrieved November 13, 2022.
45. Garreffa, Anthony (September 9, 2020). "AMD to reveal next-gen Big Navi RDNA 2 graphics
cards on October 28" (https://www.tweaktown.com/news/75066/amd-to-reveal-next-gen-big-na
vi-rdna-2-graphics-cards-on-october-28/index.html). TweakTown. Retrieved September 9,
2020.
46. Lyles, Taylor (September 9, 2020). "AMD's next-generation Zen 3 CPUs and Radeon RX 6000
'Big Navi' GPU will be revealed next month" (https://www.theverge.com/2020/9/9/21429127/am
d-zen-3-cpu-big-navi-gpu-events-october). The Verge. Retrieved September 10, 2020.
47. "AMD Teases Radeon RX 6000 Card Performance Numbers: Aiming For 3080?" (https://www.
anandtech.com/show/16150/amd-teases-radeon-rx-6000-card-performance-numbers-aiming-f
or-3080). anandtech.com. AnandTech. October 8, 2020. Retrieved October 25, 2020.
48. "AMD Announces Ryzen "Zen 3" and Radeon "RDNA2" Presentations for October: A New
Journey Begins" (https://www.anandtech.com/show/16077/amd-announces-ryzen-zen-3-and-r
adeon-rdna2-presentations-for-october-a-new-journey-begins). anandtech.com. AnandTech.
September 9, 2020. Retrieved October 25, 2020.
49. Judd, Will (October 28, 2020). "AMD unveils three Radeon 6000 graphics cards with ray
tracing and RTX-beating performance" (https://www.eurogamer.net/articles/digitalfoundry-2020
-10-28-amd-unveils-three-radeon-6000-graphics-cards-with-ray-tracing-and-impressive-perfor
mance). Eurogamer. Retrieved October 28, 2020.
50. Marrs, Adam; Shirley, Peter; Wald, Ingo (2021). Ray Tracing Gems II: Next Generation Real-
Time Rendering with DXR, Vulkan, and OptiX. Apress. pp. 213–214, 791–792.
hdl:20.500.12657/50334 (https://hdl.handle.net/20.500.12657%2F50334).
ISBN 9781484271858.
51. Warren, Tom (June 8, 2019). "Microsoft hints at next-generation Xbox 'Scarlet' in E3 teasers"
(https://www.theverge.com/2019/6/8/18658147/microsoft-xbox-scarlet-teaser-e3-2019). The
Verge. Retrieved October 8, 2019.
52. Chaim, Gartenberg (October 8, 2019). "Sony confirms PlayStation 5 name, holiday 2020
release date" (https://www.theverge.com/2019/10/8/20904351/sony-ps5-playstation-5-confirme
d-haptic-feedback-features-release-date-2020). The Verge. Retrieved October 8, 2019.
53. Warren, Tom (February 24, 2020). "Microsoft reveals more Xbox Series X specs, confirms 12
teraflops GPU" (https://www.theverge.com/2020/2/24/21150578/microsoft-xbox-series-x-specs
-performance-12-teraflops-gpu-details-features). The Verge. Retrieved February 25, 2020.
54. Warren, Tom (September 9, 2020). "Microsoft reveals Xbox Series S specs, promises four
times the processing power of Xbox One" (https://www.theverge.com/2020/9/9/21428792/micr
osoft-xbox-series-s-specs-cpu-teraflops-performance-gpu). The Verge. Retrieved
September 9, 2020.
55. Vandervell, Andy (January 4, 2020). "Making sense of the rampant Xbox Series X rumour mill"
(https://www.wired.co.uk/article/xbox-series-x-release-date-uk-price-specs). Wired. Archived (ht
tps://web.archive.org/web/20221113010340/https://www.wired.co.uk/article/xbox-series-x-relea
se-date-uk-price-specs) from the original on November 13, 2022. Retrieved November 13,
2022.
56. 93digital (November 4, 2021). "Imagination launches the most advanced ray tracing GPU" (http
s://www.imaginationtech.com/news/imagination-launches-the-most-advanced-ray-tracing-gp
u/). Imagination. Retrieved September 17, 2023.
57. "Ray Tracing" (https://www.imaginationtech.com/products/ray-tracing/). Imagination. Retrieved
September 17, 2023.
58. "Samsung Introduces Game Changing Exynos 2200 Processor With Xclipse GPU Powered by
AMD RDNA 2 Architecture" (https://news.samsung.com/global/samsung-introduces-game-cha
nging-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture).
news.samsung.com. Retrieved September 17, 2023.
59. "Gaming Performance Unleashed with Arm's new GPUs - Announcements - Arm Community
blogs - Arm Community" (https://community.arm.com/arm-community-blogs/b/announcements/
posts/gaming-performance-unleashed). community.arm.com. June 28, 2022. Retrieved
September 17, 2023.
60. "Snapdragon 8 Gen 2 Defines a New Standard for Premium Smartphones" (https://www.qualco
mm.com/news/releases/2022/11/snapdragon-8-gen-2-defines-a-new-standard-for-premium-sm
artphone). www.qualcomm.com. Retrieved September 17, 2023.
61. "New, Snapdragon 8 Gen 2: 8 extraordinary mobile experiences, unveiled" (https://www.qualco
mm.com/news/onq/2022/11/new-snapdragon-8-gen-2-8-extraordinary-mobile-experiences-unv
eiled). www.qualcomm.com. Retrieved September 17, 2023.
62. Bonshor, Ryan Smith, Gavin. "The Apple 2023 Fall iPhone Event Live Blog (Starts at 10am
PT/17:00 UTC)" (https://www.anandtech.com/show/20051/the-apple-2023-fall-iphone-event-liv
e-blog). www.anandtech.com. Retrieved September 17, 2023.
63. "Apple unveils iPhone 15 Pro and iPhone 15 Pro Max" (https://www.apple.com/newsroom/202
3/09/apple-unveils-iphone-15-pro-and-iphone-15-pro-max/). Apple Newsroom. Retrieved
October 27, 2024.
64. "Apple unveils M3, M3 Pro, and M3 Max, the most advanced chips for a personal computer" (h
ttps://www.apple.com/newsroom/2023/10/apple-unveils-m3-m3-pro-and-m3-max-the-most-adv
anced-chips-for-a-personal-computer/). Apple Newsroom. Retrieved October 27, 2024.
65. "Discover ray tracing with Metal - WWDC20 - Videos" (https://developer.apple.com/videos/play/
wwdc2020/10012/). Apple Developer. Retrieved October 27, 2024.
66. "Computability and Complexity of Ray Tracing" (https://www.cs.duke.edu/~reif/paper/tygar/raytr
acing.pdf) (PDF). CS.Duke.edu.
External links
Interactive Ray Tracing: The replacement of rasterization? (https://web.archive.org/web/20190
520020303/http://www.few.vu.nl/~kielmann/theses/avdploeg.pdf)
The Compleat Angler (1978) (https://www.youtube.com/watch?v=WV4qXzM641o)
Writing a Simple Ray Tracer (scratchapixel) (http://www.scratchapixel.com/lessons/3d-basic-re
ndering/introduction-to-ray-tracing) Archived (https://web.archive.org/web/20150228122303/htt
p://scratchapixel.com/lessons/3d-basic-rendering/introduction-to-ray-tracing) February 28,
2015, at the Wayback Machine
Ray tracing a torus (https://marcin-chwedczuk.github.io/ray-tracing-torus)
Ray Tracing in One Weekend Book Series (https://raytracing.github.io/)
Ray Tracing with Voxels - Part 1 (https://jacco.ompf2.com/2024/04/24/ray-tracing-with-voxels-in
-c-series-part-1/)
Retrieved from "https://en.wikipedia.org/w/index.php?title=Ray_tracing_(graphics)&oldid=1267964254"