Physically based rendering
Physically based rendering (PBR) is a computer graphics approach that seeks to render images in a way that models the lights and surfaces with optics in the real world. It is often referred to as "Physically Based Lighting" or "Physically Based Shading". Many PBR pipelines aim to achieve photorealism. Feasible and quick approximations of the bidirectional reflectance distribution function and rendering equation are of mathematical importance in this field. Photogrammetry may be used to help discover and encode accurate optical properties of materials. PBR principles may be implemented in real-time applications using Shaders or offline applications using Ray tracing (graphics) or Path tracing.
History
Starting in the 1980s, a number of rendering researchers worked on establishing a solid theoretical basis for rendering, including physical correctness. Much of this work was done at the Cornell University Program of Computer Graphics; a 1997 paper from that lab[1] describes the work done at Cornell in this area to that point.
"Physically Based Shading" was introduced by Yoshiharu Gotanda during the course Physically-Based Shading Models in Film and Game Production at the SIGGRAPH 2010. And followed by the course Physically Based Shading in Theory and Practice organised by Stephen Hill and Stephen McAuley between 2012 and 2020.
The phrase "Physically Based Rendering" was more widely popularized by Matt Pharr, Greg Humphreys, and Pat Hanrahan in their book of the same name from 2004, a seminal work in modern computer graphics that won its authors a Technical Achievement Academy Award for special effects.[2] The book is now in its fourth edition.[3]
The first successful, yet partial implementation of physically-based rendering in a video game can be found in the 2013 title Remember Me, that despite being built on a game engine not natively supporting this technology (Unreal Engine 3) was properly modified to accommodate this feature.[4] Despite being a moderate approach to PBR, its accuracy has been further refined with posterior titles such as Ryse and Killzone Shadow Fall, released on the same year, until the current state of PBR advancements in the 2020s. [5] [6]
Process
PBR is, as Joe Wilson puts it, "more of a concept than a strict set of rules"[4] – but the concept contains several distinctive points of note. One of these is that – unlike many previous models that sought to differentiate surfaces between non-reflective and reflective – PBR recognizes that, in the real world, as John Hable puts it, "everything is shiny".[7] Even "flat" or "matte" surfaces in the real world such as concrete will reflect a small degree of light, and many metals and liquids will reflect a great deal of it. Another thing that PBR models attempt to do is to integrate photogrammetry - measurements from photographs of real-world materials - to study and replicate real physical ranges of values to accurately simulate albedo, gloss, reflectivity, and other physical properties. Finally, PBR puts a great deal of emphasis on microfacets, and will often contain additional textures and mathematical models intended to model small-scale specular highlights and cavities resulting from smoothness or roughness in addition to traditional specular or reflectivity maps.
Surfaces
PBR often utilize Bidirectional scattering distribution functions to calculate the visible light reflected at a given point on surfaces. Common techniques use approximations and simplified models that try to fit approximate models to more accurate data from other more time consuming methods or laboratory measurements (such as those of a gonioreflectometer).
As described by researcher Jeff Russell of Marmoset, a surface-focused physically based rendering pipeline may also focus on the following areas of research:[6]
- Reflection
- Diffusion
- Translucency and transparency
- Conservation of energy
- Metallicity
- Fresnel reflection
- Subsurface scattering
Volumes
PBR is also often extended into volume renderings, with areas of research like:
- Lens-related/Angle of view/Depth of field effects
- Caustics
- Light scattering
- Participating media
- Atmospheric visual properties such as:
Application
Thanks to high performance and low costs of modern hardware[8] it has become feasible to use PBR not only for industrial but also entertainment purposes wherever photorealistic images are desired, such as video games or movie making.[2] Today's mid to high-end hardware is capable of producing and rendering PBR content and there exists a market of easy-to-use software that allows designers of all experience levels to take advantage of physically based rendering methods, such as:
- Brikl
- 3ds Max
- O3DE
- OGRE
- Maya
- Babylon.js
- Blender
- Cinema 4D
- CryEngine
- Enscape
- Vue
- Godot (game engine)
- Houdini (SideFX)
- jME
- Microstation
- Minecraft GLSL Shaders
- Rhinoceros 3D
- Roblox Studio
- Sketchfab
- Stride
- Three.js
- Unigine
- Unity
- Unreal Engine
- Webots
A typical application provides an intuitive graphical user interface that allows artists to define and layer materials with arbitrary properties and to assign them to a given 2D or 3D object to recreate the appearance of any synthetic or organic material. Environments can be defined with procedural shaders or textures as well as procedural geometry or meshes or point clouds.[5] If possible all changes are made visible in real-time and therefore allow for quick iterations. Sophisticated applications allow savvy users to write custom shaders in a shading language such as HLSL or GLSL, though increasingly node-based material editors that allow a graph-based workflow with native support for important concepts such as light position, levels of reflection and emission and metallicity, and a wide range of other math and optics functions are replacing hand-written shaders for all but the most complex applications.
See also
References
- ↑ Greenberg, Donald P. (1 August 1999). "A framework for realistic image synthesis". Communications of the ACM 42 (8): 44–53. doi:10.1145/310930.310970. Archived from the original on 24 September 2018. https://web.archive.org/web/20180924033321/http://www.graphics.cornell.edu/pubs/1997/GTS+97.pdf. Retrieved 27 November 2017.
- ↑ 2.0 2.1 Pharr, Matt; Humphreys, Greg; Hanrahan, Pat (2004). Physically Based Rendering: From Theory to Implementation (1st ed.). Morgan Kaufmann. ISBN 9780080538969.
- ↑ Pharr, Matt; Jakob, Wenzel; Humphreys, Greg (2023). Physically Based Rendering: From Theory to Implementation (4th ed.). The MIT Press. ISBN 9780262048026.
- ↑ 4.0 4.1 Wilson, Joe. "Physically Based Rendering – And You Can Too!" Retrieved on 12 Jan 2017.
- ↑ 5.0 5.1 "Point Clouds" (in en-US). https://help.sketchfab.com/hc/en-us/articles/209143806-Point-Clouds.
- ↑ 6.0 6.1 Russell, Jeff, "PBR Theory". Retrieved on 20 August 2019.
- ↑ Hable, John . "Everything Is Shiny" . Retrieved on 14 November 2016.
- ↑ Kam, Ken. "How Moore's Law Now Favors Nvidia Over Intel" (in en). Forbes. https://www.forbes.com/sites/kenkam/2018/04/23/how-moores-law-now-favors-nvidia-over-intel/.
Original source: https://en.wikipedia.org/wiki/Physically based rendering.
Read more |