Heuristic42
Blog
Opengl
Meta
Rendering
1
comment
Nov 19 at 15:47
Matrices
Hello, I hope this message finds you doing well. I believe…
–
anonymous
created
Oct 20 at 20:30
Iterators: pointers vs cursors
You're already doing both of these by hand. This post emphaisze…
–
pknowles
comment
Oct 10 at 10:27
Matrices
[deleted]
–
anonymous
comment
Oct 4 at 19:12
Matrices
[deleted]
–
anonymous
comment
Sep 30 at 18:51
Matrices
[deleted]
–
anonymous
comment
Sep 23 at 16:15
Matrices
[deleted]
–
anonymous
comment
Sep 21 at 6:52
Contributing
I kind of predicted what was bound to happen when my favourite …
–
anonymous
comment
Sep 7 at 1:21
Route contention when running docker and a VPN
Thank you for this. Between this and the overwriting of iptabl…
–
anonymous
comment
Sep 6 at 17:57
Making a real EMF Reader
Sorry for the random quoted text comments. I am one of those p…
–
anonymous
comment
Sep 6 at 17:48
Making a real EMF Reader
["ove! Play a tone with a buzzer and has 5 LEDs to show the “EM…
–
anonymous
comment
Sep 6 at 17:47
Making a real EMF Reader
["easure direction Measure the magnetic fie"](#q107-644-685)
–
anonymous
comment
Aug 20 at 17:01
Matrices
[deleted]
–
anonymous
comment
Aug 11 at 22:32
Matrices
[deleted]
–
anonymous
edited
Jun 8 at 22:29
Rethinking writing files with memory mapping and C++
This post introduces the motivation behind the [decodless C++ o…
–
admin
created
Jun 8 at 22:16
Rethinking writing files with memory mapping and C++
This post introduces the motivation behind the [decodless C++ o…
–
pknowles
comment
Jun 5 at 13:36
Contributing
[deleted]
–
anonymous
comment
Apr 19 at 11:24
Matrices
[deleted]
–
anonymous
comment
Apr 13 at 0:25
Matrices
[deleted]
–
anonymous
comment
Apr 5 at 9:43
Matrices
[deleted]
–
anonymous
comment
Mar 27 at 17:19
Matrices
[deleted]
–
anonymous
comment
Mar 25 at 4:59
Matrices
[deleted]
–
anonymous
comment
Mar 5 at 15:39
Matrices
[deleted]
–
anonymous
comment
Feb 7 at 5:45
Microsoft Natural Ergonomic 4000 Replacement
Thank you so much for sharing your thoughts here, it tells me e…
–
anonymous
comment
Jan 28 at 23:31
Microsoft Natural Ergonomic 4000 Replacement
Oh man, I feel this post. Not sure if you've seen the "new" new…
–
anonymous
…
View All
Log in
Shading
leave this field blank to prove your humanity
Slug
*
A URL path component
Parent page
<root>
rendering/:Article2:3D Rendering (Computer Graphics)
--- rendering/cameras/:Article11:Cameras
--- rendering/matrices/:Article12:Matrices
------ rendering/matrices/projection/:Article14:Projection Matrix
--- rendering/vectors/:Article13:Vectors
--- rendering/geometry/:Article62:3D Geometry
------ rendering/geometry/triangle_meshes/:None
--- rendering/shading/:Article64:Shading
------ rendering/shading/transparency/:Article70:Transparency and Alpha Blending
--- rendering/lights/:Article65:Lights
--- rendering/rasterization/:None
------ rendering/rasterization/deepimage/:Article72:Deep Image
--- rendering/shadows/:Article67:Shadows
--- rendering/spaces/:Article68:Vector Spaces
------ rendering/spaces/tangent_space/:Article69:Tangent Space
------ rendering/spaces/clip_space/:Article89:Clip Space
--- rendering/rotations/:None
--- rendering/images/:Article74:<unset>:Images
------ rendering/images/mipmapping/:Article75:<unset>:Mipmapping
--- rendering/materials/:None
opengl/:Article3:OpenGL Tutorials
--- opengl/oit/:Article7:Order Independent Transparency (OIT)
--- opengl/framebuffer/:Article71:The Framebuffer
meta/:Article4:Pages About This Site
--- meta/contribute/:Article5:Contributing
--- meta/bugs/:Article9:Bugs
--- meta/about/:Article10:Why does this website exist?
The parent page this belongs to.
Article title
*
Article revisions must have a non-empty title
Article body
*
Shading is at the centre of many of 3D rendering components. After surface intersections have been found along viewing rays, a colour needs to be computed based on the [geometry's material](/12/rendering/geometry/) at the intersection and the [lights](/15/rendering/lights/) in the scene. Shading is a process that links materials and lights, solving or approximating the [rendering equation](https://en.wikipedia.org/wiki/Rendering_equation). While this is where the name comes from it is often generalized to computing colour in a rendered image, which is not always a physically based operation. The rendering equation describes how light interacts at at point on a surface, being absorbed, reflected, refracted and sometimes emitted. It includes a bidirectional reflectance distribution function (BRDF) which is the primary part of most materials. For a given incoming light colour and direction, the BRDF gives the amount of light reflected for all other directions. Typically this function is sampled once, i.e. incoming light from a light source and an outgoing direction towards the camera. The function itself can take a number of forms. It may be a physically based model to approximate certain lighting effects, which is often very efficient, for example the Lambertian diffuse shading below. It may be measured from a real material, which involves taking many photos from different directions under controlled lighting conditions. The memory requirements for this are quite large and it is used more for offline rendering. The two approaches can also be combined, measuring the BRDF by rendering a detailed model of a surface instead of taking photos. In real-time rendering with dedicated graphics processors, *shaders* are small programs which are executed on the GPU to compute shading. A colour may be computed for vertices of a mesh and interpolated during [rasterization](/16/rendering/rasterization/), or alternatively computed after rasterization at the pixel level, giving name to vertex and fragment shaders. These shader programs are in turn executed on shader cores, which are the parallel processors of the GPU. In offline (non-real-time) or software rendering, the components which compute shading are also called shaders. While the shading process covers all possible shading operations, a monolithic physically based shader, which can compute results for interactions between all light and material combinations, is normally too expensive and cumbersome to maintain. Shaders are normally small dedicated programs which handle just a few cases. When other cases occur, with different materials or lights, different shaders are used to compute the results. The colour, or light intensity, computed by many shaders can be summed together to produce a more complete image. #Diffuse Shading [Diffuse](https://en.wikipedia.org/wiki/Diffuse_reflection) shading, which calculates the amount of directly scattered light, is the most common lighting calculation a shader will do. The image below shows how light becomes spread out over a surface the further away from the light it faces. A widely used model is [Lambertian relfectance](https://en.wikipedia.org/wiki/Lambertian_reflectance), which scales the light intensity by the cosine of the angle between the surface normal and light direction to give the amount of scattered light. ![enter image description here][1] $$ \mathsf{colour} = \cos(\hat{\mathbf{N}} \cdot \hat{\mathbf{L}}) \times \mathsf{light\,colour} \times \mathsf{material\,colour} $$ #Material Effects Diffuse shading models just one property of light interacting with geometry. Below is a list of other common graphics effects performed by shaders. These general effects are combined to create complex materials and approximate multiple surface layers. For example a diffuse and reflection effect can model a glazed surface even though it's combined of a thin glass layer over ceramic. ## Specular Highlights Modelling specular highlights is among the first effects commonly discussed after diffuse shading. Shiny objects are somewhat reflective and the most noticeable reflections are the light sources themselves, which specular highlights model. Computing specular highlights is much cheaper than a full reflection as only information about the light source is needed rather than potential secondary light from the entire scene. Many objects have slightly rough surfaces, giving a partly diffused look with larger and softer highlights, which are often adjusted by a "roughness" or "shininess" material attribute. ## Reflection Reflections are more complex than specular highlights as light comes from many parts of the scene, rather than just the light sources, i.e. diffused light from other objects is reflected as well as the direct light. A raytracer may simply continue tracing a reflected ray and use the colour of the surface this reflected ray hits instead. A rasterizer only computes the first intersection so reflections are often implemented by sampling an image pre-rendered from the point of view of the object. The amount of light reflected changes based on the angle it hits the surface. For example you can see the bottom of a lake looking directly down but towards the horizon all you can see is the reflection of the sky. This effect is described by the [Fresnel equations](https://en.wikipedia.org/wiki/Fresnel_equations) and changes based on the materials refractive index, or how dense it is. Also, how dense the air is or whatever material the light is traveling through before hitting the boundary between the two. ### Glossy Reflection Reflective materials with rough or scratched surfaces produce a blurry looking reflection, as light mostly reflects but the reflection direction is perturbed slightly. In the case of brushed metal, the scratches are all in one direction, and the reflection is blurred much more perpendicular to the scratches. In this case the glossy effect is said to be anisotropic, which is often an adjustable attribute of the material. The same effect applies to refraction of rough surfaces. ## Indirect Lighting Indirect lighting, or indirect illumination models light bouncing multiple times before being captured by a camera. For example, diffuse or reflected light from an object becomes a light source for another diffuse object, which in turn scatters light to yet more objects etc. The term global illumination (GI) is used too, but often used to imply a more complete lighting system that includes many forms of indirect light. It is much more expensive to compute than direct light and until ~2010 was not used in games and a typically a property of better-looking offline rendering. A raytracer may model indirect illumination by generating secondary rays at diffuse surfaces. [Radiosity][2] is a rendering technique to model light transfer at the geometry level. Preprocessing the scene to create some form of indirect light field and then providing this information to a shader allows much faster rendering, although often the discretized storage gives reduced accuracy and introduces light "bleeding" issues. [Cascaded light propagation volumes](https://www.youtube.com/watch?v=vPQ3BbuYVh8) is one such technique. ## Refraction So far, purely opaque materials have been discussed. In the case of glass or water, light passes through the surface and travels inside it. However rather than simply continuing it bends a little depending on the densities of the materials either side of the surface, e.g. air to glass or vice versa. The angle it bends by is given by [Snell's law](https://en.wikipedia.org/wiki/Snell's_law). Different coloured light refracts in slightly different directions, responsible for the colours in rainbows and chromatic aberration in optical systems. Refraction is often paired with reflection, as the Fresnel equations linked above give a ratio between the two. ## Absorption Partially transparent or see-through materials absorb light as it passes through it. [Beers law](https://en.wikipedia.org/wiki/Beer%E2%80%93Lambert_law) models the attenuation of the light, or how quickly it is absorbed over the distance it travels. For example, sand under the water at the beach looks green as the seawater absorbs more red and blue light. The deeper the water, the more light is absorbed and the darker it looks. The same effect occurs for coloured glass. In cases such as fog and smoke, or where opaque particles are suspended in the material absorption is often paired with scattering as although direct light is reduced, the light is diffused by the particles and reaches other parts of the material through secondary paths. ### Transparency Transparency, specifically alpha transparency as there are a number of interpretations, can be physically seen as an instantaneous absorption effect which occurs at a surface. An alpha value specifies a ratio of light that can pass through the surface to that which is reflected by other methods as above. ## Scattering Scattering is light deviating from its path as it travels through a material. For example, full beam headlights into fog appears very bright due to the light hitting particles and being reflected. Scattering in the atmosphere is responsible for the sky being blue. Sub-surface scattering refers to scattering within solid objects, often with high density so the range is limited. A common example is skin, which is a very tricky materials to make look realistic. Particularly as skin has many layers which light reaches and is scattered by before returning to the surface. #Camera Effects Light doesn't just interact with materials. Both real cameras and the human eye exhibit a number of optical phenomenon. Sometimes these effects aren't desirable as they hide detail in the image, but like a lot of the material shading, we're used to seeing these effects which generally makes the final image more relateable. *Post-processing* shaders, named as they run after rendering the geometry, operate on the final image. ## Exposure and White Balance Cameras have light meters to estimate how long to expose the image for. Our eyes also adapt to the amount of light around us, but have a much wider range of intensities they can see at once. A virtual scene can be constructed in such a way that materials and light intensities always produce the right brightness for the fixed number and range of colours in our display devices. However the real world has much more contrast than this allows. Adjusting the image brightness to better show detail after rendering can allow an animated sequence to better portray high contrasts. The term *high dynamic range* (HDR) refers to storing a larger range of contrast than display devices have, and dynamic exposure and tone mapping are used to display it. A similar phenomenon occurs when there are large differences in colours and our eyes adjust to an average. In addition to altering the total brightness, a shift in the red, green and blue intensities can show more detail without losing realism. ## Bloom Light behaves like a wave and passes past objects, not even intersecting with them, it spreads out and *diffracts* (see the [double slit experiment](https://en.wikipedia.org/wiki/Double-slit_experiment)). This happens at the aperture of a camera and even objects that are perfectly focused do not produce a perfectly sharp image. The distribution of light from a focused point produces an [Airy disk](https://en.wikipedia.org/wiki/Airy_disk) on the image, which actually extends indefinitely but falls off very quickly. Normally this isn't noticed as the diffracted light is of such low relative intensity. However when the light source is overexposed and bright enough, the scattered light appears as a glow around the bright areas. The bloom effect models this behaviour by artificially blurring the image. HDR is often used to provide the large intensities needed, although tone mapping can be used at the cost of precision. ## Depth of Field The depth of field is the range of distances at which objects appear sufficiently in focus with a lens based camera. However the name as a graphics effect often refers to the presence of out of focus blurry areas of an image, which are expensive to compute. ## Lens Flares Instead of light purely refracting through a lens, lens flares are an artefact of light being reflected back and forth by camera lens elements (individual lenses forming a lens system) and the barrel of the lens. The effect is normally very weak, although for very bright light sources such as the sun can be strong enough to see or even obscure parts of the image. Lens flares can add realism and a better sense of direction, due to their animated movement. [1]: /u/img/238650ddd4a5.svg [2]: https://en.wikipedia.org/wiki/Radiosity_%28computer_graphics%29
Toggle Preview
Edit message
*
A description of the changes made
Discard Draft
Save Draft
leave this field blank to prove your humanity
Flag
the thing you clicked
for moderator attention.
Reason choice:
Spam, promoting, advertising without disclosure
Rude, inappropriate, generally offensive
Too arrogant or demeaning to others
Other
Reason:
The reason for raising the flag
Error