Bump mapping

From Wikipedia, the free encyclopedia

  (Redirected from Bump map)
Jump to: navigation, search
A sphere without bump mapping.
The bump map that is applied to the image below.
This sphere is geometrically the same as the first, but has a bump map applied. This changes how it reacts to shading, giving it the appearance of a bumpy texture resembling that of an orange.
A modern render of the iconic Utah teapot model developed by Martin Newell (1975). In this example of fake bump mapping, a technique used in ray tracing as well as accelerated imaging, the surface normal is perturbed with a marble texture, creating a more realistic looking metallic surface.

Bump mapping is a computer graphics technique in which a perturbation to the surface normal of the object being rendered is looked up in a texture map at each pixel and applied before the illumination calculation is done (see, for instance, Phong shading). The result is a richer, more detailed surface representation that more closely resembles the details inherent in the natural world. Normal and parallax mapping are the most commonly used ways of making bumps, using new techniques that makes bump mapping using a greyscale obsolete.[1]

The difference between displacement mapping and bump mapping is evident in the example images; in bump mapping, the normal alone is perturbed, not the geometry itself. This leads to artifacts in the silhouette of the object (the sphere still has a circular silhouette).

Bump mapping was introduced by Blinn in 1978.[2]

Contents

[edit] Bump mapping basics

Bump mapping is a computer graphics technique to make a rendered surface look more realistic by modeling the interaction of a bumpy surface texture with lights in the environment. Bump mapping does this by changing the brightness of the pixels on the surface in response to a heightmap that is specified for each surface.

When rendering a 3D scene, the brightness and colour of the pixels are determined by the interaction of a 3D model with lights in the scene. After it is determined that an object is visible, trigonometry is used to calculate the 'geometric' surface normal of the object, defined as a vector at each pixel position on the object.

The geometric surface normal then defines how strongly the object interacts with light coming from a given direction using Phong shading or a similar lighting algorithm. Light traveling perpendicular to a surface interacts more strongly than light that is more parallel to the surface. After the initial geometry calculations, a coloured texture is often applied to the model to make the object appear more realistic.

After texturing, a calculation is performed for each pixel on the object's surface:

  1. Look up the position on the heightmap that corresponds to the position on the surface.
  2. Calculate the surface normal of the heightmap.
  3. Add the surface normal from step 2 to the geometric surface normal so that the normal points in a new direction.
  4. Calculate the interaction of the new 'bumpy' surface with lights in the scene using, for example, Phong shading.

The result is a surface that appears to have real depth. The algorithm also ensures that the surface appearance changes as lights in the scene move around. Normal mapping is the most commonly used bump mapping technique, but there are other alternatives, such as parallax mapping.

The difference between displacement mapping (also known as 'true' bump mapping) and 'fake' or '2D' bump mapping is that fake bump mapping perturbs only the surface normals instead of the geometry. The difference can be seen in object silhouettes and shadows. In 'true' bump mapping, the bumps are applied to the geometry, leading to a 'bumpy' silhouette. Fake bump mapping is computationally efficient and can be performed in real-time by 3D accelerator cards, while true bump mapping is generally reserved for off-line (non-realtime) ray-traced images.

For the purposes of rendering in real-time, bump mapping is often referred to as a 'pass', as in multi-pass rendering, and can be implemented as multiple passes (often three or four) to reduce the number of trigonometric calculations that are required.

[edit] Realtime bump mapping techniques

3D graphics programmers sometimes use a lower quality, faster bump mapping technique in order to simulate bump mapping. One such method uses texel index alteration instead of altering surface normals, often used for '2D' bump mapping. As of GeForce 2 class cards this technique is implemented in graphics accelerator hardware.

Full-screen 2D fake bump mapping, which could be easily implemented with a very simple and fast rendering loop, was a common visual effect when bump-mapping was first introduced.

[edit] Emboss bump mapping

This technique uses texture maps to generate bump mapping effects without requiring a custom renderer. This multi-pass algorithm is an extension and refinement of texture embossing. This process duplicates the first texture image, shifts it over to the desired amount of bump, darkens the texture underneath, cuts out the appropriate shape from the texture on top, and blends the two textures into one. This is called two-pass emboss bump mapping because it requires two textures.

It is simple to implement and requires no custom hardware, and is therefore limited by the speed of the CPU. However, it only affects diffuse lighting, and the illusion is broken depending on the angle of the light.

[edit] Environment mapped bump mapping

Matrox G400 Tech Demo with EMBM

The Matrox G400 chip supports a texture-based surface detailing method called Environment Mapped Bump Mapping (EMBM). It was originally developed by BitBoys Oy and licensed to Matrox. EMBM was first introduced in DirectX 6.0.

The Radeon 7200 also includes hardware support for EMBM, which was demonstrated in the tech demo "Radeon's Ark". However, EMBM was not supported by other graphics chips such as NVIDIA's GeForce 256 through GeForce 2, which only supported the simpler Dot-3 BM. Due to this lack of industry-wide support, and its toll on the limited graphics hardware of the time, EMBM only saw limited use during G400's time. Only a few games supported the feature, such as Dungeon Keeper 2 and Millennium Soldier: Expendable.

EMBM initially required specialized hardware within the chip for its calculations, such as the Matrox G400 or Radeon 7200. It could also be rendered by the programmable pixel shaders of later DirectX 8.0 accelerators like the GeForce 3 and Radeon 8500.

[edit] See also

[edit] References

  1. ^ Jon Radoff, Anatomy of an MMORPG, http://radoff.com/blog/2008/08/22/anatomy-of-an-mmorpg/
  2. ^ Blinn, James F. "Simulation of Wrinkled Surfaces", Computer Graphics, Vol. 12 (3), pp. 286-292 SIGGRAPH-ACM (August 1978)

[edit] External links

Personal tools
Namespaces
Variants
Actions
Navigation
Interaction
Toolbox
Print/export
Languages