LightsprintSDK 2021.08.08
Pixel buffer

Pixel buffer (2d texture) is designed for storage of per-pixel data for single object (Light map, Ambient occlusion map, Bent normal map).

Suitable for

global or direct or indirect illumination ambient occlusion bent normals
static objects YES YES YES
dynamic objects NO YES YES
realtime calculated illumination NO NO NO
precalculated illumination YES YES YES

Advantages

  • High precision and detail without additional vertices.
  • No need for good triangulation.
  • Very low resolution is sufficient (with good unwrap) for ambient maps (lightmaps with indirect illumination). Ambient maps contain mostly low frequencies, no sharp edges.
  • If you don't precompute bent normals, ambient lighting values don't depend on view angle, so rendering is very fast.
  • If you do precompute bent normals, normal maps work great even in shadows, knowing direction of incoming indirect light.

Disadvantages

  • One more uv channel needed, compared to vertex buffers. (Meshes need additional uv channel with unwrap, for mapping texture. If you don't have it, ask your 3d artists to bake unwrap into meshes as an additional uv channel, or build unwrap automatically with RRObjects::buildUnwrap().)

Interface, implementations

Instances

Rendering

  • Map pixel buffer to your object using your uv channel with object's unwrap, read per-pixel value from texture in pixel shader and interpret it appropriately as light level, ambient occlusion or bent normal.

Examples

  • Generic example:
    Creating lightmap.
    lightmap = rr::RRBuffer::create(rr::BT_2D_TEXTURE,256,256,1,rr::BF_RGBF,true,nullptr);
    static RRBuffer * create(RRBufferType type, unsigned width, unsigned height, unsigned depth, RRBufferFormat format, bool scaled, const unsigned char *data)
    Creates buffer in system memory. See reset() for parameter details. Returns nullptr when parameters a...
    @ BF_RGBF
    Floating point RGB, 96bits per pixel. High precision, suitable for any data, but some old GPUs don't ...
    Definition RRBuffer.h:94
    @ BT_2D_TEXTURE
    2d texture or video, 2d array of width*height elements. Used for object's precomputed lightmaps,...
    Definition RRBuffer.h:84
  • OpenGL example: Rendering with lightmap.
    rr::RRSolver* dynamicSolver;
    GLuint program;
    ...
    // set program created from shaders below
    glUseProgram(program);
    // bind lightmap to texture0
    glActiveTexture(GL_TEXTURE0);
    getTexture(object->illumination.getLayer(0))->bindTexture();
    // set sampler to use texture0
    glUniform1i(glGetUniformLocation(program,"lightmap"),0);
    // enable stream with texture coordinates
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);
    // set pointer to texture coordinates
    glColorPointer(2, GL_FLOAT, 0, array with uv values of unwrap);
    // render primitives
    glDrawElements...
    // cleanup
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
    Global illumination solver for interactive applications.
    Definition RRSolver.h:58
    void bindTexture() const
    Binds texture.
    Using uv coordinates in GLSL vertex shader:
    varying vec2 lightmapCoord;
    void vertexShader()
    {
    ...
    lightmapCoord = gl_TexCoord.xy;
    }
    Sampling and using illumination value in GLSL fragment shader:
    uniform sampler2D lightmap;
    varying vec2 lightmapCoord;
    void fragmentShader()
    {
    vec4 light = texture2D(lightmap, lightmapCoord);
    ...
    gl_FragColor = ... + materialColor * light;
    }
  • Direct3D 9 example: Rendering with lightmap.
    IDirect3DDevice9* device;
    IDirect3DPixelShader9* vertexShader;
    IDirect3DPixelShader9* pixelShader;
    // create vertex declaration that includes lightmap uv channel as TEXCOORD0
    IDirect3DVertexDeclaration9* vertexDeclaration;
    device->CreateVertexDeclaration(description, &vertexDeclaration);
    rr::RRSolver* dynamicSolver;
    ...
    // set rendering pipeline to use shaders below
    device->SetPixelShader(vertexShader);
    device->SetPixelShader(pixelShader);
    // set sampler to use lightmap
    rr::RRBuffer* lightmap = object->illumination.getLayer(0);
    ...
    // set vertex declaration for your mesh data,
    // including uv channel with unwrap
    device->SetVertexDeclaration(vertexDeclaration);
    // set pointer to your mesh in stream 0
    // (vertices, uv channel, possibly normals etc.)
    device->SetStreamSource(0, ...);
    // render primitives
    device->DrawPrimitive...
    // cleanup
    device->SetStreamSource(0, nullptr, 0, 0);
    Buffer, array of elements.
    Definition RRBuffer.h:168
    Using uv coordinates in HLSL vertex shader:
    void vertexShader(in float2 iLightmapCoord: TEXCOORD0,
    ..., out float2 oLightmapCoord: TEXCOORD0)
    {
    ...
    oLightmapCoord = iLightmapCoord;
    }
    Sampling and using illumination value in HLSL pixel shader:
    sampler lightmap;
    void pixelShader(in float2 iLightmapCoord: TEXCOORD0,
    ..., out float4 oColor: COLOR)
    {
    float4 light = tex2D(lightmap, iLightmapCoord);
    ...
    oColor = ... + materialColor * light;
    }
  • Alternatively, texturing could be done in fixed pipeline, without shaders, but it is beyond scope of this documentation.
  • See Direct3D, OpenGL or your engine documentation for more details on texturing and rendering with lightmap.