Project

Nobigi

Compress baked lightmaps into tiny neural networks for interactive global illumination in WebGL.

Blender / Python / TensorFlow / GLSL / WebGL2 Research, ML, rendering, tooling Visit project

Nobigi (Neural optimized baked interactive global illumination) explores how to replace heavy baked lightmaps with tiny neural networks that run directly in shaders. The result is a scene that looks like traditional baked GI but responds smoothly to dynamic inputs such as light rotation. The full demo lives at jure.github.io/nobigi, with background in the blog post.

Dataset and baking pipeline

The training set is generated from Blender using rotate_light_and_bake.py:

  • A target scene (the coffee cup) is baked at many light orientations over two axes.
  • Each baked frame stores diffuse direct + indirect lighting for a surface, producing a dense grid of lightmap images.
  • Filenames encode light rotations, which become the dynamic inputs for training.

Neural representation

The Colab notebook trains a compact SIREN-style MLP per surface:

  • Inputs: x, y, and two light-angle parameters (4D total).
  • Outputs: r, g, b values for that surface.
  • Sine-activated layers with weighted skip connections keep the model small (a few thousand parameters) while preserving high-frequency detail.

GLSL export and runtime

The trained weights are converted into GLSL with model_to_shadertoy, packing parameters into vec4/mat4 blocks to avoid shader compilation bottlenecks. At runtime:

  • Each pixel runs inference in the fragment shader.
  • Lighting smoothly interpolates between baked states (no stepping or video textures).
  • The shader payload stays in the tens of kilobytes, far smaller than video lightmaps.
Coffee cup demo still with neural-baked lighting.