Project
1257 Samalas
WebXR strategy game built for JS13K where you grow and conquer volcanic islands.
1257 Samalas is a WebXR and desktop real-time strategy game set after the 1257 Lombok eruption. You command banyan tree sanctuaries and volcanic islands, sending fleets to capture nodes and flip ownership. It was built for the JS13K game jam, so the core simulation, rendering, text, and audio systems are all compact and procedural.
Gameplay loop
You drag from a player-owned island to another to send half your troops. Units travel as a fleet, collide at the destination, and either reinforce the target or whittle it down until it flips. Troops regenerate over time based on island size, and the map can switch between tutorial, wide field, and full surround layouts.
Rendering and world building
- The world sits inside a gradient sky sphere with custom shader code and simple directional + ambient lighting.
- Islands are morph-targeted cylinders with a custom fragment shader that blends ownership colors, animates lava for enemy nodes, and pulses on selection.
- Player-owned islands sprout instanced banyan trees; enemy nodes grow into a volcano form. Tree scale reflects troop counts, and vertex shaders add subtle sway.
- The dashed connection line uses
Line2and a custom line material so the route reads clearly in both desktop and VR.
GPU simulation
Fleets are simulated on the GPU using a 64x64 float texture (4096 slots). Fragment shaders act as compute passes:
computeVelocity.glslhandles attraction to targets, repulsion from other islands, and ship avoidance.computePosition.glslintegrates movement for moving units only.computeAggregate.glslpacks position and type data for CPU readback.
The renderer reads back the aggregate buffer asynchronously with WebGL2 readPixels to detect collisions, update troop counts, trigger conquest, and schedule sound effects.
Units and visual effects
Knights are an instanced mesh whose per-instance UVs index into the GPU textures. The vertex shader orients each unit to its velocity vector, while the fragment shader uses speed and ownership to tint the fleet green or red.
Text rendering system
Troop counts and UI labels use a custom instanced text renderer:
- A monospace glyph atlas is drawn into a canvas texture.
- Each text instance writes character indices into a
DataTexture, which the shader decodes per fragment. - Labels can follow camera rotation in VR to stay readable.
Stress test pushing millions of letters per second in a single instanced draw.
Audio
Audio is fully procedural:
- A custom music engine uses Web Audio oscillators, a pooled synth voice setup, and a convolution reverb.
- An AudioWorklet generates a noise bed shaped by envelope functions and filtered for movement.
- Unit arrivals trigger positional audio notes synthesized on the fly.
Input and XR
Desktop input uses raycasts against hidden collision spheres and pointer drag to select start and end nodes. In VR, controller rays and trigger presses handle the same interactions, while thumbstick input rotates or moves the player rig.
AI and pacing
The enemy AI picks targets based on island size and troop counts, with difficulty controlling both the attack interval and troop generation rates.
Development snapshots
The earliest playable prototype already had GPU-based movement and the core conquest loop.