NULLSEED

dev diary

NULLSEED is a post-collapse voxel survival sandbox set generations after society fell due to misuse of advanced technology. Players live out of a camper van, build an off-grid base, and travel between small settlements to trade, take work, and gather supplies.

The most valuable loot and the most dangerous problems come from null-zones. These are localized pockets of contamination where matter and space behave inconsistently. Players enter null-zones for short, high-stakes salvage runs. They stabilize hazards, recover artifacts, and extract before conditions worsen.

The game supports solo play and co-op. It aims for fast drop-in sessions while keeping long-term persistence through base building, world changes, and relationships.

This simple page serves to document the development and progress as I go and run into interesting problems.

If you want to contribute, help with problems or ask questions, you can contact me on Discord @aze.music

Screenshot 50

Added atmospheric scatter simulation for skybox, haze and fog. This is still work in progress, the goal is to simulate atmosphere with pollutants, water droplets and dust.

This implementation uses a combination of Rayleigh/Mie scattering equations, optimized with small optical depth LUT. Rayleigh scattering approximates how different wavelengths scatter in the atmosphere, creating sky color, Mie scattering simulates light scattering in a volume of particles in a forward direction, creating haze.

There are three layers of fog:
- light scattering haze (softens distant edges)
- height based depth fog with animated wind texture (creates a sense of volume)
- distant band of fog that blends to skybox color (fades geometry to avoid hard edges)

I'll be adding SDF ray marching to add more volumetric behavior. I'm trying to avoid froxels for performance reasons.

Screenshot 56

High particle anisotropy, low Mie coefficient -> colourful sunset

Screenshot 54

Increased humidity -> higher haze (increased Mie coefficient, low particle anisotropy)

The ultimate intent is smooth blending for different day times, weather, nearby anomalies and biomes.

Screenshot 48

First attempt at indirect shadows. The Cheb occlusion and weighting is still work in progress, but first results seem good. Denser cascades are required, but everything is still running nicely at ~3.5ms on RTX3060

Also added a very fast 4-tap weighted cross blur using LDS that generates 7 mips for specular (~10us per level) + further optimizations.

Screenshot 46

Added a tiny voxel material blend on voxel edges. Materials and structures now feel more grounded and dont look like they float.

To achieve this I had to implement dynamic materials created from config so texture packing becomes automatic. The upside is I now have a texture array for each texture type. The shader needs access to all textures in order to create a blend. It scans neighbors for each edge, leveraging voxel material 3D map and LUT already present in the GPU for GI material sampling this was quite easy to achieve.

Also another big upside is I now have material properties directly in the block asset definition, right next to other gameplay properties. This also enabled me to automatically render a UI icon for that given block on load.

Perf impact is a few microseconds. Downside is one small artefact I will get rid of later..

Screenshot 45

Specular fixed, it was caused by mip blur on radiance probes not wrapping around edges, so when the octa tiles were wrapped around a sphere, the corners created a sharp seam that presented itself as a reflection artifact. Prefiltering roughness mips now taps across edges when blurring.

You can see the walls reflecting ground, sky and color correctly now, with the correct specular response. (These are traced speculars, not SSR, SSR is blended on top and only for very smooth surfaces)

Screenshot 43
Now I have to fix specular and radiance response of the GI system.. I have already smoothed out most problems, most were related to GatherRed not wrapping around edges, but this screenshot illustrates my biggest problem - for some reason the MIP selection differs based on view direction and face orientation.. also there's artifacts in corners.'
Screenshot 42
The result - light-tight 1 voxel wide interior with a small opening in the ceiling, light bouncing off the red floor.
Screenshot 40

I may have found the optimal path using Chebyshev dilation accelerated polycube EDT for near-SDF and low-res far-field SDF for distant geometry. JFA was fast but it tended to overestimate.

The current approach builds a truncated, neighbor-aware signed distance field per chunk by treating solid voxels as cubes (polycube field), assembling a 34^3 halo, then using Cheb (box) dilations to classify most voxels and only doing an exact Euclidean cube-distance search in the boundary band. This gives usable, sign-correct distances that preserve 1 voxel thin features, while keeping bandwidth low and allowing incremental rebuilds via small per-chunk bricks. The main optimization tricks are a bitset halo + separable dilation + distance-sorted offset list, which makes most voxels early-out to +/- and minimizes expensive searches. The con is that it is still an approximate SDF due to truncation and grid sampling (still not a full EDT), so conservative ray stepping is required and far-field coverage needs a separate coarse SDF.

Using a SM5 optimized minimal bandwidth approach and shared memory kernel, the time for a 64chunk batch is well under 2ms!! Plus with an async queue it presents itself only as minor background noise in the profiler :)

On the image, left side - real-time SDF, right side - brute-force offline calulated reference

EDTs were quite ok, but the performance hit of calculating new chunks as the player moved was too high (~100m per 64 chunk batch), also the total SDF memory footprint was too high to be considered viable (~300MB for SDF data alone). Now I am exploring 4-level 128x128x128 R16G16_UNorm 3D slicemap atlas using jump flood algorithm. With .5 - 1 - 2 - 4 spacing per level the memory footprint is way smaller and perf hit 10x smaller.
So it appears that approximating SDF using Chamfer-style masks introduces some directional drift that inherently confuses the SDF tracer. I wanted to avoid using EDTs (euclidean distance transforms) but it seems I won't avoid the math..
Screenshot 39
After investigating lighting leaks, I added a planar probe-centered SDF slice renderer to visualise the SDF world around a probe as a 2D cross-section along adjustable yaw. This basically converts the 3D SDF world into a single ortho 2D image. After pointing it along the leak source I noticed my SDF gradients cut off at chunk edges.. this effectively tells the SDF tracer there's ~4m of free space ahead even though there's actually a flat wall right upfront, because the chunk -> SDF converter doesn't see into neighboring chunks. This misleads the tracer and effectively makes it skip walls at or near chunk boundaries..
Screenshot 36
Thin geometry is going to be an issue :/ On to rethink probe placement again. Probes spawned inside the tunnel, but need to be relocated outside within the same cell when camera moves..
Screenshot 35
Shading fixed, probe positioning almost fixed - indoor - nearly there, but still a bit too dark to my liking
Screenshot 32
Shading fixed, probe positioning almost fixed - lit outdoor
Screenshot 34
Shading fixed, probe positioning almost fixed - lit/ambient cave
Screenshot 33
Shading fixed, probe positioning almost fixed - ambient cave
Screenshot 31
Caves are ok, interiors are terrible.. time to dive back to depth moments
Screenshot 30
irradiance probes for diffuse
Screenshot 29
radiance probes for specular
Screenshot 28
radiance probes for specular
Screenshot 27
diffuse probes
Screenshot 26
After weeks of tweaking, caves look like caves
Screenshot 25
Added hooks to the screenspace compute to the URP deferred stencil shader and it finally looks like light!
Screenshot 24
GBuffer full of noisy normals, time to reconstruct a clean gbuffer purely for GI
Screenshot 23
I mean look at that..
Screenshot 22
Deleted the entire composite shader and moved everything to a screenspace compute shader, after several iterations and tons of papers read, finally something that looks right!!
Screenshot 21
Although rough, the diffuse probes finally work
Screenshot 20
First indirect light?!
Screenshot 19
Probe cascades, more coherent composite, overblown specular..
Screenshot 18
First prototype of a composite shader.. well.. it's something
Screenshot 17
First hits of SDF material ray marching!!
Screenshot 16
Reworked probes completely to a depth-based spawn
Screenshot 15
First shot at frustum probe cascade
Screenshot 14
Finally the SDF "almost" works (with some holes)
Screenshot 13
Converting the world to signed distance fields cascades
Screenshot 12
Experimenting with fixed probes and raytracing
Screenshot 11
First attempt at Global Illumination.. well..
Screenshot 10
Chunk column ticketing instrumentation helped a lot
Screenshot 9
Wait.. nevermind, still broken..
Screenshot 8
OK, Edit ingress and persistence finally working.. we cookin?
Screenshot 7
Didn't realize I need to read neighbor chunks buffer slices..
Screenshot 6
Greedy mesher maybe a bit TOO greedy?
Screenshot 5
.. Something is very off?
Screenshot 4
Second iteration on voxel engine with pooled ticketed approach...
Screenshot 3
Looks nice?
Screenshot 2
Greedy meshing attempts
Screenshot 1
First attempts at voxel terrain generator (this is the first screenshot after I realized I want to document my journey :) )