Early work on what now is known as SUPERVERSE game was often interrupted with side attractions. Most of them were small graphics or game projects created ad hoc and fueled with simple desire to understand and implement a novel graphics technique or interesting programming concept.
One of biggest detours from main road was terrain rendering project. Started as a simple exercise in more complex spatial structures and some interesting shading techniques that were recently published at the time, it ended up to be quite comprehensive testing ground (pun intended) for realtime terrain rendering.
Much of it later actually ended in SUPERVERSE code base, for example atmospheric shading, advanced materials and high quality post-processing effects.
All started with geo-mipmapped terrain. One of mayor problems any terrain engine must solve is keeping polycount in sane numbers. Brute forcing bunch of triangles on screen may work for some small landscapes but drawing up to few kilometers in distance while keeping close terrain features detailed enough will require some proper solution. Similar to mipmaps, geomipmapping employs a pyramid of progressively smaller heightmaps. For start, terrain is sliced into a grid of patches. In my case top level patches were grids of 32×32 quads (64×64 in later iterations). Lower detail patches were generated by simply skipping every odd vertex ending with 16×16 quads, then 8×8 and so on. Different metrics are used to determine what level of detail should be used. To avoid nasty t-junctions (when two patches with different LOD levels are next to each other) reordering of indices was used ending with correct stitching of patches. Having axis-aligned bounding boxes for each patch helped with frustum culling. Some primitive occlusion culling was added later. I experimented with rasterization of occluders on CPU to allow for dynamic rejection for arbitrary geometry (similar to work done on Frostbite engine), but never got it in working state. Later versions had single static vertex buffer with dynamic index buffers per patch that were only updated when needed, depending on current camera position. VBO was simple 2D grid and actual height was sampled in both vertex and fragment shaders.
In the end I had quite large terrains (up to 4096×4096 quads) running very well on some modest hardware.
At that time, two AAA games were doing some pretty interesting stuff regarding terrain shading and lighting. Crytek’s Crysis and Battlefield: Bad Company by DICE both had large landscapes, dynamic texture splatting and great lighting. Also, both have published some details on tech behind it.
Shader splatting as done in Frostbite engine looked as a good choice for simple terrain engine. Being able to change terrain simply as modifying several params and using just heightmaps without need for additional processing (saving of precomputed terrain ambient occlusion and low-res baked shadows) helped me focus on coding rather than tweaking various textures for desired look. Splatting, however wasn’t optimized and having large number of different shader layers wasn’t easy on GPU. Frostbite was generating shader for each combination of layers over a single tile, but I never got to implement similar system. In my case, having 4 to 5 layers was good enough anyway.
I was well aware of fact that having decent quality shading is just one side of the story. High quality lighting that would simulate natural scattering in the sky and color of Sun was the second part. Crysis had state of the art sky rendering based on work by Nishita. First attempt at solving scattering integral wasn’t completely successful, missing out correct constants. Publishing of Sean O’ Neil’s single scattering sky rendering (also based on simplified Nishita model) helped with second and third implementation, later ending with mixed CPU/GPU solution. Finally I was able to produce believable sky with deep blue’s of clear day afternoons and beautiful sunsets. Scattering was computed on CPU and sent to sky shader and packed into SH lighting for accurate shading of terrain and models. Dynamic time-of-day was possible, with scattering calculation done over several frames to ease the cost. Of course, generated data could be saved for superfast static sky.
Procedural grass was generated each time camera moved placing cylindrical billboards on jittered grid around player. Grass billboards are shaded with modified terrain normals and are faded away based on distance, blending seamlessly with landscape. Top vertices of each billboard could be animated to produce waving of blades in the wind. Each billboard position was calculated taking into account height and slope of terrain, as for grass blending layer. Having each blade as actual geometry was planed, but never finished.
Other notable features were: water rendering with full scene reflection, refraction and underwater fog; Valve’s SMD model loader (without animations); simple screen-space ambient occlusion; HDR pipeline with linear colors, wide bloom, auto-exposure and several tonemappers at the end (Reinhard, exponential and today everybody’s favorite – filmic).
Traveling down the path is often more rewarding than reaching destination, especially when one’s end goal is constantly changing. Many techniques from terrain renderer (used in AAA games at the time) were later moved into Superverse and allowed for more optimized, higher quality graphics that we are proud of. But don’t expect to land on planets you may encounter while blasting though game.
Well, not just yet.