We have been pretty quiet lately. The reason for doing so is that we have put more attention to the development side. Quite a few things have been achieved in the meantime and more will be coming soon.
Those are the most important achievements from the last two months:
WE HAVE A NEW ICON
Take a look above. Icon has been designed by Daniel Djarmati. How do you like it?
Dejan will be responsible for the complete audio experience, including music and sound effects in the game.
We added support for the devices featuring touch screen, this is especially interesting for the tablet devices. On-screen virtual game pad can be used to control the game.
SUPERVERSE running on Acer Aspire P3
How to activate it? Just tap the screen!
The game supports lots of players playing the game locally at the moment. The limiting factor is the number of game controllers you can attach to your PC. Theoretical limit is 21 players at this moment.
Several players can play on keyboard, up to 16 using game controllers and one using touch screen – not that this would be practical, but it is possible. :)
Because of the crowd we would have in the game if all 21 players would play at once we will have to limit the number of players to some reasonable number. It’s yet to be decided how many players will be allowed to play simultaneously in the final version of the game.
Some effort has been put to make the game run on as much graphics cards as possible. In addition to dedicated Nvidia and AMD/ATI gfx chips the game now supports integrated Intel HD 4000 and newer models.
The game is slowly getting the shape of the real game, we are gradually moving away from the phase where it was “just” a technical demonstration.
We have the energy and energy recovery system in place and ability to respawn another player who just lost life. First enemies (non-player controlled objects) are added. The arsenal has already two weapons with some variations. The bullet time effect is there and the few other items for players to pick up.
More 3D-ish perspective is added to the game by changing the way objects are flying into the scene. We’ve also added a system that would enable us to track the way players are playing the game – this will give us plenty of useful data that we can use to tweak and adjust the gameplay.
Early work on what now is known as SUPERVERSE game was often interrupted with side attractions. Most of them were small graphics or game projects created ad hoc and fueled with simple desire to understand and implement a novel graphics technique or interesting programming concept.
One of biggest detours from main road was terrain rendering project. Started as a simple exercise in more complex spatial structures and some interesting shading techniques that were recently published at the time, it ended up to be quite comprehensive testing ground (pun intended) for realtime terrain rendering.
Much of it later actually ended in Superverse code base, for example atmospheric shading, advanced materials and high quality post-processing effects.
Companion cubes from Portal :)
All started with geomipmapped terrain. One of mayor problems any terrain engine must solve is keeping polycount in sane numbers. Brute forcing bunch of triangles on screen may work for some small landscapes but drawing up to few kilometers in distance while keeping close terrain features detailed enough will require some proper solution. Similar to mipmaps, geomipmapping employs a pyramid of progressively smaller heightmaps. For start, terrain is sliced into a grid of patches. In my case top level patches were grids of 32×32 quads (64×64 in later iterations). Lower detail patches were generated by simply skipping every odd vertex ending with 16×16 quads, then 8×8 and so on. Different metrics are used to determine what level of detail should be used. To avoid nasty t-junctions (when two patches with different LOD levels are next to each other) reordering of indices was used ending with correct stitching of patches. Having axis-aligned bounding boxes for each patch helped with frustum culling. Some primitive occlusion culling was added later. I experimented with rasterization of occluders on CPU to allow for dynamic rejection for arbitrary geometry (similar to work done on Frostbite engine), but never got it in working state. Later versions had single static vertex buffer with dynamic index buffers per patch that were only updated when needed, depending on current camera position. VBO was simple 2D grid and actual height was sampled in both vertex and fragment shaders.
In the end I had quite large terrains (up to 4096×4096 quads) running very well on some modest hardware.
At that time, two AAA games were doing some pretty interesting stuff regarding terrain shading and lighting. Crytek’s Crysis and Battlefield: Bad Company by DICE both had large landscapes, dynamic texture splatting and great lighting. Also, both have published some details on tech behind it.
Shader splatting as done in Frostbite engine looked as a good choice for simple terrain engine. Being able to change terrain simply as modifying several params and using just heightmaps without need for additional processing (saving of precomputed terrain ambient occlusion and low-res baked shadows) helped me focus on coding rather than tweaking various textures for desired look. Splatting, however wasn’t optimized and having large number of different shader layers wasn’t easy on GPU. Frostbite was generating shader for each combination of layers over a single tile, but I never got to implement similar system. In my case, having 4 to 5 layers was good enough anyway.
I was well aware of fact that having decent quality shading is just one side of the story. High quality lighting that would simulate natural scattering in the sky and color of Sun was the second part. Crysis had state of the art sky rendering based on work by Nishita. First attempt at solving scattering integral wasn’t completely successful, missing out correct constants. Publishing of Sean O’ Neil’s single scattering sky rendering (also based on simplified Nishita model) helped with second and third implementation, later ending with mixed CPU/GPU solution. Finally I was able to produce believable sky with deep blue’s of clear day afternoons and beautiful sunsets. Scattering was computed on CPU and sent to sky shader and packed into SH lighting for accurate shading of terrain and models. Dynamic time-of-day was possible, with scattering calculation done over several frames to ease the cost. Of course, generated data could be saved for superfast static sky.
Procedural grass was generated each time camera moved placing cylindrical billboards on jittered grid around player. Grass billboards are shaded with modified terrain normals and are faded away based on distance, blending seamlessly with landscape. Top vertices of each billboard could be animated to produce waving of blades in the wind. Each billboard position was calculated taking into account height and slope of terrain, as for grass blending layer. Having each blade as actual geometry was planed, but never finished.
Other notable features were: water rendering with full scene reflection, refraction and underwater fog; Valve’s SMD model loader (without animations); simple screen-space ambient occlusion; HDR pipeline with linear colors, wide bloom, auto-exposure and several tonemappers at the end (Reinhard, exponential and today everybody’s favorite – filmic).
Traveling down the path is often more rewarding than reaching destination, especially when one’s end goal is constantly changing. Many techniques from terrain renderer (used in AAA games at the time) were later moved into Superverse and allowed for more optimized, higher quality graphics that we are proud of. But don’t expect to land on planets you may encounter while blasting though game.
Lately we have been spending time developing the co-operative multiplayer game mode. Two players are now able to play together, although technically there is no limit on number of players except the number of controllers attached to computer.
New weapons are being designed such as this missile.