The game is being developed in a steady tempo (too sloooow) so it’s the time for the new update on the progress we’ve made so far.
First of all, the critical bugs that caused the game to crash on specific setups and in specific situations have been resolved. At this moment, we don’t have any major bugs in the game the could crash it. At least, not that we are aware of them.
We have spent some time improving the performance of certain aspects of the game. Collision detection and depth of field effect have been rewritten to be much more efficient. Although, our intention is to dedicate some time for optimizations in the last phase of game development, when we freeze the game play and all other aspects of the game. By the way, the DOF effect looks much better now.
The game playfield, the area the player can fly around, has been expanded quite a lot. In fact, now the playfield size can be different in each level.
Camera acts much better now as it allows the player to see more depending on its speed, by zooming out. Similarly, when more players are playing as they move away from each other camera will zoom out as much as possible keeping them all in the frustum.
New sound effects were added and all the sounds have their position in the space, thus creating the 3D sound ambience.
Players can now enhance their weapons by collecting weapon upgrade items. Laser weapon can be extended to have more laser beams, multiple missiles can be launched simultaneously and we’ve introduced a new weapon: a mini gun.
These are the most notable features and improvements that we’ve been working on since the last update. We are approaching the point where we would like to show the game to a limited number of people and get their feedback, as only very few people have actually seen the game in action so far.
We have been pretty quiet lately. The reason for doing so is that we have put more attention to the development side. Quite a few things have been achieved in the meantime and more will be coming soon.
Those are the most important achievements from the last two months:
WE HAVE A NEW ICON
Take a look above. Icon has been designed by Daniel Djarmati. How do you like it?
Dejan will be responsible for the complete audio experience, including music and sound effects in the game.
We added support for the devices featuring touch screen, this is especially interesting for the tablet devices. On-screen virtual game pad can be used to control the game.
How to activate it? Just tap the screen!
The game supports lots of players playing the game locally at the moment. The limiting factor is the number of game controllers you can attach to your PC. Theoretical limit is 21 players at this moment.
Several players can play on keyboard, up to 16 using game controllers and one using touch screen – not that this would be practical, but it is possible.
Because of the crowd we would have in the game if all 21 players would play at once we will have to limit the number of players to some reasonable number. It’s yet to be decided how many players will be allowed to play simultaneously in the final version of the game.
Some effort has been put to make the game run on as much graphics cards as possible. In addition to dedicated Nvidia and AMD/ATI gfx chips the game now supports integrated Intel HD 4000 and newer models.
The game is slowly getting the shape of the real game, we are gradually moving away from the phase where it was “just” a technical demonstration.
We have the energy and energy recovery system in place and ability to respawn another player who just lost life. First enemies (non-player controlled objects) are added. The arsenal has already two weapons with some variations. The bullet time effect is there and the few other items for players to pick up.
More 3D-ish perspective is added to the game by changing the way objects are flying into the scene. We’ve also added a system that would enable us to track the way players are playing the game – this will give us plenty of useful data that we can use to tweak and adjust the gameplay.
Early work on what now is known as SUPERVERSE game was often interrupted with side attractions. Most of them were small graphics or game projects created ad hoc and fueled with simple desire to understand and implement a novel graphics technique or interesting programming concept.
One of biggest detours from main road was terrain rendering project. Started as a simple exercise in more complex spatial structures and some interesting shading techniques that were recently published at the time, it ended up to be quite comprehensive testing ground (pun intended) for realtime terrain rendering.
Much of it later actually ended in Superverse code base, for example atmospheric shading, advanced materials and high quality post-processing effects.
All started with geomipmapped terrain. One of mayor problems any terrain engine must solve is keeping polycount in sane numbers. Brute forcing bunch of triangles on screen may work for some small landscapes but drawing up to few kilometers in distance while keeping close terrain features detailed enough will require some proper solution. Similar to mipmaps, geomipmapping employs a pyramid of progressively smaller heightmaps. For start, terrain is sliced into a grid of patches. In my case top level patches were grids of 32×32 quads (64×64 in later iterations). Lower detail patches were generated by simply skipping every odd vertex ending with 16×16 quads, then 8×8 and so on. Different metrics are used to determine what level of detail should be used. To avoid nasty t-junctions (when two patches with different LOD levels are next to each other) reordering of indices was used ending with correct stitching of patches. Having axis-aligned bounding boxes for each patch helped with frustum culling. Some primitive occlusion culling was added later. I experimented with rasterization of occluders on CPU to allow for dynamic rejection for arbitrary geometry (similar to work done on Frostbite engine), but never got it in working state. Later versions had single static vertex buffer with dynamic index buffers per patch that were only updated when needed, depending on current camera position. VBO was simple 2D grid and actual height was sampled in both vertex and fragment shaders.
In the end I had quite large terrains (up to 4096×4096 quads) running very well on some modest hardware.
At that time, two AAA games were doing some pretty interesting stuff regarding terrain shading and lighting. Crytek’s Crysis and Battlefield: Bad Company by DICE both had large landscapes, dynamic texture splatting and great lighting. Also, both have published some details on tech behind it.
Shader splatting as done in Frostbite engine looked as a good choice for simple terrain engine. Being able to change terrain simply as modifying several params and using just heightmaps without need for additional processing (saving of precomputed terrain ambient occlusion and low-res baked shadows) helped me focus on coding rather than tweaking various textures for desired look. Splatting, however wasn’t optimized and having large number of different shader layers wasn’t easy on GPU. Frostbite was generating shader for each combination of layers over a single tile, but I never got to implement similar system. In my case, having 4 to 5 layers was good enough anyway.
I was well aware of fact that having decent quality shading is just one side of the story. High quality lighting that would simulate natural scattering in the sky and color of Sun was the second part. Crysis had state of the art sky rendering based on work by Nishita. First attempt at solving scattering integral wasn’t completely successful, missing out correct constants. Publishing of Sean O’ Neil’s single scattering sky rendering (also based on simplified Nishita model) helped with second and third implementation, later ending with mixed CPU/GPU solution. Finally I was able to produce believable sky with deep blue’s of clear day afternoons and beautiful sunsets. Scattering was computed on CPU and sent to sky shader and packed into SH lighting for accurate shading of terrain and models. Dynamic time-of-day was possible, with scattering calculation done over several frames to ease the cost. Of course, generated data could be saved for superfast static sky.
Procedural grass was generated each time camera moved placing cylindrical billboards on jittered grid around player. Grass billboards are shaded with modified terrain normals and are faded away based on distance, blending seamlessly with landscape. Top vertices of each billboard could be animated to produce waving of blades in the wind. Each billboard position was calculated taking into account height and slope of terrain, as for grass blending layer. Having each blade as actual geometry was planed, but never finished.
Other notable features were: water rendering with full scene reflection, refraction and underwater fog; Valve’s SMD model loader (without animations); simple screen-space ambient occlusion; HDR pipeline with linear colors, wide bloom, auto-exposure and several tonemappers at the end (Reinhard, exponential and today everybody’s favorite – filmic).
Traveling down the path is often more rewarding than reaching destination, especially when one’s end goal is constantly changing. Many techniques from terrain renderer (used in AAA games at the time) were later moved into Superverse and allowed for more optimized, higher quality graphics that we are proud of. But don’t expect to land on planets you may encounter while blasting though game.