This is a space simulation focused game engine implemented in c++ and openGl, which I have been working on over the course of three years.
The purpose of the project is having an easy starting point for openGl graphics programming projects like games and tech demos. At the same time it can be used as an accessible testing environment for real time graphics, or specific parts like shaders can be reused in other projects.
It stands out with it's main future being the ability to render plates at any distance in real time with completly smooth level of detail transitions.
It is implemented in Visual Studio and uses SDL2, catch, FreeImage, freetype, Bullet Physics, OpenAL-soft, stb_vorbis, RTTR, GLAD, MikkTSpace and Assimp.
Also a bunch of (modified) code from Eric Brunetons atmospheric scattering implementation.
Rendering
- Deferred and forward rendering
- Postprocessing shader support
(FXAA, Bloom, HDR) - PBR and IBL pipeline
- Environment Maps
- Precomputed atmospheric scattering
- Starfield with real star data
- Realtime planet terrain generation (custom CDLOD)
- Cascading shadow maps
- Frustum Culling
- Distance field font rendering - Font generation from .TTF
- Stereoscopic normal encoding for optimization
- Sprite and Primitve Rendering
- GLSL Shader preprocessing
- OpenGL state managment
- Multiple lights | inverse square falloff
Framework
- Scenegraph Entity Hierachy
- Entity Component Framework
- Scenegraph parent child hierachy
- Factory pattern for content managment
- Singleton pattern for context objects like Time, Camera, Input, window
- Input Observers
- Input manager using the observer pattern
- Custom Math Library
- Platform Agnostic File System (currently Windows and Linux)
- 3D audio with OpenAL - (.ogg(vorbis) and .vam(pcm) formats)
- Rigid Body Physics with BULLET
- Logging System for console, file and debug
Utility
- Custom Mesh Loading
- GLTF support
- Mesh Filter builds vertex buffer depending on material requirements
- Atomic types (int32, uint16)
- Performance measurment
- JSON parser
- Engine settings loaded from file
- Project generation with GENie
- Continous Integration on AppVeyor
- Unit testing setup with catch.hpp
- Full unit test coverage for math library and file system
Deferred Rendering
Whether or not an object is rendered deferred depends on the material settings. Foreward rendering is supported to accomodate special shaders that don't use the fixed shader workflow of the renderer, like for instance glass, emissive or toon shaders.
In a first pass, all deferred objects render into the g-buffer, which splits the data into 3 rgb textures. The layout of this buffer is shown below. It takes data for position, normal, base color, black and white specular, black and white AO, metalness and roughness - in other words its designed to support physically based rendering.
In order to use as little texture bandwidth as possible, the normal maps are encoded stereoscopically into two channels, and decoded in the second pass. An example of the first pass including normal map encoding can be found here.
In the second pass, all the information is extracted from the g-buffer (the normal maps are decoded again) and used for whatever form of shading is required, including lighting information. This step is applied on a framebuffer.
Once the deferred passes are complete, all objects with materials using forward rendering are rendered on top of this. In order tomake it look like they integrate seemlessly and are not just rendered on top, the z-buffer from the deferred render pass is copied so that the depth test fails if a forward rendered object is sitting behind a deferred rendered object. The rendering pipeline code is executed in the root draw of each scene.
Shading with multiple lights
The shading part of the framework is done with full support for Physically Based rendering.
Currently it supports directional / sun lights and point lights with inverse square falloff. They are implemented as components of an entity, and when rendering they are rendered as volumes which enables the utilization of Frustum culling when the light source wouldn't be visible from the cameras perspective.
The rendering pipeline also supports the use of a single HDR environment map, which stores roughness information in its mipmaps for PBR support.
Postprocessing
The postprocessing step utilizes framebuffers with glsl shaders similar to the other 3d shaders, but on a screenspace plane. All prior render passes are rendered to the framebuffer textures until the last postprocessing step has been applied, which in turn renders to the backbuffer and presents.
This allows techniques such as high dynamic range rendering, which in turn enables adjusting the camera exposure. It could come in handy when dealing with high contrast environments such as indoor vs outdoor settings or space scenes with a bright planet and sun but rather dark starfield.
Further more gamma correction allows for a linear texture workflow for more accurate shading, and as a result all textures can be loaded either in srgb or linear rgb mode. Less common effects such as toon shading (for instance a sobel filter for outlines) can also be applied.