Extra Terrestial Engine

banner for open gl graphics framework showing parts of the gbuffer

This is my personal game engine implemented in C++, which I have been continuously plugging away at since 2016.

The long term goal for this project is to have my own tech stack perfectly suited to serve as a baseline especially for space-sim games, but also be adaptable for other kinds of software projects and rendering tech demos. On the way there it has been an excellent testbed for me to test my programming ideas and lern new techniques. The codebase has grown to a considerable size over the years, and contains many systems which can be used for other projects – a lot of thought has been put into building a modular architecture that decouples systems so that they can be used on their own or replaced with others.

Amongst it’s defining features is a planet rendering system that implements my own algorithm for smooth level of detail transitions (no popping) and consistent on-screen geometry density no matter the distance, which I adapted from planar CDLOD and wrote a research paper about.

Link to the ETEngine github repository

It has also contains a PBR scene renderer, a data driven asset conditioning pipeline and resource managment, and my own implementation of an archetype based Entity Component System with scene hierachy support – allowing for a data oriented development process. See below for a full feature breakdown, or browse the codebase yourself on Github –>

The engine is written to be crossplatform and uses a render hardware interface that currently is implemented with OpenGL. You can find a full breakdown of all the libraries used here.

Features

Rendering

  • Realtime planet terrain generation (custom CDLOD)
  • Precomputed atmospheric scattering based on Eric Brunetons implementation.
  • PBR and IBL pipeline
  • Hierarchical Material System
  • Deferred and forward rendering
  • Cascading shadow maps
  • Postprocessing shader support
    (FXAA, Bloom, HDR)
  • Environment Maps
  • Starfield with real star data
  • Frustum Culling
  • GUI & Distance field font rendering – Font generation from .TTF
  • Stereoscopic normal encoding for optimization
  • Texture Compression BC1-7
  • Sprite and Primitve Rendering
  • Abstract RHI with openGL backend, opengl state managment and automated texture binding
  • Multi viewport support and generic scene rendering interface
  • Render scene is decoupled from game scene to layout data optimized for rendering
  • Mesh import with GLTF and Collada

Framework

  • Entity Component System
    • Archetype based for cache friendly iteration
    • Built in scene hierarchy
    • Access patterns and dependency definitions allow automated execution order with support for MT
  • Data driven asset conditioning pipeline
    • Reflection system allows for automatic serialization and deserialization into both a binary format or JSON
    • Resource manager with support for asset dependencies
    • Content cooker converts from edit friendly formats to runtime optimized formats
  • GUI System built on top of RmlUI for weblike UI design
  • WIP Editor with blender-like window managment and application framework
  • 3D audio with OpenAL – (.ogg(vorbis) and .vam(pcm) formats)
  • Rigid Body Physics with BULLET
  • Input manager
  • Custom Math Library

Utility

  • Extensive debug features:
    • Trace and logging system collects logs from multiple executables on a single server
    • ImGUI integration for debug features
    • Debug console with various render debug commands
    • Commandline commands
    • HashStrings can be resolved back into regular strings for non shipping builds
  • Memory managment with custom smart pointers
  • Custom container types such as Slot maps and linear hash maps
  • Platform Agnostic File System (currently Windows and Linux)
  • Atomic types (int32, uint16)
  • Performance measurment
  • JSON and XML parsers
  • Network Sockets
  • Engine settings loaded from file
  • Continous Integration on AppVeyor
  • Unit testing with coverage for math, file & asset IO, smart pointers, ECS
  • CMake project generation

Architecture

Here is a rough overview of the engines architecture

History

This project started off in 2016 as an OpenGL graphics framework based on the “Overlord Engine” (Dx11) from the Graphics Programming course at Howest University

In parallel I was writing my graduation work on realtime planet rendering, and in 2017 I merged the two projects into this engine.

Since then the engine has gone through a lot of foundational structural change in order to support a scalable game development approach, and the vast majority of code has been completely rewritten.

I am pretty pleased with where the architecure is at now, the countless new quality of life features make working with the engine pleasant and fast, however there are a few things I would still like to complete until I consider the engine ready to support developing some smaller indie titles:

Roadmap

Editor and UI

I am currently working on updating the editor from using GTK as a UI framework to using my own application and UI framework built on the runtime UI system running with RmlUI, as using GTK for crossplatform support has proven to be difficult to maintain on Windows, which is what most game developers use and should therefore be prioritized. This should also ensure that any UI features developed for the editor will be readily available for the game. The scene editor is also not complete, one of the primary features needed is property editing based on reflected data.

Framework

With the entity component system implemented, this is largely where it needs to be, however for future more complex games multithreading should be supported, which means implementing a job system and then plugging it into the ECS. This should go reasonably smoothly as ECS systems already differentiate between read and write access of components, which should make managing mutability straightforward to automate. I anticipate more work around asyncronous asset loading however. It would also be nice to use this to speed up content cooking.

Rendering

To support more modern rendering techniques I intend to implement a DirectX 12 backend. This will require some changes to the RHI, including:

  • Introducing command lists as a concept. These will be written in a flat buffer and then consumed by each backend in order to be cache friendly and minimize virtual function calls.
  • Introducting pipeline state objects. The RHI will consume backend agnostic PSO descriptors, and openGL will use this to manage its state, whereas DX12 will convert these descriptors to DX12 PSO descriptors and cache resulting PSOs based on these.
  • Shaders are currently authored in GLSL and use individual uniforms. This will be switched to using HLSL and then translating shaders to GLSL using DXC and SPIRV-Cross. Uniforms will be replaced with CBVs and UAVs / their open GL equivalents, differentiating between Global / Per View / Per Material data etc. This will also help define a root signature for DX12 PSOs
  • For DX12 memory managment including heaps and allocators will be implemented. To help I might use a library such as D3D12MA

Other

The above summarises the main changes that will have foundational impact on the engines architecture. Most other features should be able to slot into the existing framework, and can therefore be implemented as needed by various projects. Some examples include controller support, a config data loader, particle systems or various modern rendering features.


Implementation

I occasionally document the implementation of various features on my blog, see various tags and categories if you’re interested:

Below are some semi outdated implementation details of how things where rendered in the past. I’m leaving them here for now because they are still somewhat interesting, however are due of an overhaul which I will probably do in the form of more blog posts.

Deferred Rendering

Whether or not an object is rendered deferred depends on the material settings. Foreward rendering is supported to accomodate special shaders that don’t use the fixed shader workflow of the renderer, like for instance glass, emissive or toon shaders.

In a first pass, all deferred objects render into the g-buffer, which splits the data into 3 rgb textures. The layout of this buffer is shown below. It takes data for position, normal, base color, black and white specular, black and white AO, metalness and roughness – in other words its designed to support physically based rendering.

In order to use as little texture bandwidth as possible, the normal maps are encoded stereoscopically into two channels, and decoded in the second pass. An example of the first pass including normal map encoding can be found here.

In the second pass, all the information is extracted from the g-buffer (the normal maps are decoded again) and used for whatever form of shading is required, including lighting information. This step is applied on a framebuffer.

Once the deferred passes are complete, all objects with materials using forward rendering are rendered on top of this. In order tomake it look like they integrate seemlessly and are not just rendered on top, the z-buffer from the deferred render pass is copied so that the depth test fails if a forward rendered object is sitting behind a deferred rendered object. The rendering pipeline code is executed in the root draw of each scene.

Graphic showing the layout of the Gbuffer in the open gl framework

Shading with multiple lights

The shading part of the framework is done with full support for Physically Based rendering.

Currently it supports directional / sun lights and point lights with inverse square falloff. They are implemented as components of an entity, and when rendering they are rendered as volumes which enables the utilization of Frustum culling when the light source wouldn’t be visible from the cameras perspective.

The rendering pipeline also supports the use of a single HDR environment map, which stores roughness information in its mipmaps for PBR support.

Screenshot of the open gl shading with multiple lights

Postprocessing

The postprocessing step utilizes framebuffers with glsl shaders similar to the other 3d shaders, but on a screenspace plane. All prior render passes are rendered to the framebuffer textures until the last postprocessing step has been applied, which in turn renders to the backbuffer and presents.

This allows techniques such as high dynamic range rendering, which in turn enables adjusting the camera exposure. It could come in handy when dealing with high contrast environments such as indoor vs outdoor settings or space scenes with a bright planet and sun but rather dark starfield.

Further more gamma correction allows for a linear texture workflow for more accurate shading, and as a result all textures can be loaded either in srgb or linear rgb mode. Less common effects such as toon shading (for instance a sobel filter for outlines) can also be applied.

Screenshot of the open gl postprocessing with tonemapping (gamma | exposure)