See Compositor for the main article about this subject.
|Caution The feature discussed below is currently considered experimental, i.e. must be considered proof-of-concept for the time being. Aircraft developers are not yet encouraged to update their models to use PBR as the HDR pipeline will not be production-ready for some time. It's just to showcase what's being worked on. If you'd like to learn more, please get in touch via the developers mailing list.|
| The FlightGear forum has a
subforum related to: Effects & Shaders
|Description||A modern rendering pipeline that targets relatively powerful systems|
|Contributor(s)||Fernando García Liñán |
The HDR pipeline is a Compositor-based rendering pipeline that attempts to bring modern rendering techniques to FlightGear, namely high dynamic range (HDR) and physically based rendering (PBR). It is implemented entirely in FGData using XML for the Compositor pipeline definition and Effects, and GLSL for shaders. As of 07/2021, the HDR pipeline is not yet usable for normal flying, but it can be enabled with the command line argument
The Classic pipeline still relies on legacy OpenGL features, so rather than improving or reworking it, the idea of creating an entirely separate rendering pipeline from scratch started taking shape. The Compositor played the biggest role in enabling this effort as it allows the creation of new rendering pipelines entirely in FGData space without any C++ changes whatsoever, greatly reducing the amount of work that had to be done and making the iterative process of testing and debugging much more comfortable and faster.
Last updated: 07/2021
The HDR pipeline is not yet ready for day-to-day flying, but it is currently available on next for anyone adventurous enough to try it. Expect a lot of breakage though.
PBR and glTF
This pipeline introduces a PBR Effect (fgdata/Effects/model-pbr.eff). This Effect can be used as usual by adding an <effect> tag in the model XML and configuring it like you would configure model-combined. However, the recommended way to use PBR is through the glTF file format.
The HDR pipeline implements a deferred rendering pipeline. Instead of computing the lighting right away, the pipeline writes information about the geometry (normals, materials...) to a series of textures that are later used to light the scene on a full-screen pass. This is identical to how Rembrandt worked. The main motivation behind implementing a deferred renderer is to keep forward passes to a minimum. FlightGear's scene graph is not very well optimized, so traversing it as few times as possible is always going to yield better performance. The alternative was to implement a modern hybrid forward renderer with a depth pre-pass, but we would be traversing the scene graph twice in this case.
The pipeline is designed to only work with PBR values, so both materials and lighting are internally assumed to be based on real-life magnitudes. Old/legacy materials try to "translate" from the legacy ambient/diffuse/specular model to PBR metalness/roughness. This may result in some incorrect lighting on aircraft that are completely unaware of the HDR pipeline though.
All lighting computations are done in HDR space, i.e. colour values are not clamped to the [0,1] range until the end of the rendering process. HDR colour values are transformed into LDR values that can be displayed on a monitor by tone mapping. A exposure parameter is calculated automatically based on the average scene luminance (like a camera on the Auto setting would do), but there is the possibility of lowering/increasing the exposure manually if the user feels like the scene is too bright/dark.
To allow truly PBR-based materials, real time environment mapping is used. At the start of the frame we render to a cubemap and subsequent passes use this information to evaluate indirect lighting.
Near the end of the rendering process, miscellaneous post-processing passes like FXAA and ambient occlusion are used.