From FlightGear wiki
Revision as of 06:52, 8 November 2019 by Bugman (Talk | contribs) (Background: Switch to the {{gitorious source}} template.)

Jump to: navigation, search
This article describes content/features that may not yet be available in the latest stable version of FlightGear (2020.1).
You may need to install some extra components, use the latest development (Git) version or even rebuild FlightGear from source, possibly from a custom topic branch using special build settings: -DENABLE_COMPOSITOR=ON.

This feature is scheduled for FlightGear 2019.2. 100}% completed

If you'd like to learn more about getting your own ideas into FlightGear, check out Implementing new features for FlightGear.

Compositor Framework
ALS Compositor pipeline.jpg
Started in 01/2018 (Available since FlightGear 2019.2)
Description Dynamic rendering pipeline configured via the property tree and XML
Contributor(s) Fernando García Liñán
Status Stable

The Compositor aims to bring multi-pass rendering to FlightGear. It encapsulates a rendering pipeline and exposes its parameters to a Property Tree interface. At startup, FlightGear reads the pipeline definition file for each physical viewport defined on the CameraGroup settings. If no Compositor file is specified for a physical camera, the one given by the --compositor= startup command will be used. If such startup option is not used either, FlightGear will look for a valid Compositor file in $FG_ROOT/Compositor/default.xml

The Compositor introduces a new dedicated fgdata directory for new/custom rendering pipelines: fgdata/Compositor.


First discussed in 03/2012 during the early Rembrandt days, Zan (Lauri Peltonen) came up with a set of patches demonstrating how to create an XML-configurable rendering pipeline.

Back then, this work was considered to look pretty promising [1] and at the time plans were discussed to unify this with the ongoing Rembrandt implementation (no longer maintained).

Adopting Zan's approach would have meant that efforts like Rembrandt (deferred rendering) could have been implemented without requiring C++ space modifications, i.e. purely in Base package space.

Rembrandt's developer (FredB) suggested to extend the format to avoid duplicating the stages when you have more than one viewport, i.e. specifying a pipeline as a template, with conditions like in effects, and have the current camera layout refer the pipeline that would be duplicated, resized and positioned for each declared viewport [2]

Zan's original patches can still be found in his newcameras branches which allow the user to define the rendering pipeline in preferences.xml: FlightGear, SimGear.

At that point, it didn't have everything Rembrandt's pipeline needs, but most likely could be easily enhanced to support those things.

Basically, the original version added support for multiple camera passes, texture targets, texture formats, passing textures from one pass to another etc, while preserving the standard rendering line if user wants that. [3]

Since the early days of Zan's groundwork, providing the (hooks) infrastructure to enable base package developers to prototype, test and develop distinct rendering pipelines without requiring C++ space modifications has been a long-standing idea, especially after the Canvas system became available in early 2012, which demonstrated how RTT-rendering buffers (FBOs) could be set up, created and manipulated procedurally (i.e. at run-time) using XML, the property tree and Nasal scripting. [4]

The new Compositor is an improved re-implementation of Zan's original work using not just XML, but also properties and a handful of Canvas concepts.


  • Completely independent of other parts of the simulator, i.e. it's part of SimGear and can be used in a standalone fashion if needed, ala Canvas.
  • Although independent, its aim is to be fully compatible with the current rendering framework in FG. This includes the Effects system, CameraGroup, Rembrandt and ALS (and obviously the Canvas).
  • Its functionality overlaps Rembrandt: what can be done with Rembrandt can be done with the Compositor, but not vice versa.
  • Fully configurable via an XML interface without compromising performance (ala Effects, using PropertyList files).
  • Flexible, expandable and compatible with modern graphics.
  • It doesn't increase the hardware requirements, it expands the hardware range FG can run on. People with integrated GPUs (Intel HD etc) can run a Compositor with a single pass that renders directly to the screen like before, while people with more powerful cards can run a Compositor that implements deferred rendering, for example.
  • Static branching support. Every pipeline element can be enabled/disabled at startup via a <condition> block.

How to enable the Compositor

Currently the Compositor can only be enabled at compile time via the -DENABLE_COMPOSITOR=ON CMake flag in FlightGear. SimGear doesn't require any extra parameters. Once you have a binary with the Compositor enabled and you run it, you will be presented with the default rendering pipeline. At the time of writing, this is the low spec rendering pipeline. If you want to try the ALS pipeline, start FlightGear with the command line argument: --compositor=Compositor/ALS/als

Notes for aircraft developers


The Compositor introduces a new way of defining lights that is renderer agnostic, so every rendering pipeline will be able to access the lights that have been implemented like this. As of 2019/11, the only pipeline that supports dynamic lights is the ALS pipeline. The resulting light volumes can be visualized for debugging purposes by setting the property /sim/debug/show-light-volumes to true.

  • name. An <animation> will be able to reference the light by this name. Most animations will work as expected (rotate, translate, spin etc).
  • type. spot or point.
  • position. The position of the light source in model space and in meters.
  • direction. Only available in spot lights. It indicates the direction of the spotlight. This parameter can be specified in three different ways:
Direction vector Look-at point Rotation angles
A vector in model space that specifies the direction. Doesn't have to be normalized.
The spotlight will calculate its direction by looking at this position from the light position. The point is in model space and in meters.
A three angle rotation in degrees that rotates the spotlight around the three axes. A 0 degree angle in all axes makes the spotlight point downwards (negative Z).
  • ambient, diffuse and specular. Four-component vectors that specify the light color.
  • attenuation (Optional). Three-component vector where <c> specifies the constant factor, <l> specifies the linear factor and <q> specifies the quadratic factor. These factors are plugged into the OpenGL light attenuation formula Spotlight attenuation.png where d is the distance of the fragment to the light source. If no attenuation has been specified, the inverse-square law will be used. See this table for a list of attenuation values based on the range of the light.
  • range-m (Optional). Maximum range from the light source position in meters. This value will be used by the renderers to determine if a fragment is illuminated by this source. Every fragment outside this range isn't guaranteed to be affected by the light, even if the attenuation factor isn't 0 in that particular fragment. If no value has been specified, it will be calculated automatically based on the attenuation.
  • cutoff. Only available in spot lights. It specifies the maximum spread angle of a light source. Only values in the range 0 90 are accepted. If the angle between the direction of the light and the direction from the light to the fragment being lighted is greater than the spot cutoff angle, it won't be lit.
  • exponent. Only available in spot lights. Higher spot exponents result in a more focused light source, regardless of the spot cutoff angle.
  • debug-color (Optional). Sets the color of the debug light volume. By default it's red.


Low-Spec pipeline

A fixed function forward rendering pipeline mainly targeted to low spec systems. It imitates the classic forward pipeline used before multi-pass rendering was introduced by using two near/far cameras rendering directly to the screen.

Screenshot showing OSG stats of the Compositor-based low-spec rendering pipeline.


The ALS pipeline tries to bring multipass rendering to the current ALS framework, effectively combining the best from ALS and Project Rembrandt.

Cascaded shadow mapping

The main issue with shadow mapping in FlightGear is the complexity of the scene graph. Culling times can become huge if we don't carefully select which parts of the scene graph we want to render in the shadow maps. Some possible optimizations:

  • Study the minimum shadow map distance we can get without noticeable light leaking. Select an appropiate amount of cascades (more cascades = more passes over all geometry, and in general we want to keep the amount of forward passes to a minimum). We should have at least three cascades: the first just for cockpit/internal shadows, the second for the whole aircraft and the third for the rest of the scenery geometry. A fourth can be added if the transition between the second and the third is too harsh.
  • Improve the culling masks (simgear/scene/util/RenderConstants.hxx). The CASTSHADOW_BIT flag is present in almost every object in the scene graph. Turning this flag off for trees, random buildings and other geometry intensive objects improves framerates by a very considerable amount. Should the user be able to select which objects cast shadows?
  • Should the terrain cast shadows? The terrain is rarely steep enough to cast shadows. Apart from that, the terrain in FlightGear messes with automatic near/far computations for the shadow passes since the geometry is not tessellated enough. Also, the terrain LOD is not good enough to have decent cull times at far cascades.
  • Adding a "internal only" shadow flag for aircraft developers. This allows farther shadow cascades to cull complex objects that are only visible in the nearest cascades. (Very important optimization for aircrafts with complex cockpit geometry).


Gamma correction, night vision and other ALS filters should happen in a quad pass. The current filter_combined() should be left for post-processing that requires as much precision as possible - e.g. dithering to prevent banding). HDR is not a planned feature for now so ALS will be using rgba8 buffers for most of its features.

Real-time dynamic reflections

Rendering dynamically to a cubemap is possible. As with shadow mapping, minimizing the object count and number of forward passes is vital to get good performance in FlightGear. Rendering to six cubemap faces requires six forward passes, but we can render to a dual paraboloid map instead, reducing this number to two.


When shadows (and multipass rendering in general) come into play, transparent objects have to be treated differently, even when we are dealing with a forward renderer. In OSG there are two ways to separate transparent surfaces:

  • Using RenderBins. After a single scene cull traversal, surfaces which belong to a special RenderBin type (DepthSortedBin) are removed or moved to another camera. This is how Rembrandt does it and it is the most backwards compatible approach since RenderBins can be changed directly inside Effects.
void removeTransparentBins(simgear::EffectCullVisitor *cv,
                           osgUtil::RenderBin::RenderBinList &transparent_bins)
    osgUtil::RenderStage *stage = cv->getRenderStage();
    osgUtil::RenderBin::RenderBinList &rbl = stage->getRenderBinList();
    for (auto rbi = rbl.begin(); rbi != rbl.end(); ) {
        if (rbi->second->getSortMode() == osgUtil::RenderBin::SORT_BACK_TO_FRONT) {
            transparent_bins.insert(std::make_pair(rbi->first, rbi->second));
        } else {
  • Using cull masks. Two separate traversals are done: one for opaque objects and another for translucent objects. This requires offering aircraft developers another way of tagging a surface as transparent. A trivial approach would be to add a new <animation> type called 'transparent', but that wouldn't be backwards compatible. Maybe we can add some kind of system where we can change cull masks inside Effects? Would that be too hacky or out of place?

Creating a custom rendering pipeline

Since the Compositor is completely data-driven, new rendering pipelines can be created by writing a custom XML pipeline definition. This section tries to document most of the available parameters, but the best and most up-to-date resource is the Compositor parsing code in SimGear (simgear/simgear/scene/viewer).


A buffer represents a texture or, more generically, a region of GPU memory. It can have the following parameters:

Passes will be able to address the buffer by this name.
Any texture type allowed by OpenGL: 1d, 2d, 2d-array, 2d-multisample, 3d, rect or cubemap.
Texture width. It's possible to write 'screen' to use the physical viewport width.
screen-width-scale (Optional)
If 'screen' was used, this controls the width scaling factor.
Texture height. It's possible to write 'screen' to use the physical viewport height.
screen-height-scale (Optional)
If 'screen' was used, this controls the height scaling factor.
Texture depth.
Specifies the texture format. It corresponds to the internalformat, format and type arguments of the OpenGL function glTexImage2D. See simgear/simgear/scene/viewer/CompositorBuffer.cxx for the latest available values.
min-filter and mag-filter (Optional)
They change the minification and magnification filtering respectively. Possible values are: linear, linear-mipmap-linear, linear-mipmap-nearest, nearest, nearest-mipmap-linear and nearest-mipmap-nearest. The default value for both filters is linear.
wrap-s, wrap-t and wrap-r (Optional)
They change the wrap mode for each coordinate. Possible values are: clamp, clamp-to-edge, clamp-to-border, repeat and mirror. The default value for every coordinate is clamp-to-border.
condition (Optional)
A valid boolean condition to enable the buffer at startup (doesn't work at runtime).

A typical property tree structure describing a buffer may be as follows:



A pass wraps around an osg::Camera. Passes all have some common parameters:

clear-color, clear-accum, clear-depth and clear-stencil
Default values are black, black, 1.0 and 0 respectively.
Controls the camera clear mask. Default value is "color depth".
The pass will try to use the specified effect scheme to draw every object.
A valid boolean condition to enable the pass at startup (doesn't work at runtime).

Passes can render to a buffer (Render to Texture), to several buffers (Multiple Render Targets) or directly to the framebuffer. This is accomplished by the <attachment/> tag. Possible parameters of an attachment are:

The name of the buffer to output to.
FBO attachment point. Possible values are color0 to color15, depth, stencil and depth-stencil.
level (Optional)
Controls the mip map level of the texture that is attached. Default value is 0.
face (Optional)
Controls the face of texture cube map or z level of 3d texture. Default value is 0.
mipmap-generation (Optional)
Controls whether mipmap generation should be done for texture. Default value is false.
multisample-samples (Optional)
MSAA samples. Default value is 0.
multisample-color-samples (Optional)
MSAA color samples. Default value is 0.
A valid boolean condition to enable the attachment at startup (doesn't work at runtime).

Passes can also receive buffers as input and use them in their shaders. This is accomplished by the <binding/> tag, which has the following parameters:

The name of the buffer to bind.
The texture unit to place the texture on. Effects will be able to access the buffer on this texture unit.
A valid boolean condition to enable the binding at startup (doesn't work at runtime).

There are specific pass types, each with their own set of custom parameters:


Renders the scene from the point of view given by the CameraGroup.

A 32 bit number that specifies the cull mask to be used. See simgear/scene/util/RenderConstants.hxx to know which bits enable what.
z-near and z-far
They change the depth range to be used. If both of them are zero, the default Z range in the CameraGroup is used.
Enables the use of clustered forward rendering for this pass.
Ignores the given view and projection matrices and uses a custom one that renders the scene as if it was seen from inside a cubemap looking towards the specified face.


Renders a fullscreen quad with an optional effect applied. Useful for screen space shaders (like SSAO, Screen Space Reflections or bloom) and deferred rendering.

Specifies the x, y, width and height of the fullscreen quad inside the viewport using normalized coordinates.
The quad will use this effect.


Renders the scene from a light's point of view.

The OpenGL light number to use for this shadow map.
near-m and far-m
They specify the range of the shadow map.

Example XML for a scene type pass:

  <clear-color type="vec4d">0 0 0 0</clear-color>




  • Bring back distortion correction.
  • Some kind of versioning system to be able to make breaking changes in the future if/when the compositor is updated in any significant way, without people having to manually update their configs.
  • Bring back Canvas integration so aircraft devs have access to the rendering pipeline. This allows to render exterior views in cockpit displays etc.

Known Issues

  • Setting a buffer scale factor different from 1.0 and rendering to it might not scale the splash screen correctly.
  • Clustered shading crashes FG if compiled under OSG 3.6. This is related to osg::TextureBuffer changing definition from OSG 3.4 to OSG 3.6 (Images vs BufferData).