Howto:Shader programming in FlightGear: Difference between revisions

From FlightGear wiki
Jump to navigation Jump to search
(Effects - shader section additions)
No edit summary
Line 390: Line 390:


=== Effects File ===
=== Effects File ===
{{Note|need link to Effects Doc here}}
{{Note|need link to Effects Doc here to explain the details of the effects file}}
==== Parameters ====
==== Parameters ====
Parameter entries defined in the Effect file correspond to a property tree data container (static or variable). They will contain the data needed by the shader program to perform its magic. The type of information contained in the property tree might be program control data or variable/static data that the shader program can manipulate prior to sending on to render.
Parameter entries defined in the Effect file correspond to a property tree data container (static or variable). They will contain the data needed by the shader program to perform its magic. The type of information contained in the property tree might be program control data or variable/static data that the shader program can manipulate prior to sending on to render.
Line 411: Line 411:
Some of this data may play a duel role inside the shader program. In other words it might be used to control other functions in addition to ALS Lights.
Some of this data may play a duel role inside the shader program. In other words it might be used to control other functions in addition to ALS Lights.
There will also be other parameter entries that have nothing to do with ALS Lights. They might be used for other actions or effects the shader is handling.  
There will also be other parameter entries that have nothing to do with ALS Lights. They might be used for other actions or effects the shader is handling.  
 
==== Technique ====
In general, the shader program and the uniforms are handled in between the technique tags. The technique is
==== Shader Program ====
==== Shader Program ====
Next comes the entry to define what Shader Program the parameters data is going to be passed to.
Next comes the entry to define what Shader Program the parameters data is going to be passed to.
This is where you specify what shader code is to be used by the technique. ALS has the lowest techniques, with higher quality preceding lower quality.
  <program>
  <program>
   <fragment-shader>Shaders/tree-ALS.frag</fragment-shader>
   <fragment-shader>Shaders/tree-ALS.frag</fragment-shader>
Line 420: Line 423:
In the case of ALS Lights, so far we only have to deal with the fragment shader.
In the case of ALS Lights, so far we only have to deal with the fragment shader.


The program section of the effect file is a nifty method used to allow users to add shaders to [[Flightgear]] without having to add code at C level language base. The C level base is programed to recognize the XML tag pair of <program></program> and thus incorporate the GLSL program files pointed to between the tags. Otherwise you would have to add the GLSL program calls in the base C requiring a completely different set of programming skills and also the necessity of compiling [[Flightgear]] everytime you want to add new shaders.
The program section of the effect file is a nifty method used to allow users to add shaders to [[Flightgear]] without having to add code at C level language base. The C level base is programed to recognize the XML tag pair of <program></program> and thus incorporate the GLSL program files pointed to between the tags. Otherwise you would have to add the GLSL program calls in the base C requiring a completely different set of programming skills and also the necessity of compiling [[Flightgear]] everytime you want to add new shaders. It can work this way because shader programs are compiled at run-time.


We'll describe the contents of the shader programs below. For now, suffice it to say tree-ALS.frag contains the main program and secondary_lights.frag has functions that are passed uniform data that is manipulated and returned to main for processing.
We'll describe the contents of the shader programs below. For now, suffice it to say tree-ALS.frag contains the main program and secondary_lights.frag has functions that are passed uniform data that is manipulated and returned to main for processing.


==== Uniforms ====
==== Uniforms ====
 
The uniforms section is used


=== Shader Program ===
=== Shader Program ===
Line 488: Line 491:
** adds uniforms (technique 2)
** adds uniforms (technique 2)


== General Comments from Forum Discussion ==
{{cquote|An effect is a container for a series of techniques which are all the possible things you could do with some object.
The <predicate> section inside each effect determines which of the techniques we actually run. Typically the predicate section contains conditionals on a) rendering framework properties b) OpenGL extension support and c) quality level properties. The renderer searches the effects from lowest to highest technique number and uses the first one that fits the bill (which is why high quality levels precede low quality levels).
The rendering itself is specified as <pass> - each pass runs the renderer over the whole scene. We may sometimes make a quick first pass to fill the depth buffer or a stencil buffer, but usually we render with a single pass.
Techniques may not involve a shader at all, for instance 12 in terrain-default is a pure fixed pipeline fallback technique, but all the rendering you're interested in use a shader, which means they have a <program> section which determines what actual code is supposed to run.
If you run a shader, that needs parameters specified as <uniform>, and textures which need to be assigned to a texture unit _and_ be declared as a uniform sampler.
In addition, each technique contains a host of parameter configuring the fixed pipeline elements, like alpha tests before the fragment stage, or depth buffer writing/reading, alpha blending,... you can ignore them on the first go.
So if you want to look at which shader code is executed when you call the water effect at a certain quality level, you need to work your way through the predicate sections of the techniques from lowest to highest till you find the one that matches, and then look into the program section of that technique.
Now, to make matters more complicated, all the parameters and textures that are assigned to uniforms and texture units in the techniques need to be declared and linked to properties were applicable in the <parameters> section heading each effect declaration. So a texture you want to use has to appear three times - in the parameters section heading the effect, inside the technique assigned to a texture unit and assigned to a uniform sampler.
Now, inheritance moves all the declared parameters and techniques from one effect to the one inheriting. In the case of water, that means the technique is actually _not_ inherited because terrain-default.eff doesn't contain a technique 1 at all, but the general <parameters> section is inherited. So you don't need to declare the additions to <parameters> again, but you do need to make the changes to the <programs> and the <uniform> sections.<ref>{{cite web |url=http://forum.flightgear.org/viewtopic.php?f=47&t=24226&start=15#p220152
|title=ALS landing lights
|author=Thorsten Renk |date= Wed Oct 08, 2014 1:58 -0700}}</ref>|Thorsten Renk}}


{{WIP|more to follow}}
{{WIP|more to follow}}

Revision as of 00:14, 13 October 2014

This is meant to become an introduction to shader programming in FlightGear, for the time being (03/2010), this is work in progress, please feel free to ask questions or suggest topics.

Your help in improving and updating this article is appreciated, thanks!

Tutorials about GLSL Programming in general are collected at GLSL Shader Programming Resources

For an OpenGL quick reference, please see: http://www.khronos.org/files/opengl-quick-reference-card.pdf for an GLSL quick reference see glsl_quickref.pdf

Intro

GLSL (OpenGL Shading Language or "GLslang") is the official OpenGL shading language and allows you to write programs, so called "shaders" in a high level shading language that is based on the C programming language to create OpenGL fragment (pixel) and vertex shaders.

With the recent advances in graphics cards, new features have been added to allow for increased flexibility in the rendering pipeline at the vertex and fragment level. Programmability at this level is achieved with the use of fragment and vertex shaders.

GLSL was created to give developers more direct control of the graphics pipeline without having to use assembly language or hardware-specific languages. Shaders provide the possibility to process individual vertices or fragments individually, so that complex rendering tasks can be accomplished without stressing the CPU. Support for shader was first introduced via extensions in OpenGL 1.5, but is now part of the core OpenGL 2.0 standard.

Shaders are written and stored as plain text files, which can be uploaded (as strings) and executed on the GPU (processor of the graphics card).

What is a Shader

A shader is a programmable replacement for parts of the fixed OpenGL function pipeline, you can imagine it sort of like a "plugin" to customize rendering for specific scene elements.

GLSL shaders are not stand-alone applications; they require an application that utilizes the OpenGL API. A shader is a program, to be run it must be loaded, compiled and linked. Shaders will be compiled when the 3D application starts. They will be validated and optimized for the current hardware.

Actually each vertex and fragment shader must have one entry point (the main function) each, but you can create and link more shaders.

GLSL shaders themselves are simply a set of strings that are passed to the hardware vendor’s driver for compilation from within an application using the OpenGL API's entry points. Shaders can be created on the fly from within an application or read in as text files, but must be sent to the driver in the form of a string.

GLSL has explicit ties to the OpenGL API - to the extent that much of the OpenGL 'state' (eg which light sources are bound, what material properties are currently set up) is presented as pre-defined global variables in GLSL.


Shaders offer:

  • Opportunity for Improved Visual Quality
  • Algorithm Flexibility
  • Performance Benefits

Shaders have access to the render state (parameters, matrices, lights, materials ...) and textures. A "pass" is the rendering of a 3D Model with a vertex and pixel shader pair. An effect can require multiple passes, while each pass can use a different shader and/or model pair. A Pass can render to a texture (to be used by another pass). Think of the "fixed functionality" as the default Shader.

To make it simple, a shader is a program that is loaded on the GPU and called for every vertex or pixel: this gives programmers the possibility to implement techniques and visual effects and execute them faster. In modern games or simulators lots of shaders are used: lights, water, skinning, reflections and much more.

We can create as many shader programs as needed (you can have many shaders of the same type (vertex or fragment) attached to the same program, but only one of them can define the entrypoint:the main() function).

Each Shader program is assigned an handler, and you can have as many programs linked and ready to use as you want (and your hardware allows). Once rendering, we can switch from program to program, and even go back to fixed functionality during a single frame.

To really understand shaders, you should have a knowledge about the rendering pipeline; this helps to understand where and when the shaders act in the rendering process. In general, you must know that vertex are collected, processed by vertex shaders, primitives are built, then are applied colors, textures and are also called fragment shaders; finally it comes to the rasterization and the frame is put on the buffer.

Some benefits of using GLSL are:

  • Cross platform compatibility on multiple operating systems, including Linux, Mac OS and Windows.
  • The ability to write shaders that can be used on any hardware vendor’s graphics card that supports the OpenGL Shading Language.
  • Each hardware vendor includes the GLSL compiler in their driver, thus allowing each vendor to create code optimized for their particular graphics card’s architecture.

Language Features

While GLSL has a C-Like syntax, it introduces some new types and keywords. To get a detailed view of the language, please see the GLSL specification you can find on http://www.opengl.org/documentation/glsl/

The OpenGL Shading Language provides many operators familiar to those with a background in using the C programming language. This gives shader developers flexibility when writing shaders. GLSL contains the operators in C and C++, with the exception of pointers. Bitwise operators were added in version 1.30.

Similar to the C programming language, GLSL supports loops and branching, including if, else, if/else, for, do-while, break, continue, etc.

User defined functions are supported, and a wide variety of commonly used functions are provided built-in as well. This allows the graphics card manufacturer the ability to optimize these built-in functions at the hardware level if they are inclined to do so. Many of these functions are similar to those found in the math library of the C programming language such as exp() and abs() while others are specific to graphics programming such as smoothstep() and texture2D().

Shader Types

There are two types of shaders in GLSL: "vertex shaders" and "fragment shaders" (with geometry shaders being a part of OpenGL 3.2).

These are executed by vertex and fragment processors in the graphics hardware.

  • Vertex shaders transform vertices, set up data for fragment shaders
  • Fragment shaders operate on fragments generated by rasterization
  • Geometry shaders create geometry on the GPU

Typically, vertex shader files use the file extension ".vert", while fragment shader files use the ".frag" extension. In FlightGear, these files can be found in the "Shaders" subdirectory of the base package, i.e. $FG_ROOT/Shaders

For a list of currently available shaders, you may want to take a look at: http://gitorious.org/fg/fgdata/trees/master/Shaders

So, shaders generally go around in pairs - one shader (the "Vertex shader") is a short program that takes in one vertex from the main CPU and produces one vertex that is passed on to the GPU rasterizer which uses the vertices to create triangles - which it then chops up into individual pixel-sized fragments.

A vertex shader is run once per vertex, while a fragment shader is run once per fragment covered by the primitive being rendered (a point, a line or a triangle). A fragment equate a pixel except in the case of multi-sampling where a pixel can be the weighted average of several fragments. Multi-sampling is used to remove aliasing and jagged edges. Many such executions can happen in parallel. There is no communication or ordering between executions. Vertex shaders are flexible and quick.


Vertex Shaders

Input: Vertex attributes

Output: At least vertex position (in the clip space)

Restrictions: Cannot access any vertex other than the current one

Note: Loading a vertex shader turns off parts of the OpenGL pipeline (vertex shaders fully replace the "Texturing & Lighting unit")

Objects in a computer graphics scene are usually meshes that are made up of polygons. The corner of each of those polygons is called a "vertex". A vertex shader receives input in the form of per-vertex variables called "attribute variables", and per-polygon variables called "uniform variables". The vertex shader must specify the coordinates of the vertex in question. This way, the geometry of the object can be modified.

Vertex shaders operate on each vertex, the vertex shader is executed for every vertex related OpenGL call (e.g. glVertex* or glDrawArrays). Accordingly, this means for example, that for meshes that contain e.g. 5000 vertices, the vertex shader will also be executed 5000 times.

A single vertex itself is composed of a number of "attributes" (vertex attrib), such as: position, texture coordinates, normal and color for the most common. The position (attribute) is the most important one. The coordinates (x, y and z) of the vertex's entering position are those which have been given by the 3D modeler during the creation of the 3D model. The vertex's position is defined in the local space of the mesh (or object space).

A vertex shader provides almost full control over what is happening with each vertex. Consequently, all per-vertex operations of the fixed function OpenGL pipeline are replaced by the custom vertex shader.

Vertex Shaders take application geometry and per-vertex attributes as input and transform the input data in some meaningful way.


  • A vertex shader MUST write to gl_Position
  • A vertex shader CAN write to gl_PointSize, gl_ClipVertex
  • gl_Vertex is an attribute supplying the untransformed vertex coordinate
  • gl_Position is an special output variable for the transformed vertex coordinate

A vertex shader can also set other variables which are called "varying variables". The values of these variables are passed on to the second kind of shader, the "fragment shader". The fragment shader is run for every pixel on the screen where the polygons of the mesh appear.The fragment shader is responsible for setting the final color of that little piece of the mesh

Common tasks for a vertex shader include:

  • Vertex position transformation
  • Per vertex lighting
  • Normal transformation
  • Texture coordinates transformation or generation
  • Vertex color computation
  • Geometry skinning
  • Animation
  • Setting up data for fragment shaders

The vertex shader runs from start to end for each and every vertex that's passed into the graphics card - the fragment process does the same thing at the pixel level. In most scenes there are a heck of a lot more pixel fragments than there are vertices - so the performance of the fragment shader is vastly more important and any work we can do in the vertex shader, we probably should.

A minum vertex shader example may looks this:

void main(void)
{
    gl_Position = ftransform();
}

Fragment Shaders

Input: Interpolation of the vertex shader outputs

Output:Usually a fragment color.

Restrictions: Fragment shaders have no knowledge of neighboring pixels.

Note: Loading a fragment shader turns off parts of the OpenGL pipeline (pixel shaders fully replace the "Texturing Unit")

The other shader (the "Fragment shader" - also known (incorrectly) as the "pixel shader") takes one pixel from the rasterizer and generates one pixel to write or blend into the frame buffer.

A fragment shader can write to the following special output variables:

  • gl_FragColor to set the color of the fragment
  • gl_FragData[n] to output to a specific render target
  • gl_FragDepth to set the fragment depth


Common tasks of fragment shaders include:

  • Texturing (even procedural)
  • Per pixel lighting and material application
  • ray tracing
  • Fragment color computation
  • Operations on Interpolated Values
  • Doing operations per fragment to make pretty pictures


A minimum fragment shader may look like this:

void main(void)
{
    gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}

A fragment shader takes perspective-correct interpolated attribute values as input and either discards the fragment or outputs the fragment's color.

Fragment shaders operate on every fragment which is produced by rasterization. Fragment shaders give you nearly full control over what is happening with each fragment. However just like vertex shaders, a fragment shader replaces all per-fragment operations of the fixed function OpenGL pipeline.

Data Types in GLSL

Note that there is no implicit type conversion in GLSL, all conversions and initializations have to be done using explicit constructor calls!

Scalars

  • float - 32 bit, very nearly IEEE-754 compatible
  • int - at least 16 bit, but not backed by a fixed-width register
  • bool - like C++, but must be explicitly used for all flow control

Vectors

  • vec2, vec3, vec4 2D, 3D and 4D floating point vector
  • ivec2, ivec3, ivec4 2D, 3D and 4D integer vector
  • bvec2, bvec3, bvec4 2D, 3D and 4D boolean vectors

Accessing a vector can be done using letters as well as standard C selectors.

TODO: explain swizzling

One can use the letters x,y,z,w to access vectors components; r,g,b,a for color components; and s,t,p,q for texture coordinates.

Matrices

  • mat2 2x2 floating point matrix
  • mat3 3x3 floating point matrix
  • mat4 4x4 floating potint matrix

Samplers

In GLSL, textures are represented and accessed using so called "samplers", which are used for sampling textures and which have to be uniform. The following samplers are available:

  • sampler1D, sampler2D, sampler3D 1D, 2D and 3D texture
  • samplerCube Cube Map texture
  • sampler1Dshadow, sampler2Dshadow 1D and 2D depth-component texture

Arrays

GLSL supports the same syntax for creating arrays that is already known from C or C++, e.g.:

vec2 foo[10];

So, arrays can be declared using the same syntax as in C, but can't be initialized when declared. Accessing array's elements is done as in C.

Structures

Structures can also be created like in C or C++, e.g.:

struct foo {
 vec3 pos;
};

Global Storage Qualifiers

Used for communication between shaders and application:

  • const - for declaring non-writable, compile-time constant variables
  • attribute - For frequently changing (per vertex) information passed from the application to a vertex shader (no integers, bools, structs, or arrays)
  • uniform - for infrequently changing (per primitive) information passed from the application to a vertex or fragment shader:constant shader parameters that can be changed between draws (cannot be written to in a shader, do not change per-vertex or per-fragment)
  • varying - for information passed from a vertex shader to a fragment shader, will be interpolated in a perspective-correct manner during rasterization (can write in vertex shader, but only read in fragment shader)

Functions

  • Much like C++
  • Entry point into a shader is void main()
  • Overloading based on parameter type (but not return type)
  • No support for direct or indirect recursion
  • Call by value-return calling convention

As in C, a shader is structured in functions. At least each type of shader must have a main function declared with the following syntax: void main() User defined functions may be defined. As in C a function may have a return value, and use the return statement to pass out its result. A function can be void. The return type can have any type, except array.

Parameter Qualifiers

The parameters of a function may have the following qualifiers:

  • in - copy in, but don't copy back out (still writable within function)
  • out - only copy out; undefined at function entry point
  • inout - copy in and copy out

If no qualifier is specified, by default it is considered to be in.

Built-ins

Vertex Shader

  • vec4 gl_Position; must be written
  • vec4 gl_ClipPosition; may be written
  • float gl_PointSize; may be written

Fragment Shader

  • float gl_FragColor; may be written
  • float gl_FragDepth; may be read/written
  • vec4 gl_FragCoord; may be read
  • bool gl_FrontFacing; may be read

Vertex Attributes

Only available in vertex shaders.

  • attribute vec4 gl_Vertex;
  • attribute vec3 gl_Normal;
  • attribute vec4 gl_Color;
  • attribute vec4 gl_SecondaryColor;
  • attribute vec4 gl_MultiTexCoordn;
  • attribute float gl_FogCoord;

Uniforms

  • uniform mat4 gl_ModelViewMatrix;
  • uniform mat4 gl_ProjectionMatrix;
  • uniform mat4 gl_ModelViewProjectionMatrix;
  • uniform mat3 gl_NormalMatrix;
  • uniform mat4 gl_TextureMatrix[n];
struct gl_MaterialParameters {
vec4 emission;
vec4 ambient;
vec4 diffuse;
vec4 specular;
float shininess;
};
  • uniform gl_MaterialParameters gl_FrontMaterial;
  • uniform gl_MaterialParameters gl_BackMaterial;
struct gl_LightSourceParameters {
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 position;
vec4 halfVector;
vec3 spotDirection;
float spotExponent;
float spotCutoff;
float spotCosCutoff;
float constantAttenuation
float linearAttenuation
float quadraticAttenuation
};
  • Uniform gl_LightSourceParameters gl_LightSource[gl_MaxLights];

Varyings

An interface between vertex and fragment shaders is provided by varying variables: vertex shaders compute values per vertex and fragment shaders compute values per fragment. The value of a varying variable defined in a vertex shader, will be interpolated (perspective-correct) over the primitve being rendered and the interpolated value in the fragment shader can be accessed.

Varying variables can only be used with the data types float, vec2, vec3, vec4, mat2, mat3, mat4. (and arrays of them too.)

  • varying vec4 gl_FrontColor // vertex
  • varying vec4 gl_BackColor; // vertex
  • varying vec4 gl_FrontSecColor; // vertex
  • varying vec4 gl_BackSecColor; // vertex
  • varying vec4 gl_Color; // fragment
  • varying vec4 gl_SecondaryColor; // fragment
  • varying vec4 gl_TexCoord[]; // both
  • varying float gl_FogFragCoord; // both

Functions

Anatomy of a Shader

A shader's entry point is the main function which returns void and takes no arguments (void)


Anatomy of a Vertex Shader

The function 'void main()' is called afresh for each vertex in the 3D object model:

// Vertex Shader
void main() {
 gl_Position = gl_Vertex;
}

Anatomy of a Fragment Shader

The function 'void main()' is called afresh for each fragment/pixel in the 3D object model:

// Fragment Shader
void main() {
 gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

A Practical Application in Flightgear

ALS Landing Lights \ Spotlight

The ALS Landing Lights\Spotlight (we'll call it ALS Lights from now on) is a good example for showing how to incorporate a shader effect into Flightgear as it touches many parts of the visuals we see and many parts of the coding pipeline.

In the case of ALS Lights, you have to add the effect to every visual item rendered on the screen that you want to see a light shining on. If you want it to be capable of shining on everything, you have to account for each separate item and how that item is rendered. That is a lot of code to touch.

The list might include

  • runway
  • terrain
  • models
    • ai
    • aircraft
    • tree
  • weather
    • fog
    • clouds
    • hazes
  • water
    • inland
    • ocean
    • stream

Some of these items may be controlled or rendered by the same effect and shader file. They might have to be accounted for individually. They may have special lighting influences that have to be accounted for. You have to take each one separately and account for all its needs.

The example highlighted in this article is what was added to tree.eff to shine the lights on trees.

Program Flow Simplified

Preferences/Nasal/XML >> Property Tree >> Effect File >> Shader >> Rendered to Screen

Preferences/Nasal/XML

Note  need link to Preferences, Nasal and XML Docs here

Any combination of Preferences, Nasal or XML manipulates data in the property tree. In this case the switch to turn on the landing or spot light and a couple other needed data containers are defined in $FG_ROOT/preferences.xml with the following lines.

<als-secondary-lights>
 <use-searchlight type="bool">false</use-searchlight>
 <use-landing-light type="bool">false</use-landing-light>
 <use-alt-landing-light type="bool">false</use-alt-landing-light>
 <landing-light1-offset-deg type="float">0.0</landing-light1-offset-deg>
 <landing-light2-offset-deg type="float">0.0</landing-light2-offset-deg>
</als-secondary-lights>

They show up in the Property Tree under sim/rendering/als-secondary-lights and can be activated or manipulated by normal Nasal calls or XML.

Effects File

Note  need link to Effects Doc here to explain the details of the effects file

Parameters

Parameter entries defined in the Effect file correspond to a property tree data container (static or variable). They will contain the data needed by the shader program to perform its magic. The type of information contained in the property tree might be program control data or variable/static data that the shader program can manipulate prior to sending on to render. In the case of ALS Lights, below is some of the data passed to, and used by, the shader.

<display_xsize><use>/sim/startup/xsize</use></display_xsize>
<display_ysize><use>/sim/startup/ysize</use></display_ysize>
<view_pitch_offset><use>/sim/current-view/pitch-offset-deg</use></view_pitch_offset>
<view_heading_offset><use>/sim/current-view/heading-offset-deg</use></view_heading_offset>
<view_fov><use>/sim/current-view/field-of-view</use></view_fov>
<use_searchlight><use>/sim/rendering/als-secondary-lights/use-searchlight</use></use_searchlight>
<use_landing_light><use>/sim/rendering/als-secondary-lights/use-landing-light</use></use_landing_light>
<use_alt_landing_light><use>/sim/rendering/als-secondary-lights/use-alt-landing-light</use></use_alt_landing_light>
<landing_light1_offset><use>/sim/rendering/als-secondary-lights/landing-light1-offset-deg</use></landing_light1_offset>
<landing_light2_offset><use>/sim/rendering/als-secondary-lights/landing-light2-offset-deg</use></landing_light2_offset>
<quality_level><use>/sim/rendering/shaders/landmass</use></quality_level>
<tquality_level><use>/sim/rendering/shaders/transition</use></tquality_level>

Note the "use-searchlight" entry, it is pointing to the use-searchlight entry in the property tree under "sim/rendering/als-secondary-lights" that was defined in preferences.xml.

Some of this data may play a duel role inside the shader program. In other words it might be used to control other functions in addition to ALS Lights. There will also be other parameter entries that have nothing to do with ALS Lights. They might be used for other actions or effects the shader is handling.

Technique

In general, the shader program and the uniforms are handled in between the technique tags. The technique is

Shader Program

Next comes the entry to define what Shader Program the parameters data is going to be passed to. This is where you specify what shader code is to be used by the technique. ALS has the lowest techniques, with higher quality preceding lower quality.

<program>
 <fragment-shader>Shaders/tree-ALS.frag</fragment-shader>
 <fragment-shader>Shaders/secondary_lights.frag</fragment-shader>
</program>

In the case of ALS Lights, so far we only have to deal with the fragment shader.

The program section of the effect file is a nifty method used to allow users to add shaders to Flightgear without having to add code at C level language base. The C level base is programed to recognize the XML tag pair of <program></program> and thus incorporate the GLSL program files pointed to between the tags. Otherwise you would have to add the GLSL program calls in the base C requiring a completely different set of programming skills and also the necessity of compiling Flightgear everytime you want to add new shaders. It can work this way because shader programs are compiled at run-time.

We'll describe the contents of the shader programs below. For now, suffice it to say tree-ALS.frag contains the main program and secondary_lights.frag has functions that are passed uniform data that is manipulated and returned to main for processing.

Uniforms

The uniforms section is used

Shader Program

Uniform Input

Variable Assignments

Main Program

Files that are directly touched by this effect include

  • preferences.xml
  • airfield.eff
    • inherits properties from terrain-default
    • adds program shaders (technique 2)
    • fragment airfield-ALS.frag & secondary_lights.frag
    • adds uniforms (technique 2)
  • building.eff
    • inherits properties from model-combined-deferred << model-combined
    • adds program shaders (technique 4)
      • fragment model-ALS-ultra.frag & secondary_lights.frag
    • inherits uniforms from model-combined-deferred << model-combined
  • model-combined.eff
    • inherits properties from model-default
    • adds program shaders (technique 4)
      • fragment model-ALS-ultra.frag & secondary_lights.frag
    • adds uniforms (technique 4)
  • model-default.eff
    • adds properties
    • adds program shaders (technique 5)
      • fragment model-ALS-base.frag & secondary_lights.frag
    • adds uniforms (technique 5)
  • runway.eff
    • inherits properties from terrain-default
    • adds program shaders (technique 2)
      • fragment runway-ALS.frag & secondary_lights.frag
    • adds uniforms (technique 2)
  • terrain-default.eff
    • adds properties
    • adds program shaders (technique 3)
      • fragment terrain-ALS-ultra.frag & secondary_lights.frag
    • adds uniforms (technique 3)
  • tree.eff
    • adds properties
    • adds program shaders (technique 4 and 5)
      • fragment tree-ALS.frag & secondary_lights.frag
    • adds uniforms (technique 4 and 5)
  • urban.eff
    • inherits properties from terrain-default
    • adds program shaders (technique 1 and 2)
      • fragment urban-ALS.frag & secondary_lights.frag
    • adds uniforms (technique 1 and 2)
  • water.eff
    • inherits properties from terrain-default
    • adds program shaders (technique 1)
      • fragment water-ALS-high.frag & secondary_lights.frag
    • adds uniforms (technique 1)
  • water-inland.eff
    • inherits properties from terrain-default
    • adds program shaders (technique 2)
      • fragment water-ALS-high.frag & secondary_lights.frag
    • adds uniforms (technique 2)

General Comments from Forum Discussion

Cquote1.png An effect is a container for a series of techniques which are all the possible things you could do with some object.

The <predicate> section inside each effect determines which of the techniques we actually run. Typically the predicate section contains conditionals on a) rendering framework properties b) OpenGL extension support and c) quality level properties. The renderer searches the effects from lowest to highest technique number and uses the first one that fits the bill (which is why high quality levels precede low quality levels).

The rendering itself is specified as <pass> - each pass runs the renderer over the whole scene. We may sometimes make a quick first pass to fill the depth buffer or a stencil buffer, but usually we render with a single pass.

Techniques may not involve a shader at all, for instance 12 in terrain-default is a pure fixed pipeline fallback technique, but all the rendering you're interested in use a shader, which means they have a <program> section which determines what actual code is supposed to run.

If you run a shader, that needs parameters specified as <uniform>, and textures which need to be assigned to a texture unit _and_ be declared as a uniform sampler.

In addition, each technique contains a host of parameter configuring the fixed pipeline elements, like alpha tests before the fragment stage, or depth buffer writing/reading, alpha blending,... you can ignore them on the first go.

So if you want to look at which shader code is executed when you call the water effect at a certain quality level, you need to work your way through the predicate sections of the techniques from lowest to highest till you find the one that matches, and then look into the program section of that technique.

Now, to make matters more complicated, all the parameters and textures that are assigned to uniforms and texture units in the techniques need to be declared and linked to properties were applicable in the <parameters> section heading each effect declaration. So a texture you want to use has to appear three times - in the parameters section heading the effect, inside the technique assigned to a texture unit and assigned to a uniform sampler.

Now, inheritance moves all the declared parameters and techniques from one effect to the one inheriting. In the case of water, that means the technique is actually _not_ inherited because terrain-default.eff doesn't contain a technique 1 at all, but the general <parameters> section is inherited. So you don't need to declare the additions to <parameters> again, but you do need to make the changes to the <programs> and the <uniform> sections.[1]
— Thorsten Renk
Cquote2.png
WIP.png Work in progress
This article or section will be worked on in the upcoming hours or days.
Note: more to follow
See history for the latest developments.
  1. Thorsten Renk (Wed Oct 08, 2014 1:58 -0700). ALS landing lights.