Jump to: navigation, search

Howto:Shader programming in FlightGear

31,135 bytes added, 20:56, 1 September 2019
Effects file: added link to new page
{{Stubforum|47|Effects & Shaders}}{{Rendering}}<!--{{WIP|More to follow}}-->
This is meant to become an introduction to '''shader programming in FlightGear''', for the time being (03/2010), this is work in progress, please feel free to ask questions or suggest topics.
Your help in improving and updating this article is appreciated, thanks!
For an OpenGL quick reference, please see: for an GLSL quick reference see [ glsl_quickref.pdf]
= Intro = What is GLSL? ==''GLSL '' (''OpenGL Shading Language '' or "GLslang") is the official OpenGL shading language and allows you to write programs, so called "shaders" in a high level shading language that is based on the C programming language to create OpenGL fragment (pixel) and vertex shaders.
With the recent advances in graphics cards, new features have been added to allow for increased flexibility in the rendering pipeline at the vertex and fragment level. Programmability at this level is achieved with the use of fragment and vertex shaders.
Shaders are written and stored as plain text files, which can be uploaded (as strings) and executed on the GPU (processor of the graphics card).
== What is a Shader shader? ==A ''shader '' is a programmable replacement for parts of the fixed OpenGL function pipeline, you can imagine it sort of like a "plugin" to customize rendering for specific scene elements.
GLSL shaders are not stand-alone applications; they require an application that utilizes the OpenGL API.
Actually each vertex and fragment shader must have one entry point (the main function) each, but you can create and link more shaders.
GLSL shaders themselves are simply a set of strings that are passed to the hardware vendor’s vendors driver for compilation from within an application using the OpenGL API's APIs entry points. Shaders can be created on the fly from within an application or read in as text files, but must be sent to the driver in the form of a string.  GLSL has explicit ties to the OpenGL API - to the extent that much of the OpenGL 'state' (eg which light sources are bound, what material properties are currently set up) is presented as pre-defined global variables in GLSL.
GLSL has explicit ties to the OpenGL API - to the extent that much of the OpenGL "state" (for example which light sources are bound, what material properties are currently set up) is presented as pre-defined global variables in GLSL.
Shaders offer:
* Performance Benefits
Shaders have access to textures and the render state (parameters, matrices, lights, materials ...etc) and textures.
A "pass" is the rendering of a 3D Model with a vertex and pixel shader pair.
An effect can require multiple passes, while each pass can use a different shader and/or model pair.
To make it simple, a shader is a program that is loaded on the GPU and called for every vertex or pixel: this gives programmers the possibility to implement techniques and visual effects and execute them faster. In modern games or simulators lots of shaders are used: lights, water, skinning, reflections and much more.
We can create as many shader programs as needed (you . You can have many shaders of the same type (vertex or fragment) attached to the same program, but only one of them can define the entrypoint:entry point — the <code>main() </code> function).
Each Shader program is assigned an handler, and you can have as many programs linked and ready to use as you want (and your hardware allows).
* Cross platform compatibility on multiple operating systems, including Linux, Mac OS and Windows.
* The ability to write shaders that can be used on any hardware vendor’s graphics card that supports the OpenGL Shading Language.
* Each hardware vendor includes the GLSL compiler in their driver, thus allowing each vendor to create code optimized for their particular graphics card’s cards architecture. = Language Features =
== Language features ==
While GLSL has a C-Like syntax, it introduces some new types and keywords. To get a detailed view of the language, please see the GLSL specification you can find on
The OpenGL Shading Language provides many operators familiar to those with a background in using the C programming language. This gives shader developers flexibility when writing shaders. GLSL contains the operators in C and C++, with the exception of pointers. Bitwise operators were added in version 1.30.
Similar to the C programming language, GLSL supports loops and branching, including <code>if</code>, <code>else</code>, <code>if</code>/<code>else</code>, <code>for</code>, <code>do-while</code>, <code>break</code>, <code>continue</code>, etc. User defined functions are supported, and a wide variety of commonly used functions are provided built-in as well. This allows the graphics card manufacturer the ability to optimize these built-in functions at the hardware level if they are inclined to do so. Many of these functions are similar to those found in the math library of the C programming language such as <code>exp()</code> and <code>abs()</code> while others are specific to graphics programming such as <code>smoothstep()</code> and <code>texture2D()</code>. == Error Reports, Debugging, Troubleshooting == Shaders are compiled at FG startup.
User defined functions are supported, and a wide variety of commonly used functions are provided built-Shader compilation errors can be found in as wellthe fgfs. This allows log file. More about the graphics card manufacturer the ability to optimize these built-in functions at the hardware level if they are inclined to do so[[Commonly_used_debugging_tools#fgfs. Many of these functions are similar to those found in the math library of the C programming language such as exp() and abs() while others are specific to graphics programming such as smoothstep() and texture2D()log|fgfs.log here]].
= Shader Types =As of FG 2016.4.4, shaders do not seem to recompile upon Debug/Reload Aircraft Model or File/Reset. So the only option to re-compile/test a shader is to quit a re-start FG altogether.
== Shader types ==There are two types of shaders in GLSL: "''vertex shaders" '' and "''fragment shaders" '' (with geometry shaders being a part of OpenGL 3.2).
These are executed by vertex and fragment processors in the graphics hardware.
* Geometry shaders create geometry on the GPU
Typically, vertex shader files use the file extension "<code>.vert"</code>, while fragment shader files use the "<code>.frag" </code> extension. In FlightGear, these files can be found in the "<code>Shaders" </code> subdirectory of the base package, i.e. in essence <code>$FG_ROOT/Shaders</code>.
For a list of currently available shaders, you may want to take a look at:{{fg/fgdata/trees/master/root file|Shaders}}.
So, shaders generally go around in pairs - one shader (the "Vertex ''vertex shader"'') is a short program that takes in one vertex from the main CPU and produces one vertex that is passed on to the GPU rasterizer which uses the vertices to create triangles - which it then chops up into individual pixel-sized fragments.
A vertex shader is run once per vertex, while a fragment shader is run once per fragment covered by the primitive being rendered (a point, a line or a triangle). A fragment equate a pixel except in the case of multi-sampling where a pixel can be the weighted average of several fragments. Multi-sampling is used to remove aliasing and jagged edges.
Many such executions can happen in parallel. There is no communication or ordering betweenexecutions. Vertex shaders are flexible and quick.
=== Vertex shaders ===
|The vertex shader doesn't know anything about the mesh it renders, it just knows one single vertex at a time and all the info that is attached to the vertex (normals, tangents, binormals, color,...) And the vertex shader doesn't really draw anything, it just takes care of all the things which have to do with 'where in space' you are.
The way this works is that for all the vertices of an object you want to render, the position of the object gets attached to all vertices (currently in the color spot). The vertex shader then just adds the offset vector to the vertex coordinate with respect to the origin.
|{{cite web |url=
|title=<nowiki>Re: [Flightgear-devel] cities in FG & how to move forward</nowiki>
|author=<nowiki>Renk Thorsten</nowiki>
{| class="wikitable"
! Input
| Vertex attributes
! Output
| At least vertex position (in the clip space)
! Restrictions
| Cannot access any vertex other than the current one
! ''Note''
| ''Loading a vertex shader turns off parts of the OpenGL pipeline (vertex shaders fully replace the "Texturing & Lighting unit").''
== Vertex Shaders ==Input: Vertex attributes Output: At least vertex position (in the clip space) Restrictions: Cannot access any vertex other than the current one Note: Loading a vertex shader turns off parts of the OpenGL pipeline (vertex shaders fully replace the "Texturing & Lighting unit") Objects in a computer graphics scene are usually meshes that are made up of polygons. The corner of each of those polygons is called a "''vertex"''.A vertex shader receives input in the form of per-vertex variables called "''attribute variables"'', and per-polygon variables called "''uniform variables"''.
The vertex shader must specify the coordinates of the vertex in question. This way, the geometry of the object can be modified.
Vertex shaders operate on each vertex, the vertex shader is executed for every vertex related OpenGL call (e.g. for example <code>glVertex* </code> or <code>glDrawArrays</code>).Accordingly, this means for example, that for meshes that contain e.g. for example 5000 vertices, the vertex shader will also be executed 5000 times.
A single vertex itself is composed of a number of "''attributes" '' (vertex attrib), such as: position, texture coordinates, normal and color for the most common.
The position (attribute) is the most important one. The coordinates (x, y and z) of the vertex's entering position are those which have been given by the 3D modeler during the creation of the 3D model. The vertex's position is defined in the local space of the mesh (or object space).
A vertex shader provides almost full control over what is happening with each vertex. Consequently, all per-vertex operations of the fixed function OpenGL pipeline are replaced by the custom vertex shader.
Vertex Shaders shaders take application geometry and per-vertex attributes as input and transform the input data in some meaningful way. 
* A vertex shader MUST '''must''' write to <code>gl_Position</code>* A vertex shader CAN '''can''' write to <code>gl_PointSize</code>, <code>gl_ClipVertex</code>* <code>gl_Vertex </code> is an attribute supplying the untransformed vertex coordinate* <code>gl_Position </code> is an special output variable for the transformed vertex coordinate
A vertex shader can also set other variables which are called "''varying variables"''. The values of these variables are passed on to the second kind of shader, the "''fragment shader"''. The fragment shader is run for every pixel on the screen where the polygons of the mesh appear.The fragment shader is responsible for setting the final color of that little piece of the mesh
Common tasks for a vertex shader include:
* Vertex position transformation
* Per vertex lighting
* Setting up data for fragment shaders
The vertex shader runs from start to end for each and every vertex that's passed into the graphics card - the fragment process does the same thing at the pixel level. In most scenes there are a heck of a lot more pixel fragments than there are vertices - so the performance of the fragment shader is vastly more important and any work we can do in the vertex shader, we probably should.
A minum vertex shader example may looks this:
<syntaxhighlight lang="glsl">
void main(void)
gl_Position = ftransform();
void main(void) { gl_Position = ftransform(); } == Fragment Shaders shaders ===Input: Interpolation of {{FGCquote |the fragment shader basically knows only the pixel it is about to render and whatever information is passed from the vertex shader outputs. Based on 'where' the vertex shader says the pixel is, the rasterizer stage determines what the texture for the pixel should be.  Output:Usually But there are techniques to do this in a fragment colordifferent way, for instance the water depth map uses world coordinates to look up a world texture, and the gardens would have to be drawn in a similar wayRestrictions |{{cite web |url=http: Fragment shaders have no knowledge of neighboring pixels// |title=<nowiki>Re: [Flightgear-devel] cities in FG & how to move forward</nowiki> |author=<nowiki>Renk Thorsten</nowiki> |date=<nowiki>2014-05-11</nowiki> }}}}
{| class="wikitable"! Input| Interpolation of the vertex shader outputs|-! Output| Usually a fragment color.|-! Restrictions| Fragment shaders have no knowledge of neighboring pixels.|-! ''Note: ''| ''Loading a fragment shader turns off parts of the OpenGL pipeline (pixel shaders fully replace the "Texturing Unit").''|}
The other shader (the "Fragment ''fragment shader" '' - also known (incorrectly) known as the "pixel shader") takes one pixel from the rasterizer and generates one pixel to write or blend into the frame buffer.
A fragment shader can write to the following special output variables:
* <code>gl_FragColor </code> to set the color of the fragment* <code>gl_FragData[n] </code> to output to a specific render target* <code>gl_FragDepth </code> to set the fragment depth 
Common tasks of fragment shaders include:
* Texturing (even procedural)
* Per pixel lighting and material application
* ray Ray tracing
* Fragment color computation
* Operations on Interpolated Values
* Doing operations per fragment to make pretty pictures
A minimum fragment shader may look like this:
<syntaxhighlight lang="glsl"> void main(void) { gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); }</syntaxhighlight>
A fragment shader takes perspective-correct interpolated attribute values as input and either discards the fragment or outputs the fragment's color.
Fragment shaders operate on every fragment which is produced by rasterization. Fragment shaders give you nearly full control over what is happening with each fragment. However just like vertex shaders, a fragment shader replaces all per-fragment operations of the fixed function OpenGL pipeline.
== Data Types types in GLSL ==
Note that there is no implicit type conversion in GLSL, all conversions and initializations have to be done using explicit constructor calls!
=== Scalars ===* <code>float - </code> – 32 bit, very nearly IEEE-754 compatible* <code>int - </code> – at least 16 bit, but not backed by a fixed-width register* <code>bool - </code> – like C++, but must be explicitly used for all flow control
=== Vectors ===* <code>vec2</code>, <code>vec3</code>, <code>vec4 </code> – 2D, 3D and 4D floating point vector* <code>ivec2</code>, <code>ivec3</code>, <code>ivec4 </code> – 2D, 3D and 4D integer vector* <code>bvec2</code>, <code>bvec3</code>, <code>bvec4 </code> – 2D, 3D and 4D boolean vectors
Accessing a vector can be done using letters as well as standard C selectors.
s,t,p,q for texture coordinates.
=== Matrices ===* <code>mat2 </code> – 2x2 floating point matrix* <code>mat3 </code> – 3x3 floating point matrix* <code>mat4 </code> – 4x4 floating potint matrix
=== Samplers ===
In GLSL, textures are represented and accessed using so called "samplers", which are used for sampling textures and which have to be uniform. The following samplers are available:
* <code>sampler1D</code>, <code>sampler2D</code>, <code>sampler3D </code> – 1D, 2D and 3D texture* <code>samplerCube </code> – Cube Map texture* <code>sampler1Dshadow</code>, <code>sampler2Dshadow </code> – 1D and 2D depth-component texture
=== Arrays ===
GLSL supports the same syntax for creating arrays that is already known from C or C++, e.g.:
<syntaxhighlight lang="glsl"> vec2 foo[10];</syntaxhighlight>
So, arrays can be declared using the same syntax as in C, but can't be initialized when declared. Accessing array's elements is done as in C.
=== Structures ===
Structures can also be created like in C or C++, e.g.:
<syntaxhighlight lang="glsl"> struct foo {
vec3 pos;
== Global Storage Qualifiers storage qualifiers ==
Used for communication between shaders and application:
* <code>const</code> – For declaring non-writable, compile-time constant variables
* <code>attribute</code> – For frequently changing (per vertex) information passed from the application to a vertex shader (no integers, bools, structs, or arrays)
* <code>uniform</code> – For infrequently changing (per primitive) information passed from the application to a vertex or fragment shader:constant shader parameters that can be changed between draws (cannot be written to in a shader, do not change per-vertex or per-fragment)
* <code>varying</code> – For information passed from a vertex shader to a fragment shader, will be interpolated in a perspective-correct manner during rasterization (can write in vertex shader, but only read in fragment shader)
* const - for declaring non-writable, compile-time constant variables* attribute - For frequently changing (per vertex) information passed from the application to a vertex shader (no integers, bools, structs, or arrays)* uniform - for infrequently changing (per primitive) information passed from the application to a vertex or fragment shader:constant shader parameters that can be changed between draws (cannot be written to in a shader, do not change per-vertex or per-fragment)* varying - for information passed from a vertex shader to a fragment shader, will be interpolated in a perspective-correct manner during rasterization (can write in vertex shader, but only read in fragment shader) == Functions ==
* Much like C++
* Entry point into a shader is <code>void main()</code>
* Overloading based on parameter type (but not return type)
* No support for direct or indirect recursion
* Call by value-return calling convention
As in C, a shader is structured in functions. At least each type of shader must have a main function declared with the following syntax: <code>void main()</code>
User defined functions may be defined. As in C a function may have a return value, and use the return statement to pass out its result. A function can be void. The return type can have any type, except array.
=== Parameter Qualifiers qualifiers ===
The parameters of a function may have the following qualifiers:
* <code>in - copy </code> – Copy in, but don't copy back out (still writable within function)* <code>out - only </code> – Only copy out; undefined at function entry point* <code>inout - copy </code> – Copy in and copy out
If no qualifier is specified, by default it is considered to be in.
== Built-ins ===== Vertex Shader shader === * <code>vec4 gl_Position; must </code> – '''Must''' be written* <code>vec4 gl_ClipPosition; may </code> – May be written* <code>float gl_PointSize; may </code> – May be written
=== Fragment Shader shader ===* <code>float gl_FragColor; may </code> – May be written* <code>float gl_FragDepth; may </code> – May be read/written* <code>vec4 gl_FragCoord; may </code> – May be read* <code>bool gl_FrontFacing; may </code> – May be read
=== Vertex Attributes attributes ===
Only available in vertex shaders.
* <code>attribute vec4 gl_Vertex;</code>* <code>attribute vec3 gl_Normal;</code>* <code>attribute vec4 gl_Color;</code>* <code>attribute vec4 gl_SecondaryColor;</code>* <code>attribute vec4 gl_MultiTexCoordn;</code>* <code>attribute float gl_FogCoord;</code>
=== Uniforms ===* <code>uniform mat4 gl_ModelViewMatrix;</code>* <code>uniform mat4 gl_ProjectionMatrix;</code>* <code>uniform mat4 gl_ModelViewProjectionMatrix;</code>* <code>uniform mat3 gl_NormalMatrix;</code>* <code>uniform mat4 gl_TextureMatrix[n];</code>
<syntaxhighlight lang="glsl">struct gl_MaterialParameters { vec4 emission; vec4 ambient; vec4 diffuse; vec4 specular; float shininess; };</syntaxhighlight>
* <code>uniform gl_MaterialParameters gl_FrontMaterial;</code>* <code>uniform gl_MaterialParameters gl_BackMaterial;</code>
<syntaxhighlight lang="glsl">struct gl_LightSourceParameters { vec4 ambient; vec4 diffuse; vec4 specular; vec4 position; vec4 halfVector; vec3 spotDirection; float spotExponent; float spotCutoff; float spotCosCutoff; float constantAttenuation float linearAttenuation float quadraticAttenuation };* Uniform gl_LightSourceParameters gl_LightSource[gl_MaxLights];</syntaxhighlight>
== Varyings ==An interface between vertex and fragment shaders is provided by varying variables: vertex shaders compute values per vertex and fragment shaders compute values per fragment. The value of a varying variable defined in a vertex shader, will be interpolated (perspective-correct) over the primitve being rendered and the interpolated value in the fragment shader can be accessed.* <code>uniform gl_LightSourceParameters gl_LightSource[gl_MaxLights];</code>
Varying === Varyings ===An interface between vertex and fragment shaders is provided by varying variables can only be used with the data types float, vec2, vec3, vec4, mat2, mat3, mat4: Vertex shaders compute values per vertex and fragment shaders compute values per fragment. The value of a varying variable defined in a vertex shader, will be interpolated (perspective-correct) over the primitive being rendered and arrays of them toothe interpolated value in the fragment shader can be accessed.)
* varying vec4 gl_FrontColor Varying variables can only be used with the data types <code>float</code>, <code>vec2</ vertex* varying vec4 gl_BackColor; code>, <code>vec3<// vertex* varying code>, <code>vec4 gl_FrontSecColor; </code>, <code>mat2</ vertex* varying vec4 gl_BackSecColor; // vertex* varying vec4 gl_Color; // fragment* varying vec4 gl_SecondaryColor; // fragment* varying vec4 gl_TexCoord[]; // both* varying float gl_FogFragCoord; code>, <code>mat3</code>, <code>mat4</ bothcode>. (and arrays of them too.)
== Functions ==* <code>varying vec4 gl_FrontColor; // vertex</code>* <code>varying vec4 gl_BackColor; // vertex</code>* <code>varying vec4 gl_FrontSecColor; // vertex</code>* <code>varying vec4 gl_BackSecColor; // vertex</code>* <code>varying vec4 gl_Color; // fragment</code>* <code>varying vec4 gl_SecondaryColor; // fragment</code>* <code>varying vec4 gl_TexCoord[]; // both</code>* <code>varying float gl_FogFragCoord; // both</code>
=== Functions ===== Anatomy of a Shader shader ==
A shader's entry point is the main function which returns void and takes no arguments (void)
=== Anatomy of a vertex shader ===
The function <code>void main()</code> is called afresh for each vertex in the 3D object model:
<syntaxhighlight lang="glsl">
// Vertex Shader
void main() {
gl_Position = gl_Vertex;
=== Anatomy of a Vertex Shader fragment shader ===The function '<code>void main()' </code> is called afresh for each vertex fragment/pixel in the 3D object model:<syntaxhighlight lang="glsl">// Fragment Shadervoid main() { gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);}</syntaxhighlight>
// Vertex Shader== Practical application – ALS landing lights – spotlight == void main() {[[File:ALS Secondary Light Proof of Concept.png|thumb|300px|ALS secondary light proof of concept]] gl_Position = gl_Vertex;[[File:Als Secondary Lights combined with Fog Effect.jpg|thumb|300px|Weather settings to produce fog and ALS landing lights on a runway.]][[File:Model on Water and Trees on Land.jpg|thumb|300px|Model on water and trees on land ALS lights effect]][[File:Model on Water.jpg|thumb|300px|ALS lights effect over model and water.]] }[[File:ALS Lights over Model and Terrain.jpg|thumb|300px|ALS lights over model and terrain]]
== Anatomy of a Fragment Shader ==The function 'void mainALS landing lights-spotlight ()we' ll call it ALS lights from now on) is called afresh a good example for each fragment/pixel in showing how to incorporate a shader effect into FlightGear as it touches many parts of the visuals we see and many parts of the 3D object model:coding pipeline.
In the case of ALS Lights, you have to add the effect to every visual item rendered on the screen that you want to see a light shining on. If you want it to be capable of shining on everything, you have to account for each separate item and how that item is rendered. That is a lot of code to touch. The list might include* Terrain** Runway** Dirtrunway** Agriculture* Models** AI** Aircraft** Tree** Buildings* Weather** Fog** Clouds** Hazes* Water** Inland** Ocean** Stream Some of these items may be controlled or rendered by the same effect and shader file. They might have to be accounted for individually. They may have special lighting influences that have to be accounted for. You have to take each one separately and account for all its needs.  The example highlighted in this article is what was added to <code>tree.eff</code> to shine the lights on trees. === Program flow simplified ===Preferences/ Fragment Nasal/XML → Property tree → Effect file → Shader→ Rendered to screen === Preferences/Nasal/XML ===Any combination of Preferences, [[Nasal|Nasal]] or [[Xml|XML]] manipulates data in the [[Property Tree|property tree]]. In this case the switch to turn on the landing or spot light and a couple other needed data containers are defined in <code>$FG_ROOT/preferences.xml</code> with the following lines.<syntaxhighlight lang="xml"><als-secondary-lights> <use-searchlight type="bool">false</use-searchlight> <use-landing-light type="bool">false</use-landing-light> <use-alt-landing-light type="bool">false</use-alt-landing-light> <landing-light1-offset-deg type="float">0.0</landing-light1-offset-deg> <landing-light2-offset-deg type="float">0.0</landing-light2-offset-deg></als-secondary-lights></syntaxhighlight> They show up in the property tree under <code>sim/rendering/als-secondary-lights</code> and can be activated or manipulated by normal Nasal calls or XML. === Property tree ===The [[Property Tree|property tree]] is like the CPU of the [[FlightGear]] program at a user level. It's a go-between that allows the user to see and influence many aspects at the heart of the program in ALMOST real time. More of the internals of FlightGear are being exposed to the property tree than ever before. This allows us to have user level access in areas of the code that used to only be reserved for [[Programming Resources|programmers]]. Because of the manner in which the property tree is fed information, and the one step removed from the C source, care must be taken in how it is used. Depending on how it is used it won't be as responsive to manipulation as it would be if you were to change the same information at the C source level. === Effects file ===The effects file is the mechanism we use to combine and manipulate all the necessary data to create stunning visual effects. It's the link between the data contained and produced in Nasal, XML and the property tree and the graphics rendering pipeline. It's there to allow us to create these affects without having to know or use the C++ code base. Its flexible framework allows for an almost infinite range of sophisticated effects. See this page for more details: [[Effect Framework]] ==== Parameters ====Parameter entries defined in the Effect file correspond to a property tree data container (static or variable). They will contain the data needed by the shader program to perform its magic. The type of information contained in the property tree might be program control data or variable/static data that the shader program can manipulate prior to sending on to render.In the case of ALS lights, below is some of the data passed to, and used by, the shader program. <syntaxhighlight lang="xml"><small> <display_xsize><use>/sim/startup/xsize</use></display_xsize> <display_ysize><use>/sim/startup/ysize</use></display_ysize> <view_pitch_offset><use>/sim/current-view/pitch-offset-deg</use></view_pitch_offset> <view_heading_offset><use>/sim/current-view/heading-offset-deg</use></view_heading_offset> <view_fov><use>/sim/current-view/field-of-view</use></view_fov> <use_searchlight><use>/sim/rendering/als-secondary-lights/use-searchlight</use></use_searchlight> <use_landing_light><use>/sim/rendering/als-secondary-lights/use-landing-light</use></use_landing_light> <use_alt_landing_light><use>/sim/rendering/als-secondary-lights/use-alt-landing-light</use></use_alt_landing_light> <landing_light1_offset><use>/sim/rendering/als-secondary-lights/landing-light1-offset-deg</use></landing_light1_offset> <landing_light2_offset><use>/sim/rendering/als-secondary-lights/landing-light2-offset-deg</use></landing_light2_offset> <quality_level><use>/sim/rendering/shaders/landmass</use></quality_level> <tquality_level><use>/sim/rendering/shaders/transition</use></tquality_level></small></syntaxhighlight> Note the <code>use-searchlight</code> entry, it is pointing to the use-searchlight entry in the property tree under <code>sim/rendering/als-secondary-lights</code> that was defined in <code>preferences.xml</code>. Some of this data may play a duel role inside the shader program. In other words it might be used to control other functions in addition to ALS lights.There will also be other parameter entries that have nothing to do with ALS lights. They might be used for other actions or effects the shader is handling.  ==== Technique ====In general, the shader program and the uniforms are defined in between the technique tags. The technique is assigned an index to distinguish one technique from another (technique n="1"). As is the case with tree.eff, sometimes the shader program and its uniforms are defined and needed in more than one technique. In the case of <code>tree.eff</code> it is used in technique 4 and 5. Which means in FlightGear, the tree shader set to either of the the highest two shader settings still produces ALS lights when activated. ==== Shader program ====Next comes the entry to define what shader program the parameters data is going to be passed to.This is where you specify what shader program is to be used by the technique. ALS has the lowest techniques, with higher quality preceding lower quality. <syntaxhighlight lang="xml"><program> <fragment-shader>Shaders/tree-ALS.frag</fragment-shader> <fragment-shader>Shaders/secondary_lights.frag</fragment-shader></program></syntaxhighlight>In the case of ALS Lights, so far we only have to deal with the fragment shader. The program section of the effect file is a nifty method used to allow users to add shaders to FlightGear without having to add code at C level language base. The C level base is programed to recognize the XML tag pair of <program></program> and thus incorporate the GLSL program files pointed to between the tags. Otherwise you would have to add the GLSL program calls in the base C requiring a completely different set of programing skills and also the necessity of compiling FlightGear everytime you want to add new shader. It can work this way because shader programs are compiled at run-time. We'll describe the contents of the shader programs below. For now, suffice it to say <code>tree-ALS.frag</code> contains the main program and <code>secondary_lights.frag</code> has functions that are passed uniform data that is manipulated and returned to main for processing. ==== Uniforms ====The uniforms section is the mechanism that feeds the parameter data to the shader program.<syntaxhighlight lang="xml"><small> <uniform> <name>view_pitch_offset</name> <type>float</type> <value><use>view_pitch_offset</use></value> </uniform> <uniform> <name>view_heading_offset</name> <type>float</type> <value><use>view_heading_offset</use></value> </uniform> <uniform> <name>field_of_view</name> <type>float</type> <value><use>view_fov</use></value> </uniform> <uniform> <name>landing_light1_offset</name> <type>float</type> <value><use>landing_light1_offset</use></value> </uniform> <uniform> <name>landing_light2_offset</name> <type>float</type> <value><use>landing_light2_offset</use></value> </uniform> <uniform> <name>use_searchlight</name> <type>int</type> <value><use>use_searchlight</use></value> </uniform> <uniform> <name>use_landing_light</name> <type>int</type> <value><use>use_landing_light</use></value> </uniform> <uniform> <name>use_alt_landing_light</name> <type>int</type> <value><use>use_alt_landing_light</use></value> </uniform> <uniform> <name>display_xsize</name> <type>int</type> <value><use>display_xsize</use></value> </uniform> <uniform> <name>display_ysize</name> <type>int</type> <value><use>display_ysize</use></value> </uniform></small></syntaxhighlight>Note the name, <code>use_searchlight</code>, which was originally defined in <code>preferences.xml</code> and then became an entry in parameters is now being passed to the program shader by the uniform. Below in the "Shader program" section, we will show you how the shader receives the uniform's data. === Shader programs ===The shader programs used in this example are <code>tree-ALS.frag</code> and <code>secondary_lights.frag</code>. === secondary_lights.frag ===<code>secondary_lights.frag</code> consists of* Uniform inputs (data coming into the shader to be manipulated* Functions that manipulate the uniform data Following it the actual GLSL code in <code>secondary_lights.frag</code>. ==== Uniform input ====<syntaxhighlight lang="glsl">uniform int display_xsize;uniform int display_ysize;uniform float field_of_view;uniform float view_pitch_offset;uniform float view_heading_offset;</syntaxhighlight> ==== Functions ====<syntaxhighlight lang="glsl">float light_distance_fading(in float dist){ return min(1.0, 10000.0/(dist*dist));} float fog_backscatter(in float avisibility){ return 0.5* min(1.0,10000.0/(avisibility*avisibility));} vec3 searchlight(){ vec2 center = vec2 (float(display_xsize) * 0.5, float(display_ysize) * 0.4); float headlightIntensity; float lightRadius = (float(display_xsize) *9.16 /field_of_view); float angularDist = length(gl_FragCoord.xy -center); if (angularDist < lightRadius) { headlightIntensity = pow(cos(angularDist/lightRadius * 1.57075),2.0); //headlightIntensity = headlightIntensity * //headlightIntensity*= clamp(1.0 + 0.15 * log(1000.0/(dist*dist)),0.0,1.0); return headlightIntensity * vec3 (0.5,0.5, 0.5); } else return vec3 (0.0,0.0,0.0);} vec3 landing_light(in float offset){ float fov_h = field_of_view; float fov_v = float(display_ysize)/float(display_xsize) * field_of_view; float yaw_offset; if (view_heading_offset > 180.0) {yaw_offset = -360.0+view_heading_offset;} else {yaw_offset = view_heading_offset;} float x_offset = (float(display_xsize) / fov_h * (yaw_offset + offset)); float y_offset = -(float(display_ysize) / fov_v * view_pitch_offset); vec2 center = vec2 (float(display_xsize) * 0.5 + x_offset, float(display_ysize) * 0.4 + y_offset); float landingLightIntensity; float lightRadius = (float(display_xsize) *9.16 /field_of_view); float angularDist = length(gl_FragCoord.xy -center); if (angularDist < lightRadius) { landingLightIntensity = pow(cos(angularDist/lightRadius * 1.57075),2.0); //landingLightIntensity *= min(1.0, 10000.0/(dist*dist)); return landingLightIntensity * vec3 (0.5,0.5, 0.5); } else return vec3 (0.0,0.0,0.0);}</syntaxhighlight> === tree-ALS.frag ===<code>tree-ALS.frag</code> consists of* Uniform inputs (data coming into the shader to be manipulated* Functions that manipulate the uniform data Following it the actual GLSL code in <code>tree-ALS.frag</code>.While there is significantly more code in <code>tree-ALS.frag</code>, only the code that was included for the ALS lights is being shown and discussed here. ==== Uniform input ====Uniform data is brought into the shader in the following manner. <syntaxhighlight lang="glsl">uniform float landing_light1_offset;uniform float landing_light2_offset;uniform int use_searchlight;uniform int use_landing_light;uniform int use_alt_landing_light;uniform int quality_level;uniform int tquality_level;</syntaxhighlight> Note <code>use_searchlight</code> and how it is defined as incoming uniform data. ==== Variable data ====Variable data can be defined in the shader program. An example of variable data defined in the shader program that is needed for ALS lights is<syntaxhighlight lang="glsl">vec3 secondary_light = vec3 (0.0,0.0,0.0);</syntaxhighlight> ==== Functions ====Function calls to the function defined in <code>secondary_lights.frag</code> are<syntaxhighlight lang="glsl">vec3 searchlight();vec3 landing_light(in float offset);</syntaxhighlight> You don't have to use any path information or includes in the code because the GLSL compiler program takes care of linking all the shader programs as long as they are defined correctly in the "programs" section of the effect file.Variable data can be passed to and returned from GLSL functions just like any other language. ==== Main program ====The <code>main()</code> function is the heart of the shader program. This is where the shader program manipulates all the data made available to it.<syntaxhighlight lang="glsl">void main() { gl_FragColor vec3 secondary_light = vec3 (0.0,0.0,0.0); if ((quality_level>5) && (tquality_level>5)) { if (use_searchlight == 1) { secondary_light += searchlight(); } if (use_landing_light == 1) { secondary_light += landing_light(landing_light1_offset); } if (use_alt_landing_light == 1) { secondary_light += landing_light(landing_light2_offset); } } vec4 fragColor = vec4(gl_Color.rgb +secondary_light * light_distance_fading(dist),1.0) * texel;}</syntaxhighlight> Note how <code>use_searchlight</code> is used in the main function to determine if the property defined in <code>preferences.xml</code> and manipulated in the property tree is set to true or 1. Some of the variable data contained in the shader program is used for other purposes and is introduced into the shader program from other property, parameter and uniform definitions not pertaining to ALS lights. ==== File list ====Files that are directly touched by this effect include * <code>preferences.xml</code>* <code>agriculture.eff</code>:** Inherits properties from <code>crop</code> << <code>terrain-default</code>** Adds program shaders (technique 2)*** Fragment <code>agriculture-ALS.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 2)* <code>airfield.eff</code>:** Inherits properties from <code>terrain-default</code>** Adds program shaders (technique 2)*** Fragment <code>airfield-ALS.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 2)* <code>building.eff</code>:** Inherits properties from <code>model-combined-deferred</code> << <code>model-combined</code>** Adds program shaders (technique 4)*** Fragment <code>model-ALS-ultra.frag</code> and <code>secondary_lights.frag</code>** Inherits uniforms from <code>model-combined-deferred</code> << <code>model-combined</code>* <code>dirt-runway.eff</code>:** Inherits properties from <code>crop</code> << <code>terrain-default</code>** Adds program shaders (technique 2)*** Fragment <code>drunway-ALS.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 2)* <code>model-combined.eff</code>:** Inherits properties from <code>model-default</code>** Adds program shaders (technique 4)*** Fragment <code>model-ALS-ultra.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 4)* <code>model-default.eff</code>:** Adds properties** Adds program shaders (technique 5)*** Fragment <code>model-ALS-base.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 5)* <code>runway.eff</code>:** Inherits properties from terrain-default** Adds program shaders (technique 2)*** Fragment <code>runway-ALS.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 2)* <code>terrain-default.eff</code>:** Adds properties** Adds program shaders (technique 3)*** Fragment <code>terrain-ALS-ultra.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 3)* <code>tree.eff</code>:** Adds properties** Adds program shaders (technique 4 and 5)*** Fragment <code>tree-ALS.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 4 and 5)* <code>urban.eff</code>:** Inherits properties from <code>terrain-default</code>** Adds program shaders (technique 1and 2)*** Fragment <code>urban-ALS.0frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 1 and 2)* <code>water.eff</code>:** Inherits properties from <code>terrain-default</code>** Adds program shaders (technique 1)*** Fragment <code>water-ALS-high.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 1)* <code>water-inland.eff</code>:** Inherits properties from <code>terrain-default</code>** Adds program shaders (technique 2)*** Fragment <code>water-ALS-high.frag</code> and <code>secondary_lights.frag</code>** Adds uniforms (technique 2) == General comments from forum discussion =={{cquote|In principle, we always do the same steps in the fragment shaders to determine the color of a pixel: * texel color - what is the base color of the pixel fully lit and unfogged* lighting - how is this color changed by the light falling onto that pixel, usually the relation is something like fragColor equals texel * light* fogging - how much is the color hidden by haze, usually the relation is something like gl_FragColor equals mix(fragColor, hazeColor, transmission_factor);what is displayed on the screen in the end is whatever gl_FragColor is set to But the location where this happens isn't always obvious - often (part) of the light is computed in the vertex shader already, in which case it typically enters the fragment shader as gl_Color. So, the lighting equation in tree-haze.frag is indeedvec4 fragColor equals vec4 (, 1.0) * texel;and your change to the light should happen just before that. But you can't dogl_Color.rgb equals gl_Color.rgb + my_light;because gl_Color.rgb is a varying variable type, and you can't assign new values to them inside the shader, so you need to either make a new variable or just dovec4 fragColor equals vec4 ((gl_Color.rgb + my_light), 1.0)* texel; (note that color.rgb is the same as, GLSL doesn't really care which convention you use, but I took a few months to learn that, so early code by myself often uses xyz indexing convention for color vectors as well).<ref>{{cite web |url= |title=ALS landing lights|author=Thorsten Renk |date= Tue Oct 07, 2014 12:04 -0700}}</ref>|Thorsten Renk}} {{cquote|An effect is a container for a series of techniques which are all the possible things you could do with some object. The <predicate> section inside each effect determines which of the techniques we actually run. Typically the predicate section contains conditionals on a) rendering framework properties b) OpenGL extension support and c) quality level properties. The renderer searches the effects from lowest to highest technique number and uses the first one that fits the bill (which is why high quality levels precede low quality levels). The rendering itself is specified as <pass> - each pass runs the renderer over the whole scene. We may sometimes make a quick first pass to fill the depth buffer or a stencil buffer, but usually we render with a single pass. Techniques may not involve a shader at all, for instance 12 in terrain-default is a pure fixed pipeline fallback technique, but all the rendering you're interested in use a shader, which means they have a <program> section which determines what actual code is supposed to run. If you run a shader, that needs parameters specified as <uniform>, and textures which need to be assigned to a texture unit _and_ be declared as a uniform sampler. In addition, each technique contains a host of parameter configuring the fixed pipeline elements, like alpha tests before the fragment stage, or depth buffer writing/reading, alpha blending,... you can ignore them on the first go. So if you want to look at which shader code is executed when you call the water effect at a certain quality level, you need to work your way through the predicate sections of the techniques from lowest to highest till you find the one that matches, and then look into the program section of that technique. Now, to make matters more complicated, all the parameters and textures that are assigned to uniforms and texture units in the techniques need to be declared and linked to properties were applicable in the <parameters> section heading each effect declaration. So a texture you want to use has to appear three times - in the parameters section heading the effect, inside the technique assigned to a texture unit and assigned to a uniform sampler. Now, inheritance moves all the declared parameters and techniques from one effect to the one inheriting. In the case of water, that means the technique is actually _not_ inherited because terrain-default.eff doesn't contain a technique 1 at all, but the general <parameters> section is inherited. So you don't need to declare the additions to <parameters> again, but you do need to make the changes to the <programs> and the <uniform> sections.<ref>{{cite web |url= |title=ALS landing lights|author=Thorsten Renk |date= Wed Oct 08, 2014 1:58 -0700}}</ref>|Thorsten Renk}} {{cquote|At low water shader quality, water isn't rendered separately from the terrain, i.e. it runs via terrain-default.eff - since you modified that to allow light, light works for water out of the box.At high water slider, the techniques in water.eff kick in, and only then do you need to modify the specific water shader code. Now, the peculiar thing about water is that there are two effects (water.eff and water_inland.eff) sharing the same shader code (but calling it with somewhat different parameters). So the function not found error was presumably caused by you modifying water.eff whereas the water you were seeing was initiated by water_inland.eff, and in that effect, the secondary_lights.frag wasn't initially in the program section. So if you alter the water shader code, you need to modify two effect files rather than one to pass the right parameters and access the functions you need. .................... Adding them _after_ fogging, isn't what you want - you'll see that if visibility is <100 m, everything will go black at night, because fog color is black at night, so finalColor.rgb of a heavily fogged object will also be black, and then when you light it up, you multiply your light value with black and it'll be black. Or, if you add the light value (which you should only do if the object you add it to is a light color itself), then you'll get a featureless grey. You want to add light after texel color and before fogging.<ref>{{cite web |url= |title=ALS landing lights|author=Thorsten Renk |date= Wed Oct 08, 2014 11:19 -0700}}</ref>|Thorsten Renk}} {{cquote|So, in old times when rendering textures was slow and complicated, we rendered objects with monochromatic surface colors. Then the (schematic) lighting equation (without specular, and the sum of ambient and diffuse already computed) was visibleColor.rgb equals objectColor.rgb * light.rgb + objectEmissive.rgb Now, we have textures and so we get visibleColor.rgb equals objectColor.rgb * texel.rgb * light.rgb + objectEmissive.rgb + lightMapTexel.rgb Since we can have the full color information in the texture, objectColor.rgb is usually (1.0,1.0,1.0) because the info is redundant. But if you don't use a texture, of course objectColor.rgb has the actual color value (incidentially, I think we shouldn't texture with monochromatic surfaces at all, it creates a jarring visual impression which can be cured by using even a small texture...) But if you do, the rendering pipeline is set up to compute color.rgb equals objectColor * light.rgb in the vertex shader, so the equation we have in the fragment shader is something like visibleColor.rgb equals color.rgb * texel.rgb + objectEmissive.rgb + lightMapTexel.rgb and if we add a secondary light like visibleColor.rgb equals (color.rgb + secLight.rgb) * texel.rgb it of course can never recover the color information, because color.rgb is zero at night since you multiplied the actual color with zero sunlight and the texel doesn't carry information for an untextured object. Since the secondary light is in screen coordinates, it can't appear in the vertex shader, so the solution would be to pass the actual color and light rather than their product to the fragment shader. Which is expensive, because we need another varying vec3, and varying variable types fill memory and need to be computed an interpolated per vertex/per fragment - which is why I'm not sure whether we shouldn't accept the loss of the color...<ref>{{cite web |url= |title=ALS landing lights|author=Thorsten Renk |date= Sat Oct 11, 2014 1:28 -0700}}</ref>|Thorsten Renk}} {{cquote|Inheritance works like this: The base effect has a list of things A1 B1 C1 D1 The second effect inherits 1 but just declares C2 and E2, so then the list is A1 B1 C2 D1 E2 The third effect inherits from 2 and declares A3, so then the list is A3 B1 C2 D1 E2 whereas if the third effect would inherit from 1, then the list would be A3 B1 C1 D1 So if already present, inheritance overrides, if not present it adds. I suspect that's why programs need to be indexed, so they they override and don't add...<ref>{{cite web |url= |title=ALS landing lights|author=Thorsten Renk |date= Sat Oct 11, 2014 11:33 -0700}}</ref>|Thorsten Renk}} <references/>
[[Category:Howto|Shader Programming in FlightGear]]
[[Category:Shader development]]
[[Category: Core developer documentation]]

Navigation menu