Coordinate systems

From FlightGear wiki
Revision as of 10:00, 2 October 2011 by Zan (talk | contribs) (Created page with "==Introduction== This page describes the various reference frames for coordinates used in flightgear. This is mostly targeted at shader development, for cartesian/geodetic etc. s...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Introduction

This page describes the various reference frames for coordinates used in flightgear. This is mostly targeted at shader development, for cartesian/geodetic etc. systems, we have other pages.

Coordinates in shader programs

Vertex program

Vertex program handles every vertex of a model. It is executed for every vertex only, and can use attributes like vertex position, color, normal etc. Vertex program's purpose is to transform vertices to screen space (i.e. calculate their position on the display).

gl_Vertex has the vertex in object's reference frame. After multiplying by gl_ModelViewMatrix, we get the vertex coordinate with relation to camera (eye point). We call this later on the eye frame. Further multiplying this with gl_ProjectionMatrix (or with the combined gl_ModelViewProjectionMatrix) we get a screen space coordinate. We call this later screen or view frame.

Fragment program

Fragment program handles all pixels that are on the screen. It cannot change pixel's position, but only calculate its color and transparency. It gets parameters from a vertex shader, and interpolates between the values at those points.

You can access the screen coordinate with gl_FragCoord and/or gl_FragDepth.