266
edits
Changes
no edit summary
{{cquote|In principle, we always do the same steps in the fragment shaders to determine the color of a pixel:
But the location where this happens isn't always obvious - often (part) of the light is computed in the vertex shader already, in which case it typically enters the fragment shader as gl_Color.
|title=ALS landing lights
|author=Thorsten Renk |date= Wed Oct 08, 2014 11:19 -0700}}</ref>|Thorsten Renk}}
{{cquote|So, in old times when rendering textures was slow and complicated, we rendered objects with monochromatic surface colors. Then the (schematic) lighting equation (without specular, and the sum of ambient and diffuse already computed) was
it of course can never recover the color information, because color.rgb is zero at night since you multiplied the actual color with zero sunlight and the texel doesn't carry information for an untextured object.
Since the secondary light is in screen coordinates, it can't appear in the vertex shader, so the solution would be to pass the actual color and light rather than their product to the fragment shader. Which is expensive, because we need another varying vec3, and varying variable types fill memory and need to be computed an interpolated per vertex/per fragment - which is why I'm not sure whether we shouldn't accept the loss of the color...<ref>{{cite web |url=http://forum.flightgear.org/viewtopic.php?f=47&t=24226&start=45#p220321
|title=ALS landing lights
|author=Thorsten Renk |date= Sat Oct 11, 2014 1:28 -0700}}</ref>|Thorsten Renk}}
<references/>