HDR Pipeline: Difference between revisions

Jump to navigation Jump to search
No change in size ,  16 April 2021
Line 35: Line 35:
A few implementation details for the curious:
A few implementation details for the curious:


* It's a deferred renderer. Most post-processing effects integrate very nicely with a deferred pipeline, and modern forward rendering
* It's a deferred renderer. Most post-processing effects integrate very nicely with a deferred pipeline, and modern forward rendering techniques usually require a depth prepass. I haven't found an easy way to share culling results between passes in OSG so that the main forward pass reuses the culling information gathered during the depth prepass, so I preferred to just go deferred.
techniques usually require a depth prepass. I haven't found an easy way to share culling results between passes in OSG so that the main
forward pass reuses the culling information gathered during the depth prepass, so I preferred to just go deferred.


* Entirely PBR-based. Old/legacy materials try to "translate" from the legacy ambient/diffuse/specular model to PBR metalness/roughness.
* Entirely PBR-based. Old/legacy materials try to "translate" from the legacy ambient/diffuse/specular model to PBR metalness/roughness. Still, the classic pipeline isn't going anywhere. If a particular aircraft is very broken under this pipeline, the user is free to use ALS/classic.
Still, the classic pipeline isn't going anywhere. If a particular aircraft is very broken under this pipeline, the user is free to use
ALS/classic.


* HDR and eye adaptation. Lighting values are written to HDR buffers and then tone mapped based on a exposure setting. This exposure
* HDR and eye adaptation. Lighting values are written to HDR buffers and then tone mapped based on a exposure setting. This exposure parameter is calculated automatically based on the average scene luminance (like a camera on the Auto setting would do), but there is the possibility of lowering/increasing the exposure manually if the user feels like the scene is too bright/dark. Maybe the pilot is wearing sunglasses? :)
parameter is calculated automatically based on the average scene luminance (like a camera on the Auto setting would do), but there is
the possibility of lowering/increasing the exposure manually if the user feels like the scene is too bright/dark. Maybe the pilot is
wearing sunglasses? :)


* Real time environment mapping. At the start of the frame we render to a cubemap and subsequent passes use this information to evaluate
* Real time environment mapping. At the start of the frame we render to a cubemap and subsequent passes use this information to evaluate indirect lighting.
indirect lighting.


* Miscellaneous post-processing passes like ambient occlusions, FXAA,bloom, etc.
* Miscellaneous post-processing passes like ambient occlusions, FXAA,bloom, etc.

Navigation menu