Slaving for Dummies
This article is a stub. You can help the wiki by expanding it. |
Back in the early days of the project (pre-dating OSG and multi-window/multi-view support), multi-screen setups would be using a master/slave configuration using the netfdm "hack" - i.e. multiple standalone fgfs instances (usually running on different computers) slaved to a single master via networking, all of which being synchronized across UDP (mainly FDM properties or plain C structs): Property_Tree/Native_Protocol_Slaving
There are a few hard-coded protocols for sync'ing other state across multiple instances.
But overall, it is and remains a huge ugly hack that only happens to work well enough for some use-cases, but fails the very instant someone wants to sync multiple subsystems (think AI/ATC or weather/environment).
Equally, instrumentation stuff (especially hard-coded MFDs) isn't easily sync'ed:
The main problem here is lack of consistency: we've seen half a dozen of glass cockpit related efforts over the years - including stuff like OpenGC (early 2000s) and FGGC (mid 2000s), and quite a few others in the meantime.At the end of the day, this always meant that we had competing, and even conflicting, technology stacks involved - where one technology (instrument/MFD) would not work within the other run-time environment. Canvas, coupled with HLA (or even just remote/telnet properties), has the potential to solve this once and for all.
|
You should be aware of glass cockpit related efforts, especially Canvas - most airliners & jets will sooner or later benefit from being ported to Canvas, e.g. to use Gijs' NavDisplay framework, or at least Philosopher's MapStructure framework for mapping purposes.
Thus, if this is also about the actual display itself, people should be aware of related canvas efforts, especially FGCanvas: FGCanvas In the long-term, I really want to support distributed FlightGear setups like those at FSWeekend/LinuxTag, where multiple computers may be used to run a single simulator session - including properly synchronized glass instruments like the PFD/ND etc. This would also help improve the multiplayer experience, especially dual-pilot setups etc. |
Canvas-based MFDs can in theory be explicitly sync'ed using either a generic protocol and/or a telnet connection (which does have support for basic "on demand" push semantics).
But in reality, using a single instances and multiple views/windows tends to work better for more involved use cases, simply because much/most of FG hasn't been designed with a distributed IG setup in mind: Howto:Configure_camera_view_windows
Obviously, there are performance issues, and especially restrictions WRT to only supporting slaved views - i.e. CompositeViewer support still is "pie in the sky" unfortunately, despite being regularly brought up: CompositeViewer_Support
FG devs are currently re-inventing CIGI functionality on top of HLA (see FGViewer), so that could be a more appropriate workaround than some generic protocol hacks:
Given that CIGI support doesn't exist so far, jumping on the HLA bandwagon would seem to be the right thing for a "proper" IG-based setup. But a workaround would seem possible using existing/extended I/O means. For FlightGear and any professional users, HLA and/or CIGI would obviously seem more relevant/interesting, because there's are already so many hacks in various places - which is how $FG_SRC/Networking came into existence, i.e. with tons of C structs put on the wire via UDP ...
It is worth noting though that the existing multi-screen/multi-window implementation seems to be particularly prone to race conditions unfortunately: Howto:Activate_multi_core_and_multi_GPU_support
Examples
First, let's start up a fgfs slave instance, with the FDM being disabled so that a native FDM socket can drive the instance:
fgfs --airport=KSFO --runway=28R --aircraft=ufo --native-fdm=socket,in,60,,5500,udp --fdm=null
Next, start the master and tell it to send native FDM packets to the address specified:
fgfs --airport=KSFO --runway=28R --aircraft=ufo --native-fdm=socket,out,60,,5500,udp
And here's how to start up a master that's driven by a standalone JSBSim instance
fgfs --airport=KSFO --runway=28R --aircraft=ufo --native-fdm=socket,out,60,,5500,udp --fdm=null --native-fdm=socket,in,60,,5600,udp
To the JSBSim FDM, you'll want to add this to the toplevel fdm-config section:
<output name="localhost" type="FLIGHTGEAR" port="5600" rate="60" protocol="UDP"/>
And then, start up JSBSim by the --realtime parameter.