Take a custom built radio controlled quadcopter, a heap of radio transmission gear, a Netduino, some sensors and some awesome software, and you'll end up with a long range flying platform that puts you, the pilot, in a virtual 'cockpit' with an immersion factor so realistic, it will feel like you're in the air.
Real-time telemetry from onboard sensors (ultrasonic range, gyroscopes, accelerometers, GPS and barometric pressure) will be overlaid on to your HUD to enhance your flight readiness, together with assisting with flight stabilization and autopilot modes of various types (altitude hold, waypoint navigation, safe flight).
Follow as I gradually build and extend on the platform capabilities.
Built entirely using Microsoft .NET technologies (software side), and open source hardware.
A future phase of this project will be connecting the video output on board to an Oculus Rift virtual reality headset, which my friend and colleague RobG is working on. (Computer vision and Oculus integration powered by RobG (http://hoverboard.io/robg)
The UAV has various modes of operation, for which it has the concept of "autopilots" which can be selected by the human pilot and switched during flight. Not every autopilot requires access to every sensor, and because there are a lot of onboard sensors, there could essentially be a lot of 'noise' for each autopilot to filter (messages that it receives that it is not interested in).
This is where the message bus architecture comes in. Using a lightweight, custom-built architecture, there are two message busses on board: avionics and control (both implement the same interface). The avionics bus receives messages from every sensor and can push them to any interested subscribed party. The control bus receives messages from the telemetry link and from autopilots and messages on this bus are primarily only received by the flight actuator (a component whose job it is to process throttle and attitude commands).
For the ground station computer to operate, it needs to receive vehicle orientation data to be able to overlay it onto a virtual Heads Up Display (HUD). The UAV sends this information to the groundstation computer via a relatively slow (but capable) 9600bps serial communications link.
In addition, the serial communications link acts as a transport for commands sent from the groundstation computer to the aircraft.
Using a Pololu servo control board, the milestone involved hooking that up to a WCF service which receives data from a Windows Phone 8 app and sends interpreted signals via the USB to Serial interface to the control board.
The phone was used as a WCF client because it has a triple axis inclinometer, which was used here in place of the Oculus Rift, which hasn't arrived yet.
We have even tested using the phone app over the public internet to control the gimbal, and latency (prior to optimization) was tolerable.
Now, we have a working prototype early alpha phone control interface, the beginnings of a server API (for sending control data to from the Oculus Rift) and an abstracted sensor layer allowing us to swap control inputs.
The first step? Design and build a custom quadcopter using largely off the shelf components (for simplicity and cost), with a minimum payload capacity of 500g.