The Boeing/Insitu ScanEagle UAV platform is extensively used in theater, having logged more than 500,000 combat flight hours. NPS CAVR maintains and operates 7 of these vehicles, which are often employed during USSOCOM-NPS TNT experimentation. However, the utility of this platform for autonomy research is limited due to semi-autonomous operation: a dedicated pilot commands the vehicle remotely, with the option of a few basic autonomous behaviors (such as loitering). For this platform to be utilized in advanced applications, the platform must be able to adapt its behavior as the mission evolves. This adaptation is accomplished through onboard sensing and decision-making.
This project focuses on extending the autonomy capability of the ScanEagle UAV platform by developing and implementing a secondary-autopilot architecture (or backseat driver). This backseat driver allows the stock autopilot to be tasked from an onboard computer, leveraging the proven capabilities of the stock autopilot for execution. The secondary autopilot consists of software and hardware components. An onboard computer was integrated as a payload on the ScanEagle and connected to the stock autopilot. For the software, the MATLAB Simulink and Stateflow environments were used for algorithm development. These algorithms must be executed either in a real-time (RT) or non-real time (NRT) environment, for which OROCOS and ROS were used, respectively. Two interfaces with the stock autopilot were implemented: waypoint (high-level) and angular-rate (low-level) commands. A mission management module was developed (NRT) and implemented and mission execution has been demonstrated in simulation and with flight demonstrations. Additionally, a real-time path-generation capability and a path-following controller were integrated in the secondary autopilot architecture. Limitations associated with the stock autopilot have been identified, most notably limited control over data access and update rates from the autopilot (e.g., UAV state data). As a result, the testing of the low-level interface will require additional efforts.
- Waypoint segment and orbit execution
- Non-expert user interface (GUI updating of mission)
- Low-communications overhead for secondary autopilot (due to onboard decision-making).
- Multi-vehicle software-in-the-loop simulation capability.
- Real-time path-planning and execution (NOTE: low-level interface has not been tested due to data rate limits).
- Flight demonstrations.
- Integration with standard GUI.
- Command with low-bandwidth, beyond line-of-sight communications (e.g., satellite).
- Consider replacing stock autopilot with better supported commercial unit that allows cross-platform compatibility.