Collaborative Research: Towards Attack-Resilient Vision-Guided Unmanned Aerial Vehicles: An Observability Analysis Approach

Project Details

Description

This grant will fund research that improves the resilience of future mobile robotic technologies to malicious cyber-attacks for safety-critical applications in transportation, aerospace, and military systems, thereby promoting the progress of science, advancing prosperity, and securing the national defense. Unmanned aerial vehicles and other mobile robots rely on image streams from onboard cameras, as well as communication with other autonomous systems, to perform cooperative tasks such as navigation and collision avoidance. Vulnerabilities in networked vision-guided systems can be exploited using powerful machine-learning methods to launch adversarial attacks that compromise function, endanger human lives, and damage property. This research project advances a new foundational framework for detecting and responding to stealthy and malicious cyber-attacks that simultaneously target mission planning, control, perception, and sensor data. This framework will enhance the reliability of vision-guided autonomous systems such as self-driving cars and networked aerial vehicles, hardening these against cyber threats and malicious actions. Efforts aimed at achieving broader impact include integration of research activities in project-based graduate courses, as well as engagement of undergraduate students in faculty-mentored summer research projects that motivate their interest in STEM, graduate education, and careers in high-tech industries. Outreach programs will use mobile robotic demonstration platforms to inspire K-12 students to pursue engineering-related education paths.

This research aims to make fundamental and rigorous contributions to the use of control theory and machine-learning tools for characterizing and defending against stealthy attacks that exploit vulnerabilities in vision-guided, networked autonomous systems, with particular emphasis on high-dimensional visual data. It achieves this outcome by developing an online learning framework able to synthesize control policies from image frames in real-time, without current restrictions to offline implementations that fail to include system dynamics in the control loop. For multi-agent vision-guided dynamical systems, this research provides a novel and holistic framework that characterizes stealthy attacks based on the unobservable subspaces of both the physical system dynamics and the neural network model used for perception. Using stochastic optimization and simulation, the attack detection methodology extends the existing model-based observer methods for linear time-invariant systems to deal with stealthy attacks against a networked system with autonomous agents represented by general time-varying and nonlinear models. Evaluation of the theoretical framework relies on experiments with quadcopters in an indoor laboratory environment, as well as the use of advanced simulation software.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

StatusActive
Effective start/end date10/1/184/30/25

Funding

  • National Science Foundation: $309,811.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.