Autonomy & Control Systems Research
About
We develop autonomy and control methods for autonomous vehicles operating in complex, dynamic, and safety-critical environments. Our research focuses on motion planning, trajectory tracking, decision-making, and adaptive control under uncertainty, integrating model-based and learning-enabled approaches with real-time sensor feedback. Emphasis is placed on safe, robust, and scalable deployment in real-world autonomous driving systems.
Projects
Consumer Acceptance of Automated Vehicles
Understanding consumer acceptance is critical to the safe and effective deployment of automated vehicles, as public trust, perceived safety, and real-world use behavior directly shape how and when these systems are adopted. This project investigates consumer acceptance of automated vehicles across SAE Levels 2–4 through a multi-scale research framework that combines national survey analysis with real-world experimentation in Chattanooga’s smart city environment. While prior studies largely rely on stated preferences derived from hypothetical scenarios, this research explicitly compares what consumers say they would do with what they actually do when exposed to automated driving systems.
Neuro-Adaptive Control for Real-World Autonomous Vehicles
Autonomous vehicle (AV) research includes control-algorithm-driven approaches that enhance vehicle stability, safety, and ride quality under real-world driving uncertainties. These efforts focus on neuro-adaptive control strategies operating across the perception, localization control pipeline to compensate for unmodeled dynamics and changing road conditions. This work includes residual steering assistance to learn road friction, tire effects, and handling limits for smooth lane-centering; neuro-adaptive speed control that learns grade, aerodynamic drag, and rolling resistance to improve acceleration, braking smoothness, and energy efficiency; and uncertainty-aware adaptive control methods that handle adversarial driving conditions such as sudden drag forces, wind disturbances, friction changes, and unexpected vehicle interactions. By learning residual vehicle dynamics online and adjusting steering, throttle, and braking commands in real time, these controllers improve robustness and performance beyond conventional model-based control approaches.
Open-World Hazard Detection and Captioning for Autonomous Driving.
Autonomous driving systems must operate reliably in dynamic environments where uncommon or unanticipated hazards frequently arise. Traditional perception models often rely on closed-set recognition, restricting them to predefined object categories and limiting their ability to address novel or rare obstacles. To tackle these open-world challenges, we developed an integrated pipeline that merges image enhancement, optical flow, depth estimation, semantic segmentation, and vision-language models.