Immersing Drivers in Virtual Reality to Test Shared Control Concepts
In ongoing collaboration with Toyota Research Institute, we are interested in driver assistance systems that improve safety on the road while keeping drivers in the loop. Working together, the automated system and driver can navigate safely and keep the vehicle progressing toward its destination in situations that either agent would struggle to handle alone. We envision shared control systems that continuously blend steering, throttle, and braking inputs from the system and driver. With these systems, drivers can enjoy the activity of driving while taking advantage of the safety benefits afforded by high-frequency sensing, computation, and actuation on board vehicles of the future.
Prototyping and testing these systems on real vehicles across a wide range of test cases poses several challenges. To safely run tests with other road users, practitioners often use dummy pedestrians and bubble cars, which don’t always appear realistic to drivers and can be difficult to control. Our solution is to utilize Human&Vehicle-in-the-Loop (Hu&ViL), a test setup where the driver dons a virtual reality (VR) headset while operating a real vehicle. In VR, the driver sees a first-person view of themselves operating a virtual vehicle whose motion corresponds to the real vehicle based on a high-precision GPS signal. With a Hu&ViL platform, we can safely and reliably create a nearly endless set of scenarios, including overtaking maneuvers and pedestrian crossings, on a wide variety of road networks.
Further, the road surface friction and size of the test track typically limits the testing conditions in which autonomous and shared control systems can be tested. At Thunderhill Raceway Park in Willows, CA where we typically test our vehicles, we don’t see any snow (or even much rain) and don’t have access to endless open space for testing. To expand these limits, we have designed the Hu&ViL platform around our four-wheel steer-by-wire (SBW) vehicle, called X1, that can modify the underlying handling characteristics of the vehicle. On this platform, we can realistically emulate the experience of driving on low friction surfaces and of driving at a much higher speed than the test vehicle is traveling. The high-speed emulation approach utilizes the virtual reality headset by showing the virtual vehicle travel at 2-3x the speed of the test vehicle, expanding the usable testing area by that same 2-3x factor.
In recent work presented at the 2022 Intelligent Vehicles Symposium, we demonstrated these different driving conditions through tests of a novel nonlinear model predictive control (NMPC)-based shared control system. Experiments included navigating a hairpin turn on a snowy day and overtaking a vehicle while avoiding oncoming traffic at highway speeds. Our work highlights the necessity of providing accurate sensory feedback to drivers in all test conditions, so their interaction with the shared control system is as natural as possible. We also experiment with heads-up display graphics that visually communicate the assistance system’s behavior back to the driver, a necessary feature for drivers to understand the automated systems on their vehicle. The Hu&ViL platform is an invaluable research tool for testing shared control concepts, and we plan to continue developing driver assistance systems that work closely with the driver to keep them and other road users safe.