Skip to main content Skip to secondary navigation

Human Behaviors and Interaction for In-Car Experiences

Main content start

Principal Investigators:

Maneesh AgrawalaMichael Bernstein, and James Landay

TRI Liaisons:

Tiffany Chen

Project Summary

From car dashboards that make the most relevant controls readily accessible, to car environments that help people transition mentally from home to work, we imagine a future in which cars understand a broad range of human behaviors and help shape human behaviors. Today’s car systems cannot do this, partly because human activities are too broad and multifaceted to be easily captured by designers and programmers. We propose to expand knowledge of human behavior to applications by mining human behaviors from rich textual, audio, and video sources. For example, by mining more than one billion words of fiction to predict thousands of user activities from surrounding objects, we can identify and predict common actions that people take with certain objects in specific contexts. These knowledge bases can power new activity-based applications, such as a phone that silences itself when it is unlikely that you will answer it, a vehicle that prompts you when you are taking an unusual or dangerous action, or a car that uses music, scent, and vibration to relax a driver after a stressful day. These models will develop better simulations and data collection for a wide variety of human behaviors in the vehicle.

Research Goals

  • Mine the long tail of human behaviors from fiction, audio, and video sources on the Web
  • Model these behaviors so that we can predict or simulate human behaviors given nearby objects or previous behaviors
  • Produce APIs and applications enabling driver experiences to reach into this knowledge base for AI training and interaction
  • Adapt car displays for driver values, culture, and emotion