Data-Driven Analysis of Multi-Modal Sensory Inputs for the Design of Smart Environments
We propose to design and develop novel data-driven mathematical tools and algorithms with the aim of realizing a pervasive ambient intelligent system, such as a smart house or smart car cabin. Such a system senses the environment and the people in it, and is capable of predicting and responding to people needs. We envision a multitude of sensors and actuators embedded in the environment through which the system can infer the users’ intent and predict future actions, so as to provide highly relevant information and adapt the environment to the activities being performed. Key to our design is the idea of information integration, not only across multiple sensory modalities of the immediate settng (audio, video, 3D scans, temperature, humidity, etc.) but also over a very large amount of relevant data stored in the cloud that has been digested and indexed over time. Our measure of success is the specificity of the information the system provides and the relevance of the actions in performs, using the available actuators. We aim for a system that deeply adapts to each specific environment and each specific user, exploring big data to perform sharp inferences and achieve personalized responses.
Design new knowledge representations enabling pervasive ambient intelligent systems, such as a smart house or smart car cabin
- Representation of knowledge about humans and their environments in homes, offices and vehicles.
- Acquisition of this knowledge from sensor data and its organization/indexing for efficient retrieval.
- Ability to transport acquired knowledge to new settings in a highly customizable and specific fashion, especially in crossmodality (e.g., from 3D scan to image) and cross-domain (e.g., from home to vehicle) cases.
- The ability to scale to big data collections recording multiple diverse environments and human actions in them.