Microsoft Research Webinar | Building multimodal, integrative AI systems with Platform for Situated Intelligence
Microsoft Research Webinar registration page banner image

Microsoft Research Webinar Series

Register for the webinar

Complete the form below and receive an email with a link to the presentation

*required fields

Microsoft Research Webinar Series

Available on-demand. Register now.

Building multimodal, integrative AI systems with Platform for Situated Intelligence

In the last decade, we’ve seen fast-paced progress in many individual AI areas, such as computer vision, speech, and machine translation. However, anyone who’s tried bringing multiple AI technologies together in end-to-end systems designed for real-time, real-world interactions knows that constructing such systems remains a demanding task. Apart from research challenges that may arise in the context of a given application, the construction of multimodal, integrative AI systems is often daunting from an engineering perspective.

In this webinar, Dan Bohus, a Senior Principal Researcher in the Perception and Interaction Group at Microsoft Research, will introduce Platform for Situated Intelligence, an open-source framework that aims to address these challenges and accelerate and simplify the development, study, debugging, and maintenance of multimodal, integrative AI systems. The framework provides infrastructure for working with temporal streams of data; an efficient model for parallel, coordinated computation; rich tools for multimodal data visualization, annotation, and processing; and an open ecosystem of components that encapsulate various AI technologies. Bohus will break down the capabilities of Platform for Situated Intelligence and demonstrate how to write a very simple application using the framework, as well as how to use the available visualization tools.

Together, you’ll explore:

  • Challenges with building multimodal, integrative AI systems
  • A model for parallel, coordinated computation over temporal streams of data
  • Tools for data visualization and debugging
  • The available open ecosystem of components

Dan Bohus joined Microsoft Research in 2007 after graduating from Carnegie Mellon University. His work is focused on the study and development of computational models for multimodal, physically situated interaction. The long-term question that drives his research agenda is how can we create systems that reason more deeply about their surroundings and seamlessly participate in interactions and collaborations with people in the physical world. Examples of such systems include embodied conversational agents, human-robot interaction, intelligent spaces, and augmented and virtual reality.