- PhD, MIT Media Lab 2018
- SM, MIT Media Lab 2013
Misha Sra is the John and Eileen Gerngross Assistant Professor and directs the Perceptual Engineering Lab in the Computer Science department at UCSB. Misha received her PhD at the MIT Media Lab in 2018. She has published at the most selective HCI and VR venues such as CHI, UIST, VRST, and DIS where she received multiple best paper awards and honorable mentions. From 2014-2015, she was a Robert Wood Johnson Foundation wellbeing research fellow at the Media Lab. In spring 2016, she received the Silver Award in the annual Edison Awards Global Competition that honors excellence in human-centered design and innovation. MIT selected her as an EECS Rising Star in 2018. Her research has received extensive media coverage from leading media outlets (e.g., from Engadget, UploadVR, MIT Tech Review and Forbes India) and has drawn the attention of industry research, such as Samsung and Unity 3D.
I am interested in the following areas:
- Perception (e.g., novel sensing and stimulation devices, body-to-mind interfaces)
- Interaction Design (e.g., body-based interaction, haptic feedback, input techniques)
- Machine Learning (e.g., content generation, AI avatars, physiological signals)
- Applications (e.g., learning, creativity, multiplayer gaming)
Over the last 50 years, interaction with digital systems has evolved from punch cards to mouse and keyboard to touch and voice. Motivated by the belief that digital interactions must continue to evolve, the Perceptual Engineering lab creates immersive systems and devices that integrate the ease and expressiveness with which our bodies interact with the physical world. We work to understand and create technology that can improve human capabilities through extra-disciplinary research, publishing across science, engineering, and design.
My research explores the use of cognitive illusions to alter perception; physiological sensing to support natural interaction (CHI’18 Honorable Mention); embodiment to allow remote collaborative learning; tangible interactions for play (CHI’18 Best Paper Award); deep learning to generate virtual worlds; 3D reconstruction to automatically map real world spaces to virtual worlds (VRST’16 Best Paper Award); haptic feedback devices to enhance immersion; and galvanic vestibular stimulation (GVS) to guide locomotion in VR and MR and to reduce motion sickness in VR, self driving cars, and zero-g flights. I am currently collaborating with two time Academy Award Winner and composer AR Rahman to explore the design of novel storytelling techniques.
I am hiring a few talented PhD students to work with me. I am looking for students who are highly motivated, excited about research, hard-working, and proficient in programming (one of Python, C#, Java etc.) at the very least. I will seriously consider students with an electronics engineering or mechanical engineering background who are interested in designing new types of hardware for VR/AR input and output, for e.g., haptic feedback devices. Interest and experience in machine learning would be very welcome.