UROP Openings

Have a UROP opening you would like to submit?

Please fill out the form.

Submit your UROP opening

Exploring Learning from Vision, Touch and Sound in Humans


Term:

Spring

Department:

CSAIL: Computer Science and Artificial Intelligence Lab

Faculty Supervisor:

Pulkit Agrawal

Faculty email:

pulkitag@mit.edu

Apply by:

Feb 10 2020

Contact:

pulkitag@mit.edu

Project Description

Information from various sensory modalities is tightly coupled with each other. The sight of strawberries evokes its sweet taste, listening to vroom-vroom sound instantly conjures images of a motorbike or a car! There is plenty of evidence that humans represent the world by integrating information from multiple sensory modalities. This project aims to tease about the exact nature of these representations and explore how predictable are touch, audio, vision, and taste signals are from each other. Furthermore, we will investigate if high-level human preferences can be inferred from low-level sensory cues and how many independent factors of variations explain representations of different sensory systems. You can find more information about the lab here: http://people.csail.mit.edu/pulkitag/

Pre-requisites

- Running experiments involving human subjects. - Ability to code in python. - Experience with deep learning packages such as pyTorch is a big plus. BCS students might be a better fit for this position, but EECS students are welcome to apply as well.