Have a UROP opening you would like to submit?
Please fill out the form.
Exploring Learning from Vision, Touch and Sound in Humans
CSAIL: Computer Science and Artificial Intelligence Lab
Feb 10 2020
Information from various sensory modalities is tightly coupled with each other. The sight of strawberries evokes its sweet taste, listening to vroom-vroom sound instantly conjures images of a motorbike or a car! There is plenty of evidence that humans represent the world by integrating information from multiple sensory modalities. This project aims to tease about the exact nature of these representations and explore how predictable are touch, audio, vision, and taste signals are from each other. Furthermore, we will investigate if high-level human preferences can be inferred from low-level sensory cues and how many independent factors of variations explain representations of different sensory systems. You can find more information about the lab here: http://people.csail.mit.edu/pulkitag/
- Running experiments involving human subjects. - Ability to code in python. - Experience with deep learning packages such as pyTorch is a big plus. BCS students might be a better fit for this position, but EECS students are welcome to apply as well.