UROP Openings

Have a UROP opening you would like to submit?

Please fill out the form.

Submit your UROP opening

Transformers for graph representations


Term:

Spring

Department:

CSAIL: Computer Science and Artificial Intelligence Lab

Faculty Supervisor:

Tommi Jaakkola

Faculty email:

tommi@csail.mit.edu

Apply by:

02/15/2021

Contact:

Octavian Ganea: oct@mit.edu

Project Description

The goal of this project is to fully replace graph neural networks (GNNs) using a modified transformer architecture that should improve long-range graph interactions and graph representations, as well as deal with the problems of GNNs for molecules or graphs in general: - oversmoothing - vanishing/exploding gradients - similar issues as in RNNs - squashing bottleneck - GNNs work best with few layers and degrade if too many layers are added - (related to the above) GNNs are very weak at capturing distant node interactions (captured only in the final pooling layer) - GNNs have difficulties differentiating graphs that are structurally almost identical, but semantically are very different - little relation between GNNs' node embeddings and distortion based node embeddings (one would like to reconcile these directions of research) - GNNs have been outperformed by simpler/faster/lighter label&error propagation models. I can give more details in a call if your profile matches the requirements for this project. This project would imply doing a large set of experiments for graph representations on many datasets using PyTorch Geometric. Reference: https://arxiv.org/pdf/1905.12712.pdf

Pre-requisites

I am looking for someone who: - is excited to push forward the research of graph representations in a large scale project - has good grades in machine learning lectures, and, ideally also in mathematics / statistics lectures - has experience with coding in Python and some experience with coding ML models in PyTorch or other ML framework. - is at least somewhat familiar with the Transformers architecture and Graph Neural Networks. Ideally, has coded such model before.