Ajay Jain

Doctoral student (Ph.D.), UC Berkeley

NSF Graduate Research Fellow

NVIDIA Research

Previous: Google Brain, Uber ATG, Meta AI, MIT


I'm a PhD student at UC Berkeley. At Berkeley, I'm affiliated with Berkeley Artificial Intelligence Research and a recipient of the NSF Graduate Research Fellowship. My research studies the problem of unsupervised machine learning: How can machines automatically recognize, reconstruct, and generate the structure within raw data? I'm particularly excited by creative applications of AI, such as supercharging design and editing with generative models.

Previously, I was a visiting researcher at Google Brain and Uber ATG, and worked on ML at Meta (Facebook) AI, Kensho, and Juniper Networks. I graduated from MIT with an S.B. in Computer Science, was a director of Machine Intelligence Community, and organized the HackMIT hackathon. I also enjoy the outdoors, board games, and cooking.

I've been fortunate to be advised by really great people along the way, including Pieter Abbeel, Ben Poole, Jon Barron, Joseph Gonzalez, Ion Stoica, Deepak Pathak, Saman Amarasinghe, and Raquel Urtasun.

Email: ajayj at berkeley dot edu


New! July 2022: Dream Fields wins the Best Poster award at AI4CC
New! May 2022: AdaCat accepted to UAI 2022
March 2022: Dream Fields accepted to CVPR 2022
Aug 2021: ContraCode accepted to EMNLP 2021
July 2021: DietNeRF accepted to ICCV 2021

Publications * indicates equal contribution

Optimizing a flexible 1D AdaCat distribution.

Adaptive Categorical Discretization for Autoregressive Models

Colin Li, Ajay Jain, Pieter Abbeel
UAI 2022 Conference on Uncertainty in Artificial Intelligence

AdaCat learns expressive autoregressive models by capturing fine-grained variation in continuous distributions with discrete density estimators.

Zero-Shot Text-Guided Object Generation with Dream Fields

Ajay Jain, Ben Mildenhall, Jon Barron, Pieter Abbeel, Ben Poole
CVPR 2022 Conference on Computer Vision and Pattern Recognition

We combine neural rendering with multi-modal image and text representations to synthesize diverse 3D objects solely from natural language descriptions.

Conceptual diagram illustrating Contrastive Code Representation Learning.

Contrastive Code Representation Learning

Paras Jain*, Ajay Jain*, Tianjun Zhang, Pieter Abbeel, Joseph E. Gonzalez, Ion Stoica
EMNLP 2021 Empirical Methods in Natural Language Processing

Learn to represent software functionality for automated software engineering tasks like type inference, clone detection and summarization. Improving robustness of ML4Code.

Overview of DietNeRF's semantic consistency loss.

Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis

Ajay Jain, Matthew Tancik, Pieter Abbeel
ICCV 2021 International Conference on Computer Vision

CLIP + NeRF: Given only a few images of an object or scene, we reconstruct its 3D structure & render novel views using prior knowledge contained in large image encoders.

Sampling four 256x256 px photos of faces from a Denoising Diffusion Probabilistic Model.

Denoising Diffusion Probabilistic Models

Jonathan Ho, Ajay Jain, Pieter Abbeel
NeurIPS 2020 34th Conference on Neural Information Processing Systems

High-quality likelihood-based image generation; connect diffusion models to denoising score matching and Langevin dynamics; compression, reconstruction and interpolation

Cleaning up errors in the sparse graphical memory data structure.

Sparse Graphical Memory for Robust Planning

Scott Emmons*, Ajay Jain*, Michael Laskin*, Thanard Kurutach, Pieter Abbeel, Deepak Pathak
NeurIPS 2020 34th Conference on Neural Information Processing Systems

Provably robust+efficient long-horizon monocular navigation combining sparse graph planning and RL. Propose two-way consistency to find landmark memories and create a topological map.

Locally Masked Convolution applies customizable masks to patches of an image

Locally Masked Convolution for Autoregressive Models

Ajay Jain, Pieter Abbeel, Deepak Pathak
UAI 2020 36th Conference on Uncertainty in Artificial Intelligence

Efficent op allows PixelCNNs to generate images in arbitrary orders

Checkmate: Breaking the Memory Wall with Optimal Tensor Rematerialization

Paras Jain*, Ajay Jain*, Aniruddha Nrusimha, Amir Gholami, Pieter Abbeel, Kurt Keutzer, Ion Stoica, Joseph E. Gonzalez
MLSys 2020 3rd Conference on Machine Learning and Systems

Use up to 5x less memory when training DNNs by recomputing activations

DRF-Net 10 second forecast

Discrete Residual Flow for Probabilistic Pedestrian Behavior Prediction

Ajay Jain*, Sergio Casas Romero*, Renjie Liao*, Yuwen Xiong*, Song Feng, Sean Segal, Raquel Urtasun
CoRL 2019 3rd Conference on Robot Learning, Spotlight talk

Multimodal, long-range behavior forecasts by predicting state marginals

Revec: Program Rejuvenation through Revectorization

Charith Mendis*, Ajay Jain*, Paras Jain, Saman Amarasinghe
CC 2019 28th International Conference on Compiler Construction

Achieve performance portability for hand-vectorized programs, with up to 1.88x speedup

Autonomy for Surface Ship Interception

C. Mirabito, D.N. Subramani, T. Lolla, J. P.J. Haley, A. Jain, P.F.J. Lermusiaux, C. Li, D. Yue, Y. Liu, F. Hover, N. Pulsone, J. Edwards, K. Railey, and G. Shaw
OCEANS 2017 60th OCEANS Conference, MTS/IEEE Aberdeen

Time-optimal path planning for underwater robots

Short papers

Learning Automatic Schedulers with Projective Reparameterization

Ajay Jain, Saman Amarasinghe
ISCA 2019 46th International Symposium on Computer Architecture
Workshop on Machine Learning for Systems, Jun 2019, Talk

Supervised learning of schedulers, with correctness constraints

Using effective dimension to analyze feature transformations in deep neural networks

Kavya Ravichandran, Ajay Jain, Alexander Rakhlin
ICML 2019 36th International Conference on Machine Learning
Workshop on Identifying and Understanding Deep Learning Phenomena, Jun 2019

The Case for GPU Multitenancy

Paras Jain, Xiangxi Mo, Ajay Jain, Alexey Tumanov, Joseph E. Gonzalez, Ion Stoica
arXiv 2019 arXiv:1910.02653, Jan 2019

Dynamic Space-Time scheduling for GPU inference

Paras Jain, Xiangxi Mo, Ajay Jain, Harikaran Subbaraj, Rehan Sohail Durrani, Alexey Tumanov, Joseph E. Gonzalez, and Ion Stoica
NeurIPS 2018 32nd Annual Conference on Neural Information Processing Systems
Workshop on Systems for Machine Learning, Dec 2018

Demonstrate 2.5x-4.9x speedups for deep learning inference workloads via GPU multitenancy

Invited talks

Service and teaching

Machine Learning Software

More Software