I am an AI researcher working broadly in the domains of probabilistic generative models and meta-learning. I also explore applications of deep learning to novel problems that can impact society for the better. Previously, I completed by undergraduate studies majoring in Computer Science and Engineering from IIT Ropar, where I was fortunate to be a part of the LSAIML team headed by Dr. Narayanan C. K.

I am passionate about building robust and efficient systems that can reason probabilistically and make interpretable decisions. I enjoy connecting dots and innovating ideas that span multiple disciplines. I have a strong academic background in engineering and machine learning. If you would like to collaborate, please feel free to reach out through email .

I will be joining the StARLing Lab headed by Dr. Sriraam Natarajan as a Ph.D. Student, starting Fall’22. Excited

## news

May 16, 2022 Our work VQ-Flows: Vector Quantized Local Normalizing Flows has been accepted to UAI 2022 Our work Machine Learning Methods Trained on Simple Models can Anticipate Crtitical Transitions in Complex Systems has been accepted to the Royal Society Open Science  Journal. Our work Task Attended Meta-Learning for Few-Shot Learning has been accepted to the NeurIPS 2021 Workshop on Metalearning Our work On Characterizing GAN Convergence Through Proximal Duality Gap has been accepted to ICML 2021 Our work On Duality Gap as a Measure for Monitoring GAN Training has been accepted to the IJCNN 2021

## selected publications

1. On Characterizing GAN Convergence Through Proximal Duality Gap
In Proceedings of the 38th International Conference on Machine Learning (ICML) 2021
2. Task Attended Meta-Learning for Few-Shot Learning
In Fifth Workshop on Meta-Learning at the Conference on Neural Information Processing Systems (NeurIPS) 2021
3. Stress Testing of Meta-learning Approaches for Few-shot Learning
In AAAI Workshop on Metalearning and Co-Hosted Challenge 2021
4. On Duality Gap as a Measure for Monitoring GAN Training
In International Joint Conference on Neural Networks (IJCNN) 2021