Biography
I am a Research Scientist (Career) at the Lawrence Berkeley National Laboratory. I also lead the Robust Deep Learning Group at the International Computer Science Institute (ICSI), an affiliated institute of UC Berkeley. Prior to this role, I was an Assistant Professor (Tenure-Track) for Data-driven Modeling and Artificial Intelligence in the Department of Mechanical Engineering and Materials Science at the University of Pittsburgh, from September 2021 to December 2022. Before joining Pitt, I was a postdoctoral researcher in the Department of Statistics at UC Berkeley, where I worked with Michael Mahoney. I was also part of the RISELab in the Department of Electrical Engineering and Computer Sciences (EECS) at UC Berkeley. Before that, I was a postdoc in the Department of Applied Mathematics at the University of Washington (UW), working with Nathan Kutz and Steven Brunton. I earned my PhD in Statistics at the University of St Andrews in December 2017. My MSc. in Applied Statistics is also from the University of St Andrews.
I am broadly interested in the intersection of deep learning, dynamical systems, and robustness—how can we build robust and intelligent dynamical systems that are computationally efficient and expressive? I am also interested in leveraging tools from randomized numerical linear algebra to build modern algorithms for data-intensive applications such as fluid flows and climate science. Projects in my group span the space between the development of robust and dynamical systems-inspired neural network architectures, efficient training strategies for extracting knowledge from limited data, and data-driven modeling of scientific data.
News
- Two papers accepted in ICLR 2024 (one as spotlight)
Robustifying State-space Models for Long Sequences via Approximate Diagonalization (preprint).
Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs (preprint).
- One paper accepted in AISTATS 2024
Boosting model robustness to common corruptions (preprint).
- One paper accepted in AISTATS 2023 (as oral presentation)
Error Estimation for Random Fourier Features (preprint).
- Two papers accepted in ICLR 2022 (one as spotlight)
Noisy Feature Mixup (preprint) .
Long Expressive Memory for Sequence Modeling (preprint).
Current Group Members
Krti Tallam, joint Postdoc
with Michael Mahoney
Kareem Hegazy, joint Postdoc
with Michael Mahoney
Pu Ren, joint Postdoc
with Michael Mahoney
Dongwei Lyu, Research Intern
Alumni
Junyi Guo (Graduate researcher 2022-24; now PhD student at University of Notre Dame).
Jialin Song (Graduate researcher 2021-24; now PhD student at Simon Fraser University).
Yixiao Kang (Graduate researcher 2023-24; now ML Engineer at Meta).
Daniel Barron (Graduate researcher 2023-24; now Software Engineer at Amazon).
Olina Mukherjee (High School student researcher, 2021-22; now undergraduate student at CMU).
Ziang Cao (Undergraduate research intern 2020-21; now graduate student at Stanford).
Francisco Utrera (Graduate researcher 2019-22; now Senior ML Engineer at Erithmitic) .
Evan Kravitz (Graduate researcher 2019-20; now Software Engineer at Amazon) .
Vanessa Lin (Undergraduate research intern 2018-19; now at Google).
Qixuan Wu (Undergraduate research intern 2018-19; now at Goldman Sachs).