N. Benjamin Erichson

N. Benjamin Erichson

Group Leader

ICSI

Biography

I am a Senior Research Scientist and Research Group Leader, leading the Robust Deep Learning Group at the International Computer Science Institute (ICSI), an Affiliated Institute of UC Berkeley. I am also affiliated with the Lawrence Berkeley National Laboratory. Prior to this role, I was an Assistant Professor (Tenure-Track) for Data-driven Modeling and Artificial Intelligence in the Department of Mechanical Engineering and Materials Science at the University of Pittsburgh, from Sep. 2021 to Dec. 2022. Before joining U Pitt, I was a postdoctoral researcher in the Department of Statistics at UC Berkeley, where I worked with Michael Mahoney. I was also part of the RISELab in the Department of Electrical Engineering and Computer Sciences (EECS) at UC Berkeley. Before that, I was a postdoc at the Department of Applied Mathematics at the University of Washington (UW) working with Nathan Kutz and Steven Brunton. I earned my PhD in Statistics at the University of St Andrews, in Dec. 2017. My MSc in Applied Statistics is also from the University of St Andrews.

Email: erichson @ icsi dot berkeley dot edu

I am broadly interested in the intersection of deep learning, dynamical systems, and robustness—how can we build robust and intelligent dynamical systems that are computational efficient and expressive? I am also interested in leveraging tools from randomized numerical linear algebra to build modern algorithms for data-intensive applications such as fluid flows and climate science. Projects in my group span the space between development of robust and dynamical systems inspired neural network architectures, efficient training strategies for extracting knowledge from limited data, and data-driven modeling of scientific data.

Full list of publications: Google Scholar.

  • Current Postdocs, Students, and Staff:

    • Kareem Hegazy (Postdoc, joint with Michael Mahoney)
    • Krti Tallam (Postdoc, joint with Michael Mahoney)
    • Dongwei Lyu (Undergraduate research intern)
    • Pu Ren (Visiting Postdoc)
    • Illan Naiman (Visiting researcher from BGU)
    • Junyi Guo (ML Engineer).
    • Jialin Song (ML Engineer).
    • Kasey Lee (MEng student, UC Berkeley)
    • Yixiao Kang (MEng student, UC Berkeley)
    • Garry Gao (MEng student, UC Berkeley)
    • Daniel Barron (MEng student, UC Berkeley)
  • Former group members:

    • Olina Mukherjee (High School student researcher).
    • Ziang Cao (Undergraduate research intern, now MSc. at Stanford).
    • Vanessa Lin (Undergraduate research intern, now at Google).
    • Francisco Utrera (Graduate researcher, now Senior ML Engineer at Erithmitic) .
    • Evan Kravitz (Graduate researcher, now Software Engineer at Amazon) .
    • Qixuan Wu (Undergraduate research intern now at Goldman Sachs).

I am looking for highly motivated and driven Postdocs to join my group. Drop me an email, if my research interests sparks your interest.

Service:

  • Area Chair: ICML (2022,2023), NeurIPS (2022,2023), ICLR (2023).

Teaching:

  • Fall 2022 - Linear Algebra for Machine Learning (ME2300).
  • Fall 2021 - Linear Algebra for Machine Learning (ME2300).
  • Spring 2020 - Linear Algebra for Data Science (Stat89a), joint with Michael Mahoney.

Interests

  • Deep Learning
  • Transfer Learning
  • Dimension Reduction
  • Sequence Modeling
  • Dynamical Systems

Education

  • PhD in Statistics, 2017

    University of St Andrews

  • MSc in Applied Statistics, 2013

    University of St Andrews

News

Two papers accepted in ICLR 2022 (one as spotlight)

Noisy Feature Mixup (preprint) and Long Expressive Memory for Sequence Modeling (preprint).

Two papers accepted in NeurIPS 2021

Noisy Recurrent Neural Networks (preprint), and Compressing Deep ODE-Nets using Basis Function Expansions (preprint) which is joint work with Google Research.

Two papers accepted in ICLR 2021

Lipschitz Recurrent Neural Networks (preprint) and Adversarially-Trained Deep Nets Transfer Better (preprint).

Two papers accepted in ICML 2020

Forecasting sequential data using consistent Koopman autoencoders (preprint) and Error Estimation for Sketched SVD via the Bootstrap (preprint).

Recent & Upcoming Talks

Continuous Networks for Sequential Predictions
Noisy Recurrent Neural Networks
Stabilized Dynamic Autoencoders