N. Benjamin Erichson

N. Benjamin Erichson

Group Leader



I am a Senior Research Scientist and Research Group Leader, leading the Robust Deep Learning group at the International Computer Science Institute (ICSI), an Affiliated Institute of UC Berkeley. I am also affiliated with the Lawrence Berkeley National Laboratory. Prior to this role, I was an Assistant Professor (Tenure-Track) for Data-driven Modeling and Artificial Intelligence in the Department of Mechanical Engineering and Materials Science at the University of Pittsburgh, from Sep. 2021 to Dec. 2022. Before joining U Pitt, I was a postdoctoral researcher in the Department of Statistics at UC Berkeley, where I worked with Michael Mahoney. I was also part of the RISELab in the Department of Electrical Engineering and Computer Sciences (EECS) at UC Berkeley. Before to that, I was a postdoc at the Department of Applied Mathematics at the University of Washington (UW) working with Nathan Kutz and Steven Brunton. I earned my PhD in Statistics at the University of St Andrews, in Dec. 2017. My MSc in Applied Statistics is also from the University of St Andrews.

Email: erichson @ icsi dot berkeley dot edu

I am broadly interested in the intersection of deep learning, dynamical systems, and robustness—how can we build robust and intelligent dynamical systems that are computational efficient and expressive? I am also interested in leveraging tools from randomized numerical linear algebra to build modern algorithms for data-intensive applications such as fluid flows and climate science. Projects in my group span the space between development of robust and dynamical systems inspired neural network architectures, efficient training strategies for extracting knowledge from limited data, and data-driven modeling of scientific data.

Full list of publications: Google Scholar.

I am looking for highly motivated and driven Postdocs to join my group. Drop me an email, if my research interests sparks your interest.

  • Current Students, Postdocs, and Staff:

    • Junyi Guo (Grad student at UC Berkeley).
    • Jialin Song (ML Engineer at ICSI).
  • Former students:

    • Olina Mukherjee (High School student researcher).
    • Ruixuan Tang (PhD student at Pitt).
    • Elahe Mehdizade (PhD student at Pitt).
    • Mohammad Ensaf (PhD student at Pitt).
    • Hanxiao Wang (Graduate researcher at Pitt).
    • Ziang Cao (Undergrad researcher at Pitt).
    • Vanessa Lin (Undergrad researcher, now at Google).
    • Francisco Utrera (Graduate researcher, now Senior ML Engineer at Erithmitic) .
    • Evan Kravitz (Graduate researcher, now Software Engineer at Amazon) .
    • Qixuan Wu (Undergrad researcher, now postgrad at LSE).


  • Area Chair: ICML (2022,2023), NeurIPS (2022,2023).


  • Fall 2022 - Linear Algebra for Machine Learning (ME2300).
  • Fall 2021 - Linear Algebra for Machine Learning (ME2300).
  • Spring 2020 - Linear Algebra for Data Science (Stat89a), joint with Michael Mahoney.


  • Deep Learning
  • Transfer Learning
  • Dimension Reduction
  • Sequence Modeling
  • Dynamical Systems


  • PhD in Statistics, 2017

    University of St Andrews

  • MSc in Applied Statistics, 2013

    University of St Andrewsy


Two papers accepted in ICLR 2022 (one as spotlight)

Noisy Feature Mixup (preprint) and Long Expressive Memory for Sequence Modeling (preprint).

Two papers accepted in NeurIPS 2021

Noisy Recurrent Neural Networks (preprint), and Compressing Deep ODE-Nets using Basis Function Expansions (preprint) which is joint work with Google Research.

Two papers accepted in ICLR 2021

Lipschitz Recurrent Neural Networks (preprint) and Adversarially-Trained Deep Nets Transfer Better (preprint).

Two papers accepted in ICML 2020

Forecasting sequential data using consistent Koopman autoencoders (preprint) and Error Estimation for Sketched SVD via the Bootstrap (preprint).

Recent & Upcoming Talks

Continuous Networks for Sequential Predictions
Noisy Recurrent Neural Networks
Stabilized Dynamic Autoencoders