First Workshop on Scientific-Driven Deep Learning (SciDL)
Deep learning is playing a growing role in the area of fluid dynamics, climate science and in many other scientific disciplines. Classically, deep learning has focused on an model agnostic learning approaches ignoring any prior knowledge that is known about the problem under consideration. However, limited data can severely challenge our ability to train complex and deep models for scientific applications. This workshop focuses on scientific-driven deep learning to explore challenges and solutions for more robust and interpretable learning.
Key Note Speakers
- George Em Karniadakis (Brown University)
paper
video
- Title: DeepOnet: Learning nonlinear operators based on the universal approximation theorem of operators
- Michael P. Brenner (Harvard University)
video
- Title: Machine Learning for Partial Differential Equations
Invited Speakers
- Frank Noe (FU Berlin)
paper
video
- Title: PauliNet: Deep neural network solution of the electronic Schrödinger Equation
- Alejandro Queiruga (Google, LLC)
video
- Title: Continuous-in-Depth Neural Networks
- Michael Muehlebach (UC Berkeley)
paper
video
- Title: Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic Perspectives
- Yasaman Bahri (Google Brain)
video
- Title: Learning Dynamics of Wide, Deep Neural Networks: Beyond the Limit of Infinite Width
- Elizabeth Qian (MIT)
paper
video
- Title: Lift & Learn: Analyzable, Generalizable Data-Driven Models for Nonlinear PDEs
- Lars Ruthotto (Emory University)
video
- Title: Deep Neural Networks Motivated by PDEs
- Tess Smidt (LBL)
video
- Title: Neural Networks with Euclidean Symmetry for Physical Sciences
- Omri Azencot (UCLA)
paper
video
- Title: Robust Prediction of High-Dimensional Dynamical Systems using Koopman Deep Networks
Organizers
- N. Benjamin Erichson (UC Berkeley)
- Michael W. Mahoney (UC Berkeley)
- Tess Smidt (LBL)
- Steven L. Brunton (University of Washington)
- J. Nathan Kutz (University of Washington)