Generalized Physics-Informed Learning through Language-Wide Differentiable Programming (Video)
March 31 2020 in Differential Equations, Mathematics, Science, Scientific ML | Tags: physics-informed machine learning, pinn, scientific machine learning, scientific ml, sciml | Author: Christopher Rackauckas
Chris Rackauckas (MIT), “Generalized Physics-Informed Learning through Language-Wide Differentiable Programming”
Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all language constructs (control flow, recursion, mutation, etc.) while generating high-performance code without requiring any user intervention or refactoring to stage computations. We showcase several examples of physics-informed learning which directly utilizes this extension to existing simulation code: neural surrogate models, machine learning on simulated quantum hardware, and data-driven stochastic dynamical model discovery with neural stochastic differential equations.
Code is available at https://github.com/MikeInnes/zygote-paper
AAAI 2020 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physics Sciences, March 23-25, 2020 (https://sites.google.com/view/aaai-mlps)
https://figshare.com/articles/presentation/Generalized_Physics-Informed_Learning_through_Language-Wide_Differentiable_Programming/12751934
Ryan Soklaski
says:Thank you for the interesting post – this is exciting work.
In this video, you reference a “very recent theory” that the number of terms needed in a neural network grows polynomially with the number of dimensions of your data (for large dimensions).
Can you point me to the source for this? I would love to dig in deeper.
Christopher Rackauckas
says:Yes, https://arxiv.org/abs/1908.10828 is one such result.