DDPS Seminar Talk: Generalizing Scientific Machine Learning and Differentiable Simulation Beyond Continuous models
November 12 2023 in Uncategorized | Tags: data-driven physics, ddps, physics-informed machine learning, piml, sciml | Author: Christopher Rackauckas
I’m pleased to share a talk I gave in the DDPS seminar series!
Data-driven Physical Simulations (DDPS) Seminar Series
Abstract: The combination of scientific models into deep learning structures, commonly referred to as scientific machine learning (SciML), has made great strides in the last few years in incorporating models such as ODEs and PDEs into deep learning through differentiable simulation. However, the vast space of scientific simulation also includes models like jump diffusions, agent-based models, and more. Is SciML constrained to the simple continuous cases or is there a way to generalize to more advanced model forms? This talk will dive into the mathematical aspects of generalizing differentiable simulation to discuss cases like chaotic simulations, differentiating stochastic simulations like particle filters and agent-based models, and solving inverse … READ MORE
Accurate and Efficient Physics-Informed Learning Through Differentiable Simulation (ASA Seminar Talk)
July 14 2022 in Uncategorized | Tags: | Author: Christopher Rackauckas
Abstract: Scientific machine learning (SciML) methods allow for the automatic discovery of mechanistic models by infusing neural network training into the simulation process. In this talk we will start by showcasing some of the ways that SciML is being used, from discovery of extrapolatory epidemic models to nonlinear mixed effects models in pharmacology. From there, we will discuss some of the increasingly advanced computational techniques behind the training process, focusing on the numerical issues involved in handling differentiation of highly stiff and chaotic systems. The viewers will leave with an understanding of how compiler techniques are being infused into the simulation stack to increasingly automate the process of developing mechanistic models
Bio: Dr. Chris Rackauckas is the Director of Scientific Research at Pumas-AI, the Director of Modeling … READ MORE
Keynote: New Horizons in Modeling and Simulation with Julia (Modelica Conference 2021)
June 25 2022 in Uncategorized | Tags: | Author: Christopher Rackauckas
Keynote Address: New Horizons in Modeling and Simulation in Julia
Presenters: Viral Shah (Julia Computing, CEO and Co-Founder), Chris Rackauckas (Julia Computing, Director of Modeling and Simulation and Christopher Laughman (Mitsubishi Electric Research Laboratories, Principal Member Research Staff)
Abstract: As modeling has become more ubiquitous, our models keep growing. The time to build models, verify their behavior, and simulate them is increasing exponentially as we seek more precise predictions. How will our tools change to accommodate the future? Julia’s language design has led to new opportunities. The combination of multiple dispatch, staged compilation, and Julia’s composable libraries have made it possible to build a next generation symbolic-numeric framework. Julia’s abstract interpretation framework enables capabilities such as automatic differentiation, automatic surrogate generation, symbolic tracing, uncertainty propagation, and automatic … READ MORE
Improved ForwardDiff.jl Stacktraces With Package Tags
December 19 2021 in Uncategorized | Tags: | Author: Christopher Rackauckas
You may have seen some hilariously long stacktraces when using ForwardDiff. In the latest releases of OrdinaryDiffEq.jl we have fixed this, and the fix is rather safe. I want to take a second to describe some of the technical details so that others can copy this technique.
The reason for this is the tag parameter. The Dual number type is given by Dual{T,V,N} where V is an element type (usually Float64), N is a chunksize (some integer), and T is the tag. What the tag does is prevent perturbation confusion by erroring if two incompatible dual numbers try to interact. The key requirement for it to prevent perturbation confusion is for the type to be unique in the context of the user. For example, if the user is differentiating f, and then differentiating x->derivative(f,x), you want the tag to be … READ MORE
The Use and Practice of Scientific Machine Learning
November 18 2021 in Uncategorized | Tags: | Author: Christopher Rackauckas
The Use and Practice of Scientific Machine Learning
Scientific machine learning (SciML) methods allow for the automatic discovery of mechanistic models by infusing neural network training into the simulation process. In this talk we will start by showcasing some of the ways that SciML is being used, from discovery of extrapolatory epidemic models to nonlinear mixed effects models in pharmacology. From there, we will discuss some of the increasingly advanced computational techniques behind the training process, focusing on the numerical issues involved in handling differentiation of highly stiff and chaotic systems. The viewers will leave with an understanding of how compiler techniques are being infused into the simulation stack to increasingly automate the process of developing mechanistic models.
Benchmarks behind this talk can be found at READ MORE
When does the mean and variance define an SDE?
November 1 2021 in Uncategorized | Tags: | Author: Christopher Rackauckas
I recently saw a paper that made the following statement:
“Innes et al. [22] trained neural SDEs by backpropagating through the operations of the solver, however their training objective simply matched the first two moments of the training data, implying that it could not consistently estimate diffusion functions.”
However, given that the statement “could not consistently estimate diffusion functions” had no reference to it and no proof in the appendix, I was interested to figure out the mathematical foundation behind the claim. Furthermore, I know from the DiffEqFlux documentation example that there was at least one case where second order method of moments seems to estimate the diffusion function. So a question arose, when does the mean and variance define an SDE?
Of course, in 2021 a Twitter thread captures the full discussion. But I want to take a step back … READ MORE
GPU-Accelerated ODE Solving in R with Julia, the Language of Libraries
August 24 2020 in Differential Equations, Julia, Programming, R, Uncategorized | Tags: diffeqr, differentialequations, gpu, high-performance, jit, r | Author: Christopher Rackauckas
R is a widely used language for data science, but due to performance most of its underlying library are written in C, C++, or Fortran. Julia is a relative newcomer to the field which has busted out since its 1.0 to become one of the top 20 most used languages due to its high performance libraries for scientific computing and machine learning. Julia’s value proposition has been its high performance in high level language, known as solving the two language problem, which has allowed allowed the language to build a robust, mature, and expansive package ecosystem. While this has been a major strength for package developers, the fact remains that there are still large and robust communities in other high level languages like R and Python. Instead of spawning distracting language wars, we should ask the … READ MORE
Universal Differential Equations for Scientific Machine Learning (Video)
March 6 2020 in Uncategorized | Tags: neural ODEs, neural ordinary differential equations, scientific machine learning, scientific ml, sciml | Author: Christopher Rackauckas
Colloquium with Chris Rackauckas
Department of Mathematics
Massachusetts Institute of Technology
“Universal Differential Equations for Scientific Machine Learning”
Feb 19, 2020, 3:30 p.m., 499 DSL
https://arxiv.org/abs/2001.04385
Abstract:
In the context of science, the well-known adage “a picture is worth a thousand words” might well be “a model is worth a thousand datasets.” Scientific models, such as Newtonian physics or biological gene regulatory networks, are human-driven simplifications of complex phenomena that serve as surrogates for the countless experiments that validated the models. Recently, machine learning has been able to overcome the inaccuracies of approximate modeling by directly learning the entire set of nonlinear interactions from data. However, without any predetermined structure from the scientific basis behind the problem, machine learning approaches are flexible but data-expensive, requiring large databases of homogeneous labeled training data. … READ MORE
Recent advancements in differential equation solver software
October 16 2019 in Differential Equations, Julia, Mathematics, Scientific ML, Uncategorized | Tags: | Author: Christopher Rackauckas
This was a talk given at the Modelica Jubilee Symposium – Future Directions of System Modeling and Simulation.
Recent Advancements in Differential Equation Solver Software
Since the time of the ancient Fortran methods like dop853 and DASSL were created, many advancements in numerical analysis, computational methods, and hardware have accelerated computing. However, many applications of differential equations still rely on the same older software, possibly to their own detriment. In this talk we will describe the recent advancements being made in differential equation solver software, focusing on the Julia-based DifferentialEquations.jl ecosystem. We will show how high order Rosenbrock and IMEX methods have been proven advantageous over traditional BDF implementations in certain problem domains, and the types of issues that give rise to general performance characteristics between the methods. Extensions of these … READ MORE
Scientific AI: Domain Models with Integrated Machine Learning
July 25 2019 in Uncategorized | Tags: | Author: Christopher Rackauckas
Modeling practice seems to be partitioned into scientific models defined by mechanistic differential equations and machine learning models defined by parameterizations of neural networks. While the ability for interpretable mechanistic models to extrapolate from little information is seemingly at odds with the big data “model-free” approach of neural networks, the next step in scientific progress is to utilize these methodologies together in order to emphasize their strengths while mitigating weaknesses. In this talk we will describe four separate ways that we are merging differential equations and deep learning through the power of the DifferentialEquations.jl and Flux.jl libraries. Data-driven hypothesis generation of model structure, automated real-time control of dynamical systems, accelerated of PDE solving, and memory-efficient deep learning workflows will all shown to be derived from this common computational structure … READ MORE