DifferentialEquations.jl 2.0: State of the Ecosystem
May 8 2017 in CUDA, Differential Equations, HPC, Julia, Mathematics, Stochastics, Uncategorized | Tags: | Author: Christopher Rackauckas
In this blog post I want to summarize what we have accomplished with DifferentialEquations’ 2.0 release and detail where we are going next. I want to put the design changes and development work into a larger context so that way everyone can better understand what has been achieved, and better understand how we are planning to tackle our next challenges.
If you find this project interesting and would like to support our work, please star our Github repository. Thanks!
Now let’s get started.
DifferentialEquations.jl 1.0: The Core
Before we start talking about 2.0, let’s understand first what 1.0 was all about. DifferentialEquations.jl 1.0 was about answering a single question: how can we put the wide array of differential equations into one simple and efficient interface. The result of this was the common interface explained in the first blog post. Essentially, we created … READ MORE
Some Fun With Julia Types: Symbolic Expressions in the ODE Solver
May 4 2017 in Differential Equations, Julia, Mathematics, Uncategorized | Tags: | Author: Christopher Rackauckas
In Julia, you can naturally write generic algorithms which work on any type which has specific “actions”. For example, an “AbstractArray” is a type which has a specific set of functions implemented. This means that in any generically-written algorithm that wants an array, you can give it an AbstractArray and it will “just work”. This kind of abstraction makes it easy to write a simple algorithm and then use that same exact code for other purposes. For example, distributed computing can be done by just passing in a DistributedArray, and the algorithm can be accomplished on the GPU by using a GPUArrays. Because Julia’s functions will auto-specialize on the types you give it, Julia automatically makes efficient versions specifically for the types you pass in which, at compile-time, strips away the costs of the abstraction.
This means … READ MORE
Building a Web App in Julia: DifferentialEquations.jl Online
January 17 2017 in Differential Equations, Julia, Mathematics | Tags: | Author: Christopher Rackauckas
Web apps are all the rage because of accessibility. However, there are usually problems with trying to modify some existing software to be a web app: specifically user interface and performance. In order for a web application to perform well, it must be able to have a clean user interface with a “smart” backend which can automatically accommodate to the user’s needs, but also give back a nice looking response in under a second. This has typically limited the types of applications which can become responsive web applications.
Scientific software is one domain where user interface has been obstructive at best, and always in need of better performance. However, the programming language Julia has allowed us to build both an easy to use and highly performant ecosystem of numerical differential equation solvers over the last 8 months. Thus we had … READ MORE
6 Months of DifferentialEquations.jl: Where We Are and Where We Are Going
December 15 2016 in Differential Equations, Julia, Mathematics, Programming, Stochastics, Uncategorized | Tags: DifferentialEquations.jl, julia | Author: Christopher Rackauckas
So around 6 months ago, DifferentialEquations.jl was first registered. It was at first made to be a library which can solve “some” types of differential equations, and that “some” didn’t even include ordinary differential equations. The focus was mostly fast algorithms for stochastic differential equations and partial differential equations.
Needless to say, Julia makes you too productive. Ambitions grew. By the first release announcement, much had already changed. Not only were there ordinary differential equation solvers, there were many. But the key difference was a change in focus. Instead of just looking to give a production-quality library of fast methods, a major goal of DifferentialEquations.jl became to unify the various existing packages of Julia to give one user-friendly interface.
Since that release announcement, we have made enormous progress. At this point, I believe we have both the most expansive and flexible … READ MORE
Modular Algorithms for Scientific Computing in Julia
December 7 2016 in Julia, Programming | Tags: | Author: Christopher Rackauckas
When most people think of the Julia programming language, they usually think about its speed. Sure, Julia is fast, but what I emphasized in a previous blog post is that what makes Julia tick is not just-in-time compilation, rather it’s specialization. In that post, I talked a lot about how multiple dispatch allows for Julia to automatically specialize functions on the types of the arguments. That micro-specialization is what allows Julia to compile functions which are as fast as those from C/Fortran.
However, there is a form of “macro specialization” that Julia’s design philosophy and multiple dispatch allows one to build libraries for. It allows you to design an algorithm in a very generic form, essentially writing your full package with inputs saying “insert scientific computing package here”, allowing users to specialize the entire overarching algorithm to the specific problem. … READ MORE
7 Julia Gotchas and How to Handle Them
October 4 2016 in Julia, Programming | Tags: | Author: Christopher Rackauckas
Let me start by saying Julia is a great language. I love the language, it is what I find to be the most powerful and intuitive language that I have ever used. It’s undoubtedly my favorite language. That said, there are some “gotchas”, tricky little things you need to know about. Every language has them, and one of the first things you have to do in order to master a language is to find out what they are and how to avoid them. The point of this blog post is to help accelerate this process for you by exposing some of the most common “gotchas” offering alternative programming practices.
Julia is a good language for understanding what’s going on because there’s no magic. The Julia developers like to have clearly defined rules for how things act. This means that all behavior … READ MORE
Introducing DifferentialEquations.jl
August 1 2016 in Differential Equations, FEM, Julia, Programming, Stochastics | Tags: DifferentialEquations.jl, julia | Author: Christopher Rackauckas
Edit: This post is very old. See this post for more up-to-date information.
Differential equations are ubiquitous throughout mathematics and the sciences. In fact, I myself have studied various forms of differential equations stemming from fields including biology, chemistry, economics, and climatology. What was interesting is that, although many different people are using differential equations for many different things, pretty much everyone wants the same thing: to quickly solve differential equations in their various forms, and make some pretty plots to describe what happened.
The goal of DifferentialEquations.jl is to do exactly that: to make it easy solve differential equations with the latest and greatest algorithms, and put out a pretty plot. The core idea behind DifferentialEquations.jl is that, while it is easy to describe a differential equation, they have such diverse behavior that experts have spent over a century compiling … READ MORE
Using Julia’s Type System For Hidden Performance Gains
June 7 2016 in Julia, Programming | Tags: DifferentialEquations.jl, julia, performance | Author: Christopher Rackauckas
What I want to share today is how you can use Julia’s type system to hide performance gains in your code. What I mean is this: in many cases you may find out that the optimal way to do some calculation is not a “clean” solution. What do you do? What I want to do is show how you can define special arrays which are wrappers such that these special “speedups” are performed in the background, while having not having to keep all of that muck in your main algorithms. This is easiest to show by example.
The examples I will be building towards are useful for solving ODEs and SDEs. Indeed, these tricks have all been implemented as part of DifferentialEquations.jl and so these examples come from a real use case! They really highlight a main feature of Julia: … READ MORE
Finalizing Your Julia Package: Documentation, Testing, Coverage, and Publishing
May 16 2016 in Julia | Tags: AppVoyer, coverage, documentation, Documenter.jl, julia, testing, Travis.CI | Author: Christopher Rackauckas
In this tutorial we will go through the steps to finalizing a Julia package. At this point you have some functionality you wish to share with the world… what do you do? You want to have documentation, code testing each time you commit (on all the major OSs), a nice badge which shows how much of the code is tested, and put it into metadata so that people could install your package just by typing Pkg.add(“Pkgname”). How do you do all of this?
Note: At anytime feel free to checkout my package repository DifferentialEquations.jl which should be a working example.
Generate the Package and Get it on Github
First you will want to generate your package and get it on Github repository. Make sure you have a Github account, and then setup the environment variables in the git shell:
Optimal Number of Workers for Parallel Julia
April 16 2016 in HPC, Julia, Programming, Stochastics | Tags: BLAS, hyperthreading, julia, parallel computing, workers | Author: Christopher Rackauckas
How many workers do you choose when running a parallel job in Julia? The answer is easy right? The number of physical cores. We always default to that number. For my Core i7 4770K, that means it’s 4, not 8 since that would include the hyperthreads. On my FX8350, there are 8 cores, but only 4 floating-point units (FPUs) which do the math, so in mathematical projects, I should use 4, right? I want to demonstrate that it’s not that simple.
Where the Intuition Comes From
Most of the time when doing scientific computing you are doing parallel programming without even knowing it. This is because a lot of vectorized operations are “implicitly paralleled”, meaning that they are multi-threaded behind the scenes to make everything faster. In other languages like Python, MATLAB, and R, this is also the case. Fire up MATLAB … READ MORE