Physics-based Deep Learning
This notebook can run directly on Bohrium Notebook. To begin, click the Connect button on the top panel and select OK.
This is part of the book Physics-based Deep Learning originally available at https://physicsbaseddeeplearning.org.
To navigate through the book, use the Collection panel at the bottom of the page.
If you find this book helpful, please star the original Github repo and cite the book!
Version fetched on 2023.9.19. Slight modifications were made to enhance the reading experience on Bohrium.
License: Apache
1.2.2 Introduction :: Simple Forward Simulation of Burgers Equation with phiflow
This chapter will give an introduction for how to run forward, i.e., regular simulations starting with a given initial state and approximating a later state numerically, and introduce the ΦFlow framework. ΦFlow provides a set of differentiable building blocks that directly interface with deep learning frameworks, and hence is a very good basis for the topics of this book. Before going for deeper and more complicated integrations, this notebook (and the next one), will show how regular simulations can be done with ΦFlow. Later on, we'll show that these simulations can be easily coupled with neural networks.
The main repository for ΦFlow (in the following "phiflow") is https://github.com/tum-pbs/PhiFlow, and additional API documentation and examples can be found at https://tum-pbs.github.io/PhiFlow/.
For this jupyter notebook (and all following ones), you can find a "[run in colab]" link at the end of the first paragraph (alternatively you can use the launch button at the top of the page). This will load the latest version from the PBDL github repo in a colab notebook that you can execute on the spot: [run in colab]
Model
As physical model we'll use Burgers equation.
This equation is a very simple, yet non-linear and non-trivial, model equation that can lead to interesting shock formations. Hence, it's a very good starting point for experiments, and it's 1D version (from equation {eq}model-burgers1d
) is given by:
Importing and loading phiflow
Let's get some preliminaries out of the way: first we'll import the phiflow library, more specifically the numpy
operators for fluid flow simulations: phi.flow
(differentiable versions for a DL framework X are loaded via phi.X.flow
instead).
Note: Below, the first command with a "!" prefix will install the phiflow python package from GitHub via pip
in your python environment once you uncomment it. We've assumed that phiflow isn't installed, but if you have already done so, just comment out the first line (the same will hold for all following notebooks).
Using phiflow version: 2.2.0
Next we can define and initialize the necessary constants (denoted by upper-case names):
our simulation domain will have N=128
cells as discretization points for the 1D velocity in a periodic domain for the interval . We'll use 32 time STEPS
for a time interval of 1, giving us DT=1/32
. Additionally, we'll use a viscosity NU
of .
We'll also define an initial state given by in the numpy array INITIAL_NUMPY
, which we'll use to initialize the velocity in the simulation in the next cell. This initialization will produce a nice shock in the center of our domain.
Phiflow is object-oriented and centered around field data in the form of grids (internally represented by a tensor object). I.e. you assemble your simulation by constructing a number of grids, and updating them over the course of time steps.
Phiflow internally works with tensors that have named dimensions. This will be especially handy later on for 2D simulations with additional batch and channel dimensions, but for now we'll simply convert the 1D array into a phiflow tensor that has a single spatial dimension 'x'
.
Next, we initialize a 1D velocity
grid from the INITIAL
numpy array that was converted into a tensor.
The extent of our domain is specifiied via the bounds
parameter , and the grid uses periodic boundary conditions (extrapolation.PERIODIC
). These two properties are the main difference between a tensor and a grid: the latter has boundary conditions and a physical extent.
Just to illustrate, we'll also print some info about the velocity object: it's a phi.math
tensor with a size of 128. Note that the actual grid content is contained in the values
of the grid. Below we're printing five entries by using the numpy()
function to convert the content of the phiflow tensor into a numpy array. For tensors with more dimensions, we'd need to specify the additional dimenions here, e.g., 'y,x,vector'
for a 2D velocity field. (For tensors with a single dimensions we could leave it out.)
Velocity tensor shape: (xˢ=128) Velocity tensor type: <class 'phi.math._tensors.CollapsedTensor'> Velocity tensor entries 10 to 14: [0.49289819 0.53499762 0.57580819 0.61523159 0.65317284]
Running the simulation
Now we're ready to run the simulation itself. To compute the diffusion and advection components of our model equation we can simply call the existing diffusion
and semi_lagrangian
operators in phiflow: diffuse.explicit(u,...)
computes an explicit diffusion step via central differences for the term of our model. Next, advect.semi_lagrangian(f,u)
is used for a stable first-order approximation of the transport of an arbitrary field f
by a velocity u
. In our model we have , hence we use the semi_lagrangian
function to transport the velocity with itself in the implementation:
New velocity content at t=1.0: [[0.0057228 ] [0.01716715] [0.02861034] [0.040052 ] [0.05149214]]
Here we're actually collecting all time steps in the list velocities
. This is not necessary in general (and could consume lots of memory for long-running sims), but useful here to plot the evolution of the velocity states later on.
The print statements print a few of the velocity entries, and already show that something is happening in our simulation, but it's difficult to get an intuition for the behavior of the PDE just from these numbers. Hence, let's visualize the states over time to show what is happening.
Visualization
We can visualize this 1D case easily in a graph: the following code shows the initial state in blue, and then times in green, cyan and purple.
This nicely shows the shock developing in the center of our domain, which forms from the collision of the two initial velocity "bumps", the positive one on left (moving right) and the negative one right of the center (moving left).
As these lines can overlap quite a bit we'll also use a different visualization in the following chapters that shows the evolution over the course of all time steps in a 2D image. Our 1D domain will be shown along the Y-axis, and each point along X will represent one time step.
The code below converts our collection of velocity states into a 2D array, repeating individual time steps 8 times to make the image a bit wider. This is purely optional, of course, but makes it easier to see what's happening in our Burgers simulation.
This concludes a first simulation in phiflow. It's not overly complex, but because of that it's a good starting point for evaluating and comparing different physics-based deep learning approaches in the next chapter. But before that, we'll target a more complex simulation type in the next section.
Next steps
Some things to try based on this simulation setup:
Feel free to experiment - the setup above is very simple, you can change the simulation parameters, or the initialization. E.g., you can use a noise field via
Noise()
to get more chaotic results (cf. the comment in thevelocity
cell above).A bit more complicated: extend the simulation to 2D (or higher). This will require changes throughout, but all operators above support higher dimensions. Before trying this, you probably will want to check out the next example, which covers a 2D Navier-Stokes case.