

DeepXDE 学习
💭本文分段落展示DeepXDE的使用方法:
- 基本ODE方程组的求解
- 深度算子网络通过数据反求算子
由于篇幅较长,本文尽量言简意赅,减轻阅读压力,提高信息密度。另外,为了不引入解释上的额外负担,可能会有一些术语/概念的混用。
DeepXDE是一款开源且高度模块化的科学计算工具,以深度学习为核心,提供多种数据、物理机理及数理融合的模型,如PINN、DeepONet、MFNN等,同时支持多种类型微分方程,如常微分方程、偏微分方程的定义及求解,可有效解决复杂科学计算问题,其论文在此。 DeepXDE框架的主要优势如下:
- 高度模块化:DeepXDE提供多种支持调用、组合的模块,如计算域、边界条件、微分方程、神经网络、训练及预测等,方便用户组合构建物理系统多类微分方程:支持自定义常微分方程、偏微分方程、积分微分方程等来描述具体问题;
- 可扩展性:支持用户结合自身需求添加自定义的数值算法、模型或其他功能;
- 可视化:提供一系列丰富的可视化工具,可以帮助用户直观地理解计算结果;
- 多后端:DeepXDE可以支持目前主流的AI计算框架作为其模型后端,便于模型部署。
1 ODE 方程求解
- 原始方程:
- 边界条件:
- 方程解:
依赖安装
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting deepxde Downloading https://pypi.tuna.tsinghua.edu.cn/packages/46/98/4fa0b6dab1f70c8280f2e5f4bb54b177b553707d0de69970e81fce832e84/DeepXDE-1.9.3-py3-none-any.whl (165 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 165.9/165.9 kB 4.7 MB/s eta 0:00:00 Collecting scikit-learn Downloading https://pypi.tuna.tsinghua.edu.cn/packages/7d/af/03d3a7d5719d00486c296ddd876e6f07a681bc4e079cb45348d2f261a748/scikit_learn-1.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (10.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.8/10.8 MB 41.1 MB/s eta 0:00:0000:010:01 Collecting matplotlib Downloading https://pypi.tuna.tsinghua.edu.cn/packages/b5/24/aaccf324ce862bb82277e8814d2aebbb2a2c160d04e95aa2b8c9dc3137a9/matplotlib-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.6/11.6 MB 55.6 MB/s eta 0:00:0000:0100:01 Requirement already satisfied: numpy in /opt/mamba/lib/python3.10/site-packages (from deepxde) (1.24.2) Requirement already satisfied: scipy in /opt/mamba/lib/python3.10/site-packages (from deepxde) (1.10.1) Collecting scikit-optimize>=0.9.0 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/55/f6/2d9efbd86126c40fe0f8a47611a9e2480b493b6f0ce9751bdf0240cfa091/scikit_optimize-0.9.0-py2.py3-none-any.whl (100 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100.3/100.3 kB 2.3 MB/s eta 0:00:00a 0:00:01 Collecting pyaml>=16.9 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/56/db/01bc52d991716ccac7366b6e763fdaaf7b57910dbc2f85466c997f915284/pyaml-23.9.6-py3-none-any.whl (22 kB) Collecting joblib>=0.11 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/10/40/d551139c85db202f1f384ba8bcf96aca2f329440a844f924c8a0040b6d02/joblib-1.3.2-py3-none-any.whl (302 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 302.2/302.2 kB 7.0 MB/s eta 0:00:00a 0:00:01 Collecting threadpoolctl>=2.0.0 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/81/12/fd4dea011af9d69e1cad05c75f3f7202cdcbeac9b712eea58ca779a72865/threadpoolctl-3.2.0-py3-none-any.whl (15 kB) Collecting fonttools>=4.22.0 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/2b/e8/61b8525acf26ec222518bdff127ae502bfa3408981fb5e5493f2b037d7fb/fonttools-4.42.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.5/4.5 MB 46.9 MB/s eta 0:00:00:00:0100:01 Collecting cycler>=0.10 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/5c/f9/695d6bedebd747e5eb0fe8fad57b72fdf25411273a39791cde838d5a8f51/cycler-0.11.0-py3-none-any.whl (6.4 kB) Requirement already satisfied: python-dateutil>=2.7 in /opt/mamba/lib/python3.10/site-packages (from matplotlib->deepxde) (2.8.2) Requirement already satisfied: packaging>=20.0 in /opt/mamba/lib/python3.10/site-packages (from matplotlib->deepxde) (23.0) Collecting pillow>=6.2.0 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/7a/07/e896b096a77375e78e02ce222ae4fd6014928cd76c691d312060a1645dfa/Pillow-10.0.1-cp310-cp310-manylinux_2_28_x86_64.whl (3.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.6/3.6 MB 42.0 MB/s eta 0:00:0000:01m Collecting kiwisolver>=1.0.1 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/6f/40/4ab1fdb57fced80ce5903f04ae1aed7c1d5939dda4fd0c0aa526c12fe28a/kiwisolver-1.4.5-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 26.9 MB/s eta 0:00:00ta 0:00:01 Collecting pyparsing>=2.3.1 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/39/92/8486ede85fcc088f1b3dba4ce92dd29d126fd96b0008ea213167940a2475/pyparsing-3.1.1-py3-none-any.whl (103 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 103.1/103.1 kB 2.4 MB/s eta 0:00:00a 0:00:01 Collecting contourpy>=1.0.1 Downloading https://pypi.tuna.tsinghua.edu.cn/packages/f1/6b/e4b0f8708f22dd7c321f87eadbb98708975e115ac6582eb46d1f32197ce6/contourpy-1.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (301 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 301.7/301.7 kB 7.0 MB/s eta 0:00:00a 0:00:01 Requirement already satisfied: PyYAML in /opt/mamba/lib/python3.10/site-packages (from pyaml>=16.9->scikit-optimize>=0.9.0->deepxde) (6.0) Requirement already satisfied: six>=1.5 in /opt/mamba/lib/python3.10/site-packages (from python-dateutil>=2.7->matplotlib->deepxde) (1.16.0) Installing collected packages: threadpoolctl, pyparsing, pyaml, pillow, kiwisolver, joblib, fonttools, cycler, contourpy, scikit-learn, matplotlib, scikit-optimize, deepxde Successfully installed contourpy-1.1.1 cycler-0.11.0 deepxde-1.9.3 fonttools-4.42.1 joblib-1.3.2 kiwisolver-1.4.5 matplotlib-3.8.0 pillow-10.0.1 pyaml-23.9.6 pyparsing-3.1.1 scikit-learn-1.3.1 scikit-optimize-0.9.0 threadpoolctl-3.2.0 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
No backend selected. Finding available backend... Using backend: pytorch Other supported backends: tensorflow.compat.v1, tensorflow, jax, paddle. paddle supports more examples now and is recommended. Found pytorch Setting the default backend to "pytorch". You can change it in the ~/.deepxde/config.json file or export the DDE_BACKEND environment variable. Valid options are: tensorflow.compat.v1, tensorflow, pytorch, jax, paddle (all lowercase)
1.1时间域定义
dde.geometry.TimeDomain(t0, t1)
用于定义时间域的类,用于指定微分方程求解的时间范围。
此处指定了一个从的时间区间
1.2 ODE方程组定义
dde.gradients.jacobian(y, t, i)
用于对求导,参数表示要对第几个分量进行操作
1.3 边界条件判断
: 时空点坐标
on_initial
: 指示当前时空点是否位于初始时刻。
np.isclose(a, b)
用于判断 a, b两个浮点数是否相等。
该函数意义为,首先通过 访问时空点的第一个分量,即空间坐标,然后使用 np.isclose()
函数检查该坐标是否等于 0。如果该坐标等于 0,则认为当前时空点处于时空边界上,并返回为 True ,表示边界条件得到满足;否则返回 False,表示边界条件未得到满足。
1.4 定义初值条件
dde.icbc.IC(geom,function,boundary,component)
用于指定微分方程组在初始时刻的状态,参数
在边界时空点上, y1 分量的值为 0, y2 分量的值为 1。
🔖代码说明
- geom:方程组的时空范围
- function:方程组在初始时刻状态的函数
- boundary:方程组在时空边界处的边界条件
- component:分量编号,表示对第几个分量进行初值条件的指定。
1.5定义对比函数
对比ODE方程的真值
1.6 带入求解器
dde.data.PDE(geom, eqn, ic, num_domain, num_boundary, solution, num_test)
用PDE求解器解决OED问题,参数
🔖代码说明
- eqn:方程组的数学表达式
- ic:初值条件,可以是一个 IC 对象或一个包含多个 IC 对象的列表
- num_domain:将时空范围分成多少个小块进行求解
- num_boundary:在时空范围的边界处采样多少个点作为边界条件
- solution:微分方程组的解析解,用于测试求解结果的准确性
- num_test:在测试求解结果准确性时采样多少个点
1.7 定义网格结构
dde.nn.FNN(layer_size, activation, initializer)
使用前馈神经网络,参数
🔖代码说明
- layer_size:用list表示神经网络每层的神经元数量,包括输入层、隐藏层和输出层;
- activation:神经网络的激活函数类型;
- initializer:神经网络的权重初始化方式。
1.8 模型编译
Compiling model... 'compile' took 0.000612 s
1.9 开启训练
自定义迭代次数。 在使用 DeepXDE 训练 PINN 模型时,输出结果一般包括以下几个部分
🔖代码说明
- Step:表示当前训练的步数;
- Train loss:表示训练集上的损失值,是一个长度为 4 的列表,包括四个部分:总损失、PDE 部分的损失、初始条件部分的损失和边界条件部分的损失;
- Test loss:表示测试集上的损失值,含义与训练集相同;
- Test metric:表示测试集上的评估指标。
Training model... Step Train loss Test loss Test metric 0 [3.30e-03, 6.57e-03, 0.00e+00, 1.00e+00] [3.26e-03, 6.73e-03, 0.00e+00, 1.00e+00] [1.00e+00] 1000 [8.85e-03, 7.09e-03, 7.85e-09, 3.17e-04] [9.95e-03, 7.09e-03, 7.85e-09, 3.17e-04] [7.82e-01] 2000 [5.82e-03, 4.90e-03, 7.93e-07, 1.33e-04] [6.21e-03, 5.12e-03, 7.93e-07, 1.33e-04] [6.46e-01] 3000 [3.39e-03, 3.85e-03, 1.04e-06, 7.31e-05] [3.54e-03, 4.07e-03, 1.04e-06, 7.31e-05] [5.04e-01] 4000 [1.71e-03, 2.19e-03, 7.57e-07, 3.25e-05] [1.82e-03, 2.12e-03, 7.57e-07, 3.25e-05] [3.45e-01] 5000 [7.35e-04, 9.64e-04, 1.85e-06, 9.08e-06] [8.26e-04, 7.74e-04, 1.85e-06, 9.08e-06] [1.98e-01] 6000 [3.27e-04, 3.88e-04, 3.17e-07, 3.08e-06] [3.45e-04, 2.75e-04, 3.17e-07, 3.08e-06] [1.11e-01] 7000 [7.87e-05, 1.20e-04, 1.50e-07, 3.37e-07] [7.53e-05, 8.04e-05, 1.50e-07, 3.37e-07] [4.44e-02] 8000 [2.96e-05, 5.78e-05, 2.04e-08, 6.77e-08] [2.89e-05, 4.05e-05, 2.04e-08, 6.77e-08] [2.19e-02] 9000 [3.63e-05, 5.01e-05, 2.87e-06, 1.51e-08] [3.80e-05, 4.70e-05, 2.87e-06, 1.51e-08] [1.40e-02] 10000 [2.20e-05, 2.00e-05, 2.52e-06, 1.13e-06] [2.25e-05, 1.39e-05, 2.52e-06, 1.13e-06] [1.04e-02] 11000 [3.54e-05, 1.86e-05, 8.43e-06, 3.51e-06] [3.86e-05, 1.90e-05, 8.43e-06, 3.51e-06] [7.30e-03] 12000 [1.28e-04, 8.36e-05, 2.84e-05, 3.24e-06] [1.30e-04, 7.24e-05, 2.84e-05, 3.24e-06] [1.34e-02] 13000 [3.58e-06, 3.32e-06, 3.58e-10, 1.19e-09] [4.29e-06, 3.48e-06, 3.58e-10, 1.19e-09] [3.01e-03] 14000 [3.23e-06, 2.73e-06, 1.94e-10, 1.67e-09] [3.97e-06, 3.01e-06, 1.94e-10, 1.67e-09] [2.88e-03] 15000 [2.85e-06, 2.32e-06, 8.76e-09, 8.29e-09] [3.60e-06, 2.64e-06, 8.76e-09, 8.29e-09] [2.25e-03] 16000 [2.47e-06, 1.98e-06, 1.25e-10, 3.96e-10] [3.10e-06, 2.30e-06, 1.25e-10, 3.96e-10] [1.97e-03] 17000 [2.24e-06, 1.72e-06, 1.23e-09, 7.85e-10] [2.90e-06, 2.06e-06, 1.23e-09, 7.85e-10] [2.00e-03] 18000 [4.16e-06, 4.60e-06, 4.93e-07, 1.16e-07] [4.89e-06, 5.32e-06, 4.93e-07, 1.16e-07] [3.39e-03] 19000 [1.99e-06, 1.55e-06, 7.84e-08, 1.62e-09] [2.65e-06, 1.97e-06, 7.84e-08, 1.62e-09] [2.28e-03] 20000 [2.22e-05, 1.30e-05, 1.04e-05, 1.76e-06] [2.33e-05, 1.09e-05, 1.04e-05, 1.76e-06] [6.12e-03] Best model at step 19000: train loss: 3.62e-06 test loss: 4.70e-06 test metric: [2.28e-03] 'train' took 59.388652 s



2 表达式拟合
2.1 单表达式
简单的待拟合函数
计算域定义
使用域内采样的 16 个点进行训练,使用 100 个点进行测试。
接下来,我们选择深度为 4(即 3 个隐藏层)、宽度为 20 的全连接神经网络
以 tanh 作为激活函数,以 Glorot 均匀值作为初始化函数
建立了函数逼近问题和网络后,建立一个模型并选择优化器 adam 以及 0.001 的学习率
Compiling model... 'compile' took 0.001053 s
Training model... Step Train loss Test loss Test metric 0 [2.75e-01] [2.36e-01] [1.06e+00] 1000 [1.85e-03] [1.54e-03] [8.59e-02] 2000 [1.14e-04] [1.33e-04] [2.52e-02] 3000 [4.76e-05] [7.54e-05] [1.90e-02] 4000 [3.03e-05] [5.68e-05] [1.65e-02] 5000 [9.66e-06] [3.39e-05] [1.27e-02] 6000 [9.93e-07] [2.01e-05] [9.80e-03] 7000 [2.99e-07] [1.81e-05] [9.30e-03] 8000 [2.41e-07] [1.82e-05] [9.33e-03] 9000 [2.29e-05] [3.74e-05] [1.34e-02] 10000 [4.78e-06] [2.17e-05] [1.02e-02] Best model at step 8000: train loss: 2.41e-07 test loss: 1.82e-05 test metric: [9.33e-03] 'train' took 13.906669 s


3 来自对齐数据集的反求算子
Compiling model... 'compile' took 0.000265 s Training model... Step Train loss Test loss Test metric 0 [1.45e-01] [1.45e-01] [7.77e-01] 1000 [4.79e-05] [4.79e-05] [2.89e-02] 2000 [1.56e-05] [1.56e-05] [1.68e-02] 3000 [9.47e-06] [9.47e-06] [1.29e-02] 4000 [6.99e-06] [6.99e-06] [1.10e-02] 5000 [5.62e-06] [5.62e-06] [9.77e-03] 6000 [5.96e-05] [5.96e-05] [1.76e-02] 7000 [3.97e-06] [3.97e-06] [7.98e-03] 8000 [3.35e-06] [3.35e-06] [7.31e-03] 9000 [2.98e-06] [2.98e-06] [6.84e-03] 10000 [2.66e-06] [2.66e-06] [6.40e-03] Best model at step 10000: train loss: 2.66e-06 test loss: 2.66e-06 test metric: [6.40e-03] 'train' took 21.000665 s












Linfeng Zhang
yangshaoyi