Differential equations parameterized by neural networks become expensive to
solve numerically as training progresses. We propose a remedy that encourages
learned dynamics to be easier to solve. Specifically, we introduce a
differentiable surrogate for the time cost of standard numerica
提出了一种数据驱动的积分方法,称为 Taylor-Lagrange NODEs (TL-NODEs),它使用定阶 Taylor 扩展进行数值积分,同时学习估计扩展的近似误差,从而在保持准确性的前提下,仅使用低阶 Taylor 扩展,大大降低了计算成本。一系列数值实验表明,TL-NODEs 比现有方法快一个数量级以上,性能也不会降低。