Details
Originalsprache | Englisch |
---|---|
Aufsatznummer | 100035 |
Fachzeitschrift | Examples and Counterexamples |
Jahrgang | 1 |
Publikationsstatus | Veröffentlicht - Nov. 2021 |
Abstract
In this work, we discuss some pitfalls when solving differential equations with neural networks. Due to the highly nonlinear cost functional, local minima might be approximated by which functions may be obtained, that do not solve the problem. The main reason for these failures is a sensitivity on initial guesses for the nonlinear iteration. We apply known algorithms and corresponding implementations, including code snippets, and present an example and counter example for the logistic differential equations. These findings are further substantiated with variations in collocation points and learning rates.
ASJC Scopus Sachgebiete
- Mathematik (insg.)
- Computational Mathematics
- Mathematik (insg.)
- Angewandte Mathematik
- Mathematik (insg.)
- Mathematik (sonstige)
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Examples and Counterexamples, Jahrgang 1, 100035, 11.2021.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Solving differential equations via artificial neural networks
T2 - Findings and failures in a model problem
AU - Knoke, Tobias
AU - Wick, Thomas
PY - 2021/11
Y1 - 2021/11
N2 - In this work, we discuss some pitfalls when solving differential equations with neural networks. Due to the highly nonlinear cost functional, local minima might be approximated by which functions may be obtained, that do not solve the problem. The main reason for these failures is a sensitivity on initial guesses for the nonlinear iteration. We apply known algorithms and corresponding implementations, including code snippets, and present an example and counter example for the logistic differential equations. These findings are further substantiated with variations in collocation points and learning rates.
AB - In this work, we discuss some pitfalls when solving differential equations with neural networks. Due to the highly nonlinear cost functional, local minima might be approximated by which functions may be obtained, that do not solve the problem. The main reason for these failures is a sensitivity on initial guesses for the nonlinear iteration. We apply known algorithms and corresponding implementations, including code snippets, and present an example and counter example for the logistic differential equations. These findings are further substantiated with variations in collocation points and learning rates.
KW - Feedforward neural network
KW - Logistic equation
KW - numerical optimization
KW - Ordinary differential equation
KW - PyTorch
UR - http://www.scopus.com/inward/record.url?scp=85124032029&partnerID=8YFLogxK
U2 - 10.1016/j.exco.2021.100035
DO - 10.1016/j.exco.2021.100035
M3 - Article
VL - 1
JO - Examples and Counterexamples
JF - Examples and Counterexamples
M1 - 100035
ER -