• search hit 3 of 3913
Back to Result List

Eigen-informed neural ordinary differential equations: incorporating knowledge of system properties

  • Using neural ordinary differential equations (ODEs) to model complex systems is still a challenging venture and failing for various reasons, often resulting in convergence to unsatisfactory local minima or unintended termination of the training process because of solver issues. To take a step back, the root of the problem lies in either the complexity of the optimization problem or the lack of data. The negative effects of both aspects can be reduced by incorporating a priori knowledge, a common strategy in Scientific Machine Learning. Especially when modeling physical systems, there is almost always more system knowledge available than is used for training. Examples range from knowledge of generic properties such as “the system is stable” to more quantifiable attributes such as “the system oscillates in a known frequency range”. To close the loop, such knowledge can be used to achieve faster convergence and better generalization. In this paper, we focus specifically on systemUsing neural ordinary differential equations (ODEs) to model complex systems is still a challenging venture and failing for various reasons, often resulting in convergence to unsatisfactory local minima or unintended termination of the training process because of solver issues. To take a step back, the root of the problem lies in either the complexity of the optimization problem or the lack of data. The negative effects of both aspects can be reduced by incorporating a priori knowledge, a common strategy in Scientific Machine Learning. Especially when modeling physical systems, there is almost always more system knowledge available than is used for training. Examples range from knowledge of generic properties such as “the system is stable” to more quantifiable attributes such as “the system oscillates in a known frequency range”. To close the loop, such knowledge can be used to achieve faster convergence and better generalization. In this paper, we focus specifically on system properties that can be expressed based on eigenvalues, such as (partly) stability, oscillation capability, frequencies, damping, and stiffness. We explain how such properties can be intuitively integrated into training neural ODEs, provide open-source software for this, and finally show in three academic examples that such eigen-informed neural ODEs are able to converge in fewer steps (median up to factor ), generalize better (median up to factor ) and are solvable more efficiently (median up to factor ) compared to pure neural ODEs, and can even reconstruct a given system on undersampled data.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Tobias ThummererORCiDGND, Lars MikelsonsORCiDGND
URN:urn:nbn:de:bvb:384-opus4-1247822
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/124782
ISSN:0925-2312OPAC
Parent Title (English):Neurocomputing
Publisher:Elsevier BV
Place of publication:Amsterdam
Type:Article
Language:English
Year of first Publication:2025
Publishing Institution:Universität Augsburg
Release Date:2025/09/04
Volume:654
First Page:131358
DOI:https://doi.org/10.1016/j.neucom.2025.131358
Institutes:Fakultät für Angewandte Informatik
Fakultät für Angewandte Informatik / Institut für Informatik
Fakultät für Angewandte Informatik / Institut für Informatik / Lehrstuhl für Ingenieurinformatik mit Schwerpunkt Mechatronik
Dewey Decimal Classification:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 004 Datenverarbeitung; Informatik
Licence (German):License LogoCC-BY 4.0: Creative Commons: Namensnennung (mit Print on Demand)