Empowering deep neural quantum states through efficient optimization

  • Computing the ground state of interacting quantum matter is a long-standing challenge, especially for complex two-dimensional systems. Recent developments have highlighted the potential of neural quantum states to solve the quantum many-body problem by encoding the many-body wavefunction into artificial neural networks. However, this method has faced the critical limitation that existing optimization algorithms are not suitable for training modern large-scale deep network architectures. Here, we introduce a minimum-step stochastic-reconfiguration optimization algorithm, which allows us to train deep neural quantum states with up to 106 parameters. We demonstrate our method for paradigmatic frustrated spin-1/2 models on square and triangular lattices, for which our trained deep networks approach machine precision and yield improved variational energies compared to existing results. Equipped with our optimization algorithm, we find numerical evidence for gapless quantum-spin-liquidComputing the ground state of interacting quantum matter is a long-standing challenge, especially for complex two-dimensional systems. Recent developments have highlighted the potential of neural quantum states to solve the quantum many-body problem by encoding the many-body wavefunction into artificial neural networks. However, this method has faced the critical limitation that existing optimization algorithms are not suitable for training modern large-scale deep network architectures. Here, we introduce a minimum-step stochastic-reconfiguration optimization algorithm, which allows us to train deep neural quantum states with up to 106 parameters. We demonstrate our method for paradigmatic frustrated spin-1/2 models on square and triangular lattices, for which our trained deep networks approach machine precision and yield improved variational energies compared to existing results. Equipped with our optimization algorithm, we find numerical evidence for gapless quantum-spin-liquid phases in the considered models, an open question to date. We present a method that captures the emergent complexity in quantum many-body problems through the expressive power of large-scale artificial neural networks.show moreshow less

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Ao Chen, Markus HeylORCiDGND
URN:urn:nbn:de:bvb:384-opus4-1139176
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/113917
ISSN:1745-2473OPAC
ISSN:1745-2481OPAC
Parent Title (English):Nature Physics
Publisher:Springer
Place of publication:Berlin
Type:Article
Language:English
Year of first Publication:2024
Publishing Institution:Universität Augsburg
Release Date:2024/07/09
Volume:20
Issue:9
First Page:1476
Last Page:1481
DOI:https://doi.org/10.1038/s41567-024-02566-1
Institutes:Mathematisch-Naturwissenschaftlich-Technische Fakultät
Mathematisch-Naturwissenschaftlich-Technische Fakultät / Institut für Physik
Mathematisch-Naturwissenschaftlich-Technische Fakultät / Institut für Physik / Lehrstuhl für Theoretische Physik III
Dewey Decimal Classification:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
Licence (German):CC-BY 4.0: Creative Commons: Namensnennung (mit Print on Demand)