Neuralnetwork based simulation of rare event processes at the water/oxide interface
Subproject P12
Atomistic computer simulations of processes occurring at the water/oxide interface are challenging in several ways. The calculation of atomic forces based on ab initio methods is computationally very demanding, and barrier crossing events may lead to long computation times. Both these aspects severely limit accessible system sizes and simulation times.
In project P12, we will address these challenges using a combination of machine learning and advanced rare event sampling methods. In particular, using software developed in our group and collaborating with P03 Kresse, we will train neural network potentials based on the BehlerParrinello approach for oxide/water interfaces, starting with the Fe_{3}O_{4}/water system studied in P11 Backus. We will pay special attention to error estimation and the correct treatment of longrange interactions. With the new potential, we will study the structure and dynamics of water near the oxide surface to provide the atomistic information necessary to rationalize the spectroscopy experiments of P11 Backus. Another important goal of P12 is to explore how deep generative models can be used to enhance rare event simulations. For this purpose, we will apply normalizing flows, represented by deep neural networks, to trajectory space. The resulting improved transition path sampling simulations will be used to study reactive processes investigated experimentally in other subprojects of TACO.
Expertise
Team
Publications
2021 

Bircher, Martin P; Singraber, Andreas; Dellago, Christoph Improved description of atomic environments using lowcost polynomial functions with compact support Journal Article Machine Learning: Science and Technology, 2 (3), pp. 035026, 2021. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Bircher2021, title = {Improved description of atomic environments using lowcost polynomial functions with compact support}, author = {Martin P Bircher and Andreas Singraber and Christoph Dellago}, doi = {10.1088/26322153/abf817}, year = {2021}, date = {20210616}, journal = {Machine Learning: Science and Technology}, volume = {2}, number = {3}, pages = {035026}, publisher = {IOP Publishing}, abstract = {The prediction of chemical properties using machine learning techniques calls for a set of appropriate descriptors that accurately describe atomic and, on a larger scale, molecular environments. A mapping of conformational information on a space spanned by atomcentred symmetry functions (SF) has become a standard technique for energy and force predictions using highdimensional neural network potentials (HDNNP). An appropriate choice of SFs is particularly crucial for accurate force predictions. Established atomcentred SFs, however, are limited in their flexibility, since their functional form restricts the angular domain that can be sampled without introducing problematic derivative discontinuities. Here, we introduce a class of atomcentred SFs based on polynomials with compact support called polynomial symmetry functions (PSF), which enable a free choice of both, the angular and the radial domain covered. We demonstrate that the accuracy of PSFs is either on par or considerably better than that of conventional, atomcentred SFs. In particular, a generic set of PSFs with an intuitive choice of the angular domain inspired by organic chemistry considerably improves prediction accuracy for organic molecules in the gaseous and liquid phase, with reductions in force prediction errors over a test set approaching 50% for certain systems. Contrary to established atomcentred SFs, computation of PSF does not involve any exponentials, and their intrinsic compact support supersedes use of separate cutoff functions, facilitating the choice of their free parameters. Most importantly, the number of floating point operations required to compute polynomial SFs introduced here is considerably lower than that of other stateoftheart SFs, enabling their efficient implementation without the need of highly optimised code structures or caching, with speedups with respect to other stateoftheart SFs reaching a factor of 4.5 to 5. This loweffort performance benefit substantially simplifies their use in new programs and emerging platforms such as graphical processing units. Overall, polynomial SFs with compact support improve accuracy of both, energy and force predictions with HDNNPs while enabling significant speedups compared to their wellestablished counterparts.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } The prediction of chemical properties using machine learning techniques calls for a set of appropriate descriptors that accurately describe atomic and, on a larger scale, molecular environments. A mapping of conformational information on a space spanned by atomcentred symmetry functions (SF) has become a standard technique for energy and force predictions using highdimensional neural network potentials (HDNNP). An appropriate choice of SFs is particularly crucial for accurate force predictions. Established atomcentred SFs, however, are limited in their flexibility, since their functional form restricts the angular domain that can be sampled without introducing problematic derivative discontinuities. Here, we introduce a class of atomcentred SFs based on polynomials with compact support called polynomial symmetry functions (PSF), which enable a free choice of both, the angular and the radial domain covered. We demonstrate that the accuracy of PSFs is either on par or considerably better than that of conventional, atomcentred SFs. In particular, a generic set of PSFs with an intuitive choice of the angular domain inspired by organic chemistry considerably improves prediction accuracy for organic molecules in the gaseous and liquid phase, with reductions in force prediction errors over a test set approaching 50% for certain systems. Contrary to established atomcentred SFs, computation of PSF does not involve any exponentials, and their intrinsic compact support supersedes use of separate cutoff functions, facilitating the choice of their free parameters. Most importantly, the number of floating point operations required to compute polynomial SFs introduced here is considerably lower than that of other stateoftheart SFs, enabling their efficient implementation without the need of highly optimised code structures or caching, with speedups with respect to other stateoftheart SFs reaching a factor of 4.5 to 5. This loweffort performance benefit substantially simplifies their use in new programs and emerging platforms such as graphical processing units. Overall, polynomial SFs with compact support improve accuracy of both, energy and force predictions with HDNNPs while enabling significant speedups compared to their wellestablished counterparts.  
2020 

Wohlfahrt, Oliver; Dellago, Christoph; Sega, Marcello Ab initio structure and thermodynamics of the RPBED3 water/vapor interface by neuralnetwork molecular dynamics Journal Article The Journal of Chemical Physics, 153 (14), pp. 144710, 2020. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Wohlfahrt2020, title = {Ab initio structure and thermodynamics of the RPBED3 water/vapor interface by neuralnetwork molecular dynamics}, author = {Oliver Wohlfahrt and Christoph Dellago and Marcello Sega}, doi = {10.1063/5.0021852}, year = {2020}, date = {20201014}, journal = {The Journal of Chemical Physics}, volume = {153}, number = {14}, pages = {144710}, publisher = {AIP Publishing}, abstract = {Aided by a neural network representation of the density functional theory potential energy landscape of water in the Revised Perdew–Burke–Ernzerhof approximation corrected for dispersion, we calculate several structural and thermodynamic properties of its liquid/vapor interface. The neural network speed allows us to bridge the size and time scale gaps required to sample the properties of water along its liquid/vapor coexistence line with unprecedented precision.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } Aided by a neural network representation of the density functional theory potential energy landscape of water in the Revised Perdew–Burke–Ernzerhof approximation corrected for dispersion, we calculate several structural and thermodynamic properties of its liquid/vapor interface. The neural network speed allows us to bridge the size and time scale gaps required to sample the properties of water along its liquid/vapor coexistence line with unprecedented precision.  
2019 

Michl, Jakob; Sega, Marcello; Dellago, Christoph Phase stability of the ice XVIIbased CO_{2} chiral hydrate from molecular dynamics simulations Journal Article The Journal of Chemical Physics, 151 (10), pp. 104502, 2019. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Michl2019, title = {Phase stability of the ice XVIIbased CO_{2} chiral hydrate from molecular dynamics simulations}, author = {Jakob Michl and Marcello Sega and Christoph Dellago}, doi = {10.1063/1.5116540}, year = {2019}, date = {20190912}, journal = {The Journal of Chemical Physics}, volume = {151}, number = {10}, pages = {104502}, publisher = {AIP Publishing}, abstract = {We computed the phase diagram of CO_{2} hydrates at high pressure (HP), from 0.3 to 20 kbar, by means of molecular dynamics simulations. The two CO_{2} hydrates known to occur in this pressure range are the cubic structure I (sI) clathrate and the HP hydrate, whose water framework is the recently discovered ice XVII. We investigated the stability of both hydrates upon heating (melting) as well as the phase changes upon compression. The CO_{2}filled ice XVII is found to be more stable than the sI clathrate and than the mixture of ice VI and dry ice at pressure values ranging from 6 to 18 kbar and in a wide temperature range, although a phenomenological correction suggests that the stability should more realistically range from 6.5 to 13.5 kbar. Our simulation results support the current hypothesis that the HP hydrate is stable at temperatures above the melting curve of ice VI.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } We computed the phase diagram of CO_{2} hydrates at high pressure (HP), from 0.3 to 20 kbar, by means of molecular dynamics simulations. The two CO_{2} hydrates known to occur in this pressure range are the cubic structure I (sI) clathrate and the HP hydrate, whose water framework is the recently discovered ice XVII. We investigated the stability of both hydrates upon heating (melting) as well as the phase changes upon compression. The CO_{2}filled ice XVII is found to be more stable than the sI clathrate and than the mixture of ice VI and dry ice at pressure values ranging from 6 to 18 kbar and in a wide temperature range, although a phenomenological correction suggests that the stability should more realistically range from 6.5 to 13.5 kbar. Our simulation results support the current hypothesis that the HP hydrate is stable at temperatures above the melting curve of ice VI.  
Singraber, Andreas; Morawietz, Tobias; Behler, Jörg; Dellago, Christoph Parallel Multistream Training of HighDimensional Neural Network Potentials Journal Article Journal of Chemical Theory and Computation, 15 (5), pp. 3075–3092, 2019. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Singraber2019, title = {Parallel Multistream Training of HighDimensional Neural Network Potentials}, author = {Andreas Singraber and Tobias Morawietz and Jörg Behler and Christoph Dellago}, doi = {10.1021/acs.jctc.8b01092}, year = {2019}, date = {20190417}, journal = {Journal of Chemical Theory and Computation}, volume = {15}, number = {5}, pages = {30753092}, publisher = {American Chemical Society (ACS)}, abstract = {Over the past years highdimensional neural network potentials (HDNNPs), fitted to accurately reproduce ab initio potential energy surfaces, have become a powerful tool in chemistry, physics and materials science. Here, we focus on the training of the neural networks that lies at the heart of the HDNNP method. We present an efficient approach for optimizing the weight parameters of the neural network via multistream Kalman filtering, using potential energies and forces as reference data. In this procedure, the choice of the free parameters of the Kalman filter can have a significant impact on the fit quality. Carrying out a large parameter study, we determine optimal settings and demonstrate how to optimize training results of HDNNPs. Moreover, we illustrate our HDNNP training approach by revisiting previously presented fits for water and developing a new potential for copper sulfide. This material, accessible in computer simulations so far only via firstprinciples methods, forms a particularly complex solid structure at low temperatures and undergoes a phase transition to a superionic state upon heating. Analyzing MD simulations carried out with the Cu_{2}S HDNNP, we confirm that the underlying ab initio reference method indeed reproduces this behavior.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } Over the past years highdimensional neural network potentials (HDNNPs), fitted to accurately reproduce ab initio potential energy surfaces, have become a powerful tool in chemistry, physics and materials science. Here, we focus on the training of the neural networks that lies at the heart of the HDNNP method. We present an efficient approach for optimizing the weight parameters of the neural network via multistream Kalman filtering, using potential energies and forces as reference data. In this procedure, the choice of the free parameters of the Kalman filter can have a significant impact on the fit quality. Carrying out a large parameter study, we determine optimal settings and demonstrate how to optimize training results of HDNNPs. Moreover, we illustrate our HDNNP training approach by revisiting previously presented fits for water and developing a new potential for copper sulfide. This material, accessible in computer simulations so far only via firstprinciples methods, forms a particularly complex solid structure at low temperatures and undergoes a phase transition to a superionic state upon heating. Analyzing MD simulations carried out with the Cu_{2}S HDNNP, we confirm that the underlying ab initio reference method indeed reproduces this behavior.  
Singraber, Andreas; Behler, Jörg; Dellago, Christoph LibraryBased LAMMPS Implementation of HighDimensional Neural Network Potentials Journal Article Journal of Chemical Theory and Computation, 15 (3), pp. 1827–1840, 2019. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Singraber2019a, title = {LibraryBased LAMMPS Implementation of HighDimensional Neural Network Potentials}, author = {Andreas Singraber and Jörg Behler and Christoph Dellago}, doi = {10.1021/acs.jctc.8b00770}, year = {2019}, date = {20190124}, journal = {Journal of Chemical Theory and Computation}, volume = {15}, number = {3}, pages = {18271840}, publisher = {American Chemical Society (ACS)}, abstract = {Neural networks and other machine learning approaches have been successfully used to accurately represent atomic interaction potentials derived from computationally demanding electronic structure calculations. Due to their low computational cost, such representations open the possibility for large scale reactive molecular dynamics simulations of processes with bonding situations that cannot be described accurately with traditional empirical force fields. Here, we present a library of functions developed for the implementation of neural network potentials. Written in C++, this library incorporates several strategies resulting in a very high efficiency of neural network potentialenergy and force evaluations. Based on this library, we have developed an implementation of the neural network potential within the molecular dynamics package LAMMPS and demonstrate its performance using liquid water as a test system.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } Neural networks and other machine learning approaches have been successfully used to accurately represent atomic interaction potentials derived from computationally demanding electronic structure calculations. Due to their low computational cost, such representations open the possibility for large scale reactive molecular dynamics simulations of processes with bonding situations that cannot be described accurately with traditional empirical force fields. Here, we present a library of functions developed for the implementation of neural network potentials. Written in C++, this library incorporates several strategies resulting in a very high efficiency of neural network potentialenergy and force evaluations. Based on this library, we have developed an implementation of the neural network potential within the molecular dynamics package LAMMPS and demonstrate its performance using liquid water as a test system.  
Cheng, Bingqing; Engel, Edgar A; Behler, Jörg; Dellago, Christoph; Ceriotti, Michele Ab initio thermodynamics of liquid and solid water Journal Article Proceedings of the National Academy of Sciences, 116 (4), pp. 1110–1115, 2019. Abstract  Links  BibTeX  Tags: P12, preTACO @article{Cheng2019, title = {Ab initio thermodynamics of liquid and solid water}, author = {Bingqing Cheng and Edgar A Engel and Jörg Behler and Christoph Dellago and Michele Ceriotti}, doi = {10.1073/pnas.1815117116}, year = {2019}, date = {20190104}, journal = {Proceedings of the National Academy of Sciences}, volume = {116}, number = {4}, pages = {11101115}, publisher = {Proceedings of the National Academy of Sciences}, abstract = {A central goal of computational physics and chemistry is to predict material properties by using firstprinciples methods based on the fundamental laws of quantum mechanics. However, the high computational costs of these methods typically prevent rigorous predictions of macroscopic quantities at finite temperatures, such as heat capacity, density, and chemical potential. Here, we enable such predictions by marrying advanced freeenergy methods with datadriven machinelearning interatomic potentials. We show that, for the ubiquitous and technologically essential system of water, a firstprinciples thermodynamic description not only leads to excellent agreement with experiments, but also reveals the crucial role of nuclear quantum fluctuations in modulating the thermodynamic stabilities of different phases of water.}, keywords = {P12, preTACO}, pubstate = {published}, tppubtype = {article} } A central goal of computational physics and chemistry is to predict material properties by using firstprinciples methods based on the fundamental laws of quantum mechanics. However, the high computational costs of these methods typically prevent rigorous predictions of macroscopic quantities at finite temperatures, such as heat capacity, density, and chemical potential. Here, we enable such predictions by marrying advanced freeenergy methods with datadriven machinelearning interatomic potentials. We show that, for the ubiquitous and technologically essential system of water, a firstprinciples thermodynamic description not only leads to excellent agreement with experiments, but also reveals the crucial role of nuclear quantum fluctuations in modulating the thermodynamic stabilities of different phases of water. 