# Machine-learning methods for structure prediction of multi-component perovskites

### Subproject P09

The connection between the composition and function of complex multi-component oxides is intricate, and our knowledge about it is extremely limited. Current models can at most predict the stability of a stoichiometric composition, a very general structural feature. P09 will develop accelerated ML models to predict the structural details that determine the functionality of perovskites. We will implement two approaches:

First, EAs will be combined with an NN potential trained on the fly to quickly explore the energy landscape of perovskite surfaces and predict their detailed structures. In collaboration with experimental partners (P02 Diebold, P04 Parkinson), those structures will be falsified by direct comparison with diffraction data on existing surfaces. Additionally, the implementation, inputs, and results of the machine-learned force fields (MLFFs) will be shared with the theoretical partners for cross-validation.

Second, GANs will be trained on known compositions to identify the key features of real perovskite structures and propose new stable ones.

## Expertise

We develop and apply atomistic models for theoretical chemistry and materials science. Our expertise covers both classical and quantum methods, as well as multiscale calculations and machine-learning techniques. The group has taken part in the development and public release of a range of packages for atomistic calculations, including:

- WIEN2k, a popular all-electron density functional theory implementation;
- BoltzTraP and BoltzTraP2, two packages used to interpolate electronic band structures and calculate transport coefficients;
- ShengBTE, the first open-source solver of the Boltzmann transport for phonons, which enables predictive calculations of the thermal conductivity of nanostructures;
- almaBTE, a software package for multiscale thermal transport simulation based on first principles;
- Clinamen, an implementation of the covariance matrix adaptation evolutionary algorithm that helps explore complex energy landscapes.

These are some of the methods we have used to study solids, liquids, surfaces, and nanostructures:

- Density functional theory (DFT);
- Classical and ab-initio molecular dynamics (MD);
- Self-consistent anharmonic free energy calculations;
- The Boltzmann transport equation (BTE);
- Traditional and particle-filter Monte Carlo (MC);
- Covariance matrix adaptation evolutionary algorithm (CMA-ES);
- Classification and regression random forests based on phenomenological information;
- Algorithmically differentiable machine-learning (ML) force fields based on JAX;
- High-throughput (HT) materials screening.

## Team

### 2021

Montes-Campos, Hadrián; Carrete, Jesús; Varela, Luis M; Madsen, Georg K H

A Differentiable Neural-Network Force Field for Ionic Liquids Journal Article

In: Pre-Print (arXiv:2106.16220), 2021.

Abstract | BibTeX | Tags: P09, pre-print

@article{MontesCampos2021,

title = {A Differentiable Neural-Network Force Field for Ionic Liquids},

author = {Hadrián Montes-Campos and Jesús Carrete and Luis M Varela and Georg K H Madsen},

year = {2021},

date = {2021-06-30},

journal = {Pre-Print (arXiv:2106.16220)},

abstract = {We present NeuralIL, a model for the potential energy of an ionic liquid that accurately reproduces first-principles results with orders-of-magnitude savings in computational cost. Based on a multilayer perceptron and spherical Bessel descriptors of the atomic environments, NeuralIL is implemented in such a way as to be fully automatically differentiable. It can thus be trained on ab-initio forces instead of just energies, to make the most out of the available data, and can efficiently predict arbitrary derivatives of the potential energy. We parametrize the model for the case of ethylammonium nitrate. We discuss the best way to include chemical information in the atom-centered descriptors for a many-component system. Furthermore, we demonstrate an ensemble-learning approach to the detection of extrapolation. With out-of-sample accuracies better than 0.1 kcal/mol in the energies and 100 meV/Å in the forces, our potential model considerably outperforms molecular-mechanics force fields and opens the door to large-scale thermodynamical calculations with ab-initio-like accuracy for ionic liquids. Including the forces does away with the idea that vast amounts of atomic configurations are required to train a neural network force field based on atom-centered descriptors. We also find that a separate treatment of long-range interactions is not required to achieve a high-quality representation of the potential

energy surface of these dense ionic systems.},

keywords = {P09, pre-print},

pubstate = {published},

tppubtype = {article}

}

energy surface of these dense ionic systems.