Staged Training of Machine-Learning Potentials from Small to Large Surface Unit Cells: Efficient Global Structure Determination of the RuO2(100)-c(2 × 2) Reconstruction and (410) Vicinal

  • Yonghyuk Lee
  • , Jakob Timmermann
  • , Chiara Panosetti
  • , Christoph Scheurer
  • , Karsten Reuter

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Machine-learning (ML) potentials trained with density functional theory (DFT) data boost the sampling capabilities in first-principles global surface structure determination. Particular data efficiency is thereby achieved by iterative training protocols that blend the creation of new training data with the actual surface exploration process. Here, we extend this to a staged training from small to large surface unit cells. With many geometric motifs learned from small unit cell data, successively less new DFT structures in computationally demanding large surface unit cells are queried. We demonstrate the fully automatized workflow in the context of rutile RuO2 surfaces. For a Gaussian approximation potential (GAP) initially trained on (1 × 1) surface structures, only limited additional data are necessary to efficiently recover only recently identified structures for the RuO2(100)-c(2 × 2) reconstruction. The same holds when retraining this GAP for the (410) vicinal, the optimized structure of which is found to involve c(2 × 2) reconstructed terraces. Due to the high stability of this structure, (410) vicinals appear in the predicted Wulff equilibrium nanoparticle shape.

Original languageEnglish
Pages (from-to)17599-17608
Number of pages10
JournalJournal of Physical Chemistry C
Volume127
Issue number35
DOIs
StatePublished - 7 Sep 2023

Fingerprint

Dive into the research topics of 'Staged Training of Machine-Learning Potentials from Small to Large Surface Unit Cells: Efficient Global Structure Determination of the RuO2(100)-c(2 × 2) Reconstruction and (410) Vicinal'. Together they form a unique fingerprint.

Cite this