Abstract
Abstract
When creating training data for machine-learned interatomic potentials (MLIPs), it is common to create initial structures and evolve them using molecular dynamics (MD) to sample a larger configuration space. We benchmark two other modalities of evolving structures, contour exploration (CE) and dimer-method (DM) searches against MD for their ability to produce diverse and robust density functional theory training data sets for MLIPs. We also discuss the generation of initial structures which are either from known structures or from random structures in detail to further formalize the structure-sourcing processes in the future. The polymorph-rich zirconium-oxygen composition space is used as a rigorous benchmark system for comparing the performance of MLIPs trained on structures generated from these structural evolution methods. Using Behler–Parrinello neural networks as our MLIP models, we find that CE and the DM searches are generally superior to MD in terms of spatial descriptor diversity and statistical accuracy.
Subject
Condensed Matter Physics,General Materials Science
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献