Convergence acceleration in machine learning potentials for atomistic simulations†
Abstract
Machine learning potentials (MLPs) for atomistic simulations have an enormous prospective impact on materials modeling, offering orders of magnitude speedup over density functional theory (DFT) calculations without appreciably sacrificing accuracy in the prediction of material properties. However, the generation of large datasets needed for training MLPs is daunting. Herein, we show that MLP-based material property predictions converge faster with respect to precision for Brillouin zone integrations than DFT-based property predictions. We demonstrate that this phenomenon is robust across material properties for different metallic systems. Further, we provide statistical error metrics to accurately determine a priori the precision level required of DFT training datasets for MLPs to ensure accelerated convergence of material property predictions, thus significantly reducing the computational expense of MLP development.