当前位置: X-MOL 学术npj Comput. Mater. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Pretraining of attention-based deep learning potential model for molecular simulation
npj Computational Materials ( IF 9.7 ) Pub Date : 2024-05-07 , DOI: 10.1038/s41524-024-01278-7
Duo Zhang , Hangrui Bi , Fu-Zhi Dai , Wanrun Jiang , Xinzijian Liu , Linfeng Zhang , Han Wang

Machine learning-assisted modeling of the inter-atomic potential energy surface (PES) is revolutionizing the field of molecular simulation. With the accumulation of high-quality electronic structure data, a model that can be pretrained on all available data and finetuned on downstream tasks with a small additional effort would bring the field to a new stage. Here we propose DPA-1, a Deep Potential model with a gated attention mechanism, which is highly effective for representing the conformation and chemical spaces of atomic systems and learning the PES. We tested DPA-1 on a number of systems and observed superior performance compared with existing benchmarks. When pretrained on large-scale datasets containing 56 elements, DPA-1 can be successfully applied to various downstream tasks with a great improvement of sample efficiency. Surprisingly, for different elements, the learned type embedding parameters form a spiral in the latent space and have a natural correspondence with their positions on the periodic table, showing interesting interpretability of the pretrained DPA-1 model.



中文翻译:

用于分子模拟的基于注意力的深度学习潜力模型的预训练

原子间势能面 (PES) 的机器学习辅助建模正在彻底改变分子模拟领域。随着高质量电子结构数据的积累,一个可以对所有可用数据进行预训练并通过少量额外工作对下游任务进行微调的模型将把该领域带到一个新的阶段。在这里,我们提出了 DPA-1,一种具有门控注意力机制的深势模型,对于表示原子系统的构象和化学空间以及学习 PES 非常有效。我们在许多系统上测试了 DPA-1,并观察到与现有基准相比具有卓越的性能。当在包含 56 个元素的大规模数据集上进行预训练时,DPA-1 可以成功应用于各种下游任务,并且样本效率得到极大提高。令人惊讶的是,对于不同的元素,学习到的类型嵌入参数在潜在空间中形成螺旋与它们在元素周期表上的位置具有自然的对应关系,显示出训练DPA-1 模型的有趣的可解释性。

更新日期:2024-05-08
down
wechat
bug