Timezone: »

 
Poster
Injecting Domain Knowledge from Empirical Interatomic Potentials to Neural Networks for Predicting Material Properties
Zeren Shui · Daniel Karls · Mingjian Wen · ilia Nikiforov · Ellad Tadmor · George Karypis

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #621

For decades, atomistic modeling has played a crucial role in predicting the behavior of materials in numerous fields ranging from nanotechnology to drug discovery. The most accurate methods in this domain are rooted in first-principles quantum mechanical calculations such as density functional theory (DFT). Because these methods have remained computationally prohibitive, practitioners have traditionally focused on defining physically motivated closed-form expressions known as empirical interatomic potentials (EIPs) that approximately model the interactions between atoms in materials. In recent years, neural network (NN)-based potentials trained on quantum mechanical (DFT-labeled) data have emerged as a more accurate alternative to conventional EIPs. However, the generalizability of these models relies heavily on the amount of labeled training data, which is often still insufficient to generate models suitable for general-purpose applications. In this paper, we propose two generic strategies that take advantage of unlabeled training instances to inject domain knowledge from conventional EIPs to NNs in order to increase their generalizability. The first strategy, based on weakly supervised learning, trains an auxiliary classifier on EIPs and selects the best-performing EIP to generate energies to supplement the ground-truth DFT energies in training the NN. The second strategy, based on transfer learning, first pretrains the NN on a large set of easily obtainable EIP energies, and then fine-tunes it on ground-truth DFT energies. Experimental results on three benchmark datasets demonstrate that the first strategy improves baseline NN performance by 5% to 51% while the second improves baseline performance by up to 55%. Combining them further boosts performance.

Author Information

Zeren Shui (University of Minnesota)
Daniel Karls (University of Minnesota - Twin Cities)
Mingjian Wen (University of Houston)
ilia Nikiforov (University of Minnesota - Twin Cities)
Ellad Tadmor (University of Minnesota-Twin Cities)
Ellad Tadmor

Ellad Tadmor is a Professor of Aerospace Engineering and Mechanics at the University of Minnesota. He received his B.Sc. and M.Sc. in Mechanical Engineering from the Technion - Israel Institute of Technology in 1987 and 1991, and his Ph.D. from Brown University (USA) in 1996. He pioneered computer simulation methods and theories that span multiple length and time scales to predict the behavior of materials and nanotechnology, including 2D materials, from their atomic structure. He has published over 75 papers in this area and two graduate-level textbooks. Professor Tadmor leads several efforts for advancing the quality and effectiveness of scientific research. He is the Director of the [NSF Open Knowledgebase of Interatomic Models](https://openkim.org), which is a web-based cyberinfrastructure tasked with developing standards and improving the reliability of molecular simulations, and the [ColabFit project](https://colabfit.org) for advancing the use of machine learning in materials science. Tadmor is on the Editorial Board of the *Journal of Elasticity*.

George Karypis (University of Minnesota, Minneapolis)

More from the Same Authors