Timezone: »

Deep-DFT: Physics-ML hybrid method to predict DFT energy using Transformer
Youngwoo Cho · Seunghoon Yi · Jaegul Choo · Joonseok Lee · Sookyung Kim

Computing the energy of molecules plays a critical role for molecule design. Classical ab-initio methods using Density Functional Theory (DFT) often suffers from scalability issues due to its extreme computing cost. A growing number of data-driven neural-net-based DFT surrogate models have been proposed to address this challenge. After trained on the ab-initio reference data, these models significantly accelerate the energy prediction of molecular systems, circumventing numerically solving the Schrödinger equation. However, the performance of these models is often limited to the scope within the training data distribution. It is also challenging to discover physical insights from their prediction due to the lack of interpretability of neural networks. In this paper, we aim to design a physics-ML hybrid DFT surrogate model, which is both physically interpretable and generalizable to beyond the training data distribution. To achieve these goals, we propose a physics-driven approach to fit the energy to an equation combining Coulomb and Lennard-Jones potentials by first predicting their sub-parameters, then computing the energy product by the equation. Our experimental results show the effectiveness of the proposed approach in its performance, generalizability, and interpretability.

Author Information

Youngwoo Cho (Korea Advanced Institute of Science and Technology)
Seunghoon Yi (Seoul National University)
Jaegul Choo (Korea Advanced Institute of Science and Technology)
Joonseok Lee (Google Research)

Joonseok Lee is a research engineer at Google Research. He is mainly working on content-based video recommendation and multi-modal video representation learning. He earned his Ph. D. in Computer Science from Georgia Institute of Technology in August 2015, under the supervision of Dr. Guy Lebanon and Prof. Hongyuan Zha. His thesis is about local approaches for collaborative filtering, with recommendation systems as the main application. He has done three internships during Ph.D, including Amazon (2014 Summer), Microsoft Research (2014 Spring), and Google (2013 Summer). Before coming to Georgia Tech, he worked in NHN corp. in Korea (2007-2010). He received his B.S degree in computer science and engineering from Seoul National University, Korea. His paper "Local Collaborative Ranking" received the best student paper award from the ACM WWW (2014) and IEEE ICDM (2016) conference. He co-organized the YouTube-8M Large-Scale Video Understanding Workshop as a program chair since 2017, and served as the publicity chair for AISTATS 2015 conference. He has served as a program committee in many conferences including NIPS, ICML, ICLR, AAAI, CVPR, I/ECCV, WSDM, and CIKM, and journals including JMLR, ACM TIST, and IEEE TKDE. More information is available in his website (http://www.joonseok.net).

Sookyung Kim (PARC)

More from the Same Authors