Skip to yearly menu bar Skip to main content


Poster

Scientific Consistency Improves upon Multi-Task Learning in Molecular Science

Yuxuan Ren · Dihan Zheng · Chang Liu · Peiran Jin · Yu Shi · Lin Huang · Jiyan He · Shengjie Luo · Tao Qin · Tie-Yan Liu


Abstract:

In recent years, machine learning has demonstrated impressive capability in handling molecular science tasks. To support various molecular properties in scale, machine learning models are trained in the multi-task learning paradigm. Nevertheless, data of different molecular properties are often not aligned: some quantities, e.g., equilibrium structure, demand more cost to compute than others, e.g., energy, so their data are often generated by cheaper computational methods at the cost of lower accuracy, which cannot be directly overcome through multi-task learning. Moreover, it is not straightforward to leverage abundant data of other tasks to benefit a particular task. To handle such data heterogeneity challenges, we exploit the specialty of molecular tasks that there are scientific laws connecting them, and design consistency training approaches that allow different tasks to exchange information directly so as to improve one another. Particularly, we demonstrate that the more accurate energy data can improve the accuracy of structure prediction. We also find that consistency training can directly leverage force and off-equilibrium structure data to improve structure prediction, demonstrating a broad capability for integrating heterogeneous data.

Live content is unavailable. Log in and register to view live content