Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Sat Dec 08 05:00 AM -- 03:30 PM (PST) @ Room 519
Machine Learning for Molecules and Materials
José Miguel Hernández-Lobato · Klaus-Robert Müller · Brooks Paige · Matt Kusner · Stefan Chmiela · Kristof Schütt

Website http://www.quantum-machine.org/workshops/nips2018/

The success of machine learning has been demonstrated time and time again in classification, generative modelling, and reinforcement learning. This revolution in machine learning has largely been in domains with at least one of two key properties: (1) the input space is continuous, and thus classifiers and generative models are able to smoothly model unseen data that is ‘similar’ to the training distribution, or (2) it is trivial to generate data, such as in controlled reinforcement learning settings such as Atari or Go games, where agents can re-play the game millions of times.
Unfortunately there are many important learning problems in chemistry, physics, materials science, and biology that do not share these attractive properties, problems where the input is molecular or material data.

Accurate prediction of atomistic properties is a crucial ingredient toward rational compound design in chemical and pharmaceutical industries. Many discoveries in chemistry can be guided by screening large databases of computational molecular structures and properties, but high level quantum-chemical calculations can take up to several days per molecule or material at the required accuracy, placing the ultimate achievement of in silico design out of reach for the foreseeable future. In large part the current state of the art for such problems is the expertise of individual researchers or at best highly-specific rule-based heuristic systems. Efficient methods in machine learning, applied to the prediction of atomistic properties as well as compound design and crystal structure prediction, can therefore have pivotal impact in enabling chemical discovery and foster fundamental insights.

Because of this, in the past few years there has been a flurry of recent work towards designing machine learning techniques for molecule and material data [1-38]. These works have drawn inspiration from and made significant contributions to areas of machine learning as diverse as learning on graphs to models in natural language processing. Recent advances enabled the acceleration of molecular dynamics simulations, contributed to a better understanding of interactions within quantum many-body system and increased the efficiency of density based quantum mechanical modeling methods. This young field offers unique opportunities for machine learning researchers and practitioners, as it presents a wide spectrum of challenges and open questions, including but not limited to representations of physical systems, physically constrained models, manifold learning, interpretability, model bias, and causality.

The goal of this workshop is to bring together researchers and industrial practitioners in the fields of computer science, chemistry, physics, materials science, and biology all working to innovate and apply machine learning to tackle the challenges involving molecules and materials. In a highly interactive format, we will outline the current frontiers and present emerging research directions. We aim to use this workshop as an opportunity to establish a common language between all communities, to actively discuss new research problems, and also to collect datasets by which novel machine learning models can be benchmarked. The program is a collection of invited talks, alongside contributed posters. A panel discussion will provide different perspectives and experiences of influential researchers from both fields and also engage open participant conversation. An expected outcome of this workshop is the interdisciplinary exchange of ideas and initiation of collaboration.

Call for papers:

The 1 day NIPS 2018 Workshop on Machine Learning for Molecules and Materials is calling for contributions on theoretical models, empirical studies, and applications of machine learning for molecules and materials. We also welcome challenge papers on possible applications or datasets. Topics of interest (though not exhaustive) include: chemoinformatics, applications of deep learning to predict molecular properties, drug-discovery and material design, retrosynthesis and synthetic route prediction, modeling and prediction of chemical reaction data, and the analysis of molecular dynamics simulations. We invite submissions that either address new problems and insights for chemistry and quantum physics or present progress on established problems. The workshop includes a poster session, giving the opportunity to present novel ideas and ongoing projects. Submissions should be no longer than 10 pages in any format. Please email all submissions to: nips2018moleculesworkshop@gmail.com

References
[1] Behler, J., Lorenz, S., Reuter, K. (2007). Representing molecule-surface interactions with symmetry-adapted neural networks. J. Chem. Phys., 127(1), 07B603.
[2] Behler, J., Parrinello, M. (2007). Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett., 98(14), 146401.
[3] Kang, B., Ceder, G. (2009). Battery materials for ultrafast charging and discharging. Nature, 458(7235), 190.
[4] Bartók, A. P., Payne, M. C., Kondor, R., Csányi, G. (2010). Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett., 104(13), 136403.
[5] Behler, J. (2011). Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys, 134(7), 074106.
[6] Behler, J. (2011). Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Phys. Chem. Chem. Phys., 13(40), 17930-17955.
[7] Rupp, M., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2012). Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett., 108(5), 058301.
[8] Snyder, J. C., Rupp, M., Hansen, K., Müller, K.-R., Burke, K. (2012). Finding density functionals with machine learning. Phys. Rev. Lett., 108(25), 253002.
[9] Montavon, G., Rupp, M., Gobre, V., Vazquez-Mayagoitia, A., Hansen, K., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2013). Machine learning of molecular electronic properties in chemical compound space. New J. Phys., 15(9), 095003.
[10] Hansen, K., Montavon, G., Biegler, F., Fazli, S., Rupp, M., Scheffler, M., Tkatchenko, A., Müller, K.-R. (2013). Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput., 9(8), 3404-3419.
[11] Bartók, A. P., Kondor, R., Csányi, G. (2013). On representing chemical environments. Phys. Rev. B, 87(18), 184115.
[12] Schütt K. T., Glawe, H., Brockherde F., Sanna A., Müller K.-R., Gross E. K. U. (2014). How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B., 89(20), 205118.
[13] Ramsundar, B., Kearnes, S., Riley, P., Webster, D., Konerding, D., Pande, V. (2015). Massively multitask networks for drug discovery. arXiv preprint arXiv:1502.02072.
[14] Rupp, M., Ramakrishnan, R., & von Lilienfeld, O. A. (2015). Machine learning for quantum mechanical properties of atoms in molecules. J. Phys. Chem. Lett., 6(16), 3309-3313.
[15] V. Botu, R. Ramprasad (2015). Learning scheme to predict atomic forces and accelerate materials simulations., Phys. Rev. B, 92(9), 094306.
[16] Hansen, K., Biegler, F., Ramakrishnan, R., Pronobis, W., von Lilienfeld, O. A., Müller, K.-R., Tkatchenko, A. (2015). Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett, 6(12), 2326-2331.
[17] Alipanahi, B., Delong, A., Weirauch, M. T., Frey, B. J. (2015). Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning. ‎Nat. Biotechnol., 33(8), 831-838.
[18] Duvenaud, D. K., Maclaurin, D., Aguilera-Iparraguirre, J., Gomez-Bombarelli, R., Hirzel, T., Aspuru-Guzik, A., Adams, R. P. (2015). Convolutional networks on graphs for learning molecular fingerprints. NIPS, 2224-2232.
[19] Faber F. A., Lindmaa A., von Lilienfeld, O. A., Armiento, R. (2016). Machine learning energies of 2 million elpasolite (A B C 2 D 6) crystals. Phys. Rev. Lett., 117(13), 135502.
[20] Gomez-Bombarelli, R., Duvenaud, D., Hernandez-Lobato, J. M., Aguilera-Iparraguirre, J., Hirzel, T. D., Adams, R. P., Aspuru-Guzik, A. (2016). Automatic chemical design using a data-driven continuous representation of molecules. arXiv preprint arXiv:1610.02415.
[21] Wei, J. N., Duvenaud, D, Aspuru-Guzik, A. (2016). Neural networks for the prediction of organic chemistry reactions. ACS Cent. Sci., 2(10), 725-732.
[22] Sadowski, P., Fooshee, D., Subrahmanya, N., Baldi, P. (2016). Synergies between quantum mechanics and machine learning in reaction prediction. J. Chem. Inf. Model., 56(11), 2125-2128.
[23] Lee, A. A., Brenner, M. P., Colwell L. J. (2016). Predicting protein-ligand affinity with a random matrix framework. Proc. Natl. Acad. Sci., 113(48), 13564-13569.
[24] Behler, J. (2016). Perspective: Machine learning potentials for atomistic simulations. J. Chem. Phys., 145(17), 170901.
[25] De, S., Bartók, A. P., Csányi, G., Ceriotti, M. (2016). Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys., 18(20), 13754-13769.
[26] Schütt, K. T., Arbabzadah, F., Chmiela, S., Müller, K.-R., Tkatchenko, A. (2017). Quantum-chemical insights from deep tensor neural networks. Nat. Commun., 8, 13890.
[27] Segler, M. H., Waller, M. P. (2017). Neural‐symbolic machine learning for retrosynthesis and reaction prediction. ‎Chem. Eur. J., 23(25), 5966-5971.
[28] Kusner, M. J., Paige, B., Hernández-Lobato, J. M. (2017). Grammar variational autoencoder. arXiv preprint arXiv:1703.01925.
[29] Coley, C. W., Barzilay, R., Jaakkola, T. S., Green, W. H., Jensen K. F. (2017). Prediction of organic reaction outcomes using machine learning. ACS Cent. Sci., 3(5), 434-443.
[30] Altae-Tran, H., Ramsundar, B., Pappu, A. S., Pande, V. (2017). Low data drug discovery with one-shot learning. ACS Cent. Sci., 3(4), 283-293.
[31] Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., Dahl, G. E. (2017). Neural message passing for quantum chemistry. arXiv preprint arXiv:1704.01212.
[32] Chmiela, S., Tkatchenko, A., Sauceda, H. E., Poltavsky, Igor, Schütt, K. T., Müller, K.-R. (2017). Machine learning of accurate energy-conserving molecular force fields. Sci. Adv., 3(5), e1603015.
[33] Ju, S., Shiga T., Feng L., Hou Z., Tsuda, K., Shiomi J. (2017). Designing nanostructures for phonon transport via bayesian optimization. Phys. Rev. X, 7(2), 021024.
[34] Ramakrishnan, R, von Lilienfeld, A. (2017). Machine learning, quantum chemistry, and chemical space. Reviews in Computational Chemistry, 225-256.
[35] Hernandez-Lobato, J. M., Requeima, J., Pyzer-Knapp, E. O., Aspuru-Guzik, A. (2017). Parallel and distributed Thompson sampling for large-scale accelerated exploration of chemical space. arXiv preprint arXiv:1706.01825.
[36] Smith, J., Isayev, O., Roitberg, A. E. (2017). ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci., 8(4), 3192-3203.
[37] Brockherde, F., Li, L., Burke, K., Müller, K.-R. By-passing the Kohn-Sham equations with machine learning. Nat. Commun., 8, 872.
[38] Schütt, K. T., Kindermans, P. J., Sauceda, H. E., Chmiela, S., Tkatchenko, A., Müller, K. R. (2017). SchNet: A continuous-filter convolutional neural network for modeling quantum interactions. NIPS 30.

Contributed Work (Posters)
Invited Talk Session 1 (Talk)
Invited Talk Session 2 (Talk)
Invited Talk Session 3 (Talk)
Invited Talk Session 4 (Talk)