Timezone: »
Website http://www.quantum-machine.org/workshops/nips2018/
The success of machine learning has been demonstrated time and time again in classification, generative modelling, and reinforcement learning. This revolution in machine learning has largely been in domains with at least one of two key properties: (1) the input space is continuous, and thus classifiers and generative models are able to smoothly model unseen data that is ‘similar’ to the training distribution, or (2) it is trivial to generate data, such as in controlled reinforcement learning settings such as Atari or Go games, where agents can re-play the game millions of times.
Unfortunately there are many important learning problems in chemistry, physics, materials science, and biology that do not share these attractive properties, problems where the input is molecular or material data.
Accurate prediction of atomistic properties is a crucial ingredient toward rational compound design in chemical and pharmaceutical industries. Many discoveries in chemistry can be guided by screening large databases of computational molecular structures and properties, but high level quantum-chemical calculations can take up to several days per molecule or material at the required accuracy, placing the ultimate achievement of in silico design out of reach for the foreseeable future. In large part the current state of the art for such problems is the expertise of individual researchers or at best highly-specific rule-based heuristic systems. Efficient methods in machine learning, applied to the prediction of atomistic properties as well as compound design and crystal structure prediction, can therefore have pivotal impact in enabling chemical discovery and foster fundamental insights.
Because of this, in the past few years there has been a flurry of recent work towards designing machine learning techniques for molecule and material data [1-38]. These works have drawn inspiration from and made significant contributions to areas of machine learning as diverse as learning on graphs to models in natural language processing. Recent advances enabled the acceleration of molecular dynamics simulations, contributed to a better understanding of interactions within quantum many-body system and increased the efficiency of density based quantum mechanical modeling methods. This young field offers unique opportunities for machine learning researchers and practitioners, as it presents a wide spectrum of challenges and open questions, including but not limited to representations of physical systems, physically constrained models, manifold learning, interpretability, model bias, and causality.
The goal of this workshop is to bring together researchers and industrial practitioners in the fields of computer science, chemistry, physics, materials science, and biology all working to innovate and apply machine learning to tackle the challenges involving molecules and materials. In a highly interactive format, we will outline the current frontiers and present emerging research directions. We aim to use this workshop as an opportunity to establish a common language between all communities, to actively discuss new research problems, and also to collect datasets by which novel machine learning models can be benchmarked. The program is a collection of invited talks, alongside contributed posters. A panel discussion will provide different perspectives and experiences of influential researchers from both fields and also engage open participant conversation. An expected outcome of this workshop is the interdisciplinary exchange of ideas and initiation of collaboration.
Call for papers:
The 1 day NIPS 2018 Workshop on Machine Learning for Molecules and Materials is calling for contributions on theoretical models, empirical studies, and applications of machine learning for molecules and materials. We also welcome challenge papers on possible applications or datasets. Topics of interest (though not exhaustive) include: chemoinformatics, applications of deep learning to predict molecular properties, drug-discovery and material design, retrosynthesis and synthetic route prediction, modeling and prediction of chemical reaction data, and the analysis of molecular dynamics simulations. We invite submissions that either address new problems and insights for chemistry and quantum physics or present progress on established problems. The workshop includes a poster session, giving the opportunity to present novel ideas and ongoing projects. Submissions should be no longer than 10 pages in any format. Please email all submissions to: nips2018moleculesworkshop@gmail.com
References
[1] Behler, J., Lorenz, S., Reuter, K. (2007). Representing molecule-surface interactions with symmetry-adapted neural networks. J. Chem. Phys., 127(1), 07B603.
[2] Behler, J., Parrinello, M. (2007). Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett., 98(14), 146401.
[3] Kang, B., Ceder, G. (2009). Battery materials for ultrafast charging and discharging. Nature, 458(7235), 190.
[4] Bartók, A. P., Payne, M. C., Kondor, R., Csányi, G. (2010). Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett., 104(13), 136403.
[5] Behler, J. (2011). Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys, 134(7), 074106.
[6] Behler, J. (2011). Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Phys. Chem. Chem. Phys., 13(40), 17930-17955.
[7] Rupp, M., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2012). Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett., 108(5), 058301.
[8] Snyder, J. C., Rupp, M., Hansen, K., Müller, K.-R., Burke, K. (2012). Finding density functionals with machine learning. Phys. Rev. Lett., 108(25), 253002.
[9] Montavon, G., Rupp, M., Gobre, V., Vazquez-Mayagoitia, A., Hansen, K., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2013). Machine learning of molecular electronic properties in chemical compound space. New J. Phys., 15(9), 095003.
[10] Hansen, K., Montavon, G., Biegler, F., Fazli, S., Rupp, M., Scheffler, M., Tkatchenko, A., Müller, K.-R. (2013). Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput., 9(8), 3404-3419.
[11] Bartók, A. P., Kondor, R., Csányi, G. (2013). On representing chemical environments. Phys. Rev. B, 87(18), 184115.
[12] Schütt K. T., Glawe, H., Brockherde F., Sanna A., Müller K.-R., Gross E. K. U. (2014). How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B., 89(20), 205118.
[13] Ramsundar, B., Kearnes, S., Riley, P., Webster, D., Konerding, D., Pande, V. (2015). Massively multitask networks for drug discovery. arXiv preprint arXiv:1502.02072.
[14] Rupp, M., Ramakrishnan, R., & von Lilienfeld, O. A. (2015). Machine learning for quantum mechanical properties of atoms in molecules. J. Phys. Chem. Lett., 6(16), 3309-3313.
[15] V. Botu, R. Ramprasad (2015). Learning scheme to predict atomic forces and accelerate materials simulations., Phys. Rev. B, 92(9), 094306.
[16] Hansen, K., Biegler, F., Ramakrishnan, R., Pronobis, W., von Lilienfeld, O. A., Müller, K.-R., Tkatchenko, A. (2015). Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett, 6(12), 2326-2331.
[17] Alipanahi, B., Delong, A., Weirauch, M. T., Frey, B. J. (2015). Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning. Nat. Biotechnol., 33(8), 831-838.
[18] Duvenaud, D. K., Maclaurin, D., Aguilera-Iparraguirre, J., Gomez-Bombarelli, R., Hirzel, T., Aspuru-Guzik, A., Adams, R. P. (2015). Convolutional networks on graphs for learning molecular fingerprints. NIPS, 2224-2232.
[19] Faber F. A., Lindmaa A., von Lilienfeld, O. A., Armiento, R. (2016). Machine learning energies of 2 million elpasolite (A B C 2 D 6) crystals. Phys. Rev. Lett., 117(13), 135502.
[20] Gomez-Bombarelli, R., Duvenaud, D., Hernandez-Lobato, J. M., Aguilera-Iparraguirre, J., Hirzel, T. D., Adams, R. P., Aspuru-Guzik, A. (2016). Automatic chemical design using a data-driven continuous representation of molecules. arXiv preprint arXiv:1610.02415.
[21] Wei, J. N., Duvenaud, D, Aspuru-Guzik, A. (2016). Neural networks for the prediction of organic chemistry reactions. ACS Cent. Sci., 2(10), 725-732.
[22] Sadowski, P., Fooshee, D., Subrahmanya, N., Baldi, P. (2016). Synergies between quantum mechanics and machine learning in reaction prediction. J. Chem. Inf. Model., 56(11), 2125-2128.
[23] Lee, A. A., Brenner, M. P., Colwell L. J. (2016). Predicting protein-ligand affinity with a random matrix framework. Proc. Natl. Acad. Sci., 113(48), 13564-13569.
[24] Behler, J. (2016). Perspective: Machine learning potentials for atomistic simulations. J. Chem. Phys., 145(17), 170901.
[25] De, S., Bartók, A. P., Csányi, G., Ceriotti, M. (2016). Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys., 18(20), 13754-13769.
[26] Schütt, K. T., Arbabzadah, F., Chmiela, S., Müller, K.-R., Tkatchenko, A. (2017). Quantum-chemical insights from deep tensor neural networks. Nat. Commun., 8, 13890.
[27] Segler, M. H., Waller, M. P. (2017). Neural‐symbolic machine learning for retrosynthesis and reaction prediction. Chem. Eur. J., 23(25), 5966-5971.
[28] Kusner, M. J., Paige, B., Hernández-Lobato, J. M. (2017). Grammar variational autoencoder. arXiv preprint arXiv:1703.01925.
[29] Coley, C. W., Barzilay, R., Jaakkola, T. S., Green, W. H., Jensen K. F. (2017). Prediction of organic reaction outcomes using machine learning. ACS Cent. Sci., 3(5), 434-443.
[30] Altae-Tran, H., Ramsundar, B., Pappu, A. S., Pande, V. (2017). Low data drug discovery with one-shot learning. ACS Cent. Sci., 3(4), 283-293.
[31] Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., Dahl, G. E. (2017). Neural message passing for quantum chemistry. arXiv preprint arXiv:1704.01212.
[32] Chmiela, S., Tkatchenko, A., Sauceda, H. E., Poltavsky, Igor, Schütt, K. T., Müller, K.-R. (2017). Machine learning of accurate energy-conserving molecular force fields. Sci. Adv., 3(5), e1603015.
[33] Ju, S., Shiga T., Feng L., Hou Z., Tsuda, K., Shiomi J. (2017). Designing nanostructures for phonon transport via bayesian optimization. Phys. Rev. X, 7(2), 021024.
[34] Ramakrishnan, R, von Lilienfeld, A. (2017). Machine learning, quantum chemistry, and chemical space. Reviews in Computational Chemistry, 225-256.
[35] Hernandez-Lobato, J. M., Requeima, J., Pyzer-Knapp, E. O., Aspuru-Guzik, A. (2017). Parallel and distributed Thompson sampling for large-scale accelerated exploration of chemical space. arXiv preprint arXiv:1706.01825.
[36] Smith, J., Isayev, O., Roitberg, A. E. (2017). ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci., 8(4), 3192-3203.
[37] Brockherde, F., Li, L., Burke, K., Müller, K.-R. By-passing the Kohn-Sham equations with machine learning. Nat. Commun., 8, 872.
[38] Schütt, K. T., Kindermans, P. J., Sauceda, H. E., Chmiela, S., Tkatchenko, A., Müller, K. R. (2017). SchNet: A continuous-filter convolutional neural network for modeling quantum interactions. NIPS 30.
Author Information
Jose Miguel Hernández-Lobato (University of Cambridge)
Klaus-Robert Müller (TU Berlin)
Brooks Paige (Alan Turing Institute)
Matt Kusner (University of Oxford)
Stefan Chmiela (Technische Universität Berlin)
Kristof Schütt (TU Berlin)
More from the Same Authors
-
2020 Workshop: Machine Learning for Molecules »
José Miguel Hernández-Lobato · Matt Kusner · Brooks Paige · Marwin Segler · Jennifer Wei -
2020 Poster: Compressing Images by Encoding Their Latent Representations with Relative Entropy Coding »
Gergely Flamich · Marton Havasi · José Miguel Hernández-Lobato -
2020 Poster: Sample-Efficient Optimization in the Latent Space of Deep Generative Models via Weighted Retraining »
Austin Tripp · Erik Daxberger · José Miguel Hernández-Lobato -
2020 Poster: Depth Uncertainty in Neural Networks »
Javier Antoran · James Allingham · José Miguel Hernández-Lobato -
2020 Poster: VAEM: a Deep Generative Model for Heterogeneous Mixed Type Data »
Chao Ma · Sebastian Tschiatschek · Richard Turner · José Miguel Hernández-Lobato · Cheng Zhang -
2020 Poster: Barking up the right tree: an approach to search over molecule synthesis DAGs »
John Bradshaw · Brooks Paige · Matt Kusner · Marwin Segler · José Miguel Hernández-Lobato -
2020 Poster: Goal-directed Generation of Discrete Structures with Conditional Generative Models »
Amina Mollaysa · Brooks Paige · Alexandros Kalousis -
2020 Spotlight: Barking up the right tree: an approach to search over molecule synthesis DAGs »
John Bradshaw · Brooks Paige · Matt Kusner · Marwin Segler · José Miguel Hernández-Lobato -
2020 Session: Orals & Spotlights Track 15: COVID/Applications/Composition »
José Miguel Hernández-Lobato · Oliver Stegle -
2019 Workshop: Bayesian Deep Learning »
Yarin Gal · José Miguel Hernández-Lobato · Christos Louizos · Eric Nalisnick · Zoubin Ghahramani · Kevin Murphy · Max Welling -
2019 Poster: Bayesian Batch Active Learning as Sparse Subset Approximation »
Robert Pinsler · Jonathan Gordon · Eric Nalisnick · José Miguel Hernández-Lobato -
2019 Poster: Icebreaker: Element-wise Efficient Information Acquisition with a Bayesian Deep Latent Gaussian Model »
Wenbo Gong · Sebastian Tschiatschek · Sebastian Nowozin · Richard Turner · José Miguel Hernández-Lobato · Cheng Zhang -
2019 Poster: Variational Mixture-of-Experts Autoencoders for Multi-Modal Deep Generative Models »
Yuge Shi · Siddharth N · Brooks Paige · Philip Torr -
2019 Poster: Symmetry-adapted generation of 3d point sets for the targeted discovery of molecules »
Niklas Gebauer · Michael Gastegger · Kristof Schütt -
2019 Poster: A Model to Search for Synthesizable Molecules »
John Bradshaw · Brooks Paige · Matt Kusner · Marwin Segler · José Miguel Hernández-Lobato -
2019 Poster: Successor Uncertainties: Exploration and Uncertainty in Temporal Difference Learning »
David Janz · Jiri Hron · Przemysław Mazur · Katja Hofmann · José Miguel Hernández-Lobato · Sebastian Tschiatschek -
2018 Workshop: Bayesian Deep Learning »
Yarin Gal · José Miguel Hernández-Lobato · Christos Louizos · Andrew Wilson · Zoubin Ghahramani · Kevin Murphy · Max Welling -
2018 Workshop: Critiquing and Correcting Trends in Machine Learning »
Thomas Rainforth · Matt Kusner · Benjamin Bloem-Reddy · Brooks Paige · Rich Caruana · Yee Whye Teh -
2018 Poster: Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo »
Marton Havasi · José Miguel Hernández-Lobato · Juan José Murillo-Fuentes -
2017 Workshop: Bayesian Deep Learning »
Yarin Gal · José Miguel Hernández-Lobato · Christos Louizos · Andrew Wilson · Andrew Wilson · Diederik Kingma · Zoubin Ghahramani · Kevin Murphy · Max Welling -
2017 Workshop: Bayesian optimization for science and engineering »
Ruben Martinez-Cantin · José Miguel Hernández-Lobato · Javier Gonzalez -
2017 Workshop: Machine Learning for Molecules and Materials »
Kristof Schütt · Klaus-Robert Müller · Anatole von Lilienfeld · José Miguel Hernández-Lobato · Klaus-Robert Müller · Alan Aspuru-Guzik · Bharath Ramsundar · Matt Kusner · Brooks Paige · Stefan Chmiela · Alexandre Tkatchenko · Anatole von Lilienfeld · Koji Tsuda -
2017 Poster: Counterfactual Fairness »
Matt Kusner · Joshua Loftus · Chris Russell · Ricardo Silva -
2017 Oral: Counterfactual Fairness »
Matt Kusner · Joshua Loftus · Chris Russell · Ricardo Silva -
2017 Poster: SchNet: A continuous-filter convolutional neural network for modeling quantum interactions »
Kristof Schütt · Pieter-Jan Kindermans · Huziel Enoc Sauceda Felix · Stefan Chmiela · Alexandre Tkatchenko · Klaus-Robert Müller -
2017 Poster: An Empirical Study on The Properties of Random Bases for Kernel Methods »
Maximilian Alber · Pieter-Jan Kindermans · Kristof Schütt · Klaus-Robert Müller · Fei Sha -
2017 Poster: Learning Disentangled Representations with Semi-Supervised Deep Generative Models »
Siddharth Narayanaswamy · Brooks Paige · Jan-Willem van de Meent · Alban Desmaison · Noah Goodman · Pushmeet Kohli · Frank Wood · Philip Torr -
2017 Poster: When Worlds Collide: Integrating Different Counterfactual Assumptions in Fairness »
Chris Russell · Matt Kusner · Joshua Loftus · Ricardo Silva -
2015 Poster: Stochastic Expectation Propagation »
Yingzhen Li · José Miguel Hernández-Lobato · Richard Turner -
2015 Spotlight: Stochastic Expectation Propagation »
Yingzhen Li · José Miguel Hernández-Lobato · Richard Turner -
2014 Poster: Predictive Entropy Search for Efficient Global Optimization of Black-box Functions »
José Miguel Hernández-Lobato · Matthew Hoffman · Zoubin Ghahramani -
2014 Poster: Gaussian Process Volatility Model »
Yue Wu · José Miguel Hernández-Lobato · Zoubin Ghahramani -
2014 Spotlight: Predictive Entropy Search for Efficient Global Optimization of Black-box Functions »
José Miguel Hernández-Lobato · Matthew Hoffman · Zoubin Ghahramani -
2013 Poster: Learning Feature Selection Dependencies in Multi-task Learning »
Daniel Hernández-lobato · José Miguel Hernández-Lobato -
2013 Poster: Gaussian Process Conditional Copulas with Applications to Financial Time Series »
José Miguel Hernández-Lobato · James R Lloyd · Daniel Hernández-lobato -
2012 Poster: Collaborative Gaussian Processes for Preference Learning »
Neil Houlsby · José Miguel Hernández-Lobato · Ferenc Huszar · Zoubin Ghahramani -
2012 Poster: Semi-Supervised Domain Adaptation with Non-Parametric Copulas »
David Lopez-Paz · José Miguel Hernández-Lobato · Bernhard Schölkopf -
2012 Spotlight: Semi-Supervised Domain Adaptation with Non-Parametric Copulas »
David Lopez-Paz · José Miguel Hernández-Lobato · Bernhard Schölkopf -
2011 Poster: Robust Multi-Class Gaussian Process Classification »
Daniel Hernández-lobato · José Miguel Hernández-Lobato · Pierre Dupont -
2007 Poster: Regulator Discovery from Gene Expression Time Series of Malaria Parasites: a Hierachical Approach »
José Miguel Hernández-Lobato · Tjeerd M Dijkstra · Tom Heskes