The NIPS 2017 Workshop on Machine Learning for Molecules and Materials is calling for contributions on theoretical models, empirical studies, and applications of machine learning for molecules and materials. We also welcome challenge papers on possible applications or datasets.
We invite submissions that either address new/interesting problems and insights for chemistry and quantum physics or present progress on established problems. The workshop includes a poster session, giving the opportunity to present novel ideas and ongoing projects.
Topics of interest (though not exhaustive) include: chemoinformatics, applications of deep learning to predict molecular properties, drug-discovery and material design, retrosynthesis and synthetic route prediction, modeling and prediction of chemical reaction data, and the analysis of molecular dynamics simulations.
Submission Instructions
Researchers interested in contributing should upload non-anonymized papers of up to 10 pages, including text, figures and bibliographic references by Wednesday, October 18, 2017 November 1st, 2017. Papers should adhere to the NIPS conference paper format, via the NIPS LaTeX style file.
Please email all submissions to: qm.nips2017@gmail.com
Deadline:
October 18, 2017 November 1st, 2017
Abstract
The success of machine learning has been demonstrated time and time again in classification, generative modelling, and reinforcement learning. In particular, we have recently seen interesting developments where ML has been applied to the natural sciences (chemistry, physics, materials science, neuroscience and biology). Here, often the data is not abundant and very costly. This workshop will focus on the unique challenges of applying machine learning to molecules and materials.
Accurate prediction of chemical and physical properties is a crucial ingredient toward rational compound design in chemical and pharmaceutical industries. Many discoveries in chemistry can be guided by screening large databases of computational molecular structures and properties, but high level quantum-chemical calculations can take up to several days per molecule or material at the required accuracy, placing the ultimate achievement of in silico design out of reach for the foreseeable future. In large part the current state of the art for such problems is the expertise of individual researchers or at best highly-specific rule-based heuristic systems. Efficient methods in machine learning, applied to property and structure prediction, can therefore have pivotal impact in enabling chemical discovery and foster fundamental insights.
Because of this, in the past few years there has been a flurry of recent work towards designing machine learning techniques for molecule [1, 2, 4-11, 13-18, 20, 21, 23-32, 34-38] and material data [1-3, 5, 6, 12, 19, 24, 33]. These works have drawn inspiration from and made significant contributions to areas of machine learning as diverse as learning on graphs to models in natural language processing. Recent advances enabled the acceleration of molecular dynamics simulations, contributed to a better understanding of interactions within quantum many-body systems and increased the efficiency of density functional theory based quantum mechanical modeling methods. This young field offers unique opportunities for machine learning researchers and practitioners, as it presents a wide spectrum of challenges and open questions, including but not limited to representations of physical systems, physically constrained models, manifold learning, interpretability, model bias, and causality.
The goal of this workshop is to bring together researchers and industrial practitioners in the fields of computer science, chemistry, physics, materials science, and biology all working to innovate and apply machine learning to tackle the challenges involving molecules and materials. In a highly interactive format, we will outline the current frontiers and present emerging research directions. We aim to use this workshop as an opportunity to establish a common language between all communities, to actively discuss new research problems, and also to collect datasets by which novel machine learning models can be benchmarked. The program is a collection of invited talks, alongside contributed posters. A panel discussion will provide different perspectives and experiences of influential researchers from both fields and also engage open participant conversation. An expected outcome of this workshop is the interdisciplinary exchange of ideas and initiation of collaboration.
Schedule
08:00 |
Opening Remarks |
|
Klaus-Robert Müller |
Introduction to Machine Learning and Chemistry |
08:20 |
Invited Talk |
Machine Learning for Molecular Materials Design |
Alán Aspuru-Guzik |
08:45 |
Invited Talk |
[TBA] |
Robert A. DiStasio Jr. |
09:00 |
Invited Talk |
New Density Functionals Created by Machine Learning |
Kieron Burke |
09:25 |
Q/A Session |
09:35 |
Poster Spotlights |
10:15 |
Poster Session Coffee |
Machine Learning Applications in Chemistry |
10:45 |
Invited Talk |
Quantum Machine Learning |
O. Anatole von Lilienfeld |
11:05 |
Invited Talk |
Machine Learning in Organic Synthesis Planning And Execution |
Klavs F. Jensen |
11:25 |
Invited Talk |
Neural-network Quantum States |
Giuseppe Carleo |
11:40 |
Invited Talk |
Quantitative Attribution: Do Neural Network Models Learn the Correct Chemistry? |
Lucy Colwell |
11:55 |
Q/A Session |
12:05 |
Lunch |
Kernel Learning with Structured Data |
13:35 |
Invited Talk |
Differentiable System Learning |
Alexander J. Smola |
14:00 |
Invited talk |
ChemTS: An Efficient Python Library for De Novo Molecular Generation |
Koji Tsuda |
14:15 |
Invited Talk |
Symmetry Matters: Learning Scalars and Tensors in Materials and Molecules |
Michele Ceriotti |
14:35 |
Short |
Towards Exact Molecular Dynamics Simulations with Machine-Learned Force Fields |
Stefan Chmiela |
14:45 |
Q/A Session |
14:55 |
Poster Session Coffee |
Deep Learning Approaches |
15:25 |
Invited Talk |
N-body Neural Networks: A General Compositional Architecture For Representing Multiscale Physical Systems |
Risi Kondor |
15:45 |
Invited Talk |
Distilling Expensive Simulations with Neural Networks |
Oriol Vinyals |
16:05 |
Invited Talk |
Automatic Chemical Design Using a Data-driven Continuous Representation of Molecules |
David Duvenaud |
16:20 |
Invited Talk |
Planning Chemical Syntheses with Neural Networks and Monte Carlo Tree Search |
Marwin Segler |
16:40 |
Q/A Session |
16:50 |
Panel Discussion |
17:20 |
Closing Remarks |
|
José Miguel Hernández-Lobato |
17:35 |
Poster Session |
Accepted Papers
- A Chemical Bond-based Representation of Materials [arXiv]
- Van-Doan Nguyen, Le Dinh Khiet, Pham Tien Lam, Dam Hieu Chi
- Spotlight Talk Automatically Extracting Action Graphs From Materials Science Synthesis Procedures [arXiv]
- Sheshera Mysore, Edward Kim, Emma Strubell, Ao Liu, Haw-Shiuan Chang, Srikrishna Kompella, Kevin Huang, Andrew McCallum, Elsa Olivetti
- Bayesian Protein Optimization
- Stephan Eismann, Karen Sarkisyan, Stefano Ermon
- Calibrated Boosting-forest [arXiv]
- Haozhen Wu
- Spotlight Talk ChemNet: A Transferable and Generalizable Deep Neural Network for Small-molecule Property Prediction
- Garrett B. Goh, Charles Siegel, Abhinav Vishnu, Nathan Hodas
- Constrained Bayesian Optimization for Automatic Chemical Design [arXiv]
- Ryan-Rhys Griffiths, José Miguel Hernández-Lobato
- Deep Learning for Prediction of Synergistic Effects of Anti-cancer Drugs
- Kristina Preuer, Richard P.I. Lewis, Sepp Hochreiter, Andreas Bender, Krishna C. Bulusu, Günter Klambauer
- Deep Learning Yields Virtual Assays
- Thomas Unterthiner, Günter Klambauer, Andreas Mayr, Sepp Hochreiter
- Spotlight Talk End-to-end Learning of Graph Neural Networks for Latent Molecular Representations
- Masashi Tsubaki, Masashi Shimbo, Atsunori Kanemura, Hideki Asoh
- Spotlight Talk “Found in translation”: Predicting Outcomes of Complex Organic Chemistry Reactions Using Neural Sequence-to-sequence Models [arXiv]
- Philippe Schwaller, Théophile Gaudin, Dávid Lányi, Costas Bekas, Teodoro Laino
- Spotlight Talk Learning a Generative Model for Validity in Complex Discrete Structures [arXiv]
- David Janz, Jos van der Westhuizen, Brooks Paige, Matt J. Kusner, José Miguel Hernández-Lobato
- Learning Hard Quantum Distributions With Variational Autoencoders [arXiv]
- Andrea Rocchetto, Edward Grant, Sergii Strelchuk, Giuseppe Carleo, Simone Severini
- Spotlight Talk Ligand Pose Optimization With Atomic Grid-based Convolutional Neural Networks [arXiv]
- Matthew Ragoza, Lillian Turner, David Ryan Koes
- Machine Learning-enabled Study of Proton Transfer Reaction Mechanisms on Titania Surfaces
- Qian Yang, Muralikrishna Raju, Matthias Ihme, Evan J. Reed
- Neural Network for Learning Universal Atomic Forces
- Pham Tien Lam, Hiori Kinob, Takashi Miyakeb, Nguyen Viet Cuong, Dam Hieu Chia
- Overcoming Data Scarcity With Transfer Learning [arXiv]
- Maxwell L. Hutchinson, Erin Antono, Brenna M. Gibbons, Sean Paradiso, Julia Ling, Bryce Meredig
- Pure Density Functional for Strong Correlations and the Thermodynamic Limit From Machine Learning
- Li Li, Thomas E. Baker, Steven R. White, Kieron Burke
- Spotlight Talk Semi-supervised Continuous Representation of Molecules [arXiv]
- Rafael Gómez-Bombarelli, Jennifer N. Wei, David Duvenaud, José Miguel Hernández-Lobato, Benjamín Sánchez-Lengeling, Dennis Sheberla, Timothy D. Hirzel, Jorge Aguilera-Iparraguirre, Ryan P. Adams, Alán Aspuru-Guzik
- Semi-supervised Learning of Hierarchical Representations of Molecules Using Neural Message Passing [arXiv]
- Hai Nguyen, Shin-ichi Maeda, Kenta Oono
- Spotlight Talk Syntax-directed Variational Autoencoder for Molecule Generation [PDF]
- Hanjun Dai, Yingtao Tian, Bo Dai, Steven Skiena, Le Song
- Toxicity Prediction Using Self-normalizing Networks
- Günter Klambauer, Thomas Unterthiner, Andreas Mayr, Sepp Hochreiter
- Unsupervised Learning of Dynamical and Molecular Similarity Using Variance Minimization
- Brooke E. Husic, Vijay S. Pande
Sponsors
The Alan Turing Institute
The Alan Turing Institute is the national institute for data science, headquartered at the British Library. Our mission is to make great leaps in data science research in order to change the world for the better.
Research excellence is the foundation of the Institute: the sharpest minds from the data science community investigating the hardest questions. We work with integrity and dedication. Our researchers collaborate across disciplines to generate impact, both through theoretical development and application to real-world problems. We are fuelled by the desire to innovate and add value.
Data science will change the world. We are pioneers; training the next generation of data science leaders, shaping the public conversation, and pushing the boundaries of this new science for the public good.
BenevolentAI: Artificial Intelligence for Scientific Innovation
BenevolentAI has built a leading position in artificial intelligence by developing technologies that deliver previously unimaginable scientific advances, rapidly accelerate scientific innovation and completely disrupt traditional methods of scientific discovery. The technology has been validated in drug discovery, specifically, in the most challenging field of human biology: the identification of new disease targets.
By amplifying a researchers’ ability to grasp an entire corpus of data and iterate the scientific method at exponentially faster rates, BenevolentAI brings highly advanced tools to traditional R&D programmes enabling artificial intelligence to be applied to the scientific discovery process. In just 2 years, the Company has developed a pipeline of twenty-two pre-clinical and clinical drug programmes, a process normally taking 10 to 15 years.
BenevolentAI is hiring Machine Learning Researchers expert in NLP, Reinforcement Learning or Chemistry Machine Learning to join its New York and London Offices
If you are interested visit our website | Contact us careers@benevolent.ai | Follow us @benevolent_ai
Technische Universität Berlin
With around 32000 students, circa 100 course offerings and 40 Institutes, the historic Technische Universität Berlin is one of Germany’s largest and most internationally
renowned technical universities. Located in Germany’s capital city – at the heart of Europe – outstanding achievements in research and teaching, imparting skills to excellent graduates, and a modern service-oriented administration characterize TU Berlin.
With six institutes, 60 professors and more than 500 scientific staff members, Faculty IV is one of the leading university faculties of its kind in Germany. The Faculty’s scientific productivity is reflected both in the large number of publications and
the high level of external funding. Our research collaboration with top universities in North America, Europe, and Asia ensures an ongoing international exchange of ideas and information. The numerous honors awarded to the Faculty’s scientists are yet another reason for our outstanding reputation.
Visit our website at http://www.ml.tu-berlin.de/menue/machine_learning/.
References
- [1]
- Behler, J., Lorenz, S., Reuter, K. (2007). Representing molecule-surface interactions with symmetry-adapted neural networks. J. Chem. Phys., 127(1), 07B603.
- [2]
- Behler, J., Parrinello, M. (2007). Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett., 98(14), 146401.
- [3]
- Kang, B., Ceder, G. (2009). Battery materials for ultrafast charging and discharging. Nature, 458(7235), 190.
- [4]
- Bartók, A. P., Payne, M. C., Kondor, R., Csányi, G. (2010). Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett., 104(13), 136403.
- [5]
- Behler, J. (2011). Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys, 134(7), 074106.
- [6]
- Behler, J. (2011). Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Phys. Chem. Chem. Phys., 13(40), 17930-17955.
- [7]
- Rupp, M., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2012). Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett., 108(5), 058301.
- [8]
- Snyder, J. C., Rupp, M., Hansen, K., Müller, K.-R., Burke, K. (2012). Finding density functionals with machine learning. Phys. Rev. Lett., 108(25), 253002.
- [9]
- Montavon, G., Rupp, M., Gobre, V., Vazquez-Mayagoitia, A., Hansen, K., Tkatchenko, A., Müller, K.-R., von Lilienfeld, O. A. (2013). Machine learning of molecular electronic properties in chemical compound space. New J. Phys., 15(9), 095003.
- [10]
- Hansen, K., Montavon, G., Biegler, F., Fazli, S., Rupp, M., Scheffler, M., Tkatchenko, A., Müller, K.-R. (2013). Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput., 9(8), 3404-3419.
- [11]
- Bartók, A. P., Kondor, R., Csányi, G. (2013). On representing chemical environments. Phys. Rev. B, 87(18), 184115.
- [12]
- Schütt K. T., Glawe, H., Brockherde F., Sanna A., Müller K.-R., Gross E. K. U. (2014). How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B., 89(20), 205118.
- [13]
- Ramsundar, B., Kearnes, S., Riley, P., Webster, D., Konerding, D., Pande, V. (2015). Massively multitask networks for drug discovery. arXiv preprint arXiv:1502.02072.
- [14]
- Rupp, M., Ramakrishnan, R., & von Lilienfeld, O. A. (2015). Machine learning for quantum mechanical properties of atoms in molecules. J. Phys. Chem. Lett., 6(16), 3309-3313.
- [15]
- V. Botu, R. Ramprasad (2015). Learning scheme to predict atomic forces and accelerate materials simulations., Phys. Rev. B, 92(9), 094306.
- [16]
- Hansen, K., Biegler, F., Ramakrishnan, R., Pronobis, W., von Lilienfeld, O. A., Müller, K.-R., Tkatchenko, A. (2015). Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett, 6(12), 2326-2331.
- [17]
- Alipanahi, B., Delong, A., Weirauch, M. T., Frey, B. J. (2015). Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning. Nat. Biotechnol., 33(8), 831-838.
- [18]
- Duvenaud, D. K., Maclaurin, D., Aguilera-Iparraguirre, J., Gomez-Bombarelli, R., Hirzel, T., Aspuru-Guzik, A., Adams, R. P. (2015). Convolutional networks on graphs for learning molecular fingerprints. NIPS, 2224-2232.
- [19]
- Faber F. A., Lindmaa A., von Lilienfeld, O. A., Armiento, R. (2016). Machine learning energies of 2 million elpasolite (A B C 2 D 6) crystals. Phys. Rev. Lett., 117(13), 135502.
- [20]
- Gomez-Bombarelli, R., Duvenaud, D., Hernandez-Lobato, J. M., Aguilera-Iparraguirre, J., Hirzel, T. D., Adams, R. P., Aspuru-Guzik, A. (2016). Automatic chemical design using a data-driven continuous representation of molecules. arXiv preprint arXiv:1610.02415.
- [21]
- Wei, J. N., Duvenaud, D, Aspuru-Guzik, A. (2016). Neural networks for the prediction of organic chemistry reactions. ACS Cent. Sci., 2(10), 725-732.
- [22]
- Sadowski, P., Fooshee, D., Subrahmanya, N., Baldi, P. (2016). Synergies between quantum mechanics and machine learning in reaction prediction. J. Chem. Inf. Model., 56(11), 2125-2128.
- [23]
- Lee, A. A., Brenner, M. P., Colwell L. J. (2016). Predicting protein-ligand affinity with a random matrix framework. Proc. Natl. Acad. Sci., 113(48), 13564-13569.
- [24]
- Behler, J. (2016). Perspective: Machine learning potentials for atomistic simulations. J. Chem. Phys., 145(17), 170901.
- [25]
- De, S., Bartók, A. P., Csányi, G., Ceriotti, M. (2016). Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys., 18(20), 13754-13769.
- [26]
- Schütt, K. T., Arbabzadah, F., Chmiela, S., Müller, K.-R., Tkatchenko, A. (2017). Quantum-chemical insights from deep tensor neural networks. Nat. Commun., 8, 13890.
- [27]
- Segler, M. H., Waller, M. P. (2017). Neural‐symbolic machine learning for retrosynthesis and reaction prediction. Chem. Eur. J., 23(25), 5966-5971.
- [28]
- Kusner, M. J., Paige, B., Hernández-Lobato, J. M. (2017). Grammar variational autoencoder. arXiv preprint arXiv:1703.01925.
- [29]
- Coley, C. W., Barzilay, R., Jaakkola, T. S., Green, W. H., Jensen K. F. (2017). Prediction of organic reaction outcomes using machine learning. ACS Cent. Sci., 3(5), 434-443.
- [30]
- Altae-Tran, H., Ramsundar, B., Pappu, A. S., Pande, V. (2017). Low data drug discovery with one-shot learning. ACS Cent. Sci., 3(4), 283-293.
- [31]
- Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., Dahl, G. E. (2017). Neural message passing for quantum chemistry. arXiv preprint arXiv:1704.01212.
- [32]
- Chmiela, S., Tkatchenko, A., Sauceda, H. E., Poltavsky, Igor, Schütt, K. T., Müller, K.-R. (2017). Machine learning of accurate energy-conserving molecular force fields. Sci. Adv., 3(5), e1603015.
- [33]
- Ju, S., Shiga T., Feng L., Hou Z., Tsuda, K., Shiomi J. (2017). Designing nanostructures for phonon transport via bayesian optimization. Phys. Rev. X, 7(2), 021024.
- [34]
- Ramakrishnan, R, von Lilienfeld, A. (2017). Machine learning, quantum chemistry, and chemical space. Reviews in Computational Chemistry, 225-256.
- [35]
- Hernandez-Lobato, J. M., Requeima, J., Pyzer-Knapp, E. O., Aspuru-Guzik, A. (2017). Parallel and distributed Thompson sampling for large-scale accelerated exploration of chemical space. arXiv preprint arXiv:1706.01825.
- [36]
- Smith, J., Isayev, O., Roitberg, A. E. (2017). ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci., 8(4), 3192-3203.
- [37]
- Brockherde, F., Li, L., Burke, K., Müller, K.-R. By-passing the Kohn-Sham equations with machine learning. Nat. Commun., 8, 872.
- [38]
- Schütt, K. T., Kindermans, P. J., Sauceda, H. E., Chmiela, S., Tkatchenko, A., Müller, K. R. (2017). MolecuLeNet: A continuous-filter convolutional neural network for modeling quantum interactions. NIPS (accepted).
Contact: Please direct any questions to qm.nips2017@gmail.com.