Please subscribe to our mailing list
to receive the latest information about seminars, conferences, grants, and resources!
Tuesday, February 23 at 2 p.m. via Zoom
Speaker: Professor Grant M. Rotskoff, Chemistry, Stanford University
Abstract: The surprising flexibility and undeniable empirical success of machine learning algorithms have inspired many efforts to employ neural networks in computational and theoretical chemistry. Is this a good idea? I will briefly introduce a perspective, based on statistical mechanics, that gives insight into the trainability and accuracy of neural networks in high-dimensional learning problems and also provides some prescriptions and design principles for learning. Bolstered by the performance of these algorithms in high dimensional problems, I will turn to a central problem in computational chemistry and condensed matter physics---that of computing reaction pathways. Often, these problems appear hopeless without a thorough physical understanding of the reaction mechanism; they are not only high-dimensional, but also dominated by rare events. However, with neural networks in the toolkit, the high-dimensionality can be made somewhat less intimidating. Putting all of this together, I will describe an algorithm that combines stochastic gradient descent with importance sampling to optimize a representation of a reaction pathway for an arbitrary molecular system. Finally, I will discuss the power and limitations of this approach.