January 17, 2018
Multilayered artificial neural networks are becoming a pervasive tool in a host of application fields. At the heart of this deep learning revolution are familiar concepts from applied and computational mathematics; notably, in calculus, approximation theory, optimization and linear algebra. This article provides a very brief introduction to the basic ideas that underlie deep learning from an applied mathematics perspective. Our target audience includes postgraduate and final ...
January 15, 2021
We review, for a general audience, a variety of recent experiments on extracting structure from machine-learning mathematical data that have been compiled over the years. Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, we present a comparative study of the accuracies on different problems. The paradigm should be useful for conjecture formulation, finding more eff...
May 2, 2019
Artificial Intelligence (AI), defined in its most simple form, is a technological tool that makes machines intelligent. Since learning is at the core of intelligence, machine learning poses itself as a core sub-field of AI. Then there comes a subclass of machine learning, known as deep learning, to address the limitations of their predecessors. AI has generally acquired its prominence over the past few years due to its considerable progress in various fields. AI has vastly in...
February 20, 2024
Machine learning techniques are increasingly powerful, leading to many breakthroughs in the natural sciences, but they are often stochastic, error-prone, and blackbox. How, then, should they be utilized in fields such as theoretical physics and pure mathematics that place a premium on rigor and understanding? In this Perspective we discuss techniques for obtaining rigor in the natural sciences with machine learning. Non-rigorous methods may lead to rigorous results via conjec...
August 25, 2019
Deep learning has sparked a network of mutual interactions between different disciplines and AI. Naturally, each discipline focuses and interprets the workings of deep learning in different ways. This diversity of perspectives on deep learning, from neuroscience to statistical physics, is a rich source of inspiration that fuels novel developments in the theory and applications of machine learning. In this perspective, we collect and synthesize different intuitions scattered a...
November 14, 2018
We use deep autoencoder neural networks to draw a chart of the heterotic $\mathbb{Z}_6$-II orbifold landscape. Even though the autoencoder is trained without knowing the phenomenological properties of the $\mathbb{Z}_6$-II orbifold models, we are able to identify fertile islands in this chart where phenomenologically promising models cluster. Then, we apply a decision tree to our chart in order to extract the defining properties of the fertile islands. Based on this informati...
February 2, 2021
Modern machine learning techniques, including deep learning, are rapidly being applied, adapted, and developed for high energy physics. Given the fast pace of this research, we have created a living review with the goal of providing a nearly comprehensive list of citations for those developing and applying these approaches to experimental, phenomenological, or theoretical analyses. As a living document, it will be updated as often as possible to incorporate the latest develop...
January 2, 2020
Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof-of-principle, we demonstrate in a large ensemble of Calabi-Yau manifolds that Kahler metrics evaluated at ...
November 26, 2018
Among many unsolved puzzles in theories of Deep Neural Networks (DNNs), there are three most fundamental challenges that highly demand solutions, namely, expressibility, optimisability, and generalisability. Although there have been significant progresses in seeking answers using various theories, e.g. information bottleneck theory, sparse representation, statistical inference, Riemannian geometry, etc., so far there is no single theory that is able to provide solutions to al...
November 27, 2021
This article is intended for physical scientists who wish to gain deeper insights into machine learning algorithms which we present via the domain they know best, physics. We begin with a review of two energy-based machine learning algorithms, Hopfield networks and Boltzmann machines, and their connection to the Ising model. This serves as a foundation to understand the phenomenon of learning more generally. Equipped with this intuition we then delve into additional, more "pr...