September 22, 2020
The purpose of this article is to review the achievements made in the last few years towards the understanding of the reasons behind the success and subtleties of neural network-based machine learning. In the tradition of good old applied mathematics, we will not only give attention to rigorous mathematical results, but also the insight we have gained from careful numerical experiments as well as the analysis of simplified models. Along the way, we also list the open problems...
August 22, 2023
This paper presents a novel, interdisciplinary study that leverages a Machine Learning (ML) assisted framework to explore the geometry of affine Deligne-Lusztig varieties (ADLV). The primary objective is to investigate the nonemptiness pattern, dimension and enumeration of irreducible components of ADLV. Our proposed framework demonstrates a recursive pipeline of data generation, model training, pattern analysis, and human examination, presenting an intricate interplay betwee...
January 15, 2021
We review, for a general audience, a variety of recent experiments on extracting structure from machine-learning mathematical data that have been compiled over the years. Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, we present a comparative study of the accuracies on different problems. The paradigm should be useful for conjecture formulation, finding more eff...
March 9, 2021
We use differentiable programming and gradient descent to find unitary matrices that can be used in the period finding algorithm to extract period information from the state of a quantum computer post application of the oracle. The standard procedure is to use the inverse quantum Fourier transform. Our findings suggest that that this is not the only unitary matrix appropriate for the period finding algorithm, There exist several unitary matrices that can affect out the same t...
June 20, 2021
For the last few decades, classical machine learning has allowed us to improve the lives of many through automation, natural language processing, predictive analytics and much more. However, a major concern is the fact that we're fast approach the threshold of the maximum possible computational capacity available to us by the means of classical computing devices including CPUs, GPUs and Application Specific Integrated Circuits (ASICs). This is due to the exponential increase ...
September 24, 2020
Neural network-based machine learning is capable of approximating functions in very high dimension with unprecedented efficiency and accuracy. This has opened up many exciting new possibilities, not just in traditional areas of artificial intelligence, but also in scientific computing and computational science. At the same time, machine learning has also acquired the reputation of being a set of "black box" type of tricks, without fundamental principles. This has been a real ...
November 18, 2022
Fano varieties are 'atomic pieces' of algebraic varieties, the shapes that can be defined by polynomial equations. We describe the role of computation and database methods in the construction and classification of Fano varieties, with an emphasis on three-dimensional Fano varieties with mild singularities called Q-Fano threefolds. The classification of Q-Fano threefolds has been open for several decades, but there has been significant recent progress. These advances combine c...
December 7, 2018
Quantum machine learning has the potential for broad industrial applications, and the development of quantum algorithms for improving the performance of neural networks is of particular interest given the central role they play in machine learning today. In this paper we present quantum algorithms for training and evaluating feedforward neural networks based on the canonical classical feedforward and backpropagation algorithms. Our algorithms rely on an efficient quantum subr...
May 30, 2021
In modelling complex processes, the potential past data that influence future expectations are immense. Models that track all this data are not only computationally wasteful but also shed little light on what past data most influence the future. There is thus enormous interest in dimensional reduction-finding automated means to reduce the memory dimension of our models while minimizing its impact on its predictive accuracy. Here we construct dimensionally reduced quantum mode...
May 11, 2022
As datasets used in scientific applications become more complex, studying the geometry and topology of data has become an increasingly prevalent part of the data analysis process. This can be seen for example with the growing interest in topological tools such as persistent homology. However, on the one hand, topological tools are inherently limited to providing only coarse information about the underlying space of the data. On the other hand, more geometric approaches rely p...