ID: hep-ph/9806432

Vegas Revisited: Adaptive Monte Carlo Integration Beyond Factorization

June 22, 1998

View on ArXiv

Similar papers 4

Quasi-Monte Carlo Software

February 15, 2021

83% Match
Sou-Cheng T. Choi, Fred J. Hickernell, R. Jagadeeswaran, ... , Sorokin Aleksei G.
Mathematical Software
Numerical Analysis
Numerical Analysis

Practitioners wishing to experience the efficiency gains from using low discrepancy sequences need correct, robust, well-written software. This article, based on our MCQMC 2020 tutorial, describes some of the better quasi-Monte Carlo (QMC) software available. We highlight the key software components required by QMC to approximate multivariate integrals or expectations of functions of vector random variables. We have combined these components in QMCPy, a Python open-source lib...

Find SimilarView on arXiv

PAGANI: A Parallel Adaptive GPU Algorithm for Numerical

April 13, 2021

83% Match
Ioannis Sakiotis, Kamesh Arumugam, Marc Paterno, Desh Ranjan, ... , Zubair Mohammad
Distributed, Parallel, and C...

We present a new adaptive parallel algorithm for the challenging problem of multi-dimensional numerical integration on massively parallel architectures. Adaptive algorithms have demonstrated the best performance, but efficient many-core utilization is difficult to achieve because the adaptive work-load can vary greatly across the integration space and is impossible to predict a priori. Existing parallel algorithms utilize sequential computations on independent processors, whi...

Find SimilarView on arXiv

MadNIS -- Neural Multi-Channel Importance Sampling

December 12, 2022

83% Match
Theo Heimel, Ramon Winterhalder, Anja Butter, Joshua Isaacson, Claudius Krause, Fabio Maltoni, ... , Plehn Tilman
Computational Physics

Theory predictions for the LHC require precise numerical phase-space integration and generation of unweighted events. We combine machine-learned multi-channel weights with a normalizing flow for importance sampling, to improve classical methods for numerical integration. We develop an efficient bi-directional setup based on an invertible network, combining online and buffered training for potentially expensive integrands. We illustrate our method for the Drell-Yan process wit...

Find SimilarView on arXiv

Monte Carlo Methods in Quantum Field Theory

May 30, 2007

83% Match
I. Montvay
High Energy Physics - Lattic...

In these lecture notes some applications of Monte Carlo integration methods in Quantum Field Theory - in particular in Quantum Chromodynamics - are introduced and discussed.

Find SimilarView on arXiv

Accelerating HEP simulations with Neural Importance Sampling

January 17, 2024

82% Match
Nicolas Deutschmann, Niklas Götz
Computational Physics
Data Analysis, Statistics an...

Virtually all high-energy-physics (HEP) simulations for the LHC rely on Monte Carlo using importance sampling by means of the VEGAS algorithm. However, complex high-precision calculations have become a challenge for the standard toolbox. As a result, there has been keen interest in HEP for modern machine learning to power adaptive sampling. Despite previous work proving that normalizing-flow-powered neural importance sampling (NIS) sometimes outperforms VEGAS, existing resear...

Find SimilarView on arXiv

(MC)**3 -- a Multi-Channel Markov Chain Monte Carlo algorithm for phase-space sampling

April 16, 2014

82% Match
Kevin Kroeninger, Steffen Schumann, Benjamin Willenberg
Data Analysis, Statistics an...

A new Monte Carlo algorithm for phase-space sampling, named (MC)**3, is presented. It is based on Markov Chain Monte Carlo techniques but at the same time incorporates prior knowledge about the target distribution in the form of suitable phase-space mappings from a corresponding Multi-Channel importance sampling Monte Carlo. The combined approach inherits the benefits of both techniques while typical drawbacks of either solution get ameliorated.

Find SimilarView on arXiv

Model Evidence with Fast Tree Based Quadrature

May 22, 2020

82% Match
Thomas Foster, Chon Lok Lei, Martin Robinson, ... , Lambert Ben
Machine Learning
Machine Learning
Mathematical Software
Computation

High dimensional integration is essential to many areas of science, ranging from particle physics to Bayesian inference. Approximating these integrals is hard, due in part to the difficulty of locating and sampling from regions of the integration domain that make significant contributions to the overall integral. Here, we present a new algorithm called Tree Quadrature (TQ) that separates this sampling problem from the problem of using those samples to produce an approximation...

Find SimilarView on arXiv

Layered Adaptive Importance Sampling

May 18, 2015

82% Match
L. Martino, V. Elvira, ... , Corander J.
Computation
Machine Learning
Machine Learning

Monte Carlo methods represent the "de facto" standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting e...

Find SimilarView on arXiv

Parallelizing MCMC Sampling via Space Partitioning

August 7, 2020

82% Match
Vasyl Hafych, Philipp Eller, ... , Caldwell Allen
Computation
Data Analysis, Statistics an...

Efficient sampling of many-dimensional and multimodal density functions is a task of great interest in many research fields. We describe an algorithm that allows parallelizing inherently serial Markov chain Monte Carlo (MCMC) sampling by partitioning the space of the function parameters into multiple subspaces and sampling each of them independently. The samples of the different subspaces are then reweighted by their integral values and stitched back together. This approach a...

Find SimilarView on arXiv

Advanced Multilevel Monte Carlo Methods

April 24, 2017

82% Match
Ajay Jasra, Kody Law, Carina Suciu
Computation
Numerical Analysis
Methodology

This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation,...

Find SimilarView on arXiv