ID: 1204.5243

Repulsive Mixtures

April 24, 2012

View on ArXiv

Similar papers 3

BayesBinMix: an R Package for Model Based Clustering of Multivariate Binary Data

September 22, 2016

86% Match
Panagiotis Papastamoulis, Magnus Rattray
Computation

The BayesBinMix package offers a Bayesian framework for clustering binary data with or without missing values by fitting mixtures of multivariate Bernoulli distributions with an unknown number of components. It allows the joint estimation of the number of clusters and model parameters using Markov chain Monte Carlo sampling. Heated chains are run in parallel and accelerate the convergence to the target posterior distribution. Identifiability issues are addressed by implementi...

Find SimilarView on arXiv

Flexible clustering via hidden hierarchical Dirichlet priors

January 18, 2022

86% Match
Antonio Lijoi, Igor PrĂ¼nster, Giovanni Rebaudo
Methodology
Statistics Theory
Statistics Theory

The Bayesian approach to inference stands out for naturally allowing borrowing information across heterogeneous populations, with different samples possibly sharing the same distribution. A popular Bayesian nonparametric model for clustering probability distributions is the nested Dirichlet process, which however has the drawback of grouping distributions in a single cluster when ties are observed across samples. With the goal of achieving a flexible and effective clustering ...

Find SimilarView on arXiv

A Bayesian approach for clustering skewed data using mixtures of multivariate normal-inverse Gaussian distributions

May 6, 2020

86% Match
Yuan Fang, Dimitris Karlis, Sanjeena Subedi
Computation

Non-Gaussian mixture models are gaining increasing attention for mixture model-based clustering particularly when dealing with data that exhibit features such as skewness and heavy tails. Here, such a mixture distribution is presented, based on the multivariate normal inverse Gaussian (MNIG) distribution. For parameter estimation of the mixture, a Bayesian approach via Gibbs sampler is used; for this, a novel approach to simulate univariate generalized inverse Gaussian random...

Find SimilarView on arXiv

CPU- and GPU-based Distributed Sampling in Dirichlet Process Mixtures for Large-scale Analysis

April 19, 2022

86% Match
Or Dinari, Raz Zamir, ... , Freifeld Oren
Machine Learning
Machine Learning

In the realm of unsupervised learning, Bayesian nonparametric mixture models, exemplified by the Dirichlet Process Mixture Model (DPMM), provide a principled approach for adapting the complexity of the model to the data. Such models are particularly useful in clustering tasks where the number of clusters is unknown. Despite their potential and mathematical elegance, however, DPMMs have yet to become a mainstream tool widely adopted by practitioners. This is arguably due to a ...

Find SimilarView on arXiv

Normalized Random Meaures with Interacting Atoms for Bayesian Nonparametric Mixtures

February 17, 2023

86% Match
Mario Beraha, Raffaele Argiento, ... , Guglielmi Alessandra
Statistics Theory
Probability
Methodology
Statistics Theory

The study of almost surely discrete random probability measures is an active line of research in Bayesian nonparametrics. The idea of assuming interaction across the atoms of the random probability measure has recently spurred significant interest in the context of Bayesian mixture models. This allows the definition of priors that encourage well separated and interpretable clusters. In this work, we provide a unified framework for the construction and the Bayesian analysis of...

Find SimilarView on arXiv

Stochastic Gradient MCMC with Repulsive Forces

November 30, 2018

86% Match
Victor Gallego, David Rios Insua
Machine Learning
Machine Learning

We propose a unifying view of two different Bayesian inference algorithms, Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) and Stein Variational Gradient Descent (SVGD), leading to improved and efficient novel sampling schemes. We show that SVGD combined with a noise term can be framed as a multiple chain SG-MCMC method. Instead of treating each parallel chain independently from others, our proposed algorithm implements a repulsive force between particles, avoiding col...

Find SimilarView on arXiv

A Random Finite Set Model for Data Clustering

March 14, 2017

86% Match
Dinh Phung, Ba-Ngu Bo
Machine Learning

The goal of data clustering is to partition data points into groups to minimize a given objective function. While most existing clustering algorithms treat each data point as vector, in many applications each datum is not a vector but a point pattern or a set of points. Moreover, many existing clustering methods require the user to specify the number of clusters, which is not available in advance. This paper proposes a new class of models for data clustering that addresses se...

Find SimilarView on arXiv

BayesMix: Bayesian Mixture Models in C++

May 17, 2022

86% Match
Mario Beraha, Bruno Guindani, ... , Guglielmi Alessandra
Computation
Other Statistics

We describe BayesMix, a C++ library for MCMC posterior simulation for general Bayesian mixture models. The goal of BayesMix is to provide a self-contained ecosystem to perform inference for mixture models to computer scientists, statisticians and practitioners. The key idea of this library is extensibility, as we wish the users to easily adapt our software to their specific Bayesian mixture models. In addition to the several models and MCMC algorithms for posterior inference ...

Find SimilarView on arXiv

Fitting A Mixture Distribution to Data: Tutorial

January 20, 2019

86% Match
Benyamin Ghojogh, Aydin Ghojogh, ... , Karray Fakhri
Other Statistics
Machine Learning
Methodology
Machine Learning

This paper is a step-by-step tutorial for fitting a mixture distribution to data. It merely assumes the reader has the background of calculus and linear algebra. Other required background is briefly reviewed before explaining the main algorithm. In explaining the main algorithm, first, fitting a mixture of two distributions is detailed and examples of fitting two Gaussians and Poissons, respectively for continuous and discrete cases, are introduced. Thereafter, fitting severa...

Find SimilarView on arXiv

Mixture models with a prior on the number of components

February 22, 2015

86% Match
Jeffrey W. Miller, Matthew T. Harrison
Methodology

A natural Bayesian approach for mixture models with an unknown number of components is to take the usual finite mixture model with Dirichlet weights, and put a prior on the number of components---that is, to use a mixture of finite mixtures (MFM). While inference in MFMs can be done with methods such as reversible jump Markov chain Monte Carlo, it is much more common to use Dirichlet process mixture (DPM) models because of the relative ease and generality with which DPM sampl...

Find SimilarView on arXiv