May 26, 2009
We introduce a statistical mechanics formalism for the study of constrained graph evolution as a Markovian stochastic process, in analogy with that available for spin systems, deriving its basic properties and highlighting the role of the `mobility' (the number of allowed moves for any given graph). As an application of the general theory we analyze the properties of degree-preserving Markov chains based on elementary edge switchings. We give an exact yet simple formula for the mobility in terms of the graph's adjacency matrix and its spectrum. This formula allows us to define acceptance probabilities for edge switchings, such that the Markov chains become controlled Glauber-type detailed balance processes, designed to evolve to any required invariant measure (representing the asymptotic frequencies with which the allowed graphs are visited during the process). As a corollary we also derive a condition in terms of simple degree statistics, sufficient to guarantee that, in the limit where the number of nodes diverges, even for state-independent acceptance probabilities of proposed moves the invariant measure of the process will be uniform. We test our theory on synthetic graphs and on realistic larger graphs as studied in cellular biology.
Similar papers 1
March 5, 2012
We analyze the properties of degree-preserving Markov chains based on elementary edge switchings in undirected and directed graphs. We give exact yet simple formulas for the mobility of a graph (the number of possible moves) in terms of its adjacency matrix. This formula allows us to define acceptance probabilities for edge switchings, such that the Markov chains become controlled Glauber-type detailed balance processes, designed to evolve to any required invariant measure (r...
December 20, 2011
Randomising networks using a naive `accept-all' edge-swap algorithm is generally biased. Building on recent results for nondirected graphs, we construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities for directed graphs, which converges to a strictly uniform measure and is based on edge swaps that conserve all in- and out-degrees. The acceptance probabilities can also be generalized to define Markov chains that target any alternative desire...
March 24, 2011
One of the most influential recent results in network analysis is that many natural networks exhibit a power-law or log-normal degree distribution. This has inspired numerous generative models that match this property. However, more recent work has shown that while these generative models do have the right degree distribution, they are not good models for real life networks due to their differences on other important metrics like conductance. We believe this is, in part, beca...
December 29, 2005
We study the problem of generating connected random graphs with no self-loops or multiple edges and that, in addition, have a given degree sequence. The generation method we focus on is the edge-switching Markov-chain method, whose functioning depends on a parameter w related to the method's core operation of an edge switch. We analyze two existing heuristics for adjusting w during the generation of a graph and show that they result in a Markov chain whose stationary distribu...
December 18, 2003
We discuss various aspects of the statistical formulation of the theory of random graphs, with emphasis on results obtained in a series of our recent publications.
October 30, 2012
Markov chains are convenient means of generating realizations of networks with a given (joint or otherwise) degree distribution, since they simply require a procedure for rewiring edges. The major challenge is to find the right number of steps to run such a chain, so that we generate truly independent samples. Theoretical bounds for mixing times of these Markov chains are too large to be practically useful. Practitioners have no useful guide for choosing the length, and tend ...
June 28, 2019
This paper presents a novel theoretical Monte Carlo Markov chain procedure in the framework of graphs. It specifically deals with the construction of a Markov chain whose empirical distribution converges to a given reference one. The Markov chain is constrained over an underlying graph, so that states are viewed as vertices and the transition between two states can have positive probability only in presence of an edge connecting them. The analysis is carried out on the basis ...
February 15, 2012
Markov chains are a convenient means of generating realizations of networks, since they require little more than a procedure for rewiring edges. If a rewiring procedure exists for generating new graphs with specified statistical properties, then a Markov chain sampler can generate an ensemble of graphs with prescribed characteristics. However, successive graphs in a Markov chain cannot be used when one desires independent draws from the distribution of graphs; the realization...
January 21, 2006
Inspired by theories such as Loop Quantum Gravity, a class of stochastic graph dynamics was studied in an attempt to gain a better understanding of discrete relational systems under the influence of local dynamics. Unlabeled graphs in a variety of initial configurations were evolved using local rules, similar to Pachner moves, until they reached a size of tens of thousands of vertices. The effect of using different combinations of local moves was studied and a clear relations...
March 7, 2018
An important problem arising in the study of complex networks, for instance in community detection and motif finding, is the sampling of graphs with fixed degree sequence. The equivalent problem of generating random 0,1 matrices with fixed row and column sums is frequently used as a quantitative tool in ecology. It has however proven very challenging to design sampling algorithms that are both fast and unbiased. This article focusses on Markov chain approaches for sampling,...