October 18, 2004
The brain as an astonishingly remarkable device has been studied from various angles. It is now well known that neurons are the seat of all activities of the brain function. The dynamical properties pertaining to a single neuron and a collection of neurons may be widely different owing to the clustering properties of a group of neurons. As it can be clearly understood theory of complex physical systems has been more and more employed to study the behaviour of neurons and neur...
February 3, 2023
Model complexity remains a key feature of any proposed data generating mechanism. Measures of complexity can be extended to complex patterns such as signals in time and graphs. In this paper, we are concerned with the well-studied class of exchangeable graphs. Exchangeability for graphs implies a distributional invariance under node permutation and is a suitable default model that can widely be used for network data. For this well-studied class of graphs, we make a choice to ...
November 28, 2023
One of the main challenges in the study of time-varying networks is the interplay of memory effects with structural heterogeneity. In particular, different nodes and dyads can have very different statistical properties in terms of both link formation and link persistence, leading to a superposition of typical timescales, sub-optimal parametrizations and substantial estimation biases. Here we develop an unbiased maximum-entropy framework to study empirical network trajectories...
March 31, 2023
We introduce a novel method, called Dispersion Entropy for Graph Signals, $DE_G$, as a powerful tool for analysing the irregularity of signals defined on graphs. We demonstrate the effectiveness of $DE_G$ in detecting changes in the dynamics of signals defined on synthetic and real-world graphs, by defining mixed processing on random geometric graphs or those exhibiting with small-world properties. Remarkably, $DE_G$ generalises the classical dispersion entropy for univariate...
January 28, 2018
Entropy is a classical measure to quantify the amount of information or complexity of a system. Various entropy-based measures such as functional and spectral entropies have been proposed in brain network analysis. However, they are less widely used than traditional graph theoretic measures such as global and local efficiencies because either they are not well-defined on a graph or difficult to interpret its biological meaning. In this paper, we propose a new entropy-based gr...
October 20, 2014
A central issue of the science of complex systems is the quantitative characterization of complexity. In the present work we address this issue by resorting to information geometry. Actually we propose a constructive way to associate to a - in principle any - network a differentiable object (a Riemannian manifold) whose volume is used to define an entropy. The effectiveness of the latter to measure networks complexity is successfully proved through its capability of detecting...
October 1, 2021
Entropy metrics (for example, permutation entropy) are nonlinear measures of irregularity in time series (one-dimensional data). Some of these entropy metrics can be generalised to data on periodic structures such as a grid or lattice pattern (two-dimensional data) using its symmetry, thus enabling their application to images. However, these metrics have not been developed for signals sampled on irregular domains, defined by a graph. Here, we define for the first time an entr...
November 6, 2017
Using an information theoretic point of view, we investigate how a dynamics acting on a network can be coarse grained through the use of graph partitions. Specifically, we are interested in how aggregating the state space of a Markov process according to a partition impacts on the thus obtained lower-dimensional dynamics. We highlight that for a dynamics on a particular graph there may be multiple coarse grained descriptions that capture different, incomparable features of th...
January 2, 2014
Pathways of diffusion observed in real-world systems often require stochastic processes going beyond first-order Markov models, as implicitly assumed in network theory. In this work, we focus on second-order Markov models, and derive an analytical expression for the effect of memory on the spectral gap and thus, equivalently, on the characteristic time needed for the stochastic process to asymptotically reach equilibrium. Perturbation analysis shows that standard first-order ...
September 15, 2012
In this research paper, weighted / unweighted, directed / undirected graphs are associated with interesting Discrete Time Markov Chains (DTMCs) as well as Continuous Time Markov Chains (CTMCs). The equilibrium / transient behaviour of such Markov chains is studied. Also entropy dynamics (Shannon entropy) of certain structured Markov chains is investigated. Finally certain structured graphs and the associated Markov chains are studied.