January 21, 2011
Similar papers 2
October 22, 2008
Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory as developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon's theory addresses. First, philosophers have sugg...
April 14, 2015
In order to transmit biochemical signals, biological regulatory systems dissipate energy with concomitant entropy production. Additionally, signaling often takes place in challenging environmental conditions. In a simple model regulatory circuit given by an input and a delayed output, we explore the trade-offs between information transmission and the system's energetic efficiency. We determine the maximally informative network, given a fixed amount of entropy production and d...
December 30, 2009
Central to the functioning of a living cell is its ability to control the readout or expression of information encoded in the genome. In many cases, a single transcription factor protein activates or represses the expression of many genes. As the concentration of the transcription factor varies, the target genes thus undergo correlated changes, and this redundancy limits the ability of the cell to transmit information about input signals. We explore how interactions among the...
November 18, 2008
We introduce a quantitative measure of the capacity of a small biological network to evolve. We apply our measure to a stochastic description of the experimental setup of Guet et al. (Science 296:1466, 2002), treating chemical inducers as functional inputs to biochemical networks and the expression of a reporter gene as the functional output. We take an information-theoretic approach, allowing the system to set parameters that optimize signal processing ability, thus enumerat...
April 22, 2014
The concept of positional information is central to our understanding of how cells in a multicellular structure determine their developmental fates. Nevertheless, positional information has neither been defined mathematically nor quantified in a principled way. Here we provide an information-theoretic definition in the context of developmental gene expression patterns and examine which features of expression patterns increase or decrease positional information. We connect pos...
July 15, 2011
This paper introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand ...
February 24, 2012
A recurring motif in gene regulatory networks is transcription factors (TFs) that regulate each other, and then bind to overlapping sites on DNA, where they interact and synergistically control transcription of a target gene. Here, we suggest that this motif maximizes information flow in a noisy network. Gene expression is an inherently noisy process due to thermal fluctuations and the small number of molecules involved. A consequence of multiple TFs interacting at overlappin...
December 16, 2011
Information is a key concept in evolutionary biology. Information is stored in biological organism's genomes, and used to generate the organism as well as to maintain and control it. Information is also "that which evolves". When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive ...
January 15, 2014
We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdos-Renyi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We...
Developing and maintaining life requires a lot of computation. This is done by gene regulatory networks. But we have little understanding of how this computation is organized. I show that there is a direct correspondence between the structural and functional building blocks of regulatory networks, which I call regulatory motifs. I derive a simple bound on the range of function that these motifs can perform, in terms of the local network structure. I prove that this range is a...