February 7, 2022
The defining property of an artificial physical self-replicating system, such as a self-replicating robot, is that it has the ability to make copies of itself from basic parts. Three questions that immediately arises in the study of such systems are: 1) How complex is the whole robot in comparison to each basic part ? 2) How disordered can the parts be while having the robot successfully replicate ? 3) What design principles can enable complex self-replicating systems to function in disordered environments generation after generation ? Consequently, much of this article focuses on exploring different concepts of entropy as a measure of disorder, and how symmetries can help in reliable self replication, both at the level of assembly (by reducing the number of wrong ways that parts could be assembled), and also as a parity check when replicas manufacture parts generation after generation. The mathematics underpinning these principles that quantify artificial physical self-replicating systems are articulated here by integrating ideas from information theory, statistical mechanics, ergodic theory, group theory, and integral geometry.
Similar papers 1
November 1, 2017
The concept of complexity appears in virtually all areas of knowledge. Its intuitive meaning shares similarities across fields, but disagreements between its details hinders a general definition, leading to a plethora of proposed measurements. While each might be appropriated to the problems it addresses, the lack of an underlying fundamental principle prevents the development of a unified theory. Here it is shown that the statistics of the amount of symmetry broken by system...
February 7, 2022
This is a review on entropy in various fields of mathematics and science. Its scope is to convey a unified vision of the classical as well as some newer entropy notions to a broad audience with an intermediate background in dynamical systems and ergodic theory. Due to the breadth and depth of the subject, we have opted for a compact exposition whose contents are a compromise between conceptual import and instrumental relevance. The intended technical level and the space limit...
December 17, 2009
We present a quantitative measure of physical complexity, based on the amount of information required to build a given physical structure through self-assembly. Our procedure can be adapted to any given geometry, and thus to any given type of physical system. We illustrate our approach using self-assembling polyominoes, and demonstrate the breadth of its potential applications by quantifying the physical complexity of molecules and protein complexes. This measure is particula...
September 6, 2012
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production i...
April 9, 2022
Even today, the concept of entropy is perceived by many as quite obscure. The main difficulty is analyzed as being fundamentally due to the subjectivity and anthropocentrism of the concept that prevent us to have a sufficient distance to embrace it. However, it is pointed out that the lack of coherence of certain presentations or certain preconceived ideas do not help. They are of three kinds : 1) axiomatic thermodynamics; 2) inconsistent solutions of certain paradoxes; 3) re...
October 28, 2003
Herein we consider various concepts of entropy as measures of the complexity of phenomena and in so doing encounter a fundamental problem in physics that affects how we understand the nature of reality. In essence the difficulty has to do with our understanding of randomness, irreversibility and unpredictability using physical theory, and these in turn undermine our certainty regarding what we can and what we cannot know about complex phenomena in general. The sources of comp...
July 1, 2008
This paper applies Algorithmic Information Theory to simple examples of replication processes to illustrate how replicating structures can generate and maintain order in a non equilibrium system. Variation in replicating structures enhances the system's ability to maintain homeostasis in a changing environment by allowing it to evolve to a more restricted region of its state space. Stability is further enhanced when replicating systems develop dependencies, by sharing informa...
August 23, 2019
We propose a new interpretation of measures of information and disorder by connecting these concepts to group theory in a new way. Entropy and group theory are connected here by their common relation to sets of permutations. A combinatorial measure of information and disorder is proposed, in terms of integers and discrete functions, that we call the integer entropy. The Shannon measure of information is the limiting case of a richer, more general conceptual structure that rev...
May 16, 1994
We present a theoretical as well as experimental investigation of a population of self-replicating segments of code subject to random mutation and survival of the fittest. Under the assumption that such a system constitutes a minimal system with characteristics of life, we obtain a number of statements on the evolution of complexity and the trade-off between entropy and information.
May 9, 2012
Concepts used in the scientific study of complex systems have become so widespread that their use and abuse has led to ambiguity and confusion in their meaning. In this paper we use information theory to provide abstract and concise measures of complexity, emergence, self-organization, and homeostasis. The purpose is to clarify the meaning of these concepts with the aid of the proposed formal measures. In a simplified version of the measures (focusing on the information produ...