The Complexity Demon

Laplace (1749-1827) believed  that if we knew the current state of all things in the universe and  the forces acting on them, we could predict events with certainty. The concept is known as Laplace's Demon. By the end of the 19th century the demon was banished from physics by quantum mechanics, which regards the universe as random. Then came  chaos theory according to which the universe is unpredictable.  Today   complexity makes the demon even less appealing, since the computability of the universe is limited.  

Imagine a computer made of atoms. Since information flow  from atom to atom cannot exceed speed of light, there is a limit to its computing power.  According to Bremermann No material system whether artificial or living can compute more than 2 x 10e47 bits per second and per gram of its mass.  http://pespmc1.vub.ac.be/ASC/BREMER_LIMIT.html

Many similar demons dwell in our conceptual world which might be called complexity demons. Like the notion that the universe is a computer, and we are its computations.  Or the universal computer mentioned in Wolfram’s book, whose universality is compromised by Bremerann’s limit. We may therefore distinguish between two kinds of complexity. One which can be generated with NKS, and a complexity demon which cannot.

Now imagine that life on earth is a computer, and  its processes are computations. Apparently life is also constrained by Bremermann’s limit since it  does not generate its complexity from scratch. It starts its computations from a baseline, the complexity of a cell.  This issue was discussed in a another section


Back to complexity index