Functional Information - a way of measuring evolution and AGI?

Measuring evolution

This is a topic that I find quite interesting. In physics we clearly have a second law of thermodynamics that talks about how entropy will increase (and order will decrease) in a closed system. However, there isn’t really a law that encapsulates evolution (that is an increase in order and a decrease in entropy) in such a tidy and elegant fashion like the second law of thermodynamics.

I like Stephen Wolfram’s Computation Irreducibility, however that does not present a method to evaluate a system’s current evolution, more it just highlights that, in some cases, we need to wait to see how systems evolve. Michael Levin has also done some very interesting work around computation being the basis of evolution, effectively a practical exploration and exploitation approach, however I have not seen any ways that his work provides ways to measure the state of evolution.

Researchers Robert Hazen and Michael Wong have recently done a video with quantum magazine, as well as an article, looking at their paper from 2023: On the roles of function and selection in evolving systems.

In this work they’ve highlighted measurable common attributes that all evolving systems appear to share.

Common attributes of an evolving system

Evolving systems appear to be conceptually equivalent in that they display three notable attributes:

  1. They form from numerous components that have the potential to adopt combinatorially vast numbers of different configurations (e.g. photons, atoms, genes, people, and artificial neurons).

  2. Processes exist that generate numerous different configurations (e.g. mutations, Darwin’s theory of evolution, human interaction, and gradient descent)

  3. Configurations are preferentially selected based on function (e.g. fitness functions, market selection, and cost functions)

“Law of Increasing Functional Information" states that a system will evolve “if many different configurations of the system undergo selection for one or more functions.”

Calculating a system’s Functional Information

The way to calculate the amount of Functional Information in a system is quite elegant, and clearly provides an opposition to the second law of thermodynamics, where a closed system (and the amount of information stored within the system) decays.

Functional Information Bits = - log_2 (# of functional configurations / total # of configurations)

Clearly the hard work is in actually deciding what is a functional configuration and also finding the total number of configurations. So whilst the formula is elegant it is non-trivial for complex systems.

It can be applied to physics, where material evolves to have many functions, animal evolution where animals have many functional configurations, through to human civilization and computing and their growing functional complexity.

They also identify the fundamental sources of selection - static persistence, dynamic persistence, and novelty generation, however I haven’t fully looked into those yet so just leave that here as a note.

Here is a nice 13 minute video from Quanta Magazine, where the authors discussed key points of their work.

YouTube Thumbnail

I say this tongue-in-cheek (or do I??!); could this be a way to evaluate AGI?? We would consider an AI general when it has a functional information value equal to or greater than a humans?

Learning AGI