physics as communications


Speed of light is compression limit for embedded manifolds and points to boundary entities whose area solves to kolmogorov or inherent randomness constant?
This suggests that information may have or account for: mass. Taken to an extreme, the mass energy equivalence may also describe the information energy equivalence. If not, then we must know where this is false. If the information capacity or potential of space is finite, then gravity itself may be felt as a spacial distortion arising from physical communication limits.

When talking of space it is possible to flatten the simlation into a 2 dimensional communication network whose nodes are connected by wires whose length translates to propagation delays. Space is an illusion that arises from differences in the wire lengths. Space may be a function that represents a holographic unification of the PD values into domains where information has integrity and communication is possible. Despite many various path length differences, space is the apparent domain in which some kind of frame syncronous communication is possible.

Energy Potential of Information

Is the free energy of information greater than zero?

What affect does information have on the entropy of a system?

How much accumulated or invested energy is there in the human genome?

Is an interactive collection of information sets a simulation?

What is the simulation potential of a universe?

Do simulations have a hierarchy or ordering and is it derived from relative frame rates?

Can simulations compete or arrive at competitive equilibriums?

When considering the meaning of life, might we ask the life of meaning and how all observable life is at its root a self propagating simulation?

When constructing a self optimizing and revising simulation model, what balance is needed between symbol persistence and symbol substitution? Is this solved by tuning the simulation to the best frame rate to input exactly as much outside information as needed to maintain sync? To create a self healing model, there would have to be a dual input function that includes the new symbol from the outside, and the symbol from the inside, and a random function or noise script to pick one or the other. This would sometimes pick the symbol from the outside, and sometimes pick a symbol from the inside. Over time there would be persistence of the internal simulation and there would be sync to the outside for anagenesis. In this exchange there is a requirement for a noise source which may be assumed to be an input not derived from the simulation itself, but from some other source that in the language of simulations would be another simulation probably at a higher simulation index.

There are multiple definitions of information

They differ on information as a product or function.

Lets consider the following thought experiment involving data symbol compression:

There is the datum to be compressed, a compression library, and a processor. The processor uses the library to convert the datum into another representation scheme where every combination is enumerated with a unique number or register in an index. Duplicates are counted and stored in a string; this smaller list is then subjected to another round of compression and the count of pattern matches is stored as a smaller string that is eventually saved as a description file for data restoration in the library format. A client recieves a datum in compressed form and uses a library to restore the data recursively until the original input is assembled.

There is a limit in compression where any further processing is not compression and the subtraction of symbols from the datum, but encryption or the substitution of symbols into the description file. IE, by overcompressing the file you get an inflated description instead. The limit in effective compression may be the inherent randomness, incompressibility, or the effective information of the payload. Since this cannot be reduced without wasting more energy it may approximate kolmogorovs constant.

The residue incompressible by the equilibrium limit may be considered the effective information in the payload, or exchange. The result is a change in entropy of a system similar in action to chemical catalysis.