Here I start to expose an investigation in the measure of information, and the gap from the theoretical measure and the real value.

The measure of information answer to the following question : given an object X how much information there is inside this object ? The first person try to answer this question was Shannon with its famous “A mathematical Theory of Communication” . Shannon choose to define the amount of information with the logarithmic function and it is very interesting to read into the paper that this choice was done by convenience aspects without proofs but listing the practical reasons to choose the logarithmic function as a good measure of information.

So the information “I” of an object “X” for Shannon is **I=log(X) .**

The correct definition of information is given by Kolmogorov with its* Kolmogorov complexity *defined by the length of the shortest program that compute the object X so **I=K(X)**.

The incredible thing is the correctness of the Shannon intuition about the simple logarithmic function because watching the 2 functions is clear the functions are very similar .

They are very similar but the difference between the 2 function is very important and on this difference we have all the MDL principle and the developement of the Algorithmic information.

My intention with this post is not to compare the Shannon information and the Kolmogorov complexity ( for this purpose I suggest the paper “Shannon Information and Kolmogorov Complexity” by Peter Grunwald and Paul Vitanyi ) but to inspect the difference between a mathematical domain and a reality domain .

I think the real information Kr is very different from the K information and this is an approximation of my new function

The difference between the real and the theoretical information is the limits of available information. I think we are inside an environment with a limit of information , a big limit but there is a limit! In the theoretical definition there is not this limit so we can use object like 2^n without worry about the value of “n” but in the reality something like 2^1000000 is a nonsense , something like this absolutely does not exist .

I think there are a lot of empirical evidence as consequence of this fact and I post something in **The Compression Paradox** and NFL and there are other fields where to investigate for example why the simplex algorithm work? or why the randomized algorithms work?

Starting from this concept the Shannon information Sr in a real domain become **I=Min( log(X) , M/log(X) )** where **M** is the limit in the amount of the information .

In this simple case I suppose the programs work in an environment with a static fixed amount limit of information M , is possible to develop also different and more complex scenario but as starting point I think is interesting to investigate on this hypothesis.

Roughly speaking what this formula means is that we have an exponential behaviour when the object X has a length not comparable to the length of the limit M but when the size of the object is near to M this is not true and for example with an object X1 with length M/2 how many different object X with the same length can we have? The answer is 2 and not 2^(M/2) … this is absolutely a different answer!

This is the basic idea .

From this point is possible to derive a new *Universal real distribution* and a *real Kolmogorov complexity* .

In the next posts I will examine the other aspects .

I don’t like to make theoretical conjecture without bringing them to the real so I will speak about a developing project that use this new tool.

Pingback: Speed Up The Inverse Levin Search « Breakingkeyboards’s Blog

Pingback: Limits of Physics and Informatics « Breakingkeyboards’s Blog