I almost finish the book “Asymmetry: The Foundation of Information” by Scott J. Muller .
I like the description of the concept of Entropy , Symmetry , algorithmic Information but the author did not understand what information is .
I report an evidence in the Table 4.1 of page 96
Symmetry Entropy Information High Low Low Low High High
Here there is a big mistake :
Low Entropy = low Information
High Symmetry = low Information
Low Kolmogorov complexity = low Information
High Entropy != High Information
Low Symmetry != High Information
High Kolmogorov complexity = High information
This is the Correct Table
Symmetry Entropy Information High Low Low Low High ?
When we have a low entropy we know a way to reduce the information to describe the informatical object but when we have high entropy this does not give a way to reduce the information but this does not mean there is not a way to reduce the information to describe the object .
Not only but there are also examples where we have high entropy objects with low Kolmogorov complexity and low information to describe the object . An example is again the Rule 30 Cellular automata that produce a high entropy of data but to describe this data you need only to know the rule number plus a log(N) if you want a complete measure .
This mistaken concept emerges along the entire book.