Sometimes the complexity of the whole is much smaller than almost all of its parts.

in mathematics •  8 years ago  (edited)

There is a saying that goes "The whole is greater than the sum of its parts". However, in information theory, or algorithmic information theory to be precise, the whole can be much much smaller than almost all of its parts.

Example 1

Let us start with a simple example of a set broken into two subsets where each of those subsets both have much larger complexity than the whole set.

Let the whole set be the square of pairs of integers:

M = {(x,y): x and y integers in {1, ..., N}}

´

To describe the set M with an algorithm (i.e. a program that prints out all members of the set and a way to tell them apart and then halts) one only needs to know the number N which can be described by log2(N) bits. Hence the program will have a size of about log2(N) + constant number of bits. So we can write that the smallest program that prints out M and halts is at most:

K(M) <= O(log2(N))

´
Here O(y) means a bounded function times y.

SQUARE_KOLMOGOROV.png

Now, break up the set, M, into two subsets A and B according to the picture. M is the union of A and B. A and B can be constructed in lots of different ways, but let each vertical line in the path that cuts up M into A and B symbolize a bit of a message. Each vertical line can have x-coordinate in {1, ..., N}. Now a number in {1,...,N} will contain approximately log2(N) bits of information. Also there will be space for N different such vertical lines in the cut, hence the cut between A and B contains N*log2(N) bits of information. The information in the cut can be chosen to contain an incompressible message (such messages exist in abundance; see my previous post). This means that both A and B will have approximately a shortest description of:

K(A) = N*log2(N) + constant

´

Therefore the complexity of M i relation to A is small for large N:

K(M)/K(A) <= O(1/N)

´

Example 2

We saw that two subsets can be more complex that the union of them, but what about the individual members of the set? Let us look at another set; the set of all strings with length <= N:

M = {all strings x: l(x)<=N}

´

Similar to above, we see that this set has Kolmogorov complexity at most:

K(M) <= O(log2(N))

´

Using a similar counting argument as in this previous post we can easily derive the following formula for the number of strings with complexity at least k, just replace n-c in the first formula in this previous post with k and denote this by:

PART3_1.png

The total number of elements in M is easily seen to be, using a geometric sum: |M| = 2^(N+1)-1. Just putting k=O(log2(N)) we get that the proportion of elements in M that have complexity at least as large as K(M) is at least:

PART3_3.png

We see that this is very close to 1, and the term to the right is asymptotically N^2/2^N which goes to zero very fast as N becomes large.

Hence in conclusion: almost all strings in the set of all strings whose length is smaller that or equal to N has complexity larger than the entire set.

UPVOTE_FOLLOW_RESTEEM.png

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

Hello rndness, that's a nice and innovative way to drive ur point home. Well composed man. U have my upvote.

This post received a 2% upvote from @randowhale thanks to @rndness222! For more information, click here!

Brilliant job,Outstanding work!
This is truly above and beyond.
This is superb! I had no idea a post could look this good.

Perfect!
Thanks, this is exactly what I was looking for.
Wonderful, this is more than I expected.
Good work, as always.
You are a lifesaver.
Awesome! Congratulations on a great job. I am so very proud of you.
Give the world the best you have, and the best will come to you! I want you to know that I really appreciate your efforts.
You don't stop when you are tired. You stop only when you are done. Congratulations!
Thank you for a hard work. I'm sure it was worth it all.