# The Pigeonhole Principle

## Impossible Compression

A common dream for the novice is the creation of a compressor that will reduce the size of *all*files. (Often touted as the ability to compress "random" data.) For example, Dr. Constant Wong of Recursiveware has been polishing his technique for compressing random data since 2003. And the USENET newsgroup comp.compression always has at least one thread dedicated to thrashing a new and eager theorist with a (flawed) idea. The Pigeonhole Principle quickly puts this idea to rest. We know that if a file is of length n bits, there are 2

^{n}possible input files. If a compressor can reduce the size of

*every*file, the number of possible output files is 2

^{n}-1. The Pigeonhole Principle tells us that the output of at least two file compressions have to be identical. And since they are identical, the decompressor cannot create two different output files.

## And More

The Wikipedia has another nice example of the principle in use. Imagine that you have a party with n people attending. At random, people shake hands with one another as they mill about. At the end of the night, we check the number of unique individuals each person has shaken with. What are the odds that two people will have shaken hands with the same number of people?
The answer is of course that there will always be two people who have shaken the same number of hands. There are n-1 pigeonholes, and n pigeons, *QED*.

## Don't Go There

If you ever find yourself spiraling down the rabbit hole of impossible data compression, I urge you to grab the life jacket of the Pigeon Principle before you are lost. It will save you a lot of pointless effort, plus clue you in to the fact that there are two IBM employees with the same number of hairs on their head.The Pigeonhole Principle, also referred to as the Counting Theorem, is a handy tool for mathematicians, and naturally, computer programmers.

The loose version of this principle says "After placing n pigeons into m compartments, if n is greater than m, you will find that some compartment must contain more than one pigeon."

Seems obvious, and perhaps it is, but at least in the world of data compression it must be trotted out from time to time in order to bludgeon dreams back to reality.

## Impossible Compression

A common dream for the novice is the creation of a compressor that will reduce the size of *all* files. (Often touted as the ability to compress "random" data.) For example, Dr. Constant Wong of Recursiveware has been polishing his technique for compressing random data since 2003. And the USENET newsgroup comp.compression always has at least one thread dedicated to thrashing a new and eager theorist with a (flawed) idea.

The Pigeonhole Principle quickly puts this idea to rest. We know that if a file is of length n bits, there are 2^{n} possible input files. If a compressor can reduce the size of *every* file, the number of possible output files is 2^{n}-1. The Pigeonhole Principle tells us that the output of at least two file compressions have to be identical. And since they are identical, the decompressor cannot create two different output files.

## And More

The Wikipedia has another nice example of the principle in use. Imagine that you have a party with n people attending. At random, people shake hands with one another as they mill about. At the end of the night, we check the number of unique individuals each person has shaken with. What are the odds that two people will have shaken hands with the same number of people?

The answer is of course that there will always be two people who have shaken the same number of hands. There are n-1 pigeonholes, and n pigeons, *QED*.

## Don't Go There

If you ever find yourself spiraling down the rabbit hole of impossible data compression, I urge you to grab the life jacket of the Pigeon Principle before you are lost. It will save you a lot of pointless effort, plus clue you in to the fact that there are two IBM employees with the same number of hairs on their head.