←back to thread

181 points ekiauhce | 1 comments | | HN request time: 0.205s | source
Show context
ball_of_lint ◴[] No.42232776[source]
This strategy or something like it legitimately wins the challenge.

The challenge (based on expected value) is not to find some compression scheme that will compress 100% or even 50% of files to less than their original size. Instead it's to find any (set of) compression schemes that will compress more than 1/50 = 2% of arbitrary random files to less than their original size. You can construct such a program essentially as they described by asking for a large file size and then searching for a valid decompression program within that file. With standard tools you can make some valid decompression program quite short, which will allow you a reasonable probability that it will appear within a large randomly-generated file.

The cost that you pay for using a compression scheme like this is that where the compression scheme works you'll win some bytes, but in the more common case where it doesn't you'll lose some, even for the simple `cat $1 > original` default decompressor.

Now, a clever contest creator might arbitrarily choose files that win against such decompressors that they can think of, but Mike appears to have not have done so.

replies(3): >>42232984 #>>42233212 #>>42239358 #
hinkley ◴[] No.42239358[source]
No. You sound as if you’re familiar with the culture but it’s clear that you’re not.

The problem with comp.compression was always that its Eternal September is new people showing up every week claiming they’ve found a universal compression algorithm that compresses everything - the perpetual motion machine of information theory. Having gotten tired of explaining to people who think they’re fighting “dogma” not the laws of the universe, people start trying to find other ways to make them put up or shut up, so they could have some peace and quiet.

Like sending them a file full of RNG output and wait for them to realize this was their teachable moment.

Winning the challenge - without engaging in out of band bullshit (compressor + output should be the giveaway) doesn’t prove you have found a universal compression algorithm. It only proves the RNG is flawed. Which would be very interesting but not break Shannon.

The problem is compressor + file means “no out of band data” to a reasonable person and we have already established we are dealing with unreasonable people.

Not

> My main motivation was to "out-trick the tricker". I thought the chances of me making any money were very remote.

replies(2): >>42240067 #>>42248467 #
ball_of_lint ◴[] No.42248467[source]
I'm not familiar with the culture.

Nothing in the challenge says anything about doing universal compression. In fact the key point I'm trying to make is that you _don't_ have to make a universal compressor or break entropy to win, instead just find some scheme that makes 2% of random files shorter than the original, even if it makes 98% of files much much longer. The overall entropy is increased. I wouldn't use this for anything else. But it does beat the challenge as stated.

I'd agree that Patrick didn't follow the "spirit" of the challenge or do anything interesting with compression. But in doing this I think he made a good point, which is roughly that you need to be very careful and explicit when posing challenges like this or people are going to use your sloppy wording against you.

replies(1): >>42258402 #
1. hinkley ◴[] No.42258402[source]
> Nothing in the challenge says anything about doing universal compression

Because it’s out of context. It also makes the winner seem more sympathetic, which is why it’s not accident it’s not provided. It’s the Missing, Missing Reason.