Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how the use of static world-prior information is considered here.

Or said in a different way: What stops me from supplying a hard-coded pre-compressed file to circumvent their RAM and HDD limits?



It's stopped by the fact that technically quantity that's being measured in the specific rules (http://prize.hutter1.net/hrules.htm) is size of the compression program plus the size of the compressed output ("zero-input decompressor" which is code+data that can produce the uncompressed output), so you can use static world-prior information but that counts against you twice as it has to be contained both in the "compressor" and the "decompressor".

For most cases the size of the "compressor" is irrelevantly tiny compared to the data, but if your "compressor" is a hard-coded pre-compressed file which it copies to the "decompressor" then the few percent of gains don't outweigh the fact that you've just doubled the size that's scored. It could be useful to include hardcoded priors iff they are very slow to compute but can be expressed in a small amount of storage.


[flagged]


This reads a bit unfriendly.

But anyway I read their FAQ and it's allowed to hand in offline-prepared data but size*2 will be added to your result so its typically a disadvantage unless your statically prepared data has an exceptionally high knowledge content.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: