Lossy codecs intentionally allow distortions that are too small to see with the naked eye (without zooming in). They're designed to operate at some "normal" viewing distance. If you zoom in, you defeat that technique.
In case you actually wanted to compress images specifically for viewing when zoomed in, you should use different codecs, or higher quality, or configure codecs differently (e.g. in classic JPEG make quantization table preserve high frequences more).
But for a benchmark that claims to compare codecs in general, you should only use normal viewing distance. Currently it's controversial whether the norm is still ~100dpi or whether it should be the "Retina"/2x resolution, but definitely it's not some zoomed-in 5dpi.
In case you actually wanted to compress images specifically for viewing when zoomed in, you should use different codecs, or higher quality, or configure codecs differently (e.g. in classic JPEG make quantization table preserve high frequences more).
But for a benchmark that claims to compare codecs in general, you should only use normal viewing distance. Currently it's controversial whether the norm is still ~100dpi or whether it should be the "Retina"/2x resolution, but definitely it's not some zoomed-in 5dpi.