They do... But thats down to poor design decisions.
Human vision extends from about 10^5 cd/m^2 to 10^-5 cd/m^2 [1].
Typical backlights use something like PWM to modulate the brightness. That means you need to be able to scale your backlight by a factor of 10^10 to cover the whole human vision range. And if your CPU runs at 1 Ghz (10^9), then at the dimmest setting then even being on for a single clock cycle would be a flash every 10 seconds - clearly unacceptable, and even then, few backlight controllers could handle a 1 nanosecond pulse!
The fix is to use PWM combined with settable 'modes'. The modes could be 'bright light' (using all the backlight LED's, perhaps a few watts total), and a dim light mode, which uses just one LED through a high value resistor, of just a few tens of microwatts. Your two modes have a factor of 10^5 between them in brightness. So now the PWM on your CPU only needs a resolution of 10^5, which it can do.
> That means you need to be able to scale your backlight by a factor of 10^10 to cover the whole human vision range.
Uh, phones don't produce brightness that high. And the low range is "too low" for usefulness anyway. Your math is off by few factors
> And if your CPU runs at 1 Ghz (10^9), then at the dimmest setting then even being on for a single clock cycle would be a flash every 10 seconds - clearly unacceptable, and even then, few backlight controllers could handle a 1 nanosecond pulse!
CPU is not doing PWM, it's separate chip
And you can't realistically have more than few kHz of PWM frequency anyway coz of losses.
It's not really a technical problem in the first place, it's easy technically, just have 2 current sources, when you need low light you just PWM the low current one. And not "just putting resistor in series", that's extremely wasteful, or "single led" that would just not illuminate evenly.
Unfortunately at these low currents, you tend to get a tiny amount of leakage across the LED junction - just a few microamps, but trying to make a small amount of light it becomes an issue. Worse, you can't even calibrate for it because the leakage is highly temperature dependant.
Compare your hypothetical 10^10 range to the range offered by typical LCD monitors, where maximum is in the 300...400 cd/m² range and a minimum brightness less than 30 cd/m² is "outstanding". I.e. a 1:10 range.
Human vision extends from about 10^5 cd/m^2 to 10^-5 cd/m^2 [1].
Typical backlights use something like PWM to modulate the brightness. That means you need to be able to scale your backlight by a factor of 10^10 to cover the whole human vision range. And if your CPU runs at 1 Ghz (10^9), then at the dimmest setting then even being on for a single clock cycle would be a flash every 10 seconds - clearly unacceptable, and even then, few backlight controllers could handle a 1 nanosecond pulse!
The fix is to use PWM combined with settable 'modes'. The modes could be 'bright light' (using all the backlight LED's, perhaps a few watts total), and a dim light mode, which uses just one LED through a high value resistor, of just a few tens of microwatts. Your two modes have a factor of 10^5 between them in brightness. So now the PWM on your CPU only needs a resolution of 10^5, which it can do.
[1]: https://www.researchgate.net/figure/Approximate-luminance-ra...