Hacker Newsnew | past | comments | ask | show | jobs | submit | orbital-decay's commentslogin

The problem is that not disturbing the fibers is impossible if you work with it at all, and workers in Russia still suffer from life-changing injuries. Disposing of it safely is also not realistically possible. The regulator just doesn't care, it's as simple as that. Of course they don't "think it's safe" as GP said, there's a ton of research and practice on the opposite and they set a specific (pretty low) limit on the exposure. But they turn a blind eye to the fact it's impossible to enforce and will never be followed in practice as long as asbestos is still being used anywhere. This is why asbestos use is banned everywhere, and this is the issue with Russian regulations, they give a tiny bit of economy a priority over public health, using the convenient research that pretty much "натягивает сову на глобус" in trying to downplay the hazard, if you actually read the relevant studies in Russian.

You are expressing disinformation. The actual regulations are very different from what you make it seem.

They're saying that the visual performance is indirectly affected by invisible wavelengths somehow. Not that you can see the difference between two types.

It's a poorly controlled experiment.

https://news.ycombinator.com/item?id=46764382


They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.

It's a big thing and you can buy LEDs which produce a better colour range, but they're much more expensive and not as energy efficient, because creating bold reds costs hard energy that no diode trick will ever get around that.


>They are saying that, and most real world LED lighting uses very cheap diodes, like, 99.9999% of them, which create very poor colour compared with incandescent bulbs, which create perfect colour representation.

Have you actually read the study? It's about infrared and has nothing to do with color rendering and visible spectrum. They're vaguely speculating about some mitochondrial mechanism of action not related to vision at all.


That's the interesting thing about this study. A lot of people here are speculating around explanations connected to metamerism, but the control (Figs 7 and 9) partly rules that out.

What's being attacked in this particular case?

The phone. It's the same attacks that secure boot tries to protect against. The issue is that these old, vulnerable versions have a valid signature allowing them to be installed.

>people who voluntarily give information to third parties

Is it the case with BitLocker? The voluntary part.


Sure. You voluntarily use windows. You could use something else or nothing so you chose to use it. You are not compelled to use it by law. You are just strongly compelled by a small carrot and a large stick. The same applies to a smart phone BTW.

From a legal perspective yes since the government didn't force you to

The wording is not hand-wavy. They said "not necessarily invalidated", which could mean that innocuous reason and nothing extra.

I really think it is. The primary function of these publications is to validate science. When we find invalid citations, it shows they're not doing their job. When they get called on that, they cite the volume of work their publication puts out and call out the only potential not-disqualifying outcome.

Seems like CYA, seems like hand wave. Seems like excuses.


Even if some of those innocuous mistakes happen, we'll all be better off if we accept people making those mistakes as acceptable casualties in an unforgiving campaign against academic fraudsters.

It's like arguing against strict liability for drunk driving because maybe somebody accidentally let their grape juice sit to long and they didn't know it was fermented... I can conceive of such a thing, but that doesn't mean we should go easy on drunk driving.


>1 owner logging everything.

Everything in Discord is also filtered through a classifier or a generative model, so their provider also has access.


>"Back in the day," computers displayed white text on a dark background (usually a blue background) out of the box. This was deemed the most legible.

That just prevented CRT degradation and it had less ghosting and flickering, especially as most CRTs in the home computer era were just terrible home TVs, and CRTs in the mainframe era were equally terrible. The saturated blue background was absolutely insufferable and I had ghosting and shifted color perception for minutes after using NC and Borland software for a long time. I loathe it till this day, just like garish CGA colors which were an assault on my eyes.

80's and 90's had a general concept of a desktop with windows as paper documents, because the first real use case for personal computers and workstations was assisting office jobs.

Funny how you call the normal light scheme inverted. IIRC PC text/graphics modes used this term for dark backgrounds.


It was not a screen-saver. Less ghosting? Most likely. But the fact remains that reading text off of a light bulb blasting in your face all day sucks, and once upon a time people knew that... but "forgot" it when vendors shoved inverse color schemes on them by default.

"80's and 90's had a general concept of a desktop with windows as paper documents"

Yes, I noted that; but the analogy to a piece of paper fails because paper does not EMIT light.

Everyone with sufficient computing experience calls "light" schemes inverted. This was even documented in instruction manuals from the early PC era: https://imgur.com/a/aLV8tn0


Anyone experienced remembers that any 60Hz CRT is a flickering mess, especially computer monitors that used shorter-lived phosphors, and any old TV had terrible burn-in. That's why you want to reduce the amount of bright pixels on it. That's not a legibility thing.

A display is not a light bulb if you aren't specifically making it a light bulb against the poorly lit environment. There's no difference between reflected and emitted light, what you actually need is much better lighting in the room, so your display doesn't stand out when used on a brightness level that provides sufficient contrast (and just because working in a poorly lit room is unhealthy).

Moreover, a light scheme in a well-lit environment is less eye-straining, because your pupils contract and adapt to the light. If you're using a dark display against a dark background, your eyes adapt to the dark and then you're hit with the bright text. If you want to display more than just text, dark mode becomes a problem because most of the content (e.g. pictures, videos) is not largely dark.

tl;dr avoid excessive contrast and flickering. Everything else is individual eyesight differences, opinions, and snake oil.


A lot of software is dark-mode first but it's still not right. Good dark schemes are just really hard to design, there are just too many nuanced differences. Color perception is maybe 10% of it. Typographics, line thickness, optical balance, accounting for massively increased contrast, antialiasing, layout, picture rendering, absolutely everything should be done differently on dark backgrounds.

And it depends too much on your environment, the type of display, and its pixel density, unlike in light mode which is way more forgiving to external factors.


My mirror disagrees.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: