Hacker Newsnew | past | comments | ask | show | jobs | submit | geuis's commentslogin

Almost absolutely everything you said is not true.

I almost get the sense you aren't a US citizen.


Mmm. Joined in 2008.

I still get value. But there's too much noise I think. I also think I'm older now and I'm more adverse to "done it, seen it"

The odd "made an app to do basic thing xyz" doesn't get my attention anymore.

Also, posts related to science get a hard scrutiny. Most posts are just cross references to poorly moderated subreddits. That, or they're just posts directly from archx with no external paper references. I'm not a scientist, but I know enough to know when to ignore bullshit.

Most posts to HN are still mostly bullshit. But I know enough to occasionally see something interesting. That's may be 4 times a year at this point.

...

I also want to add more context. I've been in the Anathem ideology for about 20 years now. It has shaped my world view. Call me fra geuis.

If you want to present what you think are new ideas, be prepared to back them up. When we get the Loreites up and running, I'll refer you to them.


Why do I want a piece of software monitoring me intentionally? It already happens against my will. It's already bad enough that Apple/Google monitors my sleep alarms. I've removed most of those.

Add ad tracking etc so what is this adding to the value of my life? You hope to get bought in a few years because you have a dataset (they) already have?


Getting people to pay to be monitored and take their data feels like a bit of a fever dream for the current tech elite. Imagine getting all that juicy data, not just for free, but with its own recurring revenue! Think of the stock price!

Coming soon, paying your company for the privilege of doing your job.


Is there a single repo that has all of these "aha" images? I could see the clown right away, and the vines/plants in the 2nd example were what I thought first but organic shapes are harder to be sure about.

That also brings to mind that first exposure to this dataset affects the effectiveness of the rest of the dataset. If you're doing initial exposure, you'll definitely get the "aha" moment. But if all of the images in the dataset are of the same type, your brain quickly learns the pattern and the "aha" moment vanishes.

If they did their study on all of the images per test subject, the results after maybe the first 5 are basically useless for any definitive conclusions.


r/showerthoughts


Please, just provide the answer. Maybe it's obvious but to people like me all I can think of is recipes for butter sauces for crab legs involving pine nuts. Which actually sounds quite good.


It's something often compared to oranges.


This word combines with Tim to make a meme.


It is answered in the article.


They're deliberately avoiding spoilers, you can decode the answer by pasting YXBwbGU= into https://www.base64decode.org/


Unlike many other languages, English has grown because it's adaptable. It has almost as many borrowed words for advanced concepts as "native" words. It's hard to even distinguish anymore.

If anything, a solid counter argument can be made that Romance languages (descended from Latin) lack the flexibility of English and other Germanic languages.

Non primary English speakers frequently complain that English is more complicated than other languages. This is true. I'm a native speaker and only can read limited Spanish. Where I get hung up is the dependence on gender of objects. Similar experience with Japanese when I was studying that a few years ago.

I completely believe that primary language has a physical effect on the brain in terms of neural structure. It must have.

But since English is so adaptable, if there's a concept that is better expressed in another language we tend to adopt the words of other languages to express it.

However other languages seem to be less adaptable. For example, France has or had an official government ministry for decades to manage new foreign words entering the French language. To this day, there are newish specific French words for technologies coming from English speaking countries.

Another good example is some YouTube videos from India I've run across. (I turn on subtitles). But say the speaker is talking in Hindi. Many times more technical terms are English words or phrases that are freely interspersed with Hindi. They're borrowing the English words, with a bit of a Hindi dialect hitting the pronunciation.

Going back to Japanese, we see the same thing. I don't know if the JP gov has a language ministry.

But if you look at written Japanese text you definitely see that most numerology is written with western/English 0-9 characters mixed with katakana or hiragana. When you hear people speaking, and once your ear is oriented towards Japanese sounds, you can start to pick up on the adopted English words that are said with a native dialect emphasis.


> However other languages seem to be less adaptable. For example, France has or had an official government ministry for decades to manage new foreign words entering the French language. To this day, there are newish specific French words for technologies coming from English speaking countries.

It's not because French is not adaptable, it's because France wants to maintain the language as "pure". They have the same in Quebec.

When a new word appears, they consider that there should be an equivalent in French instead of just using the original word. Yes, they are mainly doing this for English (there are no French word for tsunami or iceberg) because they assume that French will slowly disappear if they don't protect it.


Language imports are a poor substitute for grammatical flexibility. That's why French, for example, has limited need to import words directly: it can recreate the same meaning with native words. German is another great example, its grammar provides a lot of the flexibility that was lost in English. It is almost impossible to translate German philosophy into English without losing the natural flavor of word combinations that make German so adaptable.


For those interested there's a YouTuber geologist who does some really good videos in Minecraft on geology. https://m.youtube.com/@gneissname


I have a pc I built when the 2080 TI came out. How is Linux support for the supporting drivers for those cards today? The machine is still more than powerful enough for my needs but I haven't used it in a couple years because Windows really is just complete garbage. I'd like to be able to take advantage of new to moderately old hardware without dealing with Windows.


I've been running a 2080 TI with AMD 5600x on Linux Mint Cinnamon for three months 24/7 with no graphics issues. Previously ran the proprietary nvidia-driver-550 but now use the nouveau open source drivers. There are five choices for Nvida drivers either open source, closed or open kernal (up to 570/580 now). This was a complete switch off Windows 10, which I've only had to boot twice in three months, to transfer some data.

Every game I've tried on Linux was either gold or platinum on ProtonDB and ran fine so far. WINE worked for running a couple non-game apps. Lutris is another way to run programs but I haven't needed it yet.

Definitely try it if you have a machine sitting there. There is so much support for Linux and Mint on the web it was easy to answer any questions I had setting things up.


1070 TI works perfect on Arch for the past ~6 months with latest drivers (better than Debian stable!). This card is old enough that only the closed source drivers are supported, but it seems to work fine.


You will be absolutely fine


If it helps I used to use a gaming laptop for work that had a RTX 2060 mobile version, I was able to run some recent games like Elden Ring (including mods & online play), and some older but still demanding titles like Witcher 3. All of this without tinkering too much on a oob Ubuntu LTS install (I later switched to popOS because I don't like snaps that much).


Someone else recommended PopOSin other comments. I pulled up the page but haven't looked at it extensively. In what ways is it different from a recent Ubuntu LTS install?


The two are very similar in a lot of aspects, they both make it very easy to install proprietary nvidia GPU drivers, they both use Debian as base, and the default desktop for them is Gnome.

In the past I migrated to PopOS mainly due to these two things:

1. I prefer flatpaks/system packages over snap[1], if you run `sudo apt install firefox` you'll get a snap application in Ubuntu, that for me is an anti-pattern from Canonical to force adoption of snaps.

2. I found some weird performance issues with my laptop when I installed Ubuntu, thankfully I managed to find a fix after hours of tinkering[2], but I was surprised I had to do these workarounds in the first place (I've been running linux as my main OS since I was a teenager and never had to do anything similar)

If gaming is your main goal I'd consider using something like Bazzite or CachyOS, these two distros will still serve you if you want to run work/office apps too, they just come preconfigured w/ a lot of gaming goodies (like steam and nvidia drivers oob) but won't get into your way if you plan to install other stuff.

[1] https://snapcraft.io/ [2] https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1973434...


Needs to be even more narrow. (iPhone 16pro landscape Safari).


I certainly won't be upgrading to this version. I already don't really like the current version and see no reason to inflict a Windows Vista-like experience on myself.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: