Hacker Newsnew | past | comments | ask | show | jobs | submit | NackerHughes's commentslogin

It absolutely will not make formal verification go mainstream.

What it will make go mainstream, and in fact has already started to, is “ChatGPT verified it so it must be OK.”


I wouldn't be surprised if a fair majority of them have been taught to see goto as nothing but a vestige of the 70s which should never be used under any circumstances except as a meme or to deliberately obfuscate code.

I have recently become quite fond of goto-based error handling and find it a lot cleaner and more readable than the if-else-mountains you otherwise end up with. I just make sure to leave a comment with a link to xkcd.com/292 so anyone else reading it knows I'm aware of what I'm doing. Now with this URL trick I can do both in one line. :)


And the corresponding `goto http` is the chef's kiss :D love it


Any time you want to sign up for any "open social" platform, you have to rush to claim your domain name. If someone else got there first, too bad. And that domain name applies to every app using this protocol. So no chance to claim it on another app, ever.

So what, exactly, is the difference between this and internet handles? In fact, isn't this worse?


I suppose the difference is that you only have to rush to claim your domain name (the DNS kind) once and then you get to use it for all "open social" platforms rather than doing that for your username on each platform.


What do you mean every time? You only need one. Plus theres loads of TLDs for different people to have similar names.


3-in-1. Lack.


“The loudest voices preaching about taste and AI are often the ones who never demonstrated taste before AI.”

Yes, and if even these people can tell that AI generated stuff is godawful and tasteless, that tells you everything you need to know about AI.


AI stuff is tasteless in a different sense: It's like food that has no flavor. It "lacks salt" in some non-literal sense.

And I think that is not all that surprising, because much of what it was trained on was corporate-speak, which has the same problem.


OTOH I read the success rate is only 50-50 at detecting it. AI text does leave some clues, like those infamous em dashes, but those can be patched with some simple edits. AI images are more obvious because many are intentionally overwrought.


It depends on what the AI is trying to do. If you write a novel and ask the AI to improve the prose it becomes very obvious if you've dealt with AI prose before.


I would argue the opposite, most of the AI slop I see is gaudy and overwrought, the result of too much flavour carelessly applied.


I agree. To me it feels like food with too many additives, like fast food.


airplane meme: that is only what you notice


Does it include the author of the blog post? Also your comment is also doing it...

Or is it only bad when "AI people" do it?


'detect AI' is such a stupid goalpost given the different caliber of output one individual may have from another when coding with an LLM.

do these people detect all AI? Of course not, they detect crap.


The quoted part is really amazing. The author of the article just makes claims and clearly hasn't been exposed to any real art or music education. I suppose to some extent he means taste in programming, but anyone who is writing such an article does not have that either.

We can also talk about taste in articles, which seems to have degenerated to "any pro-AI article will be voted up and defended".


I'm not sure if this counts as a pro-AI article, but I agree. It's void of substance.

The most ironic part:

> When someone preaches about AI taste, ask them to show you their work from before AI. If they can’t demonstrate taste in their pre-AI work, they’re not qualified to lecture you about it now.

Talking about the lack of self-awareness...


The designer wants huge amounts of screen space wasted on unnnecessary padding, massive Fisher-Price rounded corners, and fancy fading and sliding animations that get in the way and slow things down. (Moreover, the designer just happens to want to completely re-design everything a few months later.)

The customer “ooh”s and “aah”s at said fancy animations running on the salesman’s top of the line macbook pro and is lured in, only realising too late that they’ve been bitten in the ass by the enormous amount of bloat that makes it run like a potato on any computer that costs less than four thousand dollars.

And US/EU laws are written by clueless bureaucrats whose most recent experience with technology is not even an electric typewriter.

What’s your point?


I think their point is that you might not have much of a choice, taking laws and modern aesthetic and economic concerns into consideration.

We "in the know" might agree, but we're not going to get it sold.


I think blind people should be able to use websites.


Wow, those are some jaded and cynical views.


Be honest. How much of this article did you write, and how much did ChatGPT write?


To the latter: Absolutely all of it, though I put my money on Claude, it has more of its prominent patterns.


Be honest. Did you write this comment with an LLM?

Why should it matter beyond correctness of the content, which you and the author need to evaluate either way.

Personally, I'm exhausted with this sentiment. There's no value in questioning how something gets written, only the output matters. Otherwise we'd be asking the same about pencils, typewriters, dictionaries and spellcheck in some pointless persuit of purity.


What, surely you’re not implying that bangers like the following are GPT artifacts!? “The changes aren’t just cosmetic; they represent a fundamental shift in how we approach server-side JavaScript development.”


And now we need to throw the entire article out because we have no idea whether any of these features are just hallucinations.


You don't know if the author knows what they're talking about with or without AI in the picture.


True! But writing an egregiously erroneous blogpost used to take actual effort. The existence of a blogpost was a form of proof of work that a human at least thought they knew enough about a topic to write it down and share it.

Now the existence of this blogpost is only evidence that the author has sufficient AI credits they are able to throw some release notes at Claude and generate some markdown, which is not really differentiating.


I think we have enough node developers here to know its truthfulness.


I'm not sure - a lot of the top comments are saying that this article is great and they learned a lot of new things. Which is great, as long as the things they learned are true things.


This article could have been two lines. It takes some serious stretching of school-essay-writing muscles to inflate it to this many pages of waffle.


I think a paragraph could have been enough to describe the issue.

My goal with this post was to describe my personal experience with the problem, research, and the solution - the overall journey. I also wanted to write it in a manner that a non-technical person would be able to follow. Hence, being more explicit in my elaborations.


> GPTBot et al. will probably do the same, as more people use AI to replace search.

It really won’t. It will steal your website’s content and regurgitate it back out in a mangled form to any lazy prompt that gets prodded into it. GPT bots are a perfect example of the parasites you speak of that have destroyed any possibility of an open web.


Only if the GPT companies can resist the temptation of all that advertising $$$.

I'll give them at most 3 years before sponsored links begin appearing in the output and "AI optimization" becomes a fashionable service alongside the SEO snake oil. Most publishers won't care whether their content is mangled or not, as long as it is regurgitated with the right keywords and links.


What do you mean sponsored links? It'll be a sponsored reply, no outbound links required.


Links to a site where the sponsor's product or service can be purchased, obviously.


That was my hunch. My initial post on robots.txt: https://evgeniipendragon.com/posts/i-am-disallowing-all-craw... - revolved around blocking AI models from doing that because I do not believe that it will bring more traffic to my website - it will use the content to keep people using their service. I might be proven wrong in the future, but I do not see why they would want to let go of an extra opportunity to increase retention.


Which is all you need a lot of the time. If you're a hotel, or restaurant, or selling a product, or have a blog to share information important to you, then all you need is for the LLM to share it with the user.

"Yes, there are a lot of great restaurants in Chicago that cater to vegans and people who enjoy musical theater. Dancing Dandelions in River North is one." or "One way to handle dogs defecating in your lawn is with Poop-Be-Gone, a non-toxic product that dissolves the poop."

It's not great for people who sell text online (journalists, I guess, who else?). But that's probably not the majority of content.


You are bringing a great point. In some cases having your data as available as possible is the best thing you can do for your business. Letting them crawl and scrape creates means by which your product is found and advertised.

In other cases, like technical writing, you might want to protect the data. There is a danger that your content will be stolen and nothing will be given in return - traffic, money, references, etc.


So just think how much greater the browser could be if Mozilla put more of the money they get into improving Firefox instead of into pointless UI redesigns that only slow things down, or breaking existing functionality - not to mention all the other frivolous nonsense they seem preoccupied with instead of being a credible competitor to Google.

With how they've been in recent years it's almost as if they're trying to be inept competition, as if they're being paid by Google to suck - in fact, that is all but established by now.


One man's "useless UI redesigns" is another man's feature.

Every person I talk to has a completely different idea of what Mozilla should be doing. Keep pocket, or pocket is stupid, or pocket is the second coming of Christ, or the VPN is stupid, or the VPN is a revenue stream not dependent on Google, or whatever fucking bullshit.

Firefox does not "suck" - this is a legitimate psyop. It has all the features of Chrome minus the horrendous privacy violations. It has all the performance of Chrome, too.

I mean, what are we missing? Web USB? Give me a fucking break.

If you really, really need Web USB then fine - use Google Chrome. You win. 99.99% of people I've ever talked to don't even know what Web USB is, let alone do they rely on it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: