Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's an arms race between human writers and AI. Writers want to sound less like AI and AI wants to sound more like writers, so no indicator is reliable for long. Today typos indicate a real writer, so tomorrow LLMs will inject them where appropriate. Yesterday em dashes indicated LLM, so now LLMs use them less.

Beyond these surface level tells though, anyone who's read a lot of both AI-unassisted human writing as well as AI output should be able to pick up on the large amount of subtler cues that are present partly because they're harder to describe (so it's harder to RLHF LLMs in the human direction).

But even today when it's not too hard to sniff out AI writing, it's quite scary to me how bad many (most?) people's chatbot detection senses are, as indicated by this article. Thinking that human writing is LLM is a false positive which is bad but not catastrophic, but the opposite seems much worse. The long term social impact, being "post-truth", seems poised to be what people have been raving / warning about for years w.r.t other tech like the internet.

Today feels like the equivalent of WW1 for information warfare, society has been caught with its pants down by the speed of innovation.





> society has been caught with its pants down by the speed of innovation.

Or rather by the slowness of regulation and enforcement in the face of blatant copyright violation.

We've seen this before, for example with YouTube, which became the go-to place for videos by allowing copyrighted material to be uploaded and hosted en masse, and then a company that was already a search engine monopoly was somehow allowed to acquire YouTube, thereby extending and reinforcing Google's monopolization of the web.


Innovation has always been faster when copyright is lax. The US was copying British and other European inventions during the industrial age left and right, and their economy took off because of it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: