One of the big complaints with LLMs is the confident hallucination of incorrect facts, like software APIs that don’t exist.
But the way I see it, if ChatGPT thinks the Python list object should have a .is_sorted() property, that’s a pretty good indication that maybe it should.
I work in PM (giant company, not Python), and one of these days my self-control will fail me and I will open a bug for “product does not support full API as specified by ChatGPT”.
> LLMs is the confident hallucination of incorrect facts
This is a very common feature of delirium in people. Chatting with an LLM seems a lot like what it would be to talk to a clever person with encyclopedic knowledge, who is just waking up from anaesthesia or is sleep talking.
When a new person is born their entire life is hallucinated in its entirety by the all great and powerful GPT. Deviation from His plan is met with swift and severe consequences.
Love it. Get ChatGPT to write the missing method, execute it this once, then store it in a file, update the current source file with the include to cache it for next time.
I can't find it, but someone already did a Python module that plugs into GPT-3 and automatically generates functions on the fly as you call them - and then the same for methods on the returned values etc.
But the way I see it, if ChatGPT thinks the Python list object should have a .is_sorted() property, that’s a pretty good indication that maybe it should.
I work in PM (giant company, not Python), and one of these days my self-control will fail me and I will open a bug for “product does not support full API as specified by ChatGPT”.