It's also just not in its training corpus. It's not. It's in journal articles, etc. But here we go again thinking chat gpt can give us something from nothing. Also a hallucination with a chemical reaction could easily mean death. It's just not a good tool for this.