Depending on the decision yes. An LLM might confidently hallucinate incorrect information and misinform, which is worse than simply not knowing.
Depending on the decision yes. An LLM might confidently hallucinate incorrect information and misinform, which is worse than simply not knowing.