People constantly make this mistake, so just to clarify: absolutely nothing about what I just said implies that llms are not helpful.
Having an accurate mental model for what a tool is doing does not preclude seeing its value, but it does preclude getting caught up in unrealistic hype.
Having an accurate mental model for what a tool is doing does not preclude seeing its value, but it does preclude getting caught up in unrealistic hype.