This might be actually be better in a certain way: if you change a real customer-facing API then customers will complain when you break their code. An LLM will likely adapt. So the interface is more flexible.
But perhaps an LLM could write an adapter that gets cached until something changes?
But perhaps an LLM could write an adapter that gets cached until something changes?