> UB in the language specification allows compilers to optimize based on the assumption that the programs you write won't have undefined behavior.
Given that has proven to be a completely false assumption, I don't think there's a justification for compilers continuing to make it. Whatever performance gains they are making are simply not worth the unreliability they are courting.
> Given that has proven to be a completely false assumption
This part is correct. The problem is in how to deal with this. If you want the compiler to correctly deal with code having undefined behavior, often the only possibility is to assume that all code has undefined behavior. That means, almost every operation gets a runtime branch. That is completely incompatible with how modern hardware works.
The rest is wrong, but again, this is a common misconception. Language designers and compiler writers are not idiots, contrary to popular belief. UB as a concept exists for a reason. It's not for marginal performance boosts, it is to enable any compiler based transformation, and a notion of portability.
Given that has proven to be a completely false assumption, I don't think there's a justification for compilers continuing to make it. Whatever performance gains they are making are simply not worth the unreliability they are courting.