Maybe he is right and LLMs are a dead end, maybe even the transformer architecture, but the attention mechanism isn't. Its the crucial genius part of today's AI and certainly can be used to create world models that dont just talk about things but really experience them.