The problem is that the bandwidth costs become too high. Internet streamed video can be encoded with get efficient algorithms, it can be cached at your ISP, and it doesn't care much about latency. None of these are true for game streaming. People will want a higher quality image with low latency and it's not something that can ever be cached.
Two words: edge compute. The major ISPs are in the process of rolling this out in trials, with nationwide rollouts likely to start in 2021. The general idea is that you build some amount of CPU / GPU capacity in the boxes on the poles outside your house. Initial versions likely won't be able to support the full Stadia 4k experience, but it'll be enough to play games like Fortnite or Minecraft.
Considering those last-mile edge devices often cost in the tens of thousands of dollars and use a lot of power, including a few hundred ARM cores isn't a huge cost increase for the benefit it provides.
You're not going to get a lot of gaming done on ARM cores, when the games Stadia is trying to sell are built for x86 and x64.
Games that run on ARM devices (Fortnight, Minecraft) can already run on Stadia's end user hardware (phones, low-end computers, even many android-based smart TVs), and don't require streaming infrastructure.
Why not? Think Nintendo Switch (which uses an ARM-based Tegra X1), not PS5. And give the hardware 3-5 years; it will be able to do 4k. It'll be just another port.
Also a single Netflix blade can support caching for hundreds of users. Stadia blades support only one user per blade, and it's not like the hardware is cheap or the space in cabines unlimited.