Bandwidth isn't the problem. Latency is. Multiple gigabits of bandwidth won't help you when it takes half a second for your character to start jumping after you hit the button.
The latency for a few people involved did actually exceed a second.
The problem is you need the latency for the nearest Google DC with graphic cards in servers which might not be your nearest Google DC at all. And Google probably fucked up other things too, considering the amount of fail in thsi product.
I think there's a potential for how games are made to make a huge change. One possibility is a "thin-client" that sends input events separate from the rendering machine.
A drastic improvement to input/rendering delay for "cloud gaming" (<10-30ms) and lots of bandwidth could even attract professional gamers. Having the your gaming machine in the same DC as the game servers is a huge advantage if everything else is optimized.
That type of thing is alright for action inputs like shooting, running, etc. I've yet to use a game streaming service, including Steam Link inside my own house, that doesn't make first-person camera movement nauseating.
I know about it. Trust me, NOBODY will play with a teammate with 500ms latency. On EU servers my usual latency is 43-47ms. It's really far from "half of a second".
I (EU) played regularly with someone in Australia on R6:S and it wasn't much of a problem. He's used to latency issues so he played as well as anyone else would.
It's not possible to get used to such huge latency (500ms). He is a burden for a team in competitive games. It is a fact (if that "friend from Australia" exist at all).
That friend certainly exists and he's not a burden on the team. Maybe you focus too much on the latency number and forget that some people play to have fun, while still playing very decently.
I think it's very rude you suggest I don't know people in Australia (do you even know someone from Australia?)
Still I don't believe you. I have experience of playing competitive shooter with 250ms latency and it was nearly impossible to aim. And you are talking about 500ms latency. Let's just agree to disagree.
Well, he plays with between 100 and 300ms latency on average, atleast whenever I check the player screen. They're perfectly capable of hitting targets reliably.
On a similar note, I used to play World of Tanks with around 400ms latency before I got a better internet uplink, while it's certainly bad for a month, you start to get used to it eventually and the brain compensates. Of course, if reaction time matters you loose, but you can adapt your playstyle in a lot of games so that pure reaction time doesn't matter as much.
There is a bunch of tricks that go into the netcode though. A slight delay on hit recognition is noticeable and sometimes annoying, but movement can be silky smooth. With Stadia, wouldn't that latency open you up to input lag?
I've used Parsec. It's not perfect, but better than a low to mid-range gaming machine.
The input delay isn't as much of a problem as bandwidth, but it could definitely be improved. The problems arise when setting it to 120+ FPS.
60FPS works surprisingly well, and for single player games you probably wouldn't even notice the input delay.
Of course there is, but there are some tricks also for it :) Modern software (and AI) can predict some user actions very precisely. Anyway, there will be input lag, without any doubts, even with the smartest AI, but for some games it's pretty acceptable. Especially for non-competitive ones.