Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'll suggest a longer term vision: no individual consumer will be able to afford any hardware needed to use anything more than a thin client. Games of the future will require more powerful hardware than what you are using now, but even your typical gaming computer of today will be not affordable in the future, simply because it won't be produced for mass market anymore. In effect, all your computations will be run on the platforms owned by someone else. And may be it even will be considered by society and the government to be an obviously sensible safety measure: the same way citizens shouldn't be able to possess nuclear weapons, they shouldn't be able to produce dangerously powerful software on their own, without government monitoring and approval.

Wow, fuck, I seriously scared myself just now.



Before you get too scared, please watch this ads sponsored by Ad, Inc between your corridor transitions. Btw, your game developer didn’t intend that at all, its just for our platform to be profitable.

You can remove it by going premium (tracking not included) for only $RTX3080Ti/year.


High quality graphics aren't that important to make good games. And any graphical improvement is having diminished returns. The jumps between PS1->PS2 and PS2->PS3 (using console because it's static comparison points, but it applies to PC game too) were noticeable and huge, but now each new generation bring smaller and smaller diminishing returns. And then there's all the games that don't go for cutting-edge graphics.


> I'll suggest a longer term vision: no individual consumer will be able to afford any hardware needed to use anything more than a thin client.

Compute power has become cheaper and cheaper over time. If you just look at the ridiculous powerhouses smartphones have become in a very short time, it is apparent that will not happen.

Also, consoles are pretty damn affordable - certainly compared to how expensive console/pc games are. Major game technology updates are mainly driven by new console generations. You might get some extra resolution or niche technologies on an ultra-highend PC, but the main underlying tech of modern games is still tailored to the current generation of consoles. I own a i7 with a 2080ti - and sure, games might look a bit better than their console versions (if available at all) - the game on my 4k+ desktop PC is still essentially the same that I could run on a $300 PS4.


Ready Player One movie comes to mind; https://en.wikipedia.org/wiki/Ready_Player_One_(film)

I had similar reaction when initially thinking through such a centralized architecture - and made me understand that a decentralized computational infrastructure ("personal computers") is a necessary failsafe to prevent against pitfalls of such centralization; similar reason as to why I think mesh network technology should potentially be everywhere, however not used as a default network.

These are the systems and failsafes, along with canaries we build into systems, that we need to educate everyone about - so the public can clearly understand and have a document available for them be able to refer to.


Seeing how, imho, the best games of all time were all made in the 1990s, I don't think gaming is all that reliant on powerful hardware.


Agree with your point but I'd say 1980s!


Agree with your point but I'd say {decades in which you were a child}!


Childhood nostalgia absolutely plays a role, but in my experience game developers tended to trade quality for graphics as the years advanced. In the 1980s games like Ultima 4 couldn't use graphics as a crutch and thus had to make the games really engrossing. My other grips is the ever-decreasing difficulty of the games. Back in the 1980s most games were difficult (if not very difficult/nearly impossible) to finish. You could play a game back then for months or years and never finish it. There was no internet to search for the answers to problems or riddles you couldn't solve. Games were not designed to be easily completed by virtually anyone who played. I can't remember how many hours I spent playing Zork before I was able to solve all the puzzles and beat that game (let alone Zork II and Zork III).


I'd say that in the 80s most game concepts were invented, but many suffered from the technical limitations of the time. In the 90s, most of these limitations were gradually lifted. The high point is probably different from game category to game category.

For instance Pac-Man, Tetris, or Galaga are still good fun today. Ditto Super Mario Bros. Flight Simulator II? Sorcery? not so much. There are plenty of 3D games from the early 90s perfectly fun to play, and some IMO like "Thief" haven't been bested in gameplay.


The 80s laid the ground work, the 90s perfected it and from the 00s onwards it was all downhill with a few exceptions here and there.


Yeah, that is basically the end game i was thinking at with my last moment. I just didn't want to start with this because it'd sound too tinfoil-hatty :-P (and some already told me i am assuming too much :-P).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: