Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Carmack: Next-gen games will still target 30 fps (develop-online.net)
43 points by Pr0 on Dec 19, 2012 | hide | past | favorite | 57 comments


Insomniac Games (the developers of Ratchet and Clank, and the Resistance series) wrote a blog post some time ago about their decision to focus on using a lower framerate (all their previous games were 60 fps).

http://www.insomniacgames.com/how-much-does-framerate-matter...

The summary is that prettier games get better reviews and sell off more copies.


Turn 10, of Forza Motorsport fame, refuses to compromise render rate in the series. Forza 2 to 4 (no idea for the 1st one, which ran on the original Xbox) are running at a steady 60fps (physics loops at 360ips), even in split screen (at the cost of a few effects, notably cockpit mirror reflections). Technically this really matters when getting the braking point at 400kph down Old Mulsanne. In terms of enjoyment, sure there's only reduced antialiasing, but driving down Bernese Alps close to the guard rail in Forza 4, especially with Kinect head tracking (not enough cash nor room for a triple-head+seat setup), is incredibly immersive.

Forza Horizon though, developed by an all-star team[0] distinct from Turn 10, chose to drop to 30fps, notably for full AA, effects such as motion blur, and features like continuous day/night cycle and realistic car headlights (old bulbs look deliciously warm, and the chromatic effect on the crisp edge of xenon headlights is awesome). As realistic and gorgeous as Forza Horizon is, at speed it feels buttery smooth but not exactly crisp as Forza 4 is. It's like you'd take a low-resolution image and try to make it look nicer by using a blur filter, only this applies to time instead of space, and thus movement.

[0]: http://www.eurogamer.net/articles/digitalfoundry-the-making-...


Disclaimer: I worked on Forza Horizon.

I think the main issue with Horizon was actually the open world, even with some serious LoDing that proved render intensive. Also the motion blur as you mentioned. There's always trade-offs to be made, and the 360 isn't getting any faster.

One thing I think perhaps more studios should try is running the main game loop at a different rate to the render loop. As you rightfully pointed out, the physics simulation runs at 360fps in Forza, similarly there's nothing to stop the game from double-buffering some of the game state and running the game at 60fps (including input). Part of the big issue of 30fps games (especially racing ones) is that the input lag becomes much bigger than in 60fps. You're normally looking at at least two frames of input lag - one to process the game, one to render it - so at least 66ms minimum for any button press to take effect. It normally tends to be about 100ms for 30fps games, and 50ms or so for 60fps, which makes the game feel a lot more responsive.


Well a better example would be Ground Turismo 5 which runs in 1080p @ 60fps, but that's besides the point. Games that look good in screenshots will always sell better.


GT5 is definitely not a better example, because it suffers huge framerate drops in some situations, which completely defeats the immersion effect. Try a race in cockpit view and multiple cars in front of you, when one goes off track creating some sand clouds, the framerate often dips far below 30 fps.

What makes Forza 2 and 3 so impressive is that it keeps up 60 fps all the time. If you a game can't keep up with that, I'd much rather have a constant 30 fps than an erratic 60 fps.


This is also interesting in light of the backlash against The Hobbit, where people are objecting to the 48fps film rate (most movies are 24fps): http://www.theverge.com/2012/12/18/3780274/48-fps-how-we-acc...

and the recent Doing Game Gravity Right post on HN, which shows that some games are still using physics equations that don't properly account for sub 60fps framerates: http://www.niksula.hut.fi/~hkankaan/Homepages/gravity.html


There's a clear distinction between games and film here and that's the interactive element. Games react to player input and a lower framerate results in a pretty poor experience for players (especially in fast paced games like first person shooters). Add to this the fact that games have only recently been adding motion blur, a much higher framerate than film's rate of 24 is important for maintaining a fluid image.

As someone who primarily plays first person shooters (TF2 and Quake Live), anything less than 60FPS looks awful to me, I play on a 120hz LCD.


Except for how the #1 selling game of all time, MW3 is 60hz as are the other games in the series

http://www.escapistmagazine.com/news/view/111353


Perhaps the answer is to have two modes, one at 60FPS with graphical compromises and one at 30FPS without.

That way the harder core can play at 60FPS and those more concerned about looks can play at 30.

The only worry is they introduce a 1FPS mode just for screenshots ;)


PC gamers view settings this way (and so I'd agree with you, I'd love it), but I don't see the rest of the console players agreeing. The idea behind a console is that it should be plug-and-play because all of the hardware is uniform.

Though, they do offer customization of keymaps (errr. button-maps...), so perhaps there is a case for this?


It could be an issue when you have online play, with those at 60FPS kicking ass. However if you default online to 60, the mags can take their screenshots from campaign ;)


Thirty frames per second is depressing. I don't know how some people can stand it. I get annoyed when I notice the frames drop below forty five.


I suspect it matters less on consoles where you are trying to move/look around by pushing a tiny little joystick with your thumb. With a mouse though it is absolutely crippling. Even minecraft is painful at 30Hz with a mouse.


"Crippling" seems a bit hyperbolic. I'd reckon most PC gamers probably don't even notice, though once you start noticing it's hard to "unsee" it, but that doesn't make the game unplayable.


If any of you have Dark Souls on PC, with a decent graphics card, use the 'dsfix' mod to toggle between 30fps (the default) and an unlocked 60fps mode.

It's night and day - suddenly the game becomes that much more fluid, and when toggling between the two, 30fps feels amazingly 'shuttered'.


It to me is crippling to the point that I will stop playing the game.

I used to play quake 3, particularly defrag, at 125Hz. 60Hz in normal games is fine to me but anything less feels like a nightmare where your legs turned into concrete and you can't turn your neck. It very honestly is an unpleasant experience to me.


I would play Diablo 3 on my laptop and desktop and would get around 10 to 20 FPS if I was lucky. So all this talk of 30 FPS is making me jealous!


Those people were talking about FPS games. Diablo's point-and-click interface is not as immediate and while it does benefit from going beyond 10 FPS, you won't see too much of a difference from 30 to 60 FPS.


Frankly I would be happy if next-gen games would actually achieve steady 30 fps. Doubly so if the games would achieve that with some reasonable resolution. That would already be a massive improvement to the current state of affairs, where games dip to 20 fps or so with sub-hd resolutions.


Agreed - I can settle for 30fps if the game maintains that. Dark Souls on PS3 and 360 did dip down to 15/12 sometimes depending on the environments.

30fps to 60fps is night and day though -- running Dark Souls on a gaming PC with the 'dsfix' mod to toggle between 30 and 60 is astonishing how much more smooth and fluid the game is.


I am not a gamer, but aren't high FPS tiring to the eyes because of the lack of natural motion blur which gives the sense of unnatural movement? Isn't that more important than increasing FPS?


No tired eyes, real world is infinite frame rate. You will get better 'natural' (psycho visual) motion blur as you approach that infinite frame rate. The required artificial blur to hide the lack of smooth motion will be reduced although can always be increased for artistic effect if desired.

Of course LCD displays will be locked to particular refresh rates on a TV probably 24Hz, 50Hz or 60Hz (although it may then apply motion compensated interpolation to take them up to 96-120Hz or even 192-240Hz but these are often turned off in 'Game' mode to reduce latency and provide less lag between input and result).

* Infinte frame rate is effectively whatever your mind can't distinguish from infinite.


I was going to pick you up on 'infinite' - eyes clearly see at a certain framerate (just look at the spokes on helicopter blades or the spokes of a spinning wheel to see the results of that) but I guess it depends on the person.

That's quite an interesting thought actually, how many fps do human eyes see at?

To answer that you I imagine you'd need to get into how the human eye works (rods vs cones), how moving images are created, and how interlacing works on movies and things compared to game engines which don't interlace.

http://www.100fps.com/how_many_frames_can_humans_see.htm


Eyes don't have a "frame rate" the same way that computer displays and cameras do. For example, under steady light, your eyes will never experience the common effect where wheels appear to spin backwards due to aliasing.

There is obviously a limit to how fast our eyes see, and that limit is somewhere within an order of magnitude or so of 100fps. But it's still a mostly continuous process. There's no hard dividing line between one frame and the next.


I could swear I've seen bicycle wheels "turn backward" in sunlight. Maybe an effect of parallax between the layers of spokes?


I've seen an effect like that which comes from the angle of reflected light changing as the wheel moves. Completely different effect, although it can look somewhat similar. It's independent of the speed of rotation, though.


The spokes and spinning car wheel effects are only visible under flickering lights (eg. fluorescent) and in videos. The eye does not have a "frame rate", although individual cells in the retina and the visual system have slow integration times.


Ah right, I could have sworn I've seen that effect in normal light.

Flickering light does make sense, and of course videos make sense as you're limiting the frame rate.


You may be confusing film with game rendering? If you shoot a movie with a camera that keeps the shutter open for a little while, you will get some motion blur. This lets the motion in the film look smooth. But if you render a game at only 24 fps, each frame will have no motion blur and it will look choppy. (Some games try to add motion blur as well, which helps a little.) In film, if you increase the frame rate, you will get less motion blur and I guess it might look less smooth. But if you increase the fps of a game, it will look smoother. (If the game adds motion blur separately, you could still have huge amounts of motion blur even with a high frame rate, because it's a fake effect.)


Good examples of frame interlacing here: http://www.100fps.com/


Ideally, you have 60fps AND motion blur, for a picture most approximating reality. The fancy games have motion blur.

Any game without motion blur is going to show momentary disjointed images, which is alleviated by motion blur. However, you still want the high frame rate for smoothness and low response time for inputs.


Increasing the FPS of a game isn't going to change the refresh rate of the monitor.

It makes the game smoother because there are more game loops, so more calculations on a smaller time delta.


Carmack's comment:

For the record, just in case it wasn't clear, we continue to target 60fps, and 120fps will be an option on PC. 120fps stereo VR, even. Eventually we will get displays with gaze tracked foveal insets, which will be rendered as a separate view. 120fpsstereofoveal=480vps

https://twitter.com/ID_AA_Carmack/status/281409030369472512


30fps? I remember the mid - late 90s "3d accelerator" boom and the games that came out of that. Unless you dumped serious $ into your PC you were lucky to get 15fps.


Bought a Dell for just under $2K in 1999 (remember prices were much different then) with some serious gaming power for an off the shelf PC. 30 FPS was generally a minimum for this rig when playing games like Q3, UT and HL. If that's costly to you, the TNT2 and other cards were cheaper than my voodoo3 3000 AGP and still delivered good results.


The problem are the consoles. Both the PS3 and the Xbox360 are absolutely ancient. Let's not even talk about the Wii.

I think I might prefer something like a Steambox, where upgrades would be more frequent. I generally prefer PC gaming, but I like being able to play from the couch with a controller.


Anyone can play in the PC from the couch with the right controller.

Even old games, just use Logitech controller mapper to convert the gamepad input to simulated mouse + keyboard.

Honesty that old complaint is very tired now.


Not really. The setup is complicated and fragile, the hardware is bulky and annoying. Some games can't really work with mouse mapped to analogs. Not everything can be done with just a controller (OS updates, etc.)

It's not great, but a Linux machine with Steam on it could be.


You are only giving your opinions.

I on the other hand, have done it. As expected, I use the computer more for movies than games (girlfriend demands). The gamepad is actually a very good Media Center controller.

The only real issue is passwords. But you don't have to enter them all the time.

My point is: having to get up and use a keyboard from time to time doesn't kill the experience of gaming in a computer and a big TV.

I want the Steam box because of the possibility of better games and a general MS distrust, not because the actual experience is something totally unmanageable, as some people here try to assume.


But I never have to get up to use a keyboard with my PS3. That's the experience I want.

Customizability and linux are just very, very nice perks.


Most people do not like to have a PC in their living room.

PC gaming can be done from a couch but the entire form factor / experience traditionally has not been optimized for that. Thankfully it appears as though Valve is working on changing that, but there is a lot of work left to be done.


>“We always do 30 frames per second on consoles, otherwise it wouldn’t be possible to fit in vehicles, effects, scale and all players.”

I'm with Carmack on this one. I'd rather have a good game with a solid 60fps than 20+ vehicles and hundreds of effects. That's all filler anyway. A solid 60fps game is a much more immersive experience than the choppy 20-30fps most developers achieve.


I concur. I played through Skyrim originally on PS3, then got a powerful PC and tried it on there.

I thought the game looked uglier. Now I realize that it might be that the FPS is too high.


If you pull up a Nintendo 64 emulator or a Playstation 1 emulator and run it at 1080P, it will look far worse than you remember it, because where the low resolution of the old CRT screens used to cover over a lot of flaws, the high fidelity will stick them in your face.

Likewise with a lot of console/PC hybrids. What looks pretty decent on your TV at what is actually 720P or even 640P upscaled and blurry is revealed to be somewhat lower detail than you thought on a crisp 2560x1440 screen.

On the other hand, if you get something meant to be played there... wow.


You're not concurring. A jittery framerate is bad, but JC isn't saying that a consistently higher framerate will look worse. He's saying that developers will try to push as much content and single-frame fidelity in there as possible while staying above 30 fps.

The whole argument really only applies to consoles.


Game looking uglier? I don't think that'll be completely related to 60fps, I'm sure that's more down to low-resolution textures (that look bad on your HD monitor) and things looking janky.

Skyrim with mods to upgrade textures and shaders and things looks freaking amazing.*

*And I'm not actually a fan of Skyrim, I dislike Bethesda sandbox games where nothing you do has an impact, but I digress...


Then what the next generation consoles need is low-latency motion interpolation hardware.


This makes me sad and angry. Carmack of all people should be pushing for more performance, but I guess he outgrew that after iD lost most of the market.


You didn't actually read the article, did you. Christ:

Games developed for the next-generation of consoles will still target a performance of 30 frames per second, claims id Software co-founder John Carmack. Taking to Twitter, the industry veteran said he could “pretty much guarantee” developers would target the standard, rather than aiming for anything as high as 60 fps. ID Software games such as Rage and the Call of Duty series both hit up to 60 fps, but many titles in the current generation fall short such as the likes of Battlefield 3, which runs at 30 fps on consoles. “Unfortunately, I can pretty much guarantee that a lot of next gen games will still target 30 fps,” said Carmack.


...

Carmack, is that you?


Higher FPS and higher performance aren't necessarily identical.

Give me a say year-2016 machine and I could target 300 FPS with what currently reaches 30 FPS, but no-one except some GPU geek would notice. Or I could use all that added performance, do many more interesting things at a much higher scale than today, at once -- physics, fluids, crowds and all other kinds of sims -- by still targeting "only" 30 FPS.


And it makes perfect sense for them to do so! They are optimizing for gameplay.


I think you have that backwards.

If they were optimizing for gameplay they would sacrifice graphical fidelity until they hit 60fps. Targeting 30fps means they want to sacrifice gameplay for graphical effects.


Not just graphical effects: more NPCs and gameplay entities on screen, better AI, better physics, etc.

In terms of trade-offs, 60fps is pretty much always the first thing to be sacrificed unless it's REALLY important, like in a twitch shooter or racing game.

The truth is that for most games 30fps is not a deal-breaker for the majority of people. If even a significant minority of players cared, the intensely focus-test-led games industry would follow.


That seems to be exactly what they are not optimizing for. Lowering your standards to 30Hz lets you crank graphical fanciness and engine features, but at the expense of gameplay. There is a reason Quake players used to turn their graphics down to mud, even though they could play with better graphics at a lower fps.


Targeting a lower frame rate also allows you to create a world that is mor interactive, a la the quote in the article, by including more game objects and other elements. Graphical quality is still important...hence the lower frame rate.


Those may make the game more enjoyable, and 30Hz may be the correct decision for that reason, but if those things come at the expense of fps they come at the expense of playability.

It is undeniably a trade-off, which way you think is worth it is a subjective matter of course.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: