This whole 30FPS BS

Author
Discussion

scorp

8,783 posts

229 months

Thursday 6th November 2014
quotequote all
Pentoman said:
I paid for a 120hz screen on my laptop though, expecting something to special. However I can see no difference. Am I missing something?
The 120hz doesn't mean your PC is feeding it at that frame rate, it means the same frame is scanned twice on the display (resulting in 60 unique images per second). This is an attempt to get rid of motion-blur/ghosting that the LCD itself introduces. Some displays advertise at 240 hz now, which is 60 x 4 duplicate frames. This is ignoring latency introducing motion-prediction that seem to be popular for some HDTV displays.

Some displays will overdrive the pixels to prevent ghosting, say you have a pure white pixel which is turning to mid-grey on the next frame, the LCD will send black to drive the white pixel to grey faster, kind of like inverse ghosting. (http://www.blurbusters.com/faq/lcd-overdrive-artifacts/ explains it better than me)

I know some old CRTs can do 120hz and can be fed at that rate but I'm not sure you can see a difference above 50hz.


Pentoman said:
I'd like to hear more from scorp too. More cunning developer tricks for speeding up games please?
biggrin Really there are no one-size fits all rules for optimisation, it's a huge topic in itself. Generally, this means doing as little as possible, or finding very clever ways to do/draw as little as possible, or sometimes simply keeping textures small enough to fit inside caches can yield huge improvements on modern hardware. You would be surprised at the amount of 'waste' that goes on in graphics, things get drawn over, stuff off screen gets lighting/shadow calculations it never needed, etc.


Edited by scorp on Thursday 6th November 03:40

Mr Whippy

29,046 posts

241 months

Thursday 6th November 2014
quotequote all
Oakey said:
Disastrous said:
A lot of people misunderstanding the frame rate/realism arguments here.

You should watch this:

https://vimeo.com/videoschool/lesson/56/frame-rate...

I think frame rate should be an artistic choice, that benefits the game. Driving games, I'd guess the higher the better but in a cinematic, immersive game, I want 24fps, thank you!
Can you provide an example of the sort of game you want to play at 24fps?

People keep comparing film 24fps to gaming 24fps as if they're similar but ignore that your game doesn't render motion blur like film does so the frames stutter when in motion
They can do motion blur as a post process if they wanted.

But I agree.

On one hand we have top film makers wanting to pass us higher FPS for films, yet on the other we have people telling us that lower is more cinematic.

People just seem to have an expectation based on FPS, and lower is accepted as more 'cinematic'

Personally I hate high FPS frame interpolation on films, it looks cheap and horrible.


In the end though, I can accept that if 30fps is plenty for the game at hand, and gives it a certain 'feel', and it lets them invest more time in other details then great. If it's a good, well made, well designed game, great.

But in the end a really good great game should work well at any FPS at most different resolutions on a PS3 through to a high end PC on 4k screens.


It's a non-argument, only if you're pushing crap does such a consideration become relevant to begin with.

To think the FPS of a game is the focus talking point... doesn't say much for the game if the FPS is that important hehe

Dave

Steven_RW

1,729 posts

202 months

Friday 7th November 2014
quotequote all
Cinema and controlling an FPS game is not really a fair comparison.

I don't control the cinema experience and am not trying to react in split tenths of a second. When competing in an FPS game I am.

Cut scenes: 24 fps and all the motion blur all the way please.

FPS:as many frames as humanly possible to allow excellent aiming and chances of winning the gun fight.

That's my view.

Interesting to hear from scorp. I'd very much like to move into the games industry but can't quite work out how best.. I need to some experience with people programming games, I can be a project manager as per current blue chip experience, but don't know where to start, to get any releveant experience even at a kick starter level!


I fancy a gaming PC based on all this chat... :-)

Regards,
Steven_RW

130R

6,810 posts

206 months

Friday 7th November 2014
quotequote all
Every frame cap I have seen on a PC game can be removed via debug console or editing an ini file, etc. Of course none of this would be necessary if developers actually did their job and didn't release some unoptimized mess of a console port. Consoles are where the big money is though so I wouldn't expect things to get any better. Also 1080p @ 60fps is hardly "next gen"! Either Microsoft should have released some better hardware or game developers need to better optimize their code.

mrmr96

13,736 posts

204 months

Friday 7th November 2014
quotequote all
What is this 30fps lark about? Both PS4 and Xbox One run CoD AW at 60fps in multiplayer. What else do you need?
http://www.gamespot.com/articles/xbox-one-call-of-...

PS - the guy saying 1080p @60fps isn't "next gen"? Really? Why would I pay more for a console capable of chucking out higher than that? My TV cant handle more than that, so a dearer console would be a waste of money.

130R

6,810 posts

206 months

Friday 7th November 2014
quotequote all
mrmr96 said:
What is this 30fps lark about? Both PS4 and Xbox One run CoD AW at 60fps in multiplayer. What else do you need?
Other games that don't run at 60fps ..

mrmr96 said:
PS - the guy saying 1080p @60fps isn't "next gen"? Really? Why would I pay more for a console capable of chucking out higher than that? My TV cant handle more than that, so a dearer console would be a waste of money.
Sure but 1080p has been around for years and years, it's not really impressive that they have finally got games running at 60fps in this resolution. 1080p is pretty much the minimum resolution most people would spec in a new monitor.

Oakey

27,585 posts

216 months

Friday 7th November 2014
quotequote all
CoD might run at 1080p/60fps but it also looks like absolute garbage

rhinochopig

17,932 posts

198 months

Friday 7th November 2014
quotequote all
Human optical critical fusion rate is circa 16hz anyway so 30 is plenty.

Oakey

27,585 posts

216 months

Friday 7th November 2014
quotequote all
That's on the Xbox One, on the PS4 it's native 1080p.

The Xbox One has dynamic res which goes to native 1080p in the campaign but spends most of its time at that other res, for MP it's always at the lower res.

The PS4 has fluctuating framerate in some places whereas the Xbox One version suffers screen tearing.

According to Digital Foundry anyway.

On the PC I find the depth of field makes things look ugly, I walked right up to a guy and he was out of focus but instead it just looked like he was a low res cutout unlike say, Alien Isolation which uses DoF well.

Pentoman

4,814 posts

263 months

Sunday 9th November 2014
quotequote all
scorp said:
Pentoman said:
I paid for a 120hz screen on my laptop though, expecting something to special. However I can see no difference. Am I missing something?
The 120hz doesn't mean your PC is feeding it at that frame rate, it means the same frame is scanned twice on the display (resulting in 60 unique images per second). This is an attempt to get rid of motion-blur/ghosting that the LCD itself introduces. Some displays advertise at 240 hz now, which is 60 x 4 duplicate frames. This is ignoring latency introducing motion-prediction that seem to be popular for some HDTV displays.
Thanks. That makes perfect sense, I just couldn't work out why 120hz was better. The screen doesn't seem to blur. Well, not like old TFT screens did!

scorp said:
Some displays will overdrive the pixels to prevent ghosting, say you have a pure white pixel which is turning to mid-grey on the next frame, the LCD will send black to drive the white pixel to grey faster, kind of like inverse ghosting. (http://www.blurbusters.com/faq/lcd-overdrive-artifacts/ explains it better than me)
Also interesting, thanks.

scorp said:
Pentoman said:
I'd like to hear more from scorp too. More cunning developer tricks for speeding up games please?
biggrin Really there are no one-size fits all rules for optimisation, it's a huge topic in itself. Generally, this means doing as little as possible, or finding very clever ways to do/draw as little as possible, or sometimes simply keeping textures small enough to fit inside caches can yield huge improvements on modern hardware. You would be surprised at the amount of 'waste' that goes on in graphics, things get drawn over, stuff off screen gets lighting/shadow calculations it never needed, etc.


Edited by scorp on Thursday 6th November 03:40
Ta. That sounds familiar. I guess I was wondering if you wanted to tell us about any particular favourite clever things that you've done, or you've seen done, in the name of performance?

Sorry for minor thread hijack...

paranoid airbag

2,679 posts

159 months

Monday 10th November 2014
quotequote all
rhinochopig said:
Human optical critical fusion rate is circa 16hz anyway so 30 is plenty.
http://www.testufo.com/#test=framerates

If you can't tell the difference, you may actually be blind. The point at which a human *won't notice extra details* is not accurately known but is well into the hundreds of Hz, possibly thousands.

EDLT

15,421 posts

206 months

Monday 10th November 2014
quotequote all
Oakey said:
On the PC I find the depth of field makes things look ugly, I walked right up to a guy and he was out of focus but instead it just looked like he was a low res cutout unlike say, Alien Isolation which uses DoF well.
Depth of field is the first thing I turn off when I start a game, its horrible and only good for arty screenshots. The second option I turn off is motion blur, which isn't good for anything. Did I miss a decision ten years ago where every developer decided games would be better if you can't see what you are doing?

I think trying to hobble cross platform games for the benefit of the Xbone is going to come back and bite when you have smaller developers making games like Arma, The Witcher and even Serious Sam which look a generation ahead of current console games.

scorp

8,783 posts

229 months

Tuesday 11th November 2014
quotequote all
anonymous said:
[redacted]
Yeah, this will be high end PC displays, I'm not familiar with those. Usually home HDTVs don't accept 120hz inputs and kind of mislead on the 120/240 fps thing.

I used to play a bit of Quake2/3 on a CRT which went up to 120hz, I could not perceive any motion difference between 60 and 120hz, from doing animation on CRTs I reckon 'smooth' movement starts somewhere around 45-50hz..

Mr Whippy

29,046 posts

241 months

Tuesday 11th November 2014
quotequote all
60hz is nice and smooth to me.

I used to run higher on a CRT too, if the GPU could handle it... but usually the cost of lower res and higher hz wasn't worth it.

Ie, 100hz did feel more solid, but it still *felt* solid at 75hz which was all that mattered.


It's when you get down to about 30fps or so that you can notice juddering, and that is where things like motion blur and DOF are probably added to mask the lower quality asset LODs and soften the jerkiness etc.



I still don't think any of this will be an issue for any half decent games and game developers. What game developer would want to start making content that looks last generation when they can be making next generation (ie, two steps ahead) NOW on PC to be enjoyed?



If any games come out for PC that look like this, just avoid them like the plague and send a clear message.

I know that is harder without a demo and no refunds via Steam etc, but just keep your wallet in your pocket, keep the pre-ordering fervour at bay, and wait for good honest reviews before buying! Simple way to not end up with a naff game... and forces developers to not just sell on pre-release hype, but actually sell on a decent quality game!

Dave

Oakey

27,585 posts

216 months

Tuesday 11th November 2014
quotequote all
EDLT said:
Depth of field is the first thing I turn off when I start a game, its horrible and only good for arty screenshots. The second option I turn off is motion blur, which isn't good for anything. Did I miss a decision ten years ago where every developer decided games would be better if you can't see what you are doing?

I think trying to hobble cross platform games for the benefit of the Xbone is going to come back and bite when you have smaller developers making games like Arma, The Witcher and even Serious Sam which look a generation ahead of current console games.
The problem with these effects is that they're binary, they're either on or off. It's jarring. DoF for example; in games objects, etc don't blend into focus, it's just IN FOCUS / OUT OF FOCUS and it's about as subtle as a smack in the face.

UncappedTag

2,102 posts

185 months

Tuesday 11th November 2014
quotequote all
I was showing my step son the difference between Forza Horizon 2 capped 30fps vs Assetto Corsa at 75fps capped. No competition, he now wants a gaming laptop.

I tried Watchdogs on the PC courtesy of Ubisoft, never again. Lazy port or what! It ruined what would have been a good game.

paranoid airbag

2,679 posts

159 months

Tuesday 11th November 2014
quotequote all
EDLT said:
its horrible and only good for arty screenshots.
That could be read as a far wider criticism ;-).

I can be unbearably smug mind, I'm entirely happy with the games I have, but there does seem a conflict of interest there. "Cinematic" stuff and quick time events sure look great.

Edited by paranoid airbag on Tuesday 11th November 12:35

ManOpener

12,467 posts

169 months

Tuesday 11th November 2014
quotequote all
Even if it were true, it wouldn't really matter as it would be relatively simple to just disable whatever hard-coded frame limit was in place via a third-party patch. But it most likely isn't, for all the reasons outlined so far in the thread.

Mr Whippy

29,046 posts

241 months

Tuesday 11th November 2014
quotequote all
anonymous said:
[redacted]
DOF is far from binary, but as you probably know we render a fixed frame and then try apply filtering in a frame filter to do DOF.

So blurring against Z-depth and other techniques, but no way to pick-up the info of items hidden behind object edges that are visible from one part of the lens but not the other.

Just like motion blurring a static frame using pixel velocity info, there is no way to blur stuff that is hidden in that static frame you're blurring, even though in an ideally blurred frame you'd have the data from it impacting the appearance.



DOF is great for leading the player, say cinematics, replays, cutscenes etc, but for actual in-game play it serves no purpose because no one knows where the player will be looking, or want to look, to then set the DOF appropriately. Maybe one day with eye-tracking it'd be a cool thing to do though.

All the recent games I've played with DOF it just made no sense to have on, except in the situations noted above.



I agree on the other stuff though. I'd prefer to do more thinking per pixel on screen, than just put more pixels using less thinking on screen.

Textures, meshes, multi-sampling, lighting and rendering.


I can look at a landscape photo on my 1440p screen, and no game comes close to looking that good. If we can't even get the 1440p worth of pixels we have right now looking like a photo then we're already ahead on res than we need to be.



I'm most excited about that new GI stuff which is like voxel based LUT's for ambient lighting info.

Also it looks like some newer API/GPU stuff is improving alpha management/blending stuff so hopefully 'bad' transparency results will begin to not be a concern as time goes forward.


Hmmmmm

Dave

Mr Whippy

29,046 posts

241 months

Tuesday 11th November 2014
quotequote all
Just like lots of effects, as it's not 'new' any more it'll be increasingly used with discretion and realism, rather than just shown off for effect to wow customers.

I remember the DOF in Operation Flashpoint 2 or whatever it was when you used the iron sights. Eeek.

And also in NFS Shift 2 iirc, the dash boards on interiors would blur at high speed. Fine if you're looking outside, but if you glance down? Suddenly it breaks the illusion.


Remember the 'HDR' blooms we saw in the mid 00's... now thankfully replaced with actual HDR rendering and tone mapping so the effect is natural looking mostly.


As per consoles, from what I see they've moved on from the PS3/Xbox360, but PC's were at that level already when they arrived. We've moved on to another generation, and then another again by now. We're about to go into another generation with the PC hardware... it leaves the current generation consoles where decent gaming PC's were about 4 years ago I'd say.

Of course, console devs always push things a long way so I expect to see very nice things as we did on PS3/Xbox360... but I fear they're lagging behind PC's too much to really throw any benefits towards the PC users advantage.

From what I see, all the new innovations in rendering/lighting and asset density in scenes is being led on PC based GPU's.

Dave