The type of game I play absolutely has to have at least ~200 fps (for reasons of audio latency processing, not visual.) Or is nearly unplayable. I do have a 120hz monitor so I prefer 120fps at least for most games, choppiness bothers me considerably.
MMORPGs and point-and-click games like FNAF don't need 60fps. Other games are okay below, and there are plenty that suck with anything lower than 50. Racing games, PVP, and shooters require precision to be properly played. So the question regarding 60fps is too broad.
If it isn't running at 60, it doesn't deserve to be played. And pretty much any game that "doesn't need to run at 60" probably isn't much of a game in the first place. But honestly, 60 FPS is playable, but even that's not ideal. 100 is where it's at.
people say they don't care about framerate until they're chugging 15fps on an action game that needs slick timing then you're playing a goddamn slideshow and trying super hard to convince yourself it's not bothering you
it also depends of the game, games that require precision for competitive purposes and fast reaction will obviously need locked 60FPS and in the other hand, games with a passive or slow pace such as RPGs, adventure games and sandbox can be played perfectly with 30FPS probably doesn't requiere more than that
but still, some people will say 60FPS are weird while others, 30FPS make the game unplayable
I play mostly older games so I could care less about the FPS. I just got a new PC and monitor and I can barley notice the frames most of the games run at. As long as it's consistent and doesn't freeze and stutter, I'm fine.
30 is fine for me, altho when developing, I want performance adaption so it always locks to 60fps(with vsync) to simplify animation interpolations(blending) -- basically so everything always looks smooth and gameplay is always consistent.
Fallout 4/Very High Visual Settings with 30-ish to 60FPS on outdoor maps (depending on clutter and object density) and stable 60 FPS indoor. Good thing I have atleast a SSD and a relatively new GPU to handle the game somewhat.
I really dont understand people who can't play with less then 60FPS. I mean you know the game feels somewhat slower paced, but how can people make such a huge deal out of it?
The difference between 60 and 30 is really noticeable, but it really depends on the genre. I mostly play RPGs and unless you have a high end GPU I feel you have to make way too many graphical sacrifices to maintain it. I played through the Witcher 3 at 30 because it was either that or making it look like a console game from 2012. I'd never play a shooter or fighting game at 30 though.
As long as its not powerpoint presentation levels of bad framerate, I won't complain. Sure 60 is super nice, but to what effect? Even competitive things seem to not benefit from 60 [Albeit above that may be different] all too much, and frame perfect occurrence requirements can only lead to frustration. Hell, people made a fuss over Tales of Zestiria not being 60 FPS and used the excuse of "With a game like that you figured it'd be a main thing."
I agree with you for the most part. But with most 3d Tales games battles have never been below 60fps. So when it suddenly was for Zestiria it was kind of jarring. I actually ended up getting the mod to unlock the framerate on it and gameplay is pretty stable with it for the most part.
True, though it still felt the same to me. Also the game isn't super stable overall. Except Dragon's Dogma, some how a game with like 15FPS originally on the console, going 60 FPS and working out way better than any Console to PC port I've ever worked with is quite surprising.
Framerate is not as important as aesthetic, but there are plenty of aesthetics out there that definitely demand 60 FPS. A lot of console games had 30 FPS limits as a result of hardware rather than by design, and the true meat of the argument going on these days is when those same limits are then hard coded into PC ports and the publishers suddenly claim 'cinematic vision.'
Besides, we've had 60 FPS since the NES days if not earlier, so it seems odd to suddenly dismiss it when you're throwing multi-million dollar budgets and hardware at it.
I'm just too out of date to care. I'm still trying to rebuild my PS1/2 collection while everyone else I know has played their PS4 and Xbox One collections to death. I love games, but apparently not enough to be considered a gamer.
Mostly depends of the game. Competitive games at lower rates than 60 FPS are a no-go, and I prefer to play 2D pixel-art based games at 30 FPS, but otherwise, I don't really have preferences whatsoever.
PC Master Race all the way. If I can't adjust my settings low enough to get mostly consistent 60 FPS in an action game, I probably won't play it. There are occasional exceptions - GTA San Andreas inexplicably feels and looks fine at 24/25 FPS, notably - and there are genres where it just doesn't matter, but for action that's my rule.
What really gets under my skin is mandatory Vsync, specifically the terrible mouse input lag that results from it. Tying engine calculations to framerate instead of ticks should've stopped in the 90s.
A decent framerate IS required for certain games (have you tried Mario Kart 8 in 1 or 2 player? Yes? Then try 3 or 4 now) and a higher framerate is ALWAYS enjoyable. I don't understand the people falling for the "cinematic" argument.