Why set framerate higher than your display’s

framerate

I've seen this allowed in many games' graphics options and I noticed how players like to boast that their hardware configuration can draw a certain game at XXX FPS on max settings.

From what I know, your display's frame rate is the limit of how many frames you'll see per second and setting higher framerate in your graphics card settings will either cause "input not supported" message on your monitor/tv or simply nothing will happen.

I've tried unlocking FPS in several games and I haven't noticed any changes, but maybe I'm not looking for the right signs.

Is there any benefit to raising/unlocking upper limit to your games' framerate? Or is it allowed just for benchmarking purposes?

Best Answer

Note: This answer is mostly based on my personal experiences.

Higher framerate is usually useful, because games tend to run on a cycle where input is checked once per frame. The more input checks you can get per second, the less input delay there will be, and input delay is something that keeps coming up, especially in competitive games.

But keep in mind that some games use a frame-based timescale, which means that the speed at which events and animations play out depends on your framerate. Sometimes a higher framerate starts breaking the game engine and causing weird issues (like in GTA: San Andreas: swimming and vehicle handling break - or Dark Souls, where you might fall through the ground when using a ladder - or Need for Speed: Rivals where the whole game speeds up).

Also capping the framerate with vsync can cause stutters as the highest stable framerate changes (for 60Hz display, when the framerate drops below 60, the game will change the cap to 30). This is the main reason I go for uncapped framerate in games like Skyrim where the framerate can be anywhere between 40 and 100 depending on location and direction of the camera.