Why do resolution [and framerate] need to be higher in video games than in TV and cinema

framerate

I've recently restarted playing video games after around 10 years, and I've noticed that there is a lot of attention about resolutions.

Now, putting aside that this could be as much an obsession as megapixels for cameras are, I wonder if increasing the resolution in video games is more significant than it is in movies.

After all, movies "look real" already at 576p (not even counting the comparative loss due to anamorphism), so why is there this rush to freak resolutions?

Are there technical grounds for giving great importance to a resolution of 1440p, rather than 720p?

Edit: originally, the question was about resolution exclusively, then somebody edited it and added the framerate. There are already answers for the latter, but since this somewhat adds value to the question, I'll leave it there.

Best Answer

Yes, there are good reasons for why games need a higher resolution and framerate than movies.

TL;DR

Basically, it all boils down to the fact that a PC needs to compute everything it wants to show, while a movie simply records everything it sees. Therefore a movie can display the real world more accurately than a game, using less precision.


Framerate

When you take pictures with a camera, sometimes you may end up with a blurry picture. This can happens when you move too much and the scene is too dark. That's because even though a camera is supposed to capture a single moment, it can't. Ideally, it will capture a scene that is shorter than 100 ms, which isn't too blurry.

Movies are made of several such pictures every second. Since all frames have a minimal amount of blur, they appear to be seamless. Thanks to that, a movie can still look decent even at framerates as low as 24 FPS.

PC games are quite different in that regard. Since each frame is rendered for a very specific point in time, there is no blur. Since there is no blur, there is a gap in between each frame. This gap is much clearer in a game due to the lack of blur, and thus much more noticeable to the human eye.

But isn't there plenty of blur in modern games nowadays? Yes, but this blur is applied to each individual frame and is quite different from the natural blur found in movies. Modern rendering techniques attempt to blur each frame based on how objects have moved since the last frame. This makes motion blur in games look more natural and might help make framerates appear smoother than usual.


Resolution

With resolution, we have a similar phenomenon happening, but for a different reason.

As a gamer, you're surely familiar with the term "aliasing". Aliasing does not happen in movies, because there is a "blur" happening on each pixel. When the camera records several light sources for a single pixel, it will mix that light. That way, the seam between one object and the next is blurred, hence we do not see aliasing.

The PC operates differently. It doesn't record a scene, instead it computes it. In order to determine which color a specific pixel needs, the PC needs to calculate the color for a very specific pixel. For the PC, a pixel has no size; it is merely a point in the scene which needs to be rendered. For us, a pixel has a very real size, and unless you have a screen with a very high pixel density (commonly measured in DPI or PPI) those pixel are pretty big.

In order to avoid aliasing, the PC therefore needs to treat a single pixel as if it were made of several smaller pixels (aka. super sampling: rendering a frame at a higher resolution than you display it). Obviously, the smaller those pixels, the more accurate the anti-aliasing, and the more resource intensive it becomes. There are of course other, less resource intensive (and less accurate) anti-aliasing methods, like FXAA, which simply applies a very weak blur to hide aliasing.

Increasing the resolution doesn't exactly get rid of aliasing, but with small enough pixels, it becomes more difficult to spot.