Internally, modern HDTVs usually support 60 frames per second.
Using an RF adapter or Composite (3 plug) plug you'll be restricted to the following:
NTSC is 60 frames per second interlaced at 640x480 resolution (480i).
PAL is 50 or 60 frames per second interlaced at (I think) 640x576 resolution (576i).
Interlaced means that only half the lines on the screen update on each frame.
Progressive means that every line updates an every frame.
Modern HDTVs have no problem doing 60fps progressive from Component (5 plug), DVI, or HDMI... or from broadcast TV channels if they're using HD. Usually, you'll see things like 720p (1440x720 progressive) or 1080p (1920x1080 progressive).
One issue you will have is that not all games run at 60fps.
Yes, there are good reasons for why games need a higher resolution and framerate than movies.
TL;DR
Basically, it all boils down to the fact that a PC needs to compute everything it wants to show, while a movie simply records everything it sees. Therefore a movie can display the real world more accurately than a game, using less precision.
Framerate
When you take pictures with a camera, sometimes you may end up with a blurry picture. This can happens when you move too much and the scene is too dark. That's because even though a camera is supposed to capture a single moment, it can't. Ideally, it will capture a scene that is shorter than 100 ms, which isn't too blurry.
Movies are made of several such pictures every second. Since all frames have a minimal amount of blur, they appear to be seamless. Thanks to that, a movie can still look decent even at framerates as low as 24 FPS.
PC games are quite different in that regard. Since each frame is rendered for a very specific point in time, there is no blur. Since there is no blur, there is a gap in between each frame. This gap is much clearer in a game due to the lack of blur, and thus much more noticeable to the human eye.
But isn't there plenty of blur in modern games nowadays? Yes, but this blur is applied to each individual frame and is quite different from the natural blur found in movies. Modern rendering techniques attempt to blur each frame based on how objects have moved since the last frame. This makes motion blur in games look more natural and might help make framerates appear smoother than usual.
Resolution
With resolution, we have a similar phenomenon happening, but for a different reason.
As a gamer, you're surely familiar with the term "aliasing". Aliasing does not happen in movies, because there is a "blur" happening on each pixel. When the camera records several light sources for a single pixel, it will mix that light. That way, the seam between one object and the next is blurred, hence we do not see aliasing.
The PC operates differently. It doesn't record a scene, instead it computes it. In order to determine which color a specific pixel needs, the PC needs to calculate the color for a very specific pixel. For the PC, a pixel has no size; it is merely a point in the scene which needs to be rendered. For us, a pixel has a very real size, and unless you have a screen with a very high pixel density (commonly measured in DPI or PPI) those pixel are pretty big.
In order to avoid aliasing, the PC therefore needs to treat a single pixel as if it were made of several smaller pixels (aka. super sampling: rendering a frame at a higher resolution than you display it). Obviously, the smaller those pixels, the more accurate the anti-aliasing, and the more resource intensive it becomes. There are of course other, less resource intensive (and less accurate) anti-aliasing methods, like FXAA, which simply applies a very weak blur to hide aliasing.
Increasing the resolution doesn't exactly get rid of aliasing, but with small enough pixels, it becomes more difficult to spot.
Best Answer
First thing first, not every game server runs on 60 tickrate, Valorant runs on 128, CSGO private tournaments run on 128 tickrate, but anyway, the fact that the server is not responding you with the same low response time as your monitor doesn't mean that you don't get any advantage in having a 144hz or a 240hz monitor.
Higher refresh rates mean lower input lag, so the game will read your input sooner, you will see animations smoother (easier to see if an enemy is using a certain ability in Overwatch, Paladins, Sekiro...), you will be able to track moving target WAY easier because you will get more information to your eyes in the same time interval and you will be able to move your mouse and having quicker response time from the monitor if you have to correct the point you are aiming at. It applies to any other game too, "not fighting games, because they are often hard-capped at 60 FPS". In Rocket League, with lower monitor response time, you will be able to turn your car and correct your air or ground trajectory sooner. I switched from 60hz to 144hz last year and I saw quantifiable benefits in any game I tried without hard-capped fps. BO4 was way easier, I was more competitive in R6, I was more precise with car movement and shots in Rocket League. If you are not yet sure, this page says that 96% of R6 pros (real pros) use a monitor capable of displaying at least 144 FPS per second. I mean, they probably have a reason to do so.