DIFFERENCE BETWEEN REFRESH RATE AND FRAMES PER SECOND
So you're in the market for a new TV, and have probably been shopping around for the best deal that offers all the bits and bobs that take your fancy. Think features such as stunning 55" 4K with smart functionality. But there is one little thing you're probably not too sure about, and that is the refresh rate rating, which is usually measured in Hz.
Your display content source, whether it be a computer's graphics card, plug-in streaming device, satellite TV or even TV streaming app, all display pictures in frames per second. The frames per second define the perception of motion and are usually around 25 to 60 FPS. In some situations, such as in gaming, it could jump up to 120 frames per second.
But what has this got to do with the refresh rate of your display?
The refresh rate describes the number of times a signal can be natively translated into a motion picture physically on the display. If you are watching content that is being displayed on a TV via a digital source, the frames per second are encoded progressively which, in turn, require a minimum refresh rate of twice that amount. So, if you are watching Netflix at 60 frames per second, the required refresh rate would be 120Hz and so forth.
However, if you are gaming on an Xbox Series X at 120 frames per second, the minimum required refresh rate would therefore be 240Hz.
But what happens if your display doesn't meet those requirements? The universal drivers would detect this, regardless of which connection you're using, and therefore would reduce the frames per second to match that of the refresh rate of your display.
Take a look at the video below by the YouTube channel, Greg Salazar, on Refresh Rate vs. Frames Per Second...