Monitors Articles

What Is Monitor Refresh Rate? (Definitive Guide)

Have you ever been curious about what monitor refresh rate is?

In brief, monitor refresh rate is the amount of times the monitor updates the content on the screen in a second.

This definition is sufficient for someone experienced, however, for a newbie we may have to go a bit deeper into its meaning.

You may have heard this term come up several times when searching for a new monitor or a TV.

How does it work? What does it mean for you as a customer? Is a higher refresh rate always the better option? Would it make your gaming and viewing experience worth the money?

I understand if you have a ton of questions. Here, I will try to dissect each and every angle of what refresh rate is and hopefully present it in a simplified manner.

This is the first of many comprehensive guides on TV/Monitor terminologies on TechGearoid. Please follow us for more updates.

If you perform a simple search for Refresh Rate, you may naturally visit the Wikipedia page which reads:

“The refresh rate (most commonly the “vertical refresh rate“, “vertical scan rate” for cathode ray tubes) is the number of times in a second that a display hardware updates its buffer.”

Note: Buffer in the above definition is a hardware component. In simplified terms, just understand that Buffer here represents the internal hardware. More over, Cathode Ray Tubes (CRT) are more or less phased out entirely, no need to worry much about that.

As is the case with almost all technical definitions found on Wikipedia, this is certainly a mouthful.

If I hadn’t done my proper research on refresh rate, a definition like this would go right over my head.

Therefore, let us break it down a bit.

However, before we attempt to understand what is monitor refresh rate in its entirety, let us look at what frequency is.

Let me warn you, the things below WILL confuse you in the beginning, however, I assure you in the end it’ll all fit together in your brain like a  jigsaw-puzzle. Also note that Monitor and TV refresh rates are the SAME THING.

TL;DR: What is Monitor Refresh Rate? The amount of times the monitor updates the content on the screen in a second. 60Hz = 60 updates in a second. TV refresh rate works the same way as monitor refresh rate.

What is Frequency?

If you have ever taken a basic physics course, you must have definitely heard of the term frequency. This is represented with the unit Hz (Hertz).

waveform animation

A Waveform with 4Hz Frequency

Frequency (in electrical devices) basically means the number of times SOMETHING completes its CYCLE in a SECOND. Generally, this is represented as a waveform.

Waveform 2

1 Cycle Shown Here (AKA wavelength)

In case of monitors and TVs that SOMETHING is the internal hardware – how fast the internal hardware updates the picture on the screen.

Different frequencies have different names. We have the radio frequencies, microwave frequencies, mechanical frequencies etc.

The Frequency of the Monitor or TV is known as the REFRESH RATE.

The CYCLE here is the amount of time the internal hardware refreshes itself and repaints a different picture on the display.

If you are confused, just hang in there, by the end of this section, you will become an expert on monitor and TV frequencies.

Since a picture speaks a thousand words, this should help you.

Have a look at the three different waveforms below.

waveform comparison

4Hz vs 12 Hz vs 24 Hz

  • The first Waveform completes only 4 CYCLEs in a second. Therefore, it has a frequency of 4Hz.
  • The second Waveform completes 12 cycles in a second. Therefore, it has a frequency of 12Hz.
  • The third Waveform completes 24 cycles in a second. Thus it has a frequency of 24Hz.

Hopefully now you have a grasp of what Frequency is.

Let us apply this to Monitors and TVs.

A monitor or a TV with 4Hz refresh rate would cycle its hardware only 4 time in a second. Therefore, it paints only 4 pictures (frames) in a second.

Can you imagine how shaky your video would be if there are only 4 frames in a second of a video?

A monitor or a TV with 12Hz refresh rate would cycle its hardware 12 times in a second. It would paint 12 consecutive frames in a second. This would still be unbearably slow. It’ll certainly give you a headache if you watch it for too long.

Related: What Is Monitor Contrast Ratio?

waveform video comparison

Sample Ball Moving at Different Refresh Rates

The animation above lasts for one second i.e it has a frame rate of 24 FPS. However, notice how different refresh rates change the SMOOTHNESS of the video.

The higher the refresh Rate, the lower would be the MOTION BLUR.

Typical monitors and TVs have a refresh rate of 60Hz. Meaning the hardware Cycles or REFRESHES the images on the screen 60 times in a second.

In other words, with a 60Hz refresh rate, the monitor or TV has the capacity to paint an ENTIRELY new picture 60 times in a second.

This is the most crucial idea that you need to remember!

The higher the refresh rate, the more pictures (frames) the TV or the Monitor can POTENTIALLY paint on the display in a second.

I say POTENTIALLY because there is the other half of the equation i.e the frame rate, to take into consideration.

In other words, just because a monitor has the CAPACITY to show 60 frame per second, does not mean that it would. That depends upon frame rate of the video.

Movies have a frame rate of 24 frames per second NOT 60 frames per second. So how does it all work?

Related: What Is DCR Setting On A Monitor?

TL;DR: The slower the Refresh Rate in a monitor/TV, the blurrier, jittier, and jumpier a video. The faster the Refresh Rate, the smoother the video. Just because a monitor has 60 Refresh Rate does not mean it would show 60 new frames.

How Does Refresh Rate Work?

A TV or a monitor can have a variety of refresh rates these days. The most common is 60Hz refresh rate.

Therefore, in order to understand how refresh rate works, let’s take 60Hz as an example.

We have already established above that a 60Hz refresh rate means that the monitor/TV has the ability to show 60 frames in a second.

Let us represent those as boxes below.

60 refreshes

60Hz Refresh Rate = 60 Screen Updates

This creates the FIRST issue here: There are more refreshes than there are frames.

How do you solve this? Well, this issue is solved through the principle of DUPLICATION.

However, before we go any further, let us solidify the concept of Refresh Rate and Frame Rate.

Related: What Is ACM Setting On A Monitor?

How are Refresh Rate and Frame Rate Related?

A frame is basically a picture on the screen painted by the monitor/TV. The Frame Rate is the amount of pictures painted in a SECOND.

There are two types:

  • Rendered Frame Rate
  • Real Time Frame Rate
Rendered = Movies

Movies and videos are RENDERED mostly at 24 frame rate. Meaning they show 24 pictures in a second. Since they are Pre-Rendered, there is nothing you can do to increase this frame rate.

You can throw in the most powerful computer hardware, yet the frame rate will remain as 24 frames per second.

No one really has too big of an issue with a 24 frame rate when it comes to movies.

However, that is not the case with gaming.

Real Time Frame Rate = Gaming

REAL TIME rendering occurs when the graphics card in a PC is actively rendering each and every frame, well, in real-time.

Gaming is the prime example.

Nothing that you do when gaming is pre-rendered. Everything is being rendered actively by the graphics card and the graphic engine right then and there.

A weaker graphics card would not be able to pump a high frame rate. Whereas, a high end graphics card would have no issue with that.

The higher the graphics settings you choose, the more a graphics card would need to work to render everything in real time.

However, it doesn’t matter if the graphics card is running at its PEAK capacity. If its weak, it IS weak.


If your Intel iGPU (some of the weakest graphics cards in the market) is giving you 10 frame rate while playing Battlefield 5, there is absolutely nothing you can do to the iGPU to improve it.

The only option you have is to get a better graphics card. Real Time rendering is a RESOURCE HEAVY TASK.

For gamers, playing a game at 24 frame per second would literally give you a headache. Let alone the fact that a 24 frame is far from being something you would use for competitive gaming.

Yet for movies 24 FPS is the way to go.

Gamers WANT more frames!

How many frames per second can your monitor POTENTIALLY support?

A 60Hz monitor, can only support 60 frames per second EVEN if your graphics card has the capacity to render more frames than that.

If you want to see more than 60 frames in a second, you would go for a monitor that has a higher refresh rate like 120 Hz. Of course, you will need a graphics card that is capable of delivering 120 frames per second to fully utilize it.

If you get a 120Hz monitor, but your graphics card can only push a 40-45 frame rate, then you will have wasted your money. Sticking with 60Hz would be then be the right option.

In short (For Real Time Rendering),

  • Refresh Rate = Function of the Monitor
  • Frame Rate = Function of the Graphics Card

TL;DR: Movies are produced at 24 FPS. Frame for games are generated in real time by the graphics card. Sky is the limit for gamers.

How Does It All Work?

Now that you have understood the difference between frame rate and refresh rate and their relation, let us continue on with understanding how it all works.

As mentioned earlier, a typical monitor/TV has a 60Hz refresh rate. Therefore it has the potential to show upto 60 frames per second.

The movies, on the other hand, operate only at a 24 frame rate.

This creates an obvious issue that we talked about earlier. 60 is not divisible by 24 evenly i.e there are more refreshes than there are movie frames.


60 Screen Refreshes vs 24 Movie Frames


This first issue is solved via the process of duplication. Basically, each frame is stretched to fit multiple refreshes.

refresh rate vs frame rate

Since 24 frames do not fit into 60 frames, they are basically duplicated onto the screen by the TV/Monitor.

Let’s zoom into the first three movie frames and see how they are duplicated.

zoomed 3 frames

3 Movie Frames and Duplication During Refreshes

Here you can see the principle of duplication being applied.

The first frame of the movie occupies 3 refreshes. The second frame occupies two refreshes. The third frame occupies 3 refreshes and so on…

The Cyan color boxes represent the refreshes where a frame update occurs.

Another way that a movie occupies all refreshes is through the principle of Motion Interpolation. We discuss this below in detail.

This brings to the NEXT ISSUE: JUDDERDe-Synchronization.

TL;DR: Duplication stretches each frame of the movie to fit the refreshes of the monitor/TV.

The Issue of De-Synchronization

A Judder Occurs when movie frames do not occupy the refreshes perfectly. This makes the video look jittery and shaky.

Since movies are shown at 24 frames per second and a typical monitor/LCD has a refresh rate of 60Hz, the movie do not occupy all the refreshes perfectly.

i.e 60/24 = 2.5 NOT A WHOLE NUMBER.

Here is how it looks like.


Half a Refresh Rate Worth of LAG

This graphic shows just 2 movie frames for the purpose of illustration.

Notice how there is a lag in when the 2nd movie frame is SUPPOSED to show and when the movie frame ACTUALLY Shows. There is a half a Refresh Rate worth of lag. This is all due to the fact that 24 frames is not perfectly synchronized on a 60HZ TV.

So even after the duplication occurs to fill up the frame rate of the monitors/TV, the frame rate of the video is not synchronized with the refresh rate of the monitor.

The first two refreshes of the monitor would show the first frame of the video.

The problem occurs on the third refresh. According to the movie, the 2nd frame should be updated onto the screen. However, due to the de-sync, there is lag and monitor continues to show the 1 frame for an additional half a refresh.

There are many technique used to resolve the de-sync between refresh rate and frame rate

TL;DR: A typical 24 FPS movie does not fit evenly on 60Hz Refresh Rate of the Monitor/TV even after duplication.

For Cinephiles - Judder Is not an Issue

Is This Judder/Lag/De-Sync Really an Issue?

Technically no. Human eye/brain cannot perceive the lags. Another important consideration is the movie culture.

The Movie Culture/Art

We have unconsciously and collectively attributed 24 frames as the standard for movies that we watch. It is an element of the movie art.

To change movies to a higher frame rate would be like asking to flatten or level the pointillism inspired painting “Parade de Cirque” with a smooth brush.

But Wait! Don’t higher frame rates make a movie MORE realistic.

Well, as the saying goes. If you want real, look outside the window. Movies are not supposed to LOOK real.

Van Gogh’s The Starry Night is a masterpiece not because it looks REAL. It is a masterpiece because it looks anything but.

Van Gogh's The Starry Night

Van Gogh's "The Starry Night" does not look REAL

24 frames per second is the oldest standard for a comfortable frame rate for movies and it is still used to this date.

The older monochrome silent movies used to have a much slower frame rate like 16 fps. Those certainly looked choppy as the slower frame rate was perceivable by the brain.

That is not the case with 24 FPS. It is comfortable and has come to be realized as the CINEMATIC STANDARD.

In short, for “MOVIE MAGIC” to work, it has to be shot and watched at 24 FPS.

The Right Balance Between Budget and Art

Of course, there was the monetary consideration when choosing 24 fps as the standard.

Film makers wanted to strike the perfect balance between a smooth video frame rate AND affordability.

24 frame per second was conceived during a time when video were still made on physical films. There were no digital software or computers.

A higher frame rate would mean you would need more films. For example going from 12 FPS to 24 FPS would require double the amount of films.

On top of that, more frames meant more manual labor at editing the movie.

24 FPS was chosen as the golden standardIt was smooth AND economically viable.

Learn more about movie magic here: The illusion of Motion

When Do Higher Frame Rates Make Sense?

I hope that you have realized by now that more frame rate = more realism.

The fact that we have chosen 24 FPS as the Cinematic Standard has nothing to do with Realism and everything to do with “Movie Magic”

So when would higher frame rates actually make sense?


Gaming is the holy grail of higher frame rates.

The higher the frame rates, the smoother the experience.

In fact, gamers and streamers can certainly be considered the pioneers of propagating and promoting higher frame rates.

Movie Industry Seems to Hate Higher Frame Rates…

The movie industry DID TRY higher frames rates. Peter Jackson’s The Hobbit Trilogy was shot with 48 Frames Per Second (twice the normal).

However, it failed to convince the viewers and critics with its”realistic” cinematography.

Jack Coyle of the AP wrote:

“Critics said the film seemed overamplified and that the increased clarity yielded a discombobulating hyper-realism that contrasted poorly with the set design.“

So there goes all the ambition of pulling higher frame rates on movies.

TL;DR: The uneven fit for 24FPS Movies on 60Hz Movies/TV is the Cinematic Standard. People generally have no issues with the judder or the slow frame rate. However, gamers cannot fathom running games at mere 24 fps.

Solving the De Sync

Since most people are okay with Cinematic Experience of the 24 FPS, they do not want it changed.

However, the current trend, especially among gamers, is that they want things changed.

It is a well known adage for the gamers that 60 FPS can cure cancer.

60 FPS cures cancer

There are plenty of techniques to increase the frame rate and to fully utilize the refresh rate of your display.

1. Shooting at Higher Frame Rates

If you want more realism, simply shoot at higher frame rates.

But, it is not as easy as that.

You need the equipment for it. A quality camera that shoots 60 FPS will definitely be more expensive. Let alone the fact that you will need a huge hard disk space to save a video file that is shot at 60 FPS.

On top of that, if you are a film maker, good luck convincing Cinephiles to watch your “Life-Like” video instead of the cinematic 24 FPS movies.

However, if you are adamanet on getting a high FPS cameras you can find such in all categories including the best action cameras.

It should be noted here that while 24 FPS is the standard for movies. 30 FPS has been the standard for Television.

If you ever have a different feeling when watching movies compared to Television, well, now you know why.

Basically, TV shows are smoother than movies.

Now 30 FPS is a great number. It divides 60 Hz perfectly. Therefore at 30 FPS you have one video frame for two screen refreshes. This in turn means, smoother video.

2. 120Hz Refresh Rate

Although 60Hz monitors and TVs are still the most common out there. 120Hz display will most certainly become a new standard.

One of the best thing about 120Hz TVs/Monitors is that all common film and video frame rate standard fit with 120Hz perfectly.

  • 24 FPS x 5 = 120 FPS
  • 30 FPS x 4 = 120 FPS
  • 60 FPS x 2 = 120 FPS

Basically, if you are playing a 24 FPS movie on a 120 Hz monitor/TV, each frame of the movie will occupy 5 screen refreshes.

The best part here is that, there will be no overlaps as we saw with 24 FPS on 60 Hz above.

24 vs 30 vs 60 movie frames

24 vs 30 vs 60 Movie Frames on 120Hz Refresh Rate

No overlaps = no lag = no JUDDER = smother video .

BUT, if you want EVEN MORE smoothness to your movies and videos, you must get a TV with MOTION INTERPOLATION.

3. Motion Interpolation (SOAP OPERA EFFECT)

This is where processors, algorithms, software and all that fancy jargon becomes relevant.

Motion Interpolation is an advanced technique employed by many TVs that enables it to GUESS and GENERATE a frame in between two video frames.

Here is how that works:

interpolation slow

Simplified Representation of Motion Interpolation

We can see that the motion interpolation algorithm has guessed and generated a new frame in between.

Since all the frames are now closer to each other, this will result in a reduced MOTION BLUR.

This makes the videos more fluid and smooth.

On the down side, it eliminates the Cinematic Essence of a jumpy 24 FPS movie.

The resulting video look is commonly referred to as the SOAP OPERA EFFECT.

As mentioned earlier, TV shows traditionally run at a higher frame rate as compared to films (30fps vs 24fps respectively). Therefore, the feel between the two is naturally different.

Motion Interpolation essentially eliminates the distinctive feel of a 24 FPS film and makes it resemble the quality of a TV show.

Drawback: Visual Artifacts

Interpolation is governed by a complex software and sometimes it can make MISTAKES.

Performing interpolation for the software is simple on slow and linear moving scenes like a slow car moving in a straight line; however, it starts to generate artifacts when there is simply too much happening on the screen.

During scenes of explosion or extremely fast scenes, there are high chances that the interpolation engine on your TV would make visual anomalies.

4. Adaptive Refresh Rate (Gamers Rejoice)

ADAPTIVE REFRESH RATE rate is the newest technology that aims to synchronize the frame rates with the refresh rate of the monitor.

There are two pioneering technologies behind this: FreeSync by AMD and G-Sync by NVIDIA.

Most gaming monitors these days offer either of the two technologies.

If you haven’t read the rest of the article here, just remember the following key concepts

  • Refresh Rate is the Function of the Monitor
  • Frame Rate is the Function of the gaming hardware i.e Graphics Card
  • List Element
How does it work?

Monitor and TV traditionally have a static refresh rate meaning it does not change. The most common is 60Hz.

Adaptive refresh rate, aka, dynamic refresh rate technology enables the monitors to ADJUST their refresh rates depending upon the frame rate of the game.

In essence the monitor and graphics card COMMUNICATE with each other.

We know that Frame Rates JUMP all over the place. Sometimes you would be getting 75 FPS and other times it would drop all the way to 40 FPS.

With the adaptive sync technology, if the frame rate drops, the refresh rate drops.

If the frame rate goes higher than the supported refresh rate, then the monitor tells the graphics card to ease down a little.

amd freesync 1

Notice How a PART of the Frame 2 is repeated into the 3rd Refresh, when the 3rd Refresh should be reserved for the Frame 3 alone!

amd freesync 2

Adaptive FreeSync by AMD in Action as it changes the Refresh Rate

What is the Benefit of this?

When a desyncrhonization occurs between the refresh rate and frame it causes two issues:

  • Screen Stutter
  • Screen Tear
Screen Tear

Screen Stutter occurs when you have very powerful hardware that can pump more frames than the monitor can support.

For example if you have a monitor with 60Hz refresh rate but your graphics card is feeding it with a game running at 75 frames per second. There is obviously an issue here.

Basically, your gaming rig gets bottlenecked by the monitor.

Related: Does Using Multiple Screens Affect The FPS?

screen tear

A representation of What Screen Tearing Looks Like [Image Credit: AMD]

When this happens, your monitor is practically getting fed MORE than a single frame per monitor refresh.

Traditionally, this issue was solved through enabling V-Sync.

However, V-Sync has its own issues. It forces the graphics card to supply ONLY as many frames as the monitor can support.

Unfortunately, when a game gets too intensive with explosions here and there and ton of particles, the frame rate drops all of a sudden.

This leads to the second problem i.e SCREEN STUTTER.

Screen Stutter/Lag

Screen Stutter, commonly known as lag, occurs in two instances

  1. You have a week graphics card
  2. You had enabled V-Sync, but a sudden intensive scene in gaming dropped the frame rate way below the monitor’s refresh rate

In the first instance, if your monitor has a refresh rate of 60Hz but your gaming hardware is only pumping out 40 frames per second. Then there are high chances that you will experience a screen stutter or lag. To resolve this issue, the adaptive-sync feature makes the monitor reduce its refresh rate to match the frame rate of the game.

In the second instance, your rather powerful hardware just got into trouble just because you had V-Sync enabled. Well, with an adaptive-sync technology, you do not need to keep the V-Sync enabled at all.

TL;DR: There are many ways to solve the de-sync between movies/game frame rates and the monitor/TV refresh rate. Synchronization, however, is a double edged sword. While it eliminates judder in movies, it also eliminates the cinematic feel for the Cinephiles. On the other hand, synchronization is a blessing for gamers in any way possible.

People Also Ask (FAQs)

If you have read and tried to understand all the stuff above, you must have a tons of follow up questions.

Let me try to answer a few.

Why TVs and Monitors have 60Hz Refresh Rate?

To answer this question, we will have to visit history and the beginning of the electrical grid.

When Alternating Current (AC) won the battle against the Direct Current (DC), it became a norm in U.S and in the Europe (subsequently in the entire world)

Unlike the DC power, AC power is supplied with a STANDARD FREQUENCY. US adopted 60Hz as the standard supply frequency. (Europe adopted 50Hz frequency as the standard).

This frequency was selected for multiple reasons.

Essentially it was a compromise between lowering the visible FLICKER vs higher transmission losses.

A higher electrical supply frequency reduces the visible flicker on lights because it makes the current travel smoothly. On the other hand, higher frequencies mean higher current loss as it travels on the transmission lines.


Flicker is Noticeable on Slow AC Line Frequencies

A higher frequency also means the equipment would run much hotter and would require cooling.

An AC current looks like a sine wave. Meaning the bulb is turned on and off as the sine wave reaches the peak and the trough (bottom) respectively.

A supply frequency of 60Hz ensures that the bulb switches on and off 60 times in a second. It is so fast that it is almost invisible to eye.

Now imagine if the supply frequency was 1Hz. That would mean that the light bulb would turn on for half a second and off for half a second.

That would be highly annoying since the constant dimming and brightening would certainly give you a headache.

Considering all that, a standard frequency was adopted.

The standard 60Hz supply frequency was chosen long before electronics came into being let alone the CRT and the LCD televisions.

Henceforth, almost all of the electronic devices, not just display devices, were made to follow the standard frequency.

When the CRT TVs were invented, they too were designed to run at 60Hz.

Since the Cathode Ray Tubes were highly sensitive components, a refresh rate different from the electrical supply frequency introduced problems such as interference, “hums”, flashing screens etc.

Therefore, both screen refresh rate and AC line frequency were HARMONIZED.

When LCD were introduced, they did not suffer from the same issues as the CRT monitors. Their components were not too sensitive to the issues that CRT faced.

However, 60Hz had long been established as the norm in the video industry. Thus, they simply carried on with the same standard.

These days, however, we can see LCD monitors offer varying degree of refresh rates. It is not uncommon to find monitor with 100, 120, 144, 165, 240Hz refresh rates.

They function perfectly fine without any “hums” and “interference” from AC line frequency despite the fact that they are NOT harmonized.

TL;DR: It has to do with history.

Why Not have 24Hz TVs When Movies have 24 Frame Rate?

We talked about the issues of a lower frequency earlier.

In our bulb example above, a slower line Current frequency means the flicker on the bulb is more noticeable.

The same applies to the LCD TVs/Monitor. LCD monitors and TVs are comprised of PIXELS. Each pixel has to be updated upon a refresh to paint a new frame.

A slower refresh rate means the pixels take longer to update. In fact, at 24Hz it is long enough for the eyes to notice. 

This phenomenon is called FLICKERING.

It is for this reason that TVs/Monitors with slow refresh rate are not made.

slow refresh rate and flicker

Slow Refresh Rate Causes Flickering (TV Image Source: Clipart Library)

A slow frequency i.e refresh rate = more noticeable flickering.

While you can technically have a TV with 24Hz refresh rate, the flickering would be so noticeable that it would be an unbearable experience.

In fact, some 60Hz TVs do offer 48Hz mode because 48 HZ is easily divisible by 24FPS (standard FPS of movies).

In this mode, each frame occupies two cycles. No overlap and no judder.

However, many have reported issues with noticing flickering even when running at 48Hz.

Another huge drawback of flickering is the EYE STRAIN and HEADACHE that it causes. A high amount of flickering can be damaging to your eyes. Let alone the fact that constant flickering is dangerous for those with epileptic disorders. Plus. for people who work on spreadsheets and reports for prolonged hours, flickering can be a serious concern for them.

TL;DR: Slower refresh rates causes visible flicker.

Why Are Movies Made at 24 Frame Rate? Why Not Go Higher?

As mentioned above, it is certainly possible to shoot movies with higher frame rates than 24FPS.

However, 24 frame rate has been recognized as the artistic element of the movies. 24 frame rate is the CINEMATIC STANDARD that delivers the famous “Movie Magic”.

While a higher frame rate will certainly make movies more realistic, cinephiles do not seek realism. They deliberately want the cinematic feel.

TL;DR: People are okay with 24 frame rate. They call it the “movie magic”.

Should You Get More Than 60Hz TV if Not Gaming?

It is assumed that only gamers benefit from a TV or a monitor with a higher refresh rate.

That is hardly the case.

We have talked extensively about movies running at 24 FPS. A 60Hz TV or a Monitor refreshes 60 times in a second.

60 refreshes are not equally divided among the 24 FPS of the movies (i.e 60 / 24 = 2.5). This introduces JUDDER. In other words, it doesn’t look smooth.

If you get a 120Hz TV, it has the potential to refresh 120 times in a second.

24 is easily distributed among 120 refreshes of the screen (i.e 120/24 = 5; 1 movie frame for every 5 screen refreshes). This removes the judder and makes the movie look very smooth.

On top of that, most modern movies come with MOTION INTERPOLATION, we talked about this earlier. This feature further improves the smoothness of the movies.

Therefore, to answer the question: would you get the benefit of a TV with higher refresh rate if you are not a gamer? Certainly!

However, it should be noted that watching a movie on a TV with higher refresh rate does not have the same CINEMATIC FEEL to it as compared to watching it on a 60Hz TV/Monitor.

TL;DR: 120Hz will make videos smoother, but eliminate the cinematic feel when watching movies. In other words, the choice is yours.

What is The Best Refresh Rate For You?

After the long and arduous research you must be able to tell which is the best refresh rate for you.

However, if you want to get to gist of it, we have this small guide for you.

1. Casual Buyers

You can either go for 60Hz or 120Hz TVs.

As for Monitors, there is no point in going for anything higher than 60Hz since high refresh rate monitors are costly. They cater specifically to the gamers.

60Hz TV Pointers for you
  • They are cheaper
  • You would have the CINEMATIC FEEL when watching movies (since there is judder). You can read about this above.
120Hz TV pointers For you
  • Smoother Videos
  • No Cinematic Feel since there is no judder (i.e 24 fps is divided equally among 120 refreshes 24×5 = 120). 
  • Motion Interpolation: 120Hz is the minimum standard to support motion interpolation. They can make your videos smoother still. The effect this technology produces is called SOAP OPERA EFFECT.
  • They are more expensive

2. Console Gamers

Console gamers can also choose between 60Hz and 120Hz TVs/Monitors.

Traditionally console games were capped at 60 frames per second. Therefore, 120Hz TVs did not make sense in that case.

However, recently Xbox One X removed the cap on its gaming consoles to support games at 120Hz.

Unfortunately, this does not make sense to most critics since Xbox One X does not have the sufficient hardware required to render and deliver games at 120 fps.

“Games may still struggle to achieve consistent 120 fps frame rates, but they may produce frame rates that fall between 120 fps and the current 60 fps limit.” – Kevin Murnane

Basically, if you want to future proof yourself, get the 120Hz TV. Otherwise, you are good to go with 60Hz TV for console gaming.

Also, there is no doubt that the next gen console will deliver much higher than 60FPS.

3. PC Gamers Monitor Refresh Rate

With PC Gamers, refresh rate standards are all over the place.

While 60Hz monitors are still ok for basic gaming, you should certainly look for monitors with very high refresh rates if playing anything other than the most simple games.

What are the types of refresh rates can you get as a gamer?

Well you have:

  • 75 Hz
  • 144 Hz
  • 165 Hz
  • 240 Hz

…and who knows what in between.

As a rule of thumb, however, for gamers, the higher the better.

Anything that can give them a slight competitive advantage is worth investing top dollar into.

Yet there is a STANDARD REFRESH RATE for most gaming monitors: 144Hz!

This is a great number since it is easily divisible by 24×6 = 144 fps i.e (the standard movie frame rate)

Here you must be wondering, well why not go for 120Hz like the TVs do? Why choose such an odd number as the standard gaming refresh rate?

Sure 144Hz is marginally faster as compared to 120Hz, by why go through all the trouble of making this the standard when 120Hz makes the most sense.

Some say it is just a marketing gimmick to set PC master race apart from the rest (of the peasants).

Others make the argument that it tends to hit most common divisible sweet spots i.e 24, 36, 48, 72. In comparison to 144Hz, 120 Hz has divisible sweet spots at 24, 30, 40, 60.

The idea here is that if you have a powerful enough graphics card it will be hitting frame rate numbers closer to 70s as compared to 60s. If you have moderately powerful graphics card, it’ll hit frame rates close to 50s as compared to 40 and 30s. In that sense, 144Hz tends to cover more sweet spots.

Yet, this there is no TRUE answer as to why 144Hz!

Well then What About 240 Hz?

competitive gaming

If you play here, then 240Hz makes sense

240Hz is a refresh rate that is rare, expensive, and reserved mostly for competitive gamers.

Competitive gamers are nothing short of superhuman with super fast reflexes.

We have already established that a higher refresh rate reduces MOTION BLUR.

240Hz is the best you can get and a competitive gamer wants the best possible. The lower the motion blur, the better would be your chances of spotting your enemy first.

Is 240Hz worth it for Casual Gamers?

For casual gamers, the difference between 1/144th of a second (144Hz refresh rate) and 1/240th of a second (240Hz refresh rate) is not discernible.

While 240Hz may not seem feasible if your not a competitive gamers, do not let me, or anyone else, make your gaming decisions!

What about Everything in Between?

Gaming is crazy as you can tell by the numbers of different refresh rates you can have on the monitors.

There are plenty more refresh rates like 165 Hz monitors! This is also reasonably common.

Essentially, when it comes to choosing higher than 144 Hz refresh rate it boils down to your overall configuration and preference.

You may come across a scenarios where you can,

  • Play at 165 frame rates at 1440P or,
  • Play at 240 Frame Rates at 1080P.

If you prefer playing at 1440p, you would go for the 165Hz monitor and vice versa.