rockcity
everyday is a chance to get better
- Joined
- Apr 10, 2005
- Location
- Greenville, NC
Whats the difference in performance and picture quality?
Rob
Rob
TV Refresh Rate Explained: 60Hz vs. 120Hz, 240Hz, and Beyond
If you're in the market for an HDTV, you've probably heard a lot about "speed." When ads and reviews talk about how fast an HDTV is, they're referring to the display's refresh rate, or how often it changes the picture. Television and movies don't show actual motion so much as dozens and hundreds of frames per second, like a reel of film or a huge flipbook. The faster the HDTV, the more frames it displays per second.
By Will Greenwald February 3, 2011
So the faster the refresh rate, the better the HDTV, right? More frames look smoother, right? In theory: yes. In marketing: very yes. In practice: not so much.
Pulldown and the Film-Video Dance
To answer those questions, you have to understand two important things about video. First, you can't add detail beyond what is already in the source footage. Second, the source footage is never greater than 60Hz. When you watch a movie on Blu-ray, it's a 1080p picture at 60 Hz. The disc displays 60 interlaced or 30 progressive frames at 1,920-by-1,080 resolution per second of video. For movies that were recorded on film, the original footage is actually 24 frames per second, upconverted to 30 frames through a process known as 2:3 pulldown. It distributes the source frames so they can be spread across 30 instead of 24 frames per second. Those frames are then interlaced (combined and shuffled) to 60 "frames" per second to match the 60Hz refresh rate of the vast majority of TVs you can buy today. In the case of 1080p60 televisions, the frames are pulled down to 60 full frames per second, and both the players and HDTVs outright skip any interlacing step.
This is a time-honored tradition, because American TVs have displayed 30 (actually, 29.97) frames per second and functioned at 60Hz since time immemorial. It's not really a problem, because between interlacing and frame pulldown, the process doesn't attempt to add information to the picture. It's simply converting it to function on the TV, because it wouldn't work otherwise. 1080p60 is the current high-end standard for HDTVs, and no commercial media exceeds that resolution or frame rate. In fact, many movies on Blu-ray even turn the frame rate down and display 1080p24, or 1,920-by-1080 video at 24 frames per second, to make the footage look as close to film as possible. The various refresh rate-increasing technologies on HDTVs destroy that effect.
Going Too Far
Enhanced refresh rates like 120Hz, 240Hz, and various other speed-boosting features on modern HDTVs, on the other hand, push the concept too far. Remember what I said earlier about not being able to add detail beyond what's in the source footage? That's exactly what those higher refresh rates do. They interpolate data between each frame to produce additional frames. But the data in those combined frames can only be based on the source frames and whatever mathematical magic the HDTV is employing to figure out the middle ground. This technique can help reduce judder, or the jerkiness that manifests when displaying footage on a display that doesn't share its native frame rate (like, for example, a 24-frame-per-second film clip pulled down to 30 fps, then interlaced to 60Hz). Some plasma HDTVs can even reach a 600Hz refresh rate, which, when you consider that the source footage is going to be between 24 to 60 frames per second, is downright overkill.
Actually, this effect can produce a distinctly artificial, unnatural feel to video. Motion can appear too smooth, almost dreamlike compared with the films and television shows we've spent decades teaching our brains to enjoy. Action can seem just slightly sped up to the point of looking unreal, and it can take you out of the experience quicker than any judder. Indeed, to many videophiles judder is just as natural as film grain, and a subtle but necessary part of watching TV and movies.
Do You Need Super-Fast Refresh Rates?
When flat-panel HDTVs were in their infancy, they suffered from motion blur. LCDs in particular tended to display distinct blurriness during very fast on-screen movements because of "ghosting," or the afterimage left after the image on the screen has changed. LCD technology has progressed a great deal over the past several years, and now ghosting and motion blur have been all but eliminated. If you've purchased an LCD HDTV in the last two years, it probably won't show noticeable blur at its standard 60Hz refresh rate.
Tastes can vary, and you might enjoy the potential judder-reducing, motion-smoothing effects of an HDTV with 120Hz or 240Hz modes. But they don't add any actual detail to the video, and they certainly shouldn't be considered dealbreakers when you're shopping for an HDTV. Even if you get a set that supports 120Hz or 240Hz (or even 480Hz or 600Hz) video modes, you might want to disable them, and watch the video without any interpolation or judder-reducing effects.
For more HDTV shopping tips, readHow to Buy an HDTV. And for a look at the top televisions we've tested, check outThe 10 Best HDTVs.
Here ya go. This explains it pretty well in laymans terms.
http://www.pcmag.com/article2/0,2817,2379206,00.asp
Well yeah, that's what I was getting at. That was an article I had found a while back and knew where to go to get it, and it does a fairly good job of explaining it to someone that has no idea about refresh rates. The main point I was trying to make was not to buy into the super high refresh rate hype. That it's just hype.That's not exactly true.
OTA TV is always 60hz (or actually 59.97, but that doesn't really matter). Film source (movies) are shot at 24fps. Film-source Blu-rays are often 1080p at 24fps. The magic of a 120hz frame rate is that it can display 60hz or 24hz video at native speeds without having to do a 3/2 pulldown conversion. If it's TV source, it shows the same image twice. If it's film source, it shows the same image five times in a row. (120 frames/second divided by five = 24fps)
When a 60Hz tv tries to display 24hz source material, it has to calculate what that "middle" frame is going to look like. That's the 3/2 pulldown conversion, and depending on the tv's ability to correctly determine what that frame should look like, it can cause a wide variety of artifacts and defects in the displayed image.
Anything more than 120hz is pure marketing bullshit. Some high speed stuff is shot faster than 60hz (slow mo), but nothing is transmitted at higher than 60hz, and your cable box, blu-ray, or dvd player aren't capable of storing, playing, or generating higher frame rates than that.
Truth be told, if you're asking this question, you probably won't be able to tell the difference anyway.
Well yeah, that's what I was getting at. That was an article I had found a while back and knew where to go to get it, and it does a fairly good job of explaining it to someone that has no idea about refresh rates. The main point I was trying to make was not to buy into the super high refresh rate hype. That it's just hype.
That said, my TV is 120hz.
You want the 120 esp if you watch sports
Truth be told, if you're asking this question, you probably won't be able to tell the difference anyway.
Again, that's a myth.
ESPN is only shooting at 60hz. ABC? 60hz. CBS? .... 60hz.
You can definitely tell the difference in 60 and 120. For lack of better explanation 120 looks more fluid and lifelike.