60Hz vs 120Hz on TVs

eh. I don't watch that much TV. I mean, I'm still using a CRT TV in the living room! :lol: So, is it going to make a difference, so much so that I should steer away considering I'd be upgrading from a CRT tv???
 
You'll notice. Especially if you watch sports. It's hard to explain the phenomenon, but you'll pick up on it even with a 120hz unit. I noticed it watching an old racing movie. It looked like live footage. This was a 120hz. My plasma has a 600hz refresh, and I don't notice any of that.
 
Here ya go. This explains it pretty well in laymans terms.

http://www.pcmag.com/article2/0,2817,2379206,00.asp

TV Refresh Rate Explained: 60Hz vs. 120Hz, 240Hz, and Beyond

  • awww9.pcmag.com_media_images_283003_will_greenwald.jpg_f6cbf538c67dbb5abe350c64997e4ce4.jpg
  • February 3, 2011
If you're in the market for an HDTV, you've probably heard a lot about "speed." When ads and reviews talk about how fast an HDTV is, they're referring to the display's refresh rate, or how often it changes the picture. Television and movies don't show actual motion so much as dozens and hundreds of frames per second, like a reel of film or a huge flipbook. The faster the HDTV, the more frames it displays per second.
So the faster the refresh rate, the better the HDTV, right? More frames look smoother, right? In theory: yes. In marketing: very yes. In practice: not so much.
Pulldown and the Film-Video Dance
To answer those questions, you have to understand two important things about video. First, you can't add detail beyond what is already in the source footage. Second, the source footage is never greater than 60Hz. When you watch a movie on Blu-ray, it's a 1080p picture at 60 Hz. The disc displays 60 interlaced or 30 progressive frames at 1,920-by-1,080 resolution per second of video. For movies that were recorded on film, the original footage is actually 24 frames per second, upconverted to 30 frames through a process known as 2:3 pulldown. It distributes the source frames so they can be spread across 30 instead of 24 frames per second. Those frames are then interlaced (combined and shuffled) to 60 "frames" per second to match the 60Hz refresh rate of the vast majority of TVs you can buy today. In the case of 1080p60 televisions, the frames are pulled down to 60 full frames per second, and both the players and HDTVs outright skip any interlacing step.
This is a time-honored tradition, because American TVs have displayed 30 (actually, 29.97) frames per second and functioned at 60Hz since time immemorial. It's not really a problem, because between interlacing and frame pulldown, the process doesn't attempt to add information to the picture. It's simply converting it to function on the TV, because it wouldn't work otherwise. 1080p60 is the current high-end standard for HDTVs, and no commercial media exceeds that resolution or frame rate. In fact, many movies on Blu-ray even turn the frame rate down and display 1080p24, or 1,920-by-1080 video at 24 frames per second, to make the footage look as close to film as possible. The various refresh rate-increasing technologies on HDTVs destroy that effect.
Going Too Far
Enhanced refresh rates like 120Hz, 240Hz, and various other speed-boosting features on modern HDTVs, on the other hand, push the concept too far. Remember what I said earlier about not being able to add detail beyond what's in the source footage? That's exactly what those higher refresh rates do. They interpolate data between each frame to produce additional frames. But the data in those combined frames can only be based on the source frames and whatever mathematical magic the HDTV is employing to figure out the middle ground. This technique can help reduce judder, or the jerkiness that manifests when displaying footage on a display that doesn't share its native frame rate (like, for example, a 24-frame-per-second film clip pulled down to 30 fps, then interlaced to 60Hz). Some plasma HDTVs can even reach a 600Hz refresh rate, which, when you consider that the source footage is going to be between 24 to 60 frames per second, is downright overkill.
Actually, this effect can produce a distinctly artificial, unnatural feel to video. Motion can appear too smooth, almost dreamlike compared with the films and television shows we've spent decades teaching our brains to enjoy. Action can seem just slightly sped up to the point of looking unreal, and it can take you out of the experience quicker than any judder. Indeed, to many videophiles judder is just as natural as film grain, and a subtle but necessary part of watching TV and movies.
Do You Need Super-Fast Refresh Rates?
When flat-panel HDTVs were in their infancy, they suffered from motion blur. LCDs in particular tended to display distinct blurriness during very fast on-screen movements because of "ghosting," or the afterimage left after the image on the screen has changed. LCD technology has progressed a great deal over the past several years, and now ghosting and motion blur have been all but eliminated. If you've purchased an LCD HDTV in the last two years, it probably won't show noticeable blur at its standard 60Hz refresh rate.
Tastes can vary, and you might enjoy the potential judder-reducing, motion-smoothing effects of an HDTV with 120Hz or 240Hz modes. But they don't add any actual detail to the video, and they certainly shouldn't be considered dealbreakers when you're shopping for an HDTV. Even if you get a set that supports 120Hz or 240Hz (or even 480Hz or 600Hz) video modes, you might want to disable them, and watch the video without any interpolation or judder-reducing effects.
For more HDTV shopping tips, readHow to Buy an HDTV. And for a look at the top televisions we've tested, check outThe 10 Best HDTVs.
 
irrelevant to picture quality.
As mentioned above, its about motion and time.
The things is virtually all LCD-based TVs nowadays have some kind of frame interpolation scheme as an option, dubbed 120, 240 or whatever. In fact I'd wager that any TV that is "only" 60 Hz and has no option of any kind of timing/motion fix options is probably a lower end, very basic, or older model. So as far as an "investment" in a TV, probably not the best.

Now as far as 120 vs 240 whatever - meh the difference is really small. Best you can do is just go look at the TVs and see what you think. Find one at Sears that you can get a remote for and turn the "features" off and on and see if you tell a difference.

And BTW the rate is basically relevant for a plasma b/c the way it operates is different.
 
Here ya go. This explains it pretty well in laymans terms.

http://www.pcmag.com/article2/0,2817,2379206,00.asp

That's not exactly true.

OTA TV is always 60hz (or actually 59.97, but that doesn't really matter). Film source (movies) are shot at 24fps. Film-source Blu-rays are often 1080p at 24fps. The magic of a 120hz frame rate is that it can display 60hz or 24hz video at native speeds without having to do a 3/2 pulldown conversion. If it's TV source, it shows the same image twice. If it's film source, it shows the same image five times in a row. (120 frames/second divided by five = 24fps)

When a 60Hz tv tries to display 24hz source material, it has to calculate what that "middle" frame is going to look like. That's the 3/2 pulldown conversion, and depending on the tv's ability to correctly determine what that frame should look like, it can cause a wide variety of artifacts and defects in the displayed image.

Anything more than 120hz is pure marketing bullshit. Some high speed stuff is shot faster than 60hz (slow mo), but nothing is transmitted at higher than 60hz, and your cable box, blu-ray, or dvd player aren't capable of storing, playing, or generating higher frame rates than that.

Truth be told, if you're asking this question, you probably won't be able to tell the difference anyway.
 
That's not exactly true.

OTA TV is always 60hz (or actually 59.97, but that doesn't really matter). Film source (movies) are shot at 24fps. Film-source Blu-rays are often 1080p at 24fps. The magic of a 120hz frame rate is that it can display 60hz or 24hz video at native speeds without having to do a 3/2 pulldown conversion. If it's TV source, it shows the same image twice. If it's film source, it shows the same image five times in a row. (120 frames/second divided by five = 24fps)

When a 60Hz tv tries to display 24hz source material, it has to calculate what that "middle" frame is going to look like. That's the 3/2 pulldown conversion, and depending on the tv's ability to correctly determine what that frame should look like, it can cause a wide variety of artifacts and defects in the displayed image.

Anything more than 120hz is pure marketing bullshit. Some high speed stuff is shot faster than 60hz (slow mo), but nothing is transmitted at higher than 60hz, and your cable box, blu-ray, or dvd player aren't capable of storing, playing, or generating higher frame rates than that.

Truth be told, if you're asking this question, you probably won't be able to tell the difference anyway.
Well yeah, that's what I was getting at. That was an article I had found a while back and knew where to go to get it, and it does a fairly good job of explaining it to someone that has no idea about refresh rates. The main point I was trying to make was not to buy into the super high refresh rate hype. That it's just hype.

That said, my TV is 120hz.
 
that, and it's apples and oranges w/ Plasma vs LCD (when it comes to refresh) so don't let the 480/600 Hz thing bother you.
 
Well yeah, that's what I was getting at. That was an article I had found a while back and knew where to go to get it, and it does a fairly good job of explaining it to someone that has no idea about refresh rates. The main point I was trying to make was not to buy into the super high refresh rate hype. That it's just hype.

That said, my TV is 120hz.

Right, and the guy does make some good points about some higher-refresh tvs interpolating their own intermediate frames.... so rather than the tv showing frame one twice at 60hz, it tries to figure out an intermediate frame between frames one and two. Mine does that, but I leave it turned off. It seemed like he was saying that there was no real reason for a 120hz native frame rate, and that's where I disagree.
 
Again, that's a myth.

ESPN is only shooting at 60hz. ABC? 60hz. CBS? .... 60hz.

My intent is not to argue but provide useful information, this is coming straight for the mouth of a personal friend and the former executive producer for ABC and CBS that shoots live TV for a living and has been in the business for 30+ years. Not a guy that builds off road vehicles for a living. After talking with him about it and him showing me in the studio you can definitely tell a difference although he did say 240 is pushing it for being worth the money. Its hard for me to believe its a myth when you sit there and watch the difference.
 
You can definitely tell the difference in 60 and 120. For lack of better explanation 120 looks more fluid and lifelike.
 
You can definitely tell the difference in 60 and 120. For lack of better explanation 120 looks more fluid and lifelike.

If I want to see lifelike, I turn off the television and see the real world. How come it is not "cool" to have TV look like TV?

...Soldiering on with my Sony WEGA 32" HD CRT...
 
my 32" Sony trinitron CRT TV is finally failing. cable connector is internally compromised, somehow... :rolleyes: now the picture flickers in and out every 5 seconds and has a buzz. It works great when I wiggle the coax connector on the back, but wires seem really loose on the back and I just don't have the time or energy to see if I can fix it. Maybe its an easy fix. Either way, this TV has given me years of trouble free service, so I cant complain too much! :D
 
Back
Top