Message Boards »
»
Refresh rates, video pull-down, and all that shit
|
Page [1]
|
neodata686 All American 11577 Posts user info edit post |
So I remember talking to Noen about this a while back. I need a little bit of clarification.
It is my understanding that movies, TV shows, etc are typically filmed at 24 frames per second. Some reality shows are filmed at 30 frames per second. They're going to film the Hobbit at 48 frames per second.
Now the way this translates to displays (TVs, monitors, projectors, etc) is the display either adjusts to the fps of the material or the processor uses some type of pull down.
So if I were watching a 24fps movie on a 60hz display I would get 3:2 pulldown (first frame is displayed 3 times, next frame displayed 2 times) to compensate for 60 divided by 24. This causes jitter. Why you see jitter watching a movie on a computer monitor.
Ideally you want a 1:1 ratio for pulldown. So if you're watching 24p content you want 24, 48, 72, 120 etc for 1:1, 2:2, 3:3, etc pulldown.
The part I want to clarify is all the TV manufacturing companies are advertising "48hz, 72hz, or 120hz compatible" LCDs. Why?
Assuming all that true-motion crap is off what's the advantage of having 48hz (or 120hz) over 24hz for watching 24p content? An LCD doesn't "flash" like a CRT or movie theater projector so there's no need for 2:2 or 3:3 pulldown. I understand that 120hz is the ideal ratio because you can do 30x4 (4:4) or 24x5 (5:5) but if I'm watching 24hz content on an LCD there's no advantage of 48, 72, or 120 over 24hz...correct?
I understand that movie theaters (film) or projectors (not LCD?) will often do 48hz (every frame displayed twice) or 72 hz (every frame displayed 3 times) simply because it reduces jitter but that's because the frames are actually disappearing and reappearing. With an LCD there's no point in anything higher than 1:1 pulldown.
Or am I totally wrong? Basically the only 2 types of content I have are 24 or 30. I have my TV set to 24 by default and for the tiny bit of 30fps material I have it will switch the TV to 60hz (2:2 because it doesn't support a 30hz refresh). Almost all the time I watch everything in 24hz 1:1. 9/10/2012 4:00:25 PM |
smoothcrim Universal Magnetic! 18966 Posts user info edit post |
lots of marketing confusing all of it. watch tv and then a bluray on a 240hz tv, a 120hz, tv, and then a 60hz tv (assuming led/lcd). I prefer watching at 60hz for tv otherwise I see that ultra fast unnatural motion. blurays I'll watch at 240 sometimes but often will leave it at 60. 120hz is universally bad at all times to me. 9/10/2012 4:22:30 PM |
neodata686 All American 11577 Posts user info edit post |
I think you're missing the point of my question.
Technically Blurays are usually 24p content. You want to watch it at 24hz on the TV. Watching it at 48, 72, or 120hz won't make a difference because the source content is still 24p. LCDs don't "flicker" like CRTs or older projectors so there's no need to watch Blurays at anything higher than 24hz.
I think what you're talking about is the "smooth motion" effect that a lot of TVs have by default. So 120, 240+ TVs aren't actually getting source material any higher than 24, 25, or 30 they're just using interpolation to create fake frames to throw in there to fake a higher refresh rate.
120hz is the best format for everything because it uses 5:5 pulldown for Blurays (24p content) and 4:4 pulldown for 30p content (reality tv shows, sports, etc). If you're watching Blurays at 60hz then you're watching them with jitter (3:2 pulldown) and you're seeing one frame 3 times and the next 2 times.
-And just to add this in. HDTVs don't currently go higher than 60hz (input) unless it's 3D (in which case it's 120hz). Anything higher than 60hz is just the processor in the TV "faking" frames. You can on the other hand buy a true 120hz computer monitor and it receives a true 120hz signal (from a computer) via dual link DVI. The only need for it is gaming or 3D movies though.
[Edited on September 10, 2012 at 4:30 PM. Reason : s] 9/10/2012 4:27:16 PM |
Igor All American 6672 Posts user info edit post |
Although LCD does not flash like a CRT( where the ray gun was scanning the screen to excite the phosphors one at a time), it still refreshes pixels at a certain rate. If the refresh rate is 60hz, then you run into the pull down issues (some frames are visible for a slighhtly longer duration than others). At 120hz both 30fps (most non-movie programming) and 24fps (most movies) can be shown with each frame visible for exactly same duration (4 or 5 refresh cycles each, respectively) without TV having to adjust its refresh rate. Now, why not all 60hz LCD tvs were not also 48hz capable, I do not know. Some 60 hz TVs do slow down to 48hz when they detect 24p content. In that case, just like with a movie theater. Each frame lasts 2 refresh cycles. Rates higher than 120hz are useless for 2d programming unles you are into artificial smoothing because the highest content rate commonly available is 60fps. For 3D content higher refresh rate may reduce flicker. There are also smoothing algorithms that take 30 or 60 fps content and make interframes that will smooth the motion and make sports games and such more lifelike. This smoothing makes movies look like complete shit though. First thing you do when you get a 120hz tv is turn the smoothing off, unless you watch primarily ESPN 9/10/2012 4:37:24 PM |
neodata686 All American 11577 Posts user info edit post |
Thanks and that's a very thorough explanation of everything but it still doesn't address my main question. I can't currently set my TV to 48, 72, 96, or 120 hz at 1080p. It will only do 1080p24.
http://www.highdefdigest.com/news/show/1015
Quote : | "In order to get this to work, the signal must be transmitted to a compatible television that can properly sync with the 24 Hz frame rate, or convert it to an even multiple such as 48 Hz, 72 Hz, 96 Hz, or 120 Hz. Most HDTVs will not accept a 1080p24 input signal at all, and even among those that will, some simply convert the signal back to 60 Hz by applying their own 3:2 pulldown and re-introducing the judder. In other words, even if you can get the 1080p24 output of the disc player to work, your TV may still not be able to benefit from any improvement it promises." |
I know the article is old but it points out that as long as the TV can do a multiple of 24hz then you're fine and you'll be watching stutter free 24p content.
My question still remains:
Is there an advantage to 48hz, 72hz, 96hz, or 120hz for displaying 24p content. LCDs do not flicker, flash or whatever. They just refresh. So unlike CRTs 24p content on a 24hz refresh setting should look identical to content on a 48hz, 72hz, 96hz or 120hz refresh setting. Right?
The reason I'm asking is my computer sets static refresh rates and my Samsung HDTV recognizes them. I can't currently get it to do 48hz or 72hz (at 1920 by 1080). I can mainly do 24hz or 60hz (haven't really messed around with different refresh rates much). Right now 24p content displays at 24hz (1:1) and 30p content displays at 60hz (2:2). Everything looks perfect and I'm happy. I just wanted to understand why anyone would do 48, 72, 96, or 120hz if they could simply do 24hz (other than the obvious 120hz advantage of being able to do 4:4 or 5:5).
tl;dr - Is there a physical/visual difference between 1:1, 2:2, 3:3, 4:4, and 5:5 on an LCD display (not CRT or old school projector)
[Edited on September 10, 2012 at 4:54 PM. Reason : tl;dr]9/10/2012 4:49:43 PM |
Lionheart I'm Eggscellent 12775 Posts user info edit post |
I think I'm getting what your saying but I'm not sure but I'll try to provide some input.
LCD information is a little bit skewed and marketed poorly. LCDs typically suffer from ghosting and blurring issues because they are slow to refresh. LCD as you probably know stands for Liquid Crystal display. The liquid crystal matrix alignment, which determines the frequency of light (color) to be allowed to pass through from the backlight, is a slower process than lighting or adjusting a diode or plasma pixel. The refresh rate of the screen is therefore more or less fixed and lower than that of some other display technologies.
What LCD makers have done is ultimately changed signal processing to try and improve the perceived refresh by manipulating how many of what frames and are sent to the display to optimize to provide clearer image in situations where certain pixels and segments of pixels are changing significantly rapidly by creating interpolated transition images. The 120HZ reflects the frequency at which this happens. So a 120HZ signal processing will actually be very different than 60HZ because the frequency with which the image is sampled for these purposes is increased and when its replaced. You would actually be getting different frames all together if your tv allows adjustment not just different numbers of the frames from your signal.
This may be a clearer explanation since I'm not sure I did that all that well. http://gizmodo.com/290237/the-trouble-with-lcd-tvs-motion-blur-and-the-120hz-solution
Just go plasma, better color, no ghosting, deeper blacks, cheaper set. 9/10/2012 5:22:41 PM |
neodata686 All American 11577 Posts user info edit post |
Quote : | "Just go plasma, better color, no ghosting, deeper blacks, cheaper set." |
I find that hard to believe now. Especially with LED lighting technology. That article is 5 years old. I think you can make the argument either way now.
Quote : | "The refresh rate of the screen is therefore more or less fixed and lower than that of some other display technologies." |
I'm not 100% sure what you mean. My computer LCD monitor refreshes at 120hz. It gets a 120hz signal from my computer. I can play a game at 120hz and it will spit out 120 frames per second. How is this anything other than 120hz?
My TV (also hooked up to a computer) only goes up to 60hz. HDTVs only take 60hz signals (unless it's 3d and hence 2 60hz signals for 120hz).
My issue is playing 24p content. Is there an advantage of settings my TV refresh rate to 24hz versus 48, 72, 96, or 120 hz. I didn't think there was a visual difference because of how LCDs actually work.
i.e - displaying the same frame in 1/24th of a second IS THE SAME as displaying that exact same frame twice in the same time period (at 1/48th of a second or 48hz).
Quote : | "The 120HZ reflects the frequency at which this happens. So a 120HZ signal processing will actually be very different than 60HZ because the frequency with which the image is sampled for these purposes is increased and when its replaced. You would actually be getting different frames all together if your tv allows adjustment not just different numbers of the frames from your signal." |
I don't think that applies to what I'm talking about. I'm NOT talking about image processing. I'm talking about a direct signal from my computer to a display. I'm very aware of which refresh rate my computer is outputting and my HDTV is able to input. It's either 24 or 60hz. TVs can't display 120hz. If they do it's simply using 5:5 for 24p or 4:4 for 30p or some image processing technique which is not in scope for this discussion. A computer on the other hand can display 120hz content on a 120hz computer monitor (where the computer is actual sending 120 images to the monitor).
I think we're talking about 2 different things.
-if you're talking strictly of ways to prevent ghosting I don't think that's as much of an issue anymore but still isn't really part of my question.
[Edited on September 10, 2012 at 5:57 PM. Reason : s]9/10/2012 5:37:20 PM |
Prospero All American 11662 Posts user info edit post |
Quote : | "Is there an advantage to 48hz, 72hz, 96hz, or 120hz for displaying 24p content. LCDs do not flicker, flash or whatever. They just refresh. So unlike CRTs 24p content on a 24hz refresh setting should look identical to content on a 48hz, 72hz, 96hz or 120hz refresh setting. Right?" |
Assuming no smooth motion feature is turned on, yes, it should look exactly the same.
Quote : | "tl;dr - Is there a physical/visual difference between 1:1, 2:2, 3:3, 4:4, and 5:5 on an LCD display (not CRT or old school projector)" |
Physical difference, yes the frames are still doubled/tripled. Visual difference, no, again, presuming smooth motion is turned off.
In the last year or so, some manufacturers have gone to less of a refresh rate and more of a proprietary "clear motion rate" that's a combination of refresh rate multiplied by backlight timing (how fast the LEDs strobe). IMHO, this is a good indicator as no matter how good the image processor is, the LEDs ultimately have to keep up to prevent motion blur and ghosting. As backlights catch up to 120Hz, I think you'll see more and more native 120Hz+ TV's showing up.
Basically without any image manipulation software, the LCD will not produce anything more than the source content. So if you have a 120Hz screen and you have 24p content, the output buffer will just keep the frame in there for 1/24th of a second and each frame is shown 5 times.
[Edited on September 10, 2012 at 6:09 PM. Reason : .]9/10/2012 5:59:08 PM |
neodata686 All American 11577 Posts user info edit post |
Quote : | "Physical difference, yes the frames are still doubled/tripled. Visual difference, no, again, presuming smooth motion is turned off." |
So you're saying displaying a frame in 1/24th of a second is physically different then displaying the same frame twice in the same time period (twice each 1/48th of a second)?
How? If I have an accurate understanding of LCD technology and my display has a 2ms refresh rate then physically 24hz should be the same as 48hz if the content is 24p.
I mean how can you transition pixels from the same image to the same image? Isn't that the same as displaying the same image for twice the time?
Quote : | "that's a combination of refresh rate multiplied by backlight timing (how fast the LEDs strobe)" |
Ok this confused me. What's the difference between the refresh rate (signal TV is receiving) versus the backlight timing? Lets use an example. Say you're watching a Bluray which sends a 24p signal to the TV but the TV takes that signal and uses 5:5 pulldown for 120hz. Is the backlight timing different then if the TV were to display that same 24p signal at 24hz for 1:1?
[Edited on September 10, 2012 at 6:11 PM. Reason : s]9/10/2012 6:08:49 PM |
Prospero All American 11662 Posts user info edit post |
That's why I said visually there's no difference.
You're confusing technologies. Backlight strobing (probably better defined as pixel response time) is different than image processing (refresh rate).
Pixel response time is native to the type/quality of the LEDs used, whereas image processing is part of the TV's cpu or whatever it's called. You could have 480Hz but if pixel response isn't up to par you'll see ghosting and image blurring. It's not directly related to your question, I just brought it up as it's a new way of determining how well LCDs respond.
I dunno, this is my understanding but maybe I'm wrong. It makes sense to me.
[Edited on September 10, 2012 at 6:19 PM. Reason : .] 9/10/2012 6:11:40 PM |
Prospero All American 11662 Posts user info edit post |
Quote : | "So you're saying displaying a frame in 1/24th of a second is physically different then displaying the same frame twice in the same time period (twice each 1/48th of a second)?" |
By physically different I meant it's electronically/programmed different, yes. Again, visually no different.
Quote : | "How? If I have an accurate understanding of LCD technology and my display has a 2ms refresh rate then physically 24hz should be the same as 48hz if the content is 24p." |
2ms is not refresh, it's response time, it's how fast the pixel can change state, no impact on refresh rate, that's determined by image processor. so basically a 2ms response time should support up to 500Hz without ghosting, that's all it means. So for 24p content, your screen either show 1:1 at 24Hz if it slows down, or does 3:2 pulldown for 60Hz, or it displays it as 5:5 at 120Hz, depending on your settings.
Quote : | "Is there an advantage to 48hz, 72hz, 96hz, or 120hz for displaying 24p content." |
Simply put, no. Visually it will look the same unless there's a smooth motion software (which interpolates those intermediate frames). They are all different refresh rates so yes something physically is changing, but not visible on screen. 120Hz has obvious advantages as already stated when changing content from 24/30/60.
So there's no advantage other than the fact that you don't have to change the TV settings when switching from TV (30fps) to Blu-ray (24fps) to Gaming (60fps)
[Edited on September 10, 2012 at 6:45 PM. Reason : .]9/10/2012 6:24:57 PM |
neodata686 All American 11577 Posts user info edit post |
Quote : | "2ms is not refresh, it's response time, it's how fast the pixel can change state, no impact on refresh rate, that's determined by image processor. so basically a 2ms response time should support up to 500Hz without ghosting, that's all it means. So for 24p content, your screen either show 1:1 at 24Hz if it slows down, or does 3:2 pulldown for 60Hz, or it displays it as 5:5 at 120Hz, depending on your settings." |
Yeah sorry response time. I usually relate response time to gaming. I guess because ghosting used to be a major factor as gamers started to make the transition from CRTs to LCDs. Now it's really not a factor anymore. Most half-way decent LCDs are below 10ms anyway.
When you say my screen can either show 1:1 at 24hz "if it slows down". Why would it slow down? I set the refresh rate based upon my input signal. Or the software I run sets the input rate. If My content is 30p it will automatically switch the display to 60hz (HDTV doesn't appear to support 30hz) or if it's 24p it will switch it to 24hz (again HDTVs don't support 24, 48, 72, 96, or 120hz inputs - from what I've seen). They only support 24hz or 60hz. I have yet to see an HDTV panel that will take any signal other than 24p or 60p (at 1920 by 1080). I have been able hit 75hz but it reduces the resolution to like 800 by 600.
Quote : | "So there's no advantage other than the fact that you don't have to change the TV settings when switching from TV (30fps) to Blu-ray (24fps) to Gaming (60fps) " |
My software (XBMC) does this all for me based upon content. I don't game on my TV. Gaming is done on my desktop which has a true 120hz monitor hooked up to it. So when I game (depending on settings) I'll be v-syn'd at 120fps.9/10/2012 7:07:26 PM |
Prospero All American 11662 Posts user info edit post |
I just meant you slow it down by changing the settings. The native refresh is probably 60Hz. And yea, computers can change all of the output on the fly just like the TV. 9/10/2012 7:19:34 PM |
neodata686 All American 11577 Posts user info edit post |
Yeah the only thing that is ever displayed on my HDTV is coming from a computer. I never thought I could notice 3:2 pulldown until I actually watched something at 24hz and noticed how much smoother it was. 9/10/2012 7:24:58 PM |
neodata686 All American 11577 Posts user info edit post |
So I guess my question then is most TVs that are more than a couple years old can't accept a true 24p signal from a computer or Bluray player so either the TV or the device (Bluray player) is using 3:2 pulldown to show a movie and giving the picture a very subtle stuttering effect.
I really never thought about it until I actually switched my TV to 24hz last year and everything was all of a sudden smooth. 9/10/2012 7:58:34 PM |
stopdropnrol All American 3908 Posts user info edit post |
Quote : | " Just go plasma, better color, no ghosting, deeper blacks, cheaper set." |
agreed
Quote : | "I find that hard to believe now. Especially with LED lighting technology. That article is 5 years old. I think you can make the argument either way now. " |
led has closed the gap a bit , but you're still paying A LOT more than you would for a comparable plasma9/10/2012 10:43:51 PM |
Igor All American 6672 Posts user info edit post |
24p should not be "smooth". Are you sure the smoothing is not on? 9/10/2012 10:52:39 PM |
neodata686 All American 11577 Posts user info edit post |
^Smooth as in I don't see the 3:2 pull down stutter I see on most peoples TVs that are playing 24p content on a 60hz set HDTV. It's set at 24hz.
^^i still prefer LCD for various reasons. At the time I bought my LED Samsung it was exactly what I wanted. Very happy with it.
-and I'm pretty sure LED backlit LCD panels have surpassed plasma (yes for a premium) but I'll take the lower power consumption and lighter weight over the cheaper cost of a plasma any day. Pretty sure the gold standard for black levels are now the Sharp Elite pro LED LCD series which surpass any plasma.
[Edited on September 10, 2012 at 11:43 PM. Reason : Hm] 9/10/2012 11:26:59 PM |
Stein All American 19842 Posts user info edit post |
Quote : | "lighter weight over the cheaper cost of a plasma any day." |
I've never understood this argument. How often do people move TVs?
Also, not all LED Blacklights are created equal. That one phrase could mean like 3 or 4 different types of TVs that drastically vary in quality.9/11/2012 2:40:09 AM |
Igor All American 6672 Posts user info edit post |
Looking at the previous posts, I think you are under assumption that LCD monitor does not refresh the screen if there are several consecutive frames in a row. It does. I am not sure about the exact physics behind why, maybe the backlight is actually pulsing, or maybe the LCD pixels are flashing, but it does flicker even when the image does not change from cycle to cycle. I see it all the time when filming at a <200fps shutter speed. If you have access to a digital camera, point it at your computer LCD while displaying a still picture and see what happens. So the Hz and the Frames Per Second (fps) should not be lumped together for the sake of clarity, although technically Hz means just that, cycles per second. Refresh rate in Hz is the frequency of the monitor "flashing" the image, and frame rate in FPS is the quantity of individual signals that come out of your source every second. So you could have watched the TV at 24 fps, but most likely at 48Hz (or 96 or 120), as 24Hz would have induced a noticeable flicker. Watching 24fps signal at 60Hz would have created a slight stutter due to pulldown. 9/11/2012 2:43:51 AM |
neodata686 All American 11577 Posts user info edit post |
Quote : | "So the Hz and the Frames Per Second (fps) should not be lumped together for the sake of clarity, although technically Hz means just that, cycles per second. Refresh rate in Hz is the frequency of the monitor "flashing" the image, and frame rate in FPS is the quantity of individual signals that come out of your source every second. So you could have watched the TV at 24 fps, but most likely at 48Hz (or 96 or 120), as 24Hz would have induced a noticeable flicker." |
I got some people's opinion in the industry and this is incorrect. Yes Hz and frames per second (fps) in my case SHOULD be lumped together because my software is adjusting the refresh rate of the TV to match the frames per second of the content. 24p gets 24hz and 30p gets 60hz (won't do 30hz).
When you say "I could have watched" that's incorrect. I know exactly what I'm watching my content at. I can read the refresh rate the HDTV is set to both on the OSD of the TV and the stats info on my software telling me what the source content is and what it's outputting to the display.
The camera effect you're talking about is when the shutter speed does not match up with the refresh rate of the display and you get the flashing effect (kind of like the wagon-wheel effect - I think). This is not relevant to refresh rate versus content frames per second.
Quote : | "as 24Hz would have induced a noticeable flicker." |
Again incorrect. 24hz does not produce flicker on LCDs. It DOES produce flicker on CRTs or older projectors because the image actually "flickers". This is why 2:2 (48hz) or 3:3 (72hz) pulldown is used with CRTs, projectors, or older movie theaters. If you guys remember the days of CRTs when you put your refresh rate too low you can actually see the screen "flicker". LCDs do not do this. If you're displaying one image the pixel buffer (or whatever it's called) keeps one static image. It doesn't flicker in and out.
This is why there is no visual difference (to my understanding) between 24hz, 48hz, 72hz, 96hz, 120hz, or 24,000 hz on an LCD monitor displaying 24 frames per second content. As I've said already displaying 1 frame once in 1/24th of a second is the exact same as displaying the same frame 5 times over 1/24th of a second (5:5 or 120hz).
Another test I did was set my computer LCD monitor to multiples of 24 (24hz up to 120hz) and played a 24 frames per second movie. 24hz looked the same as 120hz to my eye.
http://en.wikipedia.org/wiki/Refresh_rate
Quote : | "Refresh rate or the temporal resolution of an LCD is the number of times per second in which the display draws the data it is being given. Since activated LCD pixels do not flash on/off between frames, LCD monitors exhibit no refresh-induced flicker, no matter how low the refresh rate. High-end LCD televisions now feature up to 600 Hz refresh rate, which requires advanced digital processing to insert additional interpolated frames between the real images to smooth the image motion. However, such high refresh rates may not be actually supported by pixel response times and the result can be visual artifacts that distort the image in unpleasant ways." |
Quote : | "On smaller CRT monitors (up to about 15"), few people notice any discomfort below 60–72 Hz. On larger CRT monitors (17" or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher. A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 frame/s. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight—typically operates at around 200 Hz." |
Now I have no idea how fast LEDs strobe but I'm betting it's way higher than anyone can notice and not relevant here. Point still remains if you display a static image on an LCD the LCD does not refresh those pixels.
[Edited on September 11, 2012 at 9:22 AM. Reason : s]9/11/2012 9:15:13 AM |
Lionheart I'm Eggscellent 12775 Posts user info edit post |
Quote : | " Pretty sure the gold standard for black levels are now the Sharp Elite pro LED LCD series which surpass any plasma." |
Anything with a backlight is going to have worse blacks that a plasma screen where black pixels can just not be lit at all. Its possible for an LCD to have a greater contrast ratio over all colors but I've yet to see anything with a backlight get even close to my 5 year old plasma. That and it really bothers me if I feel like colors are looking desaturated because of the backlight.
Power consumption is a bitch and it puts out some heat but moving the thing was never an issue. for my 46 inch which I managed to get upstairs and on my TV console by myself.
But to each his own.
[Edited on September 11, 2012 at 10:30 AM. Reason : quotefail]9/11/2012 10:30:25 AM |
neodata686 All American 11577 Posts user info edit post |
Quote : | "Anything with a backlight is going to have worse blacks that a plasma screen where black pixels can just not be lit at all." |
That's what I thought too but multiple sources say LED TVs are now better. Such as:
http://www.pcmag.com/article2/0,2817,2387377,00.asp
Quote : | "In our tests, we measure white and black levels by luminance using a Chroma Meter. A mediocre HDTV might produce black levels of 0.05 to 0.07 cd/m2, while an excellent HDTV will offer levels of 0.01 to 0.03 cd/m2. Historically, plasma HDTVs have produced the best black levels, specifically the discontinued Pioneer Kuro HDTV brand. The Kuro's screen got so satisfyingly dark that it remained a popular HDTV for enthusiasts long after Pioneer stopped making the sets. The domination of plasma in this field, however, is over. Our current Editors' Choice HDTV, the LED-based Sharp Elite Pro-60X5FD $4,695.00 at Amazon Marketplace, puts out 0.01 cd/m2, the best level we can measure. That any LED-backlit LCD can get that dark shows how far the technology has come." |
9/11/2012 10:34:37 AM |
Lionheart I'm Eggscellent 12775 Posts user info edit post |
^I'd be curious to see the methods they use. The measurement is candela per square meter. This is a measure of luminance ("brightness" or energy output) over a region. But depending on the size of the region and type of images used for the testing this could be misleading.
In a region of half blacks and half full whites i expect a plasma will put out more total luminance than an LCD in pretty much every case over the region but depending on the pixel arrangement that could have huge implication on how the blacks are perceived and judged. If the pixels are interspersed evenly say
BWBWBWBW BWBWBWBW BWBWBWBW
The the black pixels will get washed out regardless and which ever screen has the lowest overall luminance will have the best blacks.
But lets say we have a region like
BBBBBWWWW BBBBBWWWW BBBBBWWWW
Here the pixels along the boundary get washed out but the others show whatever the base luminance level for blacks is. In a plasma this should ultimately be 0 cd/m2 since the pixel is off. Realistically there would be reflection from ambient light so pure 0 is impossible but you get the idea. Even the best backlit system is probably going to let some light through. Therefore the region on the left would be blacker even if the luminance over the region was the same for both displays.
The key then is defining what the average case is between larger regions or more interspersed regions.
Any for all I know maybe they used a pure black screen for test but I don't know 9/11/2012 10:49:41 AM |
neodata686 All American 11577 Posts user info edit post |
Ok I see your point. So I know that "full array" or truly back-lit LED LCDs (not side lit like most thin HDTVs) have local dimming and actually shut off certain portions of the screen for pure black.
Quote : | "If local dimming is implemented, this means that each LED or a specific group of LEDs can be turned on and off independently within certain areas of the screen, thus providing more control of the brightness and darkness for each those areas, depending on the source material being displayed." |
http://hometheater.about.com/od/televisions/qt/ledlcdtvfacts.htm
I assume this could technically lead to black levels on par with plasmas.9/11/2012 10:56:51 AM |
Lionheart I'm Eggscellent 12775 Posts user info edit post |
^Yeah if they can set the filter to allow no light from the backlight then there's no reason a backlit screen couldn't get the same blacks as a plasma over a single image.
The problem comes in how good they are on the fly at being able to adjust to moving images and regions of blacks to be able to shut off that region. Does that region have to fit a certain shape or size for instance.
Lets say a panda is walking across the screen, this is reasonable slow and could be handled well. But then lets say a soccer ball is rolling off a pass and going fast, here the processing is going to have to be really good to get something that small and fast moving to dim that region.
I just think its harder to accomplish over moving images. Also half the battle is a visual perception thing that varies slightly person to person depending on their own vision so once things get to a certain point anyway its a moot point.
I just like talking this stuff (<<CS Grad with a focus on Visualization and Perception )
[Edited on September 11, 2012 at 11:19 AM. Reason : can't spell] 9/11/2012 11:03:15 AM |
neodata686 All American 11577 Posts user info edit post |
Ah that's awesome! Yeah i'm not in the field but I've always been interested in stuff like this. I don't really watch sports and most of my content is 24p or 30p. I'm perfectly content with 24hz on an LCD for almost 95% of my viewing. Gaming is a different story. Every since I got a 120hz monitor I can tell a difference up to about 90 frames per second. 90-120 is pretty indistinguishable for me. That jump from 60 to 90 is noticeable though in faster motion stuff. 9/11/2012 11:14:00 AM |
|
Message Boards »
Tech Talk
»
Refresh rates, video pull-down, and all that shit
|
Page [1]
|
|