Question about Hz

Author
Discussion

Slate99

Original Poster:

2,270 posts

185 months

Sunday 4th October 2009
quotequote all
Been looking at telly and am a little confused by the hertz.

I was under the impression that Hz in tellys refers to fps? Most 1080p tv's are either 50 or 100 Hz. Sony now have the new z-series which is 200 Hz.

This seems like a pointless excercise as the eye sees it as movment at about 35 fps (what I think is 35hz) if this is the case then what's the point In 200 hz when 50 hz is enough. Surely there is no difference as they are all over our eyes maximum?

Or am I completly wrong?

FlossyThePig

4,083 posts

243 months

Sunday 4th October 2009
quotequote all
If you think the Sony 200Hz is over the top read the marketing twaddle put out by Panasonic to justify their 600Hz models.

I may be wrong but the coming 3d systems will need a higher refresh rate for the electronic goggles to switch between each eye without causing nausea.

Graham E

12,696 posts

186 months

Monday 5th October 2009
quotequote all
The refresh rate in Hz was immencely important on CRT, as the emission of light was directly linked to the refresh rate.

LCD's work very differently - the backlight is always on, and the LCD turns on or off the pixel by a clever electrostactic / light rotating process through a ploarised filter. The 100 / 200 / 1 bjillion Hz claims are IMO utter bks, as there is no benefit of refreshing a panel 200 times a second, whent here is only 24 frames per second on the DVD in the first place. With plasma there _may_ be some arguement (plasma is at least an emission based technology), but I strongly doubt is (if it was woth doing, then Pioneer / Panasonic would have done it long ago).

ymwoods

2,178 posts

177 months

Monday 5th October 2009
quotequote all
With a CRT the Hz was important as it would update anyone pixel that many times a second (not 100% sure if it is indeed per second, may be milisecond but the same applies), so for example if it was 1Hz then each one of the pixels would be updated once per second (bearing in mind a CRT works by firing a specific colour mix of 3 lights on a specific point of the screen, a pixel) then moving on to the next pixel, the screen will glow with this light for a few miliseconds (IE, you can turn a CRT off, turn off the lights and it still glows slightly for a while) by the time the CRT gets back to the first pixel your eye wont have noticed but if you take an image of the CRT you will see where the beam of light had got to at that point (think about taking a shot of an old CRT on a camera)

Your own eyes will referesh themselves, IE take a new "photo" so to speak I think 40 to 60 times per second which is why people said their is no point getting CRT monitors that had a Hz rating of anymore than 60Hz

This 200Hz rating I am assuming is the same measurement. Each LCD pixel updates itself 200times per second or checkes to see if it chould change 200 times per second. Now on an LCD a higher rating helps with ghosting, IE some pixels not updating when others have which, when a fast sequence such as a car chase comes on you can see a ghost image of the last image (hopefully thing makes sense)

They used to use "referesh rate of 4ms" which meant each pixel updated every 4ms...I am guessing that since the refersh rate is so much higher they have started using Mz again.

scorp

8,783 posts

229 months

Monday 5th October 2009
quotequote all
Slate99 said:
Been looking at telly and am a little confused by the hertz.

I was under the impression that Hz in tellys refers to fps? Most 1080p tv's are either 50 or 100 Hz. Sony now have the new z-series which is 200 Hz.

This seems like a pointless excercise as the eye sees it as movment at about 35 fps (what I think is 35hz) if this is the case then what's the point In 200 hz when 50 hz is enough. Surely there is no difference as they are all over our eyes maximum?

Or am I completly wrong?
You need quite a high amount of HZ to reduce strobing effects, if you updated a CRT at 35hz the scanning would become quite noticeable as the (phosphor?) pixels fall off in brightness quickly, it would end up looking like an old-fashioned radar display smile

Not sure what the deal is with LCD's scanning at 100/200hz as there is no source footage around at that rate anyway..

scorp

8,783 posts

229 months

Monday 5th October 2009
quotequote all
ymwoods said:
This 200Hz rating I am assuming is the same measurement. Each LCD pixel updates itself 200times per second or checkes to see if it chould change 200 times per second. Now on an LCD a higher rating helps with ghosting, IE some pixels not updating when others have which, when a fast sequence such as a car chase comes on you can see a ghost image of the last image (hopefully thing makes sense)
Don't think that makes a difference as on->off transitions of each pixel will still be at the rate of the source video.. e.g. 50/60hz. I could be wrong though..

ETA: Apparently new sets come with motion compensation and create synthetic (interpolated) frames in between source frames, so it creates the illusion of smoothness.

Edited by scorp on Monday 5th October 07:07

Slate99

Original Poster:

2,270 posts

185 months

Monday 5th October 2009
quotequote all
Thanks for the interesting replies and it seems that it is left over from the CRT days. The thing I that manufactures seem to making a big deal (read as big price increase) for a telly with 100 or 200 hz.

As said if the footage is at 60hz the Tv creates artifical frames to create smoothness. The thing is I don't notice that 50 or 60 hz isn't smooth. Can we really notice smoothness on a day to day basis. I do take on board about a car chase or pehaps a game of tennis but even then I don't think "damn this 60 hz telly it's messing up my viewing"

linking it back to tv's a 100 hz 1080p is just fine then!

RizzoTheRat

25,167 posts

192 months

Monday 5th October 2009
quotequote all
Slate99 said:
linking it back to tv's a 100 hz 1080p is just fine then!
Unless it's very big or you're sat very close to it you're not going to see a difference between a 720p and 1080p either. http://s3.carltonbale.com/resolution_chart.html Worth it if you ever plan to use it as a computer monitor though.


Slate99

Original Poster:

2,270 posts

185 months

Monday 5th October 2009
quotequote all
RizzoTheRat said:
Slate99 said:
linking it back to tv's a 100 hz 1080p is just fine then!
Unless it's very big or you're sat very close to it you're not going to see a difference between a 720p and 1080p either.
Ohh really - I was thinking that's where the main difference was, I won't be using it as a Monitor, it's for Sky HD and Gaming. I was under the impression that you could notice the differnece between 720 and 1080 - I'm looking to get a 37" or 42" sitting about 8 feet away.

ETA - Thanks for that chart - Its really helpful! thumbup

Edited by Slate99 on Monday 5th October 09:10

Slate99

Original Poster:

2,270 posts

185 months

Monday 5th October 2009
quotequote all
From that chart it appears that 720p is fine - I fall right into the "Benifits of 720p" area - that will save be a few quid if that graph is anything to go buy.

RizzoTheRat

25,167 posts

192 months

Monday 5th October 2009
quotequote all
I've got a 37" Samsung, not tried it with a 1080p source but at about 9-10 feet I can only just notice the difference between an HD channel (720p I believe) and a decent quality standard channel, however the difference it quite noticable at about 8 feet which suggests that chart isn't far off.

Apparently contrast ratio is really more important than resolution.

PJ S

10,842 posts

227 months

Tuesday 6th October 2009
quotequote all
Slate99 said:
From that chart it appears that 720p is fine - I fall right into the "Benifits of 720p" area - that will save be a few quid if that graph is anything to go buy.
Indeed, but if you were to opt for the Panasonic plasma as the TV choice, then whilst you'll not be able to fully resolve/appreciate a 1080p B-R movie, the increase in static Contrast Ratio of the G10 over the X10 model, plus the FreeSat decoder built in, and a few other features, makes the Full HD G10 a viable proposition, so you don't necessarily have to limit yourself to the X10.

If you can find a decent local reseller of them, go take a look at them, with a couple of DVD's you know intimately, and see what you think then - otherwise, I'd suggest the extra ££ spent on the G10 or even V10, looked at over the 3-5 years before you'll even contemplate changing it, works out at a paltry sum per month difference.

RizzoTheRat

25,167 posts

192 months

Tuesday 6th October 2009
quotequote all
Exactly why I ended up buying a 1080 despite knowing I wouldn't see a difference froma 720 in terms of pixels. Going up to the 1080 models also got me a better contrast ratio, and extra HDMI port, and a VGA input.