32 inches and HD
Discussion
poprock said:
In layman’s terms, 1080p is HD, 720p is halfway there.
Not quite. 720p is still "Full HD". 1080p is "1080p Full HD". The average person will struggle to see the difference between the two unless you're sitting close to a large display. Furthermore, the average person will bring their new TV/display home, plug it in and start watching without doing any sort of calibration/testing, thereby negating the difference in picture resolution.Rabbo said:
Not quite. 720p is still "Full HD". 1080p is "1080p Full HD". The average person will struggle to see the difference between the two unless you're sitting close to a large display. Furthermore, the average person will bring their new TV/display home, plug it in and start watching without doing any sort of calibration/testing, thereby negating the difference in picture resolution.
720p is HD ready, 1080p is Full HD.andytk said:
Does anyone actually broadcast in 1080p yet?
I thought sky was 'only' 1080i or 720p.
I could well be wrong.
Not wrong, broadcast HD is only those two resolutions.I thought sky was 'only' 1080i or 720p.
I could well be wrong.
1080p has 4x the data of 720p, which makes it currently cost pronibitive to be used as a broadcast standard.
It's currently only a BR disc resolution standard, and primarily a marketing tool to persuade consumers to upgrade TVs, DVD players, and movies collection.
For the vast majority of domestic setups, most won't see the additional benefit over a 720p broadcast of the same film.
1080i is best left out of the equation, as by the time it's processed into a progressive signal the panel can make use of, it's only 810 lines of vertical resolution.
You'd struggle to see any appreciable difference over setting the box to 720p, once the TV up scaled either to fit the 1080 pixels it has.
If your TV is not Full HD (1920x1080 native pixels), then it'll be 1366x768, in which case it'd be downscaling to make 810 fit 768.
720p means the TV doesn't need to deinterlace (often a weak point through cost of chips used) the signal, and the minor upscaling to 768 goes completely unnoticed, and at typical viewing distances on Full HD panels, the softening of the image is again all but imperceptible.
PJ S said:
Mark. said:
blindswelledrat said:
...you only start getting the benefit of 1080 below about 7 feet at 32 inches. Is that right?


Note to OP - just buy the biggest budget will allow, we got a 50" as it was in range of the chart, but in all honesty a 55 or even 60 would have been so much better.
PJ S said:
1080i is best left out of the equation, as by the time it's processed into a progressive signal the panel can make use of, it's only 810 lines of vertical resolution.
I've seen this posted before on here (may have been your good self), but I'm not sure if you are mixing up the number of 'lines' on a typical 2.40:1 film as the black bars mean that only about 810 lines of the 1080 are used to show the picture? Either that or are you talking about it like temperal resolution?EDIT: Meant to add that my kitchen TV is 32" 1080p but it's only about 5 feet away when seated at the table, so it's possible to see that BBC 1 HD does look sharper than ordinary BBC 1 for example, which is of course a 1080i broadcast.
Mark. said:
The problem with this old chestnut is the scale. When the question relates to a 32" TV the detail is to vague. How valid to most people is a 110" screen viewed at 38 feet? A useful chart would cover screens up to 55" and viewing distances up to about 15 feet.My purely subjective experience is I have a 32" 1080p Panasonic TV which I watch at 9 feet. I can certainly see an improvement in the picture on the two BBC HD channels which broadcast at 1080i compared to the same programmes broadcast at SD.
OldSkoolRS said:
PJ S said:
1080i is best left out of the equation, as by the time it's processed into a progressive signal the panel can make use of, it's only 810 lines of vertical resolution.
I've seen this posted before on here (may have been your good self), but I'm not sure if you are mixing up the number of 'lines' on a typical 2.40:1 film as the black bars mean that only about 810 lines of the 1080 are used to show the picture? Either that or are you talking about it like temperal resolution?EDIT: Meant to add that my kitchen TV is 32" 1080p but it's only about 5 feet away when seated at the table, so it's possible to see that BBC 1 HD does look sharper than ordinary BBC 1 for example, which is of course a 1080i broadcast.
The data amount confirms this, since it's more than 720p, but not nearly as much as 1080p.
No confusion over origin - it's effective resolution once deinterlaced, and hence why a broadcast standard as it's cost effective to transmit.
1080i originates from the US, since NTSC signals are broadcast at 540i, which comprises 480 lines of picture resolution, the rest is timecode, teletext, and other technical overhead. 1080i was chosen as it's double SD, so making for easy line-doubling upscaling of older non film SD.
Our PAL standard of 625i is 576p when stripped and deinterlaced by your DVD player over its component output (and HDMI if so equipped).
I'm sure in one of my previous replies you're thinking of, I put a link to some source of info which explains how 1080i doesn't equal 1080p by merely deinterlacing it. There's more to it than simply swapping letters of the alphabet about.
The viewing distance from the kitchen TV sits in the zone in the diagram earlier in the thread, but it will be upscaled to fit the available pixels. If your TV is a genuine 1080p native resolution, that'll be the case, otherwise it's down scaled to fit the 768 pixels (vertical resolution).
From the diagram, you'll see 32" is at the point end, meaning there's a strong chance if you sat 4' odd from it, and switched the output of a BR player between i & p for 1080, you'd not notice any difference. The reason being eye acuity, and viewing distance.
If you tried to sit nearer to see the extra info, the distance would be close to the range where you can see the panel pixel matrix.
Sorry I still disagree...if the 1080i source is video based, which much TV content would be, then there are 50 'fields' which can simply be deinterlaced to effectively give you 1080/25p which is almost the same as many BluRays which are either 1080/24p or some which are 1080/50i or 1080/60i (video sourced) which again can simply be deinterlaced effectively to 1080/25 or 1080/30p. The main difference is that BluRays have a higher bit rate, which will make them look better. You still end up with 1080 'lines' on the screen that haven't been 'created' they are the same lines (or rather pixels) that were there in the original source.
In other words on a 1080p display the temperal resolution of deinterlaced 1080/50i is 1080/25p pretty much the same as a BluRay of 1080/24p.
Of course with 'film' based content on 1080/50i then there can be some resolution loss, though from 1080/60i discs it is possible to use a non lossy 'reverse telecine' process that will send the display 1080/24p, though this isn't possible on UK HD broadcasts as we use 1080/50i (so not possible to do the 3:2 pull down of reverse telecine). So I agree that UK HD TV broadcasts of film sources will lose resolution in the deinterlacing as effectively there is some 'guesswork' (similar to upscaling). Having said that, UK 'films' are often simply sped up to 25 frames (6% faster) so that they can neatly be interlaced to 50i for transmittion, then deinterlaced back to 25 frames for display on the TV. Therefore no resolution loss from the 1080/25p source.
It's a moot point talking about 1080p compared to 1080i as things stand currently as there is no 1080/50p or 1080/60p content on disc, only computer games. The best we can currently have is 1080/24p or video based 1080/60i which equates to 1080/30p, which IMHO (apart from the lower bit rate issue) is the same resolution once deinterlaced onto a 1080p display as a BluRay disc.
In other words on a 1080p display the temperal resolution of deinterlaced 1080/50i is 1080/25p pretty much the same as a BluRay of 1080/24p.
Of course with 'film' based content on 1080/50i then there can be some resolution loss, though from 1080/60i discs it is possible to use a non lossy 'reverse telecine' process that will send the display 1080/24p, though this isn't possible on UK HD broadcasts as we use 1080/50i (so not possible to do the 3:2 pull down of reverse telecine). So I agree that UK HD TV broadcasts of film sources will lose resolution in the deinterlacing as effectively there is some 'guesswork' (similar to upscaling). Having said that, UK 'films' are often simply sped up to 25 frames (6% faster) so that they can neatly be interlaced to 50i for transmittion, then deinterlaced back to 25 frames for display on the TV. Therefore no resolution loss from the 1080/25p source.
It's a moot point talking about 1080p compared to 1080i as things stand currently as there is no 1080/50p or 1080/60p content on disc, only computer games. The best we can currently have is 1080/24p or video based 1080/60i which equates to 1080/30p, which IMHO (apart from the lower bit rate issue) is the same resolution once deinterlaced onto a 1080p display as a BluRay disc.
Edited by OldSkoolRS on Tuesday 20th December 08:44
Gassing Station | Home Cinema & Hi-Fi | Top of Page | What's New | My Stuff


