DVI or analogue putput for TFT screens
DVI or analogue putput for TFT screens
Author
Discussion

TUS 373

Original Poster:

5,055 posts

305 months

Tuesday 12th October 2004
quotequote all
I'm serioulsy considering getting a TFR monitor to reclaim a huge portion of my desk from my old but trusty 17" CRT monitor.

The graphics card I have now is quite adequate, though 4 years old now and therefore has a standard analogue video output.

Can someone please advise whether it would be of any gain to go completely digital and get a card with DVI output and a TFT monitor with DVI input? Indeed, do monitors come with both these days or do you have to choose one or the other. I don't have a particular TFT screen in mind at the moment, but believe that getting one with a fast response time ~ 16ms, is the way to go to get near CRT performance (I'm not much of a gamer!).

Tx

RobDickinson

31,343 posts

278 months

Tuesday 12th October 2004
quotequote all
If your not a gamer then TFT's are nice.

I think the cheaper screens still are analogue only. But your better off with the digital connection if you can be bothered.

JamieBeeston

9,294 posts

289 months

Tuesday 12th October 2004
quotequote all
DVI will be far better quality than Analog.

iirc th thruput on DVI is a few Gbit / Second.

initially, i think they had to artifically limit the thruput as it was going to allow 'better than cinema' quality on home sets..

DVI all the way (until the next generation comes out)

roop

6,018 posts

308 months

Tuesday 12th October 2004
quotequote all
Your regular video connection uses a DAC (Digital to Analogue Converter) on the video card to generate a signal suitable to drive an analogue display such as a CRT. When it's fed via the regular sub-D connector and cable to a TFT, the electronics in the screen use a ADC (Analogue to Digital Converter) to convert the signal back into a digital feed to drive the display. It then uses smart electronics to automatically adjust the display for the best image.

The problem is that you will always get issues in the DAC and ADC, plus you have an analogue signal running along the cable that's wide open to EMI and such.

With a DVI connector, it's a digital feed all the way with the video card then having the ability to interrogate the screen to find out the specification and then individually control every single pixel on the TFT screen making for a perfect image.

DVI all the way. I have DVI on my Dell 19" TFT coupled to an ATI Radeon Dogs Doodahs card and it's great.

Muncher

12,235 posts

273 months

Tuesday 12th October 2004
quotequote all
DVI all the way, much better picture quality.

TUS 373

Original Poster:

5,055 posts

305 months

Tuesday 12th October 2004
quotequote all


Thanks guys, exactly the information I need. Maybe its time for a new video card too then (on an early Ati Radeon here!).

Xmas is only 10 weeks away, may as well ask Santa for something that is actually useful and wanted!

squirrelz

1,186 posts

295 months

Wednesday 13th October 2004
quotequote all
DVI is definitely better - you get a little bit of ghosting with the analogue connection.

with DVI its:

Digtial -----> Digital

with the D-SUB connector its:

Digital -> Analogue -> Digital

Think of it as the difference between copying a CD using CD copying software, and playing it from a cd player to your line in then recording it to CD.