Graphics Cards and LCD Monitors

Graphics Cards and LCD Monitors

Author
Discussion

Alicat

Original Poster:

226 posts

231 months

Monday 4th February 2008
quotequote all
We have a 22" widescreen on one of our PC's at home and the picture quality is not as good as I think it should be.

I am running the screen at it's native resolution (1680 x 1050), my question is how much effect, if any, does the graphics card have on the quality of the picture.

The graphics card is a few years old Nvidia with 64mb ram.

Thanks

Alicat

Zad

12,710 posts

237 months

Monday 4th February 2008
quotequote all
That sounds quite an elderly card. If you look at the connector, is it one of the older type 15 pin high density D connectors, or is it one of the newer digital connectors? My suspicion is that you are feeding the monitor with an analogue signal. If the monitor is capable of being driven with a digital signal (DVI or HDMI) then a new video card would undoubtedly improve the image.


Alicat

Original Poster:

226 posts

231 months

Tuesday 5th February 2008
quotequote all
Yes, it is an elderly card and to upgrade to something newer is not expensive, however, looking at the monitor it does not have a DVI interface.

It would appear that I can use a card with A DVI-I output and use an adpater to input to the monitor.

Either way will a better graphics card, say 256mb provide a better image irrespective of input to the monitor?

robbieduncan

1,981 posts

237 months

Tuesday 5th February 2008
quotequote all
Alicat said:
Either way will a better graphics card, say 256mb provide a better image irrespective of input to the monitor?
Perhaps, but then again perhaps not. As the screen does not have a DVI connector going to a DVI-I->DSub connection does not gain you much: you are still sending an analogue signal to the screen. The real issue here is the D->A conversion on the graphics card and then the A->D conversion at the screen. The real advantage of running DVI (at both ends) is that it is a digital connection that removes these two quality sapping conversions.

Note that the amount of graphics RAM on the card means nothing here.

It is possible that a newer/better card may be able to perform the D->A conversion better resulting in a cleaner signal. In the past Matrox cards where significantly better at this than the competition resulting in cleaner images. Unfortunately they were not able to keep up with the 3D wars and are now not worth considering. It is also possible that a new/higher quality VGA cable could improve the signal to the screen.

ginettag27

6,300 posts

270 months

Tuesday 5th February 2008
quotequote all
What's the make and model of the monitor?

Mr_Yogi

3,279 posts

256 months

Tuesday 5th February 2008
quotequote all
Was your monitor very cheap, as I find it strange for a newish monitor (other than 17" 4:3/ 5:4 or 19" widescreens) not to have a DVI input.

Most video cards for the last 8 years or some have all had decent 2D performance, I would not think your card is to blame. The better 2D cards used to come into there own at high very bandwidths (high resolutions with high refresh rates, 1200x1600@100Hz and above), however your LCD will be 60Hz, so even at 1080x1650 your card should be fine.

As has been said the problem will be the digital to analogue conversion at the video card and then the analogue to digital conversion at the monitor.

The videocard may not be great but I think has a 400MHz RAMDAC which should be fine.

If your monitor does have a DVI port then changing to a newer video card (with DVI) should make a large difference.

IIRC DVI comes in 3 main flavours; DVI-I, DVD-D, DVI-A.

DVI-A is pretty much just your analogue VGA output through a DVI connector.

DVI-D is digital only.

DVI-I contains both the analogue DVI-A and Digital DVI-D bits together.

Most modern vediocards are DVI-I so they can support older moniotrs with DVI-VGA adapters.

Alicat

Original Poster:

226 posts

231 months

Tuesday 5th February 2008
quotequote all
The monitor is an ACER AL2202W. Not that old.

I have some doubts about the screen.

Alicat.

twister

1,454 posts

237 months

Tuesday 5th February 2008
quotequote all
Does the monitor allow you to tweak the phase/clock/width/height/etc settings, or are they hidden behind an "auto-adjust" option? If it gives you manual control, try adjustingnthe width/height/position so the desktop fills the screen, then tweak the clock & phase settings to sharpen/stabilise the image as much as possible (you might be able to use the auto-adjust set values as a starting point if they're not totally off).

On the VGA-only panel I've got at work, if the clock/phase settings are off, the whole desktop looks a bit blurry and certain patterns of pixels start to shimmer/flicker quite noticeably. With all the settings tweaked properly though (the auto-set function gets them *almost* right), the panel looks almost as clear as the DVI panel I've got at home.

scorp

8,783 posts

230 months

Tuesday 5th February 2008
quotequote all
If your absolutely stuck with a analogue connection (VGA, etc) then i would suggest lowering the refresh rate. Often with VGA cable quality makes a massive difference too.

Zad

12,710 posts

237 months

Tuesday 5th February 2008
quotequote all
scorp said:
If your absolutely stuck with a analogue connection (VGA, etc) then i would suggest lowering the refresh rate. Often with VGA cable quality makes a massive difference too.
You beat me to it, that idea sprang into my head this morning too. So, as they say: "what he said" smile


Mr_Yogi

3,279 posts

256 months

Tuesday 5th February 2008
quotequote all
nerd but don't nearly all LCD's only refresh at 60Hz (with a few at 72Hz) so there shouldn't be anything lower available.