HDMI Cables

Author
Discussion

jmorgan

36,010 posts

284 months

Tuesday 6th December 2016
quotequote all
E planes and H planes...... my head hurts....

TheExcession

11,669 posts

250 months

Tuesday 6th December 2016
quotequote all
jmorgan said:
E planes and H planes...... my head hurts....
hehe Especially as what most people don't realize is that over a 'wire' there is often a two way conversation going on, then we start to spin our heads over CDMA.

It's delicious, from one moment of stating that it's only 1s and 0s travelling down a wire we've trampled all over that and then discovered that 'coding' the data can a make a massive difference too! biggrin

Can't argue with the 'everyday is a school day', I'm not expecting many readers to grasp the maths, but an appreciation of 'oh my god it isn't quite as simple as I thought' would be appreciated -

Back on topic - I will never be an advocate of spending massive money on AV cables, but they do play a part.


Elderly

Original Poster:

3,493 posts

238 months

Tuesday 6th December 2016
quotequote all
I've borrowed a recently purchased high speed HDMI cable (presume 1.4)
and I'll try that tomorrow.

It appears that my onboard graphics card supports 2560 x 1600 @ 60Hz,
so I could remove my Nvidia GT730 from the system, use the onboard and see if that makes any difference;

I can't recall what type of outputs the onboard card has and the back of my PC is a real pain to get at rolleyes.

anonymous-user

54 months

Tuesday 6th December 2016
quotequote all
I use pound shop 1m HDMI cables on my Sky+ HD box and Xbox One. Both work well with no issues at all at 1080p.

As or the OPs problem, I have no idea. All I came to say was pound shop HDMI!

KamSandhu44

272 posts

168 months

Wednesday 7th December 2016
quotequote all
Elderly said:
I've borrowed a recently purchased high speed HDMI cable (presume 1.4)
and I'll try that tomorrow.

It appears that my onboard graphics card supports 2560 x 1600 @ 60Hz,
so I could remove my Nvidia GT730 from the system, use the onboard and see if that makes any difference;

I can't recall what type of outputs the onboard card has and the back of my PC is a real pain to get at rolleyes.
It won't make any difference if the monitor cannot support the resolution through through the HDMI port.

Elderly

Original Poster:

3,493 posts

238 months

Wednesday 7th December 2016
quotequote all
Thanks gents; so it looks like the original info I got from the people who built the PC
as to whether I could achieve the monitor's max res over HDMI is WRONG furious.
I wouldn't have chosen that monitor had I known.

I'll try a DVI to HDMI cable but it doesn't look encouraging \/

|http://thumbsnap.com/bf6eiPvJ[/url]

ETA. My Graphics card has three display connectors:

DVI-D: which will give my monitor's max res.
VGA : which I know will only give 2048 x 1536
HDMI : Why shouldn't HDMI give anything over 2048 x 1536?

Edited by Elderly on Wednesday 7th December 08:51


Edited by Elderly on Wednesday 7th December 08:53

jmorgan

36,010 posts

284 months

Wednesday 7th December 2016
quotequote all
TheExcession said:
Back on topic - I will never be an advocate of spending massive money on AV cables, but they do play a part.
Aye, without an analyser on the end of mine I have no idea how close to the edge they are however if I see no issues, then my high speed £4 ones will do......

scorp

8,783 posts

229 months

Wednesday 7th December 2016
quotequote all
jmorgan said:
Aye, without an analyser on the end of mine I have no idea how close to the edge they are however if I see no issues, then my high speed £4 ones will do......
Not all HDMI outputs are the same either, in some cases a cheap product will struggle with a cheap cable as the combination of the 2 will push it over the edge.

Elderly

Original Poster:

3,493 posts

238 months

Wednesday 7th December 2016
quotequote all
Say I get one of these: https://www.amazon.co.uk/Cable-Matters-Plated-Fema...

It claims to support 2560 x 1600 .... (which is the max res of my monitor)

.... but I think what you are saying is that the limiting factor would still be the HDMI input to my monitor, despite then using the DVI-D output of the graphics card at the other end of the adapter?

TonyRPH

12,972 posts

168 months

Wednesday 7th December 2016
quotequote all
OP, watch out for directional adaptors.

Some will only convert display port out to DVI in.

This device will convert the HDMI output of your graphics card to display port in (on your monitor) but it's not cheap.

EDIT:

DisplayPort specs allow for the source device to generate a DVI type signal so you can use an adapter or Displayport to DVI cable with a monitor that has a DVI input.

But, not the other way around (you can't use a video card's DVI output to a monitor's DisplayPort input).


OP - can you get your monitor changed?

Or get a graphics card with a display port output?

EDIT OP this thread on Reddit might help.

Reddit said:
I have the MSI GTX560Ti and just bought the Dell U2515H as well.
Exact same problem of having max resolution of 2048x1152.

Played around with the Nvidia Control Panel and finally managed to get 2560x1440.

The solution: Make sure after you select Custom Resolution of 2560x1440.
DO NOT leave the Standard in 'Timing' as Automatic, I selected 'CVT' and the full potential of the U2515H is unveiled.
Edited by TonyRPH on Wednesday 7th December 11:36

Foliage

3,861 posts

122 months

Wednesday 7th December 2016
quotequote all
Its your monitor, its crap, ignore all the stuff about cables and stuff. ultrasharp is marketing speak for, oddly sized panels that are completely different to anyone else and you'll need to fk about with your graphics card setting to get it to work right.

Your using a GTX730 - so download and install the latest NVidia software and you need to setup a custom profile for the resolution, you may need to use 50hz aswell because its such a low power card.

Note im running 4k from a GTX770 over a poundland HDMI cable with no issues what so ever, my machine isn't capable of 4k gaming but 4k video is spectacular.


Elderly

Original Poster:

3,493 posts

238 months

Wednesday 7th December 2016
quotequote all
TonyRPH said:
EDIT OP this thread on Reddit might help.

Reddit said:
I have the MSI GTX560Ti and just bought the Dell U2515H as well.
Exact same problem of having max resolution of 2048x1152.

Played around with the Nvidia Control Panel and finally managed to get 2560x1440.

The solution: Make sure after you select Custom Resolution of 2560x1440.
DO NOT leave the Standard in 'Timing' as Automatic, I selected 'CVT' and the full potential of the U2515H is unveiled.
That seems hopeful thumbup

My PC manufacturer (Palicomp) has had its support phone lines down for at least two weeks and all technical queries are being handled by slow exchange of emails which is far from satisfactory furious.
They are looking into it rolleyes.

ZesPak

24,428 posts

196 months

Wednesday 7th December 2016
quotequote all
Foliage said:
Its your monitor, its crap, ignore all the stuff about cables and stuff. ultrasharp is marketing speak for, oddly sized panels that are completely different to anyone else and you'll need to fk about with your graphics card setting to get it to work right.
Ultrasharp are basically 16:10, hardly exotic really?

Foliage

3,861 posts

122 months

Wednesday 7th December 2016
quotequote all
ZesPak said:
Foliage said:
Its your monitor, its crap, ignore all the stuff about cables and stuff. ultrasharp is marketing speak for, oddly sized panels that are completely different to anyone else and you'll need to fk about with your graphics card setting to get it to work right.
Ultrasharp are basically 16:10, hardly exotic really?
Is it a standard resolution in NVidia or AMD's software? Is it used by more than 1 manufacturer? Is it an industry standard specified by an organisation or association? The answer is No to them questions, that makes it exotic in my eyes. And to be honest their is probably some 'profit' reason for it, not that its better for the consumer.



ZesPak

24,428 posts

196 months

Wednesday 7th December 2016
quotequote all
Foliage said:
Is it a standard resolution in NVidia or AMD's software? Is it used by more than 1 manufacturer? Is it an industry standard specified by an organisation or association? The answer is No to them questions, that makes it exotic in my eyes. And to be honest their is probably some 'profit' reason for it, not that its better for the consumer.
The answer is yes to all.
16:9 comes from tv's. A 16:10 monitor will have more pixels and surface given any diagonal, thus being more expensive and a harder sell.

Foliage

3,861 posts

122 months

Wednesday 7th December 2016
quotequote all
anonymous said:
[redacted]
True its runs at WQHD and will run on a standard card after some messing around, but why do dell produce this resolution when no one else seems to bother. I suppose because its half way between 1080 and 4k.

I hope the OP gets it sorted.

Foliage

3,861 posts

122 months

Wednesday 7th December 2016
quotequote all
Ok im wrong,

BUT :P its not the cable that's the problem, op needs to look at his graphics card settings.

Agreed mountain out of molehill sorry

Elderly

Original Poster:

3,493 posts

238 months

Wednesday 7th December 2016
quotequote all
TonyRPH said:
Reddit said:
I have the MSI GTX560Ti and just bought the Dell U2515H as well.
Exact same problem of having max resolution of 2048x1152.

Played around with the Nvidia Control Panel and finally managed to get 2560x1440.

The solution: Make sure after you select Custom Resolution of 2560x1440.
DO NOT leave the Standard in 'Timing' as Automatic, I selected 'CVT' and the full potential of the U2515H is unveiled.
]
That has solved it, I'm now getting 2560 x 1440 over HDMI beer

TonyRPH

12,972 posts

168 months

Wednesday 7th December 2016
quotequote all
thumbup