monitor connection

3 replies [Last post]
Joined: 06/24/2012
Posts: 2

Im getting a new monitor. Im getting the ASUS PA238Q and i dont know what conncetion to use. It has all 4 vga which i would neve use again. DVI, hdmi and display port. I am using hdmi on my current monitor/tv I have a gtx 670 for a video card. Which connection would you advise?

3dGameMan's picture
Joined: 12/31/2000
Posts: 5401

Kinda depends on your setup, but generally speaking, for a 1080p monitor DVI, hdmi and display port would be fine. If you want to send audio and video together on the same cable, use HDMI. Note that most people just use HDMI these days, but if the dispaly includes a free DVI cable, use it. No point sending more if you don't have to.

Rodney Reynolds,

Joined: 06/24/2012
Posts: 2


Junkyard Dawg
Junkyard Dawg's picture
Joined: 02/01/2012
Posts: 23

I actually have a personal beef with using HDMI cables on monitors. It seems that when you use an HDMI cable, it treats the display as a television. The problem this has caused for me in the past is when you want to use features like scaling.

Let's say you have a monitor that is a 16:10 resolution, like 1920 x 1200. You have an image that is a 16:9 resolution, like 1920 x 1080. Obviously there will be some blank space because of the slight difference in vertical resolution. Let's even say that the image, a game maybe, you are displaying is in a 4:3 aspect, like 1280 x 1024. I had this issue when I wanted to play Silent Hill 2 for PC on a widescreen display connected via HDMI. As I said before, when you use HDMI, Windows tends to treat the device as a television.

As you may know, televisions have their own built-in functions to handle any scaling issues. You can use the menu in the OSD to control if images will be automatically configured (aspect ratio is dynamic, depending on the source), scale to fit (image is expanded as big as it can be, without losing the image or aspect ratio), expanded to fit (image is stretched and blown up to fit the entire screen, often losing the aspect ratio of the source), or matching the original source resolution.

When you use HDMI, you can no longer use options like GPU scaling that NVidia and AMD graphics card uses, because it will assume that the display will control how the image is displayed in regards to the aspect ratio. The problem is, if the display you are using is a monitor, the monitor's own scaling options will be disabled, due to the fact it expects the computer to handle any changes. When you use DVI or DisplayPort, it does not treat the display as a television, therefore leaving you the ability in the graphics software to control the GPU scaling. I have been able to get around this issue using HDMI by doing registry hacks, but you would have to do it on every display that is ever connected to the machine.

In short, DVI is still my favorite video output method for monitors. It still handles the largest resolutions available to the consumer market, and can convert to other standards like HDMI, and even send audio signals through it. DisplayPort I think has a revision or two before it can replace DVI in my book.

Sorry for the long explanation before I finally got to the point, but I wanted to make sure anyone would completely understand my argument before lashing back. I hope this helps you.

XFX nForce 780i SLI (LGA 775)
Intel Core 2 Extreme QX9770 (3.2 GHz, 1600 MHz FSB)
4 x PNY XLR8 DDR2 800 MHz (2 GB, 4-4-4-12)
2 x BFG Tech GeForce GTX 295 (Quad SLI)