![]() ![]() In theory, I could use the old dummy plug with a DisplayPort adapter but that’s just becoming complicated. The only options are either HDMI or DisplayPort. This worked just fine, but the GTX 1070 Ti takes it a step further by removing all analog output altogether. ![]() Prior to my upgrade, the solution to this issue on my GTX 970 Ti was to use a VGA dummy plug, which I built myself from an old DVI to VGA adapter and a few resistors. Sometimes this doesn’t happen quite immediately, but the result is always the same – Windows thinks there is no monitor connected, thus the desktop is not rendered and remote control of the computer (via VNC in my case) is not possible or suffers strange graphical glitches. In the case of my 4K monitors, as they are connected by DisplayPort, powering them off via the front panel controls results in them “disconnecting” from the graphics card. Having recently upgraded to a GTX 1070 Ti graphics card to keep up with my dual 4K monitor configuration, I came across an issue that plagues many Nvidia chipset users – namely that of what happens when you power off DisplayPort connected monitors. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
June 2023
Categories |