Hi folks,
I have this strange issue with my server. It usually sits at the back with a screen plugged in but usually off. I don't touch it for months.
Today I went to log in to check something. I turned on the monitor, it worked for about 1 minute as I was trying to log in. Then all of a sudden turned off. I tried to start it up again (BenQ soft-switch on button) and it would turn green for a second or two, then turn off again.
So I figured the monitor was fried. I took it and plugged into another computer, confirmed it won't power up. So then I took a known GOOD monitor and plugged into server. Turned in on... didn't detect a signal! Checked VGA cable, all good, still no signal detected. Server is on, I can access from the network. But locally I get no signal.
Next thing I do is try the HDMI cable, plugged into same video card on server (next to the VGA) and at first doesn't work, then after more jiggling and waiting and ripping my hair out, it works! So HDMI output from video card now seems to work, but VGA doesn't. I confirmed VGA cable is OK. It should be giving output on either HDMI or VGA, and if not detecting one then the other.
When I booted the server I had some early boot post showing up with VGA connection, then after it stopped. I don't know if it was monitor or card, and I can't reboot for a few more hours until office closes since the server is running everything. So I am still wondering what is going on.
We know:
1. VGA cable is good, but no VGA signal from server from the ASUS graphics card in it.
2. When HDMI output on card is used (right next to the VGA port) it seems to work after some random fiddling which I still am not sure what I did. So video card is working otherwise.
3. My monitor is good with both VGA/HDMI inputs so I know it's not the monitor I'm using now.
4. Previous monitor on server appears to be a power up issue, likely failed caps? I will investigate later.
So could the video card have bad VGA output yet still have good HDMI output? Is it just a loose pin possibly on the VGA port? Could the monitor that had a power issue have cause problems with the VGA circuit (since it was previously using the VGA output on the server) and damaged some VGA output chip/stage? Is my server video card VGA output damaged in some way? Is there some setting I am neglecting or something in the BIOS on boot?
Any advice would be helpful. Trying to figure out what is happening has been a quite sporadic as I was getting no output at all, at first I thought my entire video card was damaged but yet now I see it works.
EDIT:
Could it be that the video card detected HDMI plugged in and set that as the primary output display, and has secondary display (VGA port) off? Maybe if I reboot and have the VGA plugged in to the monitor from the start, it will use that as the primary display. I have a feeling that may be going on, and in all the confusion with which cables are working, which monitor is working, etc... the server was booted with nothing plugged in or with HDMI first and that was set to the primary and there is no secondary display set up.