Author Topic: Server monitor/video card issue  (Read 2058 times)

0 Members and 1 Guest are viewing this topic.

Offline edyTopic starter

  • Super Contributor
  • ***
  • Posts: 2387
  • Country: ca
    • DevHackMod Channel
Server monitor/video card issue
« on: November 28, 2016, 11:05:21 pm »
Hi folks,

I have this strange issue with my server. It usually sits at the back with a screen plugged in but usually off. I don't touch it for months.

Today I went to log in to check something. I turned on the monitor, it worked for about 1 minute as I was trying to log in. Then all of a sudden turned off. I tried to start it up again (BenQ soft-switch on button) and it would turn green for a second or two, then turn off again.

So I figured the monitor was fried. I took it and plugged into another computer, confirmed it won't power up. So then I took a known GOOD monitor and plugged into server. Turned in on... didn't detect a signal! Checked VGA cable, all good, still no signal detected. Server is on, I can access from the network. But locally I get no signal.

Next thing I do is try the HDMI cable, plugged into same video card on server (next to the VGA) and at first doesn't work, then after more jiggling and waiting and ripping my hair out, it works! So HDMI output from video card now seems to work, but VGA doesn't. I confirmed VGA cable is OK. It should be giving output on either HDMI or VGA, and if not detecting one then the other.

When I booted the server I had some early boot post showing up with VGA connection, then after it stopped. I don't know if it was monitor or card, and I can't reboot for a few more hours until office closes since the server is running everything. So I am still wondering what is going on.

We know:

1. VGA cable is good, but no VGA signal from server from the ASUS graphics card in it.
2. When HDMI output on card is used (right next to the VGA port) it seems to work after some random fiddling which I still am not sure what I did. So video card is working otherwise.
3. My monitor is good with both VGA/HDMI inputs so I know it's not the monitor I'm using now.
4. Previous monitor on server appears to be a power up issue, likely failed caps? I will investigate later.

So could the video card have bad VGA output yet still have good HDMI output? Is it just a loose pin possibly on the VGA port? Could the monitor that had a power issue have cause problems with the VGA circuit (since it was previously using the VGA output on the server) and damaged some VGA output chip/stage? Is my server video card VGA output damaged in some way? Is there some setting I am neglecting or something in the BIOS on boot?

Any advice would be helpful. Trying to figure out what is happening has been a quite sporadic as I was getting no output at all, at first I thought my entire video card was damaged but yet now I see it works.

EDIT:

Could it be that the video card detected HDMI plugged in and set that as the primary output display, and has secondary display (VGA port) off? Maybe if I reboot and have the VGA plugged in to the monitor from the start, it will use that as the primary display. I have a feeling that may be going on, and in all the confusion with which cables are working, which monitor is working, etc... the server was booted with nothing plugged in or with HDMI first and that was set to the primary and there is no secondary display set up.
« Last Edit: November 29, 2016, 12:34:12 am by edy »
YouTube: www.devhackmod.com LBRY: https://lbry.tv/@winegaming:b Bandcamp Music Link
"Ye cannae change the laws of physics, captain" - Scotty
 

Offline Rasz

  • Super Contributor
  • ***
  • Posts: 2617
  • Country: 00
    • My random blog.
Re: Server monitor/video card issue
« Reply #1 on: December 01, 2016, 07:42:03 am »
(BenQ soft-switch on button) and it would turn green for a second or two, then turn off again

its just a bad monitor (caps, sometimes transistors, there are even whole repair kits for them)
Who logs in to gdm? Not I, said the duck.
My fireplace is on fire, but in all the wrong places.
 

Offline CJay

  • Super Contributor
  • ***
  • Posts: 4136
  • Country: gb
Re: Server monitor/video card issue
« Reply #2 on: December 01, 2016, 07:55:39 am »
Entirely possible it's set VGA as a secondary output or disabled it, I've an HP machine here that doesn't enable VGA unless it's connected to a powered monitor at POST time.

 

Offline Z80user

  • Contributor
  • Posts: 11
  • Country: es
Re: Server monitor/video card issue
« Reply #3 on: December 01, 2016, 12:09:29 pm »
Press some random keys too, as shift or control key is common keep the videocard in low power without anything active and until you press any key, I recommend any key than don't do anything on any program as this two keys because maybe the problem come from monitor is off or in busy mode.
If the conputer have more than one videocard surely one is unused  but in a server isn't common, it can give some extra power is the memory of the server is shared memory between the procesor and the graphics card
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf