Having problems with a GTX 1060 I've bought to replace a 650Ti, not getting any display on the monitor.
Sometimes I will get the BIOS splash screen and then the Windows logo then all goes blank, sometimes not even the BIOS startup screen. If I do get to the desktop then the system runs stable and I can play the games that have a high graphical demand on ultra settings no problem.
All will be fine for a few days and then I will have to keep rebooting 10-15 times before I can get to the desktop, and then all is stable again for a while.
Nothing else has changed - just a straight swap from the old card to the new, power requirements for both cards are similar (maybe slightly less for the new card).
I currently have a 650w supply (very old though) and am only running the card and an ssd off the rail, everything else powered by the other rail I have, tried swapping rails no difference. The 650w is more than enough for the system but I know what counts is whats delivered through the rails, not the overall wattage.
Does it sound like a PSU problem ? Don't want to spend if it could be something else. Should mention my monitor only has a VGA connection and so have had to run it through a displayport-VGA adapter, that could'nt be an issue ?
Presuming it is any recommendations for a sub £90 PSU at least 600w, Corsair and Antech seem to be the best choices, any others to consider ?
Don't have an option in the BIOS for AGP voltage anyway, there are options for HT,NB,NB 1.8v and SB voltages though. Don't know what they mean so not messing.
New PSU it is then.
I do remember Trig though, haven't read any of his posts since he was stuck in the sanitarium - is he still around ? And that fella who thought the CIA/FBI/NSA were camping in his loft and assasinating everyone he knows !!
No idea. I bought my current card about 6 years ago, so it's ancient in tech terms, but still plays everything I want it to. I haven't really kept up with numbers and models since then. However for a decent card you'd probably be looking to spend around £200. Which is probably about $20 these days.
It was the displayport to VGA adapter causing the problem.
The card only has digital outputs, I needed an adapter to convert the signal to analogue to work with my VGA monitor, but the card couldn't supply enough voltage to power the adapter.
Bought a HDMI to VGA adapter that has a 5v USB input for £5.99 and have had no problems since. Cheaper than a new PSU !
It is an LCD monitor, only a couple of years old, just not a digital one.
Works perfectly well, why should I change it ? Any advantage to a HDMI/DVI-D monitor over a standard VGA one, both just a means of displaying an image after all.
Quality might be better, refresh rates and response times faster - but would I really notice any difference, I mean a really noticeable difference ? I can barely tell the difference between a standard TV picture and a high-def one !