Okay, so where do I start? Well, let me start by saying I thought my RX 590 kicked the bucket yesterday. This is after I sold my 5700 due to driver-crashing. I was pretty upset, actually very close to crying because it stressed me out, a lot. Anyway, the long and the short of it is; I loaded up Rage 2 and after about half an hour of gameplay, I started getting really, really bad artifacting. It looked to me like that Power Boozer (lol) Polaris 30 had finally drawn too much current and his logic fried.
If you're interested, here's what I saw in the game.
EPILEPSY WARNING. Please DO NOT watch this if you have Photosensitive epilepsy (PSE). This is truly a Fuster Cluck of flashing and it almost sent me into a daze, let alone someone with PSE. With that our of the way, here's the video.
... Yeah. On top of that, if I tried to alter the clock speed in-game (using the Radeon Overlay with Wattman) the entire system would 'brown-screen' and lock up. In all honesty, I just pulled the card out of my system and almost teared up. At that point I went to Amazon and bought a GTX 1660 SUPER base model with a single fan (more on that in a moment...) for £199 with free same-day shipping (Amazon Prime FTW). It was a quick purchase, because I was pretty pissed off. Anyway...
Sash gets baby Turing again, this time SUPER-charged (lol).
...Fast forward to today, and the card arrived this evening. Here's the little guy.
As you can see it's a pretty small card. That's actually a nice thing for me, since it lets me get my fat hands/fingers around it to the other components without taking the card out. Anyway... I installed the thing and fired up my system. DDU'd the old AMD Driver and pulled down the latest Nvidia driver...
'DCH' Driver doesn't include the Nvidia Control Panel. And Microsoft Store still sucks.
...Except that didn't install the Nvidia Control Panel, and the advice was to get that from the Microsoft Store (this is already going just great...). Except...
The Microsoft Store wasn't going to co-operate. By the way, the poor reviews of this 'app' on the store are from people with my exact same issue. That's already inspiring me with confidence!
Anyway, after some googling and hoop-jumping I found out that this is with the 'DCH' driver and has some Microsoft thing that says it can't include Nvidia's control panel with the package. Well, that's nice, if it would actually let me download the app so I can configure my newly purchased graphics card.
The fix is to simply search for the 'Standard' driver package from Nvidia (believe it or not, the default one on their website is the 'DCH' one). So I pulled that down, and everything is fine and dandy. Fired up Warframe and the card is A-okay.
This card is efficient. Like, really efficient. It's also remarkably cool and quiet for a teeny, tiny single-fan model. TU116 is a pretty damn efficient chip - even boosting to 1900 MHz it's barely hitting 70C and I can't hear it over my case fans (admitedly they are quite loud...).
Priceless, Rare Artifacts are so nice!
I'm always a bit anxious when using a new component for the first time. But everything was fine! Well, it was until I tried RAGE 2 again, but I bet you already figured out what I'm going to say.
I managed to progress a bit, and the area that artifacted on the RX 590 was running fine on the 1660 SUPER so I assumed the 590 was at fault. Until I entered a new area, and the textures went all stretchy and artifacty, on my shiny new Nvidia card. Yeah.
Logical deduction tells me this isn't a GPU / driver issue. Since both cards are absolutely rock-solid in Warframe at 100% load, and that game does push the shaders pretty hard (generates a lot of heat). There's no way my luck is that bad, so I guess the issue is software with RAGE 2 or my Windows install. Or my RAM and CPU, which is unlikely - actually - since I do WCG 20/7 and I've yet to return an invalid result that isn't the result of AMD's unstable drivers (>_>).
I'm going to re-install the game, and if necessary, Windows, and see if that fixes. If not, then I'm just not going to play RAGE 2, lol. ¯\_(ツ)_/¯
I'm glad Polaris didn't die. But I'm going to keep the GTX 1660 SUPER.
I'm happy that my RX 590 is still going, even though it's a bit of a pig and needs some tweaking to get it to not explode in my face and use 300 Watts. The 8GB is nice, and I will for sure use it in one of my other PCs as a backup. Which brings me to say, I am going to keep the GTX 1660 SUPER as my main PC's graphics card. There's a few reasons for this, which I will list.
Firstly, this card is extremely power efficient. Both under load and idle; with all my Radeon cards, idle power use at 144 Hz is extremely high as the GDDR5/6 memory doesn't enter idle power-saving states. It's about 30-35W at idle, just for the graphics card. That is really bad for leaving my PC on 24/7, so I had to set the monitor to 120 Hz which fixes the issue. I mean, it wasn't the worst thing in the world; but the 1660 SUPER has ultra-low idle use even at 144Hz so I guess that's one less annoyance to deal with.
In games, this thing just sips power. 120W peak board power, that is 50W less than the RX 590 used just for the GPU core - and a whopping 90W less comparing the full board power. Power efficiency doesn't matter a lot for many gamers, which I appreciate, but for me it does matter. I pay directly for my power-use and I tend to do fairly consistent 2-3 hour runs of high-refresh Warframe and that juice adds on top of my WCG farm 24/7 so it does make a bit of difference there. That, and it puts out significantly less heat.
Secondly, I was pretty disappointed with AMD's driver stability lately. The crashing on Desktop was really the final straw, as I was dumping hours of WCG tasks due to having to hard-reboot. I'm cautiously optimistic that NVidia's drivers will be more stable for leaving my system on 24/7. this is a secondary point, since I only experienced the crashing on the Navi-based RX 5700. Either way, my GTX 770 system had yet to crash at all, so it was valid.
Thirdly, is Bricklink's Stud.io renderer, Eyesight, has a CUDA plugin. From my testing, my GTX 770 is not-insignificantly faster, than my Ryzen 7 2700 with the Ray-Tracing software. This should allow me to spew out my models quicker, since I do end up waiting 10-15 minutes for the 3700X to complete them. I mean the 3700X is no slouch, but the 1660 SUPER should be a lot faster. That said, I've taken to simply doing them on the GTX 770 system, since I can edit the image in PDN whilst that other machine is doing the render; since it makes the system sluggish.
Turing is pretty wedge for compute, so the 1660S should be notably faster than the 770, so it should make up for having to wait for it to complete without using my second system for it. Sorry, I'm babbling. Jeese, I typed a lot.
Stop babbling, Sash.
I know. I'm sorry. :< Anyway, the GTX 1660 SUPER offers me performance I am happy with, in a tiny, power-efficient package that should be more stable for my WCG work, and can also do stud.io hardware rendering. So it's a win/win. Of course, the smaller VRAM capacity will likely mean I'll have to turn Fallout 76's textures from Ultra to High or Medium, but since that game is an unoptimised pile of trash with shitty textures that look no different on any setting, I think it'll be okay.
Oh, and I don't actually play Fallout 76 anymore. :D
I have once again returend to my historic happy-place GPU budget of between 200 and 300 pounds. The cheaper, the better. Within reason. Unless something big and stupid happens with me (very likely) this card should be fine for me for the foreseeable future. And I plan to invest in some pretty wedge CPU upgrades soon. :3900X!
Did you see the cat-face emote 12-core just there? I'm really happy with that. :3(900X!)
There is always MORE SCIENCE! to be done.
I'm going to stop typing now. :D
Comentarios