Yes. Well, I put the question mark there after Ultimate because I do not know if this will be the last chapter in its long and treacherous journey, all I know is that it is the latest. But, it would be nice to think it was the last one. So I suppose you want a bit of backstory, do you not? Well, for the one or two people that read my blog (LOVE YOU MAIK AND ELLERTIS), you'll probably know it has something to do with this. Yeah, that thing.
It has been through quite a lot. It has then been through even more on top of that, then some more. And then some more. In fact, it is pretty much eternal at this point, not only in utility as a graphics processing device, but also as physical product I can hold in my hands, abuse, and have it still work like (well, not quite, but you get the message) the day it was bought back in 2018.
I never posted about what I thought was its final moment, not because of sadness, but because I was taking a break from posting on my blog. So, I feel it gets a mention here; a small nod to what I thought was its last moments in service as a sturdy, noble gaming steed. After that horrific CPU-tower-cooler experiment, it eventually refused to give a display output in the system I was using it in, after several attempts to get it to work; I gave up. I peeled the thing out of the case and threw it aside into the box of 'metal bits and electrical bobs that Sash needs to sort out at some point but never will'. There it remained for months, until I finally removed it and placed it, propped up, on a plastic box on a dresser next to such legends as my R9 380X, 280X, 270X and 290X... and some not-quite-legends such as GTX 770 and a GT 730 that I killed a while back (don't ask, I forgot how). There, I thought, my Polaris-chan would remain forever.
I was wrong. (I came back to type this, forgive the next bit; it has some 'minor' digressions).
Since I sold my RTX 2080 Ti (I don't believe I even updated my blog on that) for what it was worth on ebay a week or so ago (no, I didn't scalp, I sold the Liquid-cooled 2080 Ti for £600), I found myself using my trust GTX 1650 SUPER; since I had also sold the GTX 1660 SUPER a while back (For the same as I bought it in 2020, woo). And, well, the GTX 1650 SUPER's salvaged memory bus at 128bits doesn't permit 6GB symmetrically; so you get 4 or 8, and it's Nvidia, so you get 4.
That just ain't gonna cut it. Well, maybe it would cut it for Warframe, Deep Rock Galactic and Fallout 76 (the latter with medium textures at 1080p), these are the games I routinely play, and they run fine on the baby budget model 1650 SUPER at 1080p; if it were GPU core-bound, my gaming needs would be covered by this entry model (I am not a GaMEr)...
Perhaps more fittingly, it wasn't even gaming related workloads that I felt the 4GB really hurt in, it was in Adobe Premiere Pro on my 4K screen that really hurt it. I need 4K to work on my biggest projects because at 1080p 24" on my main monitor; the tracks for the video and audio layers are so compacted I have to use tiny scrollbars if I want to see the preview composed of more than sixteen pixels (not quite, but you get the point). The 4K 40inch screen (it's actually a TV) lets me work on ~8 up and down tracks and the preview at at least 1080p resolution all on-screen which is not just handy, it's more or less essential. Well, good luck fitting that on a 4K video in 4GB with or without the Mercury Video Engine CUDA acceleration.
That's right, because even to just run the desktop interface with the Premiere Pro application, it maxes out the 4GB buffer and it stutters. It stutters, a lot. To the point of 'unusability' (that is not a word, but I am going to use it anyway). Fast forwarding past the attempts to disable windows desktop GPU acceleration and have my Ryzen 9 5950X (did I even mention that in a blog post? Oh my, I have been slacking a lot) process the graphics (which failed btw, the GPU still has to hold the screen data in memory), and we get to my predicament.
Okay, so, I may have regretted selling the 2080 Ti for creative reasons (including this), but I had done, so it was time to cobble something together so I can keep producing lovely made up spaceships in this waste of oxygen of a life that I have, well that escalated quickly, regardless...
Oh, so I haven't even told you about the RX 590 yet? Well, at this point you can probably guess what I'm going to say about it, but one more thing - actually I'll cover the TITAN part now because it did happen, chronologically, before the RX 590 part. My friend - the same Borb who sold me the 2080 Ti in the first place (in his defence, he told me not to sell the 2080 Ti, and he was right. If you're reading this, Borbossa, once again I should have listened :c)... Well, Borbossa has a TITAN X (Maxwell gen) and a TITAN Xp (the full GP102 card). Both of these models have 12GB video memory - just what I need!
Well, after agreeing a price for the Xp (I did consider the Maxwell, but it's getting on a bit these days); I am going to buy that for CUDA/memory intense applications for creative stuff hooked up to my 4K TV, while running the GTX 1650S on the 1080p for GaYmInG, as it's more efficient and has higher feature support. The TU116-based 1650S also has a better video codec engine, which I do use for transcoding, so there's that. Interestingly enough, my aquisition of a Pascal-based card (and the most powerful one for desktop, at that) will be a boon for my use of Stud.io's photorealistic renderer, which to this day is broken on Turing and Ampere (16 and 20 series) and doesn't even start on 30 series cards. Borbossa has an RTX 3090, so we already tested that and confirmed on the Bricklink forums, unless the fixed it already... doubt it). 5950X is no slouch in it, but TITAN Xp will be several times faster, especially in denoise with the 12GB GDDR5X memory producing a nice 528 GB/s (vs 62 GB/s peak on my DDR4-4200 on the 5950X).
Wow this has been a long post, and I haven't even mentioned the new episode with the RX 590 yet, okay, let's fix that.
Since I am waiting for Borbossa to post me the TITAN Xp; I am typing this post, looking at a monitor that is connected to that RX 590. That's right, it still works. And it doesn't just still work, it also runs at stock speeds as far as I can tell at stock voltages (I tried 1542 MHz at 1.15V, yes I know stock is 1545 for 590, but you tell that to the slidery draggy thing in Afterburner, unless you can type in those numbers? - because I didn't know that). This card has always used 1.15V for stock, since day one, too. Well, I tested it in Adobe Premiere Pro in the same 4K video editing on the same 4K TV, same project, same effects - and its' smooth as butter because Adobe actually have an OpenCL path for the Mercury Video Engine to use AMD GPU acceleration, yay!
I've gamed on it a little bit, but it's just nice to have a graphics card that can actually display my creative software without imploding into stutter (If you're saying "Oh, Sash, you should have known a 4GB card wasn't going to be enough for 4K video editing, you're a retard!" then you can shut up, because I was so far into Sashland that I forgot about that one detail).
ANYWAY! I gamed in Fallout 76 on the 590 a bit and it's running great. I've decided to run the card at ever-so-slightly above RX 480 speeds, to give it a break and run nice and chill, at around 1 volt. Same chip, so it's nice to be able to replicate that card's performance for my videos (well, sometimes). It's also nice to be able to use high textures without performance issues.
Here is a short video of that gaming session:
I also tested it at almost AMD reference RX 590 speeds, at the stock voltage state of 1150mV, and that also seems solid for this game:
Suffice it to say, I am quite happy that my ol' buddy ol' pal is still working. Very happy, in fact.
Now, I wonder if it will still do 1600 MHz...
Comments