Sounds crazy right? Why would I give up the power of a top of the line GPU for a midrange card?
Truth be told, I only grabbed the 3060 as a secondary card for my Ubuntu system so I could passthrough the 6900XT to a Windows VM for gaming and Adobe products.
However, I was surprised by a few things after I made the switch:
- The 3060 is less jerky with protondb gameplay
- The 3060 is better with stable diffusion
- The 3060 has CUDA, which is 100% easier than rocr/opencl/rocm/hip/…
The 3060 is less jerky in protonDB games
So I bought this 3060 specifically so I could passthrough the 6900XT to a windows VM to game with. So imagine my shock when the games I previously had installed on Linux ran smoother! This blew my mind as I totally drank the “AMD is better with Linux than Nvidia” kool-aid.
Anno 1800 with the 3060 had about 10% less frame rate, but was actually playable. It was a smooth moving around the map, and felt native. When I used the 6900XT, sure I had some more frames, but it was always twitching and jerking. I couldn’t actually stand to play the game with the 6900XT, whereas the 3060 was fine.
Satisfactory had a lot larger frame rate drop using the 3060, down to average of mid 40s from mid 70s with ultra graphics. However, it was a smooth 40. There were no huge gaps of time that there was just no frames displayed like in the 6900XT.
The 3060 outperforms 6900XT in stable diffusion
This surprised me. When I started using AI generated images with my 6900XT on Windows using OnnxDiffusersUI (AMD GPU only) took about 1 second per iteration for a 512×512 image.
Switching to Linux using Invoke AI that went up to 6+ iterations per second on the 6900XT. Sadly, the coil whine was terrible, and my lights flickered every time it finished up an image from the 240 watts+ of instant power draw, but it was fast!
Then when my 3060 arrived I tested it out, and it produced images around 7 iterations per second. On top of that, it doesn’t have coil whine and was using a hundred watts less!
I didn’t save the stats while the 6900XT was in, but throwing the 3060 numbers here in case I ever do spend the day and a half to switch back to do a proper apples to apples comparison.
3060 RTX - Driver Version: 525.85.12 - CUDA Version: 12.0
stable-diffusion-1.5 - k_eular_a
512x512 - 50 iterations - CFG 7.5 - Seed 3009636919
141 Watts
7.34 iterations per second
The 3060 has CUDA and easier driver installs
It’s hard to express how nice it is to just have a standard compute library that is compatible with everything out of the box. If I’m coding with tensorflow or doing any image recognition or generation, it just works. Don’t need separate library or weird install stuff to get the encoders working. It’s just beautiful.
To just get AMD graphics installed and working, I spent a few hours with their docs and finally coming up with the monstrosity of:
amdgpu-install --usecase=dkms,graphics,opencl,hip,rocm,rocmdev,rocmdevtools,amf,lrt,hiplibsdk --opencl=rocr,legacy --vulkan=amdvlk,pro
AMD seems to think that no one ever in the history of computing would ever want to have a $1000 GPU that could both game and do compute workloads. Whereas Nvidia’s compute boiled down to:
apt install cuda
Also In this case the open source-ness of AMD is actually working against them. Because there are at least four different driver versions easily recommended: pro (required for the workloads I wanted), regular amdgpu, Oibaf, and kisak. And they can give you difference gaming experiences per game.
Still keeping the 6900XT
The 6900XT was designed to be the raster king for gaming, and I’m going to use it that way! Even though my main machine is Ubuntu, I have a Virtual Machine set up to use the 6900XT directly, so I have all of it’s raw power!
There are plenty of guides on how to do GPU passthrough that I have followed to get it working, it’s not for the faint of heart, isn’t 100% reliable or as good as a pure Windows desktop, but it’s nice to not be in a Windows world for everything.
Final Thoughts
Apologies for no hard numbers on any of these tests, but I never actually expected to be doing a direct comparison of these cards. I was simply amazed at how much easier and smoother the RTX 3060 was performing vs the 6900XT that I had to share my findings.
No, I am not being paid by Nvidia. I actually worked for AMD and am rooting for them to make their products (including software) better! And at the end of the day if I could only have one, it would still be the AMD 6900 XT.