Well, it's here - our full NVIDIA GeForce RTX 3080 Founder's Edition review. NVIDIA's RTX 30 series debut, which we put to the test at 4K and 1440p across a wide range range of games. And we get it to trace a few rays too. The results? Well, that headline should give you an idea of how it stacks up against the RTX 2080 and even the RTX 2080 Ti.
Any review or deep dive into a new piece of tech, especially a graphics card, is an ambient occlusionary tale full of whimsical numbers, charts, and talk about frames, games, and names. Like Ampere, Turing, and GDDR6X. Which we’ll get too – that is if you can resist the temptation to scroll. Go on, resist.
Our Full NVIDIA GeForce RTX 3080 Founder's Edition Review
Looks pretty great. I'm starting to think the demand for the 3090 won't be too high and I may actually get one. Everyone will want a 3080 and later the 3070.
The 3rd party benchmarks are coming in and look to be largely in line with nvidia numbers.
3080 is up to about twice as fast as 2080, more likely about 80% faster. Up to 40% faster than 2080ti and much cheaper.
It's worth noting that lots of benchmarks have improved across the board from the 2000 launch, as CPU and pci bus bottlenecks have actually started to matter across these generations. Be careful because you may get more bang for your buck from a cpu-mobo upgrade and sitting on say a 2070 card if you've ignored it for a few generations.
I've decided to sit this launch out and wait for the ti or a 3090 paetner card. There's just too much hype and real value here, which means stocks will be low and real prices will be spectacular for a while.
The engine is a bottleneck ridden mess. Apparently the recent patch which is seconds away from being finished improves things performance-wise. It also breaks controller sensitivity settings which sucks because the default sensitivity is stupidly high. It's also CPU bound. My GPU rarely hits 100% usage while my CPU gets thrashed. Hopefully Zen 3 will fix that.
I've noticed Mwave have gone from a first-in-first-served system for purchasing the FE cards to a raffle... for the right to purchase a card. Here's me thinking I'd easily get a 3090 FE. Scalpers are going to have a field day with this.
Doesn't bother me too much if I don't snag one as I'm waiting on the Zen 3 announcement before deciding on a system to build, but still...
Trog... yeah nothing can at 4K... based on the engine I'm wondering if it's even possible at Ultra. Don't think it's a slight on the 3080... the 2080 Ti can only manage 30 frames-per-second... more of a statement that it's basically a PC killer
I'll expand on that to make it more clear though
Am I wrong in thinking that there's a not insignificant hidden cost in the energy use of the 30 series? All the reviews seem to only acknowledge the TDP in the context of "yer gonna need a bigger PSU", but I'd think that an extra 100W is going to leave a dent on the average users power bill too.
Back of the envelope calc (RTX 3080 FE TDP 320W vs RTX 2080 FE TDP 225W):
~100 more Watts used for 10 hours a day is an extra ~1000KWh per day, at .30c per KWh, that's over $100 a year more on a power bill than 2080.
Maybe it doesn't get anywhere near those numbers unless you're crypto mining or running crysis the whole time, but still seems like a huge jump for a single generation.
I'd be interested in seeing some kind of deep dive into how many killowatts an average PC versus the new gen games consoles might use in a year of average use.
Interesting point Dan... time will tell of course. Haven't been able to go super in-depth and see what the power usage has been over time. But I do know that in regular desktop usage it's quiet and doesn't drain the use.
Also while interesting academically, I don't think that power draw is necessarily the primary concern of the 2080 desktop user and early adopter.
We're taking about a market segment that commonly straps multiple graphics cards together and installs complex after market liquid cooling alongside delicate electronics just so the magic smoke stays on the inside.
Oh for sure, the typical early adopter won't give it a second thought beyond can my PSU handle it and its been designed with that market in mind.
It's the magnitude in the TDP increase that I found to be remarkable. The 1080 founder's was 180, 2080 was 225 and 3080 is 320 .
There's obviously a huge performance leap too, but I can't think of any comparable leap between annual computer hardware iterations that made a ... 42%? jump in power consumption. Historically, I thought it had only been the "titan" style superclocked versions that punched that high, never the flagship products, but I could be wrong.
The 'maybe' in my previous post was just me having no idea about the kind of numbers a 2080 and 3080 would run at when under average load, but I still figured that with those max values it was feasible the middle difference could still be in the ballpark of extra 1kwh a day for a typical user. And if just upgrading a GPU could also add $100 a year to an energy bill that seems worth talking about.
I'm curious whether it's mostly just because its early days for their 8nm fabrication and the TDP of the revisions will level out quick as it matures, or whether we might continue to see similar jumps in power consumption going forward. Also, whether it means it's going to take longer than usual to get 3080s into laptops.
@trog Flight Sim seems to be CPU bound -- one thread on 100%.
welp, didn't even stand a chance. tried multiple sites. managed to get to checkout with umart but it shat itself during the payment process.
Pre-ordered 3080 through PC case gear. No eta but assuming it'll be afew weeks at least. Thats fine with me.
I think 2x 3090 in SLI or whatever they call it now should juuuuuust about do the trick
I've ordered a Model# RC-DP38K 8Ware Ultra 8K DisplayPort v1.4 Cable 3M from PC Case Gear Because, Yes, I am future proofing my Setup I use Currently a KOGAN 50" TV screen for my monitor (and so I can watch TV without leaving my chair) can't afford an 8k TV at the moment (Some of them Cost More then a CAR 8( ) I'm so getting
The difference will be in the core clocks which are both TBA. Are you going to SLI those 3090s? Because don't. Nvidia have basically abandoned SLI support for games. I had so many problems with my SLI 1080tis that I grabbed a 2080ti as a replacement.
Even if you are going SLI 1600W is probably 600 more watts than you'll need. 1200W if you want to feel extra safe. I recently saw a youtube video of a Ryzen 3950x and 2080ti running stable on a 450W PSU. That's still lower than anyone would recommend but 1600W is just a waste.
F*** me that's a lot of power. Just looked at my last PC order (2012 - PC still going strong although I lent it to a mate recently & he popped a new video card in it), and it has a 620W power supply, which I remember thinking at the time was ludicrous.