Bethesda's epic sci-fi RPG is here, and it's a big one. From shipbuilding to exploring the surface of Mars, our thoughts so far.
Starfield Review... In Progress
The first trailer for Grand Theft Auto 6 is finally here.
Grand Theft Auto 6 Trailer
We take an in-depth look at Avatar: Frontiers of Pandora and tell you why it should be heavily on your radar!
Avatar: Frontiers of Pandora - a Deep-Dive into its Potential
Range-wise, the ROG Rapture GT6 is phenomenal, and it's ideal for all gaming and non-gaming-related tasks.
ASUS ROG Rapture GT6 WiFi 6 Mesh System Review
NVIDIA GeForce RTX 3080 Founders Edition Review
Review By @ 11:04pm 16/09/20


Product: NVIDIA GeForce RTX 3080 Founders Edition
Type: Graphics Card
Price: $1139.00
Availability: September 17

Since the debut from earlier this month, NVIDIA has captured the eyes and attention of the gaming world with its new NVIDIA GeForce RTX 30 Series graphics cards. A revolutionary new cooling design, a look so stylish it feels like it belongs somewhere in Tony Stark’s lab, plus talk about performance (in the case of the RTX 3080 reviewed here) that can, in some cases, double that of the previous generation’s GeForce RTX 2080.

The immediate questions we had before heading into this review were no doubt the same ones shared by many of you. All referencing that impressive first showing. Does the NVIDIA GeForce RTX 3080 Founders Edition run cool and quiet? Does it look as slick in person as it does in the black void of a 3D render with awesome lighting? Does it really offer “up to” double the performance of the GeForce RTX 2080?

The answer to all three, spoiler alert, is yes*.

The devil, as they say, is in the details. The asterisk. The RTX 3080 story is nuanced, varied, but always impressive. A true crank-up-the-details 4K powerhouse, the sort of generational leap that might arrive after three or four years -- let alone two. When paired with the right games and NVIDIA’s own advances made in the field of AI, it also points to a future where the two may become completely intertwined.

From Turing to Ampere




Any review or deep dive into a new piece of tech, especially a graphics card, is an ambient occlusionary tale full of whimsical numbers, charts, and talk about frames, games, and names. Like Ampere, Turing, and GDDR6X. Which we’ll get too – that is if you can resist the temptation to scroll. Go on, resist.


The RTX 3080 story is nuanced, varied, but always impressive. A true crank-up-the-details 4K powerhouse, the sort of generational leap that might arrive after three or four years -- let alone two.



Okay, okay. For the impatient ones out there the RTX 3080 is around 36% faster than the RTX 2080 Ti (the previous $2,000 AUD flagship from NVIDIA) when it comes to 4K game performance. And, it’s 50-60% faster when you factor in ray-tracing. It blows the RTX 2080 SUPER, the 3080’s like-for-like comparison in terms of price-point and naming, away.

Ahem. Back to the show.

We’re on the cusp of a technological storm, in that there’s more new gaming hardware due in the next few months than we’re likely to ever see again. Well, for several years at least. From brand-new consoles to new graphics cards, there’s enough to keep us all busy for several days and nights, tinkering away and marvelling at high frame-rates and ray-traced goodness.

The latter of which is already here thanks to NVIDIA’s Turing line of graphics cards, first brought to the scene in 2018 with the GeForce RTX 2080 and GeForce RTX 2080 Ti. The original ‘RTX On’, the 20 series of GPUs from NVIDIA made that tech-dream of real-time ray-tracing a reality.


Without turning this into a ray-tracing primer, it’s worth remembering just how taxing the computation of light rays bouncing around a scene used to render reflections, shadows, and how the glow of a sign might impact a nearby wall can be.

In terms of support and performance real-time ray-tracing took a while to take off. Hardware appeared on shelves long before the first ray-traced games did, and from there everything from game optimisation to Windows 10 support to NVIDIA’s own AI-based DLSS rendering went through a growth period.


The original ‘RTX On’, the 20 series of GPUs from NVIDIA made that tech-dream of real-time ray-tracing a reality.



The results though, arrived. Remedy’s Control with DLSS 2.0 is without a doubt one of the most impressive visual feats we’ve seen in a long time. Not to mention DLSS paired with Minecraft RTX. In 2019 and the early parts of 2020 NVIDIA hardware provided these glimpses into the future. And with games like Cyberpunk 2077 and Watch Dogs: Legion coming soon -- to help push it all even further and onto the next-generation of consoles – there’s plenty of room for a sequel.

This is where Ampere comes in, the architecture that powers NVIDIA’s RTX 30 series. Think of it as the second generation of RTX – the PlayStation 2 to the original PlayStation. Where we went from the sheer marvel of 3D gaming to something that presents it in a way we hadn’t seen before. Ampere features second generation RT Cores (NVIDIA’s dedicated ray-tracing hardware) and third-generation Tensor Cores (the AI stuff that makes DLSS the perfect match for hardware intensive ray-tracing in addition to simply boosting performance).

But these are only two of the RTX 3080 stars. There’s also cutting-edge memory in the form of the fast 10GB of GDDR6X developed in partnership with Micron, the new custom Samsung 8nm process, and a re-designed board that does a ‘Select All’ to a few of the RTX 2080 numbers and multiplies them by two. Plus, a new way to deliver power (via a new small 12-pin connector) and the Founders Edition keeping all those silicon bits nice and cool in a stunning minimal and modern design.

Of Numbers and Heat Sinks




With the RTX 20 series, NVIDIA introduced a GPU design that featured stuff like RT Cores and AI-based Tensor Cores – both critical in making real-time ray-tracing a reality. That is, in addition to the CUDA Cores that have driven PC graphics for many years. With the RTX 3080, comparing the spec-sheet numbers to the RTX 2080 should have been a lot easier than comparing the 2080 to the GeForce GTX 1080. It isn’t.

The CUDA Core count in the RTX 3080 is double that of the RTX 2080 Ti, let alone the RTX 2080. Which brings up the question of ‘How in the…’? In addition to the… ‘What the hell are you talking about, Core Boy?’


TThis is where Ampere comes in, the architecture that powers NVIDIA’s RTX 30 series. Think of it as the second generation of RTX – the PlayStation 2 to the original PlayStation. Where we went from the sheer marvel of 3D gaming to something that presents it in a way we hadn’t seen before.



Now, if you’re looking for a CUDA Core expert, you’ve come to the wrong place – just think of it as a, well, core part of the NVIDIA GPU. Ampere effectively doubles the calculations each of its Streaming Multiprocessors (SM) can do each cycle thanks to more CUDA action and the use of an 8nm Samsung process. Floating Point calculations, Integers, and other things that require a degree in smart-school-studies to understand all happen inside the SM. It’s this doubling-up architectural design of Ampere that has led to NVIDIA calling it the biggest generational leap in its history.
  • Architecture (GPU): Ampere
  • CUDA Cores: 8704
  • RT Cores: 68
  • Tensor Cores: 272
  • GPU Boost Clock: 1710 MHz
  • Memory Capacity: 10GB GDDR6X
  • Memory Interface/Speed: 320-bit/19 Gbps
  • Memory Bandwidth: 760 GB/s
  • TDP: 320W

No doubt that’s pure tech talk, in terms of what it all means and can lead to in the game we play – well, that’s what makes the RTX 3080 such an exciting release. Paired with DirectX 12 Ultimate, the future of high-end games won’t simply need more power to get better results – Variable Rate Shading seen in the likes of Wolfenstein and Gears 5 offer up notable performance increases with no loss in visual quality. A technology that will also make its way to the Xbox Series X.

DLSS 2.0 (NVIDIA’s hugely impressive AI-upscaling) also benefits from the new tech in the RTX 30 series. It’s now possible to play a game with ray-tracing turned on, in 4K, and with no loss in picture quality – at over 60 frames-per-second. When RTX first appeared, you needed a 2080 Ti to hit that at 1080p – which is roughly eight-times fewer pixels. Of course, since those first RTX moments the implementation has improved – a RTX 2060 SUPER can handle a bit of ray-traced Control. Albeit at a lower resolution and settings. But, 4K 60 gaming with ray-tracing is a thing thanks to the RTX 30 series.

Improvements and new things have arrived alongside the RTX 30 series too – with things like RTX IO using Ampere and the brand-new DirectX DirectStorage tech to drastically lower loading times. As with the 2080 launch this is another case of NVIDIA future proofing – until game support is added. Elsewhere NVIDIA Reflex already improves latency in games like Fortnite and Apex Legends. RTX Broadcast turns any space into a streaming setup with AI powered voice controls (that can go so far as to remove the background noise of a hair-dryer or lawn-mower) and green-screen-free green-screen effects. All leveraging a mix of software and RTX hardware.

Game Performance




Strip all of that away and the modern PC graphics card is what we all use to play games on a gaming PC. And that’s exactly what we did. Running the following games in 4K with detail settings dialled up to 11. Using the following gear.
  • GPU: NVIDIA GeForce RTX 3080 Founders Edition
  • CPU: AMD Ryzen 7 3800X
  • Motherboard: MSI MEG X570 UNIFY
  • Memory: HyperX FURY DDR4 RGB (32GB at 3600 MHz)


Right off the bat, games where the RTX 2080 Ti struggled to maintain 4K 60-fps with max detail settings, the RTX 3080 was more than capable. Assassin’s Creed Odyssey, a modern-day Crysis in how its Ultra settings adds so many things that kill performance, maintains 60-fps on the RTX 3080. Same story with Borderlands 3. Compared to the 2080 SUPER this Assassin’s Creed performance is over 50% faster. Compared to the RTX 2080 Ti it’s a 33% increase.

The biggest surprise though was the massive jump in Horizon Zero Dawn performance, which has improved quite a bit on its own thanks to recent patches and updates from Guerilla Games. Even taking all of that into account the RTX 3080 performs 88% faster than the RTX 2080 SUPER and 48% faster than the RTX 2080 Ti.

No doubt these are generational improvements the likes of which we haven’t seen before. The RTX 3080 has no issue running DOOM Eternal in 4K at 144fps. Even this, perhaps the most well optimised game currently playable on PC – runs over 50% faster than the RTX 2080 Ti. Yeah, insane.

Here are the same games running in 1440p, with detail settings maxed.



Without a doubt the RTX 3080 is a 4K card, the HDMI 2.1 can even support up to 8K with HDR. That said several high refresh-rate monitors and UltraWide displays are QHD or 1440p – offering up low latency, G-Sync compatibility, and other features tailor made for gaming. On that note, and on a slight tangent, we recently received a review sample for the Samsung Odyssey G7. A QHD 1440p QLED display with a 240 Hz refresh rate. What high end card can even push that many frames at 1440p without sacrificing visual quality? Well, DOOM Eternal’s answer is the GeForce RTX 3080.

At 1440p though, we get to see bottlenecks compared to the 2080 Ti. With DirectX 11, 12, different engines, optimisations, CPU usage, and other factors – games don’t always run as intended, or, expected. Games like Star Wars: Jedi Fallen Order and Far Cry: New Dawn only offer single digit improvements in 1440p over the RTX 2080 Ti. 2% in the case of Jedi.

But, this isn’t really a downside. It points to a 1440p to 4K drop off that is well, incredible on the RTX 3080. The worst case 1440p example, Star Wars: Jedi Fallen Order, is a 4K success story. With the RTX 3080 running that great slice of Star Wars action 43% faster than the RTX 2080 Ti in 4K. A testament to Ampere’s design and the overall improvements the RTX 30 series can bring.

Ray Tracing and DLSS




Speaking of which, the second-generation RT Cores and the third-generation Tensor Cores present a tasty picture for what will no doubt be the most talked about game this holiday season – CD Projekt RED’s Cyberpunk 2077. A game that will feature ray-tracing and DLSS support – so seeing the RTX 3080 double the RTX 2080’s ray-tracing capabilities in the higher resolutions of 1440p and 4K has been one of those ‘rub the eyes to make sure the numbers are real’ moments. In some cases, the RTX 3080 shows a higher than 100% improvement over the last generation. The asterick has arrived.


Even compared to the RTX 2080 Ti, which still is very capable in the ray-tracing department the improvement on average is around 50%. Minecraft with RTX and Quake II RTX, titles that offer full path tracing show a near identical 60% or so improvement over the RTX 2080 Ti – a good indicator for the ray-tracing capabilities of the RTX 30 series. And with RTX supported titles like Cyberpunk 2077, Watch Dogs: Legion, and Call of Duty Black Ops: Cold War on the horizon no doubt the demand for the RTX 3080 and other cards in the series will only increase.

Seeing Control at 1440p hit 100 frames-per-second with all RTX details on high and DLSS set to Quality (which results in a better-looking image than native) is simply jaw dropping. Maintaining 60 at 4K, equally impressive. DLSS, NVIDIA’s secret weapon - provides such an incredible performance boost that something like Death Stranding, without any sort of ray-tracing, can run over 30 frames-per-second faster with a more detailed image.

And It Looks Good Doing it Too




Make no mistake, the 20 series is still more than capable to power modern PC games. The RTX 3080 though, well, that’s a different story. It’s the most impressive bit of gaming tech in years, and it does all of this whilst looking like something designed and sent back from the distant future. Perhaps an alternate dimension or reality where humanity has solved all the world’s problems and is instead spending its time designing the coolest tech that can play retro videogames from the year 2020.


It’s now possible to play a game with ray-tracing turned on, in 4K, and with no loss in picture quality – at over 60 frames-per-second.



The GeForce RTX 3080 Founders Edition is gorgeous. It’s slick, minimal, with fins in between fans on either side. It features an exhaust on the back underneath the HDMI and DisplayPort connectors, with cool air coming into your PC case being blown through the top fan, on its way out through the system fan. The Founders Edition design is something of a radical departure for NVIDIA and the industry, so it’s great to report that it not only works but does so in style. The RTX 3080 idles in the low 40s (that is 40-45 degrees Celsius), with both fans turned off for silent desktop usage. When heating up it remains quiet, even when the temperatures top out at around 75-degrees.

Now 75-degrees may not sound all that cool, but if you tweak the fan curve with a third-party app you can easily keep the RTX 3080 Founders at around 60-65-Degrees without getting anywhere near 100% fan-speed. NVIDIA has created what is not only one of the most striking and stylish bits of tech in 2020 – as in all tech - it’s created an engineering marvel to boot. The only real down-side is that the 8-pin to 12-pin adapter sits horizontally blocking some of the view. Of course, partner cards will feature more traditional designs, with their own high-end cooling techniques, and better support for overclocking (stay tuned to AusGamers on that front), and RGB lighting.

But, there’s just something about the look of the GeForce RTX 3080 Founders Edition that feels next-gen. Getting to put it through its paces (using out-of-the-box settings with no tweaks) has not only been a treat, but somewhat cathartic. As we’ve been waiting months for next-gen console news, previews, and early looks at game demos – NVIDIA has announced its next-generation GeForce RTX line and released it in the space of a few weeks. And when it comes to that next-gen feel, the RTX 3080 sets the bar so high across look, feel, thermals, and in-game performance, that we’re inclined to call a victory before the battle has even begun.
What we liked
Gorgeous design
That is also functional with great cooling and quiet performance
Incredible 4K performance
A true high-res ray-tracing card
A massive generational leap over the RTX 20 series
What we didn't like
8-pin to 12-pin adapter covers some of its beauty
320W TDP is even higher than the 2080 Ti's
Faster than a 2080 Ti but even still, Microsoft Flight Simulator on Ultra settings at 4K 60 isn't a thing yet. PC killer that is
We gave it:
10.0
OUT OF 10
Latest Comments
fpot
Posted 04:27am 17/9/20
Looks pretty great. I'm starting to think the demand for the 3090 won't be too high and I may actually get one. Everyone will want a 3080 and later the 3070.
Hogfather
Posted 07:28am 17/9/20
The 3rd party benchmarks are coming in and look to be largely in line with nvidia numbers.

3080 is up to about twice as fast as 2080, more likely about 80% faster. Up to 40% faster than 2080ti and much cheaper.

It's worth noting that lots of benchmarks have improved across the board from the 2000 launch, as CPU and pci bus bottlenecks have actually started to matter across these generations. Be careful because you may get more bang for your buck from a cpu-mobo upgrade and sitting on say a 2070 card if you've ignored it for a few generations.

I've decided to sit this launch out and wait for the ti or a 3090 paetner card. There's just too much hype and real value here, which means stocks will be low and real prices will be spectacular for a while.
trog
Posted 09:11am 17/9/20
"Doesn't run Microsoft Flight Simulator on Ultra settings at 4K 60"


I assume this means it's just not powerful enough to do it? Is there anything that can do available yet or is that still a future hardware thing.
fpot
Posted 09:17am 17/9/20
The engine is a bottleneck ridden mess. Apparently the recent patch which is seconds away from being finished improves things performance-wise. It also breaks controller sensitivity settings which sucks because the default sensitivity is stupidly high. It's also CPU bound. My GPU rarely hits 100% usage while my CPU gets thrashed. Hopefully Zen 3 will fix that.
Jimmy
Posted 09:23am 17/9/20
I've noticed Mwave have gone from a first-in-first-served system for purchasing the FE cards to a raffle... for the right to purchase a card. Here's me thinking I'd easily get a 3090 FE. Scalpers are going to have a field day with this.
Doesn't bother me too much if I don't snag one as I'm waiting on the Zen 3 announcement before deciding on a system to build, but still...
KostaAndreadis
Posted 10:06am 17/9/20
Trog... yeah nothing can at 4K... based on the engine I'm wondering if it's even possible at Ultra. Don't think it's a slight on the 3080... the 2080 Ti can only manage 30 frames-per-second... more of a statement that it's basically a PC killer

I'll expand on that to make it more clear though
Dan
Posted 11:21am 17/9/20
Am I wrong in thinking that there's a not insignificant hidden cost in the energy use of the 30 series? All the reviews seem to only acknowledge the TDP in the context of "yer gonna need a bigger PSU", but I'd think that an extra 100W is going to leave a dent on the average users power bill too.

Back of the envelope calc (RTX 3080 FE TDP 320W vs RTX 2080 FE TDP 225W):
~100 more Watts used for 10 hours a day is an extra ~1000KWh per day, at .30c per KWh, that's over $100 a year more on a power bill than 2080.

Maybe it doesn't get anywhere near those numbers unless you're crypto mining or running crysis the whole time, but still seems like a huge jump for a single generation.

I'd be interested in seeing some kind of deep dive into how many killowatts an average PC versus the new gen games consoles might use in a year of average use.
Hogfather
Posted 01:31pm 17/9/20
> Maybe it doesn't get anywhere near those numbers unless you're crypto mining or running crysis the whole time

Maybe?
KostaAndreadis
Posted 01:45pm 17/9/20
Interesting point Dan... time will tell of course. Haven't been able to go super in-depth and see what the power usage has been over time. But I do know that in regular desktop usage it's quiet and doesn't drain the use.
Hogfather
Posted 02:39pm 17/9/20
Also while interesting academically, I don't think that power draw is necessarily the primary concern of the 2080 desktop user and early adopter.

We're taking about a market segment that commonly straps multiple graphics cards together and installs complex after market liquid cooling alongside delicate electronics just so the magic smoke stays on the inside.
Dan
Posted 06:05pm 17/9/20
Oh for sure, the typical early adopter won't give it a second thought beyond can my PSU handle it and its been designed with that market in mind.

It's the magnitude in the TDP increase that I found to be remarkable. The 1080 founder's was 180, 2080 was 225 and 3080 is 320 .

There's obviously a huge performance leap too, but I can't think of any comparable leap between annual computer hardware iterations that made a ... 42%? jump in power consumption. Historically, I thought it had only been the "titan" style superclocked versions that punched that high, never the flagship products, but I could be wrong.

The 'maybe' in my previous post was just me having no idea about the kind of numbers a 2080 and 3080 would run at when under average load, but I still figured that with those max values it was feasible the middle difference could still be in the ballpark of extra 1kwh a day for a typical user. And if just upgrading a GPU could also add $100 a year to an energy bill that seems worth talking about.

I'm curious whether it's mostly just because its early days for their 8nm fabrication and the TDP of the revisions will level out quick as it matures, or whether we might continue to see similar jumps in power consumption going forward. Also, whether it means it's going to take longer than usual to get 3080s into laptops.
Nightblade
Posted 10:14pm 17/9/20
@trog Flight Sim seems to be CPU bound -- one thread on 100%.
ravn0s
Posted 11:42pm 17/9/20
welp, didn't even stand a chance. tried multiple sites. managed to get to checkout with umart but it shat itself during the payment process.
Murf
Posted 01:04am 18/9/20
Pre-ordered 3080 through PC case gear. No eta but assuming it'll be afew weeks at least. Thats fine with me.
Fr33kSh0w2012
Posted 10:57am 18/9/20
I think 2x 3090 in SLI or whatever they call it now should juuuuuust about do the trick
Fr33kSh0w2012
Posted 11:36am 18/9/20

I've ordered a Model# RC-DP38K 8Ware Ultra 8K DisplayPort v1.4 Cable 3M from PC Case Gear Because, Yes, I am future proofing my Setup I use Currently a KOGAN 50" TV screen for my monitor (and so I can watch TV without leaving my chair) can't afford an 8k TV at the moment (Some of them Cost More then a CAR 8( ) I'm so getting

Corsair AX1600i Digital Titanium Modular 1600W Power Supply
MSI GeForce RTX 3090 Gaming X Trio 24GB

What screws me up is literally what is the difference between the

MSI GeForce RTX 3090 Gaming Trio 24GB
MSI GeForce RTX 3090 Gaming X Trio 24GB

They are Identical I tell you:

https://www.msi.com/Graphics-card/GeForce-RTX-3090-GAMING-X-TRIO-24G/Specification

https://www.msi.com/Graphics-card/GeForce-RTX-3090-GAMING-TRIO-24G/Specification

So what gives?


fpot
Posted 05:10pm 18/9/20
The difference will be in the core clocks which are both TBA. Are you going to SLI those 3090s? Because don't. Nvidia have basically abandoned SLI support for games. I had so many problems with my SLI 1080tis that I grabbed a 2080ti as a replacement.

Even if you are going SLI 1600W is probably 600 more watts than you'll need. 1200W if you want to feel extra safe. I recently saw a youtube video of a Ryzen 3950x and 2080ti running stable on a 450W PSU. That's still lower than anyone would recommend but 1600W is just a waste.
trog
Posted 12:10pm 22/9/20
F*** me that's a lot of power. Just looked at my last PC order (2012 - PC still going strong although I lent it to a mate recently & he popped a new video card in it), and it has a 620W power supply, which I remember thinking at the time was ludicrous.
Commenting has been locked for this item.
18 Comments
Show