I’m skipping the Nvidia GeForce RTX 40-series — here’s why

an image of the Nvidia GeForce RTX 4080
(Image credit: Nvidia)

The first three Nvidia GeForce RTX 40-series graphics cards were officially unveiled last week (September 20) and I haven’t been able to stop thinking about them. Not because I’m excited about the prospect of shiny new graphics cards, but because of my disappointment with the GPUs. While I understand Nvidia’s need to remain ahead of its competitors (namely, AMD and Intel), it’s hard for me to see who these cards are for beyond the ultra-hardcore PC gamer who absolutely needs the latest tech for the best gaming PCs.

I realize I’m a week late to the party since I was on vacation when Nvidia revealed its new Lovelace cards, but I needed to air my thoughts as I’ve seen similar sentiments from like-minded individuals. While the RTX 40-series looks to be the company’s most powerful GPU line yet, they don’t seem like components most people need to rush out and get.

Before, I couldn’t get excited about Nvidia GeForce RTX 40-series cards. Now, I’m flat-out skipping this GPU generation and sticking with my RTX 3080 Ti. Here’s why.

The Nvidia GeForce RTX 30-series are still solid GPUs 

Most of the conversation about the Nvidia RTX 30-series has centered around how unavailable and overpriced they've been since launch. We can blame this on the global pandemic and semiconductor shortage, but scalpers and crypto miners didn’t make things any better. The only way I was able to snag an RTX 3080 Ti without waiting months was to buy a pre-built PC with the elusive card… though the Ampere GPU still cost $200 more than retail.

These woes aside, the RTX 30-series was and is a fantastic GPU line that’s capable of running graphically demanding games at high resolutions and frame rates. The flagship RTX 3080 and RTX 3090 cards have little difficulty running games like Cyberpunk 2077 and Red Dead Redemption 2 at max settings, but even “mid-tier” GPUs like the RTX 3060 and RTX 3070 can play most modern games at mid to high settings without a hitch.

RTX 3090 Ti

The current RTX 30-series cards are still powerful enough to handle even the most demanding modern video games. (Image credit: Nvidia)

The pandemic set the gaming industry back by at least two years. Because of that, Ampere GPUs will likely be viable for longer. Developers will continue making the best PC games optimized for Nvidia’s last-gen GPUs even with the 40-series cards out in the wild. And considering how RTX 30-series cards are now more readily available, it will likely be the de facto Nvidia card for quite some time. I don’t see 8K gaming becoming mainstream anytime soon, after all.

The price isn’t right

Graphics cards aren’t the most affordable PC components, but the pricing for the RTX 40-series has upset a lot of prospective buyers. At launch, the flagship RTX 4090 will cost an eye-watering $1,599 – with the RTX 4080 16GB coming at $1,199 and RTX 4080 12GB at $899.

Nvidia 40-series GPUs from third-party vendors

We didn't expect the RTX 40-series to be cheap but they each cost hundreds of dollars more than their last-gen equivalents. (Image credit: Nvidia)

Admittedly, the RTX 4090 is $400 cheaper than the $1,999 RTX 3090 Ti that launched earlier in 2022. However, it’s $100 more than its last-gen equivalent, the RTX 3090. Nvidia claims the 4090 is two to four times more powerful than the RTX 3090 Ti, which would seemingly make it a great deal since it’s more affordable. Still, $1,599 is a steep price to pay, especially given the current state of the world economy.

I’m not a fan of the RTX 4090’s price but I can almost forgive it when compared to the pricing for the two RTX 4080 GPUs. At $1,199 for the 16GB version and $899 for the 12GB edition, the GPUs cost $400 and $200 more than the RTX 3080 12GB and 10GB, respectively. Yes, the new card has more VRAM (memory), but as far as I’m concerned, you’re better off buying a cheaper last-gen card that can still run games at high or max settings.

The GeForce RTX 4080 12GB is borderline insulting 

I have to single out the 12GB version of the RTX 4080 since it’s the one most people (myself included) have the biggest issue with. Nvidia is marketing the RTX 4080 as a single card with two variants, which might make you think they're identical except for the 4GB difference in memory. But there are other differences that make the 12GB card seems more like what the RTX 4070 should be in terms of specs.

Based on the specs Nvidia provided, the 12GB RTX 4080 has 7,680 CUDA cores compared to the 9,728 cores found in the 16GB model. This means the $899 Lovelace card won't perform as well as the $1,199 version. In addition, the 12GB variant has a lower bus width and lower power draw. The latter isn’t a bad thing since this card will use less power, but a lower bus width means it won’t be able to access or transmit as much data as the 16GB card.

Nvidia GeForce RTX 4080 press image

The RTX 4080 12GB is less powerful than the 16GB model. It might as well be the RTX 4070, spec-wise. (Image credit: Nvidia)

Though I can somewhat applaud Nvidia for releasing a Lovelace card that costs less than $1,000, it’s hard to ignore the fact that the RTX 4080 12GB is a significantly different GPU compared to the 16GB edition. I worry that unsuspecting buyers will opt for the cheaper card, thinking they’re only getting less memory. 

I’ll just say bluntly: Do not buy the RTX 4080 12GB GPU if you’re in the market to upgrade.

Ready to skip the Nvidia GeForce RTX 40-series

I wasn’t exactly sold on the idea of the RTX 40-series when rumors first began percolating. At the time, Ampere cards were notoriously difficult to find. As such, I questioned the need for the new Lovelace GPUs. Now that Nvidia has officially unveiled its latest graphics cards, I’m even less convinced.

If I had a GTX 10- or 20-series GPU then the upgrade would be worth it, even at such steep prices. But given how the RTX 3080 Ti currently residing in my gaming rig is providing me with 4K 60 fps experiences, I don’t see why I have to get a Lovelace card. Even as someone who wants the latest tech, it’s not worth it.

If you’re in the same position as me, then it might be best to skip the RTX 40-series line and wait for the next Nvidia GPU generation; or perhaps see if AMD finally releases a GPU Nvidia can truly be scared of.

TOPICS
Tony Polanco
Computing Writer

Tony is a computing writer at Tom’s Guide covering laptops, tablets, Windows, and iOS. During his off-hours, Tony enjoys reading comic books, playing video games, reading speculative fiction novels, and spending too much time on X/Twitter. His non-nerdy pursuits involve attending Hard Rock/Heavy Metal concerts and going to NYC bars with friends and colleagues. His work has appeared in publications such as Laptop Mag, PC Mag, and various independent gaming sites.

  • KevBacon
    Forget the 30 series, for me personally it's not even worth upgrading from my Strix 2080Ti. I haven't seen any solid numbers yet but even with another leap in performance I think you'd have to be just wanting even higher frames in 4K to need the upgrade. Now I'm only playing in 1440p but with a 240Hz monitor and my 2080Ti can easily handle anything I throw at it. I'm not even upset about the price as a lot of people are. I paid almost $1,500 a few years ago for my 2080Ti. If you want anything other than those hideous stock cards you're going to be paying a decent amount. If these new cards do have the rumored performance then I think the price is a fair jump. I think the number of people that actually need the jump in performance is small. Apparently they're pushing the next gen DLSS and Ray Tracing improvements. I don't have the need for DLSS, Ray Tracing is nice but for me the number of games I've played that support it is so small. So overall I think the number of people crying about the price is much larger than the number of people who would actually buy the card for the improvements regardless of price.
    Reply
  • Valkyr09
    Why are you trying to get it now? It's way overpriced. I'm waiting for 2023. The prices will drop unless we have world war 3
    Reply
  • kato128
    Tbh I'm still rocking a 1080ti and I'm considering holding off yet another generation. Really wanted a 3080 but the heavens weep at the price. Guess we'll see what AMD and Intel have on offer but my bet is the 1080ti will soldier on
    Reply
  • Matk2662
    I knew I'd be skipping when I bought the 3090 almost 2 years ago. I did the same when I bought the 1080ti. I skipped the 2080ti.
    The 4090 has very few real world uses over the 3090. High end gpu are now in the world of diminishing returns. Humans have limited perception and as such many of the insane laboratory figures we are being told about cannot be perceived by humans. For example, very high frame rates are totally invisible to the human eye. My 3090 actually is on about the fringes of human limits. The 4090 goes way beyond them.
    Let's be totally honest and say that the vast majority of those upgrading will only be doing so for the bragging rights. Less that 4% game at 4k on pc. Hdr use is virtually non existent. Hdr 400 and 600 doesn't count. Its awful.
    The 3080 is overkill for gaming at less than 4k. Yet many will upgrade just to keep up with the joneses and post inane photos of their new toy.
    The 4090 doesn't actually do anything new apart from dlss 3. A cynic may say that nvidia even rigged the deck with that one.
    Reply
  • Folly_Polymath
    Why is this an article? Why did some idiot get paid to write this? There have been editorials decrying new tech and generations of CPUs/GPUs for 20+ years, and the rationale is always, "I don't need this right now." Cool moron, we don't need to know that you don't need new tech you haven't bought. Just go back and look at all the responses to solid state storage 15 years ago, when the same morons would comment "Nah, I'll stick with my Raptor." I have a 3080 and a Samsung Odyssey and the card struggles with ray tracing at that resolution, so I'm the customer for an upgrade. I'm not writing an article "Why you NEED a 40 series GPU."
    Reply
  • phxrider
    I'm skipping it at least until I see what AMD comes out with when they release the Radeon RX7000 series.
    Reply