Curated by THEOUTPOST
On Fri, 24 Jan, 12:02 AM UTC
24 Sources
[1]
NVIDIA GeForce RTX 5090 FE review: the new king of graphics cards for creatives
Why you can trust Creative Bloq Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test. It's finally here. After much anticipation, the NVIDIA GeForce RTX 5090 FE, the first NVIDIA GeForce card of the full-on AI era, is here. Promising a paradigm shift in graphics processing, the 5090 FE (the FE stands for Founders Edition) is the flagship of an upcoming line of 50-series graphics cards from NVIDIA, with more power, more efficiency and more AI-ready features to take creative pros' work further than ever - at least that's the pitch. But will it vault right to the top of the list of best graphics cards on the market, or have we been waiting in vain? I have had one for a couple of weeks to find out. The NVIDIA GeForce RTX 5090 FE is a graphics card. That means it's essentially a big blocky block of Tech Stuff. It's incredibly solid and feels dense and sturdy, as it should. But there are notable differences to the previous generation of cards. Slightly longer than the 4070 Ti Super that previously occupied its new slot in my tower PC, it's also slightly thinner, including the port panel, meaning there's a small gap in the back of the tower where there used to be none. I will be able to cover that by reattaching one of the back panel bars again, though, I think. The power socket is very cleverly inserted at an angle to minimise the protrusion of the power cable for users with more compact towers than my admittedly generous-sized one. Switch it on, and the GEFORCE RTX logo on the side lights up, which is of course a vital detail for the aesthetically minded among us. I really like it, as it's a touch of flash without being too garish. The back of this sleek Founder's Edition has four ports, three DisplayPort and one HDMI. You'll be entirely unsurprised to find the DPs support up to 4K 480Hz or 8K 165Hz with DSC each, while the HDMI supports up to 4K 480Hz or 8K 120Hz with DSC, Gaming VRR, and HDR. The big new feature in the 50-series cards is the all-new Blackwell architecture, which succeeds the Ada Lovelace of the 40-series, and comes with more AI features than before. Pure core counts are up dramatically across the board as you see in the specs sheet above, but the GB number increase may be less than anticipated, as this flagship card comes with 32 of the latter. Why not more? As we found out during NVIDIA's Editors Day at CES 2025, it may be because instead of just going for more power at any cost, the increased AI-fication has led to vastly increased efficiency in how the card distributes and uses its power. So with a card that draws a maximum wattage of about 25% more than the previous flagship, NVIDIA claims up to 70% higher performance for creatives (and in one specific case of 4-2-2 camera video rendering, a staggering 11 times more efficient processing). All this for the same price as the 4090 set you back last year. We're getting closer to the Performance bit... NVIDIA's proprietary tech, CUDA cores, are central to the card's graphics prowess. In fact, for many in the 3D arena, the number of CUDA core processors on a card was the primary criterion for choosing a GPU. In addition to that, we have the ever-increasingly important Tensor Cores. Both are processing units, and the more you have, the better something will run. Where CUDAs run general graphics and 3D rendering well, they are not nearly as efficient at running the Neural processing, AI and "Deep" processing tasks as Tensor cores, The 5090 FE boasts 3352 of the latter, more than double that of the 4090 (1321), while you'll find 30% more CUDA cores here (21,760) than in the 4090 (16,384). And it's those Tensor cores that are at the heart of the big innovations here, including DLSS 4, Neural rendering, Neural shaders, and much more. If you're buying the 5090 FE card, you expect absolute top-level performance across the board, and I'm happy to report that it duly delivers exactly that. Name the GPU-intensive task, and this thing is producing numbers I've literally never seen before. The biggest gains are found in applications such as DaVinci Resolve and the Flux image generator. The DaVinci Resolve overall score is at least 30% above anything I've seen in any 40-series card, with it setting a new, well, benchmarking benchmark across the board. I also used 4-2-2 camera video assets to run a concurrent render of 9 video streams, which it did faster in real-time than my 4070 Ti Super did with a single render. I estimated about a 9.8x increase in speed, thanks to the new Blackwell architecture, which can clearly run render tasks like this much faster than the Ada Lovelace-equipped 40-series. Video editors, the world lies at your feet now. The overall Photoshop score is perhaps a little less dramatically higher than previous gens, but it's in Flux where we see staggering differences. A much more memory-intensive software than Stable Diffusion, which we usually use for our AI image-gen benchmarks, the 5090 rattled through the test by generating images faster in Flux than I've seen anything do in the Stable Diffusion 1.5 test (a much lighter one). Finally, traditional Geekbench 6 OpenCL testing gave me a mind-melting 376,919 points, 70% more than my 4070 Ti Super, 40% faster than most 4090 tests, and indeed higher than pretty much anything run through Geekbench ever. And this card is apparently not even optimised for that software yet. 3D processing is also much, much faster than before, thanks to the cohort of innovations on board here. Software support is obviously still being rolled out, but by the end of the year, almost every notable piece of 3D or video software should have full support for the Blackwell suite of features. As such, some of our usual benchmark tests aren't yet compatible with the 50-series cards, but we will add them to this review as and when we can run them. And although we're not a gaming site, I am a gamer (and so will many of you be), and using this card in a game like Cyberpunk 2077, with the graphics turned up to ultra in 4K resolution and DLSS 4 switched on, I consistently averaged over 200 frames per second without the card seemingly breaking a sweat. What a time to be alive. Also, extra kudos to NVIDIA for releasing it with a remarkably stable and bug-free driver on the first go. Other companies, take note. At almost $2,000, the NVIDIA GeForce RTX 5090 FE is as far removed from a casual card as you can get. This is a card for Serious Users with Serious Requirements for their Serious Work (and play), and the price reflects that. With DLSS 4 support, neural rendering, RTX Mega Geometry and a host of other innovations on board, this is the new top choice for 3D modellers, game developers and graphic designers, and the vastly increased video-editing and rendering efficiency (especially if you're using the content-creator-friendly 4-2-2 cameras that are getting more popular every week) will make this the go-to card for video editors too. Photo editors will also see a big uptick, but they might want to stick around for the incoming 5080 or 5070 cards, as they probably won't need the absolute top-level card in this lineup.
[2]
Nvidia GeForce RTX 5090: the supercar of graphics cards
Why you can trust TechRadar We spend hours testing every product or service we review, so you can be sure you're buying the best. Find out more about how we test. The Nvidia GeForce RTX 5090 is a difficult GPU to approach as a professional reviewer because it is the rare consumer product that is so powerful, and so good at what it does, you have to really examine if it is actually a useful product for people to buy. Right out the gate, let me just lay it out for you: depending on the workload, this GPU can get you up to 50% better performance versus the GeForce RTX 4090, and that's not even factoring in multi-frame generation when it comes to gaming, though on average the performance is still a respectable improvement of roughly 21% overall. Simply put, whatever it is you're looking to use it for, whether gaming, creative work, or AI research and development, this is the best graphics card for the job if all you care about is pure performance. Things get a bit more complicated if you want to bring energy efficiency into the equation. But if we're being honest, if you're considering buying the Nvidia RTX 5090, you don't care about energy efficiency. This simply isn't that kind of card, and so as much as I want to make energy efficiency an issue in this review, I really can't. It's not intended to be efficient, and those who want this card do not care about how much energy this thing is pulling down -- in fact, for many, the enormous TDP on this card is part of its appeal. Likewise, I can't really argue too much with the card's price, which comes in at $1,999 / £1,939 / AU$4,039 for the Founders Edition, and which will likely be much higher for AIB partner cards (and that's before the inevitable scalping begins). I could rage, rage against the inflation of the price of premium GPUs all I want, but honestly, Nvidia wouldn't charge this much for this card if there wasn't a line out the door and around the block full of enthusiasts who are more than willing to pay that kind of money for this thing on day one. Do they get their money's worth? For the most part, yes, especially if they're not a gamer but a creative professional or AI researcher. If you're in the latter camp, you're going to be very excited about this card. If you're a gamer, you'll still get impressive gen-on-gen performance improvements over the celebrated RTX 4090, and the Nvidia RTX 5090 is really the first consumer graphics card I've tested that can get you consistent, high-framerate 8K gameplay even before factoring in Multi-Frame Generation. That marks the RTX 5090 as something of an inflection point of things to come, much like the Nvidia RTX 2080 did back in 2018 with its first-of-its-kind hardware ray tracing. Is it worth it though? That, ultimately, is up to the enthusiast buyer who is looking to invest in this card. At this point, you probably already know whether or not you want it, and many will likely be reading this review to validate those decisions that have already been made. In that, rest easy. Even without the bells and whistles of DLSS 4, this card is a hearty upgrade to the RTX 4090, and considering that the actual price of the RTX 4090 has hovered around $2,000 for the better part of two years despite its $1,599 MSRP, if the RTX 5090 sticks close to its launch price, it's well worth the investment. If it gets scalped to hell and sells for much more above that, you'll need to consider your purchase much more carefully to make sure you're getting the most for your money. Make sure to check out our where to buy an RTX 5090 guide to help you find stock when it goes on sale. The Nvidia GeForce RTX 5090 goes on sale on January 30, 2025, starting at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition and select AIB partner cards. Overclocked (OC) and other similarly tweaked cards and designs will obviously run higher. It's worth noting that the RTX 5090 is 25% more expensive than the $1,599 launch price of the RTX 4090, but in reality, we can expect the RTX 5090 to sell for much higher than its MSRP in the months ahead, so we're really looking at an asking price closer to the $2,499.99 MSRP of the Turing-era Nvidia Titan RTX (if you're lucky). Of course, if you're in the market for the Nvidia RTX 5090, you're probably not squabbling too much about the price of the card. You're already expecting to pay the premium, especially the first adopter premium, that comes with this release. That said, this is still a ridiculously expensive graphics card for anyone other than an AI startup with VC backing, so it's worth asking yourself before you confirm that purchase if this card is truly the right card for your system and setup. There are a lot of new architectural changes in the Nvidia RTX 50 series GPUs that are worth diving into, especially the move to a transformer AI model for its upscaling, but let's start with the new specs for the RTX 5090. First and foremost, the flagship Blackwell GPU is the first consumer graphics card to feature next-gen GDDR7 video memory, and it is substantially faster than GDDR6 and GDDR6X (a roughly 33% increase in Gbps over the RTX 4090). Add in the much wider 512-bit memory interface and you have a total memory bandwidth of 1,790GB/s. This, more than even the increases VRAM pool of 32GB vs 24GB for the RTX 4090, makes this GPU the first really capable 8K graphics card on the market. 8K textures have an enormous footprint in memory, so moving them through the rendering pipelines to generate playable framerates isn't really possible with anything less than this card has. Yes, you can, maybe, get playable 8K gaming with some RTX 40 or AMD Radeon RX 7000 series cards if you use aggressive upscaling, but you won't really be getting 8K visuals that'll be worth the effort. In reality, the RTX 5090 is what you want if you want to play 8K, but good luck finding an 8K monitor at this point. Those are still years away from really going mainstream (though there are a growing number of 8K TVs). If you're settling in at 4K though, you're in for a treat, since all that bandwidth means faster 4K texture processing, so you can get very fast native 4K gaming with this card without having to fall back on upscaling tech to get you to 60fps or higher. The clock speeds on the RTX 5090 are slightly slower, which is good, because the other major top-line specs for the RTX 5090 are its gargantuan TDP of 575W and its PCIe 5.0 x16 interface. For the TDP, this thermal challenge, according to Nvidia, required major reengineering of the PCB inside the card, which I'll get to in a bit. The PCIe 5.0 x16 interface, meanwhile, is the first of its kind in a consumer GPU, though you can expect AMD and Intel to quickly follow suit. Why this matters is because a number of newer motherboards have PCIe 5.0 lanes ready to go, but most people have been using those for PCIe 5.0 m.2 SSDs. If your motherboard has 20 PCIe 5.0 lanes, the RTX 5090 will take up 16 of those, leaving just four for your SSD. If you have one PCIe 5.0 x4 SSD, you should be fine, but I've seen motherboard configurations that have two or three PCIe 5.0 x4 m.2 slots, so if you've got one of those and you've loaded them up with PCIe 5.0 SSDs, you're likely to see those SSDs drop down to the slower PCIe 4.0 speeds. I don't think it'll be that big of a deal, but it's worth considering if you've invested a lot into your SSD storage. As for the other specs, they're more or less similar to what you'd find in the RTX 4090, just more of it. The new Blackwell GB202 GPU in the RTX 5090 is built on a TSMC 4nm process, compared to the RTX 4090's TSMC 5nm AD102 GPU. The SM design is the same, so 128 CUDA cores, one ray tracing core, and four tensor cores per SM. At 170 SMs, you've got 21,760 CUDA cores, 170 RT cores, and 680 Tensor cores for the RTX 5090, compared to the RTX 4090's 128 SMs (so 16,384 CUDA, 128 RT, and 512 Tensor cores). So there's a significant change to this generation of Nvidia Founders Edition RTX flagship cards in terms of design, and it's not insubstantial. Holding the RTX 5090 Founders Edition in your hand, you'll immediately notice two things: first, you can comfortably hold it in one hand thanks to it being a dual-slot card rather than a triple-slot, and second, it's significantly lighter than the RTX 4090. A big part of this is how Nvidia designed the PCB inside the card. Traditionally, graphics cards have been built with a single PCB that extends from the inner edge of the PC case, down through the PCIe slot, and far enough back to accommodate all of the modules needed for the card. On top of this PCB, you'll have a heatsink with piping from the GPU die itself through a couple of dozen aluminum fins to dissipate heat, with some kind of fan or blower system to push or pull cooler air through the heated fins to carry away the heat from the GPU. The problem with this setup is that if you have a monolithic PCB, you can only really extend the heatsinks and fans off of the PCB to help cool it since a fan blowing air directly into a plastic wall doesn't do much to help move hot air out of the graphics card. Nvidia has a genuinely novel innovation on this account, and that's ditching the monolithic PCB that's been a mainstay of graphics cards for 30 years. Instead, the RTX 5090 (and presumably subsequent RTX 50-series GPUs to come), splits the PCB into three parts: the video output interface at the 'front' of the card facing out from the case, the PCIe interface segment of the card, and the main body of the PCB that houses the GPU itself as well as the VRAM modules and other necessary electronics. This segmented design allows a gap in the front of the card below the fan, so rather than a fan blowing air into an obstruction, it can fully pass over the fins of the GPU's heatsink, substantially improving the thermals. As a result, Nvidia is able to shrink the width of the card down considerably, moving from a 2.4-inch width to a 1.9-inch width, or a roughly 20% reduction on paper. That said, it feels substantially smaller than its predecessor, and it's definitely a card that won't completely overwhelm your PC case the way the RTX 4090 does. That said, the obscene power consumption required by this card means that the 8-pin adapter included in the RTX 5090 package is a comical 4-to-1 dongle that pretty much no PSU in anyone's PC case can really accommodate. Most modular PSUs give you three PCIe 8-pin power connectors at most, so let's just be honest about this setup. You're going to need to get a new ATX 3.0 PSU with at least 1000W to run this card at a minimum (it's officially recommended PSU is 950W, but just round up, you're going to need it), so make sure you factor that into your budget if you pick this card up Otherwise, the look and feel of the card isn't that different than previous generations, except the front plate of the GPU where the RTX 5090 branding would have gone is now missing, replaced by a finned shroud to allow air to pass through. The RTX 5090 stamp is instead printed on the center panel, similar to how it was done on the Nvidia GeForce RTX 3070 Founders Edition. As a final touch, the white back-lit GeForce RTX logo and the X strips on the front of the card, when powered, add a nice RGB-lite touch that doesn't look too guady, but for RGB fans out there, you might think it looks rather plain. So how does the Nvidia GeForce RTX 5090 stack up against its predecessor, as well as the best 4K graphics cards on the market more broadly? Very damn well, it turns out, managing to improve performance over the RTX 4090 in some workloads by 50% or more, while leaving everything else pretty much in the dust. Though when looked at from 30,000 feet, the overall performance gains are respectable gen-on-gen but aren't the kind of earth-shattering gains the RTX 4090 made over the Nvidia GeForce RTX 3090. Starting with synthetic workloads, the RTX 5090 scores anywhere from 48.6% faster to about 6.7% slower than the RTX 4090 in various 3DMark tests, depending on the workload. The only poor performance for the RTX 5090 was in 3DMark Night Raid, a test where both cards so completely overwhelm the test that the difference here could be down to CPU bottlenecking or other issues that aren't easily identifiable. On every other 3DMark test, though, the RTX 5090 scores 5.6% better or higher, more often than not by 20-35%. In the most recent;y released test, Steel Nomad, the RTX 5090 is nearly 50% faster than the RTX 4090. On the compute side of things, the RTX 5090 is up to 34.3% faster in Geekbench 6 OpenGL compute test and 53.9% faster in Vulcan, making it an absolute monster for AI researchers to leverage. On the creative side, the RTX 5090 is substantially faster in 3D rendering, scoring between 35% and 49.3% faster in my Blender Benchmark 4.30 tests. There's very little difference between the two cards when it comes to video editing though, as they essentially tie in PugetBench for Creators' Adobe Premiere test and in Handbrake 1.7 4K to 1080p encoding. The latter two results might be down to CPU bottlenecking, as even the RTX 4090 pushes right up against the performance ceiling set by the CPU in a lot of cases. When it comes to gaming, the RTX 5090 is substantially faster than the RTX 4090, especially at 4K. In non-upscaled 1440p gaming, you're looking at a roughly 18% better average frame rate and a 22.6% better minimum/1% framerate for the RTX 5090. With DLSS 3 upscaling (but no frame generation), you're looking at 23.3% better average and 23% better minimum/1% framerates overall with the RTX 5090 vs the RTX 4090. With ray tracing turn on without upscaling, you're getting 26.3% better average framerates and about 23% better minimum/1% framerates, and with upscaling turned on to balanced (again, no frame generation), you're looking at about 14% better average fps and about 13% better minimum/1% fps for the RTX 5090 against the RTX 4090. At 4K, however, the faster memory and wider memory bus really make a difference. Without upscaling and ray tracing turned off, you're getting upwards of 200 fps at 4K for the RTX 5090 on average, compared to the RTX 4090's 154 average fps, a nearly 30% increase. The average minimum/1% fps for the RTX 5090 is about 28% faster than the RTX 4090, as well. With DLSS 3 set to balanced, you're looking at a roughly 22% better average framerate overall compared to the RTX 4090, with an 18% better minimum/1% framerate on average as well. With ray tracing and no upscaling, the difference is even more pronounced with the RTX 5090 getting just over 34% faster average framerates compared to the RTX 4090 (with a more modest 7% faster average minimum/1% fps). Turn on balanced DLSS 3 with full ray tracing and you're looking at about 22% faster average fps overall for the RTX 5090, but an incredible 66.2% jump in average minimum/1% fps compared to the RTX 4090 at 4K. Again, none of this even factors in single frame generation, which can already substantially increase framerates in some games (though with the introduction of some input latency). Once Multi-Frame Generation rolls out at launch, you can expect to see these framerates for the RTX 5090 run substantially higher. Pair that with Nvidia Reflex 2 to help mitigate the input latency issues frame generation can introduce, and the playable performance of the RTX 5090 will only get better with time, and it's starting from a substantial lead right out of the gate. In the end, the overall baseline performance of the RTX 5090 comes in about 21% better than the RTX 4090, which is what you're really looking for when it comes to a gen-on-gen improvement. That said, you have to ask whether the performance improvement you do get is worth the enormous increase in power consumption. That 575W TDP isn't a joke. I maxed out at 556W of power at 100% utilization, and I hit 100% fairly often in my testing and while gaming. The dual flow-through fan design also does a great job of cooling the GPU, but at the expense of turning the card into a space heater. That 575W of heat needs to go somewhere, and that somewhere is inside your PC case. Make sure you have adequate airflow to vent all that hot air, otherwise everything in your case is going to slowly cook. As far as performance-per-price, this card does slightly better than the RTX 4090 on value for the money, but that's never been a buying factor for this kind of card anyway. You want this card for its performance, plain and simple, and in that regard, it's the best there is. I spent about a week and a half testing the Nvidia GeForce RTX 5090, both running synthetic tests as well as using it in my day-to-day PC for both work and gaming. I used my updated testing suite, which uses industry standard benchmark tools like 3DMark, Geekbench, Pugetbench for Creators, and various built-in gaming benchmarks. I used the same testbench setup listed to the right for the purposes of testing this card, as well as all of the other cards I tested for comparison purposes. I've tested and retested dozens of graphics cards for the 20+ graphics card reviews I've written for TechRadar over the last few years, and so I know the ins and outs of these PC components. That's why you can trust my review process to help you make the right buying decision for your next GPU, whether it's the RTX 5090 or any of the other graphics cards I review.
[3]
Nvidia GeForce RTX 5090 review: Brutally fast, but DLSS 4 is the game changer
This value will show the geolocated pricing text for product undefined The wait is finally over. The long-awaited GeForce RTX 5090 lands on store shelves in January -- and friends, the flagship graphics card for Nvidia's new "Blackwell" architecture is an absolute monster. It should be for $2,000, of course. While the RTX 5090's leap in raw gaming performance isn't anywhere as massive as the 4090's was over its predecessor, it blows the pants off any GPU we've ever seen before, with no notable flaws in its technical configuration. But while raw gaming performance is welcome, I suspect that the GeForce RTX 50-series will live or die on the back of DLSS 4, a new generation of Nvidia's vaunted AI-powered performance-boosting technologies. A lot of Blackwell's improvements were focused on Nvidia's AI tensor cores, and Blackwell was designed hand-in-hand to optimize DLSS 4. And hot damn, friends. Based on our early playtime, DLSS 4's new Multi Frame Generation AI technology feels like black magic, utterly changing the way games feel and respond to your inputs. It's amazing, full stop. Check out our embedded video review below for an in-depth analysis of every benchmark we ran and plenty of additional experiential information. This written review will focus on key things would-be RTX 5090 buyers need to know before slapping down $2,000 for the most badass graphics card ever built. When we covered (and then analyzed!) the specifications for Nvidia's initial GeForce RTX 50-series lineup, one thing jumped out: The GeForce RTX 5090 was the clear crown jewel, designed with virtually no technical flaws. That bears out in our testing. With 33 percent more CUDA cores than the RTX 4090, new RT and tensor AI cores, and more raw power being pumped through its digital veins, there was never any doubt the 5090 would whup on its predecessor in gaming. (Much more on that in the next section.) Its ginormous 32GB of bleeding-edge GDDR7 memory, built with a wide 512-bit bus, will be able to tackle any gaming task you throw at it, regardless of resolution. But the RTX 5090 is more than just a gaming behemoth. We live in an era where GPUs do serious work on AI tasks now, not just gaming. Nvidia optimized its Blackwell architecture to excel at AI workloads, while the RTX 5090's unrivaled memory configuration can hold much larger AI models than any prior GPU. The results in Procyon's AI Text Generation benchmark are nothing short of sparkling. In the "worst case" scenario, with the Phi 3.5 large language model, the RTX 5090 is about 19 percent faster than the 4090; in the best case scenario, Meta's Llama 3.1, performance jumped 32 percent. AI researchers and engineers will be scrambling to pick this up. The GeForce RTX 5090 is also a content creation powerhouse. It houses an additional media encoding engine, bringing its total up to three, while the massive memory pool helps with the creation and editing of complex projects. We were only able to run a couple of Adobe benchmarks from Puget Systems' excellent PugetBench for this initial review. All modern high-end GPUs offer roughly the same performance in Photoshop (no surprise there), but the RTX 5090 is 8 percent faster than the 4090 in Premiere Pro. Expect that margin to jump with more GPU-centric creation software, like DaVinci Resolve, or if your workloads utilize ray tracing or can tap into Nvidia's excellent DLSS technologies. There's a reason we're talking about this first: The GeForce RTX 5090 is much more than just a gaming card, like a more amplified version of the 4090 before it. People who use their PCs for real work will be clamoring for this monster GeForce GPU to make real money using it. The RTX 4090 has sold for closer to $2,500 than its $1,600 suggested price for years now, and I expect the demand will be even stronger for the titanic 5090 with its fast, massive memory pool and AI optimizations. This will sell lot hotcakes -- yes, even at $1,999, which I suspect will look like an absolute steal a few months from now. But if you are able to snag one, you'll have your hands on by far the fastest gaming GPU of all time. Onto the fun stuff. The GeForce RTX 4090 stood unopposed as the ultimate gaming GPU since the moment it launched. No longer. The new Blackwell generation uses the same underlying TSMC 4N process technology as the RTX 40-series, so Nvidia couldn't squeeze easy improvements there. Instead, the company overhauled the RTX 5090's instruction pipeline, endowed it with 33 percent more CUDA cores, and pushed it to a staggering 575W TGP, up from the 4090's 450W. Blackwell also introduced a new generation of RT and AI cores. Add it all up and the RTX 5090 is an unparalleled gaming beast -- though the effects hit different depending on whether or not you're using RTX features like ray tracing and DLSS. Our gaming benchmark suite tests titles utilizing a variety of different game engines, to try to get a well-rounded view of performance. We've decided to focus on 4K gaming performance given this $2,000 graphics card's might. In games that don't use ray tracing or DLSS, simply brute force graphics rendering, the RTX 5090 isn't much more than a mild generational performance upgrade. It runs an average of 27 percent faster in those games -- but the splits swing wildly depending on the game: Cyberpunk 2077 is 50 percent faster, Shadow of the Tomb Raider is 32 percent faster, and Rainbox Six Siege is 28 percent faster, but Assassin's Creed Valhalla and Call of Duty: Black Ops 6 only pick up 15 and 12 percent more performance, respectively. Performance results are less of a yo-yo once you flip on ray tracing, DLSS upscaling (not Frame Generation), or some mix of the two. Black Myth Wukong, Cyberpunk 2077, and Returnal all run ~30 percent faster on the RTX 5090 versus the 4090, while F1 24 leapt up 40 percent. Nvidia invested heavily engineering work into its ray tracing and tensor cores this generation, and it shows. You should not pick up the RTX 5090 if you plan on ignoring ray tracing and DLSS in games (unless you plan on using it for work, of course). But sweet holy moly, you shouldn't ignore DLSS 4 anyway. Much like DLSS, DLSS 2, and DLSS 3 before it, the new DLSS 4 generation is an absolute game-changer. Nvidia's boundary-pushing AI tech continues to look better, run faster, and now feel smoother. It's insane. Nvidia made two monumental changes to DLSS to coincide with the RTX 50-series release. First, all DLSS games will be switching to a new "Transformer" model from the older "Convolutional Neural Network" behind the scenes, on all RTX GPUs going back to the 20-series. More crucially for the RTX 5090 (and future 50-series offerings), DLSS 4 adds a new Multi Frame Generation technology, building upon the success of DLSS 3 Frame Gen. While DLSS 3 uses tensor cores to insert a single AI-generated frame between GPU-rendered frames, supercharging performance, MFG inserts three AI frames between each GPU-rendered frame (which itself may only be rendering an image at quarter resolution, then using DLSS Super Resolution to upscale that to fit your screen). It's AI all the way down, just like we predicted. As someone who is sensitive to latency, I was skeptical going in. But friends, DLSS 4's Multi Frame Generation feels fantastic in our limited playtesting. We only pulled benchmark numbers for Cyberpunk 2077 using the RT Overdrive preset and 1.7x DLSS scaling. Flipping on MFG increases the frame rate over the 5090 with DLSS 3 by a whopping 91 percent, hitting a blistering 249 frames per second. Without any sort of Frame Generation on, the game runs at 71fps; enabling DLSS 4 MFG lets it run an absolutely staggering 251 percent faster. Insanity. And it feels smoother than silk with Nvidia Reflex turned on. Online forums have been broiled in debate since DLSS 4's announcement: Do the AI frames really count as a frame rate increase, since you're only affecting the input on 1/4 of frames rendered by the GPU, or is it more like an advanced motion-smoothing technology? I lean towards the latter, but either way: DLSS 4 MFG makes games look and feel so much better, and that's the important part. PCWorld video guru Adam Patrick Murray, who ran our benchmarks, normally sits on a couch during gaming sessions, using a small form-factor PC connected to a television. He never bothered turning on DLSS 3 Frame Gen; he felt the occasional visual glitches hurt more than the extra performance helped in that scenario. But he's a big DLSS 4 MFG believer. "You can absolutely feel the difference now," he told me, and he turned it on wherever possible during playtesting. PCWorld contributor Will Smith, who is working on a deeper dive into DLSS 4, delivers even stronger praise: He reports that turning on DLSS 4 makes Star Wars Outlaws, a fun game prone to performance concerns, feel just as good as the legendary Doom 2016, which many gamers consider the paragon of fast-action shooters. "It's like a whole new game," he said. Cyberpunk 2077 also looked and felt smoother than ever, though to be fair, that game already handled really well. Regardless of whether you consider AI frames to boost frame rates or smooth out motion, the end result is a masterpiece in action. Odd visual glitches seem to happen much less often now -- though pumping out ray-traced Cyberpunk scenes at 249 frames per second gives your eyes much less time to try to notice them, as well. I was concerned about added latency but Will and Adam report that everything feels snappy and even better than usual, though they counsel that you'll want native in-game frame rates to hit close to 50- to 60fps before turning on MFG, or the latency can start to feel a little weird. That shouldn't be a problem with the RTX 5090. And that's with using the standard version of Nvidia's latency-reducing Reflex technology, which is needed to counter the latency introduced by adding AI frames that don't respond to user input. The newer, more complex and performant Reflex 2 isn't required for DLSS 4, just the original version. That let Nvidia roll out a truly killer capability for DLSS 4: The ability to force DLSS 3 games to run DLSS 4 MFG instead using an override in the Nvidia app, rather than requiring developers to go back and update older titles. Because of this, you'll be able to use 75 games and apps with DLSS 4 the day your new RTX 5090 shows up. That's a huge improvement over the usual long, multi-month (or multi-year) rollout for new AI rendering technologies. Bottom line: DLSS 4 is a stunning upgrade you must play around with to fully appreciate its benefits. It's literally a game-changer, once again -- though we'll have to see if it feels this sublime on lower-end Nvidia cards like the more affordable RTX 5070. Okay, so you're sold on the RTX 5090! Not so fast. You may need to adjust the rest of your PC build to accommodate it. Fortunately, the meticulously designed Nvidia RTX 5090 Founders Edition has been engineered down to a slim, svelte two-slot solution that can easily slip into any PC (including Adam's SFF rig). But adding all the extra hardware and then cranking up the juice means you may need a new power supply. The GeForce RTX 5090 pushes the power supply requirement goalposts to 1,000W, up from the 4090's 850W mandate. If you don't want to use the included 12VHPWR adapter cables, consider picking up a PSU that ships with those included -- though Nvidia has listened to past feedback and overhauled its fugly adapter to include a woven-braided design with longer cords. Hallelujah. Despite its small stature, the RTX 5090 Founders Edition ran quietly enough in our testing, and didn't heat up our rooms nearly as much as feared. Nvidia nailed the design on this one. The GPU hit 84 degrees Celsius in our testing, which is a bit on the warm side, but well within spec tolerances. Bigger 3- and 4-slot custom 5090s will no doubt tame temperatures in exchange for their larger size and higher sticker prices. If the RTX 4090 was over your budget, this will be too. The GeForce RTX 5090 Founders Edition starts at $1,999, but I expect it and the MSRP-priced custom cards we see at launch to disappear quickly, driving prices up. Go stand in line at midnight at Best Buy or prepare to battle masses of scalpers and AI researchers online if you want the 5090 day one. Again, the 4090 is still going for $2,800 to $3,200 online right now, today because of its immense use to AI researchers and developers -- and the RTX 5090 obliterates it in that field. This is a weird review to have to wrap into a neat conclusion. In a vacuum, the RTX 5090 delivers around a 30 percent average boost in gaming performance over the RTX 4090. That's a solid generational improvement, but one we've seen throughout history delivered at the same price point as the older, slower outgoing hardware. Nvidia asking for an extra $500 on top seems garish and overblown from that perspective. But the RTX 5090 isn't like past generations. We're in the AI era now, and AI professionals will no doubt sell their firstborn to get ahold of the awesomely powerful GPU, massive memory pool, and ferociously fast memory bandwidth -- three crucial hardware considerations for the field. Given that, I suspect prices for this monstrous graphics card to rise as rapidly as Godzilla does from Japan's seas. If you demand the very best of bleeding-edge gaming hardware, price be damned, you'll drool over the GeForce RTX 5090. It's built to handle anything and everything with aplomb, churning out ray-traced frames at a frenetic pace and then cranking smoothness way past 11 with DLSS 4's magic. With over 75 games and apps scheduled to support DLSS 4 on the day of the RTX 5090's launch, you'll have plenty of opportunities to put your fancy new hardware to work. While I wouldn't recommend upgrading to this over the RTX 4090 for gaming (unless you're giddy to try DLSS 4), it's a definite upgrade option for the RTX 3090 and anything older. The 4090 was 55 to 83 percent faster than the 3090 in games, and the 5090 is about 30 percent faster than that, with gobs more memory. At the end of the day, nobody needs a $2,000 graphics card to play games. But if you want one and don't mind the sticker price, this is easily the most powerful, capable graphics card ever released. The GeForce RTX 5090 is a performance monster supercharged by DLSS 4's see-it-to-believe it magic.
[4]
Nvidia GeForce RTX 5090 Founders Edition review: Blackwell commences its reign with a few stumbles
The Nvidia GeForce RTX 5090 Founders Edition has arrived -- or at least, the reviews have arrived. It's the fastest GPU we've ever tested, most of the time, and we expect things will continue to improve as drivers mature in the coming weeks. When it lands on retail shelves, the RTX 5090 will undoubtedly reign as one of the best graphics cards around for the next several years. The card itself -- as well as AIB (add-in board) partner cards using the RTX 5090 GPU -- won't go on sale until January 30. Once it does, good luck acquiring one. It's an extreme GPU with a $1,999 price tag, though there will certainly be some well-funded gamers looking to upgrade. It also features new AI-centric features, including native FP4 support, and that will very likely generate a lot of interest outside of the gaming realm. With 32GB of VRAM and 3.4 PetaFLOPS of FP4 compute, it should easily eclipse any other consumer-centric GPU in AI workloads. Review in Progress... It's been an extremely busy month so this review is currently a work in progress and our current score is a tentative 4.5 out of 5, subject to adjustment in the next week or so as we fill in more blanks. There are tests that we wanted to run that failed, and several of the games in our new test suite also showed anomalous behavior. We've also revamped our test suite and our test PC, wiping the slate clean and requiring new benchmarks for every graphics card in our GPU benchmarks hierarchy, and while we have reviewed the Intel Arc B580 and Arc B570 and tested some comparable offerings, there are a lot of GPUs that we still need to retest. It takes a lot of time, even without any driver oddities, and we need to do some additional work. The Nvidia Blackwell RTX 50-series GPUs also bring some new technologies, which require separate testing. Chief among these (for gamers) is the new DLSS 4 with Multi Frame Generation (MFG). That requires new benchmarking methods, and more importantly, we need to spend time with the various DLSS 4 enabled games to get a better idea of how they look and feel. We already know from experience that DLSS 3 frame generation isn't a magic bullet that makes everything faster and better. It adds latency, and the experience also depends on the GPU, game, settings, and monitor you're using. With MFG potentially doubling the number of AI-generated frames (DLSS 4 can generate 1, 2, or 3 depending on the setting you select), things become even more confusing. MFG as an example running at 240 FPS would mean user input only gets sampled at 60 FPS, so while MFG could make games smoother it might also feel laggy. We'll be testing out some of the early DLSS 4 examples and updating our review in the coming days. Here are the specifications for the RTX 5090 and its predecessors -- the top Nvidia GPUs of the past several generations. The raw specs alone give a hint at the performance potential of the RTX 5090. It has 33% more Streaming Multiprocessors (SMs) than the previous generation RTX 4090, just over double the SMs of the 3090, and 2.5 times as many SMs as the RTX 2080 Ti that kicked off the ray tracing and AI GPU brouhaha. Just as important, it has 33% more VRAM than the 4090, and the GDDR7 runs 33% faster than the GDDR6X memory used on the 4090, yielding a 78% increase in memory bandwidth. The rated boost clocks on the RTX 5090 have dropped compared to the 4090, but Nvidia's boost clocks have always been rather conservative. Depending on the game and settings used, in some cases the real-world clocks were even higher than before. Except, that's mostly at 1080p and 1440p, where CPU bottlenecks are definitely a factor and the 5090 wasn't hitting anywhere close to the maximum 575W of power use. Typical clocks ranged from 2.5 GHz to 2.85 GHz in our testing (more details on page eight). But it's not just performance and specs that have increased. The RTX 5090 has an official base MSRP of $1,999 -- $400 more than the RTX 4090's base MSRP. That's probably thanks to the demand that Nvidia saw for the 4090, with cards frequently going for over $2,000 during the past two years. We suspect much of that was thanks to businesses buying the cards for AI use and research (not to mention people reportedly trying to smuggle 4090 cards into China). Those same factors will undoubtedly affect the RTX 5090. Things shouldn't be as bad as the cryptomining shortages of the RTX 30-series era, but we would be shocked if the 5090 isn't difficult to buy in the coming months at anywhere close to $2,000. Nvidia's top GPUs have traditionally been hard to acquire in the first month or two after launch, and that pattern will no doubt continue -- and perhaps be even worse than the 4090 launch. We've already noted that testing is ongoing and that we haven't been able to run all the benchmarks that we'd like. But really, what's the competition to the RTX 5090? Most people will want to see how much faster it is than the RTX 4090, plus a few other high-end / extreme offerings. It's not like someone is looking at an RTX 4070, RX 7700 XT, or Arc B580 but will instead decided to spend 4-8 times as much on a 5090. AMD's RX 7900 XTX didn't really compete with the 4090, and it certainly won't beat the 5090 -- at least, not at 4K. And you really should be running a 4K or higher resolution display if you're thinking about using a 5090 for gaming. We're still running 7900 XTX benchmarks on the new test PC, but we'll have complete results in the next day or so. We also want to add the 3090 to show what a two generation upgrade will deliver, for those using the top Ampere RTX 30-series GPUs. Other than that? We don't really see anything that will keep up with the 5090 arriving any time soon. And that's just for gaming. For AI workloads that can use the new FP4 number format, Nvidia claims the 5090 can be up to three times as fast as the 4090. It's set to dominate the GPU landscape until the inevitable RTX 6090 or whatever arrives in a couple of years -- or perhaps Nvidia will do an RTX 5090 Ti or Titan Blackwell this generation.
[5]
Nvidia GeForce RTX 5090 Founders Edition Review - IGN
Every couple of years, Nvidia launches an extremely expensive, extremely powerful graphics card that brings PC gaming into a new generation. That is what the Nvidia GeForce RTX 5090 ultimately is, but the way it brings next-generation performance is unconventional, to say the least. Because in a lot of games, the performance uplift over the RTX 4090 isn't quite as steep as you'd expect - at least when DLSS Frame Generation is taken out of consideration. With the next generation of Nvidia's DLSS for both upscaling and frame generation, however, we get leaps in image quality and performance that feel even greater than what we see with a typical graphics generation. How much of an upgrade the Nvidia RTX 5090 is going to be for you, then, is ultimately going to depend on the games you play, the resolution you play those games at, and whether you're ok with an AI algorithm generating extra frames. For a lot of people playing games on anything less than a 4K monitor with a 240Hz refresh rate, this upgrade is simply not going to make a lot of sense. But if you do have a high-end display, these AI-generated frames are going to feel like a taste of the future. The Nvidia GeForce RTX 5090 is built on Blackwell, Nvidia's high-end architecture that's already powering the data centers and supercomputers behind many of the most popular AI models. That should give you an idea of what the RTX 5090 is especially good at, but Nvidia didn't neglect the, well, non-AI parts of the card. With the 5090, Nvidia found a way to shove more Streaming Multiprocessors (SMs) into the same amount of GPCs (Graphics Processing Clusters), which means more CUDA cores - 21,760, up from 16,384 in the RTX 4090. That makes up for a 32% uplift in the amount of shader cores over the previous generation, and is where a bulk of the raw gaming performance comes from. Each SM also has four Tensor Cores and one RT Core, just like its predecessor. That means you get 680 Tensor Cores and 170 RT cores, compared to 512 and 128, respectively, for the RTX 4090. The 5th-generation Tensor Cores are tailor-made to boost AI performance, with this generation adding support for FP4 operations, which should make AI workloads less dependent on VRAM. All of this silicon is coupled with 32GB of GDDR7 VRAM, or video memory. This is a generational shift from the GDDR6X memory in the RTX 4090, and should be faster and more power efficient than the previous generation. But because the RTX 5090 requires a staggering 575W of power, a huge increase over the already power hungry 4090, power efficiency isn't exactly Nvidia's main goal with this graphics card. Because the new Tensor Cores are more efficient, Nvidia shifted the entire DLSS algorithm to run on a Transformer Neural Network (TNN), rather than a Convolutional Neural Network (CNN). This shift won't necessarily improve your framerate when you enable DLSS, but Nvidia claims it will improve image quality, and mitigate issues like ghosting and other unwanted artifacts. Nvidia did more than just make an under-the-hood change to the way DLSS works. Team Green also introduces Multi-Frame Generation, which takes the Frame Gen tech introduced with the RTX 4090, makes it more efficient and smooth, and allows it to generate multiple frames off of each rendered image. This drastically improves frame rate, but should probably only be enabled if you're already getting a decent frame rate, just like the last generation version. The Nvidia GeForce RTX 5090 requires 575W of power, which is much more than the 450W of the RTX 4090. More power inherently means more heat, which means increasingly powerful cooling solutions are needed. Looking back at the RTX 4090 and even the RTX 3090, the Founders Editions were these giant, triple-slot graphics cards that took up a ton of room, and straight up wouldn't fit in some PC cases. Before I saw the RTX 5090, I was expecting something even bigger and more unwieldy. However, somehow, it's smaller. Nvidia was able to make a 575W graphics card fit in a dual-slot chassis with a dual fan configuration. And it works. Throughout my time with the RTX 5090, which included both my standard testing suite and playing games with DLSS 4 enabled to test multi frame generation, the temperature maxed out at around 86°C, even while its power consumption peaked at 578W. That's a high temperature, to be sure, and higher than the RTX 4090's 80°C, but it's not high enough to throttle, and that's all that matters. Nvidia was able to do this by shrinking down the PCB (printed circuit board) to a little square and placing it in the middle of the graphics card. The two fans are placed on each side of where the PCB is, with a heatsink that runs through the width of the card. These fans then take air in through the bottom of the card and shoot it straight through the top of the card and out through your exhaust fans in your PC case. In fact, this graphics card doesn't even have exhaust vents under the output ports at the rear of the card, unlike previous generation designs. But it's immediately apparent that the RTX 5090 Founders Edition follows a similar design language to the last couple of generations. The center of the card has the same silver 'X' design as the RTX 4090, with a gunmetal-gray chassis surrounding the black heatsinks. On the outer edge of the graphics card, you get a 'GeForce RTX' logo that lights up with white LEDs, too. Next to that logo, you're going to find the power connector. While it looks very similar to the 12VHPWR connector of the last generation, it's actually a new 12V-2x6 power connector. The difference is minor, but it's supposedly more efficient than the last-generation connector. Maybe Nvidia will be available to avoid controversy around its power connectors melting with this generation - only time will tell. Nvidia does include an 12V-2x6 adapter in the box, which takes four 8-pin PCIe power connectors in order to provide the required 575W of power. But what's nice is that the connector on the graphics card itself is now angled, and facing the back of the graphics card, which should make connecting the cable much easier. The power connector seems more secure this time around, too. The hidden benefit of this design is its ability to be slotted into smaller PC builds. You don't need a giant case to run the RTX 5090, like you did with the RTX 4090 and 3090. However, it's very likely that third-party designs from the likes of Asus and MSI are going to be much larger than Nvidia's Founders Edition. When Nvidia revealed the RTX 5090, it claimed that it could boost performance by as much as 8x. The actual number isn't that high, but the RTX 5090 can deliver extremely high frame rates in the most demanding games, but not exactly through traditional rendering. Because while the RTX 5090 does deliver a decent increase in raw rasterization performance, the real next-generation benefit is in its ability to generate extra frames to increase your frame rate. DLSS 4 introduces 'Multi-Frame Generation', a next-gen version of the Frame Generation introduced with DLSS 3 and the RTX 4090. But it's more than just using the same method to just produce more frames. And the secret behind it is a new AI Management Processor, or AMP core in the RTX 5090 - along with other RTX 5000 graphics cards. The AMP allows the graphics card to essentially assign work to different parts of the GPU, something that was traditionally handled by your CPU. But because it's physically located on the GPU, it's able to do this much more efficiently. According to Nvidia, the AMP and the 5th-generation Tensor Cores allowed it to create a new frame generation model that's both 40% faster than the original frame generation model, while requiring 30% less memory. This new model only needs to run once on each rendered frame, which then can create 3 AI frames. Something like this would naturally introduce latency, but Nvidia found a way around that, too. The AMP runs a Flip Metering algorithm, which paces out the frames in order to reduce input lag. Nvidia claims this is why multi-frame generation won't work on RTX 4000 graphics cards, as the last-generation frame generation relied on the CPU for frame pacing, which would introduce much more latency than the new model, which runs entirely on the GPU itself. To be clear, this isn't a magic button to get good performance. Just like the previous generation, you only really want to enable this if you're already getting a passable frame rate. If you're not already getting around 60 fps with Frame Gen disabled, turning it on can introduce significant latency problems. That's why it pairs best with DLSS upscaling also enabled, in order to maximise your performance. When the RTX 5090 hits store shelves on January 30, DLSS 4 will work in a wide array of PC games that already support DLSS 3 Frame Generation. However, while working on this review, I only had access to two games with this technology enabled, and both were on beta builds - Cyberpunk 2077 and Star Wars Outlaws. And I was surprised how well it worked. In Cyberpunk 2077 at 4K, on the Ray Tracing Overdrive Preset, with DLSS on Performance mode, the RTX 5090 gets 94 fps. That's not bad for a game with full ray tracing. When I turned on DLSS 2x frame gen - the same as is supported by the RTX 4090 - that framerate increased to 162 fps. That's a 2x improvement over just plain ol' DLSS. However, when I cranked the frame generation to 4x, that's 3 AI frames per rendered frame, that number went all the way up to 286 fps - more than my 4K display can actually render. It's a similar story with Star Wars Outlaws. Playing at 4K with all the settings cranked up to max, I was able to get up to around 300 fps with DLSS 4 enabled - and that's up from about 120 fps without frame generation. When I saw these framerates, I was straight-up expecting to see artifacts and weird spikes of lag. However, I only really saw one broken texture in Star Wars Outlaws, and it's something I wouldn't have noticed if I wasn't actively looking for problems. It's hard to believe, but Multi-Frame Generation actually does work, you just need to have an extremely high-end 4K display to benefit from it, at least with the RTX 5090. It's easy to write off this performance as 'Fake Frames', and you wouldn't necessarily be wrong, especially because you need good baseline performance to make it a good experience. But it is going to be genuinely useful for anyone with a high-refresh, high-resolution display. It's also important to keep in mind that I was only able to test it in a handful of games. Nvidia claims that 75 games will support DLSS 4 when the RTX 5090 hits store shelves on January 30, and there's a decent possibility that it won't work flawlessly in at least one of those games. For the time being, though, it looks like it works extremely well. The Nvidia GeForce RTX 5090 is an incredibly powerful graphics card, but testing this thing was a journey. In 3DMark, the RTX 5090 proved itself to provide a generational improvement over the RTX 4090 in terms of raw performance. However, things get a lot more complicated when I test actual games. In the vast majority of games, the RTX 5090 is bottlenecked by my CPU, even at 4K - and I paired it with the Ryzen 7 9800X3D, the fastest gaming processor on the market right now. For most people who already have a high-end graphics card, upgrading to this $1,999 GPU isn't going to make a world of difference - the games just aren't there yet. This is a graphics card you buy to set yourself up for future PC games, like The Witcher 4. I want to note that I did not enable DLSS 4 for any of these comparative benchmarks, and everything was tested on the public drivers available at the time. That means all non-5090 Nvidia cards were tested on Driver version 566.36, and all AMD cards were tested on AMD Adrenalin 24.12.1. All games were tested on their latest public builds, too. In 3DMark, the RTX 5090 is up to 42% faster than the RTX 4090. In the Speed Way benchmark, the RTX 5090 scores 14,399 points to the RTX 4090's 10,130, making for a 42% performance uplift. Similarly in Port Royal, the ray tracing test, the RTX 5090 scores 36,946 points to the 4090's 25,997 points, which is also a 42% performance leap. What's more impressive is how far Team Green has come since the RTX 3090. That graphics card got 5,619 points in Speed Way and 13,738 points in Port Royal, meaning anyone that skipped the last generation can get a 2.5x performance jump. That's just 3DMark, though, and not necessarily reflective of real-world gaming performance. In Call of Duty Black Ops 6, however, we start to see the big issue with the RTX 5090 in today's games - a severe CPU bottleneck. At 4K Extreme settings, with DLSS set to 'performance', the RTX 5090 gets 161 fps, compared to 146 fps from the RTX 4090. That's just a 10% performance difference, and definitely not what I'd call a 'next-generation' performance increase. However, when looking back at the RTX 3090, the 4-year-old graphics card gets 91 fps, which means a nearly 2 x performance increase. What's wild is that the RTX 5090 even shows signs of CPU bottleneck in Cyberpunk 2077, one of the most demanding PC games on the market right now. At 4K, with the Ray Tracing Ultra preset and DLSS set to performance, the RTX 5090 gets 125 fps, compared to 112 fps from the RTX 4090 with the same settings. Similarly to Black Ops 6, this is just a 10% performance increase. That scaling just gets worse at lower resolutions, though, with the RTX 5090 getting 153 fps at 1440p and 156 fps at 1080p. I test Metro Exodus: Enhanced Edition with DLSS disabled, because it's the only upscaling solution in that game, and I need to get an honest comparison with AMD cards. This makes it one of the most demanding tests in my suite, and gives the RTX 5090 a chance to stretch its legs a bit. At 4K with the Extreme preset, the RTX 5090 gets 95 fps, compared to 76 fps from the RTX 4090 and 44 fps from the Radeon RX 7900 XTX. But even without the upscaling, the RTX 5090 only gets a 25% improvement over the RTX 4090, even if it more than doubles the 39 fps from the RTX 3090. Red Dead Redemption 2 is getting up there in years, but it's still a gorgeous game. At 4K with every setting maxed out, and DLSS set to performance mode, the RTX 5090 gets 167 fps, compared to 151 from the RTX 4090 and 92 fps from the RTX 3090. That means in this game, which admittedly doesn't even use ray tracing, the RTX 5090 gets a paltry 6% performance uplift over the RTX 4090. Total War: Warhammer 3 is an interesting test these days, because it doesn't support ray tracing or upscaling, and gives a clear picture on raw rasterization performance. The RTX 5090 impresses here, delivering 147 fps to the RTX 4090's 107 fps. That's a 35% performance uplift and close to the potential performance difference demonstrated in 3DMark. However, it's still a far cry from the 67% performance difference enjoyed by the RTX 4090 over the 3090. Assassins Creed Mirage is a weird one. For some reason, when I first benchmarked this game, the RTX 5090 was getting terrible performance. Its game clock was limited to 772MHz, and was giving me around 50 fps at 4K. I was able to get around that problem, but even when it was resolved the 5090 only got 172 fps, which is lower than the RTX 4090 at 183 fps. What's worse, is that the framerate was extremely spikey with microstutters. That's bad, obviously, but it's very likely that this is a driver bug, and as such should be treated as an outlier. Black Myth: Wukong, like Cyberpunk 2077, is an extremely demanding game that will push any GPU to its limits. The RTX 5090, however, averaged 104 fps at 4K, with the Cinematic Preset and DLSS set to 40%. The RTX 4090, with identical settings, got 84 fps. That's a 20% uplift in favor of the RTX 5090. In Forza Horizon 5, the RTX 5090 averaged 216 fps, compared to 210 fps from the RTX 4090, which is essentially in the margin of error. This is an aging game, to be sure, but the CPU bottleneck means there's essentially no difference between these two cards at this resolution. Nvidia really wants us to believe that Moore's Law is dead and GPUs that deliver a giant uplift over their previous-generation counterparts are going to grow more rare over time. I don't know if that's true - I'm not an engineer - but regardless, in most games, the RTX 5090 doesn't exactly deliver next-generation performance over the RTX 4090 - at least not to the extent that the latter card thoroughly trounced the 3090 back in 2022. That's not to say the RTX 5090 is a bad graphics card. No matter how you slice it, the RTX 5090 is now the fastest graphics card on the consumer market, and that's not nothing. The problem is that a lot of games can't really take advantage of the extra power offered by the Blackwell GPU. That's something that will absolutely change over time, but it also means there's little reason for someone with an RTX 4090 to upgrade to the new hotness. Instead, the Nvidia GeForce RTX 5090 is betting its existence on the future of AI-powered gaming. DLSS 4 uses AI to greatly increase frame rates, which is definitely a sight to behold. This graphics card is therefore best for gamers that want to be on the cutting edge, and are willing to bet $1,999 (at least) on an AI gaming future. For everyone else, the RTX 4090 is going to be more than powerful enough for the next few years.
[6]
NVIDIA GeForce RTX 5090 review: Pure AI excess for $2,000
It's the fastest video card I've ever seen, but it's not meant for mere mortals. A $2,000 video card for consumers shouldn't exist. The GeForce RTX 5090, like the $1,599 RTX 4090 before it, is more a flex by NVIDIA than anything truly meaningful for most gamers. NVIDIA CEO Jensen Huang said as much when he revealed the GPU at CES 2025, assuming that it'll be for hardcore players who have $10,000 rigs. Personally, I don't know anyone who actually fits that bill, not unless you count parasocial relationships with streamers. (My own setup doesn't even cross $5,000.) But we all know why NVIDIA is hyping up the unattainable RTX 5090: It lets the company show off benchmarks that AMD can't touch, once again cementing itself as the supreme leader of the high-end video card market. It's not just about gaming, either. The RTX 5090 is also being positioned as an AI workhorse since it's powered by NVIDIA's new Blackwell architecture, which leans on the company's Tensor Cores for artificial intelligence work more than ever. Realistically, though, the $549 RTX 5070 is the GPU more gamers will actually be able to buy. I'll admit, I went into this review with a mixture of excitement and disgust. It's astonishing that NVIDIA was able to stuff 91 billion transistors and 21,760 CUDA cores in the RTX 5090, and I couldn't wait to see how it performed. Still, I find it genuinely sad that NVIDIA keeps pushing the bar higher for GPU prices, in the process making the gaming world even more unequal. A $2,000 graphics card, in this economy?! But after hours of benchmarking and playtime, I realized the RTX 5090 wasn't much of a threat to gaming accessibility. Wealthy PC gamers have always overspent for graphics performance -- I've seen people (unwisely) pay thousands more than consumer GPUs just to get extra VRAM from NVIDIA's Quadro cards. But the rise of PC handhelds like the Steam Deck, which are a direct offshoot of the Nintendo Switch's success, is a clear sign that convenience matters more than raw power to mainstream players today. I don't think many Switch 2 buyers are saving up for an RTX 5090. For the few who can afford it, though, NVIDIA's new flagship sure is a treat. In many ways, the RTX 5000 GPU family is the convergence of NVIDIA's decades-long GPU expertise and its newfound role powering the AI hype train. Sure, they'll run games faster than before, but what makes them unique is their ability to tap into "neural rendering" AI for even better performance. It's at the heart of DLSS 4, the company's latest AI upscaling technology, which can now generate up to three frames for every one that's actually rendered by the RTX 5090. That's how NVIDIA can claim this GPU is twice as fast as the RTX 4090, or that the RTX 5070 matches the speed of the 4090. Does it really matter if these frames are "fake" if you can't tell, and they lead to smoother gameplay? Before I dive further into the AI side of things, though, let's take a closer look at the RTX 5090. Once again, it features 21,760 CUDA cores, up from 16,384 cores on the 4090, as well as 32GB of GDDR7 VRAM instead of the 4090's 24GB of GDDR6X. (I thought I was future-proofing my desktop when I equipped it with 32GB of RAM years ago, but now that video cards have caught up I'm almost convinced to go up to 64GB.) The 5090 also sports 5th-gen Tensor cores with 3,352 of AI TOPs performance, while the 4090 had 1,321 AI TOPS with last-gen Tensor hardware. I tested the RTX 5090 Founder's Edition GPU (provided by NVIDIA), which is dramatically slimmer than its 4090 counterpart. The 5090 has a sleek two-slot case that can actually fit in small form factor systems. The three-slot 4090, meanwhile, was so massive it felt like it was going to tear my PCIe slot out of my motherboard. NVIDIA also added another cooling fan this time around, instead of just relying on a vapor chamber and a single fan. The 5090's main PCB sits in the center of the card, and it's connected to other PCB modules at the PCIe slot and rear ports (three DisplayPort 2.1b and an HDMI 2.1b connection). While multi-frame generation is the defining feature for the RTX 50 cards, there are several other DLSS 4 features that should help games look dramatically better. Best of all, those capabilities are also trickling down to earlier RTX GPUs. RTX 40 cards will be more efficient with their single-frame generation, while RTX 30 and 20 cards will also see an upgrade from AI transformer models used for ray reconstruction (leading to more stable ray tracing), Super Resolution (higher quality textures) and Deep Learning Anti-Aliasing (DLAA). These transformer models should also fix some rendering artifacts present in earlier versions of DLSS. At NVIDIA's Editor's Day earlier this month, the company showed off how the updated version of Ray Reconstruction made a chainlink fence in Alan Wake 2 appear completely sharp and clear. An earlier version of the feature made the same fence look muddy, almost as if it was out of focus. In Horizon Forbidden West, the new version of Super Resolution revealed more detail from the texture of Aloy's bag. DLSS 4 will be supported in 75 games and apps at launch, including Indiana Jones and the Great Circle and Cyberpunk 2077, according to NVIDIA. For titles that haven't yet been updated with new DLSS menu options, you'll also be able to force support for the latest features in the NVIDIA app. I could almost hear my motherboard breathe a sigh of relief when I unplugged the RTX 4090 and swapped in the slimmer 5090. Installation was a cinch, though I still needed to plug in four PSU connectors to satisfy its demand for 575 watts of power and a 1,000W PSU. If you're lucky enough to have a new PSU with a 600W PCIe Gen 5 cable, that will also work (and also avoid tons of cable clutter). I tested the RTX 5090 on my home rig powered by an AMD Ryzen 9 7900X and 32GB of RAM, alongside a 1,000W Corsair PSU. I also used Alienware's 32-inch 4K QD-OLED 4K 240Hz monitor to get the most out of the 5090, and honestly, you wouldn't want to run this GPU on anything less. Once I started benchmarking, it didn't take long for the RTX 5090 to impress me. In the 3DMark Steel Nomad test, which is a demanding DX12 demo, it scored 14,239 points, well above the 9,250 points I saw on the RTX 4090. Similarly, the 5090 hit 15,416 points in the 3DMark Speedway benchmark, compared to the 4090's 10,600 points. These are notable generation-over-generation gains without the use of frame generation or any DLSS sorcery -- it's just the raw power you see with more CUDA and RT cores. Once I started gaming and let DLSS 4 do its magic, my jaw just about hit the floor. But I suppose that's just a natural response to seeing a PC hit 250fps on average in Cyberpunk 2077 while playing in 4K with maxed-out ray tracing overdrive settings and 4x frame generation. In comparison, the 4090 hit 135fps with the same settings and single frame generation. Now I know most of those frames aren't technically real, but it's also the first time I've seen any game fill out the Alienware monitor's 4K 240hz refresh rate. And most importantly, Cyberpunk simply looked amazing as I rode my motorcycle down rain-slicked city streets and soaked in the reflections and realistic lighting from robust ray tracing. Like Cypher in The Matrix (far from the best role model, I know), after suffering through years of low 4K framerates, I couldn't help but feel like "ignorance is bliss" when it comes to frame generation. I didn't see any artifacts or stuttering. There wasn't anything that took away from my experience of playing Cyberpunk. And the game genuinely looked better than I'd ever seen it before. And if you're the sort of person who could never live with "fake frames," the RTX 5090 is also the only card I've seen that can get close to 60fps in Cyberpunk natively in 4K with maxed out graphics and no DLSS. I hit 54fps on average in my testing, whereas the 4090 chugged along at 42fps in native 4K. You could also compromise a bit and turn on 2x or 3x frame generation to get a solid fps boost, if the idea of 4x frame generation just makes you feel dirty. And if you can't tell, I quickly got over any fake frame trepidation. When I used the NVIDIA app to turn on 4x frame generation in Dragon Quest: The Veilguard, I once again saw an average framerate of around 240fps in 4K with maxed out graphics. I've already spent over 25 hours in the game, but running through a few missions at that framerate still felt revelatory. Combat sequences were clearer and easier to follow, possibly thanks to better Ray Reconstruction and Super Resolution, and I could also make out even more detail in my character's ornate costumes. On the 4090, I typically saw around 120fps with standard frame generation. The 5090's DLSS 4 performance makes me eager to see how the cheaper RTX 5070 and 5070 Ti cards perform. If a $550 card can actually get close to what I saw on the $1,599 4090, even if it's relying on massive amounts of frame generation, that's still a major accomplishment. It would also be great news for anyone who invested in a 4K 120Hz screen, which is tough to fill with other mid-range GPUs. Outside of gaming, the RTX 5090 also managed to convert a minute-long 4K clip into 1080p using the NVENC H.264 encoder in just 23 seconds. That's the fastest conversion I've seen yet. In comparison, the RTX 4090 took 28 seconds. Add up those seconds on a much larger project, and the 5090 could potentially save you hours of repeated rendering time. Naturally, it also saw the fastest Blender benchmark score we've ever seen, reaching 14,903 points. The RTX 4090, the previous leader in our benchmarks, hit 12,335 points. Throughout benchmarks and lengthy gaming sessions, the RTX 5090 typically reached around 70 degrees Celsius with audible, but not annoying, fan noise. The card also quickly cooled down to idle temperatures between 34C and 39C when it wasn't under load. Aiming to push the limits of NVIDIA's cooling setup, I also ran several stress test sessions in 3DMark, which involves looping a benchmark 20 times. It never crashed, and achieved over 97 percent accuracy in most of the tests. There was just one Steel Nomad session where it scored 95.9 percent and failed 3DMark's 97 percent threshold. That could easily be due to early driver issues, but it's still worth noting. The only time I really got the RTX 5090 cooking was during an exploration of the Speedway benchmark, where I could move the camera around the ray traced scene and look at different objects and characters. The card hit 79C almost immediately and stayed there until I quit the demo. During that session, as well as typical gaming, the 5090 drew between 500W and 550W of power. On top of DLSS, NVIDIA is also planning to tap into its RTX cards to power AI NPCs in games like PUBG and ZooPunk. Based on what I saw at NVIDIA's Editor's Day, though, I'm more worried than excited. The company's Ace technology can let NPCs generate text, voices and even have conversational voice chats, but every example I saw was robotic and disturbing. The AI Ally in PUBG makes a lot of sense on paper -- who wouldn't want a computer companion that could help you fight and find ammo? But in the demo I saw, it wasn't much of a conversationalist, it couldn't find weapons when asked and it also took way too long to hop into a vehicle during a dangerous firefight. As I wrote last week, "I'm personally tired of being sold on AI fantasies, when we know the key to great writing and performances is to give human talent the time and resources to refine their craft." And on a certain level, I think I'll always feel like the director Hayao Miyazaki, who described an early example of an AI CG creature as, "an affront to life itself." NVIDIA's Neural Shaders are an attempt to bring AI right into texture shaders, something the company says wasn't possible on previous GPUs. These can be implemented in a variety of ways: RTX Neural Materials, for example, can use AI to render complex materials like silk and porcelain, which often have nuanced and reflective textures. RTX Neural Texture Compression, on the other hand, can store complex textures while saving up to 7 times the VRAM used from typical block compression. For ray tracing, there's RTX Neural Radiance Cache, which is trained on live gameplay to help simulate path-traced indirect lighting. Much like NVIDIA's early ray tracing demos, it's unclear how long it'll take for us to see these features in actual games. But from the glimpses so far, NVIDIA is clearly thinking of new ways to deploy its AI Tensor Cores. RTX Neural Faces, for example, uses a variety of methods to make faces seem more realistic, and less like plastic 3D models. There's also RTX Mega Geometry, which can help developers make up to "100x more ray traced triangles," according to NVIDIA. Demos show it being used to construct a large building as well as an enormous dragon. The $2,000 GeForce RTX 5090 is not meant for mere mortals, that much is clear. But it points to an interesting new direction for NVIDIA, one where AI features can seemingly lead to exponential performance gains. While I hate that it's pushing GPU prices to new heights, there's no denying that NVIDIA has crafted an absolute beast. But, like most people, I'm more excited to see how the $549 RTX 5070 fares. Sure, it's also going to lean into frame generation, but at least you won't have to spend $2,000 to make the most of your 4K monitor.
[7]
First Tests: Nvidia's GeForce RTX 5090 vs. 4-Star Powerhouse RTX 4090
I have been a technology journalist for 30-plus years and have covered just about every kind of computer gear -- from the 386SX to 64-core processors -- in my long tenure as an editor, a writer, and an advice columnist. For almost a quarter-century, I worked on the seminal, gigantic Computer Shopper magazine (and later, its digital counterpart), aka the phone book for PC buyers, and the nemesis of every postal delivery person. I was Computer Shopper's editor in chief for its final nine years, after which much of its digital content was folded into PCMag.com. I also served, briefly, as the editor in chief of the well-known hard-core tech site Tom's Hardware. The cards don't go on sale for another week, but samples of the much-anticipated Nvidia GeForce RTX 5090 are in the wild and in the hands of reviewers around the world. (See our initial unboxing of the card from a few days ago.) And we're all banging away furiously at these cards with our special benchmarking hammers: new and familiar synthetic test programs, GPU-accelerated productivity applications, and a mix of AAA games to push the limits of Nvidia's new silicon. Testing the new flagship RTX 5090 is a mix of old challenges (how well does it do on classic rasterization?) as well as new ones: How well do the AI features assist in accelerating new games? How about for the hands-on AI tasks that people might do today? Here at PCMag, we're in the midst of recalibrating and rebuilding our database of graphics card tests. We're doing that in light of the new GeForce RTX 50-series generation of Nvidia cards, with AMD Radeon RX 9000-series RDNA 4 cards soon in the offing and Intel making some real progress at the lower end of the market with its Arc line. That means lots of retesting of old cards with new tests, and experimenting with new tools and processes. Take, for one thing, DLSS. Nvidia's upscaling and frame-rate-boosting tech has taken different forms over its relatively short life. It started as an upscaler (rendering frames at a lower resolution, then enhancing them to appear higher-res), moved into frame generation (the short explanation: inserting AI-generated frames between classically rasterized ones), and is now poised to be a somewhat different animal still in some PC games, inserting multiple AI-generated frames between "real" ones to rocket-boost frame rates. (Our senior analyst will be taking a quick first look at the new DLSS 4 in a companion piece here shortly to follow.) AI assistance and other side tweaks are complexifying the GPU market, to be sure. Meanwhile, though, there's a more familiar question: How well does the RTX 5090 flagship card of the moment do for the basic stuff? With our new test suite, we pitted the GeForce RTX 5090 Founders Edition--Nvidia's own version of its flagship GPU--against its former flagship card, the GeForce RTX 4090. We haven't gathered enough data yet for a full, editor-scored review of the RTX 5090--we need more time with the card, as well as to test a critical mass of older cards in our new fashion. (We received ours less than 72 hours ago.) Nvidia hosted an Editor's Day earlier this month at CES 2025 that unveiled some of the technologies behind the RTX 50 series, and that briefing made clear the direction that future testing would need to take. So it's off the benchmarking races. Below is a rundown of some of the numbers we churned so far. (Our new testbed PC comprises a top-of-the-line 16-core AMD Ryzen 9 9950X CPU cooled by a 360mm Cooler Master cooler, a Gigabyte X870E Aorus Master motherboard, 32GB of Crucial DDR5 clocked at the AMD EXPO 6000 setting, two PCI Express 4.0 SSDs from Crucial, and a 1,500-watt Corsair power supply.) Lots of raw charts follow. The TLDR, though? This card is looking like a beast by any measure. It even cows the power-monger RTX 4090, which we'll compare it to throughout. Synthetic Graphics Tests: A Festival of 3DMark First, let's take a look at some industry-standard graphics benches we ran. While UL's various 3DMark subtests are mostly only useful in relation to other 3DMark scores, they're also a good "reality check" on the hardware at hand, and a key grounding point for subsequent testing. For those unfamiliar with the suite, here's a rundown what each test specializes in... Here's a peek at the RTX 4090 versus the 5090 on the 3DMark suite... We won't calculate every percentage, but it's clear that at least on these synthetic tests, the RTX 5090 looks to be quite a step above the RTX 4090, especially in the classic 4K Steel Nomad and Time Spy Extreme trials. The DLSS Feature test, set for DLSS 3, also shows a good-size leap. A Few GPU-Accelerated Faves: Adobe's Creative Suite, Blender, and More Next up: Some key apps that benefit (to a greater or lesser degree) from GPU acceleration. Adobe Photoshop and Premiere Pro are tested using a utility from workstation maker Puget Systems, dubbed PugetBench for Creators. PugetBench runs each program through a litany of typical creative tasks: applying image filters, picture manipulation, performing file renders, and the like. The modeling program Blender, meanwhile, now has a nifty benchmark utility that lets you run three 3D-model renders (dubbed Monster, Junk Shop, and Classroom) on the CPU or GPU. (Here, we chose the GPU option, of course.) And V-Ray 6 lets you run renders in Chaos Group's engine used by many 3D modeling programs. It runs both RTX and CUDA versions of the test in turn. Photoshop didn't see a boost from the RTX 5090, at least in this workload. As we see more cards tested and the variances, we may well drop this benchmark from our suite. Premiere Pro, however, saw a healthy boost. V-Ray, meanwhile, is mostly interesting for the RTX results, which show a boost; the CUDA trial doesn't run well on the "Blackwell"-based RTX 50 series yet. (Nvidia says this is a known issue.) One Peek at AI Performance The field of local AI processing is as wide as the sea is deep, and testing that kind of workload is almost as new as the RTX 50 series itself. We're working up another AI test or two as we write this--of course, locally run AI modeling and training could well be the biggest draw of this card--but for now, we have a measure from our friends at UL of local AI text-generation performance with four of the most common LLM models out there of the moment: Phi, Mistral, and two flavors of Llama. Time to first token, and average token rate per second, are the typical measures for tasks like this; you'll see those measures on the second two tabs below. The overall UL Procyon score is generated by the software and mostly useful only in relation to other equivalent scores. We'll need more context for more cards, but we suspect we won't soon be seeing anything higher than the scores afforded here by the RTX 5090 among consumer-grade GPUs. We're working toward adding a local-model image-generation benchmark to the suite (at the moment, it may only be useful for comparing Nvidia cards to Nvidia cards, and high-end cards at that), so more on that soon. A Helping of AAA Games Next up: Five of the AAA games we use for testing. Now, the RTX 4090 and RTX 5090, despite the lust of most gamers for one, are really not gaming cards, first and foremost. Some well-heeled shoppers will buy them for that, to be sure. But the RTX 5090's $1,999 starting price is certainly going to give pause to many a shopper, who will look further down Nvidia's stack when the lower-end RTX 50 cards debut. Several games here (Returnal, Call of Duty: Modern Warfare III) we tested strictly with "classic" FPS measures at our three common resolutions of 1080p, 1440p, and 4K. We also tested a couple (F1 2024, Cyberpunk 2077) with a version of DLSS on, and DLSS plus frame generation. (A third DLSS title, Black Myth Wukong, which runs as a stand-alone benchmark, we ran with DLSS on and with DLSS plus frame generation on. There's no DLSS-off function with this test.) We've outlined the base settings we used with each game; we'll get into more detail in the final review of this card. Note: Here, we did not factor in DLSS 4, supported in Cyberpunk 2077, quite yet. Yes, that is one of the prime new features touted by Nvidia with the launch of the RTX 50 series. DLSS 4, which has one of its key traits Multi-Frame Generation (MFG), is showing up in Cyberpunk 2077 as one of about 75 games that will support this new version of DLSS at launch. It's a big part of Nvidia's big claims about the RTX 50 series (especially the much ballyhooed "$549 RTX 5070 that will perform like an RTX 4090!" in Nvidia's CES 2025 keynote, which is largely down to specific DLSS 4 scenarios). And DLSS 4 will be a big factor in future titles, we are sure. But DLSS 4 support will be lean enough from the start that it's not the primary focus of our testing here on day one. (Again, see our companion story for more on DLSS 4, coming up shortly.) As for what we saw? The numbers speak for themselves. Some of these games at lower resolutions are clearly limited even by the tip-top Ryzen 9 9950X we used, but at the higher resolutions, you will see some very healthy bumps depending on the game. We're also reporting here "1% lows," which is a popular measure of the average 1% lowest of frames in a given benchmark run. This reflects relative smoothness and the prevalence of stutter; the closer the "1% low" to the overall average frame rate, the smoother the experience. This metric is not always entirely dependent on the GPU but can be down to the game and the CPU, too; we'll be looking at how this metric shakes out with subsequent cards and if the relative gulfs are consistent from card to card in a given game. (Note the wide swings with Call of Duty, say, versus Returnal. Also note that we had some technical issues collecting 1% lows from the RTX 4090 for Cyberpunk 2077. We're working to fix that.) How About a Couple of Legacy Games? Finally, we've revised our bank of what we call "legacy" games (that is, older titles) to ones that aren't quite as antique as the golden oldies (Bioshock Infinite, Tomb Raider, Sleeping Dogs) that we were using until recently. These two, Shadow of the Tomb Raider (2018) and Total War: Three Kingdoms (2019), are classics that we'll keep around to see how a card deals with older games that may not be optimized right out of the gate. Note: Total War is especially sensitive to CPU power and cores, so the Ryzen 9 in our testbed will give it an extra boost here and also show some interesting results... Some solid increases across the board here suggest that with at least these two titles (admittedly, a tiny sample size), the RTX 5090 may not see too much of a cliff with older games. Nvidia tends to be good this way out of the gate. More Testing and Numbers to Come... This is just a tease of the RTX 5090's capabilities versus the former flagship RTX 4090, the card that most folks who are looking at the RTX 5090's capabilities are likely to weigh it against. We haven't gotten in hand--yet--a sample of the RTX 5080 (which will launch alongside the RTX 5090, on Jan. 30), and the announced GeForce RTX 5070 Ti and RTX 5070 are coming a little later. A fuller assessment of the RTX 5090 may require the RTX 5080 in hand, or at least a few more cards down the stack, like the RTX 4080 or AMD's closest competitor, the Radeon RX 7900 XTX. But we'll be testing it and a host of other key competing cards all through the winter--possibly into the spring!--to see how the new graphics landscape in 2025 shakes out. One thing's clear, though: The path to consumer-card royalty is looking like it runs straight through the RTX 4090 to the RTX 5090.
[8]
Nvidia RTX 5090: 3 reasons to buy and 2 reasons to skip
The Nvidia GeForce RTX 50-series cards have arrived, and the Nvidia GeForce RTX 5090 is the biggest and baddest of the lot. These cards were the subject of leaks and rumors for months ahead of their debut at CES 2025, and now that we've been formally introduced it's clear that these cards will continue the company's tradition of launching big, powerful and power-hungry GPUs. One other tradition that continues is sky-high pricing on the high-end cards. The top-of-the-line RTX 5090 is slated to retail for $1,999 when it hits store shelves January 30, which is $400 more than the MSRP of the RTX 4090 and the highest launch price of any GeForce GPU to date. Is it worth the money? That's a question lots of us in the tech press are trying to answer, and luckily here at Tom's Guide we've acquired our own RTX 5090 and have had a chance to play games and run some tests on it. So if you're on the fence about whether to drop $2k on the new Nvidia GeForce RTX 5090, good news: I've got some data and details that will help you make the call. To show you what I mean, here a few good reasons to buy and skip the Nvidia GeForce RTX 5090. If you're not sure whether the juice is worth the squeeze, let me walk you through the big reasons why the Nvidia RTX 5090 has so many PC enthusiasts counting their pennies to try and afford one. It's probably obvious, but the number one reason most of us are thinking about spending $2,000 on an Nvidia GeForce RTX 5090 is because it's the most powerful consumer-grade GPU on the market right now. That means if you're building a gaming PC, this is the cream of the crop when it comes to graphics cards. We tested Nvidia's RTX 5090 desktop GPU earlier this month, and the results are clear: this beast chews through the best PC games with ease. As you can see from our results, the Nvidia GeForce RTX 5090 delivers serious performance improvements in all of the games we use for our benchmarking tests. When we compared the results of the Nvidia GeForce RTX 5090 vs Nvidia GeForce RTX 4090 in those games, we saw average frames per second (FPS) jump 20-30% in basically every test. So if you value raw performance in games, the Nvidia GeForce RTX 5090 is an expensive way to guarantee you're on the cutting edge of PC gaming for at least a year or two. And honestly, you could probably use this GPU for the next decade without ever having to worry about upgrading. Nvidia's Deep Learning Super Sampling (DLSS) tech helps your PC run games that support it (most of the best Steam games do) better by rendering frames at a lower resolution, then using machine learning technology to upscale it to your target resolution. This effectively allows you to play games at 1440p or 4K with demands on your PC that are closer to if you were playing the game at 1080p, so you usually get better framerates without losing much in terms of graphical fidelity.I know because I've been using DLSS 3 for years now to better enjoy games like Cyberpunk 2077 and Star Wars Outlaws, and it makes a noticeable difference. Now Nvidia is launching DLSS 4, and while some of its features will be supported on older Nvidia GPUS, it will also have features that are only available on the Nvidia GeForce RTX 50-series cards. Most notably you get a significant upgrade with DLSS 4 to multi-frame generation, which improves upon DLSS 3's existing frame generation feature. It's pretty technical, but the simple explanation is that if you turn frame generation on in games that support it your Nvidia graphics card will try to give you more FPS by intelligently looking at two frames while playing and generating a third that can go between them, then inserting it seamlessly while you play. The upshot is that your GPU is effectively using AI to generate more frames per second than the game is actually giving you, leading to a higher framerate. And with DLSS 4's multi-frame generation feature it will be capable of generating even more frames in supported games -- you can see it in action in Nvidia's sizzle reel above. According to Nvidia, "DLSS Multi Frame Generation generates up to three additional frames per traditionally rendered frame", which could mean that games which support it will run even faster on your gaming PC. However, one caveat with frame generation: it can sometimes make games feel a little more sluggish or slow to respond, especially fast-paced action games, because you can experience more input lag due to the artificially-generated frames. I've never personally noticed the feeling, but I'm not particularly sensitive to input lag or stutter. AI has been the buzzword du jour in the tech industry for a few years now, and in the laptop and PC space there's been a lot of hay made about the value of being able to run AI applications locally on your PC. That's why we've seen Microsoft put oodles of money and effort into launching its new tier of Copilot+ PCs, which have unique features that are only available if your PC has an NPU (Neural Processing Unit) capable of 40+ TOPS (trillion operations per second). There's a lot of marketing being done to sell you on these PCs that ties into the ability to run AI apps like ChatGPT locally, and while you can run your own AI chatbot locally on Windows (or Mac), the best AI powerhouse isn't your NPU -- it's your graphics card. The Nvidia GeForce RTX 4090 was a powerhouse for running AI locally, so the RTX 5090 should be even more monstrous when it comes to generating images and video. And Nvidia is going even harder on AI in 2025 and beyond, so I expect the company will continue to improve its existing local AI features like Chat with RTX, which lets you train a GPT large language model (LLM) locally on your own files and data, so you can ask it questions about your files and PC without worrying about it needing to connect to the Internet. Even though it's clearly the new top dog in the GPU market, there are still good reasons to skip the Nvidia GeForce RTX 5090 for the moment. Here are two very good ones I think you should consider before pulling the trigger on a purchase. Nvidia has priced the GeForce RTX 5090 at a cool $1,999, which is enough to buy you one of the best gaming PCs on the market and still have money left over for accessories, software and lunch. Now, I'm not going to say the company is doing this just to make money on performance-obsessed PC gamers, but you should know that the 5090 is $400 more at launch than the GeForce RTX 4090, which was priced around $1,599. And frankly, I think you could buy yourself an RTX 4090 (or even an RTX 4080) tomorrow, put it in your PC and still have a great gaming rig that would run all the best Steam games at 1080p/60FPS or better for years to come. So as your finger hovers over the buy button on that 5090, take a second to think about whether you really need that feeling of owning the best of the best. Is it worth the extra $400-$1,000 you're spending over an older 40-series card, or even one of the less powerful 50-series GPUs? The RTX 5080 is a thousand bucks cheaper, for example, while the Nvidia RTX 5070 actually feels semi-affordable at $549. At CES 2025 Nvidia chief Jensen Huang famously claimed that an RTX 5070 would deliver the same performance as an RTX 4090, and while there are major caveats to that claim (chief among them being that multi-frame generation is required to achieve that performance), it should give you some comfort that you don't need to spend two grand to get an Nvidia 50-series card that delivers excellent gaming performance. A few years ago the Nvidia GeForce RTX 4090 was criticized for being a massive card that ate up power, to the point that it had a high (for the time) 450 Watt power draw rating. Worse, some users reported seeing power spike higher than the 450W during extended or heavy usage, which could cause problems if the PC power supply wasn't big enough to handle the drain. This led Nvidia to recommend that 4090 owners use at least a 850W power supply in their PC. Unsurprisingly, the RTX 5090 is even hungrier for power. Nvidia has rated it for 575W, and recommends you have a PSU (power supply unit) capable of kicking out at least 1000W in your PC if you want to use the 5090 safely. If you haven't built a PC in a few years, trust me: that's a lotta watts. Most of us don't have that kind of power supply in our PC, so chances are you'll have to build a whole new rig to support your new RTX 5090. If you've made it this far, you should have a pretty good sense of the ups and downs of investing $2,000 in Nvidia's new top-of-the-line GPU. Frankly, it's too rich for my blood. Back in the day I was excited to see what the latest and greatest PC components could do for my rig, and I loved booting up Crysis to see how well my new GPU made it look. But those days are behind me, and I can't pretend I think it's a good idea to spend a month's rent (Bay Area, bay-bee) on a new graphics card, or more if you need to build a whole new PC to handle it. And frankly, after seeing such great results gaming on an Nvidia GeForce RTX 4070 Super, which had no trouble running 2024's latest PC games at 4K/60 FPS with DLSS enabled, I don't see any reason to pay through the nose for a few dozen more frames. Nvidia knows that, and it's not targeting me with these cards. The company is going hard on marketing the 5090 to AI enthusiasts and hardcore PC gamers with thick wallets, letting the rest of us content ourselves with looking forward to a 5070 upgrade some day. And honestly, I'm psyched about it. Because as Nvidia pushes the cutting edge of its consumer-grade GPUs forward, I can hang back a generation or so and try to pick up a nice discounted 40-series card later this year. I expect those cards will still be viable for years to come, so don't feel pressured to race out and buy a new 50-series RTX GPU as soon as they hit store shelves later this month -- as our testing shows, the 5090 is an incredible graphics card, but there are lots of other options out there that are way easier on your wallet.
[9]
NVIDIA GeForce RTX 5090 Founders Edition Review
As one generation ends, a new one begins. NVIDIA's GeForce RTX 50 Series, powered by its new Blackwell architecture, has arrived - and in flagship GeForce RTX 5090 form, it's something to behold. 92.2 billion transistors on a 750mm die built using a custom TSMC 4N process. 32GB of GDDR7 memory on a 512-bit interface with an overall bandwidth of 1792 GB/sec. An impressive 21760 CUDA Cores with the latest generation of Tensor and RT Cores capable of rendering incredible real-time path traced visuals as well as pushing 3352 AI TOPS of performance - which is more than double the GeForce RTX 4090. The GeForce RTX 5090 also sees the arrival of DLSS 4, which brings significant improvements to all GeForce RTX gamers (including those rocking a GeForce RTX 2060) thanks to the new 'Transformer' model that can make DLSS 'Performance' mode in 4K look better than the DLSS 3 'Quality.' Multi Frame Generation, which upgrades the previous Frame Generation with a new VRAM and performance-optimized model, has a new 4X option exclusive to the RTX 50 Series that pushes performance or smoothness into the 300 FPS realm or even up to 400 FPS in 4K. Blackwell fundamentally changes the idea of a GeForce GPU by integrating AI into all aspects of its design and makeup like never before, from cutting-edge ray tracing upgrades to Neural Shaders and Neural Rendering, stuff that will pave the way for a new era of visuals and AI-enhanced gaming. The GeForce RTX 5090 is undoubtedly the most powerful and advanced gaming GPU ever built; it's a gaming beast. The GeForce RTX 4090 is still up there as a 4K powerhouse, but this is something else. Looking at raw performance numbers, without DLSS or Frame Generation - covering a range of games, including competitive shooters and ray-traced cinematic adventures, the gen-on-gen uplift across all the games we tested is 27%. Now, you could focus on this number, add that the GeForce RTX 5090 uses about 28% more power than the RTX 4090, and conclude that it's a pretty straightforward improvement with a price increase to match. However, that 27% number doesn't tell the whole story - not even close to it. There are games where the GeForce RTX 5090 doesn't even get close to 100% GPU usage in 4K - pointing to it being too powerful for its own good. In Warhammer 40K: Space Marine 2, GPU usage was around 60%. With DLSS 4's new 'Transformer' model for Super Resolution and Ray Reconstruction and RTX Mega Geometry rendering more detailed scenes and improving performance, Remedy's Alan Wake 2 with its new 'Ultra' setting for its Full Ray Tracing or Path Tracing mode looks infinitely better running at 100 FPS in 4K with DLSS and no Frame Generation than it does running natively at 42 FPS. You could say that the GeForce RTX 5090 is ahead of its time. It offers exceptional raw performance, DLSS 4 levels up AI Super Resolution to the point where it doesn't make any sense not to turn it on, and Neural Shaders bring a new level of cinematic realism that would be impossible in 2025 without AI. Yes, the GeForce RTX 5090 is absolutely a luxury item, the most advanced gaming hardware on the planet - with a price tag to match. $1,999 is an eye-watering number for a single component that goes into a PC, and consequently, it sits outside the realm of consideration for most gamers. However, for those seriously interested in picking one up - you won't be disappointed. Below is a summary of NVIDIA's GeForce RTX 50 Series and RTX Blackwell architecture, applicable to all models. NVIDIA describes 'Neural Rendering,' which includes all previous versions of DLSS and the brand-new DLSS 4, as the 'next era for computer graphics.' They're not alone; the Lead System Architect for the PlayStation 5 Pro console, Mark Cerny, recently said that ray-tracing is the future of games and that AI will play an integral role in making that happen. DOOM: The Dark Ages developer id Software shared a similar sentiment, adding that the arrival of DLSS was an 'inflection point' for PC game visuals and performance and on par with the arrival of dedicated GPUs and programmable shaders. With the arrival of the Blackwell generation and the GeForce RTX 50 Series, AI is now being used to accelerate programmable shaders with the brand-new RTX Neural Shaders. Yes, these are actual neural networks that use live game data, and the power of Tensor Cores to do everything from compress textures, render lifelike materials with a level of detail impossible to match using traditional rendering methods, and even use AI to partially trace rays and then infer "an infinite amount of rays and bounces for a more accurate representation of indirect lighting in the game scene." RTX Mega Geometry is incredible in its own right; it essentially increases a scene's geometry detail and complexity (triangles or polygons) by up to 100x. 100 times the detail, it's hard to wrap your head around - but the added benefit in a game like Alan Wake 2 is dramatically improving the performance of the game's Full Ray Tracing or Path Tracing mode. With DLSS 4 and RTX Neural Shaders, NVIDIA's GeForce RTX 50 Series and RTX Blackwell architecture (which includes the same AI optimizations as data center Blackwell) is the turning point - the moment when game development and rendering a scene or frame includes AI enhancements, big or small. DLSS 4 is also much more than simply adding the new Multi Frame Generation technology to its list of features. DLSS 3's version of Frame Generation has evolved with DLSS 4, powered by Blackwell hardware and software, and an innovative use of AI to generate frames 40% faster while using 30% less VRAM. Switching to a new model also means that Frame Generation and Multi Frame Generation could come to GeForce RTX 30 and RTX 40 Series owners. However, with the 5th Generation of Tensor Cores in the GeForce RTX 50 Series delivering 2.5X more AI performance, NVIDIA's latest GPUs can execute five complex AI models - covering Super Resolution, Ray Reconstruction, and Multi Frame Generation in a couple of milliseconds. Part of the reason it happens so quickly is the addition of hardware Flip Metering, which shifts frame pacing to the Blackwell display engine - the result is frame rates of up to 4K 240 FPS and higher without stuttering issues. With up to 15 of every 16 pixels generated by AI, the result is up to 8X the performance when compared to native rendering or rasterized performance. DLSS Super Resolution and Ray Reconstruction are also switching to a new 'Transformer' model, with over double the parameters and four times the compute requirement. This is one of the most exciting aspects of the GeForce RTX 50 Series, as it pushes DLSS into a new realm of image quality and performance. The best part is that it will work on all GeForce RTX GPUs; however, there will be a slight performance hit compared to running it on an RTX 50 Series GPU. Even better, DLSS 4 is being integrated into the NVIDIA App with a new 'DLSS Override' feature that allows the latest tech to be experienced without waiting for a path or game update. DLSS 4 is built to be backward compatible, and on day one, 75 games and apps will be supported. It doesn't stop there, as the new AI Management Processor (AMP) allows AI models to share the GPU with graphics workloads. As a result, expect to see digital humans in games alongside AI assistants like NVIDIA's Project G-Assist become more prevalent in the coming years. Here's a look at the specs for the flagship GeForce RTX 50 Series GPUs, the GeForce RTX 5090 and GeForce RTX 5080, compared to the previous Ada generation. You might notice that the custom TSMC 4N process node has remained the same for the new generation. At a glance, the Blackwell generation looks like more with more power required to drive it all. After the Ada Lovelace generation's exceptional power efficiency, Blackwell feels like a step back, especially with the 575W power rating of the GeForce RTX 5090. The good news is that adding new MaxQ technologies and power management means that the RTX 5090 rarely hits that 575W ceiling, with average power usage being much lower when gaming. With 33% more CUDA Cores and VRAM capacity than the GeForce RTX 5090, which includes a shift to cutting-edge high-speed GDDR7 and more Tensor Cores and RT Cores, the 'more' still looks impressive. Of course, the Blackwell architecture significantly changes the SM for Neural Rendering and RTX Neural Shaders alongside native FP4 support that delivers double the AI performance compared to the GeForce RTX 4090. Like the GeForce RTX 4090 before it, the GeForce RTX 5090 is tailor-made for 4K gaming - as you'll see in the benchmark results below, the RTX 5090 is wasted in 1440p. The good news is that Blackwell also sees NVIDIA upgrade its display engine with the arrival of DisplayPort 2.1, which can handle 4K 12-bit HDR at 480Hz and up to 8K 12-bit HDR at 165Hz. With DLSS 4, you'll be able to hit these numbers too. And finally, for creators, Blackwell adds 4:2:2 chroma-sampled video encoding and decoding. The ninth-generation NVENC encoder also improves AV1 and HEVC quality, and the GeForce RTX 5090 supports up to three encoders and two decoders to deliver a 50% gen-over-gen improvement in speed compared to the GeForce RTX 4090. For creators and editors, the GeForce RTX 5090 is a game changer. The GeForce RTX 5090 Founders Edition is beautiful. It evolves the impressive and unique GPU designs from previous generations into a cutting-edge and sleek GPU that is also compact and SFF-Ready. Somehow, NVIDIA managed to get the massive GeForce RTX 5090 specs, memory, and power in a two-slot form factor. That 'somehow' involved creating a custom and small PCB, an impressive engineering feat in its own right, and placing it in a new GPU with a Double Flow Through layout, which means you can see through the fin stacks behind each fan. This new design means that the Founders Edition model is very different from other GeForce RTX 5090 models from NVIDIA's partners, which all arrive with more traditional PCBs and cooling structures. Smaller, thinner, and more powerful - it's hard not to be impressed. The design includes a 3D Vapor Chamber to dissipate heat from the high-density PCB (which houses the GPU and VRAM) to heat pipes connected to two heatsinks on each side of the GPU. A third heatsink is also situated underneath the PCB. From there, the dual axial fans direct air through the fin stacks on either side of the GPU, cooling it down. According to NVIDIA, the design can handle up to 600W with a noise output of around 30dBA - which is not silent but relatively quiet in a close case. Across several benchmarks and gaming sessions over multiple days, there were moments when we heard the fans kick into high gear, but overall, it's an incredible achievement. Granted, it does run warmer than the GeForce RTX 4090 Founders Edition, but that was an almost quad-slot beast of a unit. NVIDIA is set to use this Double Flow Through layout for all GeForce RTX 50 Series Founders Edition models, and in slim GeForce RTX 5090 form, it's easily one of the best-looking and most impressive GPU designs we've ever seen. PC gaming not only covers a wide range of genres and styles, from indie games with simple 2D graphics to massive 3D worlds lit by cutting-edge real-time ray tracing technology. With that, the needs and requirements of each gamer vary. High refresh rates and latency reduction become more important than flashy visuals or playing at the highest resolution possible for those who live and breathe fast-paced competitive games. For those who want to live in a cinematic world and become a key player in an expansive narrative, ray-tracing, and high-fidelity visuals are a stepping stone toward immersion. Our chosen benchmarks cover various games, engines, APIs, and technologies. For the GeForce RTX 5090, all tests are run at 4K and 1440p and include results for performance-boosting Super Resolution technologies like NVIDIA DLSS 4 - including Frame Generation and the new Multi Frame Generation. In many ways, DLSS numbers are more important in 2025 than native rendering - a title with ray tracing isn't meant to be played without Super Resolution. Also, DLSS technologies like Ray Reconstruction and the new RTX Mega Geometry dramatically improve visual fidelity and detail compared to native rendering. However, our benchmark results are still sorted using 'raw performance' or native rendering. Here's the breakdown of games, settings, and what's being tested. The GeForce RTX 4090 came into its own when gaming in 4K; the same can be said for the GeForce RTX 5090. However, it's so powerful that we saw it do the GPU equivalent of sitting back, relaxing, and taking it easy - hovering at around 50-70% GPU usage. The two most significant examples include DOOM Eternal (which we didn't realize had a limit), which only saw a 15% performance improvement compared to the GeForce RTX 4090, and Warhammer 40K: Space Marine 2, which only saw a 13% uplift to performance. This is why the GeForce RTX 5090's overall 4K performance looks good but not mind-blowing, 'Oh boy! 50% faster than the GeForce RTX 4090!' good. Looking at raw performance, most games that actually use the RTX 5090's GPU horsepower run 30-35% faster than on the RTX 4090. We saw the most significant gen-over-gen gains in F1 24 with in-race ray-tracing, which is 38% faster, and Cyberpunk 2077 using the Ray Tracing Ultra preset, which is 36% faster. This improvement can also be seen in Path Tracing performance, which we'll cover in a bit. Either way, it's still impressive, especially when you factor in that the overall 4K gaming performance on the GeForce RTX 5090 is 73% faster than the GeForce RTX 4080 SUPER and a whopping 89% faster than the Radeon RX 7900 XTX - which includes games that hit their performance ceiling. Interestingly, with DLSS 4's new 'Transformer' model available for several games in our benchmark results, the Quality mode is something you'd enable in 4K. However, the Catch 22 is that rendering at 1440p before upscaling means the performance uplift over the GeForce RTX 4090 in 4K drops to 21%. At face value, it's not an impressive number for a $2,000 GPU - however, it means that the GeForce RTX 5090, like my Barbarian in Diablo 4, is overpowered for the current crop of PC gaming. The GeForce RTX 5090 is the fastest 1440p gaming GPU money can buy. It hits an impressive 413 FPS in Counter-Strike 2 using a stress test map meant to put a GPU in its place. It is the only GPU that can hit 100+ FPS in Call of Duty: Black Ops 6 natively, with the detail settings cranked. Marvel Rivals hits 220 FPS with the DLSS Quality preset. So, yes, you could make a case for the GeForce RTX 5090 as a competitive gaming card. However, high performance in competitive games is relatively easy to achieve with most modern GPUs. Regarding 1440 gaming, on average, the GeForce RTX 5090 is only 11% faster than the RTX 4090 and 35% faster than the GeForce RTX 4080 SUPER. The reason for this is simple: performance is either limited to the game engine or CPU. Check out the individual 1440p benchmark results below to see this in action, keeping in mind that in games where the RTX 5090 isn't much faster than the RTX 4090, it's hardly using any power and isn't breaking a sweat. 3DMark offers a suite of synthetic benchmarks built to test GPUs in various scenarios. 3DMark Steel Nomad is a cutting-edge DirectX 12 benchmark with newer, modern rendering techniques designed to push GPUs to their limit. The 'Light' version tests at 1440p, while the main Steel Nomad benchmark tests pure native 4K rendering. Port Royal is a benchmark focusing exclusively on real-time ray tracing for lighting effects like reflections, shadows, and more. The 3DMark results paint a different picture of the overall game averages, showcasing a more significant performance uplift. For example, the Port Royal ray-tracing score is 43% higher than the GeForce RTX 4090, while the Steel Nomad score is 54% higher. These results aren't something to dismiss as heavy ray-tracing workloads and modern DirectX 12 and Unreal Engine 5 games where you'll see the most prominent performance increases. DLSS 4 and Multi Frame Generation are impressive bits of technology, thanks mainly to the overall improvements to performance and latency on the Frame Generation side and the new 'Transformer' model for Super Resolution and Ray Reconstruction. Looking at the DLSS Super Resolution 4K results, you're not only seeing performance faster than native rendering but also noticeably better image quality. Here, the native or rasterized performance should only be seen as a data point and not a real-world scenario. With 100+ FPS in most games, it also serves as an excellent foundation for Frame Generation to dramatically improve the smoothness and perceived performance (with the help of NVIDIA Reflex and the new AI model). All of the above figures are using the DLSS 'Quality' preset, which is stunning in 4K, and with Multi Frame Generation, sees Cyberpunk 2077, Dragon Age: The Veilguard, Hogwarts Legacy, and Marvel Rival hit insane levels of performance. So much so that with the GeForce RTX 5090, you're better off using the Frame Generation 3X preset at most. The good news is that Frame Generation latency increases only minimally, no matter the X factor, and in all-out testing, the responsiveness felt smooth. As for the image quality, it's hard to notice issues when gaming; however, you can pick up on artifacts when spinning the camera around quickly. Path Tracing, or Full Ray Tracing, arrived with the GeForce RTX 40 Series and DLSS 3 and is leveling up with the GeForce RTX 50 Series and DLSS 4. It is the realm of high-end GPUs like the GeForce RTX 5090, a glimpse at what games might look like on the PlayStation 7 - we'd be surprised if the PS6 hits the level of fidelity we're seeing here. For those who love being immersed in a digital world, it's hard to describe playing a game like Cyberpunk 2077, Alan Wake 2, or Indiana Jones and the Great Circle with Full Ray Tracing. Screenshots and videos can show you, but the effect is akin to being there. It's not realism for the sake of realism; it's realistic lighting for immersion and believability. Currently, the only GPUs on the market capable of delivering a great Path Tracing experience are all GeForce RTX cards, and it's a definite selling point. Path Tracing is where we also see the most significant gen-on-gen gains for the GeForce RTX 5090, roughly 36% faster than the GeForce RTX 4090. The GeForce RTX 5090 can run Indiana Jones and the Great Circle with Full Ray Tracing natively at 73 FPS, which jumps up to 153 FPS with DLSS Super Resolution and no Frame Generation. Of course, Multi Frame Generation is a game changer in its own right - Cyberpunk 2077 maxes out our 4K 240 Hz testbench display. Likewise, Alan Wake 2 looks stunning with DLSS 4 and RTX Neural Shaders. Even for a flagship GPU, the 575W power rating of the GeForce RTX 5090 can be hard to swallow. On the flip side, you could see it as a throwback to getting the best performance by linking up two GPUs and connecting them to a massive 1000W PSU. Yes, you'll need at least 1000W GPU for the GeForce RTX 5090, but the good news is that when we went through all the capture data of our 4K gaming benchmarks, the average power usage was 470W. And NVIDIA's idle power usage is still impressive with Blackwell, with the GeForce RTX 5050 sitting on around 20W. That said, fire up a game with Full Ray Tracing or Path Tracing in 4K, and you'll see the RTX 5090 pull 500-550W pretty consistently. As for the new Founders Edition design, it's a definite winner, with temperatures hovering at around 72 degrees under load (this can increase depending on the workload) and idle temperatures sitting at a GPU-chilling 30 degrees with both fans stopped. The transition has begun, but the GeForce RTX 5090 solidifies a shift to Neural Rendering for performance and image fidelity. The raw performance capabilities of a gaming GPU will always be vital because you can't have one without the other - but after spending an entire week with the GeForce RTX 5090, it's safe to say that DLSS 4 is not only a selling point but a set of AI models and features that improve PC gaming as a whole. When looking closely at gaming workloads across native rendering, DLSS Super Resolution, ray tracing, and even Path Tracing with Frame Generation that pushes the RTX 5090 - the results are impressive and, more importantly, jaw-dropping. Looking at every conceivable metric, from raw performance to AI-generated frames, it's the most powerful GPU on the planet right now - and in Founders Edition form, it looks like something from the future. Which, weirdly, it is. With the GeForce RTX 5090, we're starting to see 4K gaming performance hit a wall. So, for those focusing on a single result that is only 20% faster than the previous generation or an average uplift below 30%, there's more to the story - much more. Some of which, hopefully, we covered in this review. We fully expect the GeForce RTX 5090 to only improve throughout 2025 as more games integrate the latest DLSS 4 updates and Neural Shaders arrive in more titles. Still, 75 games on Day One makes it a real reason to pick up a new GeForce RTX 50 Series GPU. Of course, the RTX 5090's content creation side is next-level, too, from its video encoding capabilities to AI-powered tools for streamers to running AI models and rendering complex 3D scenes. And we didn't even touch on the RTX 5090's 2X AI performance, mainly because we ran out of time running 20 individual benchmarks on a single game like Cyberpunk 2077. It'll be interesting to see how the rest of the Blackwell lineup stacks up, but either way, the GeForce RTX 5090 is an impressive start to a new generation. 4K 240 FPS with Full Ray Tracing. Maybe it's time to start looking for an 8K display.
[10]
Nvidia RTX 5090
Nvidia RTX 5090Nvidia's flagship breaks all boundaries - and your wallet... Nvidia's Blackwell platform is, unsurprisingly, packed with exciting new technologies and ways of doing things. We're looking at the Nvidia RTX 5090 graphics card this time, the chip behind which has the extremely catchy name, GB202. We're not going to go through everything with a fine-tooth comb, because that would make this review 10 pages long. Nvidia has chosen to focus more on AI, as we are already down to such small sizes when it comes to hardware that it is problematic to push things further. At these levels, Diminishing Return is a serious problem. Ideally, I'd actually like to go through the whole thing because it's technologically extremely impressive, but for anyone other than the ultra-hardcore tech geeks, it's also just nonsense talk, but impressive nonetheless. Let's start with the design. Nvidia has kept its heavily industrial and feature-based design, which I love, and has now moved to a dual-blow-through design. That is, a PCB with the chip in the centre and symmetrically arranged cooling pipes on either side, each with its own fan that draws air through the card. It seems like a more logical way to do it, and it's a little hard to understand the many intermediate steps between this solution and the old design. It also means that the card is now a traditional 2-slot design, which has helped the size enormously. Unfortunately, it's still over 30 centimetres long, which is still too much in my opinion. Nvidia has even claimed that this Founder's Edition can fit into small SFF enclosures that typically manage 33-35 centimetres, but I wouldn't try that myself. In addition, the power connector does not extend perpendicularly from the card, but is angled and thus protrudes significantly less than before. A new power adapter has been designed with much more flexible cables that are not braided or extremely stiff, which also helps. Speaking of adapters, I do recommend a dedicated cord if you have one. The attentive reader will realise that there are now not three, but four connections for 8-pin power cables. The good news is that if you use the right connectors, you can actually make do with three, but you do lose some performance, but typically only 10-15%. The bad news is that you need all four to get full value for money, resulting in 575.8 Watts being drawn through the system. Yikes. It's a good thing modern CPUs have become more energy efficient, because that's a lot of power. The GPU runs slightly above the promised 2407 MHz, and the same goes for the RAM speed, which officially stands at 2209 MHz. But back to the internals. We're now up to 32 GB of GDDR7 VRAM in the beast. That's insane. A 4nm TSMC production platform, over 92 billion transistors, and PCIe 5.0 connectivity actually comes into play. We hit a VRAM bandwidth of 1.79 TB/s. On top of that, you get 21,769 Shading Units (aka Stream Processors - they're the ones doing most of the work, roughly speaking) - which now support Neural Shading and thus generative AI down to this level. Theoretically, this allows real-time graphics of, among other things, faces with a much higher level of light, shadows, and reflections than before, and the realism of the graphics should increase significantly if game developers can allocate resources to utilise it. There are a lot of other innards, but for most people the most relevant will be that there are 170 Ray-Tracing cores and 680 Tensor cores of the latest generation, with these responsible for everything related to upscaling and Deep Learning. The outputs include one HDMI 2.1b and three DisplayPort 2.1b. It's a bit strange that the HDMI port isn't 2.2 when you're launching something with a DisplayPort standard that's so new that I can't immediately find anything else that has it, perhaps because the standard isn't technically finished yet. There's also support for 4K/480 Hz or 8K/165 Hz with some compression. It'll probably bite me in the arse, but 4K at 240 Hz should be plenty for even the most demanding gamer, although there will probably be esports players who want more of everything. If we had to highlight one thing in particular, apart from the fact that those who use GeForce graphics cards for professional use will probably want to buy one because they've upgraded not only the processing power but also the encoding and decoding dramatically, that would be DLSS 4. They've switched to a completely new way of upscaling and have switched to the so-called Transformer model, which in practice means much less ghosting and far fewer artefacts. In fact, I would go as far as to say that DLSS 4 is a big step forward. Then there's the more controversial part, namely Multi Frame Generation. Where we used to get one artificial image between two "real" ones, we now get up to three. This has led many to criticise the "fake" images, but what is perhaps forgotten is that the latest generation of graphics cards allow much more to be done in the graphics card and not in the processor itself in the computer. More of the graphics in modern games will also be generated directly in the game instead of being an existing data set to be calculated on. This requires a lot of computing power, but the delays are minimised with the Reflex 2 system. However, we have to get used to a completely different way of thinking about graphics generation, especially with the many light and shadow requirements, and the fact that the graphics card thinks more on its own. We will still show benchmarks with native results, without any voodoo switched on. Before we get to that, we need to get one thing straight: the price. A top-of-the-range model like the one we've borrowed here doesn't come cheap. It's £1,939/$1,999, and the cards you can buy in stores that aren't made by Nvidia will probably cost more. It's a scary and insane price that also clearly indicates that this card is not intended for mainstream consumers, which is ironic as it's aimed at those who game in 4K on fast monitors with 200+ Hz refresh rates. Things get a little more fun with the RTX 5080 card, which "only" costs £979/$999, and the two even cheaper RTX 5070 cards (around £539/$549). They may not have quite the same 4K capability, but you'll get all the goodness of the RTX 50 Series at a price where many more can suddenly join in, relatively speaking. But a price increase of a few hundred pounds from the RTX 4090 in less than a year and a half can't be excused by inflation or anything else, it's just pure madness. But buyers for Nvidia's XX90 cards have never been like the rest of us either. And now for the fun part: benchmarks. We tried to get them right as much as possible with the drivers that were available, on an X870E platform, and with everything running on a PCIe 5.0 platform for both drivers and graphics cards. The heat output was quite surprising: it idles at 48 degrees Celsius, maxes out at 72 degrees Celsius, but most of the time it was holding at 63 degrees Celsius. It should also be said that the test bench is designed for maximum airflow, but it's impressive regardless. The noise hit 41.5 dB, but is extremely low frequency. No coil whine, no fan blade noise or, worst of all, air turbulence. So it's a job well done. The performance - which with DLSS 4 is double that of the RTX 4090 - is still, in raw processing power, quite impressive. It's double the performance of the RTX 4080 Super and in some tests even more. There are also new and even wilder Ray-Tracing modes for several games, but as we were unable to collect data on these with older cards, they have not been included. Here we see a massive improvement in most areas, with Port Royal being a 50% improvement from the RTX 4090. V-Ray 6 is a rendering programme that has a solid benchmark tool and a dedicated mode for several types of rendering for both CPU and GPU. The score is three times what an RTX 4080 Super can do, but far from the world record set by the 15x RTX 4090. Yes, it's for professional use. Black Myth: Wukong has a nice benchmark tool that really shows its worth: 4K, everything on maximum, it gives 32 FPS. 56 with Frame Generation. We'll be using this benchmark for a long time. And then for gaming! Total War still doesn't use any assistive technology and is tough on the CPU and GPU, but still offers a 60% improvement over the RTX 4090. Both Assassin's Creed and Far Cry 6 are unfortunately showing their age, it still takes a lot to pull it off in 4K, but as for the rest, it doesn't really matter. Despite DLSS 3 making a significant difference here, it's the fact that it can run at almost 100 FPS that you should take note of. This is a test that unfortunately often crashes many graphics cards, and even expensive models rarely pull over 50 FPS. Here was a really successful implementation of DLSS 4. No artefacts, strange glitches, ghosting, or anything else, it was virtually impossible to tell the difference between native and Frame Generation. Here was also an extremely successful integration where it was virtually impossible to tell the difference between native and Multi Frame Generation. So, here comes the big question: should you buy one? That's a bit complicated. DLSS 4 is a bit of a game changer. If you're playing around with quality settings, you may already be familiar with Ray Reconstruction and DLAA, but the transition to the Transformer model when upscaling, i.e. using DLSS, is very visible. If you're reasonably happy with your RTX 40 Series card, it may not make much sense to switch, especially as rumour has it that there will be up to a 10% performance improvement just by switching to this (again a rumour) for RTX 40 Series cards. Conversely, Multi Frame Generation is only available in the RTX 50 Series, and it seems to work impressively well, especially when combined with a card powerful enough to pull everything in 4K native, which it should for the price. Unfortunately, the price is probably my biggest problem. If they had raised the price a bit or kept it around the same level, I would have been happy, but an increase of almost 20% is too much. Even if it's much more powerful than its predecessor because you expect that with the absolute flagship model. On the other hand, if you have an older card, such as the RTX 3090, then yes, you should probably upgrade if you want to play in 4K. More and more of the games we look at here at Gamereactor are poorly optimised and incredibly poorly adapted to the PC, so a lot of horsepower must be available all the time. However, Nvidia should be commended for listening to the criticism and making a better power connection and a smaller card. But four 8-pin connections? It's still completely unreal to look at.
[11]
We tested Nvidia's RTX 5090 desktop GPU -- gaming performance gains are HUGE
If you've been reading my stuff over the past few weeks, you'll know I'm pretty pumped about Nvidia's new GeForce RTX 5090 graphics card. From having my mind blown when playing the likes of Black Myth: Wukong and Black State, to seeing Cyberpunk 2077 run at over 200 frames per second in 4K with every ray and path tracing feature turned on, the hype is big around this. And now, we've got one to test for ourselves, and it was a pretty revealing experience. Namely, one thing is abundantly clear: it's not just about raw performance anymore. Don't get me wrong -- the 5090 is an absolute beast. Those increases in total cores and the wattage going to the GPU will drive a lot of the improvement you're about to see. But with that massive jump in trillions of AI operations per second (TOPS) alongside that vast increase in total Memory Bandwidth, these will fuel some of the future tech I've been talking about like RTX Mega Geometry. Nvidia is jumping fully onto the DLSS train (a feature that over 80% of Nvidia gamers use on the regular) and stopping by a lot of RTX neural stations to drive a lot of that truly next-gen gaming performance. Let's get into it. For full transparency, this is the PC we built for our testing. Our master builder Matthew Murray will have a piece about his building experience real soon! So, I know the first thing everybody wants to see is how are the improvements in rendering without any AI trickery. Nvidia talked so much about "DLSS" this and "neural" that, it started to give a lot of our readers pause in what the actual improvements in raw horsepower would be. Well, to answer your question, the answer is in line with when you saw the smaller lines on the left side of Nvidia's graphs -- roughly a 20-25% increase in frame rates. And if we put it through 3DMark's benchmarks, you can also see where some of the emphasis lies -- improvements in rendering and ray tracing are clear. Now the ultimate question I know a lot of you are asking is simple: are the improvements enough to warrant spending $2,000 on a new GPU? In one word, no. If you're already on an RTX 4090, then that's still pretty future-proofed for the next few years -- packing enough under the hood to run pretty much all of the oncoming AAA titles at impressive framerates. Plus DLSS 4 features are indeed coming to this (everything but the multi frame generation), so you're good for a while to come. Quick disclaimer to start this. DLSS 4 is not quite here yet -- we're waiting on patches for these games to arrive. It looks as if these will all start arriving on January 30, starting with Cyberpunk 2077. While we wait for DLSS 4 and multi-frame gen patches in all our favorite games, though, we've got a good idea of how the improved AI accelerators and 20% increase in Tensor cores over the 4090 increase frame rates in the widely available DLSS 3 and 3.5. Turns out it's a lot. That is an average of around 60% smoother frame rates if you were to go up to DLSS Performance modes, and around 25% for the Quality option. With DLSS 3, you're just getting an additional frame generated. But with DLSS 4's multi-frame, you'll be able to see this generate an additional three frames with an unnoticeable impact on latency. As you've seen in my own testing of Cyberpunk 2077 with this turned on, that means frame rates can spike to over 250 fps. Once this feature becomes available, we'll revisit this testing across different games. What surprised me most is when looking at these numbers, I'm reminded that this is just the beginning. There is plenty of new neural rendering technology that developers are yet to jump on, which could see games looking even more incredible and running so much faster. Of course, we can't base a judgment on future promises, which is why I've opted to go down the route of telling you the numbers now before actually reviewing it when DLSS 4 patches drop. However, I can say one thing for sure: if you already have an RTX 40-series GPU, the upgrade here isn't necessarily going to be needed. Of course, if you're someone that wants the best of the best and has the money to do so, knock yourself out. But a lot of these DLSS 4 features are coming to these older cards, and as you can see, every AAA game still absolutely sings on these cards. One thing is clear, RTX 5090 is Nvidia's statement about the future of gaming, and how AI is the path to truly next-gen visual quality and frame rate. Whether you agree with this or not is up to you, but I'm certainly on board.
[12]
3 things all Nvidia GeForce RTX 5090 reviews are saying about the graphics card
It appears that the Nvidia GeForce RTX 5090 is living up to the hype. Credit: Bridget Bennett/Bloomberg via Getty Images The Nvidia GeForce RTX 5090 is finally here. Everyone in the tech space has at least heard something about Nvidia's most powerful graphics card yet. The company stole the show earlier this year at CES, the biggest technology conference in the world, when it announced the GeForce RTX 5090 GPU. Now, the reviews are coming out from outlets that have had a chance to give the Nvidia GeForce RTX 5090 a spin. While Mashable hasn't tested the GPU out ourselves, our sibling publication PC Mag did! And we poured through reviews from multiple other outlets as well. Mashable found that each outlet had plenty to say about the Nvidia GeForce RTX 5090. To make things easier, here are some findings we've found constant in all the reviews. Gamers who are looking for the best of the best, look no further than the Nvidia GeForce RTX 5090. The most powerful of Nvidia's newest Blackwell graphics cards is a "beast." Review after review has brought up benchmarks showcasing that when it comes to gaming, the RTX 5090 blows its predecessors and other graphics cards away. However, it appears that for many consumers, the RTX 5090 is overkill. Most consumers will just not notice a difference unless they have a really high-end 4K gaming monitor with a 240Hz refresh rate. And even then, the current generation of AAA games on the market aren't utilizing the RTX 5090 to its full potential. "The games just aren't there yet," IGN said about users looking to upgrade their gaming PCs. If you're looking to future proof your gaming PC though, the slate of next generation games will likely take advantage of at least some of what the Nvidia GeForce RTX 5090 has to offer. The RTX 5090's AI-generated multi-frame feature has all the reviewers talking about it. Basically, Nvidia has a suite of tools called deep learning super sampling or DLSS which basically uses AI to improve image quality and boost frame rate. This isn't entirely new from Nvidia. However, the latest version in the RTX 5090, DLSS 4 with multi-frame generation, apparently takes it to a whole new level. Basically, Nvidia is now able to insert 3 AI-generated frames for every one "real" frame in order to increase the FPS on a game and in turn making it much smoother and more realistic looking. According to the reviews, it works extremely well. The one drawback here is, again, a user would need a high-end 4K monitor to really see the difference. Of course, these are two relevant pieces of every Nvidia GeForce RTX 5090 review. The Nvidia GeForce RTX 5090 is expensive at $1,999 for the Founders Edition. That's hundreds of dollars more than its predecessor. In fact, consumers can easily buy an entire gaming computer right now for around half the price of the graphics card alone. Plus, that also doesn't include any markups from third-party retailers and sellers. And that last bit is very relevant because regardless of that nearly $2,000 pricepoint, this thing is going to be hard to find. Between gamers and its AI processing use cases, the Nvidia GeForce RTX 5090 is going to be in high demand. So, if you really want one, get ready to shell out a premium for it.
[13]
Nvidia says the RTX 5070 is as fast as the RTX 4090 -- let's look at the specs
Table of Contents Table of Contents Pricing and availability Specs Performance Wait and see Nvidia made some stark claims with the announcement of its RTX 50-series GPUs. It promised double the performance with the best graphics card Nvidia has ever released, the RTX 5090, and it said outright that the more modest RTX 5070 could equal the performance of the last-generation king, the RTX 4090. That seems dubious, even with the hype surrounding the new hardware. We can't officially confirm or deny such claims until the embargo lifts on reviews for these new cards, but for now, let's consider how the RTX 5070 lines up with the 4090, to give us an idea about what to expect. Recommended Videos Pricing and availability The Nvidia RTX 4090 debuted in October 2022 with an at-the-time eye-watering price tag of $1,600. It's remained at around that level ever since, with some shortages spiking the price over $2,000 at times, depending on the version. As of late January 2025, however, the card is sold out almost everywhere, with only overpriced versions around $2,500 still available new, and second-hand versions selling for around $1,600. Get your weekly teardown of the tech behind PC gaming ReSpec Subscribe Check your inbox! Privacy Policy The RTX 5070 is set to go on sale some time in February -- though more likely toward the end of the month than the start. Nvidia's suggested retail price for the card is $550. That's $50 less than the RTX 4070 Super debuted at, but depending on stock and early interest, prices for the card may be higher in the short term after release. Specs Nvidia RTX 5070 Nvidia RTX 4090 CUDA Cores 6,144 16,384 RT Cores Unknown quantity, 4th generation 128, 3rd generation Tensor Cores Unknown quantity, 5th generation 512, 4th generation Maximum clock 2.51GHz 2.5GHz Memory size 12GB GDDR7 24GB GDDR6X Memory bus 192-bit 384-bit Memory speed 28Gbps 21Gbps Memory bandwidth 672GBps 1,008GBps TBP 250W 450W On paper at least, it seems almost comical that Nvidia claims the RTX 5070 can match the RTX 4090's performance. It has a third less memory bandwidth, half the physical memory, and barely a third of the CUDA core count with comparable boost clock speeds. That's because Nvidia's claims are largely based around the enhanced support for AI-driven upscaling using the latest generation of Tensor cores. Nvidia's new RTX 50-series supports multi frame generation, which can construct up to three AI-generated frames around a single GPU-rendered frame. The RTX 4090, on the other hand, is restricted to just single AI frame as part of its support for DLSS 3. That does give its fps a big boost in compatible games, but potentially leaves the last-generation kingpin behind in raw numbers -- even if there are concerns over the viability of such high numbers of AI constructed frames. Putting aside our sceptic's hat for a second, though, it's fair to say that the RTX 5070 uses substantially less power than the RTX 4090, therefore outputting far less heat in turn. That will make it far better suited to small form-factor gaming PCs -- especially if its new architecture, process node, and indeed, DLSS 4 support, can help it close the performance gap with the 4090. Performance Until we can test the RTX 5070 ourselves, we can't say for sure just how good this next-generation card is or how it stacks up against the RTX 4090. We can use Nvidia's slides and claims to give us a rough ballpark, but with the DLSS-heavy marketing, we should be aware of the upscaling involved to reach some of these numbers. Nvidia's graphs do tell us more than they might initially appear to, though. While the big claims of double the performance of the RTX 4070 would put the RTX 5070 in the realm of the RTX 4090, we should instead focus on the left two results on the above graph. In Horizon Forbidden West and Resident Evil, where DLSS 3 and no DLSS were used, respectively, we see much more modest performance improvements from one generation to the next. In reality, the RTX 5070 may be less than 20% faster than the RTX 4070. If that's the case, it would take a huge uplift from DLSS 4 and multi-frame generation to even approach 4090 performance, let alone match it. We'll have to wait and see for real-world testing to settle this performance debate, but at first glance, it seems very unlikely the 5070 can match the 4090 in most games. Even then, it'd be with heavy multi-frame generation, which can introduce latency issues and visual artifacts which are unlikely to be to everyone's tastes. Wait and see We only have a month or so to wait to see just how good the RTX 5070 really is. It's likely to be faster than the RTX 4070 and the new DLSS features make it far more capable in select scenarios. It's not likely to measure up to the RTX 4090, though. If you're using that last-generation king card already, beware the FOMO of a new generation. You're likely still sitting on one of the fastest graphics cards in the world, and it will do for quite some time.
[14]
MSI GeForce RTX 5090 SUPRIM SOC Review - Supreme Performance
The GeForce RTX 5090 is here, ushering in the RTX Blackwell generation of GPUs for gamers, creators, and AI enthusiasts. As far as flagship gaming performance goes, it can't be beat. That goes without saying because its predecessor, the GeForce RTX 4090, is still an absolute killer card for 4K gaming. A quick look around the MSI GeForce RTX 5090 SUPRIM SOC As seen in our full review of the GeForce RTX 5090 Founders Edition model, the gen-on-gen raw performance uplift is decent but not earth-shattering, with the true star of the show being the arrival of DLSS 4, the new 'Transformer' model for Super Resolution and NVIDIA's new Multi Frame Generation technology that delivers 4K 240 Hz path tracing in games like Cyberpunk 2077 and Alan Wake 2. Incredible stuff and proof that AI-enhanced gaming is the future. In this review, we'll go into more detail on how DLSS 4's various technologies feel because gaming is all about immersion, entertainment, and losing yourself in intimate or grand digital worlds. One of the most impressive features of the GeForce RTX 5090 Founders Edition is its sleek, cutting-edge two-slot design, with a specialized custom PCB and a new way to cool a GPU. The MSI GeForce RTX 5090 32G SUPRIM SOC, reviewed here, is more traditional in the sense that it's over 3.5 slots thick, long, heavy, and remarkably robust. It's also stylish and modern, with MSI revamping the SUPRIM look for the GeForce RTX 50 Series. MSI's SUPRIM line-up represents the company's flagship offerings, sporting impressive cooling with a significant out-of-the-box overclock to push the RTX 5090's remarkable performance even higher. Is the SUPRIM faster than the Founders Edition model? Yes, and you can feel it in demanding games and workloads. It ships with a 600W 'Gaming' mode, drawing more power than the 575W reference design. Upping the power limit to 600W doesn't mean it will use all that juice when gaming, as the flagship RTX Blackwell GPU is still relatively efficient regarding average power usage. Without a doubt, the MSI GeForce RTX 5090 32G SUPRIM SOC is quieter and runs around 10 degrees cooler than the Founder Edition model - even when using the OC 'Gaming' mode. This means the chunky 3.5+ slot GPU is still king when it comes to keeping temperatures in check. It also opens the door to additional OC tuning for even more performance. MSI's new flagship SUPRIM GPU doesn't disappoint. Below is a summary of NVIDIA's GeForce RTX 50 Series and RTX Blackwell architecture, applicable to all models. NVIDIA describes 'Neural Rendering,' which includes all previous versions of DLSS and the brand-new DLSS 4, as the 'next era for computer graphics.' They're not alone; the Lead System Architect for the PlayStation 5 Pro console, Mark Cerny, recently said that ray-tracing is the future of games and that AI will play an integral role in making that happen. DOOM: The Dark Ages developer id Software shared a similar sentiment, adding that the arrival of DLSS was an 'inflection point' for PC game visuals and performance and on par with the arrival of dedicated GPUs and programmable shaders. With the arrival of the Blackwell generation and the GeForce RTX 50 Series, AI is now being used to accelerate programmable shaders with the brand-new RTX Neural Shaders. Yes, these are actual neural networks that use live game data, and the power of Tensor Cores to do everything from compress textures, render lifelike materials with a level of detail impossible to match using traditional rendering methods, and even use AI to partially trace rays and then infer "an infinite amount of rays and bounces for a more accurate representation of indirect lighting in the game scene." RTX Mega Geometry is incredible in its own right; it essentially increases a scene's geometry detail and complexity (triangles or polygons) by up to 100x. 100 times the detail, it's hard to wrap your head around - but the added benefit in a game like Alan Wake 2 is dramatically improving the performance of the game's Full Ray Tracing or Path Tracing mode. With DLSS 4 and RTX Neural Shaders, NVIDIA's GeForce RTX 50 Series and RTX Blackwell architecture (which includes the same AI optimizations as data center Blackwell) can be viewed as the turning point for PC gaming - the moment when AI becomes integral to everything from designing a game to programming and then finally rendering it on a 4K display to play. DLSS 4 includes more goodies than NVIDIA's highly touted new Multi Frame Generation technology, but let's start there. DLSS 3's version of Frame Generation has evolved with DLSS 4, powered by Blackwell hardware and software, and an innovative use of AI to generate frames 40% faster while using 30% less VRAM. Switching to a new model also means that Frame Generation and Multi-Frame Generation could soon come to GeForce RTX 20, 30, and RTX 40 Series owners. DLSS 4 benefits all GeForce RTX gamers. However, with the 5th Generation of Tensor Cores in the GeForce RTX 50 Series delivering 2.5X more AI performance, NVIDIA's latest GeForce RTX 50 Series GPUs can execute five complex AI models - covering Super Resolution, Ray Reconstruction, and Multi Frame Generation in a couple of milliseconds. Part of the reason it happens so quickly is the addition of hardware Flip Metering, which shifts frame pacing to the Blackwell display engine - the result is frame rates of up to 4K 240 FPS and higher without stuttering issues. With up to 15 of every 16 pixels generated by AI, the result is up to 8X the performance when compared to native rendering or rasterized performance. DLSS Super Resolution and Ray Reconstruction are also switching to a new 'Transformer' model, with over double the parameters and four times the compute requirement. This is one of the most exciting aspects of the GeForce RTX 50 Series, as it pushes DLSS into a new realm of image quality and performance. The best part is that it will work on all GeForce RTX GPUs; however, there will be a performance hit compared to running it on an RTX 50 Series GPU. Already available in games, DLSS 4's Transformer model is another DLSS 2.0-like moment for the technology, and the results speak for themselves. Even better, DLSS 4 is being integrated into the NVIDIA App with a new 'DLSS Override' feature that allows users to experience the latest tech without waiting for a path or game update. DLSS 4 is built to be backward compatible, with 75 games and apps supported. It doesn't stop there, as the new AI Management Processor (AMP) allows AI models to share the GPU with graphics workloads. As a result, expect to see digital humans in games alongside AI assistants like NVIDIA's Project G-Assist become more prevalent in the coming years. This filters down to the creator side, with AI assistants for streamers, who will also benefit from the GeForce RTX 50 Series' expanded creator features. RTX Blackwell introduces 4:2:2 chroma-sampled video encoding and decoding. The ninth-generation NVENC encoder also improves AV1 and HEVC quality. The flagship GeForce RTX 5090 supports up to three encoders and two decoders to deliver a 50% gen-over-gen improvement in speed compared to the GeForce RTX 4090. For creators and editors, the RTX Blackwell is a game changer. With the new low-voltage and cutting-edge GDDR7 memory, memory bandwidth and speed see a dramatic improvement. Here's a look at the specs for the flagship GeForce RTX 50 Series GPUs, the GeForce RTX 5090 and GeForce RTX 5080, compared to the previous Ada generation. One of the reasons why the Blackwell generation is more power-hungry than Ada comes down to NVIDIA sticking with the same custom TSMC 4N process, which means we're not seeing the sort of dramatic efficiency gains that we saw with the GeForce RTX 40 Series. However, there are several fundamental changes to the GPU design, from the SM layout to AI being integrated into all aspects to the latest generation of RT Cores, adding a bunch of advanced ray-tracing tech that will take a little while to filter down to the games we play. Compared to the GeForce RTX 4090, the new flagship has 33% more CUDA Cores, Tensor Cores, RT Cores, and even VRAM capacity. Of course, it's not 33% more of the same as the new Tensor Cores, which includes FP4 support that can deliver over double the AI performance. Also, the 32GB of GDDR7 memory on a 512-bit bus has a bandwidth of 1792 GB/sec - a whopping 77% more than the GeForce RTX 4090. Yes, in terms of hardware, the GeForce RTX 5090 is not only the most advanced GPU we've ever seen - 92.2 billion transistors on a 750mm die, which is remarkable, to say the least. As you can see in the specs for the MSI GeForce RTX 5090 32G SUPRIM SOC, you're looking at a supped-up GeForce RTX 5090 with an out-of-the-box Boost Clock speed of 2565 MHz - 158 MHz higher than the reference design and Founders Edition model. This can be pushed to 2580 MHz with MSI Center and even higher with MSI Afterburner. As an OC model, the SUPRIM increases the power rating to 600W - however, as you'll see in the summary on the GPU's thermal performance, the average usage when gaming sits well below this. Still, you'll need at least 1000W for this beast of a graphics card. It's a small and subtle change, but we love how MSI has made the SUPRIM logo and name a key part of the physical design. Gone are the MSI branding and iconic 'MSI Dragon' logo; in its place, you've simply got the SUPRIM name and the polygonal branding or logo that now sits on the backplate, middle fan, and corner. This helps separate the SUPRIM model from the rest of MSI's line-up, further bolstered with a brushed metal look and stylish angles to make it look as premium as its notable higher-than-MSRP price point. The lighting here is also minimal and stylish, and by default, it is a cool white color to complement the 'diamond-cut' inspired look of the new SUPRIM design. Of course, MSI has also improved the cooling and thermal design with its latest flagship, which the company calls 'Hyper Frozr Thermal, ' a fancy way of naming various components. The MSI GeForce RTX 5090 32G SUPRIM SOC sports three of the company's new STORMFORCE fans, which are optimized for airflow and minimal noise. It also sports an advanced built-in Vapor Chamber for both the GPU and VRAM, square-shaped core pipes, and heatsink fins shaped and contoured in a certain way to ensure that only cool air comes in and hot air goes out. There are many other premium touches, including a thickened copper layer inside the PCB to improve heat dissipation, high-quality thermal padding, a Dual BIOS mode, a bundled GPU stand (sporting one of the best designs we've seen), a color-coded 16-pin power connector so you can be sure it's plugged in correctly, and several power management and delivery improvements for long-term reliability. And it all works flawlessly; the MSI GeForce RTX 5090 32G SUPRIM SOC runs quiet and cool even when overclocked. PC gaming not only covers a wide range of genres and styles, from indie games with simple 2D graphics to massive 3D worlds lit by cutting-edge real-time ray tracing technology. With that, the needs and requirements of each gamer vary. High refresh rates and latency reduction become more important than flashy visuals or playing at the highest resolution possible for those who live and breathe fast-paced competitive games. For those who want to live in a cinematic world and become a key player in an expansive narrative, ray-tracing, and high-fidelity visuals are a stepping stone toward immersion. Our chosen benchmarks cover various games, engines, APIs, and technologies. For the GeForce RTX 5090, all tests are run at 4K and 1440p and include results for performance-boosting Super Resolution technologies like NVIDIA DLSS 4 - including Frame Generation and the new Multi Frame Generation. In many ways, DLSS numbers are more important in 2025 than native rendering - a title with ray tracing isn't meant to be played without Super Resolution. Also, DLSS technologies like Ray Reconstruction and the new RTX Mega Geometry dramatically improve visual fidelity and detail compared to native rendering. However, our benchmark results are still sorted using 'raw performance' or native rendering. Here's the breakdown of games, settings, and what's being tested. Like the GeForce RTX 4090 before it, the new GeForce RTX 5090 is, first and foremost, a 4K gaming GPU. Looking at native or rasterized performance, the MSI GeForce RTX 5090 32G SUPRIM SOC is 30% faster than the GeForce RTX 4090 and 3-5% faster than the GeForce RTX 5090 Founders Edition model. The 1% low figures are also higher than the Founders card. A 30% improvement in raw performance is impressive; however, even at 4K, the GeForce RTX 5090 can be underutilized and run into a few bottlenecks that hold it back from its true potential. A prime example of this can be seen in Warhammer 40K: Space Marine 2, where the MSI GeForce RTX 5090 32G SUPRIM SOC is only 16% faster than the GeForce RTX 5090 and barely breaking a sweat. Things get interesting when looking at titles like F1 24 with full in-race ray tracing features enabled; here, the MSI GeForce RTX 5090 SUPRIM is 39% faster than the GeForce RTX 4090. Cyberpunk 2077 with the 'Ultra' ray tracing preset is 41% faster, and the intense real-time strategy action of Total War: Warhammer III also runs 41% faster than the GeForce RTX 4090. There is a catch, though: the performance gap drops when gaming using the DLSS 'Quality' preset in some games on account of the native rendering dropping to 1440p. With the new Transformer model (which we enabled in a few of the games tested), DLSS 4 is a game changer for visual fidelity, and with Ray Reconstruction using the new model, native rendering or rasterized performance doesn't make any sense for games with RT. Even for those without RT. Looking at the overall performance across all 14 games tested, the 4K 163 FPS average with DLSS using the Quality preset for the MSI GeForce RTX 5090 SUPRIM is remarkable. And this doesn't include Frame Generation, which we'll get in a bit. 1440p gaming with the MSI GeForce RTX 5090 SUPRIM can be impressive. Counter-Strike 2, tested with a super-intensive smoke and explosion-filled stress test map, hits an average FPS of 418 FPS. The Unreal Engine 5-powered smash-hit Marvel Rivals hits an impressive 222 FPS with DLSS. However, it's mostly a story of diminishing returns as the MSI GeForce RTX 5090 SUPRIM shifts into low gear at this resolution. You are looking at significantly less GPU usage for an average 12% performance improvement compared to the GeForce RTX 4090. Still, the GeForce RTX 5090 is the only card that can run Black Myth: Wukong at 100+ FPS at 1440p using the game's Very High-quality reset, so there's that. In many games, you can only boost performance at 1440p on a GPU like the GeForce RTX 5090 or RTX 4090 by using DLSS in conjunction with Frame Generation. Multi Frame Generation in 1440p in games like Cyberpunk 2077 delivers incredible smoothness and responsiveness, and those benchmarks can be found below. 3DMark offers a suite of synthetic benchmarks built to test GPUs in various scenarios. 3DMark Steel Nomad is a cutting-edge DirectX 12 benchmark with newer, modern rendering techniques designed to push GPUs to their limit. The 'Light' version tests at 1440p, while the main Steel Nomad benchmark tests pure native 4K rendering. Port Royal is a benchmark focusing exclusively on real-time ray tracing for lighting effects like reflections, shadows, and more. As we saw with the Founders Edition model, the MSI GeForce RTX 5090 SUPRIM's synthetic 3DMark benchmark results paint a very different picture than actual in-game results. This is strange because 3DMark results usually align with real-world performance for GeForce RTX graphics cards. Here, we see the MSI GeForce RTX 5090 SUPRIM deliver a Steel Nomad score that is 56% higher than the GeForce RTX 4090's score. Likewise, the ray-tracing Port Royal score is 46% higher. However, there are titles with heavy ray-tracing, like Cyberpunk 2077, where the MSI GeForce RTX 5090 SUPRIM is 40% faster than the GeForce RTX 4090. Also, as Steel Nomad is still relatively new and designed for modern GPU features, it could indicate what we'll see as more games are released that take advantage of the RTX Blackwell's brand-new rendering technologies. DLSS 4 and Multi Frame Generation are impressive bits of technology, thanks mainly to the overall improvements to performance and latency on the Frame Generation side and the new 'Transformer' model for Super Resolution and Ray Reconstruction. We used the DLSS 'Quality' mode preset for these benchmarks, which often delivers better-than-native image quality. Focusing on the 4K results, without Frame Generation, you're looking at excellent performance across all of the games benchmarked - Cyberpunk 2077, Dragon Age: The Veilguard, Hogwarts Legacy, and Marvel Rival. What Frame Generation does primarily increases the smoothness of how a game plays. Multi Frame Generation allows you to select how many AI-generated frames you want it to create. Here, we've tested 2X (the current standard) and the new 4X, generating three additional frames. With the 4X preset, Cyberpunk 2077's smoothness with ray-tracing sees a 4.7X increase in performance and frame rate, while the rest of the games see a 3.8X improvement. Regarding how it looks, yes, 2X does look better than 4X, but the image quality difference - in motion - is pretty minor, as is the increase in latency. The overall latency of Multi Frame Generation is excellent, maintaining the responsiveness of how it feels to play each of these titles at 100 FPS. Frame Generation is all about maxing out your display's refresh rate, so we tested on a 4K 240 Hz display that could lower the rate to 3X. With fast movement, you can notice issues with the image, from ghosting to some general fuzziness - but for the most part, it's smooth sailing. Frame Generation's smoothness makes it a worthwhile technology - especially when it comes to Path Tracing or Full Ray Tracing. Path Tracing, or Full Ray Tracing, arrived with the GeForce RTX 40 Series and DLSS 3 and is leveling up with the GeForce RTX 50 Series and DLSS 4. It's only possible thanks to AI technologies like DLSS Super Resolution, Ray Reconstruction, and RTX Neural Shader technology like RTX Mega Geometry. It's designed specifically for these technologies, and we're only including native or rasterized performance to highlight just how intensive it is on a GPU as powerful as the GeForce RTX 5090. In fact, outside of the massive increase in performance, these games also look notably worse without DLSS 4. Path Tracing is where we see the most significant performance gains for the MSI GeForce RTX 5090 SUPRIM, where, on average, you're looking at 40% faster ray tracing, with that number climbing to 46% when looking specifically at Alan Wake 2. Interestingly, the build we used for Alan Wake 2 was a special preview branch with DLSS 4 features, including the new Transformer model, Multi Frame Generation, and even RTX Mega Geometry on the rendering side. The result is mind-blowing, with the DLSS Performance preset looking as good as the previous DLSS Quality preset with the older AI model. Frame Generation and Multi Frame Generation are excellent additions, too, as you get the responsiveness of a 60+ FPS experience with the smoothness of a 200+ FPS game. Seeing is believing when it comes to both Path Tracing and DLSS 4; it's a glimpse at the future of gaming, and we can't wait to see how it looks at some of 2025's biggest releases like DOOM: The Dark Ages. As that will be running on id Tech, the same engine used in Indiana Jones and the Great Circle, we're expecting big things. Indiana Jones is the only Path Tracing game on the GeForce RTX 5090 that doesn't need Frame Generation. When the GeForce RTX 4090 launched with a power rating of 450W, it felt like a lot, so the GeForce RTX 5090's 575W power rating was definitely on the extreme side. And with the MSI GeForce RTX 5090 32G SUPRIM SOC pushing that to 600W, you need a 1000W PSU to game with it. That said, the average 4K gaming power usage with the GPU was 490W, so it won't run at 600W the moment you fire up any game. In fact, with RTX Blackwell's MaxQ technologies, power management is impressive in providing what's required at any given moment. The MSI GeForce RTX 5090 32G SUPRIM SOC's thermal performance is exceptional, especially when you factor in the additional power draw. During our tests, the temperature stayed below 65 degrees even when overclocked, and the fans were either silent or whisper-quiet the entire time. Compared to the Founders Edition model, this is a significant improvement (10 degrees cooler and quieter fans) and opens the door to additional tweaking and tuning. The MSI GeForce RTX 5090 32G SUPRIM SOC is a beast; it's the most powerful gaming GPU we've reviewed, tested, and played games with. With some additional modest overlocking using MSI Afterburner, we were able to see it perform 5-6% faster than the Founders Edition model. Compared to the GeForce RTX 4090, you're looking at 32% faster performance on average with some OC action, with that figure climbing to 43% when looking at games like Cyberpunk 2077 and Alan Wake 2 with Path Tracing or Full Ray Tracing. These generational numbers are impressive; however, the 600W power rating and $2399.99 price point make it more of a luxury item or halo product. The show's real star, however, is the arrival of DLSS 4 and RTX Neural Shaders. The latter will start showing up in more and more titles as time goes on, but from what we've seen in Remedy's Alan Wake 2 and the Half-Life 2 RTX demo we went hands-on with at CES, game visuals are seriously going to level up with the GeForce RTX 50 Series. The best part is that, outside of Multi Frame Generation, which currently relies on the new RTX Blackwell architecture for smooth performance, all of these features will work across every RTX GPU - from the GeForce RTX 2060 to the GeForce RTX 4090. In fact, with AI becoming an integral part of the GeForce RTX 5090 and RTX 50 Series experience, from gaming to content creation, the industry (led by NVIDIA) has moved on from 'how many frames can it render natively.' Seeing is believing, and 4K 240 FPS with Full Ray Tracing is mind-blowing when you go hands-on. The flagship RTX Blackwell GPU is even more impressive in MSI GeForce RTX 5090 32G SUPRIM SOC form, and we can't wait to see how 2025's biggest releases use all of its advanced features and premium cooling. In the meantime, we'll use it to spend quality time in Cyberpunk 2077's stunning Night City.
[15]
Nvidia GeForce RTX 5090 FE review
PC Gamer's got your back Our experienced team dedicates many hours to every review, to really get to the heart of what matters most to you. Find out more about how we evaluate games and hardware. There is an alternative 2025 where you get the Nvidia RTX 5090 of your dreams. That's a timeline where Nvidia has busted Apple's grip on TSMC's most advanced process nodes, managed to negotiate an unprecedented deal on silicon production, and worked some magic to deliver the same sort of generational rendering performance increases we've become used to since the RTX prefix was born. And it's a 2025 where Nvidia hasn't slapped a $400 price hike on the most powerful of its new RTX Blackwell graphics cards. But in this timeline, the RTX 5090 is an ultra enthusiast graphics card that is begging us to be more realistic. Which, I will freely admit, sounds kinda odd from what has always been an OTT card. But, in the real world, a GB202 GPU running on a more advanced, smaller process node, with far more CUDA cores, would have cost a whole lot more than the $1,999 the green team is asking for this new card. And would still maybe only get you another 10-20% higher performance for the money -- I mean, how much different is TSMC's 3 nm node to its 4 nm ones? The RTX 5090 is a new kind of graphics card, however, in terms of ethos if not in silicon. It's a GPU designed for a new future of AI processing, and I don't just mean it's really good at generating pictures of astronauts riding horses above the surface of the moon: AI processing is built into its core design and that's how you get a gaming performance boost that is almost unprecedented in modern PC graphics, even when the core at its heart hasn't changed that much. The new RTX Blackwell GPU is... fine. Okay, that's a bit mean, the GB202 chip inside the RTX 5090 is better than fine, it's the most powerful graphics core you can jam into a gaming PC. I'm maybe just finding it a little tough not to think of it like an RTX 4090 Ti or Ada Titan. Apart from hooking up the Tensor Cores to the shaders, via a new Microsoft API, and a new flip metering doohicky in the display engine, it largely feels like Ada on steroids. The software suite backing it up, however, is a frickin' marvel. Multi Frame Generation is giving me ultra smooth gaming performance, and will continue to do so in an impressively large number of games from day one. The nexus point between hardware and software is where the RTX 5090 thrives. When everything's running like it should I'm being treated to an unparalleled level of both image fidelity and frame rates. It's when you look at the stark contrast between a game such as Cyberpunk 2077 running at 4K native in the peak RT Overdrive settings, and then with the DLSS and 4x Multi Frame Gen bells and whistles enabled that it becomes hard to argue with Nvidia's focus on AI modeling over what it is now, rather disdainfully calling brute force rendering. Sure, the 30% gen-on-gen 4K rendering performance increase looks kinda disappointing when we've been treated to a 50% bump from Turing to Ada and then a frankly ludicrous 80% hike from Ampere to Ada. And, if Nvidia had purely been relying on DLSS upscaling alone to gild its gaming numbers, I'd have been looking at the vanguard of the RTX 50-series with a wrinkled nose and a raised eyebrow at its $2K sticker price. But the actual gaming performance I'm seeing out of this card in the MFG test builds -- and with the DLSS Override functionality on live, retail versions of games -- is kinda making me a a convert to this new AI world in which we live. I'm sitting a little easier with the idea of 15 out of 16 pixels in my games getting generated by AI algorithms when I'm playing Alan Wake 2 at max 4K settings just north of 180 fps, Cyberpunk 2077's Overdrive settings at 215 fps, and Dragon Age: Veilguard at more than 300 fps. Call it frame smoothing, fake frames, whatever, it works from a gaming experience perspective. And it's not some laggy mess full of weird graphical artifacts mangled together in order to hit those ludicrous frame rates, either. Admittedly, there are times where you can notice a glitch caused by either Frame Gen or the new DLSS Transformer model, but nothing so game or immersion breaking that I've wanted to disable either feature and flip back to gaming at 30 fps or not run at top settings. There are also absolutely cases where the DLSS version looks better than native res and times where those extra 'fake frames' are as convincing as any other to the naked eye. Honestly you're going to have to be really looking for problems from what I've seen so far. And I've been watching side-by-side videos while they export, where you can literally watch it move one frame at a time; the frame gen options stand up incredibly well even under such close scrutiny. If that noggin'-boggling performance were only available in the few games I've tested it with at launch then, again, I would be more parsimonious with my praise. But Nvidia is promising the Nvidia App is going to be offering the DLSS Override feature for 75 games and apps to turn standard Frame Gen over to the new frame multiplier. And you still don't need to log in to the app to be able to flip the Multi Frame Generation switch. And, rando PlayStation port aside, most of the games you're going to want to play over the next 12 months -- especially the most feature-rich and demanding ones -- will more than likely include Nvidia's full DLSS feature-set. Unless other deals take precedence... ahem... Starfield. I will say the switch to the transformer model for DLSS hasn't been the game-changer I was expecting from the demos I witnessed at CES, but it's at the very least often better than the standard convolution neural network in terms of image quality. It's just that it will add in some oddities of its own to the mix and doesn't completely rid us of Ray Reconstruction's ghosting. But don't get me wrong, more base level rendering grunt would always be welcome, but to get to these sorts of fps numbers with pure rendering power alone is going to take a lot of process node shrinks, more transistors than there are stars in the sky, and a long, long time. Oh, and probably cost a ton of cash, too. Though even a little more raster power would push those AI augmented numbers up even further, and that's something which will certainly be in my mind as I put the rest of the RTX 50-series through its paces. I, for one, am a little concerned about the RTX 5070 despite those claims of RTX 4090 performance for $549. The RTX 5090, though, is as good as it gets right now, and is going to be as good as the RTX Blackwell generation gets... until Nvidia decides it wants to use the full GB202 chip. Yields on TSMC's mature 4N node are surely pretty good this far down the line, eh? And, oh is it ever pretty. With all the comic girth of the RTX 3090 and RTX 4090, they are just stupid-looking cards. I'm always taken aback whenever I pull one out of its box to stick in a PC. Being able to come back to the dual-slot comfort zone is testament to the over-engineering Nvidia has done with the Founders Edition, even if both the RTX 5090 cards I've tested have been some of the squealiest, coil-whiney GPUs I've tested in recent history. But your mileage and your PSU may vary, they certainly don't sound great with the test rig's Seasonic power supply. Despite being somewhat of a loss-leader for Nvidia, this RTX 5090 Founders Edition is also likely to be as cheap as an RTX 5090 retails for over the next year. With every other AIB version sure to be bigger, and most of them more expensive, the Founders Edition is the card you should covet. And the one you will be disappointed about when you almost inevitably miss out on what will surely be slim inventory numbers. The GPU at its heart might not be super exciting, but the potential of all the neural rendering gubbins Nvidia is laying down the groundwork for with this generation could change that given time. Right now, however, it feels more like an extension of Ada, and with the outstanding AI augmented performance really symptomatic of where we're at in time. Still, when it comes to the raw gaming experience of using this svelte new RTX 5090 graphics card, it's literally impossible to beat with any other hardware on the planet. As a layman, not a huge amount seems to have changed from the Ada architecture through to the GB202 Blackwell GPU. As I've said, on the surface it feels very much like an extension of the Ada Lovelace design, though that is potentially because Blackwell is sitting on the same custom TSMC 4N node, so in terms of core counts and physical transistor space there isn't a lot of literal wiggle room for Nvidia. There are 21% more transistors in the GB202 versus the AD102, and a commensurate 21% increase in die size. Compare that with the move from the RTX 3090 to RTX 4090, with the switch from Samsung's 8nm node to this same 4N process, Ada's top chip gave us 170% more transistors, but a 3% die shrink. There are still 128 CUDA cores per streaming multiprocessor (SM), so the 170 SMs of the GB202 deliver 21,760 shaders. Though in a genuine change from Ada, each of those can be configured to handle both integer and floating point calculations. Gone are the dedicated FP32 units of old. Though, interestingly, this isn't the full top-tier Blackwell GPU. The RTX 5090 has lopped off one full graphics processing cluster, leaving around 2800 CUDA cores on the cutting room floor. I guess that leaves room for a Super, Ti, or an RTX Blackwell Titan down the line if Nvidia deems it necessary. You are getting the full complement of L2 cache, however, with near 100 MB available to the GPU. But then you are also seeing 32 GB of fast GDDR7 memory, too, on a proper 512-bit memory bus. That means you're getting a ton more memory bandwidth -- 78% more than the RTX 4090 could offer. There are deeper, arguably more fundamental changes that Nvidia has made with this generation, however. Those programmable shaders have finally been given direct access to the Tensor Cores, and that allows for what the green team is calling Neural Shaders. Previously the Tensor Cores could only be accessed using CUDA, but in collaboration with Microsoft, Nvidia has helped create the new Cooperative Vectors API, which allows any shader -- whether pixel or ray tracing -- to access the matrix calculating cores in both DX12 and Vulkan. This is going to allow developers to bring a bunch of interesting new AI-powered features directly into their games. And it means AI is deeply embedded into the rendering pipeline. Which is why we do have a new slice of silicon in the Blackwell chips to help with this additional potential workload. The AI Management Processor, or AMP, is there to help schedule both generative AI and AI augmented game graphics, ensuring they can all be processed concurrently in good order. It's that Cooperative Vectors API which will allow for features such as neural texture compression, which is touted to deliver 7x savings against VRAM usage -- ostensibly part of Nvidia's dedicated push to ensure 8 GB video cards still have a place in the future. But it also paves the way for RTX Neural Radiance Cache (to enhance lighting via inferred global illumination), and RTX Neural Materials, RTX Neural Skin, and RTX Neural Faces, which all promise to leverage the power of AI models to get us ever closer to photo realism. At least get us close to the sort of image quality you'll see in offline rendered films and TV. The new 4th Gen RT Cores aren't to be left out, and come with a couple of new units dedicated to improving ray tracing. Part of that push is something called Mega Geometry, which massively increases the amount of geometry possible within a scene. It reminds me a whole lot of when tesselation was first introduced -- the moment you turn off the textures and get down to the mesh layer in the Zorah demo, which showcases the tech, you're suddenly hit by what an unfeasible level of geometry is possible in a real-time scene. This feature has largely been designed for devs on Unreal Engine 5 utilising Nanite, and allows them to ray trace their geometry at full fidelity. Nvidia has put so much store in Mega Geometry that it has designed the new RT Cores specifically for it. The final hardware piece of the RTX Blackwell puzzle to be dropped into the new GPU is Flip Metering. The new enhanced display engine has twice the pixel processing capability, and has been designed to take the load away from the CPU when it comes to ordering frames up for the display. The Flip Metering feature is there to enable Multi Frame Generation to function smoothly -- displaying all those extra frames in between the rendered ones in good order is vital in order to stop it feeling "lumpy". That's not my phrase, that's a technical term from Nvidia's Mr. DLSS, Brian Catanzaro, and he should know. In terms of the feature set, DLSS itself has also had a potentially big upgrade, too. Previously it used a convolutional neural network (CNN) as the base model for DLSS, which is an image-focused model, and made sense for something so image-focused as upscaling. But it's no longer the cutting edge of AI, so DLSS 4 has switched over to the transformer architecture you will be familiar with if you've used ChatGPT -- the GPT bit stands for generative pre-trained transformer. It's more efficient than CNN, and that has allowed Nvidia to be more computationally demanding with DLSS 4 -- though I've not really seen much in the way of a performance difference between the two forms in action. Primarily it seems the transformer model was brought in to help Ray Reconstruction rid itself of the smearing and ghosting it suffers from, though it's also there for upscaling, too. Nvidia, however, is currently calling that a beta. Given my up and down experience with the transformer model in my testing, I can now understand why. It does feel very much like a v1.0 with some strange artifacts introduced for all the ones it helps remove. I've saved the best new feature for last: Multi Frame Generation. I was already impressed with the original version of the feature introduced with the RTX 40-series, but it has been hugely upgraded for the RTX 50-series and is arguably the thing which will impress people the most while we wait for those neural shading features to actually get used in a released game. It's also the thing which will really sell the RTX 50-series. We are still talking essentially about interpolation, no matter how much Jen-Hsun wants to talk about his GPU seeing four frames into the future. The GPU will render two frames and then squeeze up to three extra frames in between. Using a set of new AI models it no longer needs dedicated optical flow hardware (potentially good news for RTX 30-series gamers), and is able to perform the frame generation function 40% faster and with a 30% reduction in its VRAM footprint. That flip metering system now means the GPU's display engine queues up each frame, pacing them evenly, so you get a smooth final experience. The 5th Gen Tensor Cores have more horsepower to deal with the load, and the AMP gets involved, too, in order to keep all the necessary AI processing around both DLSS and Frame Generation, and whatever else they get up to in the pipeline, running smoothly. The raw performance of the RTX 5090 is relatively impressive. As I've mentioned earlier, I'm seeing around a 30% improvement in 4K gaming frame rates over the RTX 4090, which isn't bad gen-on-gen. We have been spoiled by the RTX 30- and 40-series cards, however, and that does make this bump seem a little less exciting. The main increase is all at that top 4K resolution, because below that the beefy GB202 GPU does start to get bottlenecked by the processor. And that's despite us rocking the AMD Ryzen 7 9800X3D in our test rig -- I've tossed the RTX 5090 into my own rig with a Ryzen 9 7950X in it and the performance certainly drops. And in games where the CPU is regularly the bottleneck, even at 4K, the performance delta between the top Ada and Blackwell GPUs is negligible. In Homeworld 3 the 4K performance increase is just under 9%, even worse, at 1080p the RTX 5090 actually takes a retrograde step and drops 7% in comparison. Where the GPU is the star, however, the extra 4K frame rates are matched by the overall increase in power usage. This thing will drain your PSU and I measured the card pulling down nearly 640 W at peak during our extended Metro Exodus benchmark. The commensurate performance increase does, however, follow so the performance per watt at 4K remains the same compared with the RTX 4090. But yes, it does start to fall down when you drop to 1440p and certainly 1080p. If you were hoping to smash 500 fps at 1080p with this card we might have to have a little chat. It will still draw a ton of power at the lower resolutions, too, which means its performance per watt metrics drop by 15%. You might say that's not such a biggy considering you'll be looking to play your games at 4K with such a beast of a GPU, but this is a graphics card built for DLSS, and as such if you hit 4K DLSS Quality settings you're actually rendering at 1440p. That 30% 4K uplift figure is then kinda moot unless you're sticking to native rendering alone. Which you absolutely shouldn't do because Multi Frame Generation is a game-changer, in the most literal sense. The performance difference going from Native, or even DLSS Quality is stark. With Alan Wake 2 now hitting 183 fps, with 102 fps 1% low, it's a glorious gaming experience. Everything in the graphics settings can be pushed to maximum and it'll still fly. More importantly, the latency is only marginally higher than with just DLSS settings -- the work Nvidia has done to pull that down with Multi Frame Generation is a marvel. As is the Flip Metering frame pacing. This is what allows the frames to come out in a smooth cadence, and makes it feel like you're really getting that high-end performance. Cyberpunk 2077 exhibits the same huge increase in performance, and is even more responsive than Alan Wake 2, with just 43 ms latency when I've got 4x Multi Frame Generation on the go. And even though Dragon Age: The Veilguard is pretty performant at 4K native, I'll happily take a 289% increase in perceived frame rate, especially when the actual PC latency on that game barely moves the needle. It's 28 ms at 4K native and 32 ms with DLSS Quality and 4x MFG. Another benefit of the DLSS and MFG combo is that it pulls down the power and thermal excesses of the RTX 5090. I've noticed around a 50 W drop in power consumption with MFG in action, and that means the temps go down, and the GPU clock speed goes up. Still, the overall combination of high power, high performance, and a new, thinner chassis means that the GPU temperature is noticeably higher than on the RTX 4090 Founders Edition. Running through our 4K native Metro Exodus torture test, the RTX 5090 Founders Edition averages 71 °C, with the occasional 77 °C peak. That's a fair chunk higher than the top Ada, though obviously that's with a far thinner chassis. For me, I'd take that extra little bit of heat for the pleasure of its smaller footprint. What I will say, however, is that I did experience a lot of coil whine on our PC Gamer test rig. So much so, that Nvidia shipped me a second card to test if there was an issue with my original GPU. Having now tested in my home rig, with a 1600 W EVGA PSU, it seems like the issue arose because of how the Seasonic Prime TX 1600 W works with the RTX 5090, because in my PC the card doesn't have the same constantly pitching whine I experienced on our test rig. The RTX 5090 being a beastly GPU, I've also taken note of what it can offer creatives as well as gamers. Obviously with Nvidia's AI leanings the thing can smash through a generative AI workload, as highlighted by the way it blows past the RTX 4090 in the UL Procyon image benchmark. Though the AI index score from the PugetBench for DaVinci Resolve test shows that it's not all AI plain sailing for the RTX 5090. GenAI is one thing, but DaVinci Resolve's use of its neural smarts highlights only a 2.5% increase over the big Ada GPU. Blender, though, matches the Procyon test, offering over a 43% increase in raw rendering grunt. I'm confident that extra memory bandwidth and more VRAM is helping out here. When is a game frame a real frame? This is the question you might find yourself asking when you hear talk of 15 out of 16 pixels being generated by AI in a modern game. With only a small amount of traditional rendering actually making it onto your display, what counts as a true frame? I mean, it's all just ones and zeros in the end. So, does it really matter? For all that you might wish to talk about Multi Frame Generation as fake frames and just frame-smoothing rather than boosting performance, the end result is essentially the same: More frames output onto your screen every second. I do understand that if we could use a GPU's pure rendering chops to hit the same frame rates it would look better, but my experience of the Blackwell-only feature is that often-times it's really hard to see any difference. Nvidia suggests that it would take too long, and be too expensive to create a GPU capable of delivering the performance MFG is capable of, and certainly it would be impossible on this production node without somehow making GPU chiplets a thing. It would be a tall order even just to match the performance increase the RTX 4090 offered over the RTX 3090 in straight rendering. But that's the thing, the RTX 4090's 80% performance bump is living in recent memory, rent-free in the minds of gamers. Not that that sort of increase is, or should necessarily be expected, but it shows it's not completely beyond the realms of possibility. It's just that TSMC's 2N process isn't even being used by Apple this year, and I don't think anyone would wait another year or so for a new Nvidia series of GPUs. Though just think what a die-shrink and another couple year's maturity for DLSS, Multi Frame Gen, and neural rendering in general might mean for the RTX 60-series. AMD, be afraid, be very afraid. Or, y'know, make multiple GPU compute chiplets a thing in a consumer graphics card. Simple things, obvs. Still, if the input latency had been an issue then MFG would have been a total non-starter and the RTX Blackwell generation of graphics cards would have felt a lot less significant with its rendering performance increase alone. At least at launch. The future-gazing features look exciting, but it's far too early to tell just how impactful they're going to be until developers start delivering the games that utilise the full suite of neural shading features. It would have certainly been a lot tougher for Nvidia to slap a $2,000 price tag onto the RTX 5090 and get away with it. With MFG it can legitimately claim to deliver performance twice that of an RTX 4090. Without it, a sole 30% 4K performance bump wouldn't have been enough to justify a 25% increase in pricing. What I will say in Nvidia's defence on this is that the RTX 4090 has been retailing for around the $2,000 mark for most of its existence, so the real world price delta is a lot smaller. At least compared to the RTX 5090's MSRP. How many, and for how long we'll see actual retail cards selling for this $1,999 MSRP, however, is tough to say. It's entirely likely the RTX 5090's effective selling price may end up closer to the $2,500 or even $3,000 mark once the AIBs are in sole charge of sales as the Founders Edition stock runs dry. I can see why Nvidia went with the RTX 5090 first as the proponent of Multi Frame Generation. The top-end card is going to benefit far more from the feature than cards lower down the stack, with less upfront rendering power to call on. Sure, Nvidia claims the RTX 5070 can hit RTX 4090 performance with MFG, but I'm going to want to see that in a few more games before I can get onboard with the claims. The issue with frame generation has always been that you need a pretty high level of performance to start with, or it ends up being too laggy and essentially a bit of a mess. The most demanding games may still be a struggle for the RTX 5070 even with MFG, but I guess we'll find out soon enough come February's launch. Until then, I'll just have to sit back and bask in the glorious performance Nvidia's AI chops are bringing in alongside the RTX 5090.
[16]
The Nvidia RTX 5090 Generates So Many Frames, It Scares Me
Nvidia's multi frame gen on the GeForce RTX 5090 graphics card can produce framerates above 300 FPS, though that doesn't mean you absolutely need a $2,000 GPU. The $2,000 Nvidia RTX 5090 is so powerful that it made me wonder if there were such a thing as too many frames when it came to PC gaming. The thought didn't occur to me until I played Dragon Age: The Veilguard with Nvidia’s touted Multi Frame Generation as part of its DLSS 4 update. During a hectic enemy encounter, it went above 360 FPS on the highest graphical settings and even higher when the action died down. It's partly a factor of the new Blackwell architecture, but the graphics card is only a vehicle. The rest is AI and "fake frames." More to the point, it’s a process called Multi Frame Generation. With the launch of the latest update to Nvidia's AI upscaler DLSS 4, the new Nvidia cards can essentially generate up to three frames in between two rendered frames. If the card is already powerful, then multi-frame gen is like putting a cherry on top of the most expensive, decadent gold leaf sundae. What’s the difference between playing a single-player roleplaying game at 120 FPS versus 360 FPS? After playing multiple big-name titles with the new DLSS 4 capabilities, I can’t tell whether there’s any reason you need those framerates other than the primordial joy of watching a number go up. This isn’t a full review of Nvidia’s new GPU. I can’t compare the RTX 5090 Founders Edition to the $1,600 RTX 4090 because I don’t have access to one. I can compare it to the RTX 4080 Super and RTX 4070 Ti Super, solid cards with asking prices well below the 5090. With any high-end card, you always need to rely on software. That’s true for the 5090 as well. If you want to play Cyberpunk 2077 at 4K with max settings and ray tracing enabled without DLSS, you won’t hit 60 FPS. With DLSS on balanced settings, you’ll see over 100 FPS. When you apply frame gen, you get a kick out of the high number, but as with anything generated with the help of AI, there will inevitably be some unintended side effects. The Nvidia GeForce RTX 5090 Founders Edition is set to launch on Jan. 30. The Nvidia GeForce RTX 5090 specs are only slightly less ludicrous than its price tag. The card is based on the new Blackwell architecture, and the two-fan Founders Edition config packs 32 GB of VRAM, 21,760 CUDA cores with a 2.01 GHz base clock, and a 2.41 GHz boost clock. The card comes with 4th-gen ray tracing cores with a claimed 318 TFLOPS performance. All that’s well and good, but Nvidia has focused most of its attention on the 5th-gen Tensor cores and the promised 3,352 TOPS of AI performance. The AI is the name of the game here. It would be difficult to justify the price increase without the Blackwell architecture and multi frame gen capabilities. Like previous versions of frame gen first released with DLSS 3, this inputs a generated frame between two rendered frames. Multi Frame Gen is better than before, according to Nvidia. It should be more efficient and 40% faster. What helps is it uses an AI Management Processor on the GPU itself to assign these various AI tasks. The Blackwell cards use so-called “flip metering†when generating frames. This slots them in such a way to reduce latency. Even then, Nvidia wants you to rely on RTX Reflex 2 to reduce latency further. If this is all starting to sound like a lot to add on top of the traditional horsepower of a new graphics processor, that’s because it is. Some games won’t be quick on the draw to update their UI for Multi Frame Gen or any new DLSS 4 features. The Nvidia app includes a DLSS override feature that lets you force it to upgrade to the new upscaler and frame gen. Either way, you can choose 4x, 3x, or 2x frame gen. Older RTX cards will receive some upgrades to DLSS 4. Still, while the RTX 40-series will have access to the transformer model DLSS and some upgrades to 2x frame generation, Nvidia said the Blackwell architecture is necessary for Multi Frame Generation. I tested Nvidia’s new flagship on the prebuilt Origin PC Neuron 3500X, which included an Intel Core Ultra 285K CPU and 32 GB of LDDR5X RAM. The PC packs a 1000 W 80 Plus Gold PSU. The power supply is the minimum Nvidia recommends for the 5090. This is a near-$3,400 desktop tower MSRP. A 1000 W PSU is nominal for most high-end PCs, though before, it was for the sake of having headroom and for future-proofing. Considering the RTX 5090 is $2,000, every other component you need to make the most of the GPU capabilities makes an excessive PC feel more excessive. There's an unnerving feeling when you jump into a game and watch that frame counter skyrocket. There’s a sense that this can’t be real, and you immediately look for problems. The thing is, I couldn’t spot many or practically any visual discrepancies. I analyzed the foliage as closely as possible in Dragon Age: The Veilguard but couldn’t find any noticeable issues with sharpness or textures. I took it into other demanding games, like Hogwarts Legacy and Cyberpunk 2077. I never thought I’d ever witness Cyberpunk push 200 FPS, but life tends to throw its curveballs. Multi Frame Gen also meant I could push path tracing to ultra and still maintain such high FPS anybody looking on would assume I'm faking it (and to a point, I am). While cruising the streets of Night City on a bike, I didn’t encounter any hints of hitching. However, I noticed some of the lights on Jackie's bike oddly flickering while driving around at high speed. The flickering occurred at 2x, 3x, and 4x frame gen, though it was worse when relying on more generated frames. You can see the issue for yourself in the video above. I won't say I wasn't distracted by the odd UI flickering, though the game was perfectly playable anyway. I also didn't notice any floatiness in the controls or artifacts in the visuals, but that doesn't mean there weren't any other issues I couldn't make out myself. With Reflex 2, I didn't notice any responsiveness issues either. Without Frame Gen turned on, Cyberpunk was running at about 100 FPS with DLSS upscaling on balanced mode. Without DLSS, it would run at sub-60 FPS. The game is perfectly playable on an RTX 5090 without frame gen. The excess frames don't necessarily add to the experience. Those "fake frames" could distract some players more than it aids. I was struck by how normal games looked while running all these generated frames. In Dragon Age: The Veilguard, I noticed one cutscene seemed to pass too fast, but the issue didn’t repeat afterward. Playing Alan Wake II, I can’t possibly get beyond 60 FPS with path tracing on Ultra and every other setting turned up to 11, even with the 5090 and DLSS on balance. When enabling 4x multi frame gen, it jumps to past 190 FPS. There were odd issues with pop-in, but that’s a known issue for the game running on the maximum settings on PC. I tried it and found that the difference between 70 FPS without frame gen and 250 FPS with 4x frame gen didn’t disrupt gameplay. But does it enhance gameplay? I normally test games at 4K on an AOC U27G3X monitor, which only goes up to a 120 Hz refresh rate. That’s perfectly fine for most GPUs, and it would be fine for the 5090 if it weren’t for multi-frame generation. Without the 2x, 3x, or 4x frame generation, I could only hope to hit around 100 FPS with medium DLSS upscaling and no path tracing for the most demanding titles. If I wanted to make the most of 360 FPS, I would need a really, really expensive monitor like the LG Ultragear monitor that can go up to 480 Hz refresh rates, but then you may sacrifice the sacred 4K resolution. This last CES was stuffed with 240 Hz 4K OLED monitors from every OEM under the sun. You could grab a beautiful, high-end curved monitor that does 4K and 240 Hz for well over $1,000, but you'll struggle to find one to make sense of all those frames. LG’s lauded bendable 5K2K monitor will only stick at 165 Hz if you want a higher resolution. So, plenty of monitors can do 240 Hz, but fewer do more. That’s because there’s a point when framerate doesn’t make a material difference. Even a professional FPS esports competitor will compete on a 240 Hz monitor. You won’t notice a major difference between 120 FPS and 240 FPS if you game casually. I took the 5090 for a spin with Marvel Rivals. It’s the kind of game that doesn’t demand too much from a PC. It will have over 300 FPS with multi frame gen. I’m not a capable enough gamer to know whether the generated frames help or hinder me. It’s not going to make my aim better, in either case. Nvidia CEO Jensen Huang made it clear in a recent Q&A with press and analysts that the company priced RTX 5090 at $2,000, knowing your average gamer won't afford it. In the CEO's words, this is the GPU for the people who want†the best," and they don't care what they spend to get "the best." Having the best also means you would need the best CPU, the highest-rated power supply, and one of the most expensive monitors you can buy. It’s a card made for people where money is no object. At that point, what is the point of a review? You already know the 5090 is a step above the 4090. It had better be because it cost $400 more than Nvidia’s previous flagship. You still need a game to run at a stable 60 FPS to make frame generation useful, which is why I’m more curious about the RTX 5070 and 5070 Ti for the sake of gamers who can’t spend the equivalent of their monthly rent or mortgage payment on a single graphics card. Multi Frame Gen will be a better bargain for cheaper cards. That’s why I’m more excited for the still unannounced RTX 5060. That will depend on how good the base card offers a standard frame rate with fully rendered frames. Still, budget gamers are used to making compromises. Those who pay $2,000 for a graphics card demand none.
[17]
Nvidia RTX 5090 review: fast, but not nearly twice as fast
Nvidia GeForce RTX 5090 MSRP $1,999.00 Score Details "Nvidia is, once again, leaving its mark on the flagship throne with the RTX 5090." Pros Unrivaled 4K gaming performance Innovative, attractive Founder's Edition design DisplayPort 2.1 and 4:2:2 encoding 32GB of memory for AI workloads DLSS 4 is a treat... Cons ...when it works properly Insanely expensive Power requirements are off the charts Table of Contents Table of Contents Nvidia RTX 5090 specs 4K gaming performance 1440p gaming performance 1080p gaming performance Ray tracing A closer look at DLSS 4 Great -- for those in the market The RTX 5090 is a hard GPU to review. By the numbers, it's undoubtedly the best graphics card you can buy. That's what happens when you're the only one in town making this class of GPU, and as it stands now, Nvidia is. If you want the best of the best and don't mind spending $2,000 to get it, you don't need to read the rest of this review -- though, I'd certainly appreciate if you did. Recommended Videos No, the RTX 5090 is about everything else that RTX 50-series GPUs represent. It delivers that flagship gaming performance, but it also ushers in an entirely new architecture, DLSS 4, and the era of neural rendering. And on those points, the dissection of the RTX 5090 is far more nuanced. Get your weekly teardown of the tech behind PC gaming ReSpec Subscribe Check your inbox! Privacy Policy Nvidia RTX 5090 specs The RTX 5090 is angled toward PC gamers who want the best of the best -- regardless of the price -- but it's also the first taste we've gotten of Nvidia's new Blackwell architecture in desktops. The big change is neural rendering. With RTX 50-series GPUs, Nvidia is introducing neural shaders -- along with DirectX -- though we won't see the fruits of that labor play out for quite some time. For immediate satisfaction, Nvidia has DLSS 4. This feature is coming to all RTX graphics cards, replacing the convolutional neural network (CNN) that DLSS previously used with a new transformer model. Nvidia says this leads to a quality boost across the board. For the RTX 5090, the more important addition is DLSS Multi-Frame Generation, which promises up to 4X frame generation in 75 games on day one. DLSS 4 is coming to all RTX graphics cards, but DLSS Multi-Frame Generation is exclusive to RTX 50-series GPUs, including the RTX 5090. RTX 5090 RTX 4090 Architecture Blackwell Ada Lovelace Process node TSMC N4 TSMC N4 CUDA cores 21,760 16,384 Ray tracing cores 170 4th-gen 144 3rd-gen Tensor cores 680 5th-gen 576 4th-gen Base clock speed 2017MHz 2235MHz Boost clock speed 2407MHz 2520MHz VRAM 32GB GDDR7 24GB GDDR6X Memory speed 30Gbps 21Gbps Bus width 512-bit 384-bit TDP 575W 450W List price $1,999 $1,599 Although it might seem like Nvidia could just flip a switch and enable DLSS Multi-Frame Generation on all of its GPUs, that's not exactly the case. Nvidia says with 4X frame generation and Ray Reconstruction enabled, there are five AI models running on your GPU for each rendered frame. To manage all of that, the RTX 5090 includes an AI management processor, or AMP, which handles scheduling of these different workloads across the ray tracing, Tensor, and CUDA cores. Outside of AI hardware, the RTX 5090 brings 32GB of GDDR7 memory. Nvidia bumped up the capacity from 24GB on the RTX 4090, though that doesn't have a ton of applications in games. The extra memory here really helps AI workloads, where training large models can easily saturate 32GB of memory. The bigger boost is GDDR7, which is twice as efficient as GDDR6 while providing twice as high of a data rate. Nvidia also redesigned its ray tracing and Tensor cores for Blackwell, both of which it says are built for the new Mega Geometry feature. The bigger standout for me is the media encoding engine, however. Nvidia now supports 4:2:2 video encoding, along with DisplayPort 2.1 output. Those are some significant upgrades over the RTX 4090, regardless of what the benchmarks say. 4K gaming performance Twice as fast as the RTX 4090? Not quite. Based on my results, the RTX 5090 is about 30% faster than the RTX 4090 when the new DLSS Multi-Frame Generation feature isn't brought into the mix. And it's a feature you might want to leave out of the mix in some titles, as I'll dig into later in this review. That sounds like a solid generational jump, but I went back to my RTX 4090 review for a sanity check. It's not nearly as big as what we've seen previously. With the RTX 4090, Nvidia provided over an 80% generational improvement, which is massive. Here, it's actually more of a lateral move. The RTX 5090 is 30% faster than the RTX 4090, but it's also 25% more expensive, at least at list price. That said, good luck finding an RTX 4090 in stock at $2,000, much less at list price. The RTX 5090 may not be the generational improvement I expected, but the reality for buyers is still that it's the best option for flagship performance. The average is brought down by a handful of games where the RTX 5090 doesn't show a huge increase. In Assassin's Creed Mirage, for example, there's about a 17% uplift. Similarly, in Forza Motorsport, the improvement shrinks to just 14%. Those aren't exactly the margins I was hoping for when Nvidia announced a new flagship GPU, and especially one that comes in at a significantly higher price. Make no mistake; there are still big wins. As you can see above, I measured a massive 54% improvement in Cyberpunk 2077, which is really impressive. In the previous generation, the RTX 4090 was the only GPU that could run this game at 4K Ultra without upscaling and still achieve 60 frames per second (fps). Now, the RTX 5090 is comfortably reaching into the triple digits. This is the kind of improvement I expected to see across the board. Cyberpunk 2077 isn't a one-off thankfully. Although the improvements aren't quite as large across the board, I saw similarly impressive uplifts in Horizon Zero Dawn Remastered, Returnal, and Dying Light 2. The improvement may not be above 80% like we saw in the previous generation, but there's still a clear improvement. If you want the best of the best, Nvidia is claiming that throne with the RTX 5090. It's just the expectations that are important. Despite some big wins, I suspect most games will look like Black Myth: Wukong, Red Dead Redemption 2, and Call of Duty Modern Warfare 2. You're getting a nice chunk of extra performance, no doubt, but that lift doesn't fundamentally change the gameplay experience in quite the same way that the RTX 4090 did. Looking over my 4K data, it became clear that the RTX 5090 establishes somewhat of a new normal. The RTX 4090 had an outsized generational improvement, as Nvidia continued to navigate the waters of how it wanted to market its flagships moving forward. The RTX 5090 is disappointing by comparison, and I'm not sure there's much reason for RTX 4090 owners to run out and buy Nvidia's latest. But for those that want the best, it's hard arguing with the numbers the RTX 5090 puts up. It's easy to argue, however, with Nvidia's misleading claims. We're nowhere near twice the performance of an RTX 4090, and the company confirmed to me that it's seeing a 30% average uplift internally, as well. That's the kind of improvement I'd expect to see out of an 80-class card, but it looks like the death of Moore's Law has to hit everyone at some point. 1440p gaming performance Even at 1440p, it's very easy to run into a CPU bottleneck with the RTX 5090. You can see that just from looking at the averages above; the RTX 5090 shrinks down to just a 22% lead over the RTX 4090. All of my data here is fresh, and run with a top-of-the-line Ryzen 9 9950X. In short, if you plan to use the RTX 5090 at 1440p, you're giving up a serious chunk of its performance potential, and you're probably better off with a used RTX 4090. Forza Motorsport and especially Red Dead Redemption 2 show the problem here. The RTX 5090 is still able to squeeze out a win across games at 1440p, but the margins are much thinner. That's not a critique of the graphics card, but it is the reality of trying to run this monstrous GPU at any resolution below 4K. There are still some solid wins for Nvidia's latest, particularly in games that scale well on your CPU. Cyberpunk 2077 is once again a standout victory, but you can see similarly large improvements in Dying Light 2 and Returnal. One game that's worth zooming in on is Black Myth: Wukong. This is the only game in my test suite that I run with upscaling enabled by default, and it shows what can happen when forcing upscaling on at a lower resolution. The RTX 5090 is providing a 20% improvement, but as you continue to push down the internal resolution, that lead will continue to flatline. Regardless, the RTX 5090 really isn't built for 1440p. You can use it at this resolution, but you're giving up a chunk of what the RTX 5090 is truly capable of. 1080p gaming performance The idea of using an RTX 5090 at 1080p is a little silly, but I still ran the card through all of the games I tested at this resolution. Here, the CPU bottleneck becomes more extreme, pushing the RTX 5090 down to just a 15% lead over the RTX 4090. You could see that as disappointing, but frankly, I see this resolution as unrealistic for a $2,000 graphics card. However, looking at 1080p data is still valuable, at least at a high level. It's important to remember that DLSS Super Resolution renders the game at a lower internal resolution, so the advantage of the RTX 5090 slips a bit with DLSS upscaling turned on. The RTX 5090 can easily make up that gap with DLSS Multi-Frame Generation -- and even push much further -- but these results are a good reminder of bottlenecks you can run into when using flagship hardware with upscaling. Ray tracing Nvidia dominates when it comes to ray tracing, so it's no surprise that the RTX 5090 enjoys a top slot among the games I tested. However, the improvements aren't as large as I expected. Nvidia has "solved," for lack of a better word, real-time ray tracing. Games that aren't pushing full-on path tracing are seeing less of an improvement, largely due to the fact that lighter forms of ray tracing are fair game for GPUs as weak as the Intel Arc B580. Dying Light 2 is a good example of this dynamic. When this game released a few years back, it was one of the most demanding titles you could play on PC. But even at 4K with the highest graphics preset and no help from upscaling, the RTX 5090 makes Dying Light 2 look like child's play with a comfortable 90 fps average. In Returnal, the situation is even more extreme. This is one of those lighter ray tracing games available, and sure enough, the RTX 5090 crosses triple digits without breaking a sweat, even at 4K. Things get interesting when looking at those more demanding ray tracing games, though. Cyberpunk 2077, once again, serves as a mile marker for the RTX 5090. It's the first GPU to get close to 60 fps at 4K with the RT Ultra preset, which is quite the achievement. Of course, it's possible to push the RT Overdrive preset, as well -- more on that in the next section -- but looking at raw performance, Nvidia is pushing to new heights. The next frontier is path tracing, and for that, I used Black Myth: Wukong. The RTX 5090 provides a great experience, even at the Cinematic graphics preset in the game. But games like Black Myth -- such as Alan Wake 2 and Cyberpunk 2077 -- that have a path tracing mode still need to resort to upscaling, introducing the CPU more into the mix and limiting the performance uplift. Maybe in the next few generations we'll see a native 60 fps in this title from an Nvidia flagship. There really isn't much to talk about when it comes to ray tracing on the RTX 5090, and that's exactly how Nvidia wants it. In the vast majority of games, you're looking at rasterized performance that comfortably clears 60 fps at native 4K and can easily climb into the triple digits. Ray tracing still forces some upscaling wizardry in titles like Black Myth: Wukong, but for the most part, you can flip on ray tracing without a second thought. That's the way it should be. A closer look at DLSS 4 The chart above is the story Nvidia wants to tell about DLSS 4. Nvidia didn't make this chart, nor did it tell me to make it, but there's a clear narrative that emerges from the data here. Even factoring in PC latency, which is the main issue with frame generation technology, DLSS 4 is doing some magical things. You're going from an unplayable frame rate to something that can fully saturate a 4K, 240Hz monitor like the MSI MPG 321URX. And you're doing so with around half of the average PC latency as native rendering. The devil is in the details here, however, and Nvidia has a few little devils to contend with. Cyberpunk 2077 - DLSS 4 Gameplay Here's a different side of the story. Above, you can see a short section of gameplay -- I'm going for five stars here -- in Cyberpunk 2077 with the RT Overdrive preset. I'm using DLAA for a little boost to image quality, and I'm using the 4X frame generation mode. Given that I'm playing on a 4K display with a 138Hz refresh rate, these seem like ideal settings for my setup. Watch the video, and you tell me if it looks like an ideal experience. I can point out a lot of problems here, picking out single frames with various visual artifacts between each swipe of the mouse, but you don't need to pixel peep to see the issue. There's an unnatural motion blur over everything, and the edges of objects are mere suggestions rather than being locked in place. You don't need a trained eye to see that this is a bad experience. You don't need a point of comparison, even. You can watch this video in a vacuum and see that DLSS 4 has some clear limitations. That's not a damning critique of DLSS 4. It's a wonderful tool, but you need to use it correctly. Like any frame generation tech, your experience will rapidly deteriorate when you feed the frame generation algorithm with a low base frame rate like I did in Cyberpunk 2077. Nvidia wants you to use Super Resolution to get to a playable base frame rate of near 60 fps, and then click on Multi-Frame Generation to saturate a high refresh rate display. Using Multi-Frame Generation alone, especially if you're hovering around 30 fps, will give you a bad experience. Marvel Rivals - DLSS 4 Gameplay Cyberpunk 2077 shows the worst of what DLSS 4 has to offer, but Marvel Rivals shows the best. This is one of various games that uses Nvidia's new DLSS Override feature, allowing you to add up to 4X Multi-Frame Generation to games with DLSS Frame Generation through the Nvidia app. Not only is the base frame rate high enough here -- well over 60 fps, even with DLAA turned on -- but you also have a third-person camera. There are some minor artifacts, but nothing that ruins the experience and nothing you'd even notice during gameplay. Alan Wake 2 - DLSS 4 Gameplay Similarly, the artifacting isn't nearly as bad in Alan Wake 2 as it is in Cyberpunk 2077. Here, once again, I'm starting with a base frame rate of around 30 fps and using Multi-Frame Generation to make up the difference. There are some artifacts, and I'd recommend using a combination of Super Resolution and Frame Generation instead. But the experience is at least better compared to Cyberpunk 2077 due to the camera angle. You don't want to just crank DLSS 4 to 4X mode and call it a day. It needs to be fed with a base frame rate of ideally 60 fps. Although the latency doesn't significantly increase up to three generated frames -- something that Nvidia should be applauded for on its own -- the number of visual artifacts does. Realistically, I suspect DLSS 4 will more often run in 2X or 3X mode alongside Super Resolution. That, in a lot of games, will provide a much better experience than relying on Multi-Frame Generation alone. Over the past few generations, Nvidia has increasingly relied on DLSS to market its graphics cards, and that same playbook is at work here. It's just not the same selling point that it once was. Super Resolution is still pulling a lot of the weight, and even a single generated frame is enough to saturate most gaming monitors, even as refresh rates climb. There's still a use for 4X Multi-Frame Generation, and with the right circumstances, it works extremely well. But when it comes time to spend $2,000 on a graphics card, I would seriously consider how much DLSS Multi-Frame Generation is offering over a $7 utility like Lossless Scaling. For my money, it isn't providing much of an advantage. This is where you need to carefully consider your setup. You want to be using Multi-Frame Generation alongside Super Resolution in those prestige games like Cyberpunk 2077 and Alan Wake 2, and if you don't have a monitor capable of producing that high of a refresh rate, that second or third generated frame goes to waste. Unlike DLSS 3, Multi-Frame Generation isn't a feature that just works on its own; it needs to work as part of the rest of your gaming rig. Great -- for those in the market Nvidia's CEO hit the nail on the head when defending the price of the RTX 5090: "When someone would like to have the best, they just go for the best." If there's one thing I can say with absolute certainly, especially considering the lack of flagship competition from AMD, it's that the RTX 5090 is the best. It doesn't matter if it's $1,500, $2,000, or $2,500 -- Nvidia's CEO is right when he says that the appetite for this type of product doesn't factor in price nearly as much as more inexpensive options. The question isn't if the RTX 5090 is the best; it is. The question is if you need the best, and there's a bit more discussion there. The generational improvements are here, but they don't touch what we saw with the RTX 4090. DLSS 4 is incredible, but it falls apart when it's not fed with the right information. And 32GB of GDDR7 memory is welcome, but it's only delivering a benefit in AI workloads, not in games. If you're sitting on an RTX 4090, there's not much reason to upgrade here. There's a performance boost, but the real value lies in DLSS 4, and that's something that's very easy to get around without spending $2,000. The RTX 5090 really shines for everyone else. Maybe you had to skip the RTX 40-series due to poor availability, or maybe the RTX 2080 Ti you have just isn't providing the grunt that it used to. In those situations, the RTX 5090 is great. But if you're in the market to spend $2,000 on a graphics card, you probably don't need me to convince you.
[18]
Nvidia GeForce RTX 5090 (Quick Look) - A Next-Generation Graphics Card
With the 50 Series of Nvidia graphics cards now in the public eye, we've got our hands on the flagship model, which is looking to take video game graphics and performance to the next level. "Hello everyone and welcome to another exciting Gamereactor Quick Look because, like some other outlets across the globe, we are among the first to go hands-on with the RTX 5090 that was recently unveiled at CES by NVIDIA after much speculation and leaks, but it's finally here." "The 5090 is of course the most expensive and the most powerful card that NVIDIA produces in their new series, and we are also going to do follow-up reviews of some of the subsequent smaller, less expensive cards in that particular series. We are starting out with the big boy though, and it comes in this very awesome packaging with the engraved GeForce RTX 5090 name on the back here." "It seems like it is made with recyclable cardboard. It's funny now that NVIDIA's GPUs are so power-heavy that they themselves constitute quite the climate risk, but be that as it may, it's really cool to have it here and it's lovely that when the card costs as much as it does, it does come in quite the elaborate packaging." "It's nice that NVIDIA thought as far with sort of the unboxing process. So I've removed two small cardboard inlays here so that I can just lift the lid off, as it were, and here you see the card itself. It is dual-slot, it is quite heavy, but particularly in terms of its overall width, I think it was much smaller than I thought it was going to be." "I thought we were heading into a truly massive sort of physical dimension-wise GPU generation from NVIDIA, but that does not seem to be the case. You can also just quietly see the angled power connector here, which by the way uses 575 watts and a quadruple power adapter." "But you can also see one thing that I've been saying about Founders Edition cards for a while is that I think they are truly beyond sort of their raw capability, which of course means the most. They're just such beautiful items." "They are attractive. You want one, even if you don't really understand what it is that they do. And that is both this sort of dual cooling aesthetic that NVIDIA has been going on with for a couple of generations now, but it's overall just fit and finish." "A lot of third-party manufacturers will slab a bunch of plastic on there in flared fins and make it more gamer-y, but this is very stylish, I think, and NVIDIA does incredibly well here. So beyond just skin-deep stuff, this obviously uses the new Blackwell architecture that actually uses PSI 5.0, which is great." "It is a new and more of everything, essentially. So more cores, more memory, more AI, new cooling design, DLSS 4 with frame generation, Reflex 2, all of those things. Some of the stuff is on the hardware side, which we'll test first, and some of the stuff is on the software side, which will take a little bit more fiddling to get right, and we will update you, of course, accordingly." "So inside here, we have 21,760 shading units that support AI, 680 Tensor Cores alongside 170 Ray Tracing Cores, and all of those are a new generation, which is great. It has 32 gigs of GDDR7 memory operating at, I think, 1.79 terabytes per second, which is really cool." "This card, by the way, has roughly 10 times the computing power of a PS5. That is quite significant. Of course, it will vary how you can utilize that in real-world scenarios, be those professional or gaming-oriented, but again, we will update you accordingly when we have some actual findings to show you." "Now, the price is more than 20% more than the RTX 4090 was introduced with, at 2,329 euros total. That is insane for a graphics card. We are talking about this particular card being perhaps even several times more than all of the rest of the components in a pretty decent machine, so that is something to take into account." "But when you swap this in, in your tool slot motherboard, and you start using those new generation cores, and you utilize the 32 gigs of GDDR7 memory, and you use DLSS4, what kind of results can this give us? Well, that is something that we are going to have to explore in a full written review, which will hopefully land at the same time you are watching this video."
[19]
Nvidia RTX 5090 8K performance has blown me away already - and it's mainly thanks to Multi-Frame Generation
As soon as Nvidia announced the RTX 5090 at CES 2025, I couldn't wait to get hold of the premium GPU and install it in our 8K test rig - and that's exactly what I've done. Since the unveiling, Nvidia has talked up how powerful the 5090 is, claiming that it offers around twice the performance of the RTX 4090, its predecessor, in certain games. If you've read any of my recent 8K gaming features, you'll know how impressive the 4090 was (and to be fair, still is), and with help from DLSS 3 and Frame Generation, it was the first GPU that made gaming at 8K a real possibility. Our full Nvidia GeForce RTX 5090 review dives into the overall performance (and more) of the new GPU, so here we'll look at how it performs when gaming at the incredibly demanding resolution of 7,680 x 4,320 - and (spoiler alert) it's incredibly impressive and could be the start of 8K gaming finally becoming mainstream. Of course, selling at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition I've tested here means this GPU is far from being a mainstream card, and is aimed at enthusiasts and even professionals, but many of the features that the RTX 5090 comes with that makes 8K gaming possible, especially DLSS 4 and Multi-Frame Generation, will be coming to other RTX 50-series cards, including the much more affordable RTX 5070 Ti and RTX 5070 GPUs. For this article, I installed the Nvidia GeForce RTX 5090 into our existing 8K test rig, directly swapping out the RTX 4090 so I could compare the two GPUs running games at 8K fairly. In the coming month, I aim to build a new 8K setup that will take full advantage of the RTX 5090 (it's the first GPU to use PCI 5.0, for example), but this should give us a good head-to-head comparison between the two cards. Because I wanted to test out the difference DLSS 4 and Multi-Frame Generation makes at 8K, I've stuck to three games that will support those new technologies at launch (or close to) with the RTX 5090: Cyberpunk 2077, Hogwarts Legacy and Star Wars Outlaws. Since its rather disastrous launch back in 2020, Cyberpunk 2077 has turned into one of the best games I've ever played, as well as a graphical showcase, with developer CD Projekt Red updating the game over the years to take advantage of new graphical effects and technology - including support for DLSS 4 and Multi-Frame Generation. Running the game on the RTX 4090 with graphics set to RT Over Drive - essentially the highest possible graphics settings which makes liberal use of advanced ray tracing for realistic lighting and reflection - and DLSS set to 'Ultra Performance', I got 39.57fps (frames per second) on average. This setting renders the game at a lower resolution, then uses DLSS to upscale it via AI to 8K, and while around 40fps is certainly impressive considering the graphical splendor on show, as always I'm looking to get as close to 60fps as possible, as all 8K-capable displays max out at 60Hz for the moment. Turning on Frame Generation, which adds an AI-generated frame in between 'real' generated frames, gave me a decent boost to 53.19fps, showing the RTX 4090, with a bit of tweaking, is still a supremely capable GPU. With the RTX 5090 FE installed and DLSS 4 set to 'Auto', we got 54.56fps - a big leap over the 39.57fps the RTX 4090 managed with the same settings. While you might be disappointed that it didn't hit 60fps, remember this is at 8K - an extremely demanding resolution. I also hadn't turned on the RTX 5090's Multi-Frame Generation feature, and this is where things get very fun. Doing so boosted the frame rates to an incredible 86.48fps, and there was a noticeable improvement in image quality between the 5090 and 4090 thanks to the new Transformer-based AI upscaling tech and improved Frame Generation techniques. This meant that some of the visual oddities DLSS and Frame Generation can sometimes cause were essentially eliminated. But that's not all - as the name 'Multi-Frame Generation' suggests, this feature isn't just capable of generating one frame between rendered frames, but up to three frames. So, turning up Multi-Frame Generation to its '3X' setting, which is two generated frames per rendered frame, the RTX 5090 FE hit an incredible 121.05 fps. Best of all, I couldn't distinguish between real and generated frames while playing. Previous versions of Frame Generation could sometimes introduce a bit of blurring. That appears to be fixed here. Even though the RTX 5090 FE was now putting out 8K content at a frame rate that we just can't see at the moment (due to the 8K@60Hz limitation of current 8K TVs), I cranked up Multi-Frame Generation to 4X, so it was now generating three frames for every one rendered frame, allowing the RTX 5090 FE to hit 148.89fps, and without any noticeable reduction in image quality. That's one of the best-looking games in the world right now with the maximum graphics settings outputting at 7,680 x 4,320 (with upscaling) and managing 148.89fps. That's a remarkable achievement, and something that's extremely exciting for the future of PC gaming. I've not played Hogwarts Legacy, but it is one of the first games to support DLSS 4 and Multi-Frame Generation, and it's another graphically-ambitious game with open world elements and support for some of the latest graphical effects. First, I loaded it up with the RTX 4090 installed, and with DLSS set to Ultra Performance but Frame Generation left off, the RTX 4090 hit 77fps on average. That's pretty great, and the grounds around the iconic Hogwarts school (as well as the school itself) looked excellent - with interiors looking especially impressive thanks to shadow and lighting effects. Turning on Frame Generation and DLSS to Auto, we hit 81.90fps, again highlighting just how good the RTX 4090 remains. Still using DLSS, but set to Quality, which upscales from a higher initial resolution to preserve graphical fidelity as much as possible, the RTX 5090 hit 92fps - while the RTX 4090 managed 52fps with the same settings (but with the older version of Frame Generation). Turning on Multi-Frame Generation to 4X boosted frame rates to an impressive 156fps. Changing DLSS to Auto with the RTX 5090, I managed to get 150fps without Multi-Frame Generation. Turning it back on to 4X got the 5090 hitting a frankly ridiculous 235fps. Star Wars Outlaws is another game I've yet to sit down and play, mainly due to a mix of Star Wars fatigue and a well-publicized rocky launch. It's also one of the first games to support RTX 50-series features, so I loaded it up and got testing. With DLSS set to 'Performance', the RTX 4090 hit 63fps with Frame Generation on, while the RTX 5090 with same settings hit 79fps. While Star Wars Outlaws didn't offer the same level of graphical options as the other games I tried, it's a good looking game, and hitting around 80fps at 8K is, again, very, very impressive. One thing I noticed, however, is that there were quite a few graphical artifacts around characters and objects on both cards. I was playing using an early beta version of Star Wars Outlaws so I could test the new features before the RTX 5090 launched, so these glitches will hopefully be ironed out quickly, but I found that turning off Ray Reconstruction eliminated the problem - though at a slight cost to performance. I had high hopes for the Nvidia RTX 5090's 8K performance based on the hype we'd been hearing from Team Green, especially when it came to hardware improvements over the RTX 4090, such as including more, faster, memory (32GB of GDDR7 with 1,790GB/s bandwidth, versus 24GB of GDDR6X and a 1,010GB/s bandwidth), and it's safe to say the Nvidia RTX 5090 surpassed those hopes - by a big margin. Seeing Cyberpunk 2077 hit 148fps at 8K with graphical settings set to max was an incredible moment that really did make me feel like I was witnessing the future of gaming. And, while the hardware side of RTX 5090 was extremely impressive, what I learned from this was just how much potential Multi-Frame Generation has. In this early showing it boosted frame rates beyond what I had thought possible - and it also addressed many of the issues I've previously had with Frame Generation. When the original Frame Generation feature launched, I was a bit disappointed. Using it in Cyberpunk 2077 led to a blurry experience where the hit to graphical fidelity was too much to justify the better performance. Since then, Frame Generation has improved, but I did notice in Hogwarts Legacy that there was still some graphical issues when using Frame Generation, especially when it came to running up and down stairs, and rendering backgrounds when partially obscured by fast-moving objects in the foreground. With Multi-Frame Generation, I was extremely happy to see these issues seemingly fixed. Cyberpunk 2077 looked crisp and clear, and those graphical artifacts in Hogwarts Legacy did not appear. By boosting the performance so much while minimizing the impact on graphic performance so well, it looks like Nvidia has got its hands on the holy grail of gaming. I should also point out that during my time pushing the RTX 5090 FE to the extremes with 8K gaming, it remained impressively quiet. Sure, there's fan noise as there's to be expected when using a powerful gaming rig, but I can safely say that there's none of the distracting extreme noise that some rumors were suggesting. The fact that we'll be seeing Multi-Frame Generation come to more and more games, and more affordable GPUs from Nvidia make use of this feature, is extremely exciting - especially if Nvidia continues to improve and refine Multi-Frame Generation as it has done with Frame Generation. The future of PC gaming at 8K is extremely exciting.
[20]
Nvidia unveils GeForce RTX 5090, the most powerful graphics card to date with 32% more power and unmatched performance
Nvidia GeForce RTX 5090 is here, setting a new benchmark for graphics cards. With a 32% power increase over its predecessor, the RTX 5090 promises unmatched performance, making it a game-changer for gamers and creators.Nvidia has recently launched its most powerful graphics card to date, the GeForce RTX 5090, which promises a significant upgrade in performance compared to its predecessors. This new card, built on Nvidia's Blackwell architecture, is designed to take gaming and professional graphics to the next level. With a 32% increase in power, the GeForce RTX 5090 is ready to handle even the most demanding applications with ease. The GeForce RTX 5090 comes equipped with 21,760 CUDA cores, a substantial jump from the RTX 4090's 16,384 cores. This boost in processing power means that the RTX 5090 can perform tasks faster, making it an excellent choice for gamers and professionals who rely on high-performance graphics. Whether you're playing graphically intense video games or running demanding AI workloads, this card is designed to deliver exceptional results. With the 32GB of GDDR7 VRAM, the RTX 5090 also provides more memory than its predecessors, ensuring smooth performance in high-resolution gaming, 3D rendering, and other memory-heavy tasks. Also Read : Why Is the Flu hitting harder than Covid this winter in the US? One of the standout features of the GeForce RTX 5090 is its support for Nvidia's Deep Learning Super Sampling (DLSS) 4. This advanced technology uses AI to generate higher-quality frames, enhancing visual fidelity while maintaining high frame rates. The RTX 5090 takes DLSS a step further by introducing a feature called "Multi-Frame Generation," which further boosts the frame rate for smoother gaming experiences. In tests with popular titles like Cyberpunk 2077 at 4K resolution, the RTX 5090 was able to deliver an impressive 286 frames per second, showing how the card can handle even the most graphically demanding games without a hitch. Despite the added power, the GeForce RTX 5090 features a more compact and efficient design. It comes with a dual-slot setup and an advanced cooling system to keep the temperatures in check, even under heavy workloads. During extensive testing, the card managed to maintain temperatures under 86°C, making it reliable for extended use. Also Read : NFL conference Championship weekend: Everything you need to know about the games The redesigned angled power connector and 12V-2x6 adapter make the power setup cleaner and easier to manage, improving the overall user experience. The Nvidia GeForce RTX 5090 will be available for purchase starting January 30, 2025. Priced at $1,999, it positions itself as a high-end option for those looking to get the best performance out of their system. While the price point may be steep, the GeForce RTX 5090 offers unparalleled performance, making it an appealing option for gamers, content creators, and professionals who need top-tier graphics. What is the GeForce RTX 5090? The GeForce RTX 5090 is Nvidia's latest and most powerful graphics card designed for gamers and professionals, offering unmatched performance. How does the RTX 5090 improve gaming performance? The RTX 5090 significantly boosts gaming performance with advanced ray tracing, AI-powered technologies, and high frame rates, delivering an immersive experience.
[21]
Review: Palit GeForce RTX 5090 GameRock
Introduction and Analysis Palit has returned with a powerful offering: the Palit GeForce RTX 5090 GameRock, ready to be explored. At its heart lies the Blackwell GB202 graphics processor, equipped with an impressive 170 out of 192 Streaming Multiprocessors. This setup translates to a staggering 21,760 shader cores working tirelessly to deliver exceptional performance. The card boasts an astounding 32GB of new GDDR7 graphics memory, running at 28 Gbps. Alongside its raw power, NVIDIA has integrated advanced features such as DLSS4 and Multi Frame Generation (MFG), promising smoother visuals and more lifelike gaming experiences. As the technology unfolds, the Palit GeForce RTX 5090 GameRock emerges not just as a piece of hardware, but as a symbol of innovation and performance, pushing the boundaries of what is possible in the world of graphics technology. This flagship graphics card for hardcore gamers is packed with the latest technologies and the powerful Blackwell GB202 graphics processor. Featuring 170 out of 192 Streaming Multiprocessors, it boasts a staggering total of 21,760 shader cores. With 32 GB of new GDDR7 graphics memory running at 28 Gbps, this card is set to impress. Naturally, NVIDIA pairs this performance with innovative features like DLSS4. But the question on everyone's mind remains: is it worth the steep price of $1999? NVIDIA is set to release its latest GPUs -- the RTX 5090, 5080, and RTX 5070 (Ti) -- between late January and early February. The GeForce RTX 5090 tops the line, designed for users in need of high computational power for intensive tasks. The naming tradition of NVIDIA GPUs honours great scientists, mathematicians, and innovators. In this case, "Blackwell" pays tribute to David Blackwell, a pioneering African-American mathematician and statistician. His groundbreaking work in game theory, probability, and information theory earned him a place in the National Academy of Sciences, and now his legacy lives on through this advanced GPU. SpecificationRTX 5090RTX 5080RTX 5070 TiRTX 5070RTX 5060 TiRTX 5060BlackwellGB202-300 GB203-400GB203-300-A1GB205-300-A1 GB206GB206GPU SMs170 (192 Full)84 (84 Full)70 (84 Full)50 (50 SM Full) GPU Cores21,760 10,7528,9606,144 Clock Speeds2010/24102300/26202300/24502160/2510L2 Cache98MB65MBMemory Capacity32 GB GDDR716 GB GDDR716 GB GDDR712 GB GDDR716 GB GDDR78 GB GDDR7Memory Bus512-bit 256-bit256-bit192-bit128-bit128-bitMemory Speed28 Gbps30 Gbps28 Gbps28 Gbps Bandwidth1,792 GB/s1,024 GB/s896 GB/s672 GB/s Total Board Power (TBP)575W360W300W250W Power Interface1x 12V-2x6 (16-Pin)1x 12V-2x6 (16-Pin)1x 12V-2x6 (16-Pin)1x 12VHPWR (16-Pin)1x 12VHPWR (16-Pin)1x 12VHPWR (16-Pin)Launch Date 2025January 30, 2025January 30, 2025February February Price$1999$999$749$549 NVIDIA GeForce RTX 50 GPU Specs (some preliminary and subject to change): Built on the Blackwell GB202 GPU, the RTX 5090 sports a larger die size compared to its predecessor. It integrates the GB202-300-A1 core with 170 (active) Streaming Multiprocessors, giving it 21,760 shader cores. While this is slightly lower than the full 24,576 cores available in the AD102 die, the reduction may lead to better thermal performance and greater overclocking potential. It could also mean that NVIDIA has an even faster card planned at a later stage in time. The RTX 5000 Series thus introduces a flagship GPU with a staggering 92 billion transistors and come with GDDR7 memory, delivering up to 1.8 TB/s of memory bandwidth. Graphics Card: GeForce RTX 5090 NVIDIA CUDA Cores: 21,760 Shader Cores: Blackwell Tensor Cores (AI): 5th Generation, 3,352 AI TOPS Ray Tracing Cores: 4th Generation, 318 TFLOPS Boost Clock (GHz): 2.41 Base Clock (GHz): 2.01 Standard Memory Config: 32 GB GDDR7 Memory Interface Width: 512-bit Display Support: 4K at 480 Hz or 8K at 120 Hz with DSC Memory technology also sees a leap forward. The RTX 5090 features an impressive 32 GB of GDDR7 VRAM with a 512-bit interface, running at 28 Gbps. This setup offers a total bandwidth of up to 1,792 GB/s. A larger L3 cache and new memory compression techniques enhance data flow and efficiency. These improvements support high-resolution gaming, real-time rendering, complex simulations, and more, promising smoother performance across various applications like 4K gaming and professional design. The GeForce RTX 5090 has a Total Board Power rating of 575 watts. While this represents the maximum draw under full load, typical power consumption during gaming is expected to be lower, something we'll check out in this review. The Founders Edition model will feature at least a dual-slot cooler design to manage the intense thermal output effectively. Palit RTX 5090 GameRock This flagship line integrates innovative features such as the Chameleon Panel that offers dynamic color changes. The panel is engineered to support ARGB Sync Evo, ensuring effortless synchronization with other system components. The card's foundation is based on NVIDIA's Blackwell architecture, which promises significant improvements in graphics processing and artificial intelligence capabilities. Utilizing this architecture, the card is designed to deliver high efficiency under demanding workloads while prioritizing thermal stability. The integration of multiple cooling technologies, including TurboFan 4.0, an Air Deflector system, and Composite Heat Pipes, collectively ensures that the thermal output remains within safe limits even under continuous heavy use. Palit delivers the four-slot wide GameRock edition at stock clocks, however, the cooling performance is grand making this a silent and very lovely to look at product.
[22]
MSI GeForce RTX 5090 SUPRIM SOC 32 GB GPU Review - Enter The Realm of AI-Charged Gaming
It's been two years since NVIDIA introduced its Ada Lovelace GPUs, kicking things off with the RTX 4090 and finishing up the initial lineup with the SUPER family At CES, the company unveiled its new RTX 50 "Blackwell" family which features a brand new architecture and several changes such as new cores, AI accelerators, new memory standards, and the latest video/display capabilities. Today, NVIDIA is releasing the last and fastest card within its "RTX 50" portfolio, the GeForce RTX 5090. The GeForce RTX 5090 is a top-of-the-line graphics card, designed for enthusiast gamers and prosumers, and features a gargantuan price point of $1999 US. This makes the RTX 5090 25 percent more expensive than the RTX 4090 which had an MSRP of $1599 so let's find out if the new features are enough to justify the massive price bump. Today, we will be trying out MSI's GeForce RTX 5090 SUPRIM SOC, a custom variant that is designed with a ton of cooling potential and retails for a premium. With Blackwell, NVIDIA is going full-on into the AI segment with loads of optimizations & AI-specific accelerators. The Blackwell GPU does many traditional things which we would expect from a GPU, but simultaneously breaks the barrier when it comes to untraditional GPU operations. To sum up some features: The technologies mentioned above are some of the main building blocks of the Blackwell GPU, but there's more within the graphics core itself which we will talk about in detail so let's get started.
[23]
Is it worth spending $2,000 on the Nvidia RTX 5090?
Nvidia has launched the RTX 5090, claiming it to be the fastest consumer GPU available, with a price set at $2,000. This release is the first instance where Nvidia has created a $1,000 price gap between its 80 and 90 class cards, making the RTX 5090 significantly more expensive than the RTX 5080, which is priced at $999. For gamers, the RTX 5090 may seem exorbitant, as $2,000 could typically buy a complete high-performance 4K gaming setup. However, a few scenarios may justify the investment in the RTX 5090. It offers a remarkable upgrade for those seeking superior performance over the RTX 4090, which retails for around $1,600 to $2,500. While the $549 RTX 5070 is suggested to offer performance comparable to the RTX 4090, serious gamers who already possess the RTX 4090 may feel compelled to upgrade to the RTX 5090 for its advanced capabilities. The RTX 5090 is noted for its significant performance improvements compared to its predecessor, the RTX 4090, which itself had a substantial advantage over the RTX 3090. Reviews indicate that creators have noted the RTX 5090's exceptional speed, solidifying its appeal to enthusiasts willing to pay for top-tier performance. A substantial factor for some consumers is the VRAM. The RTX 5090 features 32GB of VRAM, twice that of the RTX 5080, which may be crucial for users who can't afford to experience VRAM limitations. Nvidia has faced criticism for the VRAM capacities in its other cards, particularly as the RTX 5080 retains the same VRAM as the previous-gen RTX 4080 Super and the RTX 5070 Ti. The RTX 5090 is also marketed as a resourceful option for professionals involved in AI computing. Nvidia has placed substantial emphasis on AI capabilities, highlighting the increased AI TOPS of each card during its CES presentation. Professionals seeking a high-performing AI accelerator may find the RTX 5090 an appealing option due to its improved specifications over the RTX 4090, which currently costs over $2,000. However, the RTX 5000 series has received criticism regarding its pricing and advertised performance gains. Nvidia claims the RTX 5070 can perform at RTX 4090 levels, but this assertion rests heavily on the effectiveness of DLSS 4 and Multi Frame Generation, features that may not deliver the anticipated user experience. Additionally, there are concerns that the price of the RTX 5000 series may not provide a competitive edge over older or forthcoming GPUs, particularly with AMD's RDNA 4 architecture expected to launch soon, which may offer better alternatives at similar price points. Some believe it is possible to secure a last-generation GPU for an adequate gaming experience, especially as prices drop among the RTX 4000 series. Furthermore, many features from the RTX 5000 series, including new DLSS capabilities, are expected to be backported to the RTX 4000 and even RTX 2000 series. This potential availability of advanced features on older models might dissuade consumers from investing in the expensive RTX 5000 series unless they require the unique performance enhancements that come with it. Statistical data shows that the majority of gamers do not prioritize 4K gaming, with only 4.21% using 4K as their primary resolution. Most gamers are still utilizing 1080p screens, suggesting a limited market for high-end 4K gaming performance. For general consumers: Maybe. The Nvidia RTX 5090 offers exceptional performance but at a high price. For most consumers, there are other GPUs available that provide good value without such a steep price tag. For gamers: Maybe. If you're gaming at 4K or aiming for the best possible performance, the RTX 5090 could be worth the investment. However, for most gamers, especially those playing at lower resolutions like 1080p, the RTX 5090 may be overkill. For professionals (AI/computing): Yes. If you're working with AI or need top-tier computing performance, the RTX 5090 is a strong choice, offering significant upgrades over the RTX 4090, particularly with its improved AI capabilities and 32GB of VRAM. For enthusiasts: Yes. For those seeking cutting-edge technology and future-proofing their setup, the RTX 5090 offers immense power and performance improvements that will keep your system ahead of the curve. For budget-conscious buyers: No. If you're looking for good performance without breaking the bank, there are more affordable options like the RTX 5070 or RTX 4090 that offer excellent performance at a lower price. For tech analysts: Maybe. The RTX 5090 brings strong performance improvements, but the price may not be justified for everyone, particularly considering the effectiveness of DLSS and other features that could be available on older models. For those upgrading from RTX 4090: Maybe. If you're already using the RTX 4090, the RTX 5090 may not provide a huge leap unless you need the advanced features and VRAM for very demanding tasks like high-end AI work or 4K gaming at ultra settings.
[24]
I Tested DLSS 4 in Cyberpunk 2077 on the GeForce RTX 5090, and Here's How It Went
For as long as I can remember, I've had love of all things tech, spurred on, in part, by a love of gaming. I began working on computers owned by immediate family members and relatives when I was around 10 years old. I've always sought to learn as much as possible about anything PC, leading to a well-rounded grasp on all things tech today. In my role at PCMag, I greatly enjoy the opportunity to share what I know. At this point, most PC-gaming enthusiasts are familiar with some flavor of Nvidia's DLSS technology and its ability to boost gaming frame rates. But DLSS just entered new territory in its fourth generation. Nvidia introduced DLSS 4 alongside its upcoming GeForce RTX 50-series graphics cards, and we've now been able to put it to the test first-hand in Cyberpunk 2077. Here's a brief overview of what DLSS 4 is, its benefits, who can use it, what negative impacts it may have, and how it performed in my early testing with Cyberpunk 2077 on an RTX 5090. What Is DLSS 4? From one perspective, DLSS 4 is closely related to Nvidia's current DLSS 3 technology. Both focus on generating additional frames beyond those rendered by the GPU's primary cores, driven by additional AI Tensor cores inside the GPU, to boost frame rates. The key difference is that DLSS 4 introduces a new Transformer AI model (more on that in a bit) for creating these frames that, Nvidia alleges, is more accurate, with better image quality. DLSS 4 also aims to generate a greater number of artificial frames per original frame rendered than DLSS 3 could. The way this works is relatively simple in concept; TVs and computer software have done it for decades. Ever go into a store and see TVs advertising a super-high refresh rate for sports content? They may have simply had a sign advertising 240 images every second. Or, maybe the store didn't directly mention the refresh rate, and instead, the retail display just talked about how smooth the picture should look. If so, you've seen some version of this technology in use before, though having evaluated various forms of artificial frame generation, I will say Nvidia's approach looks better than any standalone TV's. How does it work? First, the graphics card first creates a couple of frames the old-fashioned way. Imagine a baseball player swinging a bat at a ball. In the first frame that the GPU renders, the player may be holding the bat vertically, ready to swing, and in the next frame, it could be horizontal and colliding with the baseball. In this case, you have missed all the motion between the bat being raised and hitting the ball, and you can't get that back. However, frame generation attempts to fill these gaps by looking at the two frames and creating artificial frames that reflect the visual states between the two. In the example above, it might add a frame of the bat positioned horizontally, but further back and not yet hitting the ball. This frame is essentially an interpolated image, so it is not 100% accurate, but it probably will look decent enough as it flashes by in a split second. In that time, you aren't likely to notice any slight graphical errors it produces, if they aren't grievous. While the three GPU makers (AMD, Intel, Nvidia) have differing methods to achieve the effect, this is how frame generation works in general. It's also an accurate reflection of how frame generation works for Nvidia's last-generation DLSS 3, and things don't change much for DLSS 4. The difference: DLSS 4 generates even more frames by looking at the differences between that first artificially created frame and the two original frames (before it and after it) it is based on, creating additional artificial frames between them. Technically, this method presents no limit to how many artificial frames you could make in this way. But for now, Nvidia has it capped at four artificial frames for every original frame. (According to the company during its Editor's Day held earlier this month at CES, it didn't go beyond one artificial frame with earlier DLSS frame gen because the image quality trade-offs were too much.) Who Can Use DLSS 4, and Why Not to Use It Technically, no one at this writing can use DLSS 4, as the GeForce RTX 50 series hasn't shipped yet. Only the GeForce RTX 50 series has been stated to have official support for DLSS 4, starting with the Nvidia GeForce RTX 5090 and GeForce RTX 5080, launching January 30. I've seen speculation that Nvidia could extend DLSS 4 support to the RTX 40 series, but the company has not confirmed this yet. The only other known cards that will support it in the near future, besides the RTX 5090 and RTX 5080, are the Nvidia GeForce RTX 5070 Ti and the Nvidia GeForce RTX 5070, due to release in February. Mobile variants of all these GPUs will launch in March and support DLSS 4, too. However, you'll find valid reasons why you may not want to use DLSS 4 at all (or any flavor of DLSS, for that matter), even if you can. All versions of DLSS, just like all versions of AMD FSR and Intel XeSS, present trade-offs. When used without frame generation, all of these tools work on rendering frames at a lower resolution and then upscaling the created image. It's effective at increasing the frame rate but negatively affects image quality. With frame generation, you take a more significant hit to image quality than with standard DLSS, FSR, or XeSS, but you also introduce additional latency. You may also see "1% lows" -- the average of the 1% lowest frames during a test -- that are substantially lower than your average frame rate, which can feel jarring or stuttery in games. Also, using frame generation can, counterintuitively, lower your frame rate instead of raising it. Here's why. Frame generation works to increase the frame rate because the graphics card has excess processing power, thermal headroom, and literal power available to make it work. This will likely happen when the graphics card is bottlenecked by either the game engine or your CPU. When that isn't the case and the GPU itself is the bottleneck, performance can actually go down, as we observed when we tested the Nvidia GeForce RTX 4060. Due to these issues, I generally avoid using DLSS, FSR, or XeSS. In my free time, I primarily run games at 4K, and I only use DLSS when my system struggles to stay close enough to 60fps for my liking. I go to it as a last resort, as it's a better option than lowering the resolution, but that's it. I view frame generation similarly, but if you are trying to get higher frame rates in a competitive shooter, for instance, it might be advantageous to use it so long as you aren't going past your monitor's refresh rate. If your monitor can only display, say, 120 or 240 frames per second, you have no reason to push your graphics card to feed it more frames per second than that. Test-Driving DLSS 4 and Cyberpunk 2077 With DLSS 4, you get a few additional options than you had previously with DLSS 3. First is the AI model for frame generation. This option may not be present in every game, but in Cyberpunk 2077, you can manually select using the older Convolutional Neural Network (CNN) model used by DLSS 3, or the newer Transformer model debuting with DLSS 4. The Transformer model is supposed to provide better image quality, but as you'll see in the chart below, it also leads to a slightly lower frame rate overall. You will also have options for essentially setting a multiplier for how many frames you want to generate. Cyberpunk 2077 defaulted to x2 for us with both the Convolutional and Transformer models, but you can run either model with a multiplier of x3 or x4, too. The higher the multiplier, the faster the frame rate; but you also have a greater chance of graphical defects, more significant 1% low deltas, and longer latency. Nvidia attempts to get ahead of the issue on the legacy front by forcing Nvidia's RTX Reflex technology whenever frame generation is used. This is supposed to help reduce latency, and you have no reason not to use it except for when you want to compare with non-Nvidia graphics cards evenly. But the fact that it gets locked on shows that either Nvidia or the game's developers deemed it essential for frame generation. The game likely needs an update here, however, as Nvidia also announced RTX Reflex 2 that is supported by Cyberpunk 2077 and is supposed to be better. But I could only activate this with frame generation off. As you can see from the chart, the Nvidia GeForce RTX 5090 ran Cyberpunk 2077 like a champ with or without DLSS on. This game is extremely demanding, and this is the first time I've seen a graphics card maintain a frame rate of more than 60fps at 4K with the Ray-Tracing Ultra graphics preset. That it takes a $2,000 GPU released years after the game first launched to do this is quite astonishing in itself -- and this isn't even the highest graphics preset. Anyway, as you can see from the numbers, the multiplier does more or less what you'd expect. From 130fps at 1080p with DLSS 2, you double your frame rate to 261fps at 1080p with DLSS 3 frame generation enabled. You get right around triple that 130fps score with DLSS 4 frame generation set to x3 and nearly four times that 130fps baseline with frame generation set to x4. It doesn't scale quite this neatly across the board, due to the overhead on the GPU from running DLSS and frame generation, and the CPU becomes a bottleneck at various points. But it's relatively linear. Now, you must decide whether you want to use these features. As mentioned before, you'll find some issues related to DLSS. While I encountered problems with Nvidia's FrameView tool that prevented me from measuring 1% lows or latency, these are well-known if not clearly defined. You'll also find some unquestionably advantageous places to use it, particularly if you have a high-refresh-rate monitor or struggle to run a game smoothly. However, the key to using DLSS to your advantage is recognizing when that is, and not using it when your GPU already produces high enough frame rates.
Share
Share
Copy Link
NVIDIA launches the GeForce RTX 5090, a high-end graphics card with significant performance improvements and new AI features, marking a leap in GPU technology for gaming and creative professionals.
NVIDIA has launched its latest flagship graphics card, the GeForce RTX 5090, marking a significant leap in GPU technology. This new addition to the RTX 50-series lineup brings substantial improvements in performance and introduces advanced AI capabilities, catering to both gamers and creative professionals 123.
The RTX 5090 boasts impressive specifications, including 21,760 CUDA cores, 32GB of GDDR7 memory, and a 512-bit memory interface 12. These enhancements result in a performance increase of up to 50% compared to its predecessor, the RTX 4090, in certain workloads 2. The card's raw gaming performance shows an average improvement of 27% over the previous generation, with some games seeing gains of up to 50% 3.
One of the most significant advancements in the RTX 5090 is the introduction of DLSS 4 with Multi Frame Generation (MFG) technology. This AI-powered feature can generate multiple frames for each rendered image, potentially doubling the number of AI-generated frames compared to previous versions 34. The new Blackwell architecture optimizes AI workloads, making the RTX 5090 particularly attractive for AI researchers and engineers 2.
Despite its increased power draw of 575W, NVIDIA has managed to design the RTX 5090 Founders Edition as a dual-slot card with an innovative cooling solution. The card features a compact PCB with a heatsink running through its width and two strategically placed fans for efficient heat dissipation 5.
The NVIDIA GeForce RTX 5090 Founders Edition is priced at $1,999, with availability starting January 30, 2025 23. This represents a $400 increase over the RTX 4090's launch price, reflecting the card's enhanced capabilities and potential demand from both gaming and AI sectors 4.
The RTX 5090 is positioned as the undisputed leader in the high-end GPU market, outperforming its predecessors and competitors in most benchmarks 34. Its advanced AI features and substantial VRAM make it particularly appealing for content creators and AI researchers, potentially driving strong demand despite the high price point 24.
The NVIDIA GeForce RTX 5090 represents a significant advancement in GPU technology, offering unparalleled performance for gaming and creative workloads. While its high price may limit its appeal to enthusiasts and professionals, the card's innovative features, particularly in AI and DLSS technology, position it as a glimpse into the future of graphics processing 12345.
Reference
[4]
Nvidia's new RTX 5090 GPU offers significant improvements over the RTX 4090, with a focus on AI-driven features and enhanced performance, albeit at a higher price point.
44 Sources
44 Sources
NVIDIA launches its new GeForce RTX 50 Series GPUs, featuring the Blackwell architecture and DLSS 4 technology, promising significant performance improvements and AI-enhanced gaming experiences.
10 Sources
10 Sources
Nvidia unveils its new RTX 50 Series GPUs, promising significant performance improvements through AI-driven technologies like DLSS 4, potentially revolutionizing gaming graphics and performance.
15 Sources
15 Sources
NVIDIA's new GeForce RTX 5070 Ti offers impressive 4K gaming performance with DLSS 4 technology at a more affordable price point than higher-end models.
7 Sources
7 Sources
AMD and NVIDIA unveil their latest GPU series, the Radeon RX 9000 and GeForce RTX 50, featuring significant advancements in AI and ray tracing capabilities, targeting high-performance gaming and content creation.
19 Sources
19 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved