2 Sources
[1]
The RTX 5060 is cheap, but can it game? We played 5 titles to find out
Nvidia's GeForce RTX 5060 is set to launch on May 19 at prices starting from $299, and the lofty promises of 1080p gaming with super high frame rates are plentiful. But understandably, there are some concerns...well one big one to be exact - 8GB of video memory. Video memory (or VRAM) is crucial for storing in-game textures and shaders for the GPU to quickly access and process to produce the game you see on-screen. And with more recent AAA games like Indiana Jones and the Great Circle demanding a baseline of 12GB of VRAM, gamers are nervous that this just won't have the horsepower to keep up. So we have one question to answer: the RTX 5060 is cheap, but can it game? Or is the slightly more pricey RTX 5060 Ti the actual base model? We played 5 titles at 1080p to find out. Spoiler alert: when tested under Nvidia's scenarios, the games live up to (and in some cases, slightly exceed) the company's performance claims. Of course, this is a very early look at what it can do, and Nvidia has been careful to make sure we share information about games that have been extremely well-optimized for Nvidia GPUs. The games we can talk about are Avowed, Cyberpunk 2077, Doom: The Dark Ages, Hogwarts Legacy and Marvel Rivals. Provided you stay within these fertile grounds (because let's be honest, a lot of PC games play well with Team Green), you're going to have a great time at 1080p with DLSS 4 being the backbone to all of it. DLSS 4 unlocked unfathomable frame rates in the RTX 5060 Ti at 1440p, and it does similar here. It may not be done through pure rendering (or rasterization if you're nasty). But as we've noticed time and time again, the new transformer model used for frame generation absolutely slaps - all while handling the likes of Doom at Ultra Nightmare preset at 1080p. We saw frame rates during gameplay stay well maintained at over 200 FPS with multi-frame generation 4x turned on (where one frame is rendered and the next 3 are filled in with AI). Dying at 230 FPS may be annoying, but it's also a smooth way to go. And after all the AI trickery, the texture pool size is just over 1.5GB. Granted, this is an isolated example, but it is proof that the neural rendering work does work hard to compress the demands on VRAM. We didn't see any clipping or stuttering at all! Meanwhile, Cyberpunk 2077 doesn't show us an accurate frame rate when using the game's own benchmarking tool. When we asked Nvidia, our rep confirmed that this is because the benchmark tool was unable to recognize the frame generation - probably a symptom of this pre-release driver being used for the 5060. We were seeing an average of 34.97 FPS reported through the game, which is a far cry from the claims of 149 FPS and very different to what we see by eyeballing the gameplay. However, one quick use of Frameview and we see that triple digit number. A frustration I've run into a few times with RTX 50-series cards comes down to the DLSS override settings in the Nvidia app. Ever opened a game that is said to support all the DLSS 4 options, but you don't see them show up in the game settings? For these, you're supposed to head to the Nvidia app and manually override them. The experience is one that on paper sounds good, but can be a gauntlet of trial and error - it can take a lot of tweaking that gamers who go for this card (the cheaper one meant to be a lower barrier of entry for players) may not know. With many years of PC building experience between us, we didn't find it tricky, but if this can be simplified or even if people are given the option to just automate the activation of DLSS override over having to do it game-by-game, that would be significant. That being said, though, the end results were impressive. Avowed when overridden (Epic preset, DLSS set to quality with multi-frame gen 4x) is a crispy experience with extremely minimal ghosting that most of you won't notice - running at the promised 190 FPS at 1080p. Even better, when we did the same with Hogwarts Legacy (Ultra preset), Nvidia's claim of 156 FPS was routinely beaten at 1080p, but it does take a lot of your PC settings tweaking to get there, as ray tracing options are independently controlled to the global quality settings for better fine tuning. Like the Nvidia app, great for PC players who know what they're doing, but there's a learning curve here. But once you figure out that learning curve, based on this limited glimpse, but from that glimpse, there are reasons to be optimistic here. Just like we've seen on the rest of Nvidia's 50-series family, the AI trickery going on to make this sky high frame rates possible is impressive - giving you buttery smoothness without any overtly noticeable ghosting around objects or huge impact to in-game latency. But the 8GB of video memory is still a big question that we're yet to get a proper answer for in terms of our full testing. So stick with Tom's Guide for our full testing to see just how much (or how little) we can actually push the RTX 5060.
[2]
I tested the RTX 5060 - is 8GB of VRAM really enough in 2025?
It's almost here: Nvidia's RTX 5060 has been a hotly-discussed upcoming GPU, with arguments on both sides about pricing, performance, and of course, the much-maligned 8GB of video memory. Well, I've got my grubby little mitts on one of these new GPUs ahead of its official launch on May 19, and Nvidia was kind enough to let me take it home and slap it into a build. Granted, a lot of details are still embargoed (look out for our full review here at TechRadar in the near future), but Team Green did give me limited permission to talk about some early test results with a pre-agreed pool of games, so here's the scoop. Before I discuss the game performance, I'll start by laying out the testing parameters. At $299 (other regional prices TBC), the RTX 5060 doesn't represent a generational price increase from the venerable RTX 4060 - which was itself cheaper than the previous RTX 3060. As such, I wanted to put it in an appropriate system; the rig I used for testing features an AMD Ryzen 3600, 16GB of Crucial Ballistix DDR4 RAM, and a midrange X570 motherboard from ASRock. Nothing fancy here, in other words. The xx60 GPUs from Nvidia have always sat at the budget-to-midrange section of the GPU scale; perhaps never quite being true budget cards, but still aiming to offer strong value for money with a sensible price tag for gamers who can't afford to splash thousands on a high-end card. It's worth noting that there's no Founder's Edition model of the RTX 5060, so the exact card I'm using here is the Asus Dual OC RTX 5060 8GB. Next up, the actual game settings. I wanted to really put this card through its paces with a broad variety of tests, but Nvidia was quite insistent about the use of its shiny new AI features. As such, DLSS 4's resolution upscaling is set to the 'Quality' preset at 1080p output, and Multi Frame Generation (MFG) is used for all of the tests - yes, I have thoughts on this, and you can read them further down. All of the following games were tested at their respective maximum in-game graphical presets at 1080p. The first game on the testing rig was Doom: The Dark Ages. While its predecessor Doom Eternal held a fairly positive reputation for running well on lower-end hardware, this techno-medieval new romp is a little more demanding. Still, at the 'Ultra Nightmare' graphics preset with DLSS 4 and MFG enabled, I was getting crisp framerates of more than 220fps, with lows never dropping below 200. The Dark Ages absolutely slaps, by the way. It looks fantastic in motion, too. DLSS has come a long way since its public release more than six years ago, and old complaints of visual tearing and glitching are completely non-applicable here. The frame-generation works great too; even in fast-paced arena battles against the hordes of Hell, I wasn't able to register any visual issues, and with the graphics cranked up to maximum everything looked very pretty (when it wasn't covered in blood and viscera, anyway). The same was true of Marvel Rivals, which saw framerates comfortably sitting around the 250-260fps mark, with some dips (though never below 200) during particularly intense firefights. Again, the game looked great, with no visual issues that could be readily discerned. I did experience two individual crashes across about two hours of playtime, but we can reasonably put that down to early driver instability; this is technically an unreleased GPU running on beta drivers, after all. All looking very good so far, then? After all, the majority of PC gamers with new Nvidia GPUs are using DLSS or frame-gen in some capacity. It's free framerate; why wouldn't you? But there are some important caveats surrounding upscaling and frame generation, and those caveats did rear their heads in my testing. Upon running the built-in benchmark for Cyberpunk 2077 - in its ridiculously demanding RT Overdrive preset, no less - I got a perfectly good framerate average of 121fps, but upon actually jumping into the game, I was getting occasional issues with stuttering during high-action sequences like car chases and intense gunfights. It was never unplayable, to be clear. Visual fidelity and clarity remained very good with the DLSS Quality preset, with no obvious tearing or artifacting. We're talking about very brief, very occasional drops into the 30-40fps range, quite likely caused by a combination of the heavy VRAM demands of Cyberpunk's RT Overdrive mode (on an 8GB card) and the key drawback of frame generation: it's only ever as effective as the base rendering rate. See, if MFG is able to run at maximum capacity (exclusively on RTX 5000 GPUs) it offers up to 4x frame-gen. In other words, for every individual frame rendered directly on the GPU, you get three AI-generated frames ('fake frames', as some detractors online have dubbed them) which are then inserted between the rendered frames to speed up your end framerate. The problem with this is that it's directly tied to the original rendered framerate. So if you've already got a healthy fps above 60, then great, you're getting an extra 180 frames per second for free. But if your framerate dips are bottoming out in the single digits, no amount of MFG will produce enough extra frames to salvage the game's playability. Thankfully, Cyberpunk never quite got that bad in my 1080p testing, but it makes me wonder how effective the RTX 5060 will really be when it comes to 1440p gaming. There's one other major caveat to consider here, and that's developer support for DLSS and MFG. Game devs need to build support for Nvidia's nifty performance-boosting features into their games, and while Doom: TDA, Marvel Rivals, and Cyberpunk all include that support, the last two games I tested didn't feature integrated MFG functionality. Fortunately, Nvidia has a workaround for this: DLSS Override, which lets users force-activate both resolution upscaling and frame generation on a driver level, effectively bypassing the need for dev implementation. Less fortunately, the results of this tool are... mixed, to say the least. Hogwarts Legacy, with MFG enabled via the Override setting in the Nvidia App, was getting a pretty consistent 170-180fps. Visual fidelity was solid, with only a very small amount of tearing here and there, mostly around bright particle effects. It was certainly playable; I'd say these extremely minor graphical issues were worth the massively boosted framerate. With DLSS Quality and the Ultra graphics preset, it still looked great at 1080p. Avowed, on the other hand, did not seem to enjoy being overridden nearly as much. While MFG and DLSS 4 granted it an excellent 189 average fps, the bright and colorful scenery of the Living Lands was rife with small but noticeable issues with blurring and artifacting, especially in busy environments like the port city of Paradis. This was a case where I felt DLSS Override wasn't really worth it; it'll vary a lot from game to game, of course, and the fact that Override exists and functions at all is still hugely impressive to me. But Avowed had a little too much visual jank for my liking with Override's frame-gen running, to the point where I think I'd rather just drop the graphical settings down a notch or two and play with a lower framerate but more consistent visuals. Still, at the asking price, the RTX 5060 is a definite improvement over the RTX 4060, thanks in no small part to the introduction of MFG - and while that will only reliably help your performance in supported games, more and more devs are now working to include MFG functionality in their titles. As for that 8GB question... honestly, I still don't think it's enough, especially considering that 1440p is becoming a more popular resolution for PC gamers. If we don't get a desktop RTX 5050, this card can be considered the bottom rung of Nvidia's Blackwell desktop GPU stack, so it's not entirely unreasonable for it to remain a 1080p-focused card, but this is 2025: it's time we step up our VRAM game a bit, folks.
Share
Copy Link
Early tests of Nvidia's RTX 5060 show strong 1080p gaming performance with DLSS 4 and Multi-Frame Generation, but questions remain about its 8GB VRAM in future games.
Nvidia is set to launch its latest budget-friendly graphics card, the GeForce RTX 5060, on May 19, starting at $299. Early tests reveal impressive performance at 1080p resolution, but concerns linger about its 8GB of video memory (VRAM) 1.
The RTX 5060's performance is significantly boosted by Nvidia's DLSS 4 (Deep Learning Super Sampling) technology and Multi-Frame Generation (MFG). These AI-powered features allow the card to achieve remarkably high frame rates in supported games:
Despite its impressive performance, the RTX 5060's 8GB of VRAM has raised concerns among gamers and tech enthusiasts. With recent AAA games like Indiana Jones and the Great Circle requiring a minimum of 12GB VRAM, questions arise about the card's future-proofing 1.
While the RTX 5060 shows promise, there are some important considerations:
DLSS Override Settings: Users may need to manually override DLSS settings in the Nvidia app for optimal performance, which could be challenging for less experienced users 1.
Frame Generation Limitations: Multi-Frame Generation's effectiveness is tied to the base rendering rate. In demanding scenarios, such as Cyberpunk 2077's RT Overdrive mode, occasional stutters were observed 2.
Developer Support: Not all games support DLSS and MFG, which may limit the card's performance boost in certain titles 2.
As games continue to evolve and demand more resources, the long-term viability of the RTX 5060's 8GB VRAM remains uncertain. While Nvidia's AI technologies currently help mitigate VRAM limitations, future game requirements may pose challenges for the card.
The RTX 5060 appears to be a strong contender in the budget to mid-range GPU market, offering impressive 1080p gaming performance. However, potential buyers should consider their long-term gaming needs and the possibility of upgrading sooner rather than later if they plan to play cutting-edge titles at higher resolutions or with extensive ray-tracing features.
Summarized by
Navi
AMD CEO Lisa Su reveals new MI400 series AI chips and partnerships with major tech companies, aiming to compete with Nvidia in the rapidly growing AI chip market.
8 Sources
Technology
3 hrs ago
8 Sources
Technology
3 hrs ago
Meta has filed a lawsuit against Joy Timeline HK Limited, the developer of the AI 'nudify' app Crush AI, for repeatedly violating advertising policies on Facebook and Instagram. The company is also implementing new measures to combat the spread of AI-generated explicit content across its platforms.
17 Sources
Technology
11 hrs ago
17 Sources
Technology
11 hrs ago
Mattel, the iconic toy manufacturer, partners with OpenAI to incorporate artificial intelligence into toy-making and content creation, promising innovative play experiences while prioritizing safety and privacy.
14 Sources
Business and Economy
10 hrs ago
14 Sources
Business and Economy
10 hrs ago
A critical security flaw named "EchoLeak" was discovered in Microsoft 365 Copilot, allowing attackers to exfiltrate sensitive data without user interaction. The vulnerability highlights potential risks in AI-integrated systems.
5 Sources
Technology
19 hrs ago
5 Sources
Technology
19 hrs ago
Spanish AI startup Multiverse Computing secures $217 million in funding to advance its quantum-inspired AI model compression technology, promising to dramatically reduce the size and cost of running large language models.
5 Sources
Technology
11 hrs ago
5 Sources
Technology
11 hrs ago