Difference Between NVIDIA RTX VS GTX series | TechRadar Guide

rtx vs gtx

 

So you're going to buy a new Nvidia graphics card. Good for you the only question is which series should you be looking at? The RTX or the GTX? Nvidia is about as consistent as Microsoft when it comes to numbering their products. Perhaps even less consistent given that the 16 series of the GeForce lineup was actually released a year after the 20 series. So confusion was bound to ensue. That's why in today's blog we'll be covering all you need to know about both RT X and GTX graphics cards named performance and value. So without any further ado let's begin.


 

Basics Of Graphics Card

Let's start from the ground up and work our way towards RTX and GTX. So that we can understand their similarities before moving on to their differences. All of Nvidia's gaming-oriented GPUs belong to their GeForce GPU brand. This brand was established way back in 1999 with the original GeForce 256 GPU. Over the years and Nvidia has released lots and lots of different GPUs. Culminating in the two latest GPU lineups. The GeForce 20 series that launched in 2018 and the GeForce 16 series that launched in 2019. 

The GeForce 20 series is made up of only RTX graphics cards. While GeForce 16 series is made up of only GTX graphics cards. The RTX name was brand new at this point while the GTX moniker had been used in previous generations. So what do these letters mean anyway? Are they an acronym for something?

Nope, they don't mean anything they're literally just there for branding because they sound cool and kind of catchy that's literally all there is to it. Just like the i3 and Intel Core i3 doesn't mean anything either. It just sounds better than one. Nvidia has been using similar two or three letter designations for a while now to convey the sense of quality. As in what kind of performance you can expect from which GPU. 

For example, they've used the GT GTS and GTX designations just to name a few. But only the GTX and the RTX have survived to present-day. That's really all there is to it. Nvidia wanted to convey the sense of a huge generational gap in their new line of graphics cards and to do so they introduced a new RTX moniker. But then why go back to GTX the year after and why was the GeForce 16 series released after the 20 series?

 

rtx vs gtx


Difference Between GeForce 16 And GeForce 20

First things first we need to talk about architecture. Both the 20 series RTX GPUs and the 16 series GTX GPUs utilized the same Turing GPU micro-architecture. The RTX graphics cards released in 2018 were the first-ever graphics cards to feature the new architecture and naturally the 16 Series GPUs that came out a year after also featured it. Before Turing gaming-oriented NVIDIA GPUs used the Pascal micro-architecture for a very long time. So they wanted to heavily highlight this new architecture and its capabilities. 

Which is why they decided to release the more powerful GPUs first? Thus the 20 series lineup was made up entirely of upper mid-range and high-end GPUs. These were the graphics cards that could properly highlight the capabilities of the advanced features that the Turing based GPUs brought to the table. And to really drive home the idea that this was supposedly a monumental step forward for gaming they slapped on a new brand name the RTX. 

But they had to release the lower mid-range and budget entries sooner or later after all not everyone can afford to buy graphics cards that cost for $100 or more. Here's the thing even though these lower-end GPUs were still based on the same architecture. They simply didn't have the advanced features that Nvidia had been shoving and everyone's faced through marketing. So in an effort to make it easier for consumers to distinguish between the GPUs with advanced features and the GPUs without them. 

They kept the GTX name and called them the 16 Series just for good measure. That's why even though both the 16 Series GTX and the 20 series RTX belonged to the same generation of GPUs they have these different designations. Nvidia wanted to highlight that the RTX GPUs were not merely more powerful than the GTX GPUs. But that they also had awesome new features that the GTX GPUs didn't. We think that this raised just as many questions as it answered. But it is what it is. So what are these advanced new features?

 Well, it's normal for more powerful GPUs to have more cores than there are less powerful cousins. The RT X GPUs however not only have more cores but some of them are the two special kinds of new cores. These are the RT cores and tensor cores. If you take these special cores out of the equation there really isn't that much of a difference between the 20 series and the 16 series. Sure the 20 series GPUs are still more powerful just because they have more transistors more regular cores better memory and so on. But the difference here is a kind to the difference between the low and high-end last-gen 10 series GPUs just pure raw power.


NVIDIA GTX Official Site: Click Here

NVIDIA RTX Official Site: Click Here


RT Cores

So what makes these new cores so special. Let's start with the RT course. These are the cores responsible for bringing to life the most aggressively marketed capability of the RTX GPU's real-time ray tracing. What ray tracing does is trace the paths of virtual rays of light with the goal of realistically simulating the way light interacts with the environment. Ray tracing in and of itself isn't a new technology. It's been used in animations for a long time now. But this was the first time that at GP you could handle it in real-time. 

In games, the use of real-time ray tracing allows for much more realistic lighting and reflections. But here's the thing we cannot overstate how taxing it is on hardware to calculate the trajectory of each individual ray of light and how it interacts with the environment. 

This is where the special RT cores come into play ray tracing is technically possible on all GPUs without RT cores and so much as you can enable it in a graphics menu. But the performance will be nothing short of awful. Even a last-gen flagship like the GTX 1080ti can't handle it in a manner that leaves the game even remotely playable. Even GPUs with RT cores take a  performance hit when real-time rate racing is enabled. 

This begs the question is rate racing even worth it. First of all, as of now, there are only 20 or so games out there that support real-time retracing. It's certainly a more respectable number than it was back when the RTX GPUs just came out and people had to pay a hefty premium for a feature that no game supported yet. 

This number is bound to grow with both the upcoming PS 5 and Xbox Series X console set to feature real-time ray tracing support. But even then we highly suggest watching some comparison videos to see for yourself what games look like with real-time ray tracing enabled and disabled. The difference is undeniably there and once you spotted you'll definitely appreciate it. But since it's all about shadows and light it's easy not to notice the difference at first glance. Once you do notice them you'll be wowed without a doubt. But whether or not they make up for the performance hit is up to you. 

For example, the game control runs at a stable 60fps on an RTX 2070 and an i7 9700 k setup but only manages to scrape by on 30fps once real-time rate racing is turned on. The framerate drop isn't this drastic in every game but it is always noticeable. So our stance on real-time rate racing is this it is most definitely not a gimmick it's an important technological advancement that will greatly enhance our gaming experience in the years to come. But at the moment the hardware simply isn't powerful enough to handle it and the developers still aren't making full use of it. 

NVIDIA was understandably very eager to push this feature to the forefront of the marketing campaign for the RTX GPUs but we shouldn't forget that this is the first generation of GPUs to have this feature. Each new generation will handle it better and in a couple of years, this feature will be immensely impactful. But at the moment this is a feature that many gamers will happily turn off. This greatly reduces the cost-effectiveness of the RTX graphics cards.


NVIDIA GTX 1660 Ti Buy Online

Amazon.com     Click Here

Amazon.in         Click Here

Tensor Cores

We're guessing most of you knew about real-time rate racing. Even if you didn't know anything about RT cores. But the same does not go for tensor cores. That's just because NVIDIA doesn't market them as aggressively. Tensor cores our cores used specifically to enhance the capabilities of deep learning. These cores were first used in Nvidia Volta GPUs. But since these aren't gaming GPUs most people have never heard of them. 

Deep learning has many applications but for us gamers, it's important because it's used for a brand new anti-aliasing method. Deep learning super sampling or DLSS for short. DLSS works by using deep learning models to generate detail and upscale the images to a higher resolution which makes the image sharper and reduces aliasing. These deep learning models are constructed on a video supercomputer and then executed to your GPU tensor cores.

But overall DLSS makes for a crisper image and it's less taxing on the hardware than other anti-aliasing methods. More importantly, it's also been shown to significantly improve performance when ray-tracing is turned on. Which is a big deal as you can imagine? We're not sure how and why this is? But somehow DLSS simply brings out the best in ray tracing and we think that's wonderful. Sadly the number of games that support DLSS is small. More games support real-time ray tracing than DLSS so you figure out the math.

NVIDIA RTX 2080 Ti Buy Online

Amazon.com     Click Here

Amazon.in         Click Here

Conclusion

The 20 series RTX and the 16 series GTX split we got with the current-gen GPUs are confusing. But there is at least a reason behind it which doesn't always happen in the tech industry. Even though all of these GPUs are based on the Turing micro-architecture at the RTX GPUs have advanced RT and tensor cores that the GTX GPUs are simply lacking. Our guess is that the folk at Nvidia thought potential buyers may believe that the new GTX graphics cards like the GTX 1650 would support the aggressively marketed real-time ray-tracing if they called it the RTX 2050. 

All and all ray-tracing is a cool feature but given how new and unrefined it still is it's not really worth it for most gamers. From a performance, dollars standpoint RTX graphics cards are only worth it if you don't plan on enabling ray-tracing. Even then you're sadly paying a premium for this feature that you won't be using. In any case, we hope you found this blog helpful you can let us know. We're especially here to hear from people who already use the RTX graphics cards and are still reading this blog for some reason. 

What has the experience been like for you? Notifications by clicking on a bell icon we upload a new video every week so keep your eyes peeled for the next one in the meantime may your games be fun and your losses view and as always we'll see you next time on gaming scam.

Post a Comment

0 Comments