Tumgik
#the real moral of the story is that we will probably never get cheap gpus again and it’s all Crypto Bros fault
crowcryptid · 8 months
Note
how do you know which gpus are better? it's okay if you don't want to answer
So when it comes to gpus, ignore the number at the start. Think of it like the generation of card, not related to the performance. For example: 3080 vs a 4060. You would expect the 4060 to be better, it’s not. The number at the end is what matters. A 4060 is the lowest end card in the 40 series. A 3080 is the mid-high range in the 30 series. The same applies to AMDs cards. The 6800 xt > 7700 xt despite the 7700 being brand new. The 6800 is a gen older, but it is a higher tier card as 800>700.
For both brands, having ti, xt, and xtx just means that’s a more upgraded version of the card. Ex: 4070 ti > 4070. This is kind of oversimplifying cause nvidia pulled some bullshit with the 4070 ti, but I’m not gonna get into that it’s not important lol. If you buy nvidia ur getting ripped off no matter what, it’s just how it is. I’ll get into that later. (Despite my harshness toward nvidia, I have no bias toward either. I’ve owned both. I have an nvidia card right now. Both have issues. Both also do what you want them to do.)
The best way to decide what you want would be to watch benchmark vids. Just ignore marketing bs and look at real performance. They typically will compare multiple cards at 1080, 2k, 4k running different games. This gives you a decent idea of what to expect.
Once you decide on a card, I also recommend you watch a vid that compares every model of that card. Never buy the founders edition cards unless you are getting it dirt cheap or somethin. Wait for the third party versions. Some will have better thermals, some will be quieter, some will be larger, etc. Also, if it matters to you, the aesthetic differences.
Pick whichever card fits your budget and matches the games you play. Also consider if you want to keep the same card for a long time or if you don’t mind selling your card and upgrading every 2-3 years.
If you want to keep it a while, get the better card, just don’t expect to run things at the highest settings 5 years from now. If you don’t care about running things on ultra settings and you just want a consistent experience on mid-high settings, get whichever card matches your monitor and sell it when the performance isn’t to your standard anymore.
Now: if you are torn between AMD and Nvidia, here’s how to decide.
Are you just going to game and don’t have an insane monitor? AKA you don’t NEED to have over 60 fps with raytracing or over 60 fps in 4K or 244 hz whatever. You’re not an esports player, who the hell needs anything over 144? Then AMD will save you money. You should probably go AMD. UNLESS you play one specific game and AMD cards are known to not handle that one specific game well. The only example I can think of here is minecraft shaders. The performance on AMD isn’t as good. You may have heard horror stories about AMD drivers, I never experienced that*. Ultimately it’s up to you to decide if you want to risk buying a card and then needing to return it later cause you had issues. (*beside my mistaken assumption that a driver update killed my card a few months back. Nope the card just died. It just happened to die a day after updating it. Not sarcasm btw I did roll back to old ass drivers to check and the card was 100% dead. Even in when placed in a new system. Just bad timing)
Oh and incase it’s not obvious, new AMD cards do have raytracing they’re just nowhere near as good at is as Nvidia.
Do you want to game and say, do some 3D rendering, want the best of the best, are an esports player who wants 300 fps, want to stream, create content, or value raytracing? Then go Nvidia. AMD is not trying to be the best of the best, Nvidia is, and they succeeded. And because they know they’re stronger and have better performance for things like rendering, they charge insane amounts. Nvidia has the rep of being the safe option, but overpriced, like Apple lol.
Though to be fair, both sides have gotten greedy and the prices have become bad for both. Nvidia is just the worse of the two.
TLDR:
The number at the beginning of a gpu name is just the generation of card. The number at the end is the “tier” of the card. Ex: 3080 = 30 series, mid-high tier. 4060 = 40 series, lowest tier. For Nvidia The tiers are 60, 70, 80, 90. Low, mid, mid-high, highest. Anything with a “ti” at the end just means it’s an upgrade over the plain numbered version.
AMD follows a similar scheme but it’s a single number for the generation. Ex: 6800 xt = 6000 series, mid-high tier, xt stands for extreme. xtx is a step above xt (same concept at nvidia’s ti) AMD’s tiers are 600, 700, 800, 900, 950. Lowest, mid, mid-high, high, highest.
For the general gamer, AMD is going to be better 99% of the time unless, as I said, you want something specific like raytracing. Saves you a couple hundred bucks as well.
Nvidia is the jack of all trades card that can do anything and do it better than AMD can. If you do more than just game, like make videos, do 3D modeling, etc, then go with Nvidia. It’ll cost ya though.
You didn’t ask about CPUS but the story is the same for AMD vs Intel. AMD is the general cpu that will work for most people. Intel is better for work related things and typically costs more. Though this point in time is a rare moment where going Intel would be cheaper if you go with an older motherboard. I am not going to keep talking that’s not what you asked about I’ll shut up now lol
1 note · View note