10 Years since AMD took the graphics crown? What AMD needs to do better going into 2024+?

Avatar image for Xtasy26
Xtasy26

5600

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

Poll 10 Years since AMD took the graphics crown? What AMD needs to do better going into 2024+? (14 votes)

Maintain or improve lead on Rasterization. 7%
Improve Ray Tracing performance. 29%
Improve FSR 3 with better image quality, Frame Generation and use of AI/ML and other features. 43%
Better price in the product stack from top to bottom. 21%

It was in 2013 that AMD released the R9 290X that took the graphics crown from NVIDA. It beat the original Titan at half the price. Which was amazing. Over the past 7 years AMD hasn’t been competitive in the high end. While it has done good in the mid end with the RX 580 doing good agains the GTX 1060 6GB. RX 5700 XT being competitive with RTX 2070. And 7700XT and 7800 XT doing an excellent against the RTX 4070 and RTX 4060. It really hasn’t beat NVIDIA since the Hawaii GPU released back in 2013. It was more of a draw with HD 6900 XT where it beat the RTX 3090 but got trashed with Ray Tracing enabled.

Ironically my last AMD GPU was the R9 390X (which was essentially slightly better R9 290X with 8 GB of memory).

Apparently it still can play some games in 2023 at 1080P depending on the settings which is insane.

I switched to NVIDA with GTX 1060 6GB in 2017 and then getting the RTX 3090 because want to game at 4K with Ray Tracing 2021. So, I have been away from AMD (other than CPUs) from over half a decade now.

But I would like to have a more competitive gaming in the high end. I was hoping that intel would inject some competition but ARC A770 is a JOKE.

Last data with discrete GPU market shows AMD with 17% market share in Q2 2023 with intel with 2% and NVIDIA 80% market share which is insane. This is basically what they were back in Q2 2015 when they had the mediocre R9 Fury series. In other words they were they same as they were back in 2015! No progress in 8 years.

So, what does AMD need to get back into at least 33% market share which they were for the most part through the 2000’s and early 2010. I think highest they had was like 38% back in 2014 after the R9 290X and the boom in Bitcoin mining. They haven’t reached that share in 10 years now!

In my opinion AMD is killing it in Rasterization but they really need to improve Ray Tracing performance which goes hand in had with having features like FSR 3 with frame generation with excellent picture quality. They need to use ML/AI to improve the image quality like NVIDIA does with DLSS 2/3.5. Yes they get close to picture quality at 4K with Ray Tracing and FSR 3 but below that NVIDIA has a significant advantage. I am not saying not to get AMD I will recommend AMD with GPUs like the RX 7800 XT which I think is a better buy but what not necessarily in the high end.

So, what do you guys think?

 • 
Avatar image for pyro1245
pyro1245

9525

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#1 pyro1245
Member since 2003 • 9525 Posts

They need to improve the machine learning features. Rasterization is great, but has limits to what devs can reasonable accomplish in a given power budget. I want to see them match nvidia in this area.

Seems like they could improve their drivers too. I grabbed a used 6800 xt recently because I didn't want to jump into nvidia's 4xxx cards late into the generation with the prices still as stupid as they are. Been pretty happy with it, aside from some system stability issues (used card, sample size of one, so who knows).

Avatar image for Xtasy26
Xtasy26

5600

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#2  Edited By Xtasy26
Member since 2008 • 5600 Posts

@pyro1245 said:

They need to improve the machine learning features. Rasterization is great, but has limits to what devs can reasonable accomplish in a given power budget. I want to see them match nvidia in this area.

Seems like they could improve their drivers too. I grabbed a used 6800 xt recently because I didn't want to jump into nvidia's 4xxx cards late into the generation with the prices still as stupid as they are. Been pretty happy with it, aside from some system stability issues (used card, sample size of one, so who knows).

6800 XT is a great choice! Great bang for the buck in the mid-end.

Avatar image for Bikouchu35
Bikouchu35

8344

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3  Edited By Bikouchu35
Member since 2009 • 8344 Posts

I don't have faith in them retaking the crown like the 7970 last did. They can still make a competitive product and pricing though. I like it for what it is and I can undevolt them to mitigate power draw. Intel did alright for their first try, a random Chinese company tried to make a card that only performed to GTX 780 levels which goes to show you how hard it can be to make a GPU.

Avatar image for ionusX
ionusX

25780

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#4 ionusX
Member since 2009 • 25780 Posts

what they need to do is make good enough, and i do mean that, focus on removing 8gb vram from the stack 10gb+ at least with good memory bus width, at a good price, especially that 400 dollar price tag like the 6700xt is now, if you put like the 7700XT with 16gb at 400 dollars whooo nvidias bacon would be greased something fierce.

trying to fight nvidia for the 99%th percentile is no longer relevant, thats a pointless horse to chase, nvidia will simply do it and theres no catching them but those are also the only 4000 series that really have sold well, 4060's pop up because of prebuilts mainly or a lack of available 3060's still on the shelf where someone is geographically situated. but between there is a gulf that the consumer has avoided because none of it is appealing.

the 8800gt back in the day sold like crazy, so much so they basically released the same card twice (9800gt) and THAT sold like crazy too, amd needs that. it needs a product that sits in the center of the stack that everyone is willing to pony up too, "good enough" for most things, with the vram for 2k gaming and the preformance needed to play games without a fuss at like medium and high at 60fps. if you can hit that mark dead on at a reasonable price. you will sell and sell, and sell, and AIC's will roll in doubloons, we are in a time of pc gaming boom, AMD needs to capitalize now before the bust.

if you do that there is no.. 5090 nvidia can make that will counter that, yes, crypto bros are rich and ai bros too, but what they are is elite, they are few in number the bulk of the consumer has always been the mid and low end, and if you can make that segment profitable and successful, there really is no catching you

Avatar image for Marfoo
Marfoo

6006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Marfoo
Member since 2004 • 6006 Posts

AMD is shifting into an AI GPU first business model. RDNA is going to be retired in favor of UDNA, their combined architecture for gaming and AI workloads. Being a GPU company solely for graphics simply is not a winning business strategy anymore. When AI capability drives such high prices, they're leaving money on the table not being able to charge in the ballpark of what Nvidia does for their chips.

AMD's next GPU, the 9070XT which will be unveiled at CES this coming week, will be their attempt to sell as many GPUs as they can to start clawing back market share and mindshare. It going to be a high bang for your buck card and attempt to reach feature parity with Nvidia. Expect better RT and FSR 4 to be unveiled.

The real return to form though is what comes after the 9070XT. We're talking big chiplet GPUs that go after Nvidia both in gaming and AI

Avatar image for girlusocrazy
GirlUSoCrazy

4355

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 GirlUSoCrazy
Member since 2015 • 4355 Posts

They have the best x86-64 APU solution, so for integrated computing devices they are a good fit. They have good power consumption/performance/price, so they get a lot of mainstream appeal in gaming devices and laptops.

Maybe they could get into the server farms like Nvidia, but they still fill an important market segment.

It would be good if they could provide some higher performance parts for the enthusiast crowd, because nvidia seems to be pushing people to higher cost GPUs if they want a decent amount of RAM, nerfing their mid/low end parts because of the limits. They could also be giving some better bandwidth on those parts, but they have things set up to try to upsell people aggressively.

Avatar image for Marfoo
Marfoo

6006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Marfoo
Member since 2004 • 6006 Posts

@girlusocrazy said:

They have the best x86-64 APU solution, so for integrated computing devices they are a good fit. They have good power consumption/performance/price, so they get a lot of mainstream appeal in gaming devices and laptops.

Maybe they could get into the server farms like Nvidia, but they still fill an important market segment.

It would be good if they could provide some higher performance parts for the enthusiast crowd, because nvidia seems to be pushing people to higher cost GPUs if they want a decent amount of RAM, nerfing their mid/low end parts because of the limits. They could also be giving some better bandwidth on those parts, but they have things set up to try to upsell people aggressively.

Planned obsolescence and aggressive upsell indeed. It didn't always used to be this way. When AMD is weak, Nvidia prices things the way they know they can get away with it. Unfortunately, I don't think AMD competing will bring Nvidia down to match AMD, it'll just bring AMD up to match Nvidia this time around.

Avatar image for mrbojangles25
mrbojangles25

60943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#8 mrbojangles25
Member since 2005 • 60943 Posts

@Marfoo said:

AMD is shifting into an AI GPU first business model. RDNA is going to be retired in favor of UDNA, their combined architecture for gaming and AI workloads. Being a GPU company solely for graphics simply is not a winning business strategy anymore. When AI capability drives such high prices, they're leaving money on the table not being able to charge in the ballpark of what Nvidia does for their chips.

AMD's next GPU, the 9070XT which will be unveiled at CES this coming week, will be their attempt to sell as many GPUs as they can to start clawing back market share and mindshare. It going to be a high bang for your buck card and attempt to reach feature parity with Nvidia. Expect better RT and FSR 4 to be unveiled.

The real return to form though is what comes after the 9070XT. We're talking big chiplet GPUs that go after Nvidia both in gaming and AI

Yeah I am in the market for a new video card, waiting to see what is more worth my money: AMD 9000 series or an Nvidia 5000 series (aiming for a 5080 and AMD equivelent).

AMD has an uphill battle; not particularly loyal to one brand or another, but I've had absolutely zero issues and nothing but good performance from Nvidia cards.

Avatar image for Marfoo
Marfoo

6006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 Marfoo
Member since 2004 • 6006 Posts

@mrbojangles25 said:
@Marfoo said:

AMD is shifting into an AI GPU first business model. RDNA is going to be retired in favor of UDNA, their combined architecture for gaming and AI workloads. Being a GPU company solely for graphics simply is not a winning business strategy anymore. When AI capability drives such high prices, they're leaving money on the table not being able to charge in the ballpark of what Nvidia does for their chips.

AMD's next GPU, the 9070XT which will be unveiled at CES this coming week, will be their attempt to sell as many GPUs as they can to start clawing back market share and mindshare. It going to be a high bang for your buck card and attempt to reach feature parity with Nvidia. Expect better RT and FSR 4 to be unveiled.

The real return to form though is what comes after the 9070XT. We're talking big chiplet GPUs that go after Nvidia both in gaming and AI

Yeah I am in the market for a new video card, waiting to see what is more worth my money: AMD 9000 series or an Nvidia 5000 series (aiming for a 5080 and AMD equivelent).

AMD has an uphill battle; not particularly loyal to one brand or another, but I've had absolutely zero issues and nothing but good performance from Nvidia cards.

I've used them both over the decades. I seem to have avoided AMD during their bad driver phase. I have a 7900XTX now, but up until it the last Radeon I had was the 7870. My XTX has been rock solid, no complaints with the software at all.

The 9070 will be the first card they're serious with about providing an RT uplift and getting an AI/ML upscaler in, but like I said, don't expect a real flagship until what comes after the 9070.

9070 XT is probably going land in 4080 territory overall for about $500 with 16 GB of VRAM.

Avatar image for nirgal
Nirgal

1985

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#10 Nirgal
Member since 2019 • 1985 Posts

They need to be able to compete in ai upscaling and be more power efficient.

I want to buy amd graphics, but I game in laptops. For me power efficiency is súper important and they are behind.

Avatar image for Marfoo
Marfoo

6006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11  Edited By Marfoo
Member since 2004 • 6006 Posts

@nirgal: RDNA 3.5 in APUs is showing great efficiency gains over RDNA 3. Let's hope that larger RDNA 4 GPUs for laptop show similar gains in efficiency.

Avatar image for blaznwiipspman1
blaznwiipspman1

16927

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 blaznwiipspman1
Member since 2007 • 16927 Posts

Nothing because people are sheep. They buy iphones even though they were complete trash from the start, and bought in such numbers at such ridiculous prices that it allowed Apple to invest massive amounts of R&D into actually creating something better than the competition.

It's a similar case with nvidia. You might say that AMD was able to compete with intel, but thats the exception not the norm. Let's just say intel are filled with dumb fucks, and shareholders that were only too happy to bleed the company dry with ridiculous dividend payments, instead of allowing the company to really invest in R&D. Now intel has made so many mistakes, they are a train wreck, even getting beat by ARM. The once mighty intel on its knees because of shareholders greed. Again, nvidia has no such problem with their shareholders and invest orders of magnitude more than AMD does in gpus. Though end of the day, nvidia still offers trash value, and that's why I'd always go with AMD gpus regardless.

Avatar image for girlusocrazy
GirlUSoCrazy

4355

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13  Edited By GirlUSoCrazy
Member since 2015 • 4355 Posts
@Marfoo said:

Planned obsolescence and aggressive upsell indeed. It didn't always used to be this way. When AMD is weak, Nvidia prices things the way they know they can get away with it. Unfortunately, I don't think AMD competing will bring Nvidia down to match AMD, it'll just bring AMD up to match Nvidia this time around.

It has to at this point. Nvidia are just building bigger, more power-hungry monsters. They will keep adding new power connectors to their cards and maybe come out with a 4-slot wide GPU that requires a bigger case and power supply. If you look at the biggest die size it's nvidia. They are just brute forcing the problem and making more expensive products as a result, and people will buy them. Earlier on, high end GPUs used to be considered expensive at $400-500 but nvidia has gone way past that. The mad AI race has companies buying up all the expensive nvidia parts.