Fury X Gaming GPU was 2015's World's Most Powerful Computer Chip!

  • 61 results
  • 1
  • 2
Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#52 Xtasy26
Member since 2008 • 5593 Posts

@desprado said:

@Xtasy26:It is not about playable settings and its about benchmark and performance. Fury X is only competitive at 4k against reference. You are so stupid and idiot fanboy that you are only selecting one website benchmark to back your false claim. Moreover, you are far away from reality kid and that is the reason why normal PC user do not like AMD.

This posted by you right.

http://techavenue.wix.com/techavenue#!xfx-fury-x-review-final/c6bd

and see this post with a proper benchmark.

http://forums.anandtech.com/showthread.php?t=2457410&highlight=

Grow up kid and this will not help AMD at all .It will further reduce AMD DGPU market share if idiots like you do not learn their lesson.

First of all it has caught up to the 980 Ti at 1440P. Yes, I did reference TechAvenue but I also referenced Hardware.fr and other sites. And what do you mean by "proper benchmark"? More like "useless" benchmark. While it has some good information, it doesn't show Crossfire vs SLI benchmarks of the Fury X vs the 980 Ti SLI vs Titan X SLI at 4K. You are too stupid to realize that at the high resolutions and AA settings that they are using the FPS drops below 30 FPS in both the 980 Ti and Fury X at 4K. Why do you think I have been posting CF Fury X vs 980 Ti/Titan X SLI benchmark dummy? Because a single GPU wouldn't be enough at 4K. While it may be good in some older games, maybe like BF3. It's going to be piss poor at 4K in newer games especially with the high AA and advanced AA settings that they are using.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#53 Xtasy26
Member since 2008 • 5593 Posts

@scatteh316 said:

Dear fucking lord.... OP is a idiot....... MINIMUM FRAME RATES are the single most important factor of game performance and they're higher on NVIDIA cards.

Where did I talk about minimum frame rates in my OP. I was talking about Raw compute power. And depends on the minimum frame rates. If go from 60 fps to 50 fps that's not going to bad as opposed to 55 FPS and 45 FPS. It all depends on how low the minimum framerates.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#54  Edited By Xtasy26
Member since 2008 • 5593 Posts

@m3dude1 said:

@Xtasy26: it doesnt fucking matter, its completely transparent to the end user. all that matters is end performance. bus width by itself in a vacuum is completely unimportant and a fucking stupid thing to compare. but hey saying something is 8x better than the competition suits your retarded amd bias.

oh, and theres no need to compare a theoretical 7000 mhz HBM because it will likely never exist. you dont even know wtf youre talking about. one of the benefits of moving to HBM is that it drastically lowers the need to continually chase higher clocked memory chips which are a massive drain on power. gpus have already hit the power ceiling. so saving 30 to 40 watts by using low clocked, wide bus HBM is 30 to 40 more watts available for pixel processing.

Well no duh it doesn't matter to the end user. And no I didn't say it was 8X better than the competition I said it has 8X larger bus (don't put words in m mouth because you clearly lack the reading comprehension level of a two year old). Where exactly does it imply that it was better 8X better? Only in your deluded mind.

And no I never said anything about having a 7000 mhz HBM. What are you smoking? I said what would be the consequence if I were to have a 500 Mhz 4096 bit using GDDR5 instead of suing HBM?

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#55 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@Xtasy26 said:
@m3dude1 said:

@Xtasy26: it doesnt fucking matter, its completely transparent to the end user. all that matters is end performance. bus width by itself in a vacuum is completely unimportant and a fucking stupid thing to compare. but hey saying something is 8x better than the competition suits your retarded amd bias.

oh, and theres no need to compare a theoretical 7000 mhz HBM because it will likely never exist. you dont even know wtf youre talking about. one of the benefits of moving to HBM is that it drastically lowers the need to continually chase higher clocked memory chips which are a massive drain on power. gpus have already hit the power ceiling. so saving 30 to 40 watts by using low clocked, wide bus HBM is 30 to 40 more watts available for pixel processing.

Well no duh it doesn't matter to the end user. And no I didn't say it was 8X better than the competition I said it has 8X larger bus (don't put words in m mouth because you clearly lack the reading comprehension level of a two year old). Where exactly does it imply that it was better 8X better? Only in your deluded mind.

And no I never said anything about having a 7000 mhz HBM. What are you smoking? I said what would be the consequence if I were to have a 500 Mhz 4096 bit using GDDR5 instead of suing HBM?

Avatar image for neatfeatguy
neatfeatguy

4415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 0

#56 neatfeatguy
Member since 2005 • 4415 Posts

@Xtasy26 said:
@neatfeatguy said:

Really? You're posting the same post in the hardware forum and now you're spouting the same old crappy, cherry picked images from 6+ months ago to try and back your crappy post?

Really? Can't you see this was about compute power. I showed benchmarks to show the performance of this chip because someone asked for it. If you don't like my "crappy" post then post a counter argument instead of whining.

@desprado said:

@m3dude1: I have read much worse in AMD Roy twitter and that is why he closed and deleted his account due to many blind idiot AMD fanboy.

He is just being typical AMD fanboy which are far from reality.

Heck I could play your game too. You are a typical nVidia fan boy. At least post a counter argument.

@Jag85 said:

What's wrong with PassMark?

It's one of the most garbage benchmarks for actual performance. At least with Firestrike it has maybe "some" resemblance to "real life" benchmarks. Even then I think Firestrike benchmarks is very silly to use when buying graphics cards.

I'm still waiting on the new benchmarks you are supposed to provide that actually utilize the new Crimson drivers. Your 6+ month old benchmarks are pointless. Both sides have improved drivers. Both sides have games that work better on their own cards. Stop cherry picking images to use.

Show us maxed, 4K benchmarks with current drivers. No medium settings. No posts about how Nvidia "stutters" and show videos or blog posts from 6+ months back. Do not post the image of someone posting in a youtube blog claiming the "magic" number for OC'ing the Fury X is 1200/600 and you gain a shit ton of performance - provide actual, current (current meaning - post Crimson driver release) proof. OC that Fury X of yours and benchmark for us - stop wasting time in here trying to defend your side until you do this.

I personal have no qualms about which is better, Fury X or 980Ti/Titan X - I just want to see current proof instead of you throwing out claims and posting data from 6+ months back where both sides are using old drivers or games tested are done on medium settings. Why would a site bench high-end enthusiast cards on medium settings? I've been searching for benchmark results for the Fury X since the Crimson release and all I come across are general data points indicating that at most, some games have increased performance by 3-4%. Overall, nothing has changed performance wise since the driver update prior to Crimson's release.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#57 GarGx1
Member since 2011 • 10934 Posts

Did we need another thread about the Fury X that's full of misinformation and cherry picking? How much do you get paid?

Avatar image for Heil68
Heil68

60817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#58 Heil68
Member since 2004 • 60817 Posts

@GarGx1 said:

Did we need another thread about the Fury X that's full of misinformation and cherry picking? How much do you get paid?

Ron crushed the thread in short order. The chip will be a flop and everyone should stay away from it.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#59 GarGx1
Member since 2011 • 10934 Posts
@Heil68 said:
@GarGx1 said:

Did we need another thread about the Fury X that's full of misinformation and cherry picking? How much do you get paid?

Ron crushed the thread in short order. The chip will be a flop and everyone should stay away from it.

I was going to post some benchmarks from my Zotac 980Ti AMP Extreme but I didn't want to make the TC cry.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#60  Edited By 04dcarraher
Member since 2004 • 23857 Posts

@Xtasy26 said:

Again with your delusions. Why are using older drivers from 6 months ago? Even then on average it's still beating the Titan X SLI in both scaling and average frametimes.

lol your the one with delusions, igoring facts and claiming half truths. lol claiming Crossfire is faster with a single synthetic benchmark which does not reflect real world results.... Both Crossfire & SLI trade blows and both have issues.

Every time your claims are picked apart you move to something else with more one sided crap.

Avatar image for Heil68
Heil68

60817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#61 Heil68
Member since 2004 • 60817 Posts

@GarGx1: That would of been good. :D

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#62 ronvalencia
Member since 2008 • 29612 Posts

@Heil68 said:
@ronvalencia said:

@Xtasy26:

FYI, my 980 Ti factory overclocks to ~1.317 Ghz 6.8 TFLOPS, hence it's not limited to 5.6 TFLOPS at ~1075MHz. Neither GPUs can properly sustain 4K.

I was going to buy this gpu in my next build not now. I'll go a different route,

Thanks Ron, saved me some money

It may take 14 nm or 16 nm flagship GPU generation.

@Xtasy26 said:
@sSubZerOo said:

.... Seriously I had to check the time this was posted because we have had reviews about this card for months.. And it was seen as widely a disappointment in offering anything ground breaking.. Furthermore those are cherry picked benches, not to mention crossfire/sli comparison.. A comparison that is literally only going to concern a very very few people who find the need to have a SLI set up ranging 1200+ dollars.. Meanwhile the 980 TI overclocked handedly beats the fury in every real world test, much has to do with the fact that the fury is a poor overclocker.. The Fury X is by no means a bad card, but stop trying to make it out that it stomps the competition.. It doesn't, it lags behind the 980 ti most of the time.. I would have strongly considered one my self if it weren't for the poor overclocking compared to the 900 series of nivdia cards currently.

I could give you average benchmarks of a whole hosts of games and not just certain games. Even then it has pulled away from the 980 Ti at 4K since it's launch and has gotten better at 1440P.

As AMD's drivers get better and better expect it to pull further away. Same thing happened with the 780 Ti vs R9 290X it was behind when it launched and the R9 290X is ahead of the 780 Ti on average. The only difference is that the Fury X did it in 4 - 5 months.

NVIDIA Keplers are in NVIDIA's legacy support status i.e. aging GPU design.