Fury X Gaming GPU was 2015's World's Most Powerful Computer Chip!

  • 61 results
  • 1
  • 2
Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#1 Xtasy26
Member since 2008 • 5593 Posts

Titan X: 6.14 Teraflops

980 Ti: 5.6 Teraflops

Fury X: 8.6 Teraflops

What's more AMD did it with the chip being smaller than the Titan X/980 Ti.

Titan X/980 Ti chip size: 601 mm2

Fury X: 596 mm2.

They also packed in more transistor in the chip despite it being smaller.

Titan X: 8 Billion Transistors

Fury X: 8.9 Billion Transistor.

That means AMD packed in 900 Million MORE transistors then the 980 Ti/Titan X. They also included the Word's First HBM Memory and the Worlds's First 4096 Bit Bus all in a chip that's smaller than the Titan X!

They did all that with a shoe string budget. Impressive.

I guess the term "Fast and Furyous" does apply to the Fury X. ;)

#PokerFuryXFace

Avatar image for Big_Red_Button
Big_Red_Button

6094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#3 Big_Red_Button
Member since 2005 • 6094 Posts

@magicalclick said:

Sounds great. How dose the game benchmark compares? Hard to know the difference from spec only.

This.

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#4 Flyincloud1116
Member since 2014 • 6418 Posts

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5  Edited By scatteh316
Member since 2004 • 10273 Posts

Nope... not even close...

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6  Edited By lostrib
Member since 2009 • 49999 Posts

I look forward to when this gets bumped in a month

Avatar image for howmakewood
Howmakewood

7832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 Howmakewood
Member since 2015 • 7832 Posts

@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#8 Flyincloud1116
Member since 2014 • 6418 Posts

@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 kingtito
Member since 2003 • 11775 Posts

@flyincloud1116 said:
@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#10 Flyincloud1116
Member since 2014 • 6418 Posts

@kingtito said:
@flyincloud1116 said:
@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

Avatar image for howmakewood
Howmakewood

7832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 Howmakewood
Member since 2015 • 7832 Posts

@flyincloud1116 said:
@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Unlike you I am a true gamer who plays games on the best platform available to them, in most games this is the PC.

Minimum wage? I thought that was for the peasant complaining about how expensive pc gaming is.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#12 ReadingRainbow4
Member since 2012 • 18733 Posts

4gb ram.

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 kingtito
Member since 2003 • 11775 Posts

@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#14 Flyincloud1116
Member since 2014 • 6418 Posts

@howmakewood said:
@flyincloud1116 said:
@howmakewood said:
@flyincloud1116 said:

If this produce the desired numbers, then all Hermits are again thrust into the perpetual state of being last generation. Oh the horror.

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Unlike you I am a true gamer who plays games on the best platform available to them, in most games this is the PC.

Minimum wage? I thought that was for the peasant complaining about how expensive pc gaming is.

I guess it was over your head, but digress. A true gamer because you game on the best platform, surely you jest. Does it matter the degree that one games on PC to considered a True Gamer? I mean does the PC have to mid, high, or ultra high end(meaning the latest of the late no matter what even if it means upgrading several times a year)? So I assume you have a couple of Titans and if this have the best benchmarks the you are moving on to this new GPU.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#15 NyaDC
Member since 2014 • 8006 Posts

@ReadingRainbow4 said:

4gb ram.

That's the only negative I see with this card, it may be HBM but 4gb's is a deal breaker for me going into the future regardless of the cards compute power.

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#16 Flyincloud1116
Member since 2014 • 6418 Posts

@kingtito said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@howmakewood said:

Why do you peasants even bother?

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Avatar image for cyanblues
cyanblues

312

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 cyanblues
Member since 2004 • 312 Posts

Okay the last nvidia card I had was a geforce 256....other then that i've always had AMD cards, sporting a 290x now. I've always liked AMD cards more but even I'm not stupid enough to think the fury x is better then 980ti. At least in most game esp direct x 11 games. But overall if someone want the best right now they're better off with nvidia980ti

Now if we're talking about Async computing...that may be different, still don't know since there isn't any official full direct x 12 released game yet.

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 kingtito
Member since 2003 • 11775 Posts

@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:

Because we are peasants?

We all will be peasants is this holds up under benchmark testing. How you like your $15 minimum wage?

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Yeah it was an i7. I just upgraded to the i7 5930k with x99 Asus deluxe MB, 16GB of G Skill mem, new case because my liquid CPU radiator wouldn't fit in my old one, new 850 SSD and 1200W PS. It's more of a brand new computer than an upgrade I guess. I figure I won't need to upgrade for a few years. I will be getting a new Video card 980ti most likely to replace my 780ti until the new Pascal chips come out later this year.

Avatar image for flyincloud1116
Flyincloud1116

6418

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#19 Flyincloud1116
Member since 2014 • 6418 Posts

@kingtito said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@kingtito said:

Stop your crying and run back to your console. You really had no reason to post here.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Yeah it was an i7. I just upgraded to the i7 5930k with x99 Asus deluxe MB, 16GB of G Skill mem, new case because my liquid CPU radiator wouldn't fit in my old one, new 850 SSD and 1200W PS. It's more of a brand new computer than an upgrade I guess. I figure I won't need to upgrade for a few years. I will be getting a new Video card 980ti most likely to replace my 780ti until the new Pascal chips come out later this year.

Should I wait on Pascal, meaning I could get an i7 a little cheaper. Also, what is the best GPU to get, I have been out of the PC game for over a few years.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#20 Xtasy26
Member since 2008 • 5593 Posts

@magicalclick said:

Sounds great. How dose the game benchmark compares? Hard to know the difference from spec only.

t beat's the Titan X in some games. And that's with summer drivers and at a significantly less price.

Expect things to get better with the newer drivers.

@Xtasy26 why are you such a fanboy for AMD video cards? Are you a paid shill or something?

No, just pointing out interesting facts. I will give kudos to AMD for creating the World's First HBM GPU and computer chip with the World's First 4096 Bit Bus. HBM and 4096 Bit+ bus was revolutionary in that regard as this will be the basis for next get GPU's over the next 10 years. I even expect the next gen consoles to use HBM. They have been working on HBM for the past 7 years. So, in that regard the Fury X was "special" as it was First in it's kind.

@scatteh316 said:

Nope... not even close...

Explain?

@ReadingRainbow4 said:

4gb ram.

@nyadc said:
@ReadingRainbow4 said:

4gb ram.

That's the only negative I see with this card, it may be HBM but 4gb's is a deal breaker for me going into the future regardless of the cards compute power.

Ah for that AMD has an answer. According to this video HBM sheer speed will actually help circumvent the problem because of vastly more faster it is then the old school GDDR5 RAM. As explained by AMD's Chief Scientists Richard Huddy HBM can do over Half a Terabyte per second of memory bandwidth as he put it: “exceeds the capability of 8GB or 12GB of memory and the reason for that is that there is so much bandwidth inside HBM that if you have system memory we can swap memory around inside the machine, swap between HBM and system memory and keep the working set in the 4 Gigabytes and it never get’s in the way of the GPU. “

He goes on to say: “What happens is you effectively get rid of the problems of frame buffer size and the extraordinary result that comes on the back of that is when you benchmark a Fiji chip is..when start to wind up the resolution higher and higher you would start to think that our 4GB would come to the limit to the headroom a bottleneck but far from that it actually as you wind the resolution up we get better and better we start beating a Titan X and indeed we consistently beat it if you go to high enough resolution so HBM is actually the future of memory."

Loading Video...

^^ That was a nice video if you are a geek as he goes onto explain that it would have been nearly impossible to create a 4096 Bit Bus with GDDR5 RAM with stated power consumption. Also, goes in AsynShaders that AMD's Chip supports at the Hardware Level (unlike nVidia which will be important for DX 12).

AMD has also assigned two engineers to do VRAM optimization as stated by Anandtech:

So, far I haven't run into and memory issues even at 4K ( I use 4K VSR).

Avatar image for dxmcat
dxmcat

3385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 dxmcat
Member since 2007 • 3385 Posts

inb4 next gen consoles melt inside your entertainment center.

Also, maxwell over a year old. I'm sure Pascal will poop on it once again.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22  Edited By 04dcarraher
Member since 2004 • 23857 Posts

again with the obsession..... The cherry picked benches and medium settings at 4k....... really..... Major downfall of FURY is the 4gb, when games need more than 4gb, it has to rely on system ram which makes the bandwidth of HBM mean squat, causing lower minimum fps and frametiming issues.

Avatar image for cyanblues
cyanblues

312

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23  Edited By cyanblues
Member since 2004 • 312 Posts

Crossfire scaling has been working better then SLI for the last 2 years or so, when it does work.

The rage3d forum I go to a lot of people complain about crossfire and sli not working in games properly etc and waiting for drivers from weeks to mnths before both nvidia/amd get them out.

In my opinion using those benchmark as proof is pointless.

Here's some single card benchmark from AnandTech where Fury loses most of the time http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/13 and some from Techreport http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/6

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 kingtito
Member since 2003 • 11775 Posts

@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Yeah it was an i7. I just upgraded to the i7 5930k with x99 Asus deluxe MB, 16GB of G Skill mem, new case because my liquid CPU radiator wouldn't fit in my old one, new 850 SSD and 1200W PS. It's more of a brand new computer than an upgrade I guess. I figure I won't need to upgrade for a few years. I will be getting a new Video card 980ti most likely to replace my 780ti until the new Pascal chips come out later this year.

Should I wait on Pascal, meaning I could get an i7 a little cheaper. Also, what is the best GPU to get, I have been out of the PC game for over a few years.

It all depends on your budget. If you can wait till Pascal comes out, I'd wait. It's supposed to be significant upgrade. I'm going to upgrade now and then get one of the Pascal GPUs when they release the ti version.

When it comes to picking the CPU and MB, I always go with the most I can afford. If I have to save a little I will so I'm not feeling the need to upgrade every 6 months.

Right now I think best GPU is the 980ti spite what the TC in this thread is saying. Nvidia drivers are usually more mature and come out at a faster rate than AMD. Lots of games are also made with Nvidia in mind. You really can't go wrong with the Fury X or 980ti but if you're not going to go with either of those then I'd go with Nvidia for mid tier GPU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Xtasy26:

FYI, my 980 Ti factory overclocks to ~1.317 Ghz 6.8 TFLOPS, hence it's not limited to 5.6 TFLOPS at ~1075MHz. Neither GPUs can properly sustain 4K.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#27  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

.... Seriously I had to check the time this was posted because we have had reviews about this card for months.. And it was seen as widely a disappointment in offering anything ground breaking.. Furthermore those are cherry picked benches, not to mention crossfire/sli comparison.. A comparison that is literally only going to concern a very very few people who find the need to have a SLI set up ranging 1200+ dollars.. Meanwhile the 980 TI overclocked handedly beats the fury in every real world test, much has to do with the fact that the fury is a poor overclocker.. The Fury X is by no means a bad card, but stop trying to make it out that it stomps the competition.. It doesn't, it lags behind the 980 ti most of the time.. I would have strongly considered one my self if it weren't for the poor overclocking compared to the 900 series of nivdia cards currently.

Avatar image for Jag85
Jag85

20637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#28  Edited By Jag85
Member since 2005 • 20637 Posts

@Xtasy26 said:

Titan X: 6.14 Teraflops

980 Ti: 5.6 Teraflops

Fury X: 8.6 Teraflops

What's more AMD did it with the chip being smaller than the Titan X/980 Ti.

Titan X/980 Ti chip size: 601 mm2

Fury X: 596 mm2.

They also packed in more transistor in the chip despite it being smaller.

Titan X: 8 Billion Transistors

Fury X: 8.9 Billion Transistor.

That means AMD packed in 900 Million MORE transistors then the 980 Ti/Titan X. They also included the Word's First HBM Memory and the Worlds's First 4096 Bit Bus all in a chip that's smaller than the Titan X!

They did all that with a shoe string budget. Impressive.

I guess the term "Fast and Furyous" does apply to the Fury X. ;)

#PokerFuryXFace

And yet, despite all that power, AMD is still losing to Nvidia in benchmark performance tests...

...Nvidia does what AMDon't.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29  Edited By clyde46
Member since 2005 • 49061 Posts

The AMD shill is at it again.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#30 Xtasy26
Member since 2008 • 5593 Posts

@04dcarraher said:

again with the obsession..... The cherry picked benches and medium settings at 4k....... really..... Major downfall of FURY is the 4gb, when games need more than 4gb, it has to rely on system ram which makes the bandwidth of HBM mean squat, causing lower minimum fps and frametiming issues.

Again with your delusions. Why are using older drivers from 6 months ago? Even then on average it's still beating the Titan X SLI in both scaling and average frametimes.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#31 Xtasy26
Member since 2008 • 5593 Posts

@clyde46 said:

The AMD shill is at it again.

Why don't you like facts?

@Jag85 said:
@Xtasy26 said:

Titan X: 6.14 Teraflops

980 Ti: 5.6 Teraflops

Fury X: 8.6 Teraflops

What's more AMD did it with the chip being smaller than the Titan X/980 Ti.

Titan X/980 Ti chip size: 601 mm2

Fury X: 596 mm2.

They also packed in more transistor in the chip despite it being smaller.

Titan X: 8 Billion Transistors

Fury X: 8.9 Billion Transistor.

That means AMD packed in 900 Million MORE transistors then the 980 Ti/Titan X. They also included the Word's First HBM Memory and the Worlds's First 4096 Bit Bus all in a chip that's smaller than the Titan X!

They did all that with a shoe string budget. Impressive.

I guess the term "Fast and Furyous" does apply to the Fury X. ;)

#PokerFuryXFace

And yet, despite all that power, AMD is still losing to Nvidia in benchmark performance tests...

...Nvidia does what AMDon't.

Too bad I don't play PassMark.

Avatar image for desprado
Desprado

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 Desprado
Member since 2015 • 60 Posts

@Xtasy26: You are an idiot and stupid.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#33  Edited By NyaDC
Member since 2014 • 8006 Posts

@Jag85 said:

And yet, despite all that power, AMD is still losing to Nvidia in benchmark performance tests...

...Nvidia does what AMDon't.

Passmark is literally one of the worst benchmarks you could have cited as evidence for anything, it's absolutely worthless. I hate to say this but anyone who EVER posts Passmark as some form of argument piece should never be listened to again in a PC discussion, and I mean NEVER...

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#34 Xtasy26
Member since 2008 • 5593 Posts

@ronvalencia said:

@Xtasy26:

FYI, my 980 Ti factory overclocks to ~1.317 Ghz 6.8 TFLOPS, hence it's not limited to 5.6 TFLOPS at ~1075MHz. Neither GPUs can properly sustain 4K.

That's a nice overclock. I really would like to try to see what my OC can do now that HBM has been unlocked.

@magicalclick said:

@Xtasy26: The score is decent. I want to see how this compare with DX12 game like Fable Legend in the future. And I am still waiting on tiled resources to work with this kind of lower capability ram. I think we need a benchmark with Star Citizens too. The game seems to generate the planet in real time, thus, is not limited to textures. The minimum fps is still questionable though.

I would like to see more DX 12 benchmarks too. Also, with crossfire in DX 12 it will share Video Memory which means having 4GB HBM will further not be a negating factor even less so in future games. If you do crossfire (which I expect will be essential if you want to game on the Fury X or 980 Ti in future games at 4K).

http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/

Avatar image for Heil68
Heil68

60817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35  Edited By Heil68
Member since 2004 • 60817 Posts

@ronvalencia said:

@Xtasy26:

FYI, my 980 Ti factory overclocks to ~1.317 Ghz 6.8 TFLOPS, hence it's not limited to 5.6 TFLOPS at ~1075MHz. Neither GPUs can properly sustain 4K.

I was going to buy this gpu in my next build not now. I'll go a different route,

Thanks Ron, saved me some money

Avatar image for neatfeatguy
neatfeatguy

4415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 0

#36  Edited By neatfeatguy
Member since 2005 • 4415 Posts

Really? You're posting the same post in the hardware forum and now you're spouting the same old crappy, cherry picked images from 6+ months ago to try and back your crappy post?

Can someone please delete his stupid threads? They're multiplying in other categories on this forum.

Avatar image for desprado
Desprado

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Desprado
Member since 2015 • 60 Posts

@neatfeatguy said:

Really? You're posting the same post in the hardware forum and now you're spouting the same old crappy, cherry picked images from 6+ months ago to try and back your crappy post?

Can someone please delete his stupid threads? They're multiplying in other forum locations on this board.

Funny thing is that he never shown a proper customer OC version of GTX 980 Ti vs Fury X. He also edited image so that fury X look faster. It only shows the failure of AMD management , its community and how much they are disconnected with each other and from normal PC user.

This is called a proper benchmark of GTX 980 Ti reference vs Fury X with latest driver.

Avatar image for Jag85
Jag85

20637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#38  Edited By Jag85
Member since 2005 • 20637 Posts

@nyadc: What's wrong with PassMark?

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#39 Xtasy26
Member since 2008 • 5593 Posts

@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:

I'm in the process of building a new rig finally, so does that count oh mighty Hermit Lem.

What are you putting in it? I just upgraded from my 2600K last week.

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Yeah it was an i7. I just upgraded to the i7 5930k with x99 Asus deluxe MB, 16GB of G Skill mem, new case because my liquid CPU radiator wouldn't fit in my old one, new 850 SSD and 1200W PS. It's more of a brand new computer than an upgrade I guess. I figure I won't need to upgrade for a few years. I will be getting a new Video card 980ti most likely to replace my 780ti until the new Pascal chips come out later this year.

Should I wait on Pascal, meaning I could get an i7 a little cheaper. Also, what is the best GPU to get, I have been out of the PC game for over a few years.

If you want to game at high resolutions and want something that is future proof it would be insane not to get the Fury X now that the drivers have improved. The Fury X has 176 GB/s more than the 980 Ti and has a 4096 Bit bus which is 8X more than the 980 Ti.

Remember, their drivers are getting better and better so expect the game to increase as drivers improve at 1440P+ resolution.

@desprado And? Those benches don't show anything that I have stated all along. By the what kind of an idiot would want to play certain games at 4K maxed out below 30? Maybe at certain games. Why do you think I have been selecting CF vs SLI benchmarks at 4K? Or are you too stupid to realize that?

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#40 Xtasy26
Member since 2008 • 5593 Posts

@sSubZerOo said:

.... Seriously I had to check the time this was posted because we have had reviews about this card for months.. And it was seen as widely a disappointment in offering anything ground breaking.. Furthermore those are cherry picked benches, not to mention crossfire/sli comparison.. A comparison that is literally only going to concern a very very few people who find the need to have a SLI set up ranging 1200+ dollars.. Meanwhile the 980 TI overclocked handedly beats the fury in every real world test, much has to do with the fact that the fury is a poor overclocker.. The Fury X is by no means a bad card, but stop trying to make it out that it stomps the competition.. It doesn't, it lags behind the 980 ti most of the time.. I would have strongly considered one my self if it weren't for the poor overclocking compared to the 900 series of nivdia cards currently.

I could give you average benchmarks of a whole hosts of games and not just certain games. Even then it has pulled away from the 980 Ti at 4K since it's launch and has gotten better at 1440P.

As AMD's drivers get better and better expect it to pull further away. Same thing happened with the 780 Ti vs R9 290X it was behind when it launched and the R9 290X is ahead of the 780 Ti on average. The only difference is that the Fury X did it in 4 - 5 months.

Avatar image for desprado
Desprado

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41  Edited By Desprado
Member since 2015 • 60 Posts

@Xtasy26:It is not about playable settings and its about benchmark and performance. Fury X is only competitive at 4k against reference. You are so stupid and idiot fanboy that you are only selecting one website benchmark to back your false claim. Moreover, you are far away from reality kid and that is the reason why normal PC user do not like AMD.

This posted by you right.

http://techavenue.wix.com/techavenue#!xfx-fury-x-review-final/c6bd

and see this post with a proper benchmark.

http://forums.anandtech.com/showthread.php?t=2457410&highlight=

Grow up kid and this will not help AMD at all .It will further reduce AMD DGPU market share if idiots like you do not learn their lesson.

Avatar image for m3dude1
m3dude1

2334

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42  Edited By m3dude1
Member since 2007 • 2334 Posts

@Xtasy26 said:
@flyincloud1116 said:
@kingtito said:
@flyincloud1116 said:

I haven't made up my mind yet. I don't think that's in my budget, but we shall see. i7 right?

Yeah it was an i7. I just upgraded to the i7 5930k with x99 Asus deluxe MB, 16GB of G Skill mem, new case because my liquid CPU radiator wouldn't fit in my old one, new 850 SSD and 1200W PS. It's more of a brand new computer than an upgrade I guess. I figure I won't need to upgrade for a few years. I will be getting a new Video card 980ti most likely to replace my 780ti until the new Pascal chips come out later this year.

Should I wait on Pascal, meaning I could get an i7 a little cheaper. Also, what is the best GPU to get, I have been out of the PC game for over a few years.

If you want to game at high resolutions and want something that is future proof it would be insane not to get the Fury X now that the drivers have improved. The Fury X has 176 GB/s more than the 980 Ti and has a 4096 Bit bus which is 8X more than the 980 Ti.

Remember, their drivers are getting better and better so expect the game to increase as drivers improve at 1440P+ resolution.

@desprado And? Those benches don't show anything that I have stated all along. By the what kind of an idiot would want to play certain games at 4K maxed out below 30? Maybe at certain games. Why do you think I have been selecting CF vs SLI benchmarks at 4K? Or are you too stupid to realize that?

one of the dumbest things ive ever read

Avatar image for desprado
Desprado

60

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43  Edited By Desprado
Member since 2015 • 60 Posts

@m3dude1: I have read much worse in AMD Roy twitter and that is why he closed and deleted his account due to many blind idiot AMD fanboy.

He is just being typical AMD fanboy which are far from reality.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#44  Edited By NyaDC
Member since 2014 • 8006 Posts

@Jag85 said:

@nyadc: What's wrong with PassMark?

It provides zero valuable information of any kind and the performance metrics in relation to how hardware performs in real world applications is insanely off.

Avatar image for m3dude1
m3dude1

2334

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45  Edited By m3dude1
Member since 2007 • 2334 Posts

@desprado said:

@m3dude1: I have read much worse in AMD Roy twitter and that is why he closed and deleted his account due to many blind idiot AMD fanboy.

He is just being typical AMD fanboy which are far from reality.

i meant that specific line about the 4096 bus. ive never seen a comparison at that level of stupidity in gpu discussions. for once nyadc is right, passmark is useless. hes ultimately wrong tho as usual. furyx is not a better product than the 980ti. seems fundamentally broken on a hardware level somehow as theres a serious lack of scaling above a 390x.

Avatar image for Jag85
Jag85

20637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#46  Edited By Jag85
Member since 2005 • 20637 Posts

@nyadc said:

It provides zero valuable information of any kind and the performance metrics in relation to how hardware performs in real world applications is insanely off.

Interesting. It seems to be very popular for some reason. What other alternative benchmark software would be better for benchmarking graphics cards?

By the way, I wasn't being serious. Not a big fan of either AMD or Nvidia. Still deciding on which one to go with for my next GPU upgrade.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#47 NyaDC
Member since 2014 • 8006 Posts

@Jag85 said:
@nyadc said:

It provides zero valuable information of any kind and the performance metrics in relation to how hardware performs in real world applications is insanely off.

Interesting. It seems to be very popular for some reason. What other alternative benchmark software would be better for benchmarking graphics cards?

By the way, I wasn't being serious. Not a big fan of either AMD or Nvidia. Still deciding on which one to go with for my next GPU upgrade.

I'd mostly stick to game benchmarking but 3DMark and Unigen can provide some valuable information.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#48 Xtasy26
Member since 2008 • 5593 Posts

@neatfeatguy said:

Really? You're posting the same post in the hardware forum and now you're spouting the same old crappy, cherry picked images from 6+ months ago to try and back your crappy post?

Really? Can't you see this was about compute power. I showed benchmarks to show the performance of this chip because someone asked for it. If you don't like my "crappy" post then post a counter argument instead of whining.

@desprado said:

@m3dude1: I have read much worse in AMD Roy twitter and that is why he closed and deleted his account due to many blind idiot AMD fanboy.

He is just being typical AMD fanboy which are far from reality.

Heck I could play your game too. You are a typical nVidia fan boy. At least post a counter argument.

@Jag85 said:

What's wrong with PassMark?

It's one of the most garbage benchmarks for actual performance. At least with Firestrike it has maybe "some" resemblance to "real life" benchmarks. Even then I think Firestrike benchmarks is very silly to use when buying graphics cards.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#49 Xtasy26
Member since 2008 • 5593 Posts

@m3dude1 said:
@desprado said:

@m3dude1: I have read much worse in AMD Roy twitter and that is why he closed and deleted his account due to many blind idiot AMD fanboy.

He is just being typical AMD fanboy which are far from reality.

i meant that specific line about the 4096 bus. ive never seen a comparison at that level of stupidity in gpu discussions.

Then explain the significance of using a 500 MHZ (4096 Bit Bus) using the clock speed of say GDDR5 as opposed to using HBM?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 scatteh316
Member since 2004 • 10273 Posts

Dear fucking lord.... OP is a idiot....... MINIMUM FRAME RATES are the single most important factor of game performance and they're higher on NVIDIA cards.