I have joined the Dark Side and Fury X now dominates the 980 Ti!

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#1  Edited By Xtasy26
Member since 2008 • 5593 Posts

This so-called AMD fanboy did the unthinkable. He brought an intel processor! The Core i7 6700K! ;)

I am so sorry AMD but Zen apparently got delayed to late Q4 2016, so you might actually be able to buy it with available quantities with proper motherboard support in Q1 2017. 2017 is too late. I want to be able to play the latest games out now maxed out not one year later. See I am not a fanboy, I just buy the best that's what's within my budget like most people in the WORLD does. Buy I did pick up this:

Why the Fury X over the 980 Ti? Well, I got it over $50 cheaper (actually more than that). But later on as I found out with newer drivers it's actually dominating the 980 Ti at 4K and SLI vs Crossfire Fury X. According the review below:

Now, with voltage unlocking with the new release of Sapphire Trixx Fury X can now be unlocked to 1200/600 making it faster than an OC'ed 980 Ti.

And at 1440P it has now caught up to the 980 Ti and is beating it at 4K on average.

Not to mention that it is now cheaper than the 980 Ti.

Also, in terms of image quality the R9 series is a lot better than the Titan/Ti series:

As stated in the article:

"Over the past 10 years most tech sites has shied away from doing image quality comparisons. So, we thought it would be good idea to do image quality comparison between the current R9 series graphics card and the nVidia's Titan series graphics card from which these images were taken. The results show noticeable difference in picture quality in Battlefield 4.

As one can see, if you look closely at the side of the tank, one can see greater detail and better color on the AMD card versus the nVidia card. The picture overall looks more vibrant and detail on the AMD card versus nVidia card."

"If one looks at this picture one can see if you look at the building floors where there is black color, black is black on the AMD card while on the nVidia card it is more grayish. Also, if one looks at the gloves on the AMD card, the colors are much more detailed and vibrant, where on the nVidia card the colors look bland."

Source

So, I guess you can say it was a good buy. I especially liked the image quality comparison (which is rare these days in a review). I thought I was seeing things when I went from nVidia to AMD with respect to image quality. Now I know I was right. The site also has extensive benchmark of games running at 4K VSR which I actually used so that was very beneficial.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#2 NyaDC
Member since 2014 • 8006 Posts

As great as the Fury X is I'm not buying one, my 290X's are still substantially more powerful, I'll wait for the next line of cards.

Avatar image for Heil68
Heil68

60819

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 Heil68
Member since 2004 • 60819 Posts

@nyadc said:

As great as the Fury X is I'm not buying one, my 290X's are still substantially more powerful, I'll wait for the next line of cards.

Me too. PASS.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#4 Xtasy26
Member since 2008 • 5593 Posts

@nyadc said:

As great as the Fury X is I'm not buying one, my 290X's are still substantially more powerful, I'll wait for the next line of cards.

You have a 290X in crossfire so that's understandable. Enjoy the image quality. I didn't realize there was such a difference.

Looking at the fact that the Fury X is beating the Titan X SLI in crossfire I too want to go crossfire! Imagine the savings!

Avatar image for EZs
EZs

1573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 EZs
Member since 2005 • 1573 Posts

AMD cards runs hotter than Nvdia though.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#6 remiks00
Member since 2006 • 4249 Posts

EZ Bugatti's.

Avatar image for jhonMalcovich
jhonMalcovich

7090

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 jhonMalcovich
Member since 2010 • 7090 Posts

I will wait for the new gen GPUs like Pascal that will obliterate both Fury and Ti

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#8  Edited By commander
Member since 2010 • 16217 Posts

No more amd for me, i got screwed with drivers to many times, not to mention their crossfire profiles are always considerably late. For some reason , nvidia always manages to deliver, on day one of new game release. It's more expensive though, that's for sure.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Xtasy26:

I have both R9-290X 1040Mhz OC edition and 980 Ti OC edition. I don't have a power supply and will(lazy) to CrossFire both of my R9-290X and R9-290.

If you have Fury-X, then that's good for you.

Avatar image for aroxx_ab
aroxx_ab

13236

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 aroxx_ab
Member since 2005 • 13236 Posts

You should got the Ps4 instead, all powered by your beloved AMD :D

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 scatteh316
Member since 2004 • 10273 Posts

Fury only beats it in 2 of the benchmarks you posted...... complete fail...

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#12 Xtasy26
Member since 2008 • 5593 Posts

@EZs said:

AMD cards runs hotter than Nvdia though.

Please tell me what you are smoking. In the review, the Fury X is getting an idle temperature is 22 degrees Celsius. Under load they are getting 49 degrees Celsius.

@aroxx_ab said:

You should got the Ps4 instead, all powered by your beloved AMD :D

Hell no! I am part of the #PCMasterRace not the #DownGradeRace #Can'tdoasteady1080P or #Can'tdoa1080P race. ;)

@scatteh316 said:

Fury only beats it in 2 of the benchmarks you posted...... complete fail...

You must be blind. At 1440P on average it's matching the 980 Ti. At 4K it's beating the 980 Ti. In CrossFire X it's COMPLETELY dominating the 980 Ti SLI in EVERY SINGLE game benched by TweakTown including Battlefield 4, Metro Last Night, Grid Autosport, Bishock Infinite, Shadow of Mordor. That's only two games. Heck Fury X Crossfire is beating the Titan X SLI 4/5 games and drawing in one.

Can't believe it's beating Titan X in SLI. :O And that's Pre-Crimson drivers.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#13 Xtasy26
Member since 2008 • 5593 Posts

@ronvalencia said:

@Xtasy26:

If you have Fury-X, then that's good for you.

Thanks! I am glad I waited for the price to come down and the performance to improve. I am extremely satisfied by the purchase now that I know it's beating the 980 Ti/Titan X SLI especially in crossfire. I am thinking about getting another one once the price drops due to crossfire benches.

I like what they said in the Summary.

  • Performance and features of Crimson drivers should have been launched when Fury X launched.
  • $50 cheaper than the 980 Ti should have been from the beginning.

As I stated when the Fury X launched is that it's a great card. The price should have been $599 back in June.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14 HalcyonScarlet
Member since 2011 • 13838 Posts

Graphics cards aside, Intel CPUs are pretty amazing. I have an i5 and I'm very happy with it. It handles everything I throw at it.

Avatar image for Sollet
Sollet

8288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#15  Edited By Sollet
Member since 2003 • 8288 Posts

@jhonMalcovich said:

I will wait for the new gen GPUs like Pascal that will obliterate both Fury and Ti

Same here.

Avatar image for Snugenz
Snugenz

13388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By Snugenz
Member since 2006 • 13388 Posts

The 4gig vram on the Fury is a bit of a downer, recent games have gotten silly with their Vram usage and my 780's 3gig is constantly maxed so my next upgrade will atleast be 6gig+.

Dat 6700k though...

Avatar image for howmakewood
Howmakewood

7834

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17  Edited By Howmakewood
Member since 2015 • 7834 Posts

My next upgrade will be a completely new build, time to let my trusty 2600k i7 go and say goodbye to ddr3, I do have 980ti tho, but planning on getting the next gen pascal.

Nice to see AMD doing some work and not banging on the old stuff. More competition is always good for the consumer!

Avatar image for GhoX
GhoX

6267

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#18  Edited By GhoX
Member since 2006 • 6267 Posts

Fury SHOULD be cheaper and stronger. It is newer tech.

The next move though is nvidia's with their new lineup in 2016. By then I may actually need to upgrade depending on how VR develops.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 scatteh316
Member since 2004 • 10273 Posts

@Xtasy26 said:
@EZs said:

AMD cards runs hotter than Nvdia though.

Please tell me what you are smoking. In the review, the Fury X is getting an idle temperature is 22 degrees Celsius. Under load they are getting 49 degrees Celsius.

@aroxx_ab said:

You should got the Ps4 instead, all powered by your beloved AMD :D

Hell no! I am part of the #PCMasterRace not the #DownGradeRace #Can'tdoasteady1080P or #Can'tdoa1080P race. ;)

@scatteh316 said:

Fury only beats it in 2 of the benchmarks you posted...... complete fail...

You must be blind. At 1440P on average it's matching the 980 Ti. At 4K it's beating the 980 Ti. In CrossFire X it's COMPLETELY dominating the 980 Ti SLI in EVERY SINGLE game benched by TweakTown including Battlefield 4, Metro Last Night, Grid Autosport, Bishock Infinite, Shadow of Mordor. That's only two games. Heck Fury X Crossfire is beating the Titan X SLI 4/5 games and drawing in one.

Can't believe it's beating Titan X in SLI. :O And that's Pre-Crimson drivers.

I'm not blind, I just have common sense.

Average frame rates means sweet **** all..... Minimum frame rates are the most important and the 980ti is the better card... heck look at the difference in minimum FPS in Shadow of Mordor... 35fps vs 67fps...... That's nearly double the bottom end frame rate and yet you think Fury turns in the better performance in that game?

Hahahahahahahahahaha

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

Comparing reference cards isn't exactly a good way to test, the 980ti also overclocks a lot better.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#21  Edited By Xtasy26
Member since 2008 • 5593 Posts

@scatteh316 said:
@Xtasy26 said:
@EZs said:

AMD cards runs hotter than Nvdia though.

Please tell me what you are smoking. In the review, the Fury X is getting an idle temperature is 22 degrees Celsius. Under load they are getting 49 degrees Celsius.

@aroxx_ab said:

You should got the Ps4 instead, all powered by your beloved AMD :D

Hell no! I am part of the #PCMasterRace not the #DownGradeRace #Can'tdoasteady1080P or #Can'tdoa1080P race. ;)

@scatteh316 said:

Fury only beats it in 2 of the benchmarks you posted...... complete fail...

You must be blind. At 1440P on average it's matching the 980 Ti. At 4K it's beating the 980 Ti. In CrossFire X it's COMPLETELY dominating the 980 Ti SLI in EVERY SINGLE game benched by TweakTown including Battlefield 4, Metro Last Night, Grid Autosport, Bishock Infinite, Shadow of Mordor. That's only two games. Heck Fury X Crossfire is beating the Titan X SLI 4/5 games and drawing in one.

Can't believe it's beating Titan X in SLI. :O And that's Pre-Crimson drivers.

I'm not blind, I just have common sense.

Average frame rates means sweet **** all..... Minimum frame rates are the most important and the 980ti is the better card... heck look at the difference in minimum FPS in Shadow of Mordor... 35fps vs 67fps...... That's nearly double the bottom end frame rate and yet you think Fury turns in the better performance in that game?

Hahahahahahahahahaha

Well in Metro Last Night it's 10 FPS for the Fury X and 7 for the 980 Ti. I haven't seen much issues with minimum frame rates. But to remember that they got better with the new Crimson drivers as those are pre-crimson drivers and several month's old. Also, they hired AMD the guy who originally discovered frame pacing issues with the AMD cards. So expect things to get better. Also, with respect to frametimes Fury X crossfire is doing better than Titan X SLI.

@hoosier7 said:

Comparing reference cards isn't exactly a good way to test, the 980ti also overclocks a lot better.

With the release of the new Sapphire Trixx you can now unlock the memory so that means it's now overclocking as well as the 980 Ti if not even better and getting better performance.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Juub1990
Member since 2013 • 12622 Posts
@Xtasy26 said:

With the release of the new Sapphire Trixx you can now unlock the memory so that means it's now overclocking as well as the 980 Ti if not even better and getting better performance.

We know that's a lie.

Avatar image for BassMan
BassMan

18736

Forum Posts

0

Wiki Points

0

Followers

Reviews: 232

User Lists: 0

#23 BassMan
Member since 2002 • 18736 Posts

He is a Radeon fanboy. Leave him be. I hope he got a good deal on that Fury X, otherwise he should have bought a 980 Ti.

Avatar image for Yams1980
Yams1980

2866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#24 Yams1980
Member since 2006 • 2866 Posts

the quality doesnt look better. It just has darker contrast on the AMD screenshots which makes you think it looks better. You see the same when they compare ps4,xbox1 and pc games, the colors vary a bit but some color adjustments would make them look identical. Many of these screenshots are likely photoshoped to wash them out more by whoever is trying to make their case.

I'd like some real unedited shots, those were carefully edited and rescaled and modified to make the nvidia ones look worse and the amd ones look better. The first shot there you can clearly see they smudged the vehicle texture there in the screenshot with photoshop.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#25 Xtasy26
Member since 2008 • 5593 Posts

@Juub1990 said:
@Xtasy26 said:

With the release of the new Sapphire Trixx you can now unlock the memory so that means it's now overclocking as well as the 980 Ti if not even better and getting better performance.

We know that's a lie.

You clearly don't know anything about the Sapphire Trixx software. People are getting now overclock of the memory which if you didn't know you couldn't do when Fury X was launched. People are now getting much better over clocks as indicated but the posts below:

@BassMan said:

He is a Radeon fanboy. Leave him be. I hope he got a good deal on that Fury X, otherwise he should have bought a 980 Ti.

Stating facts is now a fanboy? Did you look at the benches. Oh yeah you keep ignoring it. And yes I get it a lot cheaper on a deal that I couldn't pass up.

@Yams1980 said:

the quality doesnt look better. It just has darker contrast on the AMD screenshots which makes you think it looks better. You see the same when they compare ps4,xbox1 and pc games, the colors vary a bit but some color adjustments would make them look identical. Many of these screenshots are likely photoshoped to wash them out more by whoever is trying to make their case.

I'd like some real unedited shots, those were carefully edited and rescaled and modified to make the nvidia ones look worse and the amd ones look better. The first shot there you can clearly see they smudged the vehicle texture there in the screenshot with photoshop.

You might be blind. LOL. These were side by side comparison so we could see a direct comparison. Hardly what you would call "modified" and your assertion that they used Photoshop to make nVidia look worse is even more ludicrous.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#26  Edited By Ten_Pints
Member since 2014 • 4072 Posts

I'm still waiting for graphics cards to come down in price. Been looking at some 4GB cards from AMD, and a 2GB model is 40% cheaper than the 4GB model, don't really understand why the ram has such a premium on it when nothing else is different.

Nvidia cards while nice are too expensive and don't support DX12 feature set, or did that change?

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#27 Xtasy26
Member since 2008 • 5593 Posts

@ten_pints said:

I'm still waiting for graphics cards to come down in price. Been looking at some 4GB cards from AMD, and a 2GB model is 40% cheaper than the 4GB model, don't really understand why the ram has such a premium on it when nothing else is different.

Nvidia cards while nice are too expensive and don't support DX12 feature set, or did that change?

That's how new technology works. New technology will always be expensive when it comes out and after more products adopts the technology the prices come down. GDDR5 has been implemented in new graphics cards since 2008. Now, that the it's an old school technology and has been mass marketed the prices has come down.

Whereas with HBM, there are only 3 products out in the world right now that uses HBM: Fury X, Fury and the Nano.

Think of it like this. Electric cars like Tesla Model S is very expensive right now. When newer Tesla Model comes out and it gets to mass market expect the price of Tesla cars to come down (and also when more car manufactures starts to produce electric cars). In the Graphics World when nVidia releases their Pascal GPU expect the price of HBM to come down.

Avatar image for organic_machine
organic_machine

10143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28 organic_machine
Member since 2004 • 10143 Posts

The Fury rocks. I have one and I have a shitty A10 CPU (it's okaaay, but I'm waiting for the Zen CPU, this is good enough to last me in the mean time), and I wreck games.

ESPECIALLY with that Windows 10 AMD patch. I literally jumped 20 FPS in the Witcher after that patch.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#29 Xtasy26
Member since 2008 • 5593 Posts

This will probably be my last and final upgrade in a long, long time as I don't have much time to game anymore. My upgrade history:

2002 - GeForce 4 MX 420 64MB DDR. New Pentium 4 Build.

2006 - XFX GeForce 7600GT XXX edtion.

2008 - Sapphire ATI Radeon HD 4870. New Build.

2011- XFX Radeon HD 6950 Bios Flashed to HD 6970.

2015 - Radeon Fury X + Core i7 6700K. New Build.

As you can see over the past 13 years, I don't go more than 4 years without a Graphics Card or new PC build upgrade. I will probably break the 4 year cycle and won't upgrade in the next 4 years. Anyone want to venture a guess on how long my system will last, I mean will be able to play new games maxed out at least at 1080P with some amounts of FSAA like FXAA? I mean most games today are console ports and console are struggling to push 1080P on current games, imagine games coming out in 2019, they will struggle even more and my Fury X + Core i7 6700K vastly outperforms a PS4 and Xbox One. Also, how often do you guys upgrade?

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#30  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

lol no it's not. especially with their terrible optimization, games designed around NVidia hardware.

Team green is always the much safer bet for single gpu's.

edit: Lol @ the image comparison, that's some lemming type fuckery you're attempting, so sad.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Phazevariance
Member since 2003 • 12356 Posts

I'll hold off until the Nvidia 1080 comes out. until then my old school GTX 680 will have to hold it's own.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#32 Xtasy26
Member since 2008 • 5593 Posts

@ReadingRainbow4 said:

lol no it's not. especially with their terrible optimization, games designed around NVidia hardware.

Team green is always the much safer bet for single gpu's.

edit: Lol @ the image comparison, that's some lemming type fuckery you're attempting, so sad.

The new benches with updated drivers says otherwise at least on 1440P+ resolutions. You must have **totally** ignored the benches.

As for the image comparison. What's wrong with it? It shows pretty good comparison. You must be new to the whole GPU image comparison things because it was something that they used to do 10 - 12 years ago on major tech sites. AMD/ATI always had better image quality and with these new image comparison looks like things haven't changed.

Avatar image for deactivated-5a8875b6c648f
deactivated-5a8875b6c648f

954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 deactivated-5a8875b6c648f
Member since 2015 • 954 Posts

I'll just wait for pascal.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#34  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Xtasy26 said:

This will probably be my last and final upgrade in a long, long time as I don't have much time to game anymore. My upgrade history:

2002 - GeForce 4 MX 420 64MB DDR. New Pentium 4 Build.

2006 - XFX GeForce 7600GT XXX edtion.

2008 - Sapphire ATI Radeon HD 4870. New Build.

2011- XFX Radeon HD 6950 Bios Flashed to HD 6970.

2015 - Radeon Fury X + Core i7 6700K. New Build.

As you can see over the past 13 years, I don't go more than 4 years without a Graphics Card or new PC build upgrade. I will probably break the 4 year cycle and won't upgrade in the next 4 years. Anyone want to venture a guess on how long my system will last, I mean will be able to play new games maxed out at least at 1080P with some amounts of FSAA like FXAA? I mean most games today are console ports and console are struggling to push 1080P on current games, imagine games coming out in 2019, they will struggle even more and my Fury X + Core i7 6700K vastly outperforms a PS4 and Xbox One. Also, how often do you guys upgrade?

For desktops

GeForce FX 4200 Ti VIVO (video-in and video-out feature).

GeForce FX 5950.

GeForce 8600 GTS (dead).

Radeon HD 3870.

Radeon HD 4870.

Radeon HD 5770. For DIY external GPU for my DELL laptop equipped with ExpressCard slot.

Radeon HD 6950.

Radeon 7950-900Mhz and 7970 1Ghz .

Radeon R9-290 and R9-290X, my R9-290 OC was re-flashed to R9-390X i.e. 10 Mhz higher GPU. LOL.

GeForce GTX 980 Ti OC.

Plan for 14/16 nm era GPU.

-------------------------

For laptop

Radeon 9600M.

GeForce 8600M GT (dead). GeForce 9600M GT (dead, unstable with power management shift issues, thanks ASUS). Both ASUS laptops.

Radeon HD 4650M. Sony laptop.

Radeon HD 5770M. Dell laptop.

Radeon HD 8870M. OC past R9-M370X levels. Samsung laptop with touch screen.

Radeon HD 8570M for 13 inch Ultrabook. Samsung laptop with touch screen.

Plan for 14/16 nm era GPU.

Avatar image for hiphops_savior
hiphops_savior

8535

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 2

#35 hiphops_savior
Member since 2007 • 8535 Posts

@nyadc: Fair enough, I wouldn't upgrade either if I was in your shoes. Fury X is pretty damn awesome as a graphics card, though.

Avatar image for wizard
Wizard

940

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 Wizard
Member since 2015 • 940 Posts

1. Comparing water cooled to reference air cooled.

2. Those are all much older games. Frostbite I know is AMD optimized, not sure about the rest. What about the Witcher 3, Fallout 4, or Syndicate? How well do they stack up?

3. Overclocking - Have you seen firestrike? Nvidia dominated.

4. Pascal is early next year...

Congratulations on the new purchase! Fury X is great, it pushed HBM to the market and it clearly shines at 4k.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#37 Xtasy26
Member since 2008 • 5593 Posts

@wizard said:

1. Comparing water cooled to reference air cooled.

2. Those are all much older games. Frostbite I know is AMD optimized, not sure about the rest. What about the Witcher 3, Fallout 4, or Syndicate? How well do they stack up?

3. Overclocking - Have you seen firestrike? Nvidia dominated.

4. Pascal is early next year...

Congratulations on the new purchase! Fury X is great, it pushed HBM to the market and it clearly shines at 4k.

1. I don't see the problem with that. It was nVidia's choice to not include water cooled 980 Ti as a standard.

2. I would love to see new benchmarks of those with the new Crimson drivers even some Crossfire/SLI benchmarks.

3. Fury X now can reach up to 1200/600 with memory overclocking unlocked. In any case, I don't really care much about synthetic benchmarks as they don't reflect real world gaming (I am in agreement with the reviewer in the video who talked about it.)

4. Looking forward to it. I would be interested to know how nVidia handles the transition to HBM. AMD has been at it for the last 5+ years working closely with memory makers. I think it was a pretty amazing achievement since it's a radical departure from old school GDDR memory.

And thanks! I really like gaming at 4K even-though it's in VSR. Speaking of VSR the site which posted the video review of the Fury X with the new Crimson drivers have 4K VSR and 1440P VSR benchmarks for those of you who don't have a 4K monitor. I haven't found any VSR benchmarks anywhere on the internet which was a nice surprise. Anyways here are the benchmarks:

I can tell you that there is a significant difference in picture quality at 4K even though the monitor may not be a native 4K monitor.

Avatar image for doozie78
Doozie78

1123

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#38  Edited By Doozie78
Member since 2014 • 1123 Posts

Sweet buys on the PC parts, that's sure to be a beastly gaming rig!

The main problem I have with crossfire is that it rarely works as you want it to. I have many games on steam and only a couple of them actually support crossfire and even then the performance gain is generally pretty shit. If it does run then your going to have some other weird graphic issue almost guaranteed. I'm running two r9 290's and most of the time its not enabled because the games just don't support it properly. Crossfire is overrated.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#39  Edited By Xtasy26
Member since 2008 • 5593 Posts

@ronvalencia said:
@Xtasy26 said:

This will probably be my last and final upgrade in a long, long time as I don't have much time to game anymore. My upgrade history:

2002 - GeForce 4 MX 420 64MB DDR. New Pentium 4 Build.

2006 - XFX GeForce 7600GT XXX edtion.

2008 - Sapphire ATI Radeon HD 4870. New Build.

2011- XFX Radeon HD 6950 Bios Flashed to HD 6970.

2015 - Radeon Fury X + Core i7 6700K. New Build.

As you can see over the past 13 years, I don't go more than 4 years without a Graphics Card or new PC build upgrade. I will probably break the 4 year cycle and won't upgrade in the next 4 years. Anyone want to venture a guess on how long my system will last, I mean will be able to play new games maxed out at least at 1080P with some amounts of FSAA like FXAA? I mean most games today are console ports and console are struggling to push 1080P on current games, imagine games coming out in 2019, they will struggle even more and my Fury X + Core i7 6700K vastly outperforms a PS4 and Xbox One. Also, how often do you guys upgrade?

For desktops

GeForce FX 4200 Ti VIVO (video-in and video-out feature).

GeForce FX 5950.

GeForce 8600 GTS (dead).

Radeon HD 3870.

Radeon HD 4870.

Radeon HD 5770. For DIY external GPU for my DELL laptop equipped with ExpressCard slot.

Radeon HD 6950.

Radeon 7950-900Mhz and 7970 1Ghz .

Radeon R9-290 and R9-290X, my R9-290 OC was re-flashed to R9-390X i.e. 10 Mhz higher GPU. LOL.

GeForce GTX 980 Ti OC.

Plan for 14/16 nm era GPU.

-------------------------

For laptop

Radeon 9600M.

GeForce 8600M GT (dead). GeForce 9600M GT (dead, unstable with power management shift issues, thanks ASUS). Both ASUS laptops.

Radeon HD 4650M. Sony laptop.

Radeon HD 5770M. Dell laptop.

Radeon HD 8870M. OC past R9-M370X levels. Samsung laptop with touch screen.

Radeon HD 8570M for 13 inch Ultrabook. Samsung laptop with touch screen.

Plan for 14/16 nm era GPU.

Wow! Looks like you upgrade every generation! That's WAY more GPU upgrades than I ever did even when you include the late 90's during when I was gaming on a nVidia RIVA 128.

My Other GPU's would included for laptop would include:

GeForce Go 7600 (dead). Thank you nVidia for screwing me over and making me lose my 17" Gaming Laptop for your defective die-packaging.

Mobility Radeon HD 3650 - My replacement 17" Gaming Laptop, still works today (so far) even when I overclock it for gaming.

I am pretty shocked that you had two dead nVidia laptop GPU's. Did you ever get any re-reimbursement or replacement under warranty? nVidia seems to have a high rate of defective laptop GPUs. I got $0 out of it because it was past warranty when it died. :( I am still pretty pissed about it as it's not like replacing a dead Graphics Card for desktops, if your laptop GPU dies your entire laptop goes. Hence I lost a lot of money since it was a 17" Gaming Laptop.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Xtasy26 said:
@ronvalencia said:
@Xtasy26 said:

This will probably be my last and final upgrade in a long, long time as I don't have much time to game anymore. My upgrade history:

2002 - GeForce 4 MX 420 64MB DDR. New Pentium 4 Build.

2006 - XFX GeForce 7600GT XXX edtion.

2008 - Sapphire ATI Radeon HD 4870. New Build.

2011- XFX Radeon HD 6950 Bios Flashed to HD 6970.

2015 - Radeon Fury X + Core i7 6700K. New Build.

As you can see over the past 13 years, I don't go more than 4 years without a Graphics Card or new PC build upgrade. I will probably break the 4 year cycle and won't upgrade in the next 4 years. Anyone want to venture a guess on how long my system will last, I mean will be able to play new games maxed out at least at 1080P with some amounts of FSAA like FXAA? I mean most games today are console ports and console are struggling to push 1080P on current games, imagine games coming out in 2019, they will struggle even more and my Fury X + Core i7 6700K vastly outperforms a PS4 and Xbox One. Also, how often do you guys upgrade?

For desktops

GeForce FX 4200 Ti VIVO (video-in and video-out feature).

GeForce FX 5950.

GeForce 8600 GTS (dead).

Radeon HD 3870.

Radeon HD 4870.

Radeon HD 5770. For DIY external GPU for my DELL laptop equipped with ExpressCard slot.

Radeon HD 6950.

Radeon 7950-900Mhz and 7970 1Ghz .

Radeon R9-290 and R9-290X, my R9-290 OC was re-flashed to R9-390X i.e. 10 Mhz higher GPU. LOL.

GeForce GTX 980 Ti OC.

Plan for 14/16 nm era GPU.

-------------------------

For laptop

Radeon 9600M.

GeForce 8600M GT (dead). GeForce 9600M GT (dead, unstable with power management shift issues, thanks ASUS). Both ASUS laptops.

Radeon HD 4650M. Sony laptop.

Radeon HD 5770M. Dell laptop.

Radeon HD 8870M. OC past R9-M370X levels. Samsung laptop with touch screen.

Radeon HD 8570M for 13 inch Ultrabook. Samsung laptop with touch screen.

Plan for 14/16 nm era GPU.

Wow! Looks like you upgrade every generation! That's WAY more GPU upgrades than I ever did even when you include the late 90's during when I was gaming on a nVidia RIVA 128.

My Other GPU's would included for laptop would include:

GeForce Go 7600 (dead). Thank you nVidia for screwing me over and making me lose my 17" Gaming Laptop for your defective die-packaging.

Mobility Radeon HD 3650 - My replacement 17" Gaming Laptop, still works today (so far) even when I overclock it for gaming.

I am pretty shocked that you had two dead nVidia laptop GPU's. Did you ever get any re-reimbursement or replacement under warranty? nVidia seems to have a high rate of defective laptop GPUs. I got $0 out of it because it was past warranty when it died. :( I am still pretty pissed about it as it's not like replacing a dead Graphics Card for desktops, if your laptop GPU dies your entire laptop goes. Hence I lost a lot of money since it was a 17" Gaming Laptop.

~Asus G1S 15.4 inch laptop, 8600M GT GDDR3 256MB + motherboard was replaced to 9500M GS DDR2 512 MB + Asus G1SN motherboard. Model number increase with slower memory i.e. it's a con job. DDR2 is cheaper/slower than GDDR3. Use for home. Over heated and cracked the case and problem with USB ports which gets very hot. G1S failed at 18th month of 24 month warranty.

Work 14 inch laptop, 9650M DDR2 laptop's motherboard was replaced and still haven't fix the stability for Windows 7 32bit/64bit. ASUS has stated to stick with Vista 32bit.

I switched to a Sony laptop with Radeon HD 4650M GDDR3 512 MB. Use for both work (within VM) and home. It took me awhile to select another Nvidia GPU i.e. it has issues with multi-monitor with control panel sometimes crash. Running Windows 7 64bit, plans to update it to Windows 10 and give it to family relative.

My next GPU would have FreeSync standard and hopefully with HDMI-HDR (HDMI 2.0A) Nvidia Gsync is dead and HDMI 2.0 has been superseded by HDMI-HDR. I plan to purchase corresponding Ultra-HDTV with HDMI-HDR.

Avatar image for Xtasy26
Xtasy26

5593

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#41  Edited By Xtasy26
Member since 2008 • 5593 Posts

@Snugenz said:

The 4gig vram on the Fury is a bit of a downer, recent games have gotten silly with their Vram usage and my 780's 3gig is constantly maxed so my next upgrade will atleast be 6gig+.

Dat 6700k though...

4GB of HBM VRAM has little to no impact on gaming as explained by AMD's Chief Scientists Richard Huddy as HBM can do over Half a Terabyte per second of memory bandwidth as he put it:

"Effectively get rid of frame buffer size” He said it “exceeds the capability of 8GB or 12GB of memory and the reason for that is that there is so much bandwidth inside HBM that if you have system memory we can swap memory around inside the machine, swap between HBM and system memory and keep the working set in the 4 Gigabytes and it never get’s in the way of the GPU. “

He goes on to say: “What happens is you effectively get rid of the problems of frame buffer size and the extraordinary result that comes on the back of that is when you benchmark a Fiji chip is..when start to wind up the resolution higher and higher you would start to think that our 4GB would come to the limit to the headroom a bottleneck but far from that it actually as you wind the resolution up we get better and better we start beating a Titan X and indeed we consistently beat it if you go to high enough resolution so HBM is actually the future of memory."

Full Video Below:

Loading Video...

So, you should have no problems with 4GB HBM as long as you have descent amount of System Memory. I have 16GB DDR4 so no problems.

Source