gpuguru's forum posts

  • 25 results
  • 1
  • 2
  • 3
Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#1 gpuguru
Member since 2016 • 30 Posts

I don't know why people are harping on Mantle. It's been handed over to the Khronos group to be implemented in Vulkan. I think that was a smart move by AMD. I hope to see Vulkan based games coming out in the next couple of years. Valve even talked about using Vulkan. Also, it helped push Microsoft to come up with DX 12 much faster. For anyone interested in what is going on with Vulkan here is what the Khronos group stated last month:

"Vulkan Working Group Update - December 18th 2015
We have some good news and some bad news. The year-end target release date for Vulkan will not be met. However, we are in the home stretch and the release of Vulkan 1.0 is imminent!"

When it comes to AMD there execution has been poor, especially over the last year and a half. Why it did take then till June of 2015 to release a refreshed Hawaii core with the R9 300 series? AMD should have been prepping for it to release in Fall of 2014. That's inexcusable. I could understand the Fury being released in the summer due to the nature of HBM technology that they used and how difficult it may be to get enough supplies. Hopefully that will change starting this year with Polaris and this year will mark the first year where AMD's new Radeon Technologies Group will release their first set of cards headed by a guy who is an actual engineer.

Going back to Gameworks, that's a far more legitimate argument against nVidia. Don't get me wrong, I actually like the features of Gameworks. I think over the last 10 - 12 years it's been all about the performance and not enough effects in games. What's the point of having all that power if you don't get to see actual improvements in visual the fidelity of games instead of just playing games at higher resolutions and higher frame rates. This is especially true now as games are being made for the lowest common denominator and that is the Xbox One and PS4 which is far less powerful then PC hardware that we have today. That's a far cry from 10 years ago when the Xbox 360 or PS3 was released as those GPU's were actually similar to high end GPU's from ATI and nVidia. You could actually benefit from getting a higher end GPU's as the bar was set much higher and thus if you wanted to even play games at 1600x1200 resolution you would need a descent GPU. Right now, even a four year old HD 7970 can play many new games at 1080P with high enough in game settings.

The problem with Gameworks is that AMD can't view the .DLL files that are executed to generate the special effects in those games. nVidia strictly forbids AMD to view the code so there is now way for AMD to optimize their GPU's for AMD cards. Where as with TressFX, it's exactly the opposite where nVidia can view the code to optimize their GPU's. When Tomb Raider launched with TressFX, initially it ran bad on nVidia, now. with proper optimization it's runs as well on nVidia cards. There is also cases where Gameworks features are not implemented efficiently. Take for example Call of Duty Ghosts, the hair on the dog was over tesselated then it needed to be and it was a piss poor way to implement Hairworks and it crippled performance on not only AMD's hardware but nVidia's older hardware. Compare that with TressFX, where Laura Croft's hair was done much more efficiently with it running great on both vendors hardware. So, it's clear who is championing open technology at the benefit of PC Gamers and who is not. By the way, below is an interview where AMD's Gaming Scientists talks about how Gameworks works in games and how they can't optimize AMD hardware for it:

Loading Video...

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#2  Edited By gpuguru
Member since 2016 • 30 Posts

@Coseniath said:
@gpuguru said:

Nice!

I personally like the Fury X, I just think it's overpriced like the 980 Ti, at least when it launched. When they dropped the price to $570, then it was a good buy. Too bad all the PowerColor ones got sold out.

@Coseniath said:

@klunt_bumskrint: Are you by any chance defy the divine statement of AMD CTO Joe Macri, who said: “This is an overclocker’s dream.”?

Shame on you! :P

Yeah this was definitely misleading. Too be fair nVidia did have their fair share of "gaffes". Anyone remember the wooden screws on the Fermi chips that nVidia was showing off as an actual board when it wasn't and Fermi was behind schedule? Or the recent showing of two Pascal GPUs on the PX 2 when it was clearly a pair of 980M's fabbed nearly a year ago!

Hello and welcome to Gamespot forums!

We can start saying a lot of "gaffes" or fails or misleading information for both companies (like the VRAM (3,5GB @196GB/s + 0,5GB @28GB/s) fail to mention of GTX970 or the R290X GHz fiasco).

But here we are talking about FuryX and obviously its competitor for the "world's most powerful computer" next to Tianhe-2 (lol), GTX980Ti.

There is no need to try to defend any company or blame other for even 5 years before things.

Thanks for the Welcome! I probably won't be posting much unless it's topics related to GPUs.

Yes you were right about nVidia's "gaffes" with the 3.5 GB. I forgot about that one.

I do believe we shouldn't give companies a free pass on their products. And I am not talking about the "gaffes" I am talking about when looking at products from an unbiased perspective. One should look at from an unbiased perspective. If you give them a free pass then they will come out with sub par products while new or other competitors on the market will come out with better products and the company that is getting the "free pass" will go extinct as the people will naturally gravitate towards the best products anyways. For example, AMD learned their lesson with the R9 290X where the stock cooler wasn't up to snuff and the card ran hot while with the Fury it was much cooler as they went to launch with a 3rd party cooler. The Fury X took a step further by adding liquid cooling as a standard, where now it's not only significantly cooler but it's significantly quieter than the 980 Ti.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#3  Edited By gpuguru
Member since 2016 • 30 Posts

@04dcarraher said:

You cant go by youtube videos, because lack of data, ie drivers, setup being used, bias etc.

while yes drivers do improve performance it does not fix the underlying cause of the issue, lack of memory. even with new crimson drivers people still see issues when at 4+gb.

Shadow of Modor seen performance lost with latest drivers. at 4k Furyx seen around 8 fps lower minimums but roughly the same average fps. So question is that did they counter the framepacing by limiting min frames. So instead of getting a near 40 fps now your getting 30 fps as a low.

How much longer is 4gb fine for 1440p? We are already seeing games allocating beyond 3gb for 1080p, and a few games allocating 4gb or more at 1440p.

Using COD's "fake" vram requirements as a example of vram being over estimated in general is wrong.

I am not going to go by old video and benchmarks to make judgements. Who would you rather use an old data or old videos that show stuttering or newly improved drivers that reduces stuttering and newer videos that shows no more stuttering? Many reviewers have mentioned that the drivers were not up to snuff on many of the review sites and they were still working on driver optimization as communicated by them by AMD and I believe them. I don't see anything to say that stuttering or drivers have gotten worse.

Your theory about counter the framepacing could definitely be a possibility. Only AMD knows what they did to remove stuttering in Shadow of Mordor. That's the only game I have seen stuttering at 4K on the Fury X. Regardless, it was a good thing.

With respect to 4GB at 1440P is more than fine for this year and I would argue even next year. 4K on the other hand I would argue could be an issue with 4GB or even the 6GB of the 980 Ti for this year and next year. Then again I really don't think the 980 Ti or the Fury X can handle every games at 4K with silky smooth framerates, you really need 2X Fury X or 2X 980 Ti to get 4K 60FPS. Problem is that with multi-gpu configurations is that not all games are supported and as you rightly pointed in the other thread where the HD 5970 was losing to the HD 7870 in Arkahm City because it's lack of support for 2 GPU cards like the HD 5970. Which means that your second card could be sitting idle and you would not get the most out of your money. I have never really been a fan of multi-gpu configurations.

As for the COD, I would argue it was an outlier but it was to make a point that just because it uses more VRAM doesn't mean it's not unplayable. I think it comes down to how the memory is managed. That's why even in some games where it takes more than 3GB like with graphics cards like the HD 7970 or the 780 Ti, those games are still playable. Also, with respect to framepacing showing spikes for .001 seconds won't really be noticeable. You are really exaggerating the issue of framepacing. That's why even in games like GTA V where it spikes for a fraction of a second with the Fury X none of the reviewers were complaining that they had noticeable stuttering where it was unplayable as pointed out in my earlier post. Techspot, Anandtech, Tom's Hardware none of them complained about GTA V and the Youtube videos confirms that.

I mostly view the Fury X as AMD doing a testbed for implementing 4GB HBM + using a interposer to lie on the top of the GPU to get experience and technical know how to do develop a GPU using HBM with an interposer. They have done that before with going to new nodes such as the 9600 Pro which was using .13 micron where as the 9700 Pro was using the .15 micron. Also, they did that with the HD 4770 which they used as a testbed to get to 40 nm where as the HD 4850 and the HD 4870 was still on 55 nm. I think this was a smart move. If you look now they already have Polaris GPUs taped out and were demonstrating it at CES. I expect them to be released earlier than Pascal.

They problem I have with the Fury X was that they launched it at the same price as the 980 Ti with drivers that weren't up to snuff. They should have launched it at the price they have now with the driver performance improvements which shows that it's actually ahead of the stock 980 Ti at 4K. And the Fury prices should have been at $500 like they have now. It would have been a much more attractive option. And whose idea was it to include a custom liquid cooler as a standard? It added extra $50 - $80 cost. Not to mention it affected availability as many people weren't able to buy it. They only fixed the issue in Q4 of 2015 and by that time many people had already brought the 980 Ti. They should have a closed loop cooler as an option for AIB partners just like nVidia's AIB partners have done with 980 Ti hybrids.

These are the kind of business decisions that's killing AMD despite them having stellar products.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#4  Edited By gpuguru
Member since 2016 • 30 Posts

@neogeo419 I have a very simple rule when buying GPU's. When buying GPU's I try to buy the most performance single GPU that I can buy within my budget. Since you have a 'firm' $540 the R9 Nitro Fury is right up your alley.

Trust me. The R9 Fury is going to be a major improvement over your HD 7950.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#5 gpuguru
Member since 2016 • 30 Posts

I am going to fix my reverse flux capacitor.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#6  Edited By gpuguru
Member since 2016 • 30 Posts

@04dcarraher said:

that graph was done in august , there are other graphs of GTA 5 with upto beta crimson drivers still showing same framepacing issues but only with gpu's with more than 4gb.

youtube is not a correct way to see the issue, also with some games at 1440p 4gb still isnt enough with max settings. You really need to aim for 980ti instead of a Fury, you will regret it when games continue to use more than 4gb.

It's not unusual to have some framepacing issues. AMD has done a lot to improve frame pacing as shown by newer videos.

Youtube is a great way to see improvements. Let me break this down for you. If you look at Shadow of Mordor at 4K Ultra back in June below you can see stuttering:

Loading Video...

Even the guy mentions how he is looking forward to the issue being solved. And now compare that with the newer drivers.

Loading Video...

As you can see the performance is vastly improved.

And at 1440P 4GB is plenty fine. I am not spending another $100 for a 980 Ti when I can get a Fury X cheaper or save another $50 and get a R9 Fury.

Also, you have to remember some games show more VRAM usage but it still works fine. I believe it was of Call of Duty Ghosts/Advance Warfare that had like 6GB usage for Titan X but when you ran it on a 4GB card it showed less memory usage and it ran fine.

Also, other reviewers are showing the Fury X runs fine at 4K running GTA V.

As they stated:

"No matter how you slice it, the Fury X handles GTA V in 4K quite nicely. The 99th-percentile results track with the FPS results, which is what happens when the frame time plots are generally nice and flat." And those were done with older drivers.

I have also looked at videos on Youtube. I haven't seen any frame pacing issues.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#7 gpuguru
Member since 2016 • 30 Posts

@04dcarraher said:

lol youtube is not proof , its funny that you cant find all the benches showing all the framepacing issues, when vram is 4gb+.

As you can see 290x does not spike like furyX

Well thank you for the graph. But what drivers were they using for the Fury X? Is this from the early benchmarks from when Fury launched? If so, those were quite immature drivers. Plenty of reviews have mentioned that the Fury X drivers were up to snuff when it launched. Because what other users that were using Fury X with newer drivers they didn't have any stuttering issues. Also, you can Youtube this with the newer drivers.

I like using Youtube because that shows actual gameplay video with newer drivers, where stuttering would be noticeable. As I mentioned in my early comments that there was stuttering at 4K on Ultra with Fury X on Youtube and now with newer drivers you don't see that. You could Youtube yourself.

And that's from GTA V not Shadow of Mordor. Besides I am thinking about getting the Acer XG270HU 27" 1ms 144Hz freesync monitor to go with the Fury. Which would not be an issue because that would be at 1440P.

Frankly, the current crop of graphics card including the 980 Ti is not suitable for 4K Gaming as you can't max those games out getting hickups in gameplay, certainly not 60 FPS. You could tone down the settings to get better frame rates but who want's to tone down settings on a $650 graphics card.

Next gen with Pascal and Polaris on 14nm then that would be much better at 4K.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#8 gpuguru
Member since 2016 • 30 Posts

nVidia is raking in all the $ash!

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#9 gpuguru
Member since 2016 • 30 Posts

Looking forward to it. Even though I have the Uncharted games the last Tomb Raider game was great. Have always been a Tomb Raider fan.

Avatar image for gpuguru
gpuguru

30

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#10  Edited By gpuguru
Member since 2016 • 30 Posts

@04dcarraher said:

your clearly an alt account a short youtube video is not proof.

I don't know why you keep on accusing me of being in an alt account. The only reason I am on this thread is because I was interested in the R9 Fury which then led me to the interesting topic of the R9 Fury X as the World's Most Powerful GPU on the other thread. THESE WERE THE ONLY TWO FURY TOPICS on the FRONT PAGE OF THE PC AV AND HARDWARE section. I posted on this thread first before I ended up on the other one. Maybe you should do a little bit of thinking before you spout stuff. And yes you can look at other videos on Youtube where you can see the issue back from june from short videos and comparing it to the new ones.

And where is your claim that other 4GB cards doesn't experience the same issue? If you are going to make a claim at least back it up.

And no I don't have a R9 Fury but I am interested in acquiring one. So, this card does interest me. From my research there isn't this so called stuttering problem with newer updates (at least not that is visible ). You could go to Tom's hardware and see for yourself.

  • 25 results
  • 1
  • 2
  • 3