Nvidia are desperate and playing dirty

This topic is locked from further discussion.

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#1  Edited By Gue1
Member since 2004 • 12171 Posts

With DX12 all of a sudden AMD cards are superior to Nvidia's and now they are going around pressuring people that do benchmarks to turn off certain settings because is not fair... And then Intel is now full on board with AMD's Free Sync while G Sync's support is nowhere to be seen too. <- http://www.maximumpc.com/intel-pledges-support-for-freesync-where-does-that-leave-g-sync/

-

Oxide Developer on Nvidia's request to turn off certain settings:

“There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally.”

“Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only ‘vendor’ specific code is for Nvidia where we had to shutdown Async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn’t really have Async Compute so I don’t know why their driver was trying to expose that. The only other thing that is different between them is that Nvidia does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don’t think it ended up being very significant. This isn’t a vendor specific path, as it’s responding to capabilities the driver reports.

http://www.dsogaming.com/news/oxide-developer-nvidia-was-putting-pressure-on-us-to-disable-certain-settings-in-the-benchmark/

-

Avatar image for FireEmblem_Man
FireEmblem_Man

20389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#2 FireEmblem_Man
Member since 2004 • 20389 Posts

They're not very desperate when they own 80% of the graphic card markets. But their damage control is funny.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 Wickerman777
Member since 2013 • 2164 Posts

They're desperate? Don't they control something like 80% of the market share?

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 MlauTheDaft
Member since 2011 • 5189 Posts

Freesync is not really AMDs, it's a part of the DisplayPort specs.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5  Edited By ronvalencia
Member since 2008 • 29612 Posts

@MlauTheDaft said:

Freesync is not really AMDs, it's a part of the DisplayPort specs.

FreeSync was designed by AMD and given to VESA.

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By MlauTheDaft
Member since 2011 • 5189 Posts

@ronvalencia: True that. Guess, I was approaching spin there and I apologize. Good on AMD, in fact.

In regards to NVidia and DX12, I heard the AA thing was a matter of a driver bug and I'd like to see more benchmarks than a game, which was previously based on Mantle. I mean, either NV has dun goofed really bad or there's some "wait and see" to get done.

Avatar image for bravo632
Bravo632

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Bravo632
Member since 2015 • 207 Posts

I might be upgrading my VGA sometime soon, any recommendations? I want something very good & easy with driver updates, I also hate to tweak and overclock etc.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8  Edited By ronvalencia
Member since 2008 • 29612 Posts
@Gue1 said:

With DX12 all of a sudden AMD cards are superior to Nvidia's and now they are going around pressuring people that do benchmarks to turn off certain settings because is not fair... And then Intel is now full on board with AMD's Free Sync while G Sync's support is nowhere to be seen too. <- http://www.maximumpc.com/intel-pledges-support-for-freesync-where-does-that-leave-g-sync/

-

Oxide Developer on Nvidia's request to turn off certain settings:

“There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally.”

“Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only ‘vendor’ specific code is for Nvidia where we had to shutdown Async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn’t really have Async Compute so I don’t know why their driver was trying to expose that. The only other thing that is different between them is that Nvidia does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don’t think it ended up being very significant. This isn’t a vendor specific path, as it’s responding to capabilities the driver reports.

http://www.dsogaming.com/news/oxide-developer-nvidia-was-putting-pressure-on-us-to-disable-certain-settings-in-the-benchmark/

-

NVIDIA is just ticking the box for Async compute without any real practical performance.

The same bullshit tick box support with Geforce 7800/7900 series GPU with 32bit FP shader math.

As for DirectX12's ROVs (part of Feature 12_1), recent testing done by Christophe Riccio

...

It seems that shader invocation ordering is proportionally a lot more expensive on GM204 than S.I. or HSW.

— Christophe Riccio (@g_truc) March 26, 2015

https://twitter.com/g_truc/status/581224843556843521

AMD already enabled ROVs feature with it's OpenGl driver.

@bravo632 said:

I might be upgrading my VGA sometime soon, any recommendations? I want something very good & easy with driver updates, I also hate to tweak and overclock etc.

1. Set your budget.

2. Find the best GPU for your budget.

Avatar image for Idontremember
Idontremember

965

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By Idontremember
Member since 2003 • 965 Posts

Nvidia aren't desperate. They are pissed that Ashes runs like crap since it's an AMD backed game (Of course, when Nvidia does it, they don't mind)
The only problem is it's the first representitive dx12 benchmark of a true game.

As for A-Sync. I'm sure they aren't that worried.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#10 NyaDC
Member since 2014 • 8006 Posts

Nvidia isn't desperate, they're just money grubbing monopoly wanting assholes.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#11 ReadingRainbow4
Member since 2012 • 18733 Posts

I just want some tv's to come out that have built in g-sync, I don't understand why that has to be such a damn problem for them.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#12 NyaDC
Member since 2014 • 8006 Posts

@ReadingRainbow4 said:

I just want some tv's to come out that have built in g-sync, I don't understand why that has to be such a damn problem for them.

@nyadc said:

Nvidia isn't desperate, they're just money grubbing monopoly wanting assholes.

^

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#13  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@nyadc said:
@ReadingRainbow4 said:

I just want some tv's to come out that have built in g-sync, I don't understand why that has to be such a damn problem for them.

@nyadc said:

Nvidia isn't desperate, they're just money grubbing monopoly wanting assholes.

^

Well it could be freesync as well, as long as the results are nearly identical I don't really care what brand it is.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#14 NyaDC
Member since 2014 • 8006 Posts

@ReadingRainbow4 said:
@nyadc said:
@ReadingRainbow4 said:

I just want some tv's to come out that have built in g-sync, I don't understand why that has to be such a damn problem for them.

@nyadc said:

Nvidia isn't desperate, they're just money grubbing monopoly wanting assholes.

^

Well it could be freesync as well, as long as the results are nearly identical I don't really care what brand it is.

They're just an anti-consumer company man, that's really all there is to it.

My monitor has FreeSync, the Asus version of this essentially same monitor has G-Sync and costs $100 more because of that, they're a horrible company that creates great things.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 clyde46
Member since 2005 • 49061 Posts

Why would you want a TV with G/Free-sync?

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 clyde46
Member since 2005 • 49061 Posts

@ReadingRainbow4 said:
@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

You'd have better luck buying better hardware than G/Free-sync ever coming to a TV near you.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#18 ReadingRainbow4
Member since 2012 • 18733 Posts

@clyde46 said:
@ReadingRainbow4 said:
@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

You'd have better luck buying better hardware than G/Free-sync ever coming to a TV near you.

And that's exactly what should change. there's a lot of people that also use their televisions for PC usage m8,

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#19 Ten_Pints
Member since 2014 • 4072 Posts

@clyde46 said:
@ReadingRainbow4 said:
@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

You'd have better luck buying better hardware than G/Free-sync ever coming to a TV near you.

Shame you don't really get 40" monitors, at least with freesync.

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#20  Edited By ShadowDeathX
Member since 2006 • 11699 Posts

@ten_pints said:
@clyde46 said:
@ReadingRainbow4 said:
@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

You'd have better luck buying better hardware than G/Free-sync ever coming to a TV near you.

Shame you don't really get 40" monitors, at least with freesync.

The Wasabi Mango UHD420 is a 42 Inch 4K 10-Bit Monitor with FreeSync support.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21 ReadingRainbow4
Member since 2012 • 18733 Posts

@ShadowDeathX said:
@ten_pints said:
@clyde46 said:
@ReadingRainbow4 said:
@clyde46 said:

Why would you want a TV with G/Free-sync?

Beause why not? I enjoy having more real-estate and generally lay back when I game. There shouldn't be an issue adding it onto televisions, hell it might even help with console games and the screen tearing that's prevalent among them sometimes as well.

It's just pretty cool tech that I feel should be more mainstream. There's no reason not to do it.

You'd have better luck buying better hardware than G/Free-sync ever coming to a TV near you.

Shame you don't really get 40" monitors, at least with freesync.

The Wasabi Mango UHD420 is a 42 Inch 4K 10-Bit Monitor with FreeSync support.

That's actually slick looking but I think I'll hold off a bit, I appreciate how it resembles a television more than a monitor.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22 HalcyonScarlet
Member since 2011 • 13838 Posts

Nvidia need to look competitive, but I doubt they're desperate.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 GarGx1
Member since 2011 • 10934 Posts

It's little early to be condemning or praising anyone for DX12 performance, all we have is one benchmark, from an alpha game being built on AMD hardware. My suggestion would be to wait for more before making judgements. There have always been games that run better on one manufacturers hardware when compared to the other's, DX12 will not change that.

As for Nvidia's marketing department, that's a scummy move for anyone to pull.

I'm not sure what support is needed for G/Free-sync? All you need to do, with a monitor that supports either, is switch off V-Sync and screen tearing is a thing of the past.

Avatar image for deactivated-5b0367b217732
deactivated-5b0367b217732

1697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 9

#24  Edited By deactivated-5b0367b217732
Member since 2014 • 1697 Posts

@bravo632 said:

I might be upgrading my VGA sometime soon, any recommendations? I want something very good & easy with driver updates, I also hate to tweak and overclock etc.

Depends on your budget and the resolution you play at (I only play at 1080p).

I'm getting an R9 380 4GB next week, should cost me a little over 200 €. If you don't like to tweak, get one of those factory overclocked ones. Will gain you a couple more frames, at least.

If you wanna go over 300, maybe an R9 390.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Gue1:

https://www.youtube.com/watch?v=zsjmLNZtvxk

Apparently, the video released by Nvidia was sped up to make it look like the game could run at 60 FPS. If you listen to the thugs talking in the background you can hear their voices going at chipmunk speed. They just used the music to cover the fact.

From http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995

The full reply from Oxide.

Oxdie:

Wow, there are lots of posts here, so I'll only respond to the last one. The interest in this subject is higher then we thought. The primary evolution of the benchmark is for our own internal testing, so it's pretty important that it be representative of the gameplay. To keep things clean, I'm not going to make very many comments on the concept of bias and fairness, as it can completely go down a rat hole.

Certainly I could see how one might see that we are working closer with one hardware vendor then the other, but the numbers don't really bare that out. Since we've started, I think we've had about 3 site visits from NVidia, 3 from AMD, and 2 from Intel ( and 0 from Microsoft, but they never come visit anyone ;(). Nvidia was actually a far more active collaborator over the summer then AMD was, If you judged from email traffic and code-checkins, you'd draw the conclusion we were working closer with Nvidia rather than AMD As you've pointed out, there does exist a marketing agreement between Stardock (our publisher) for Ashes with AMD. But this is typical of almost every major PC game I've ever worked on (Civ 5 had a marketing agreement with NVidia, for example). Without getting into the specifics, I believe the primary goal of AMD is to promote D3D12 titles as they have also lined up a few other D3D12 games.

If you use this metric, however, given Nvidia's promotions with Unreal (and integration with Gameworks) you'd have to say that every Unreal game is biased, not to mention virtually every game that's commonly used as a benchmark since most of them have a promotion agreement with someone. Certainly, one might argue that Unreal being an engine with many titles should give it particular weight, and I wouldn't disagree. However, Ashes is not the only game being developed with Nitrous. It is also being used in several additional titles right now, the only announced one being the Star Control reboot. (Which I am super excited about! But that's a completely other topic ).

Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only 'vendor' specific code is for Nvidia where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that. The only other thing that is different between them is that Nvidia does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don't think it ended up being very significant. This isn't a vendor specific path, as it's responding to capabilities the driver reports.

From our perspective, one of the surprising things about the results is just how good Nvidia's DX11 perf is. But that's a very recent development, with huge CPU perf improvements over the last month. Still, DX12 CPU overhead is still far far better on Nvidia, and we haven't even tuned it as much as DX11. The other surprise is that of the min frame times having the 290X beat out the 980 Ti (as reported on Ars Techinica). Unlike DX11, minimum frame times are mostly an application controlled feature so I was expecting it to be close to identical. This would appear to be GPU side variance, rather then software variance. We'll have to dig into this one.

I suspect that one thing that is helping AMD on GPU performance is D3D12 exposes Async Compute, which D3D11 did not. Ashes uses a modest amount of it, which gave us a noticeable perf improvement. It was mostly opportunistic where we just took a few compute tasks we were already doing and made them asynchronous, Ashes really isn't a poster-child for advanced GCN features.

Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven't made their way to the PC yet, but I've heard of developers getting 30% GPU performance by using Async Compute. Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC. I don't think Unreal titles will show this very much though, so likely we'll have to wait to see. Has anyone profiled Ark yet?

In the end, I think everyone has to give AMD alot of credit for not objecting to our collaborative effort with Nvidia even though the game had a marketing deal with them. They never once complained about it, and it certainly would have been within their right to do so. (Complain, anyway, we would have still done it, )

--

P.S. There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally.

AMD's reply on this issue from https://www.reddit.com/r/AdvancedMicroDevices/comments/3iwn74/kollock_oxide_games_made_a_post_discussing_dx12/cul9auq

Oxide effectively summarized my thoughts on the matter. NVIDIA claims "full support" for DX12, but conveniently ignores that Maxwell is utterly incapable of performing asynchronous compute without heavy reliance on slow context switching.

GCN has supported async shading since its inception, and it did so because we hoped and expected that gaming would lean into these workloads heavily. Mantle, Vulkan and DX12 all do. The consoles do (with gusto). PC games are chock full of compute-driven effects.

If memory serves, GCN has higher FLOPS/mm2 than any other architecture, and GCN is once again showing its prowess when utilized with common-sense workloads that are appropriate for the design of the architecture.

Basically, sites like Anandtech misled everyone when they stated that Maxwell supported Async compute.

From http://forums.anandtech.com/showpost.php?p=37649656&postcount=246

VR pipeline latency

AMD: 10 ms

NV: 25 ms

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#26 remiks00
Member since 2006 • 4249 Posts

@ronvalencia: Great post ron. It really shows how shady Nvidia can be. I can actually hear the chipmunk voices within the video. smfh...

Avatar image for lamprey263
lamprey263

45482

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#28  Edited By lamprey263
Member since 2006 • 45482 Posts

Not surprised, AMD has off-and-on gone out and stated they think Nvidia is using their GameWorkers developer support program to custom tailor game code to specifically utilize Nvidia proprietary code 7 hardware while in turn sabotagingof performance anybody running it on an AMD system. And there has been issues with these GameWorks titles, however they appear to affect all people equally sometimes regardless of whether they're using Nvidia or AMD hardware.

They're also not above appearing petty. I recall after it was decided that all next gen systems would utilize AMD technology that Nvidia seemed rather bitter about it with their press statements. It goes without saying everybody knows a economically pimped out PC runs circles around console tech, but they didn't spare any time not reminding people about it again and again and again. Can't blame MS for not wanting to partner with them anymore after the Xbox fiasco, they're the whole reason MS couldn't drop the original Xbox's price because they wouldn't negotiate cheaper GPUs. They also made MS pay them handsomely for Xbox BC on Xbox 360. Sony also forked over a fortune to Nvidia for their part in Sony-Nvidia partnered RSX Reality Synthesizer GPU used in PS3s.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29  Edited By ronvalencia
Member since 2008 • 29612 Posts

@lamprey263 said:

Not surprised, AMD has off-and-on gone out and stated they think Nvidia is using their GameWorkers developer support program to custom tailor game code to specifically utilize Nvidia proprietary code 7 hardware while in turn sabotagingof performance anybody running it on an AMD system. And there has been issues with these GameWorks titles, however they appear to affect all people equally sometimes regardless of whether they're using Nvidia or AMD hardware.

They're also not above appearing petty. I recall after it was decided that all next gen systems would utilize AMD technology that Nvidia seemed rather bitter about it with their press statements. It goes without saying everybody knows a economically pimped out PC runs circles around console tech, but they didn't spare any time not reminding people about it again and again and again. Can't blame MS for not wanting to partner with them anymore after the Xbox fiasco, they're the whole reason MS couldn't drop the original Xbox's price because they wouldn't negotiate cheaper GPUs. They also made MS pay them handsomely for Xbox BC on Xbox 360. Sony also forked over a fortune to Nvidia for their part in Sony-Nvidia partnered RSX Reality Synthesizer GPU used in PS3s.

The difference is Ashes of The Singularity's source code is equally available to Intel, AMD, NVIDIA and Microsoft.

In general, games with NVIDIA Gameworks restricts source code access to non-NVIDIA vendors e.g. Intel, AMD.

For Witcher 3 PC's tessellation issue, some AMD GCN owners has forgotten the driver's tessellation level override settings for Crysis 2 NVIDIA edition. This issue wouldn't be a big issue if the newbies remembers AMD's driver settings for Crysis 2 NVIDIA edition.

As a result of the backlash, Nvidia changed its policy to allow game developers access to the source code as of mid April of last year. However according to AMD, this did not alter the situation, as game developers engaged in the Nvidia GameWorks program were still not allowed to work with AMD to optimize the code for their hardware. Something which Nvidia initially denied but later Nvidia’s Tom Petersen and Rev Lebaradian admitted. Witcher 3 developers, CD Projekt Red, reaffirmed this again two days ago.

Read more: http://wccftech.com/nvidia-responds-...#ixzz3jNVQBe3n

"We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekendwhich substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible," read a Nvidia statement.

"Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved."

"In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.”

- See more at: http://www.gamewatcher.com/news/2013....yBaPEoQO.dpuf

Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#30 jhcho2
Member since 2004 • 5103 Posts

@Gue1 said:

With DX12 all of a sudden AMD cards are superior to Nvidia's and now they are going around pressuring people that do benchmarks to turn off certain settings because is not fair... And then Intel is now full on board with AMD's Free Sync while G Sync's support is nowhere to be seen too. <- http://www.maximumpc.com/intel-pledges-support-for-freesync-where-does-that-leave-g-sync/

A non-issue. When the next GTX whatever comes out, it'll be superior to AMD cards anyway, for 6 months or so.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31  Edited By ronvalencia
Member since 2008 • 29612 Posts

@jhcho2 said:
@Gue1 said:

With DX12 all of a sudden AMD cards are superior to Nvidia's and now they are going around pressuring people that do benchmarks to turn off certain settings because is not fair... And then Intel is now full on board with AMD's Free Sync while G Sync's support is nowhere to be seen too. <- http://www.maximumpc.com/intel-pledges-support-for-freesync-where-does-that-leave-g-sync/

A non-issue. When the next GTX whatever comes out, it'll be superior to AMD cards anyway, for 6 months or so.

NVIDIA doesn't have a monopoly on fab process tech shift.

The problem with "superior to AMD cards anyway, for 6 months or so." statement is year 2013 R9-290X comes back to bite the newer Maxwellv2 GPUs.

Like Kepler, Maxwellv2 has a built-in design obsolescence which is good for NVIDIA's shareholder value.

Avatar image for isturbo1984
isturbo1984

660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 isturbo1984
Member since 2015 • 660 Posts

This is where the gamer in me stops giving a ****. Video game publishers are one thing... graphics card manufacturers is another.

Avatar image for GhoX
GhoX

6267

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#33  Edited By GhoX
Member since 2006 • 6267 Posts

nvidia is not desperate, but certainly... over-protective. I realise it's necessary to be commercial, but I don't like how they are getting so overly commercial.

Ultimately though, I still find brand loyalty worthless. When I upgrade, I will go with whichever card that is practically superior, regardless of how many dirty tricks were involved in weakening its opponents. I went with AMD during nvidia's Fermi fail, and then went with nvidia when their flagships were simply superior and more practical. By the time I upgrade again, AMD may still have the upper hand, but it's quite likely that nvidia will have regained its superiority by then.

Avatar image for superbuuman
superbuuman

6400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#34 superbuuman
Member since 2010 • 6400 Posts

As long as the results translate to games..c'mon R9 Nano. :P

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35  Edited By ronvalencia
Member since 2008 • 29612 Posts

@GhoX said:

nvidia is not desperate, but certainly... over-protective. I realise it's necessary to be commercial, but I don't like how they are getting so overly commercial.

Ultimately though, I still find brand loyalty worthless. When I upgrade, I will go with whichever card that is practically superior, regardless of how many dirty tricks were involved in weakening its opponents. I went with AMD during nvidia's Fermi fail, and then went with nvidia when their flagships were simply superior and more practical. By the time I upgrade again, AMD may still have the upper hand, but it's quite likely that nvidia will have regained its superiority by then.

Your "it's quite likely that nvidia will have regained its superiority by then" assertion is an example of brand loyalty. LOL.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#36 LegatoSkyheart
Member since 2009 • 29733 Posts

herp What an amazing hiccup from Nvidia.

Caught with their Pants down and asks everyone else to compare without pants. How Funny.

(I'm rocking a GTX970)