Hermits, did you guys know this all along? (Nvidia J'accuse!)

  • 165 results
  • 1
  • 2
  • 3
  • 4
Avatar image for whistle_blower
Whistle_Blower

291

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Poll Hermits, did you guys know this all along? (Nvidia J'accuse!) (78 votes)

AMD 63%
NVIDIA 37%

POLL QUESTION: WHO IS MORE CUSTOMER FRIENDLY/ORIENTED?

At what point does it go from "It's just business" to, "Ok, enough is enough".

I'm going to wait for Pascal but I think my next rig will be with AMD. I don't support assholes,

 • 
Avatar image for lundy86_4
lundy86_4

62038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#51  Edited By lundy86_4
Member since 2003 • 62038 Posts

I can't be arsed watching the video, as i'm sick and lazy. Still, I don't base my decision on the company behind the product, but rather the product itself. nVidia are typically more expensive, but i've just had a better experience. I remember constant issues with my 4870 in certain games, and the driver fixes were few and far between.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#52  Edited By NyaDC
Member since 2014 • 8006 Posts

@Juub1990 said:
@nyadc said:

Frame pacing has been fixed on AMD Crossfire solutions for over two years now, not that it was a huge problem to begin with, but it's been rectified, not only that, they have achieved considerably lower frame times than Nvidia SLI configurations, they have surpassed Nvidia in this regard.

The 290X released 4 months after the 780 and demolished it, 4 months is hardly a long time for a competitor answer to a more powerful product, two weeks after this Nvidia released the 780 Ti based on GK110. A minor overclock to the 290X or factory OC'd models matched the 780 Ti in performance, it wasn't something substantial, it was to counter the 290X. From this point it was 10 months until the GTX 980 was released and it was only about a 12-15% increase in performance over the 780 Ti, then 8 months later the 980 Ti was released followed directly after by the Fury X a few weeks later.

All Nvidia did was released a redundancy for the 780 Ti when they released the 980, AMD skipped that and instead just rolled with the 290X all the way up until the Fury X, which for all intents and purposes makes more sense, they didn't release a redundant flagship card.

Have you read about the micro stuttering in Crossfire? It was a pretty huge problem. The frame time variance before the drivers were bad. Like really bad. I know, I actually had a pair of 7950's and ended up returning one because it was horrible. Yes it has been rectified but it took ages. AMD wasn't even acknowledging the issue before the tests came out on major tech sites.

R9 290x came out 5 months after the 780 and hardly demolished it. How can you claim the R9 290X demolished the 780 but cannot claim the 780 Ti demolishes the R9 290X? That's bias to the finest. Also why are you ignoring the fact non-reference R9 290X weren't available until way after launch? The stock models were bad. Loud as hell and had that so-called ''uber mode'' that made them sound like a plane was about to take off. I know, I assisted a private AMD showing in Montreal before the public release.

http://www.alphr.com/graphics-cards/32438/amd-radeon-r9-290x-vs-nvidia-geforce-gtx-780-review

It's an incredibly close-fought battle. Performance and pricing is neck and neck, and it would only take the slightest price cut for either card to take the upper hand.

At the time of writing, Nvidia's GeForce GTX 780 is a little cheaper than its AMD-branded rival, and it also runs cooler and is more power-efficient. If all-out performance is the primary concern, however, there can only be one winner: AMD's Radeon R9 290X edges ahead where it matters, delivering smoother frame rates at higher resolutions – right now, that's the one we'd buy.

The R9 290X gained the upper hand when NVIDIA pretty much left its GK110 and prior GPU's in the dust. That is one of the many anti-consumer acts they've committed but it took over a year for that to happen. The 780 was closer to the R9 290X than the R9 290X was to the 780 Ti. Factor in overclocking it got even worse. GK 110 overclocked like a dream whereas Hawaii was a step down from Tahiti in the OC department. R9 290X is closer to the 780 Ti at higher res supposedly because of its 512-bit bus but at anything below 4K, the 780 Ti pulls ahead. Not by a huge margin mind you(like 5%) but still ahead. Overclock and that goes to 10%.

Anyway, my point is simply that most of the time in the past 3 years, NIVIDIA dominated the market with AMD playing catch up. You can't seriously call people who buy NVIDIA cards idiots when it is a fact they had superior products over the past 3 years. From the release of the original Titan all the way to the release of Fury X, NVIDIA always had the best GPU(s) with only late answers from AMD that were quickly sent back to second place with prompt showings from NVIDIA. From Feb 2013 to Nov 2013 it was Titan/GTX 780. From Nov 2013 to Sep 2014 it was Titan/Black/780 Ti. From Sep 2014 to today it was 980/Ti/Titan X. When did AMD have the most powerful GPU on the market for the last time? It's even worse when you consider the fact NVIDIA also had 1st, 2nd, 3rd and at times even 4th place on lock. AMD's responses are too scarce and not overwhelming enough. They need to step the hell up.

Even now, look at how the 980Ti trounces the Fury X when you factor overclocking.

http://www.maximumpc.com/gtx-980-ti-vs-fury-x-overclocking-showdown/#page-2

http://www.babeltechreviews.com/the-xfx-fury-x-vs-the-gtx-980-ti-showdown-redux-no-3/

Let me address the 290X and the 780/780 Ti first and foremost, the 780 was trounced by the 290X, the 780 Ti took the upper hand and beats the 290X, but it's by a much smaller margin than the 290X in respect to beating the 780.

Look at the numbers, you don't even need the numbers, look at the graphs, look at the colors and the staggering in terms of performance.

This is the 290X vs. the 780.

This is the 290X vs. the 780 Ti.

Numbers don't lie, the results for the 780 are staggered all over the place, whereas for the 780 Ti, the 290X is nipping at its heels for essentially everything. There's no bias here, you're just up Nvidia's ass, we're talking roughly a +15% average difference for the 780 Ti vs. the 290X and like a +35% average difference for the 290X vs. the 780.

I had HD 4890 Crossfire, then HD 6870 Crossfire and am now running R9 290X Crossfire, frame pacing was an issue but it was severely overblown and very dependent on the application. It was never problematic enough to where I actually put a thought into it when I was playing games, it for the most part was completely unnoticed, micro-stutter has always been severely blown out of proportion for multi-GPU configurations, Nvidia and ATI alike.

In terms of the reference 290X's, who gives a shit if they get loud, it's enthusiast PC gaming, you should be wearing a headset anyways, it all gets drown out. I hate the noise complaint for any cards from people, I find it to be completely ridiculous, it's like owning a muscle car and bitching about it being too loud... Really?

Both sides play catch up back and forth, I don't know what this delusional mindset is that you hold where you think that Nvidia just completely obliterated AMD in performance for three years, they didn't, and they haven't, it's back and forth, back and forth.

AMD is about to best Nvidia again with Polaris, and guess what Nvidia is going to do? Release something more powerful to best that, that's just the way these things go. Also the Fury X was designed to compete with the 980 and it beats it, not the 980 Ti, Nvidia just dropped the 980 Ti right before the Fury X and now AMD has to answer to that.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#53  Edited By ronvalencia
Member since 2008 • 29612 Posts

@napo_sp said:

AMD? oh please noobs.... it was ATI vs 3dfx vs SiS vs Matrox.

Oh really?

Competing chipsets against NVIDIA's TNT2

3dfx Voodoo3

Matrox G400

ATI Rage 128

S3 Graphics Savage4

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54  Edited By ronvalencia
Member since 2008 • 29612 Posts

@lundy86_4 said:

I can't be arsed watching the video, as i'm sick and lazy. Still, I don't base my decision on the company behind the product, but rather the product itself. nVidia are typically more expensive, but i've just had a better experience. I remember constant issues with my 4870 in certain games, and the driver fixes were few and far between.

4870 has nothing to do with GCN i.e. it's VLIW vs SIMD. The last VLIW stream processor enabled GPU from NVIDIA was Geforce FX.

Both AMD and NVIDIA has concluded VLIW designs are harder to optimise.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55  Edited By Juub1990
Member since 2013 • 12622 Posts
@nyadc said:

Let me address the 290X and the 780/780 Ti first and foremost, the 780 was trounced by the 290X, the 780 Ti took the upper hand and beats the 290X, but it's by a much smaller margin than the 290X in respect to beating the 780.

Look at the numbers, you don't even need the numbers, look at the graphs, look at the colors and the staggering in terms of performance.

This is the 290X vs. the 780.

This is the 290X vs. the 780 Ti.

Numbers don't lie, the results for the 780 are staggered all over the place, whereas for the 780 Ti, the 290X is nipping at its heels for essentially everything. There's no bias here, you're just up Nvidia's ass, we're talking roughly a +15% average difference for the 780 Ti vs. the 290X and like a +35% average difference for the 290X vs. the 780.

I had HD 4890 Crossfire, then HD 6870 Crossfire and am now running R9 290X Crossfire, frame pacing was an issue but it was severely overblown and very dependent on the application. It was never problematic enough to where I actually put a thought into it when I was playing games, it for the most part was completely unnoticed, micro-stutter has always been severely blown out of proportion for multi-GPU configurations, Nvidia and ATI alike.

In terms of the reference 290X's, who gives a shit if they get loud, it's enthusiast PC gaming, you should be wearing a headset anyways, it all gets drown out. I hate the noise complaint for any cards from people, I find it to be completely ridiculous, it's like owning a muscle car and bitching about it being too loud... Really?

Both sides play catch up back and forth, I don't know what this delusional mindset is that you hold where you think that Nvidia just completely obliterated AMD in performance for three years, they didn't, and they haven't, it's back and forth, back and forth.

AMD is about to best Nvidia again with Polaris, and guess what Nvidia is going to do? Release something more powerful to best that, that's just the way these things go. Also the Fury X was designed to compete with the 980 and it beats it, not the 980 Ti, Nvidia just dropped the 980 Ti right before the Fury X and now AMD has to answer to that.

No need for that. Techpowerup has us covered.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/27.html

9% difference between 780 Ti and R9 290X(100/92). 7% difference between R9 290X and 780(92/86). As I said, the difference was the same at the time of their release. When you factor in overclock the 780Ti pulls further ahead the the 780 closes in on the R9 290X. GK110 overclocks better than Hawaii. If you say the R9 290X came out and slaughtered the 780 then the 780 Ti absolutely demolished the R9 290X.

If you look at more recent comparisons, the 780 Ti and R9 290X are closer.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/29.html

My point was that the 780 was a as a consumer grade GPU, the 780 was the best one for 5 months. When AMD finally arrived, the R9 290X was better but only marginally. Consider the fact there were only reference models with poor overclocking capaibilites and that the 780 had after market models with high overclocking ability, the R9 290X wasn't really a better choice at the time. Throw the 780 Ti in the mix and it's not even fair any more.

Both sides play catch up back and forth, I don't know what this delusional mindset is that you hold where you think that Nvidia just completely obliterated AMD in performance for three years, they didn't, and they haven't, it's back and forth, back and forth.

We both know that's a lie considering NVIDIA hasn't played catch up in about 3 years.

Your anecdotes about micro stuttering are irrelevant. The frame time variance was a huge problem at the time. I experienced it firsthand with my pair of 7950's. Some games were downright unplayable.

I also don't game wearing a headset so noise is a huge issue for me.

I honestly have no dog in this fight but you blatantly denying the truth is a bit ridiculous. How can you claim they are playing catch up back and forth when AMD hasn't had the most powerful GPU in three years? 780 was the best for 5 months. R9 290X was the best for about a week. 780 Ti was the best for 10 months. 980 was the best for 9 months until it was overtaken by the 980 Ti which is currently the fastest consumer-grade GPU on the market. I ask again, when was the last time the best GPU was an AMD? That alone justifies purchasing NVIDIA and those buying them aren't idiots, they're simply looking for the best performance available.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#56 Wasdie  Moderator
Member since 2003 • 53622 Posts

PhysX is basically GPGPU coding done in CUDA. Nobody else bothered to make a PhysX like engine on OpenCL. That's not Nvidia's fault. There was just no demand. PhysX really can't add much to the game. It's actual physics calculations are pretty basic. It's only good for mass physics calculations, nothing really deep.

Avatar image for deactivated-5d68555a05c4b
deactivated-5d68555a05c4b

1024

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 deactivated-5d68555a05c4b
Member since 2015 • 1024 Posts

AMD is a better company yes, but I've had nothing but shit luck with their cards, or more particularly their software not working so I stick with Nvidia as I can actually use their cards reliably

Avatar image for MuD3
MuD3

2192

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 MuD3
Member since 2011 • 2192 Posts

I wouldn't call them dicks for doing what most businesses do...

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60  Edited By Jereb31
Member since 2015 • 2025 Posts

@Juub1990:

Yikes, SLI hasnt been better than crossfire for.... oh man since back around the 4870 X2 (did sli even exist back then), crossfire in my opinion has been the better product since its inception. Very few games have had micro stuttering issues that i have experienced. The only one i can think of was crysis 2. I was using a pair of 6970's for the past 4'ish years and they rocked my face off.

As far as drivers on AMD being bad, man look no further than the nvidia reddit page for how dreadful there updates can be, remember back in the day where the nvidia cards would destroy rhemselves from over hearing!!! It was hilarious and terrible at the same time. How about when nvidia recently released drivers that made there cards worse (until a fix a couple of weeks later). Nvidia have had pretty dodgy drivera for a while, just enough grunt in there recent cards for people to ignore it. Not ven a lot of grunt, 5-10% more for upto 25-50% more in price on occasion.

Edit: Yes apparently sli was around then and crossfire since around 2004.

Also this is a good reference to see performance

http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 Juub1990
Member since 2013 • 12622 Posts

@jereb31 said:

@Juub1990:

Yikes, SLI hasnt been better than crossfire for.... oh man since back around the 4870 X2 (did sli even exist back then), crossfire in my opinion has been the better product since its inception. Very few games have had micro stuttering issues that i have experienced. The only one i can think of was crysis 2. I was using a pair of 6970's for the past 4'ish years and they rocked my face off.

As far as drivers on AMD being bad, man look no further than the nvidia reddit page for how dreadful there updates can be, remember back in the day where the nvidia cards would destroy rhemselves from over hearing!!! It was hilarious and terrible at the same time. How about when nvidia recently released drivers that made there cards worse (until a fix a couple of weeks later). Nvidia have had pretty dodgy drivera for a while, just enough grunt in there recent cards for people to ignore it. Not ven a lot of grunt, 5-10% more for upto 25-50% more in price on occasion.

Edit: Yes apparently sli was around then and crossfire since around 2004.

Also this is a good reference to see performance

http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

SLI was considered better than Crossfire prior to August/September 2013. Crossfire has better scaling and now the micro stuttering issues are much less pronounced(In Tahiti and onwards at least).

AMD doesn't have bad drivers any more. That was like 10 years ago. They simply release their drivers more slowly than NVIDIA most of the time but if anything their drivers have been a lot better than NVIDIA lately. Their cards tend to get better over time. A 7970 is definitely better than a 680 now and it started as worse.

Still my point was that in the past three years, if you wanted to best card on the market, you had to go NVIDIA. It's still somewhat the case but Fury X isn't a lot worse than 980 Ti.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62 HalcyonScarlet
Member since 2011 • 13838 Posts

Did I know, yes. Are they dicks, sure.

But it's not my business. I'm not invested like that. When I'm in the market for a product, I go for the one that meets the criteria and has great after service support.

Any time AMD are ready to step up in CPUs and GPUs, I'll start buying their stuff.

Avatar image for lundy86_4
lundy86_4

62038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#63 lundy86_4
Member since 2003 • 62038 Posts

@ronvalencia: What on Earth are you talking about?

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64  Edited By Juub1990
Member since 2013 • 12622 Posts

@lundy86_4 said:

@ronvalencia: What on Earth are you talking about?

Trying to show he's knowledgeable with tech as always. Basically saying the architecture you used at the time that you've had issues with is different from the one being used now by AMD.

Avatar image for leandrro
leandrro

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: -2

User Lists: 0

#65 leandrro
Member since 2007 • 1644 Posts

@nyadc said:

AMD has always been more consumer friendly and oriented, they have better price/performance ratios and all of their technologies are open source.

i had a 7870 that in theory is faster than my current gtx660

i replaced it for the 660 and i found that it runs cooler and uses less energy than 7870,

so a overclocked 660 is still faster and cooler than my old 7870, and i see it happening to all amd and nvida comparable cards

it is quite clear tha amd is pushing their cards beyond the limits to claim better price/performance

i also know tha amd cards like 7870 have built in video encoder/decoder and they should have lossless capture like shadowplay

but amd is such a shitty company that is capable of producing a graphics card but incapable of making the software to run it and still today you have to lose 1020 fps on amd cards to capture video

their consumer support in drivers, game settings, and everything related to software is pathetic, not to mention their driver overhead on lower end cpus

its great to have 2 companies competing, but amd couldmnt be more incompetent in their software and consumer support side

Avatar image for BassMan
BassMan

18741

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#66  Edited By BassMan
Member since 2002 • 18741 Posts

@nyadc said:
@Juub1990 said:

@nyadc: I disagree with that. If you wanted the best product on the market for the past 3 years or so, you had to go the NVIDIA route.

Yes, NVIDIA are greedy money grubbers but I expect nothing else from a multinational. The poster make it seem like AMD are the good guys when they simply have inferior products in general. They took like 6 months more than NVIDIA to release the R9 series only to be beaten by the 780 Ti like a week later. AMD needs to release better products faster and NVIDIA won't be able to bully them if they can catch a bigger market share.

I'm not saying you're new to all of this, but you kind of come off like it, graphics card releases are a ballet, AMD releases the most powerful card, then Nvidia does to best them, and vice versa, this is how it has always been. The only thing Nvidia has done better than AMD over the last three years is create more power efficient cards, that just went out the window with Polaris though which at 14nm is offering roughly double the performance per watt than what Nvidia currently has available...

The 780 Ti is just a more cost efficient GK110, in other words Titan architecture which released 9 months after the Titan. The 290X was beaten "like a week later" because GK110 already existed, they just redesigned the card to make it more general consumer friendly and affordable.. Also the margins in which the 780 Ti bested the 290X is really nothing to brag about.

What about the comparisons to what AMD currently has available? AMD would look a lot worse than Nvidia. I love how they try to make Nvidia look bad when they are talking about cards that are not yet available. If you want to do comparisons, then wait for the Pascal details. Pretty shady marketing tactics for a company that is supposed to be the good guy and not a dick. Let's not forget about their cheap-shot 4GB campaign after the whole 970 split VRAM reveal.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#67  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@BassMan:

Or how about AMD lying about Mantle being open.... or about them promoting their tech on specific games.

Fact is both companies do the same things....AMD isnt so innocent as some make them out to be.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68  Edited By ronvalencia
Member since 2008 • 29612 Posts

@lundy86_4 said:

@ronvalencia: What on Earth are you talking about?

Have you checked the recent NVIDIA's driver forums? NVIDIA has more driver download withdrawals/driver regression than AMD.

While playing SWTOR (MMORPG), my NVIDIA 980 Ti (1) suffered BSOD/crash to desktop/unrecoverable black window during alt-tab switch while R9-390X(2) did not suffer BSOD/crash to desktop/unrecoverable black window during alt-tab switch. Both GPUs are running the same scenes and same mission start to finish.

I rolled the driver back to first NVIDIA released driver after Windows 10's release.

1. MSI's default factory overclock. I have Windows Event log to prove to my NVIDIA 980 Ti suffered BSOD. It has nearly similar stability issues as my last NVIDIA CUDA GPU i.e. GeForce 9650M GT. Running the SWTOR game at 3K resolution.

2. R9-390X "mod" firmware update applied for my MSI R9-290X factory 1040Mhz OC. Running the SWTOR game at 5760x1920p resolution. Driver is AMD Radeon Software Crimson Edition 15.12.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:

@BassMan:

Or how about AMD lying about Mantle being open.... or about them promoting their tech on specific games.

Fact is both companies do the same things....AMD isnt so innocent as some make them out to be.

Vulkan ~= Mantle for non-AMD GPUs. Valve was able to create their own Vulkan driver for Intel IGP i.e. access to Intel GT2/GT3 IGP datasheets are very good.

AMD's shader source code tech are open to other GPU vendors i.e. Intel and NVidia.

NVIDIA has equal access to Tomb Raider 2013's source code.

NVIDIA has equal access to Ashes of Singularity's source code.

NVIDIA lacks transparency with their GameWorks.

Outside of Lintel and Wintel, operating system developers (e.g. AmigaOS 4.x PowerPC) has open access to AMD Radeon's datasheets, while zero for NVIDIA.

Try again.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#70  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@ronvalencia said:
@04dcarraher said:

@BassMan:

Or how about AMD lying about Mantle being open.... or about them promoting their tech on specific games.

Fact is both companies do the same things....AMD isnt so innocent as some make them out to be.

Vulkan ~= Mantle for non-AMD GPUs. Try again.

lol making excuses..... AMD's Mantle was proprietary.... Them using Mantle as a base for Vulkan has no bearing on what AMD did before

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#71  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:
@ronvalencia said:
@04dcarraher said:

@BassMan:

Or how about AMD lying about Mantle being open.... or about them promoting their tech on specific games.

Fact is both companies do the same things....AMD isnt so innocent as some make them out to be.

Vulkan ~= Mantle for non-AMD GPUs. Try again.

lol making excuses..... AMD's Mantle was proprietary.... Them using Mantle as a base for Vulkan has no bearing on what AMD did before

LOL Blind fool. Mantle was given to Kronos, hence Vulkan was born.

Kronos modified Mantle API spec to be MSFT and GPU agonistic.

Read the basics from https://en.wikipedia.org/wiki/Vulkan_(API)

Kronos is making sure legal due diligence has been applied.

Avatar image for Gaming-Planet
Gaming-Planet

21107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#72 Gaming-Planet
Member since 2008 • 21107 Posts

I've always known this. I don't even use nvidia tech because they are poorly implemented but I do use their card.

It comes down to price/performance.

Avatar image for OmegaTau
OmegaTau

908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 OmegaTau
Member since 2007 • 908 Posts

Even though I have been using nvidia for years I always perferd amd just afraid that games wouldn't run so well so I would stick with the evil empire of pc gaming.

but I hope amd and game developers(really up to them because they are greedy as well but we always give them a pass) to stop nvidia control

Avatar image for dxmcat
dxmcat

3385

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 dxmcat
Member since 2007 • 3385 Posts

ATI vs Nvidia

Mcdonalds vs Five Guys.

:D

Avatar image for jj-josh
jj-josh

266

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#75 jj-josh
Member since 2014 • 266 Posts

@Juub1990 said:
@nyadc said:

I'm not saying you're new to all of this, but you kind of come off like it, graphics card releases are a ballet, AMD releases the most powerful card, then Nvidia does to best them, and vice versa, this is how it has always been. The only thing Nvidia has done better than AMD over the last three years is create more power efficient cards, that just went out the window with Polaris though which at 14nm is offering roughly double the performance per watt than what Nvidia currently has available...

The 780 Ti is just a more cost efficient GK110, in other words Titan architecture which released 9 months after the Titan. The 290X was beaten "like a week later" because GK110 already existed, they just redesigned the card to make it more general consumer friendly and affordable.. Also the margins in which the 780 Ti bested the 290X is really nothing to brag about.

I've been playing on PC since the late 90's so no I'm not new to any of this. AMD used to be much more competitive back then. The 680 was faster than the 7970 for a while and the 670 was also faster than the 7950. Only with the release of the so-called ''wonder drivers'' did AMD managed to best NVIDIA's GK 104. There was also the whole issue of Crossfire being much worse than SLI due to the horrible micro stuttering issues. The frame times were terrible for Crossfire as tested and proven by multiple independent sources. SLI was a lot more smooth and stable. AMD somewhat fixed that in what August 2013? With the frame-pacing drivers I believe.

In February 2013 NVIDIA came out with the GTX Titan, the infamous 1,000$ GPU. Yes it was crazy expensive but was around 25-30% faster than any single GPU at the time and was despite the price the best card on the market. Then NVIDIA released the GTX 780 which was a cut down version of the Titan if memory serves right. It was the second best card on the market after the Titan. For a time the list went like this:

1. GTX Titan

2. GTX 780

3. GTX 680/HD 7970

The GTX 780 was released in March 2013. AMD didn't have an answer for that until the release of the R9 290 in November 2013. For over 6 months NVIDIA dominated the high-end market completely with no response from AMD. What was a gamer seeking the ultimate gaming GPU to do? Purchase a 7990 with its terribad micro stuttering, high power draw and heat output? Not only that but initially when the Hawaii came out there were just reference models available and these cards were notorious for being loud(even had youtube videos spoofing them as jet engines), power hungry and got extremely hot. Worst of it? They were only marginally faster than the 780 and overclocked significantly worse. Just when you thought NVIDIA had finally met their match what happened? They released the 780 Ti which was rumored to have been ready for a while but NVIDIA were simply waiting for AMD to release Hawaii. Titan was still sitting at the top. 780 Ti claimed second place and could at times rival or even surpass a Titan with proper overclocking. R9 290X was third with horrible cooling, horrible noise, horrible power draw and horrible overclocking.

In between we had an unremarkable card like Tonga(R9 285) which was worse than the R9 280X in pretty much all aspects. The R9 280X was merely a rebranded 7970Ghz edition too. Fast forward a bit later. NVIDIA released the Titan Black. Landscape looked like this:

1. Titan Black

2. GTX Titan

3. GTX 780 Ti

4. R9 290X

NVIDIA got greedy and released the infamous Titan Z at 3000$. AMD responded with the R9 295X2 at 1500$. Both were multi-gpu's at ridiculous prices but at least the R9 295X2 could beat the Titan in many scenarios and was half of the price. September 2014, NVIDIA released the 970 at 349$ MSRP. It absolutely demolished the mid-range market as it was by far the most affordable, powerful and efficient card at that price point. Its overclocking was decent and it drew half the power of the Hawaii cards. 980 came out at the same time.

Again, AMD laid dormant for months and NVIDIA had captured the high-end and mid-range segments of the market. AMD's response was merely to drop the prices of Hawaii which while cheaper were still inferior products. The market looked like this:

1. GTX Titan Z

2. GTX 980

3. GTX Titan

4. GTX 780 Ti

5. GTX 970/R9 290X

As a preemptive measure, NVIDIA released the 980 Ti to solidify its position and the Titan X came out too.

1. GTX Titan X

2. GTX 980 Ti

3. GTX 980

4. GTX Titan

5. GTX 780 Ti

6. GTX 970/R9 290X

Once again AMD had no answer for several months. We had to wait 10 months for AMD to release the Fury which was comparable to a 980. For almost a year NVIDIA dominated the market top to bottom. It showed as AMD was constantly loosing market share quarter over quarter. Even still, the overclocking capabilities of the Fury/X are reportedly non-existent and the 980 Ti still kills them and is widely regarded as the best consumer grade GPU on the market today. Factor things like more frequent releases, faster and better driver support, better power efficiency, better overclocking and better cooling, you can't seriously tell me AMD have been competing with NVIDIA.

I'm currently playing with a pair of 7850 after having sold my 970. If AMD releases better products then I'll gladly buy them. You suggesting NVIDIA consumers are a detriment to the market is hilarious and ignorant. For the past three years if you wanted the best card on the market, you had to go the NVIDIA route. I'm not gonna support AMD merely because they need it. They want my money then they release better cards. Couldn't care less about their situation nor that of NVIDIA.

I honestly believe that MOST people who purchase Nvidia GPU's are idiots, complete and utter idiots who get sucked into all of this with viral marketing and equally as idiotic word of mouth from other ignorant people... Not to mention I don't consider most of them real PC gamers because what kind of PC gamer would support a company who is destroying it...

I mean seriously? You're smarter than that man.

very informative brother

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#76  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@ronvalencia said:

LOL Blind fool. Mantle was given to Kronos, hence Vulkan was born.

Kronos modified Mantle API spec to be MSFT and GPU agonistic.

Read the basics from https://en.wikipedia.org/wiki/Vulkan_(API)

Kronos is making sure legal due diligence has been applied.

im the fool? lol how about you actually read what is posted ....... not even talking about Vulkan, its about Mantle before it was shut down by AMD and given to Kronos. Fact is Mantle was proprietary and only GCN architecture could use it.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 Jereb31
Member since 2015 • 2025 Posts

@Juub1990:

I dont think he is lying on the catchup part. Honestly the 295x was a monster that beat nvidias arse all over the place when it came out. Dual gpu on one card but that doesnt really matter.

Avatar image for Phazevariance
Phazevariance

12356

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 Phazevariance
Member since 2003 • 12356 Posts

I want to see what Pascal offers, but damn, their prices are so high, and in Canada its like 40% higher than in the US, making it cost a first born child. Only got one first born. 4 years later they want another first born. So that's when I have to get another secret family started so i can trade them in for the latest video cards. It literally takes at least a year to get a first born made.

Maybe this year, they will drop the cost to a second born only.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 Juub1990
Member since 2013 • 12622 Posts

@jereb31: Yeah if you were willing to spend 1500$. Either I'm talking single GPU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#80  Edited By ronvalencia
Member since 2008 • 29612 Posts

http://www.hardocp.com/news/2016/01/14/nvidiarsquos_drive_px_2_prototype_allegedly_powered_by_maxwell_pascal#.Vpl8DI9OKUk

For CES 2016, NVIDIA has dual Maxwell prototype instead of Pascal.

Delayed on TSMC (16 nm FinFET)?

http://www.overclock3d.net/articles/gpu_displays/nvidia_pascal_mia_at_ces_reportedly_in_trouble/1

Nvidia's Pascal MIA (Missing In Action) at CES, reportedly in trouble

AMD shown 14 nm FinFET Polaris GCN for CES 2016. 14 nm FinFET is offered on either Global Foundry and Samsung.

----------------------------------------------------

From http://www.dsogaming.com/news/the-witcher-3-developer-its-up-to-nvidia-to-let-amd-users-enjoy-physx-hair-and-fur-effects/

The contract between NVIDIA and CDPR stops AMD from being involve with The Witcher 3 PC's development.

John Kloetzli is graphics programmer at Firaxis.

Bart Wronski is graphics programmers at … Ubi Soft Montreal!

Timothy Lottes is ex-NVIDIA and now works at Epic (the author of TXAA).

Michal Drobot also works at Ubi Soft Montreal.

Johan works at EA DICE

From https://forums.geforce.com/default/topic/806331/nvidia-intentionally-cripples-the-older-generation-kepler-video-cards-through-nvidia-experience-/

before driver/geforce experience update

after gpu lost on

directx 9 simple -18 percent performance

directx 10 -23 percent performance

directx 11 -8 percent performance

a whole average 16 percent performance have been lost just so you can show your customers how much better your new generation is

STOP IT NVIDIA STOP CRIPPLING YOUR OLDER GENERATIONS STOP

---------------------------------------------------

Loading Video...

Loading Video...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#81  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:
@ronvalencia said:

LOL Blind fool. Mantle was given to Kronos, hence Vulkan was born.

Kronos modified Mantle API spec to be MSFT and GPU agonistic.

Read the basics from https://en.wikipedia.org/wiki/Vulkan_(API)

Kronos is making sure legal due diligence has been applied.

im the fool? lol how about you actually read what is posted ....... not even talking about Vulkan, its about Mantle before it was shut down by AMD and given to Kronos. Fact is Mantle was proprietary and only GCN architecture could use it.

The FACT is Mantle was given to Kronos and they modified it to fit non-AMD GPUs and non-MS Windows.

As of today

The FACT is NVIDIA PhysX doesn't have the same open status as Mantle-to-Vulkan.

The FACT is NVIDIA Gameworks doesn't have the same open status as GPUOpen

The FACT is NVIDIA Gsync doesn't have the same open standards as FreeSync.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#82 04dcarraher
Member since 2004 • 23858 Posts

@ronvalencia said:
@04dcarraher said:
@ronvalencia said:

LOL Blind fool. Mantle was given to Kronos, hence Vulkan was born.

Kronos modified Mantle API spec to be MSFT and GPU agonistic.

Read the basics from https://en.wikipedia.org/wiki/Vulkan_(API)

Kronos is making sure legal due diligence has been applied.

im the fool? lol how about you actually read what is posted ....... not even talking about Vulkan, its about Mantle before it was shut down by AMD and given to Kronos. Fact is Mantle was proprietary and only GCN architecture could use it.

The FACT is Mantle was given to Kronos and they modified it to fit non-AMD GPU and non-MS Windows.

fact is your ignoring the point about what AMD was doing when it was their baby.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@undefined:

Absolutely nothing wrong with what nvidia is doing.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84  Edited By Jereb31
Member since 2015 • 2025 Posts

@Juub1990:

Yeah it was hella expensive but then the performance on it was unmatched, same with the 4870 x2, 5990, 6990, 7990. Nvidia didnt match those except with the 680 or 690 (cant remember the dual card they had). Purely performance wise Nvidia has not had the lead for a solid 3 years. The 7990 was in front in there somewhere. Doesnt matter if it was a dual gpu it was still a single card.

Avatar image for whistle_blower
Whistle_Blower

291

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 Whistle_Blower
Member since 2015 • 291 Posts

I do appreciate Nvidia and I do understand their importance to technological development, but what I'm not a fan of is how they constantly release new products and their prices and how entitled they are to having people fork out the cash for them

Avatar image for davillain
DaVillain

58705

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#86 DaVillain  Moderator
Member since 2014 • 58705 Posts

@jj-josh said:
@Juub1990 said:
@nyadc said:

I'm not saying you're new to all of this, but you kind of come off like it, graphics card releases are a ballet, AMD releases the most powerful card, then Nvidia does to best them, and vice versa, this is how it has always been. The only thing Nvidia has done better than AMD over the last three years is create more power efficient cards, that just went out the window with Polaris though which at 14nm is offering roughly double the performance per watt than what Nvidia currently has available...

The 780 Ti is just a more cost efficient GK110, in other words Titan architecture which released 9 months after the Titan. The 290X was beaten "like a week later" because GK110 already existed, they just redesigned the card to make it more general consumer friendly and affordable.. Also the margins in which the 780 Ti bested the 290X is really nothing to brag about.

I've been playing on PC since the late 90's so no I'm not new to any of this. AMD used to be much more competitive back then. The 680 was faster than the 7970 for a while and the 670 was also faster than the 7950. Only with the release of the so-called ''wonder drivers'' did AMD managed to best NVIDIA's GK 104. There was also the whole issue of Crossfire being much worse than SLI due to the horrible micro stuttering issues. The frame times were terrible for Crossfire as tested and proven by multiple independent sources. SLI was a lot more smooth and stable. AMD somewhat fixed that in what August 2013? With the frame-pacing drivers I believe.

In February 2013 NVIDIA came out with the GTX Titan, the infamous 1,000$ GPU. Yes it was crazy expensive but was around 25-30% faster than any single GPU at the time and was despite the price the best card on the market. Then NVIDIA released the GTX 780 which was a cut down version of the Titan if memory serves right. It was the second best card on the market after the Titan. For a time the list went like this:

1. GTX Titan

2. GTX 780

3. GTX 680/HD 7970

The GTX 780 was released in March 2013. AMD didn't have an answer for that until the release of the R9 290 in November 2013. For over 6 months NVIDIA dominated the high-end market completely with no response from AMD. What was a gamer seeking the ultimate gaming GPU to do? Purchase a 7990 with its terribad micro stuttering, high power draw and heat output? Not only that but initially when the Hawaii came out there were just reference models available and these cards were notorious for being loud(even had youtube videos spoofing them as jet engines), power hungry and got extremely hot. Worst of it? They were only marginally faster than the 780 and overclocked significantly worse. Just when you thought NVIDIA had finally met their match what happened? They released the 780 Ti which was rumored to have been ready for a while but NVIDIA were simply waiting for AMD to release Hawaii. Titan was still sitting at the top. 780 Ti claimed second place and could at times rival or even surpass a Titan with proper overclocking. R9 290X was third with horrible cooling, horrible noise, horrible power draw and horrible overclocking.

In between we had an unremarkable card like Tonga(R9 285) which was worse than the R9 280X in pretty much all aspects. The R9 280X was merely a rebranded 7970Ghz edition too. Fast forward a bit later. NVIDIA released the Titan Black. Landscape looked like this:

1. Titan Black

2. GTX Titan

3. GTX 780 Ti

4. R9 290X

NVIDIA got greedy and released the infamous Titan Z at 3000$. AMD responded with the R9 295X2 at 1500$. Both were multi-gpu's at ridiculous prices but at least the R9 295X2 could beat the Titan in many scenarios and was half of the price. September 2014, NVIDIA released the 970 at 349$ MSRP. It absolutely demolished the mid-range market as it was by far the most affordable, powerful and efficient card at that price point. Its overclocking was decent and it drew half the power of the Hawaii cards. 980 came out at the same time.

Again, AMD laid dormant for months and NVIDIA had captured the high-end and mid-range segments of the market. AMD's response was merely to drop the prices of Hawaii which while cheaper were still inferior products. The market looked like this:

1. GTX Titan Z

2. GTX 980

3. GTX Titan

4. GTX 780 Ti

5. GTX 970/R9 290X

As a preemptive measure, NVIDIA released the 980 Ti to solidify its position and the Titan X came out too.

1. GTX Titan X

2. GTX 980 Ti

3. GTX 980

4. GTX Titan

5. GTX 780 Ti

6. GTX 970/R9 290X

Once again AMD had no answer for several months. We had to wait 10 months for AMD to release the Fury which was comparable to a 980. For almost a year NVIDIA dominated the market top to bottom. It showed as AMD was constantly loosing market share quarter over quarter. Even still, the overclocking capabilities of the Fury/X are reportedly non-existent and the 980 Ti still kills them and is widely regarded as the best consumer grade GPU on the market today. Factor things like more frequent releases, faster and better driver support, better power efficiency, better overclocking and better cooling, you can't seriously tell me AMD have been competing with NVIDIA.

I'm currently playing with a pair of 7850 after having sold my 970. If AMD releases better products then I'll gladly buy them. You suggesting NVIDIA consumers are a detriment to the market is hilarious and ignorant. For the past three years if you wanted the best card on the market, you had to go the NVIDIA route. I'm not gonna support AMD merely because they need it. They want my money then they release better cards. Couldn't care less about their situation nor that of NVIDIA.

I honestly believe that MOST people who purchase Nvidia GPU's are idiots, complete and utter idiots who get sucked into all of this with viral marketing and equally as idiotic word of mouth from other ignorant people... Not to mention I don't consider most of them real PC gamers because what kind of PC gamer would support a company who is destroying it...

I mean seriously? You're smarter than that man.

very informative brother

Indeed very informative and well said too.

Avatar image for blueinheaven
blueinheaven

5567

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#87 blueinheaven
Member since 2008 • 5567 Posts

Hilarious thread full of fanboys on one side or the other. Whenever I need a new GPU I do a lot of research and buy the best, most reliable reasonably priced card around at the time. It might be AMD, more often than not it has been Nvidea.

Last purchase was a GTX 970. It's an amazing card for the price and AMD have nothing to match it, they need equal power at a much lower price given the many driver issues I've had with their cards in the past.

Anyone who doesn't buy the BEST card within their budget when they need a new GPU is a complete moron. Anyone who suggests just buying an Nvidia card makes you a moron (as some halfwit did earlier) can safely be ignored as a complete idiot.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#88  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@jereb31 said:

@Juub1990:

Yeah it was hella expensive but then the performance on it was unmatched, same with the 4870 x2, 5990, 6990, 7990. Nvidia didnt match those except with the 680 or 690 (cant remember the dual card they had). Purely performance wise Nvidia has not had the lead for a solid 3 years. The 7990 was in front in there somewhere. Doesnt matter if it was a dual gpu it was still a single card.

however AMD's support for their dual gpu cards dwindled after new series, and those gpu's suffered same affects as Crossfiring, making their potential performance lack luster from frametiming issues and games lacking multi gpu support.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89  Edited By Juub1990
Member since 2013 • 12622 Posts

@jereb31 said:

@Juub1990:

Yeah it was hella expensive but then the performance on it was unmatched, same with the 4870 x2, 5990, 6990, 7990. Nvidia didnt match those except with the 680 or 690 (cant remember the dual card they had). Purely performance wise Nvidia has not had the lead for a solid 3 years. The 7990 was in front in there somewhere. Doesnt matter if it was a dual gpu it was still a single card.

Or you could have bought 2 780 Ti's for less than the price of a single R9 295X2. Its pricing of 1500$ was ludicrous considering 2 R9 290X performed slightly better and were a lot cheaper. I was also talking strictly single GPU.

@blueinheaven said:

Hilarious thread full of fanboys on one side or the other. Whenever I need a new GPU I do a lot of research and buy the best, most reliable reasonably priced card around at the time. It might be AMD, more often than not it has been Nvidea.

Last purchase was a GTX 970. It's an amazing card for the price and AMD have nothing to match it, they need equal power at a much lower price given the many driver issues I've had with their cards in the past.

Anyone who doesn't buy the BEST card within their budget when they need a new GPU is a complete moron. Anyone who suggests just buying an Nvidia card makes you a moron (as some halfwit did earlier) can safely be ignored as a complete idiot.

Pretty much this. You look at your budget and you buy the best GPU available. My budget has always been more or less 400$ and NVIDIA always had a better solution available sooner at that price point for me.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90 Jereb31
Member since 2015 • 2025 Posts

@04dcarraher:

The frame rate variance was equal between Nvidia and AMD except on AMD's 7000 series which is where they had all the issues. All in all i have found AMD a better buy than Nvidia for the last 12 years. Always had a leg up late in the generations.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91  Edited By Jereb31
Member since 2015 • 2025 Posts

@Juub1990:

Or you could buy one now for 699 ish.

http://m.newegg.com/Product/index?itemnumber=N82E16814202108

Which is better than the 980ti in performance and clearly the better price/performance or price or performance standalone. But hey if its a dual gpu it doeant count right.

http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

Nothing even now from Nvidia comes close to this thing in performance unless you want to shell out for a titan z which is ironically enough still $1500. So you could get two 295x2 and have change. THAT is what AMD are good for, not price gouging like a monster and reducing the prices of previous gen cards.

Edit: Better make that $630 for the 295x2.

http://m.newegg.com/Product/index?itemnumber=N82E16814202108

This does in fact mean that AMD have had the crown since april 2014.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92  Edited By Juub1990
Member since 2013 • 12622 Posts

@jereb31: It's two GPU's...might as well bring SLI 980, SLI 780 TI, SLI 980 TI in the mix. Know why it beats a 980 Ti? Because there are two GPU's. Not exactly a fair comparison. We also have to count the Titan Z otherwise which was 3000$ on launch.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 Jereb31
Member since 2015 • 2025 Posts

@Juub1990: yeah man two gpus on a Single card. Count crossfire or sli anything else as much a you need to, fact is it is the fastest thing out there and you can still crossfire it for a lot less cash than two 980 tis and beat the pants of the competition.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 Juub1990
Member since 2013 • 12622 Posts
@jereb31 said:

@Juub1990: yeah man two gpus on a Single card. Count crossfire or sli anything else as much a you need to, fact is it is the fastest thing out there and you can still crossfire it for a lot less cash than two 980 tis and beat the pants of the competition.

An overclocked 980 Ti will actually come very close to it.

https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/7.html

http://www.tweaktown.com/tweakipedia/100/3440x1440-benchmarked-amd-radeon-r9-fury-295x2-more/index.html

These are not OC'd by the way. Anyway feel free to include multi-gpu's set ups if you want. We'd have to include the Titan Z and any dual-gpu configurations under the sun.

Avatar image for ShadowDeathX
ShadowDeathX

11699

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#95  Edited By ShadowDeathX
Member since 2006 • 11699 Posts

I use both AMD and Nvidia and from what I can tell you;

  1. Microshutter and frame pacing issues did occur with AMD cards before 2013. This of course has improved upon. I was using 7970 Trifire during this time. I would know since I was affected.

  2. Multi-GPU frame pacing and scalability is now better on AMD cards than their Nvida counterparts, especially with XDMA Crossfire. Nvidia needs to get rid of SLI Bridges with Pascal.

  3. With that being said, multi-GPU support from both AMD and Nvidia has been shit. Working profiles for newer games take forever to be released.
  4. AMD is more open source and "consumer" friendly. Anyone who argues against this is silly.

  5. AMD's Windows drivers are not crap/shit/etc like Nvidia fantoys make out to believe.

  6. However Linux, on the other hand, is a work in progress. Nvidia wins there, no argument.

  7. Nvidia is usually faster to implement driver tricks or performance "optimizations" for many games. It usually takes AMD a bit longer in comparison.

  8. Nvidia's driver overhead is smaller than AMD's. In addition, Nvidia's drivers are more multi-threaded friendly than AMDs.

  9. Nvidia's suite of programs is superior to what AMD offers. ShadowPlay and Geforce Experience are way ahead of what AMD does.

And as some people aren't aware, Nvidia was better prepared for the demise of the 20nm node for GPUs. Nvidia backed off early and developed Maxwell on the 28nm node. AMD kept going forward and had multiple GPU designs on 20nm. When that didn't work out, AMD had to quickly redesign Hawaii, and later Tonga and Fiji on 28nm. They ditched their other GPUs and "rehashed" their GCN 1.0 offerings.

And I wouldn't consider Mantle as proprietary. The API was still being developed so of course AMD would BETA test it on their own hardware.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#96 ronvalencia
Member since 2008 • 29612 Posts

http://techfrag.com/2016/01/12/nvidia-lied-about-the-pascal-gpu-at-ces-that-their-ceo-showed-off/

Nvidia Lied About the Pascal GPU at CES That Their CEO Showed Off!

http://www.tweaktown.com/news/49531/nvidia-used-gtx-980-mxm-modules-during-pascal-tease-ces-2016/index.html

NVIDIA didn't have Pascal on stage at CES 2016, instead they used GTX 980 MXM modules

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 Jereb31
Member since 2015 • 2025 Posts

@Juub1990:

Well its only fair to include it, saying AMD havent made a competing product in 3 years was not quite right.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 Jereb31
Member since 2015 • 2025 Posts

@ronvalencia:

Huh, thats pretty cagey to be doing that. Glad they were caught out i suppose.

Avatar image for jereb31
Jereb31

2025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 Jereb31
Member since 2015 • 2025 Posts

Just for reference, Nvidia had inferior frame time variance (micro stutter) post the AMD 7000 series.

http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-5.html

And still do from what im aware.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#100  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@jereb31 said:

@04dcarraher:

The frame rate variance was equal between Nvidia and AMD except on AMD's 7000 series which is where they had all the issues. All in all i have found AMD a better buy than Nvidia for the last 12 years. Always had a leg up late in the generations.

Actually no, 5000 and 6000 series also experiences frame pacing issues, 6000 series seen improvements after 13.8 driver which started to address frame timing issues in DX11. If you found AMD to be a better buy than Nvidia for the last 12 years and to suggest they always had a leg up is hilarious. AMD hasn't had a leg up on Nvidia since 2012. AMD has been rehashing same architecture and previous gpu's to fill spots for new gpu lines. ie 7970 as 280x or 290x as 390x...