AMD: ''Nvidia are full of sh*t and asshurt''

This topic is locked from further discussion.

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#201 Cranler
Member since 2005 • 8809 Posts
[QUOTE="tormentos"][QUOTE="Cranler"]

"""In fact, based on our testing, Radeon cards seem to handle PhysX slightly better than their GeForce counterparts. When running Borderlands 2 at 1920x1200, the HD 7970 only took a 15% performance hit after enabling PhysX (dropping from 72fps to 61fps), whereas the GTX 680 fell 19% from 74fps to 60fps.""" http://www.techspot.com/review/577-borderlands-2-performance/page7.html You are a soulless troll who argue just for the sake of arguing i proved how HSA only work on HSA able hardware,and now i prove how Radeons card actually ran PhysX by using mod and the results were not only better than on the 680GTX the quality was the same...tormentos

Yet in this benchmark the GTX 680 averages 69 fps while the 7970 averages 33 fps. http://www.pcper.com/reviews/Graphics-Cards/Borderlands-2-PhysX-Performance-and-PhysX-Comparison-GTX-680-and-HD-7970/GPU-

This site recommends disabling physx on amd gpu's. http://www.hardocp.com/article/2012/10/01/borderlands_2_gameplay_performance_iq_review/3

Different sites mine says different.. There is a link there read it..

Heres another bench showing the 680 winning by a large margin. So thats 3 to 1. http://www.pcgameshardware.de/Borderlands-2-PC-234034/Tests/Borderlands-2-Physx-Test-1026921/
Avatar image for Messiahbolical-
Messiahbolical-

5670

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#202 Messiahbolical-
Member since 2009 • 5670 Posts
Meh, does doesn't take away the fact that Nvidia are walking over AMD in the graphics card wars.clyde46
What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#203 Cranler
Member since 2005 • 8809 Posts
[QUOTE="clyde46"]Meh, does doesn't take away the fact that Nvidia are walking over AMD in the graphics card wars.Messiahbolical-
What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.

You pay for better drivers, more features like adaptive vsync and phys x.
Avatar image for Messiahbolical-
Messiahbolical-

5670

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204 Messiahbolical-
Member since 2009 • 5670 Posts

You pay for better drivers, more features like adaptive vsync and phys x.Cranler
So in other words basically nothing.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205 tormentos
Member since 2003 • 33798 Posts
[QUOTE="Cranler"][QUOTE="tormentos"][QUOTE="Cranler"]

Yet in this benchmark the GTX 680 averages 69 fps while the 7970 averages 33 fps. http://www.pcper.com/reviews/Graphics-Cards/Borderlands-2-PhysX-Performance-and-PhysX-Comparison-GTX-680-and-HD-7970/GPU-

This site recommends disabling physx on amd gpu's. http://www.hardocp.com/article/2012/10/01/borderlands_2_gameplay_performance_iq_review/3

Different sites mine says different.. There is a link there read it..

Heres another bench showing the 680 winning by a large margin. So thats 3 to 1. http://www.pcgameshardware.de/Borderlands-2-PC-234034/Tests/Borderlands-2-Physx-Test-1026921/

Did you even look at what you post because on that second link the 7970 actually hit 100FP while the 680gtx doesn't the 7970 average 75FPS the 680GTX average 55FPS.. My god..
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#206 Cranler
Member since 2005 • 8809 Posts

[QUOTE="Cranler"][QUOTE="tormentos"] Different sites mine says different.. There is a link there read it..tormentos
Heres another bench showing the 680 winning by a large margin. So thats 3 to 1. http://www.pcgameshardware.de/Borderlands-2-PC-234034/Tests/Borderlands-2-Physx-Test-1026921/

Did you even look at what you post because on that second link the 7970 actually hit 100FP while the 680gtx doesn't the 7970 average 75FPS the 680GTX average 55FPS.. My god..

Nice job looking at the details. Physx is off for the 7970 bench. I thought you would have clued in on that when I said that hard ocp recommends disabling physx on amd gpu's. :roll:

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 Cranler
Member since 2005 • 8809 Posts

[QUOTE="Cranler"]You pay for better drivers, more features like adaptive vsync and phys x.Messiahbolical-

So in other words basically nothing.

Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#208 clyde46
Member since 2005 • 49061 Posts
[QUOTE="clyde46"]Meh, does doesn't take away the fact that Nvidia are walking over AMD in the graphics card wars.Messiahbolical-
What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.

And once again we come back to the old crutch, bang for buck. For you, AMD was the better option as you didn't have much money. I however are different from you and I want the best and I'm prepared to pay for it. That is why I went Intel and Nvidia. Why settle for a hamburger when you could have roast duck?
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="faizan_faizan"][QUOTE="ShadowriverUB"] No, but it does not mean you can't do something similar on any other GPU, bah if not corporate issues nvidia could make CUDA work on any other GPU. Anyway i think OpenCL is better as it's open and it will work on any other GPU (if drivers supports it), developers don't need to care about what GPU hardware hasShadowriverUB
There are things that PhysX is doing what other physics engines can not. Like Fluids or http://www.youtube.com/watch?v=zgWur7HaIks or Millions of Particles.

Can't do as they didn't been programmed to does not mean it can't be recreated on other platfroms


That was not an appropriate answer.
IF the engine was capable of doing so, We'd be seeing games that would be doing so, But there ain't. 

Avatar image for EvanTheGamer
EvanTheGamer

1550

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 EvanTheGamer
Member since 2009 • 1550 Posts

[QUOTE="Messiahbolical-"]

[QUOTE="Cranler"]You pay for better drivers, more features like adaptive vsync and phys x.Cranler

So in other words basically nothing.

Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for.

PC Gaming requires patience to debug crap like that. You should been persistent because AMDs hardware is superior in this case.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#211 clyde46
Member since 2005 • 49061 Posts

[QUOTE="Cranler"][QUOTE="Messiahbolical-"] So in other words basically nothing.

EvanTheGamer

Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for.

PC Gaming requires patience to debug crap like that. You should been persistent because AMDs hardware is superior in this case.

AMD? Superior?
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#212 04dcarraher
Member since 2004 • 23858 Posts

[QUOTE="Cranler"][QUOTE="Messiahbolical-"] Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for. EvanTheGamer

PC Gaming requires patience to debug crap like that. You should been persistent because AMDs hardware is superior in this case.

AMD? Superior?

AMD isnt superior, they just provide a better price to performance ratio when they are behind. Remember AMD's 7900 prices when they released? they were at expensive and were until Nvidia released the GTX 600's. Have they gotten better over the years on the gpu front? yes they have, but when it comes to drivers they are still behind Nvidia. also it took AMD nearly a year to get their drivers straightened out where their 7970 could actually beat Nvidia's GTX 670. On the cpu front AMD haven't been superior for a long time, their cpu core performance has not increased much since 2009, all they been doing is increasing clock rates and adding more cores.And the PS4 is only getting a mid tier gpu and a low clocked laptop based 8 core cpu that is slower then AMD's desktop quad cores from 2009 that are clocked at 3 ghz.

Avatar image for Jag85
Jag85

20709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#213 Jag85
Member since 2005 • 20709 Posts
[QUOTE="Eddie-Vedder"]

That's pretty much what most of SW said too lol. SW ahead of the curve.

SaltyMeatballs
AMDoes what Nvidon't.

lol
Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#214 deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

[QUOTE="Chozofication"]

[QUOTE="tormentos"] Yes and nvidia also sell GPU for Cell phones tables and integrated GPU,so is not that big of a win either way.. But how does that change the fact that their GPU are over priced.? And over hyped .? I remember when the original xbox specs were announce the xbox was introduce as a 300 million polygon console,then the GPU was downgrade clock wise and the end results were 125 million... Nvidia always over sell,the xbox PS3 and now the Titan are a testament to that.tormentos

Actually the original Xbox had the absolute top of the line GPU from Nvidia at the time.  The only "downgrade" was a single digit clockspeed reduction, might have been less than 5mhz, can't remember.

The Xbox was the only console that could ever have such a GPU though because it was so huge...

It was 50mhz which by that time was allot since the actual first speed was 300 mhz,the xbox was first introduce as a 300 million polygon console,the after the downgrade the new figure was 125 million.

Just checked, the Xbox gpu was a derivitive of the Geforce 3 series, the highest clocked of which was 250 mhz.  So it was a 17mhz clock downgrade.  The Xbox gpu ran at 233.  I knew all that I just didn't know what the highest clocked geforce 3 was.

And I don't care to know about those BS number's wherever you came up with them, out your ass I suspect, you should know that the polygon number's game is always more than 10 fold of what the console can actually do.  Your own math fails, 1/6 of a decrease in clockspeed equals more than 200% less polygon's?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#215 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Messiahbolical-"]

[QUOTE="Cranler"]You pay for better drivers, more features like adaptive vsync and phys x.Cranler

So in other words basically nothing.

Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for.

NVIDIA has it's own issues. Should one start re-posting issues from NVIDIA .com's forums?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#216 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="Messiahbolical-"][QUOTE="clyde46"]Meh, does doesn't take away the fact that Nvidia are walking over AMD in the graphics card wars.Cranler
What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.

You pay for better drivers, more features like adaptive vsync and phys x.

Better driver is not statistical fact.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#217 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="EvanTheGamer"]

[QUOTE="Cranler"]

PC Gaming requires patience to debug crap like that. You should been persistent because AMDs hardware is superior in this case.

04dcarraher

AMD? Superior?

AMD isnt superior, they just provide a better price to performance ratio when they are behind. Remember AMD's 7900 prices when they released? they were at expensive and were until Nvidia released the GTX 600's. Have they gotten better over the years on the gpu front? yes they have, but when it comes to drivers they are still behind Nvidia. also it took AMD nearly a year to get their drivers straightened out where their 7970 could actually beat Nvidia's GTX 670. On the cpu front AMD haven't been superior for a long time, their cpu core performance has not increased much since 2009, all they been doing is increasing clock rates and adding more cores.And the PS4 is only getting a mid tier gpu and a low clocked laptop based 8 core cpu that is slower then AMD's desktop quad cores from 2009 that are clocked at 3 ghz.

The problem with Intel is the pricewar coming from ARM.

AMD Jaguar's die size rivals ARM's Cortex A15 die size.

2013_core_sizes_768.jpg

Building King Tigers tanks doesn't win the war.

Intel also has a stake to sure the X86 wins the war hence thier support for Sony's PS4 GPU Physics.

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#218 Cranler
Member since 2005 • 8809 Posts
[QUOTE="Cranler"][QUOTE="Messiahbolical-"] So in other words basically nothing.ronvalencia
Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for.

NVIDIA has it's own issues. Should one start re-posting issues from NVIDIA .com's forums?

Never said Nvidia had no issues, just far less issues in my experience. For example right off the bat after I got my 5870 there were loading time issues with AMD on Bad Company 2. Never had an issue like that with Nvidia.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#219 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Cranler"] Not sure if serious. I tried AMD once and it was awful. All kinds of driver issues that simply dont exist with Nvidia. Adaptive vsync is the best driver innovation in years. You get what you pay for. Cranler
NVIDIA has it's own issues. Should one start re-posting issues from NVIDIA .com's forums?

Never said Nvidia had no issues, just far less issues in my experience. For example right off the bat after I got my 5870 there were loading time issues with AMD on Bad Company 2. Never had an issue like that with Nvidia.

Read http://www.evga.com/forums/tm.aspx?m=302859&mpage=1 for Bad Company 2 loading issue and NVIDIA GF100.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#220 clyde46
Member since 2005 • 49061 Posts

[QUOTE="Cranler"][QUOTE="ronvalencia"] NVIDIA has it's own issues. Should one start re-posting issues from NVIDIA .com's forums?ronvalencia

Never said Nvidia had no issues, just far less issues in my experience. For example right off the bat after I got my 5870 there were loading time issues with AMD on Bad Company 2. Never had an issue like that with Nvidia.

Read http://www.evga.com/forums/tm.aspx?m=302859&mpage=1 for Bad Company 2 loading issue for NVIDIA.

Hardly the fault of Nvidia if Dice messed something up.
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#221 Cranler
Member since 2005 • 8809 Posts
[QUOTE="Cranler"][QUOTE="Messiahbolical-"] What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.ronvalencia
You pay for better drivers, more features like adaptive vsync and phys x.

Better driver is not statistical fact.

Updating drivers is faster and simpler, easier to set custom game profiles. What cool features does Amd drivers have like Nvidia's adaptive vsync?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#222 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"]

[QUOTE="Cranler"] Never said Nvidia had no issues, just far less issues in my experience. For example right off the bat after I got my 5870 there were loading time issues with AMD on Bad Company 2. Never had an issue like that with Nvidia. clyde46

Read http://www.evga.com/forums/tm.aspx?m=302859&mpage=1 for Bad Company 2 loading issue for NVIDIA.

Hardly the fault of Nvidia if Dice messed something up.

Does your comment address Cranler's assertion?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#223 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="Cranler"]You pay for better drivers, more features like adaptive vsync and phys x.Cranler
Better driver is not statistical fact.

Updating drivers is faster and simpler, easier to set custom game profiles. What cool features does Amd drivers have like Nvidia's adaptive vsync?

Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?

PS; GK110 still supports DirectX11.1 Level 11.0.

Avatar image for ChubbyGuy40
ChubbyGuy40

26442

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#224 ChubbyGuy40
Member since 2007 • 26442 Posts

Still no statistical backing

ronvalencia

We can create a statistical measurement from SW users and use that as the basis for Nvidomination over AMDoesntwork.

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#225 Cranler
Member since 2005 • 8809 Posts

[QUOTE="Cranler"][QUOTE="ronvalencia"] Better driver is not statistical fact.ronvalencia

Updating drivers is faster and simpler, easier to set custom game profiles. What cool features does Amd drivers have like Nvidia's adaptive vsync?

Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?

PS; GK110 still supports DirectX11.1 Level 11.0.

From Nvidia

So basically, we do support 11.1 features with 11_0 feature level through the DirectX 11.1 API. We do not support feature level 11_1. This is a bit confusing, due to Microsoft naming. So we do support 11.1 from a feature level for gaming related features."

In laymans terms please describe how gpgpu support would give the 7970 an advantage over the gtx 680 for example.

Oh and look at all the crap you have to go through to create a game profile with catalyst, what a joke. http://www.tweakguides.com/ATICAT_9.html

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#226 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Cranler"] Updating drivers is faster and simpler, easier to set custom game profiles. What cool features does Amd drivers have like Nvidia's adaptive vsync? Cranler

Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?

PS; GK110 still supports DirectX11.1 Level 11.0.

From Nvidia

So basically, we do support 11.1 features with 11_0 feature level through the DirectX 11.1 API. We do not support feature level 11_1. This is a bit confusing, due to Microsoft naming. So we do support 11.1 from a feature level for gaming related features."

In laymans terms please describe how gpgpu support would give the 7970 an advantage over the gtx 680 for example.

Oh and look at all the crap you have to go through to create a game profile with catalyst, what a joke. http://www.tweakguides.com/ATICAT_9.html

Typical NVIDIA PR spin, NVIDIA doesn't support UAVs(random read/write) for all shader types. I'm sure domain/vertex/hull shaders for UAVs are game related APIs.

A simple GpGPU example is the practical double precision floating point performance not just a tick box support.

PS4's GPU support is beyond DirectX11.1 Level 11.1+ features support e.g. shader interrupt instructions. Note that this issue also affects all PCs with DX.

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#227 Cranler
Member since 2005 • 8809 Posts
[QUOTE="ronvalencia"]

[QUOTE="Cranler"]

[QUOTE="ronvalencia"]

Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?

PS; GK110 still supports DirectX11.1 Level 11.0.

From Nvidia

So basically, we do support 11.1 features with 11_0 feature level through the DirectX 11.1 API. We do not support feature level 11_1. This is a bit confusing, due to Microsoft naming. So we do support 11.1 from a feature level for gaming related features."

In laymans terms please describe how gpgpu support would give the 7970 an advantage over the gtx 680 for example.

Oh and look at all the crap you have to go through to create a game profile with catalyst, what a joke. http://www.tweakguides.com/ATICAT_9.html

Typical NVIDIA PR spin, NVIDIA doesn't support UAVs(random read/write) for all shader types. I'm sure domain/vertex/hull shaders for UAVs are game related APIs.

A simple GpGPU example is the practical double precision floating point performance not just a tick box support.

I asked for laymens terms and what are the ingame advantages? Where will an Nvidia gpu owner like me be at a disadvantage? You keep ignoring the biggest problem with Amd which is the individual game profile setup. So easy with Nvidia drivers.
Avatar image for Ravenlore_basic
Ravenlore_basic

4319

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#228 Ravenlore_basic
Member since 2003 • 4319 Posts

Yeah,that´s exactly what they said :roll: And Nvidia is a much bigger and way more profitabe company than AMD,why would there be any butthurt is beyond me,unlike AMD,Nvidia is ablçe to be picky with contracts,that´s why they didn´t do any console builds,they don´t need to and they ask for way more than AMD,for AMD these console deals have been their saving grace and even with that Nvidia still is by far the most profitable company.MrYaotubo

Go back a few years prior to Apple launching the Iphone.  Blackberries and Nokia were the leaders, to say that Apple would topple them would have been laughed at, by all the Blackberry and Nokia fans.  But what happned?!!  

I think the same thing Will happen here.  AMD is making the right moves while Nvidia continues behave as if people need them. Their prices are high and not many developers even push their code to make it preform better on Nvidia. 

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#229 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Cranler"] Updating drivers is faster and simpler, easier to set custom game profiles. What cool features does Amd drivers have like Nvidia's adaptive vsync? Cranler

Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?

PS; GK110 still supports DirectX11.1 Level 11.0.

From Nvidia

So basically, we do support 11.1 features with 11_0 feature level through the DirectX 11.1 API. We do not support feature level 11_1. This is a bit confusing, due to Microsoft naming. So we do support 11.1 from a feature level for gaming related features."

In laymans terms please describe how gpgpu support would give the 7970 an advantage over the gtx 680 for example.

Oh and look at all the crap you have to go through to create a game profile with catalyst, what a joke. http://www.tweakguides.com/ATICAT_9.html

That's pre-12.1 profile creation. Read http://www.hardocp.com/article/2011/12/19/amd_catalyst_121_preview_profiles_performance/#.UVjV2LO4bwM for 12.x's profile creation.


As for DirectX 11.1 level 11.1 support, refer to Geforce 7x0 series (not mobile 710M/720M i.e. renamed 630M or renamed Kelper 1.0).

On DX11.1 Level 11.1 and Geforce 600 issues refer to http://forum.beyond3d.com/showthread.php?p=1681417

Avatar image for Ravenlore_basic
Ravenlore_basic

4319

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#230 Ravenlore_basic
Member since 2003 • 4319 Posts

[QUOTE="HaloinventedFPS"]

I don't like either of them, especially not Nvidia since they keep Physx & CUDA Nvidia only, which screws over a heap of PC gamers

but Nvidia were right, PS4 isn't a high end PC for $400

cfisher2833

 

Just wondering, but where do people keep getting the idea that the PS4 will be $400? Sony isn't in a position to take a huge loss for each console sold, and considering the WiiU (which we all know is a weak as hell system) retails for $350, I just don't get where people are getting this figure. I could maybe see the  low end version of the 720 retailing for $400 if it is indeed weaker than the PS4, but the idea that the PS4 will be $400 just seems like wishful thinking. I would at least expect the low end version to retail for $500 and the higher end one to retail for $600. 

Did you guys get this figure from Pachter or something!? :lol:

The CPU in the APU is just right for it to do its job.  It is not ment to bottleneck or reduce the GPU. 

The GPU can be accessed at a lower leve than GPUs on PCs so to copare you would need to compare it to a higher level GPU. 

This system will also give developers more direct access to the shader pipeline than they had on the PS3 or through DirectX itself. "This is access you're not used to getting on the PC, and as a result you can do a lot more cool things and have a lot more access to the power of the system," Norden said. A low-level API will also let coders talk directly with the hardware in a way that's "much lower-level than DirectX and OpenGL,"  http://arstechnica.com/gaming/2013/03/sony-dives-deep-into-the-ps4s-hardware-power-controller-features-at-gdc/

SO Even though PCs have the "Brute Force" to out power the PS4, the PS4's capabilities come from the dedication to Games and its ability to use low level programing.  The whole system makes it more powerful than an equal power "on paper" than one would believe. BUT we will wait and see the games to compare.  

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#231 tormentos
Member since 2003 • 33798 Posts

[QUOTE="tormentos"][QUOTE="Cranler"] Heres another bench showing the 680 winning by a large margin. So thats 3 to 1. http://www.pcgameshardware.de/Borderlands-2-PC-234034/Tests/Borderlands-2-Physx-Test-1026921/Cranler

Did you even look at what you post because on that second link the 7970 actually hit 100FP while the 680gtx doesn't the 7970 average 75FPS the 680GTX average 55FPS.. My god..

Nice job looking at the details. Physx is off for the 7970 bench. I thought you would have clued in on that when I said that hard ocp recommends disabling physx on amd gpu's. :roll:

The physics are play on the CPU when you do that...:lol: I guess you did not know that,unless you make a mod AMD GPU will no run PhysX...
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 tormentos
Member since 2003 • 33798 Posts
[QUOTE="Messiahbolical-"][QUOTE="clyde46"]Meh, does doesn't take away the fact that Nvidia are walking over AMD in the graphics card wars.clyde46
What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.

And once again we come back to the old crutch, bang for buck. For you, AMD was the better option as you didn't have much money. I however are different from you and I want the best and I'm prepared to pay for it. That is why I went Intel and Nvidia. Why settle for a hamburger when you could have roast duck?

The 7990 is better than the 690GTX and cost $100 dollars less. So is more like paying $100 for chucksteak when Tbone cost $80...
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#233 clyde46
Member since 2005 • 49061 Posts
[QUOTE="tormentos"][QUOTE="clyde46"][QUOTE="Messiahbolical-"] What's the graphics card war? War for the best high-end cards? Well that's irrelevant unless you're going to pay hundreds upon hundreds of dollars for the best cards. ATI gives you the most bang for your bucks, and that's all that matters to me. I payed a mere $85(Newegg) for my Sapphire Radeon HD 6770 like a year and a half ago and it still pumps out enough power to run most brand games with all settings on high(some on Ultra) at smooth framerates. I'm playing Bioshock Infinite @ 1080p with a lot of settings on Ultra, Anti-Aliasing turned on, texture filtering at ultra, dynamic lighting/light beams turned all the way up, and a few unimportant settings turned down a bit. I don't understand the need to spend a $600+ on a video card(or 2) when cards half or less of the price can do the job just fine for at least a few years. I've found that if your compare prices of an Nvidia and ATI card of very comparable capabilities, the Nvidia GPU is usually at least 15-30% more expensive than the ATI. Why? Kind of like how AMD CPUs give you more bang for your buck than Intel CPUs. Seems like you're paying for the brand name.

And once again we come back to the old crutch, bang for buck. For you, AMD was the better option as you didn't have much money. I however are different from you and I want the best and I'm prepared to pay for it. That is why I went Intel and Nvidia. Why settle for a hamburger when you could have roast duck?

The 7990 is better than the 690GTX and cost $100 dollars less. So is more like paying $100 for chucksteak when Tbone cost $80...

It also runs a lot hotter, requires 3 8 pin PCI-E connectors and has a triple slot cooler, plus its heavily reliant on driver updates for games to get the best of it.
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#234 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="clyde46"][QUOTE="tormentos"][QUOTE="clyde46"] And once again we come back to the old crutch, bang for buck. For you, AMD was the better option as you didn't have much money. I however are different from you and I want the best and I'm prepared to pay for it. That is why I went Intel and Nvidia. Why settle for a hamburger when you could have roast duck?

The 7990 is better than the 690GTX and cost $100 dollars less. So is more like paying $100 for chucksteak when Tbone cost $80...

It also runs a lot hotter, requires 3 8 pin PCI-E connectors and has a triple slot cooler, plus its heavily reliant on driver updates for games to get the best of it.

Also AMD tends to drop dual gpu driver support very quickly
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#235 tormentos
Member since 2003 • 33798 Posts
Just checked, the Xbox gpu was a derivitive of the Geforce 3 series, the highest clocked of which was 250 mhz.  So it was a 17mhz clock downgrade.  The Xbox gpu ran at 233.  I knew all that I just didn't know what the highest clocked geforce 3 was.

And I don't care to know about those BS number's wherever you came up with them, out your ass I suspect, you should know that the polygon number's game is always more than 10 fold of what the console can actually do.  Your own math fails, 1/6 of a decrease in clockspeed equals more than 200% less polygon's?

Chozofication
I like debating with idiots like you it makes the own even better.. ""Xbox general manager, J Allard, took time out of his busy schedule to talk to IGN in a short Q&A about the recent Xbox downgrades. Allard states that the main reason behind making DVD playback require a remote control purchase to help solidify the perception of the Xbox as a focused gaming console. Here's Allard's response to the NV2A spec change from 300 MHZ to 250 MHZ: J Allard: The honest truth is that the goal that we always had for the system was 3x the graphical and computational performance of PlayStation 2. Initially, we thought that a 600MHz CPU and a 300MHz GPU was about right -- and that was in March. Now that Nvidia's got the NV20 in production, and we've got NV20 cards working with the operating system out in dev kits, and we've got games up and running on NV20s, we learned a bit more about the production and the manufacturing, and we decided that the 250MHz combined with the 733MHz is really the right balance. We'll still hit the 3x, but we guessed bad with the 300Mhz. Basically, forget the numbers, and know that the Xbox will give you 3x more juice than the PS2. "" http://news.teamxbox.com/xbox/475/Allard-on-Downgrade/ Hey idiot look at the date of the link January 2001 before the xbox was release,the original clock speed was 300mhz and was downgrade to 250mhz,that was J Allard on a interview on IGN back them about the downgrade in specs,in fact it lol when J Allard claimed that the xbox was able to achieve 3 times the in game performance of the PS2..:lol: I actually read that interview back then and how they try to say that they did a survey and that people wanted to have DVD playback with the remote so they locked out and make it available only paying,when in real life they did that to avoid paying license fee to the DVD group,so you basically had to pay for the DVD license.:lol: The xbox GPU ran at 250mhz not 233.
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#236 tormentos
Member since 2003 • 33798 Posts
[QUOTE="clyde46"] It also runs a lot hotter, requires 3 8 pin PCI-E connectors and has a triple slot cooler, plus its heavily reliant on driver updates for games to get the best of it.

And? Remember you are made of money so that would not bother you..
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#237 tormentos
Member since 2003 • 33798 Posts
[QUOTE="04dcarraher"] Also AMD tends to drop dual gpu driver support very quickly

This conversation is for people who can afford high end GPU not for 560TI peasants so get loss.:lol:
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#238 tormentos
Member since 2003 • 33798 Posts
[QUOTE="clyde46"] Hardly the fault of Nvidia if Dice messed something up.

Really so was when their cards could not do HDR+AA while AMD ones could and coders had to do their own patches..
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#239 clyde46
Member since 2005 • 49061 Posts
[QUOTE="tormentos"][QUOTE="clyde46"] It also runs a lot hotter, requires 3 8 pin PCI-E connectors and has a triple slot cooler, plus its heavily reliant on driver updates for games to get the best of it.

And? Remember you are made of money so that would not bother you..

It doesn't but I know the 690 is the better card when comparing dual GPU cards.
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#240 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="tormentos"][QUOTE="04dcarraher"] Also AMD tends to drop dual gpu driver support very quickly

This conversation is for people who can afford high end GPU not for 560TI peasants so get loss.:lol:

That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....
Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#241 cfisher2833
Member since 2011 • 2150 Posts

[QUOTE="tormentos"][QUOTE="04dcarraher"] Also AMD tends to drop dual gpu driver support very quickly 04dcarraher
This conversation is for people who can afford high end GPU not for 560TI peasants so get loss.:lol:

That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....

 

Not to mention, the 560ti overclocks like a beast. I am personally waiting until the 760ti. There's no reason right now to upgrade as my 560ti gets me a constant 60fps for the vast majority of games. 

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#242 tormentos
Member since 2003 • 33798 Posts
[QUOTE="04dcarraher"] That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....

Is 2013 your GPU is old and out of date,so this argument is only for people like clyde46 who can afford nothing but the best..:lol:
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#243 04dcarraher
Member since 2004 • 23858 Posts

[QUOTE="04dcarraher"][QUOTE="tormentos"] This conversation is for people who can afford high end GPU not for 560TI peasants so get loss.:lol:cfisher2833

That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....

 

Not to mention, the 560ti overclocks like a beast. I am personally waiting until the 760ti. There's no reason right now to upgrade as my 560ti gets me a constant 60fps for the vast majority of games. 

have mine at 900mhz
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#244 04dcarraher
Member since 2004 • 23858 Posts
[QUOTE="tormentos"][QUOTE="04dcarraher"] That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....

Is 2013 your GPU is old and out of date,so this argument is only for people like clyde46 who can afford nothing but the best..:lol:

yet I can play all new games on high or max settings you fail.
Avatar image for jhcho2
jhcho2

5103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#245 jhcho2
Member since 2004 • 5103 Posts

Nvidia is butthurt that AMD secured the deal for next gen consoles.

AMD was panicking and had to secure a deal for next gen or go bankrupt.

Oh yeah, that part about the APU and GPU integration not being possible from Nvidia....that's horse sh1t. Nvidia's RSX also had integration capabilities with the Cell for the ps3. There is no moral high ground here. Both parties are talking out of their a$$es

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#246 cfisher2833
Member since 2011 • 2150 Posts

[QUOTE="cfisher2833"]

[QUOTE="04dcarraher"] That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too.... 04dcarraher

 

Not to mention, the 560ti overclocks like a beast. I am personally waiting until the 760ti. There's no reason right now to upgrade as my 560ti gets me a constant 60fps for the vast majority of games. 

have mine at 900mhz

 

This is what I have mine set at. Runs really stable.

 

922201284958pm.png

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#247 tormentos
Member since 2003 • 33798 Posts
[QUOTE="04dcarraher"][QUOTE="tormentos"][QUOTE="04dcarraher"] That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too....

Is 2013 your GPU is old and out of date,so this argument is only for people like clyde46 who can afford nothing but the best..:lol:

yet I can play all new games on high or max settings you fail.

At what 1680X1050... Remember 1080p 30fps isn't good that is Killzone PS4 level..:lol:
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#248 clyde46
Member since 2005 • 49061 Posts
[QUOTE="tormentos"][QUOTE="04dcarraher"][QUOTE="tormentos"] Is 2013 your GPU is old and out of date,so this argument is only for people like clyde46 who can afford nothing but the best..:lol:

yet I can play all new games on high or max settings you fail.

At what 1680X1050... Remember 1080p 30fps isn't good that is Killzone PS4 level..:lol:

I got play most things back in 2010-2011 at 1080p at 60FPS with my 460. The 560Ti has no problems playing modern games at good settings with good FPS.
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249 tormentos
Member since 2003 • 33798 Posts

Nvidia is butthurt that AMD secured the deal for next gen consoles.

AMD was panicking and had to secure a deal for next gen or go bankrupt.

Oh yeah, that part about the APU and GPU integration not being possible from Nvidia....that's horse sh1t. Nvidia's RSX also had integration capabilities with the Cell for the ps3. There is no moral high ground here. Both parties are talking out of their a$$es

jhcho2
The RSX and Cell were not on the same die,like the PS4 GPU and CPU,they share a direct link which was different but more or less serve the same purpose,and it wasn't thanks to Nvidia it was thanks to sony and IBM.
Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#250 tormentos
Member since 2003 • 33798 Posts
[QUOTE="clyde46"][QUOTE="tormentos"][QUOTE="04dcarraher"] yet I can play all new games on high or max settings you fail.

At what 1680X1050... Remember 1080p 30fps isn't good that is Killzone PS4 level..:lol:

I got play most things back in 2010-2011 at 1080p at 60FPS with my 460. The 560Ti has no problems playing modern games at good settings with good FPS.

But that is not the point,the point is you hype the best you are not allow to use anything but the best,complaining about Killzone PS4 for been 1080p 30FPS when the 560TI isn't a much better position and not hitting 60FPS is a joke when so many hermits complain about how important is... By the way do he 690GTX even hit 60 FPS on Crysis 3 at ultra at just 1080p.? http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html All that power and the $1,000 dollar GPU can't reach 60FPS on 1080p.? Wow Carmack was right so much waste power.