This topic is locked from further discussion.
[QUOTE="clyde46"][QUOTE="tormentos"] At what 1680X1050... Remember 1080p 30fps isn't good that is Killzone PS4 level..:lol:tormentosI got play most things back in 2010-2011 at 1080p at 60FPS with my 460. The 560Ti has no problems playing modern games at good settings with good FPS. But that is not the point,the point is you hype the best you are not allow to use anything but the best,complaining about Killzone PS4 for been 1080p 30FPS when the 560TI isn't a much better position and not hitting 60FPS is a joke when so many hermits complain about how important is... By the way do he 690GTX even hit 60 FPS on Crysis 3 at ultra at just 1080p.? http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html All that power and the $1,000 dollar GPU can't reach 60FPS on 1080p.? Wow Carmack was right so much waste power.
That benchmark isnt showing much difference between single gpu's and dual gpu's means that the multigpu scaling was not yet fixed when that was done. AKA the game was not patched or updated with the drivers.... After the 313 drivers GTX 690users can max with 4xAA Crysis 3 at 1080 and get more then 60 fps average. Â
From Nvidia[QUOTE="Cranler"][QUOTE="ronvalencia"]
Still no statistical backing. What cool features does NVIDIA drivers have like AMD proper GpGPU support or DirectX11.1 Level 11.1?
PS; GK110 still supports DirectX11.1 Level 11.0.
ronvalencia
So basically, we do support 11.1 features with 11_0 feature level through the DirectX 11.1 API. We do not support feature level 11_1. This is a bit confusing, due to Microsoft naming. So we do support 11.1 from a feature level for gaming related features."
In laymans terms please describe how gpgpu support would give the 7970 an advantage over the gtx 680 for example.
Oh and look at all the crap you have to go through to create a game profile with catalyst, what a joke. http://www.tweakguides.com/ATICAT_9.html
That's pre-12.1 profile creation. Read http://www.hardocp.com/article/2011/12/19/amd_catalyst_121_preview_profiles_performance/#.UVjV2LO4bwM for 12.x's profile creation.
As for DirectX 11.1 level 11.1 support, refer to Geforce 7x0 series (not mobile 710M/720M i.e. renamed 630M or renamed Kelper 1.0).
On DX11.1 Level 11.1 and Geforce 600 issues refer to http://forum.beyond3d.com/showthread.php?p=1681417
I'm still waiting for a list of advantages to Amd's dx 11.1 support in laymans terms. Better framerate, better iq etc
Took long enough for Amd to implement app profiles. Nvidia has had the feature for years and years. And according to this article Amd drivers still leave a lot to be desired.
With that said, while AMD has done a great job implementing the functionality of custom application profiles the interface could use some further work. The whole implementation still feels like its been shoehorned into AMDs existing 3D Applications Setting panel; AMD doesnt sufficiently separate the concept of global and custom profile settings, as you use the same control panel to make changes to both types of settings. Its possible (and likely) that youll accidentally set your global settings at least once when trying to save a custom profile.
Furthermore whereas NVIDIA uses application detection to pre-populate a list of profiles, AMD has no such detection. In order to create a profile you need to first select your settings in the 3D Application Settings panel and then save those settings to a new profile, a process that involves hunting down the executable of the game.
[QUOTE="Cranler"]Nice job looking at the details. Physx is off for the 7970 bench. I thought you would have clued in on that when I said that hard ocp recommends disabling physx on amd gpu's. :roll: The physics are play on the CPU when you do that...:lol: I guess you did not know that,unless you make a mod AMD GPU will no run PhysX... Was the techspot bench you linked using this mod you speak of? If not then obviously they benched a part of the game that isnt physx heavy. Lets see some benches with this mod. Face it, when 3 out of 4 benches show the Nvidia winning who would you think is right?[QUOTE="tormentos"] Did you even look at what you post because on that second link the 7970 actually hit 100FP while the 680gtx doesn't the 7970 average 75FPS the 680GTX average 55FPS.. My god..tormentos
Nvidia is butthurt that AMD secured the deal for next gen consoles.
AMD was panicking and had to secure a deal for next gen or go bankrupt.
Oh yeah, that part about the APU and GPU integration not being possible from Nvidia....that's horse sh1t. Nvidia's RSX also had integration capabilities with the Cell for the ps3. There is no moral high ground here. Both parties are talking out of their a$$es
jhcho2
Based on CELL's 115 mm^2 @ 45 nm process tech, 28 nm version of CELL would have a chip size of 71 mm^2.
With 28nm process tech, 8 core Jaguar's chip size is 41.6 mm^2.
They haven't made a single chip out of CELL and RSX, meanwhile Wii, Wii U and Xbox 360 has single package chip solution.
You are forgeting that Sony wants an APU solution at a certain price and AMD already has the IP blocks for PS4 i.e. scaled AMD Temash SoC with 8 Jaguar cores, 18 CUs and GDDR5.
Other option, AMD Kaveri APU with 4 to 6 Streamroller cores, 8 CUs (IGP) , GDDR5 and Radeon HD 7770 (dGPU) with 10 CUs.
AMD Temash already includes a southbridge.
---------------------------------------------------------
Semi-custom AMD Jaguar with 8 cores with Radeon HD 7850
AMD Jaguar core: 3.1 mm^2 + 2.1 mm^2 L2 cache = 5.2 mm^2 x 8 cores = 41.6 mm^2 (28nm process tech)
AMD Radeon HD 7850 (16 CU GpGPU) = 212 mm^2 (28nm process tech)
Total : 253.6 mm^2, ~150 watts.
---------------------------------------------------------
NVIDIA GeForce 650 Ti Boost (GK106) has a die size of 214 mm^2 (28nm process tech). AMD wins for this section.
ARM Cortex A15's performance is on par with Intel Atom which AMD Jaguar murders it in performance i.e. PS4 is not tablet/netbook and needs a cost effective performance CPU design. AMD wins for this section.
At the same process tech, IBM PPE's chip size is larger than AMD Jaguar's chip size and that's in-order-processing PPE from IBM. AMD Jaguar is an out-of-order CPU solution.
Should I continue and get Wii U's PowerPC G3 kitbash solution and compare it to AMD Jaguar? hint; You could fit 6 AMD Jaguar cores for Wii U's PowerPC 3 cores solution.
I wonder who has the "horse sh1t".
---------------------------------------------------------
On transistors packing skills
At 28nm process tech, 32 core CELL would have a chip size about 284 mm^2 and it only delivers 4 times over existing CELL.
~230 GFLOPs x 4 = 920 GFLOPS.
Based on CELL's 115 mm^2 @ 45 nm process tech, 28 nm version of CELL would have a chip size of 71 mm^2. Density of ~3.34 million transistors per mm^2.
With 45 nm process tech, IBM Power 7, has a chip size of 567 mm^2 (packing 1.2 billion transistors). Density of ~2.12 million transistors per mm^2
With 28 nm process tech, IBM Power 7, has a chip size of 352.8 mm^2 (packing 1.2 billion transistors). Density of ~3.40 million transistors per mm^2
IBM is consistent with inferior transistors per mm^2 skills.
For comparisons
With 28nm process tech, AMD Radeon HD 7870 has a chip size of 212 mm^2 (packing 2.8 billion transistors) and delivers 2560 GFLOPS. Density of 13.2 billion transistors per mm^2.
With 28nm process tech, AMD Radeon HD 7970 GE has a chip size of 352 mm^2 (packing 4.313 billion transistors) and delivers 4300 GFLOPS. Density of 12.25 billion transistors per mm^2. Notice AMD improves transistors per mm^2 within the same generation.
With 28nm process tech, NVIDIA Geforce GTX 680 has a chip size of 294 mm^2 (packing 3.54 billion transistors) and delivers 3090 GFLOPS. Density of 12.04 billion transistors per mm^2.
With 22nm process tech, Intel Ivybridge has a chip size of 160mm^2 (packing 1.4 billion transistors) . Density of 8.75 billion transistors per mm^2.
There's reason why IBM CELL based add-on card was blasted from the PC market.
Please outline a competitive APU product against PS4's AMD APU.
[QUOTE="tormentos"][QUOTE="SKaREO"] The GTX690 is unrivaled by any other video card available today. AMD makes budget hardware with bad driver support. Thanks for coming out.tionmedonhttp://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html Such fanboys.. The 7990 actually beat the 690GTX in several games...And is $100 cheaper.. Not only that 2 7970 on Cross fire also beat the $690 and they cost $200 less than the 690GTX.. http://www.newegg.com/Product/Product.aspx?Item=N82E16814131483&Tpk=7990%20amd&IsVirtualParent=1 http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=100006662&isNodeId=1&Description=690gtx&x=0&y=0 People like you are greatly miss inform and not i am not an AMD fanboy like Ron. .....u can not cf the 7990 that would b 6-8pin plus a least a 1200psu........690gtx is 2 -8pin and with a 650 psu...
Reference 7990 (Malta) uses two 8-pin PCI-E power connectors i.e. four 8-pin PCI-E power connectors for two 7990.
http://www.xbitlabs.com/picture/?src=/images/news/2013-03/amd_radeon_hd_7990_malta_tahiti.jpg
AMD waited for lower power XT2 for 7990 Malta.
But that is not the point,the point is you hype the best you are not allow to use anything but the best,complaining about Killzone PS4 for been 1080p 30FPS when the 560TI isn't a much better position and not hitting 60FPS is a joke when so many hermits complain about how important is... By the way do he 690GTX even hit 60 FPS on Crysis 3 at ultra at just 1080p.? http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html All that power and the $1,000 dollar GPU can't reach 60FPS on 1080p.? Wow Carmack was right so much waste power.[QUOTE="tormentos"][QUOTE="clyde46"] I got play most things back in 2010-2011 at 1080p at 60FPS with my 460. The 560Ti has no problems playing modern games at good settings with good FPS.04dcarraher
That benchmark isnt showing much difference between single gpu's and dual gpu's means that the multigpu scaling was not yet fixed when that was done. AKA the game was not patched or updated with the drivers.... After the 313 drivers GTX 690users can max with 4xAA Crysis 3 at 1080 and get more then 60 fps average. Â
Nvidia 314.07 beta (314.09 Beta for GeForce GTX Titan) http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-3.html Argument destroy they use 314.07 drivers and .09 for the Gforce Titan.. Mi point still stand. 1080p lower than 60 FPS this is what you people flamed Killzone PS4 for..Funny thing is that nowhere does AMD dispute Nvidias claims.
AMD have been playing the catch up game for years when it comes to CPU's, and now GPU's aswell. The top dogs in both fields are Nvidia and Intel, with AMD always playing the catch-up game to whatever these 2 are making.
It puzzles me why Nvidia has been kicking AMD's butt in the computer market. AMD's designs are much, much more efficient than Nvidia's. You get more bang for your buck with AMD. Nvidia's cards are big and dumb. They lack any elegance. Look at Titan. Does 4.5 tflops but is in the neighborhood of $1000. Then you've got AMD's Radeon 7970 Ghz edition at just a little less in tflops but half the price! No way is Titan twice as good. PC gamers consistently choosing to pay so much more for only marginal gains in performance makes no sense to me.
AMD are just as bad when the Radeon 7000 series released and the 5000 before that the prices where just as much as Nvidia. Once Nvidia release their competing cards only then do AMD drop there prices and not cause they want to, but because they have no choice.It puzzles me why Nvidia has been kicking AMD's butt in the computer market. AMD's designs are much, much more efficient than Nvidia's. You get more bang for your buck with AMD. Nvidia's cards are big and dumb. They lack any elegance. Look at Titan. Does 4.5 tflops but is in the neighborhood of $1000. Then you've got AMD's Radeon 7970 Ghz edition at just a little less in tflops but half the price! No way is Titan twice as good. PC gamers consistently choosing to pay so much more for only marginal gains in performance makes no sense to me.
Wickerman777
Â
AMD designs being more efficient is also untrue. Both Nvidia and AMD have hit and misses. The 5800Ultra hot, hungry and under powered. The Radeon 2900XT the same. And more upto date from Nvidia the 480. With this release of cards the Green team use less power and run cooler.
The Titan pricing is a rare situation. Usually AMD and Nvidia go back and forth with the gpu crown but Amd still hasnt released a card that beats the 680. Why drop the price of the 680 when its better than the 7970?It puzzles me why Nvidia has been kicking AMD's butt in the computer market. AMD's designs are much, much more efficient than Nvidia's. You get more bang for your buck with AMD. Nvidia's cards are big and dumb. They lack any elegance. Look at Titan. Does 4.5 tflops but is in the neighborhood of $1000. Then you've got AMD's Radeon 7970 Ghz edition at just a little less in tflops but half the price! No way is Titan twice as good. PC gamers consistently choosing to pay so much more for only marginal gains in performance makes no sense to me.
Wickerman777
[QUOTE="Wickerman777"]The Titan pricing is a rare situation. Usually AMD and Nvidia go back and forth with the gpu crown but Amd still hasnt released a card that beats the 680. Why drop the price of the 680 when its better than the 7970?It puzzles me why Nvidia has been kicking AMD's butt in the computer market. AMD's designs are much, much more efficient than Nvidia's. You get more bang for your buck with AMD. Nvidia's cards are big and dumb. They lack any elegance. Look at Titan. Does 4.5 tflops but is in the neighborhood of $1000. Then you've got AMD's Radeon 7970 Ghz edition at just a little less in tflops but half the price! No way is Titan twice as good. PC gamers consistently choosing to pay so much more for only marginal gains in performance makes no sense to me.
Cranler
Â
Meh, we could go back and fourth all day. And not everyone agrees with that:
http://www.maximumpc.com/article/features/amd_radeon_hd_7970_vs_nvidia_geforce_gtx_680_take_two
I still think AMD makes the better value cards, provides more bang for the buck. That's why all 3 console companies chose them.
[QUOTE="Cranler"][QUOTE="Wickerman777"]
It puzzles me why Nvidia has been kicking AMD's butt in the computer market. AMD's designs are much, much more efficient than Nvidia's. You get more bang for your buck with AMD. Nvidia's cards are big and dumb. They lack any elegance. Look at Titan. Does 4.5 tflops but is in the neighborhood of $1000. Then you've got AMD's Radeon 7970 Ghz edition at just a little less in tflops but half the price! No way is Titan twice as good. PC gamers consistently choosing to pay so much more for only marginal gains in performance makes no sense to me.
The Titan pricing is a rare situation. Usually AMD and Nvidia go back and forth with the gpu crown but Amd still hasnt released a card that beats the 680. Why drop the price of the 680 when its better than the 7970?Â
Meh, we could go back and fourth all day. And not everyone agrees with that:
http://www.maximumpc.com/article/features/amd_radeon_hd_7970_vs_nvidia_geforce_gtx_680_take_two
I still think AMD makes the better value cards, provides more bang for the buck. That's why all 3 console companies chose them.
Bang for the buck on console isnt the same as bang for the buck on pc. I prefer the Nvidia experience on pc. I gladly pay an extra $50 for the better drivers you get with Nvidia.[QUOTE="clyde46"] Why would I buy another PSU just to run a 7990 when a 690 runs on a lesser PSU? tormentosWhy would i buy a $1,000 dollar GPU to play Crysis 3 at 1080p less than 60FPS.? Other lesser cards run the game,and not matter what graphic quality you play is still is the same game..Crysis 3 is the same game on 360,PS3,and on a 690GTX.
:lol::lol::lol:
:lol::lol::lol:jhonMalcovichSo you mean to tell me that Crysis 3 on PC has a different storyline,different missions and everything.? No Yes.? Crysis 3 on the 3 is the same regardless of the graphic difference is the same game with the same mission and suck just as bad on PC as it did on consoles by reviewers,having ultra pretty graphics did not save it..
[QUOTE="clyde46"] implying I would buy a GTX690 just to play Crysis 3. You aren't very bright are you.tormentosLike Crysis example doesn't apply to every other game on both consoles and PC you are even less bright than me.
Â
No it doesn't apply to every game as every game is different. You are twisting this to try and fit your argument. I buy the best, two years ago that was the GTX580, why would I want to spend less on a GPU that I would have to upgrade a year down the line. My 580 still plays games at max at 1080p. Until such time comes when my 580 does not play all the games I want at max at 1080p then I will upgrade again.
Â
Your last comment doesn't even much sense. You are just mad that AMD have failed to produce a card that can beat the 690. You keep harping on about Titan being $1000, yet its not even aimed at your section of the market. Just look at the time before Nvidia released the 600 series, AMD had the prices for the 7970 at around £400 which is where the GTX 680 sits now, only after Nvidia launched the GTX 680 did AMD drop the price of the 7970 because they know they can't compete on a performance level so they beat Nvidia on prices. True a 7970 is cheaper than a 680 and in some games come within spitting distance of a 680 but at the end of the day, I am the master of my money and I choose what I buy. And currently, I see nothing in AMD's lineup that suits me.
[QUOTE="jhonMalcovich"]:lol::lol::lol:tormentosSo you mean to tell me that Crysis 3 on PC has a different storyline,different missions and everything.? No Yes.? Crysis 3 on the 3 is the same regardless of the graphic difference is the same game with the same mission and suck just as bad on PC as it did on consoles by reviewers,having ultra pretty graphics did not save it..
Â
I haven't played Crysis 3 nor even the original Crysis but I played Crysis 2 on Xbox 360 and liked it. It wasn't the best shooter I've played on the console but it was good.
Â
I agree with some of your points and some others I don't. I think the thing about the graphics chip in PS3 being more expensive despite being weaker than the AMD chip in 360 was a good point. Just goes back to what I was saying before: Nvidia is expensive. That's a good thing for them cuz they make more money than AMD does but how/why they are able to whip AMD's butt in sales to such an extent on PCs remains a mystery to me. Their stuff is quite close in power and often AMD is cheaper.
Â
[QUOTE="tormentos"][QUOTE="04dcarraher"] That excludes you..... :roll: in 2011 a GTX 560ti was medium to high end and I have no real reason to upgrade at the moment since I have no need too.... 04dcarraherIs 2013 your GPU is old and out of date,so this argument is only for people like clyde46 who can afford nothing but the best..:lol: yet I can play all new games on high or max settings you fail.
meanwhile tormentos is stuck at 1280x720 low settings and 30fps:P
Like Crysis example doesn't apply to every other game on both consoles and PC you are even less bright than me.[QUOTE="tormentos"][QUOTE="clyde46"] implying I would buy a GTX690 just to play Crysis 3. You aren't very bright are you.clyde46
Â
No it doesn't apply to every game as every game is different. You are twisting this to try and fit your argument. I buy the best, two years ago that was the GTX580, why would I want to spend less on a GPU that I would have to upgrade a year down the line. My 580 still plays games at max at 1080p. Until such time comes when my 580 does not play all the games I want at max at 1080p then I will upgrade again.
Â
Your last comment doesn't even much sense. You are just mad that AMD have failed to produce a card that can beat the 690. You keep harping on about Titan being $1000, yet its not even aimed at your section of the market. Just look at the time before Nvidia released the 600 series, AMD had the prices for the 7970 at around £400 which is where the GTX 680 sits now, only after Nvidia launched the GTX 680 did AMD drop the price of the 7970 because they know they can't compete on a performance level so they beat Nvidia on prices. True a 7970 is cheaper than a 680 and in some games come within spitting distance of a 680 but at the end of the day, I am the master of my money and I choose what I buy. And currently, I see nothing in AMD's lineup that suits me.
No all the games are the same unless you prove that just because you have a powerful GPU the game is a completely different game. Regardless of the graphics the game is the same,which is my point vs your lame argument of why would you buy a more expensive PSU,when most PSU out there will not work with the 690GTX either.. Hell your 580 barely play crysis 3 on very high at anything higher than 1680x1050,in fact at that resolution it play it at 33 FPS which is consider what to you slow frame rate.? You and many other hermits.. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,7.html I guess 1080p 30 FPS is ok when your card can't handle the heat,but the same on console is not good some how.. First i am not mad second i am not an AMD ass kisser like Ron,and 3rd the 7990 does beat the 690GTX in several games. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html No the only master you have is for been a sucker,and fall to those high prices that don't equate to the performance they should compare to their price. Hell 2 7950 on cross fire actually deliver the same performance the 690GTX does on crysis 3 but do so costing $400 dollars less,PC gamers are idiots..meanwhile tormentos is stuck at 1280x720 low settings and 30fps:PMK-ProfessorReally last time i heard Killzone PS4 was 30FPS 1080p not 720p,maybe you are going by BF4 rumors,in which case i will have to ask you what make you think the 560T I will run BF4 max out at 1080P..:lol: It can't even max out Crysis 3 at 1080p not even the 690GTX can at 60FPS..
[QUOTE="clyde46"][QUOTE="tormentos"] Like Crysis example doesn't apply to every other game on both consoles and PC you are even less bright than me.tormentos
Â
No it doesn't apply to every game as every game is different. You are twisting this to try and fit your argument. I buy the best, two years ago that was the GTX580, why would I want to spend less on a GPU that I would have to upgrade a year down the line. My 580 still plays games at max at 1080p. Until such time comes when my 580 does not play all the games I want at max at 1080p then I will upgrade again.
Â
Your last comment doesn't even much sense. You are just mad that AMD have failed to produce a card that can beat the 690. You keep harping on about Titan being $1000, yet its not even aimed at your section of the market. Just look at the time before Nvidia released the 600 series, AMD had the prices for the 7970 at around £400 which is where the GTX 680 sits now, only after Nvidia launched the GTX 680 did AMD drop the price of the 7970 because they know they can't compete on a performance level so they beat Nvidia on prices. True a 7970 is cheaper than a 680 and in some games come within spitting distance of a 680 but at the end of the day, I am the master of my money and I choose what I buy. And currently, I see nothing in AMD's lineup that suits me.
No all the games are the same unless you prove that just because you have a powerful GPU the game is a completely different game. Regardless of the graphics the game is the same,which is my point vs your lame argument of why would you buy a more expensive PSU,when most PSU out there will not work with the 690GTX either.. Hell your 580 barely play crysis 3 on very high at anything higher than 1680x1050,in fact at that resolution it play it at 33 FPS which is consider what to you slow frame rate.? You and many other hermits.. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,7.html I guess 1080p 30 FPS is ok when your card can't handle the heat,but the same on console is not good some how.. First i am not mad second i am not an AMD ass kisser like Ron,and 3rd the 7990 does beat the 690GTX in several games. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html No the only master you have is for been a sucker,and fall to those high prices that don't equate to the performance they should compare to their price. Hell 2 7950 on cross fire actually deliver the same performance the 690GTX does on crysis 3 but do so costing $400 dollars less,PC gamers are idiots..Â
Ok, broken English aside, you are failing to provide a convincing argument. Yes my 580 struggles with Crysis 3 but I haven't played Crysis 3, I do not care for it as I was never really interested on it. You keep banging on about Crysis 3 thinking that all PC gamers love to play it. Next up we have that PSU argument. For someone who claims they know about hardware, you aren't very clued in on PSU's. Most PSU's do not come with three 8 pin PCI-E connectors as standard. My current PSU which is a AX750 from Corsair does not but it does come with two 8 pin PCI-E connectors which is perfect for the GTX690, so out of the box the 690 has greater compatibility with most gamers rig's. Next we have performance.
Yes I will concede that the 7990 can perform better in certain games but this is heavily relient on driver support and that is not always forthcoming. Also, when comparing SLI vs Crossfire we find that Crossfire suffers more from micro-stuttering and latency issues. And finally we come to the power and heat. The GTX690 according to the article you linked uses vastly less power both at idle and under load when compared to the dual GPU offerings from AMD.
Â
Lastly, you claim that the 7950's in Crossfire produce better performance in Crysis 3 yet you do not take into account resoultion, any micro-stuttering which is present on all AMD dual card setups and will produce more heat in the process then you call PC gamers idiots? You call PC gamers idiots because they like to choose what they buy? Are you going to stand outside of a BMW dealership and pounce on people walking in screaming "Don't buy a BMW when you could get this Toyota for $5000 less"?
[QUOTE="Cranler"] Bang for the buck on console isnt the same as bang for the buck on pc. I prefer the Nvidia experience on pc. I gladly pay an extra $50 for the better drivers you get with Nvidia.tormentosHow is not.? The 7970 cheapest model is like $379 the cheapest 680GTX is like $459 and both basically perform the same. The RSX on PS3 was a piece of crap compare to the Xenos,and yet it cost sony more than what MS pay for the Xenos. Better driver is a lame excuse to try to justify Nvidia lame ass over priced crap,MS drop them because of that,Sony drop them because of that and i don't think any console maker will ever work with them again.. Nvidia like Intel sell way over priced,the only difference is at least intel has the upper hand on CPU while Nvidia doesn't really.
Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.
Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315
[QUOTE="clyde46"]No all the games are the same unless you prove that just because you have a powerful GPU the game is a completely different game. Regardless of the graphics the game is the same,which is my point vs your lame argument of why would you buy a more expensive PSU,when most PSU out there will not work with the 690GTX either.. Hell your 580 barely play crysis 3 on very high at anything higher than 1680x1050,in fact at that resolution it play it at 33 FPS which is consider what to you slow frame rate.? You and many other hermits.. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,7.html I guess 1080p 30 FPS is ok when your card can't handle the heat,but the same on console is not good some how.. First i am not mad second i am not an AMD ass kisser like Ron,and 3rd the 7990 does beat the 690GTX in several games. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html No the only master you have is for been a sucker,and fall to those high prices that don't equate to the performance they should compare to their price. Hell 2 7950 on cross fire actually deliver the same performance the 690GTX does on crysis 3 but do so costing $400 dollars less,PC gamers are idiots..[QUOTE="tormentos"]
Â
No it doesn't apply to every game as every game is different. You are twisting this to try and fit your argument. I buy the best, two years ago that was the GTX580, why would I want to spend less on a GPU that I would have to upgrade a year down the line. My 580 still plays games at max at 1080p. Until such time comes when my 580 does not play all the games I want at max at 1080p then I will upgrade again.
Â
Your last comment doesn't even much sense. You are just mad that AMD have failed to produce a card that can beat the 690. You keep harping on about Titan being $1000, yet its not even aimed at your section of the market. Just look at the time before Nvidia released the 600 series, AMD had the prices for the 7970 at around £400 which is where the GTX 680 sits now, only after Nvidia launched the GTX 680 did AMD drop the price of the 7970 because they know they can't compete on a performance level so they beat Nvidia on prices. True a 7970 is cheaper than a 680 and in some games come within spitting distance of a 680 but at the end of the day, I am the master of my money and I choose what I buy. And currently, I see nothing in AMD's lineup that suits me.
tormentos
you need to read this:
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi
And understand why all those reviews are flawed and why review sites are moving away from that method.
And if you cant be bothered reading:
The results shouldn't surprise you, and it won't surprise AMD any more either - if released today, the HD 7990 would not perform well in our tests. AMD has told me that they are working on an option to meter frames in the way that NVIDIA is doing it, but offering users the options to enable or disable it, but we are months away from that fix. Until then, any dual-GPU Radeon HD 7000-series cards are going to show these problems represented as runts and dropped frames. We have many more pages of results to go over for the HD 7950/7870/7850/etc and those will be published in the coming days - but the story will look very similar as you'll find.
In all honesty, when AMD told me they were planning this card release I recommend they hold off until its driver fix is in place - myself and other reviewers are going to be hitting them hard on these issues until then, and any dual-GPU option with the Radeon name is going struggle to live up to any placed expectations. PCper
Â
[QUOTE="MK-Professor"]meanwhile tormentos is stuck at 1280x720 low settings and 30fps:PtormentosReally last time i heard Killzone PS4 was 30FPS 1080p not 720p,maybe you are going by BF4 rumors,in which case i will have to ask you what make you think the 560T I will run BF4 max out at 1080P..:lol: It can't even max out Crysis 3 at 1080p not even the 690GTX can at 60FPS..
Â
You keep acting as if you own a PS4 and have access to those games...:?
Dude stop i know what drivers are i am not a damn newbie to GPU,in fact one of my first GPU's in the late 90's was a rage,after that most of the GPU i have own had bee from Nvidia,including their horrible 5000 series,down to the last one i bough like several years ago the 240GT.. I owned 2 Radeons as well and i did not had many issues with drivers on either cards.Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.
Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315
Cranler
[QUOTE="tormentos"][QUOTE="clyde46"] No all the games are the same unless you prove that just because you have a powerful GPU the game is a completely different game. Regardless of the graphics the game is the same,which is my point vs your lame argument of why would you buy a more expensive PSU,when most PSU out there will not work with the 690GTX either.. Hell your 580 barely play crysis 3 on very high at anything higher than 1680x1050,in fact at that resolution it play it at 33 FPS which is consider what to you slow frame rate.? You and many other hermits.. http://www.guru3d.com/articles_pages/crysis_3_graphics_performance_review_benchmark,7.html I guess 1080p 30 FPS is ok when your card can't handle the heat,but the same on console is not good some how.. First i am not mad second i am not an AMD ass kisser like Ron,and 3rd the 7990 does beat the 690GTX in several games. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html No the only master you have is for been a sucker,and fall to those high prices that don't equate to the performance they should compare to their price. Hell 2 7950 on cross fire actually deliver the same performance the 690GTX does on crysis 3 but do so costing $400 dollars less,PC gamers are idiots..darksusperia
you need to read this:
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi
And understand why all those reviews are flawed and why review sites are moving away from that method.
And if you cant be bothered reading:
The results shouldn't surprise you, and it won't surprise AMD any more either - if released today, the HD 7990 would not perform well in our tests. AMD has told me that they are working on an option to meter frames in the way that NVIDIA is doing it, but offering users the options to enable or disable it, but we are months away from that fix. Until then, any dual-GPU Radeon HD 7000-series cards are going to show these problems represented as runts and dropped frames. We have many more pages of results to go over for the HD 7950/7870/7850/etc and those will be published in the coming days - but the story will look very similar as you'll find.
In all honesty, when AMD told me they were planning this card release I recommend they hold off until its driver fix is in place - myself and other reviewers are going to be hitting them hard on these issues until then, and any dual-GPU option with the Radeon name is going struggle to live up to any placed expectations. PCper
Â
Yes it shows they are moving away when several of those test were done some weeks ago... Now Tomshardware test are flawed..:lol:[QUOTE="darksusperia"][QUOTE="tormentos"]
you need to read this:
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi
And understand why all those reviews are flawed and why review sites are moving away from that method.
And if you cant be bothered reading:
[QUOTE="PCper"]The results shouldn't surprise you, and it won't surprise AMD any more either - if released today, the HD 7990 would not perform well in our tests. AMD has told me that they are working on an option to meter frames in the way that NVIDIA is doing it, but offering users the options to enable or disable it, but we are months away from that fix. Until then, any dual-GPU Radeon HD 7000-series cards are going to show these problems represented as runts and dropped frames. We have many more pages of results to go over for the HD 7950/7870/7850/etc and those will be published in the coming days - but the story will look very similar as you'll find.
In all honesty, when AMD told me they were planning this card release I recommend they hold off until its driver fix is in place - myself and other reviewers are going to be hitting them hard on these issues until then, and any dual-GPU option with the Radeon name is going struggle to live up to any placed expectations. tormentos
Â
Yes it shows they are moving away when several of those test were done some weeks ago... Now Tomshardware test are flawed..:lol: did you not read? those test were done by pcper on march 15th. its taken them this long due to the amount of testing they have done. if you cared to read, instead of being a smartass youd see that there is issues with crossfire setups at the driver level, which AMD know about and are fixing. Fraps and other "Framerate average" based testing is flawed and doesnt show the whole picture. Everything you linked is flawed and therefore wrong. It substantiates the driver claims being better on nvidias side producing smoother gameplay, and AMD knows it.[QUOTE="cfisher2833"]You keep acting as if you own a PS4 and have access to those games...:?tormentosNot having a PS4 hasn't stop Hermits from talking sh** about the PS4..Just saying.. :lol: you think stating facts and most probable out comes is bashing? comparing to your hyping that would seem like bashing :lol:
Please Log In to post.
Log in to comment