AMD: ''Nvidia are full of sh*t and asshurt''

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#301 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="tormentos"][QUOTE="Cranler"] Bang for the buck on console isnt the same as bang for the buck on pc. I prefer the Nvidia experience on pc. I gladly pay an extra $50 for the better drivers you get with Nvidia.Cranler

How is not.? The 7970 cheapest model is like $379 the cheapest 680GTX is like $459 and both basically perform the same. The RSX on PS3 was a piece of crap compare to the Xenos,and yet it cost sony more than what MS pay for the Xenos. Better driver is a lame excuse to try to justify Nvidia lame ass over priced crap,MS drop them because of that,Sony drop them because of that and i don't think any console maker will ever work with them again.. Nvidia like Intel sell way over priced,the only difference is at least intel has the upper hand on CPU while Nvidia doesn't really.

Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.

Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315

This poll indicates otherwise http://www.overclock.net/t/1351071/crossfire-7970-or-sli-680s

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#303 Wickerman777
Member since 2013 • 2164 Posts

Regarding GPUs in general:

 

nvidia-ps4.jpg

 

What trips me out about that chart is how radically things have changed since the beginning of this gen. The GPUs in Xbox and Xbox 360 were both ahead of the PC market at the times of their launches. But this time all the consoles are gonna be way behind PC right from the start. Heck, is looking like Nextbox's GPU won't even beat the one in PS4 let alone computers. My, how things change.

Avatar image for HaloPimp978
HaloPimp978

7329

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 1

#304 HaloPimp978
Member since 2005 • 7329 Posts

This thread is just getting silly :?

cfisher2833

Avatar image for tionmedon
tionmedon

468

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#305 tionmedon
Member since 2006 • 468 Posts

[QUOTE="tionmedon"][QUOTE="tormentos"] http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html Such fanboys.. The 7990 actually beat the 690GTX in several games...And is $100 cheaper.. Not only that 2 7970 on Cross fire also beat the $690 and they cost $200 less than the 690GTX.. http://www.newegg.com/Product/Product.aspx?Item=N82E16814131483&Tpk=7990%20amd&IsVirtualParent=1 http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=100006662&isNodeId=1&Description=690gtx&x=0&y=0 People like you are greatly miss inform and not i am not an AMD fanboy like Ron.ronvalencia

.....u can not cf the 7990 that would b 6-8pin plus a least a 1200psu........690gtx is 2 -8pin and with a 650 psu...

Reference 7990 (Malta) uses two 8-pin PCI-E power connectors i.e. four 8-pin PCI-E power connectors for two 7990.

http://www.xbitlabs.com/picture/?src=/images/news/2013-03/amd_radeon_hd_7990_malta_tahiti.jpg

AMD waited for lower power XT2 for 7990 Malta.

newegg has a powercolor ax 7990 6g it`s 3-8pin power with min 850 psu............... u lose.....
Avatar image for tionmedon
tionmedon

468

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#306 tionmedon
Member since 2006 • 468 Posts
[QUOTE="clyde46"][QUOTE="tormentos"][QUOTE="tionmedon"] .....u can not cf the 7990 that would b 6-8pin plus a least a 1200psu........690gtx is 2 -8pin and with a 650 psu...

And is one of the lamest counter point ever,you are complaining about buying a more beef up power supply,on GPU that cost $1,000 dollars that my friend is silly...

absolutly.................. Why would I buy another PSU just to run a 7990 when a 690 runs on a lesser PSU?

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#307 topgunmv
Member since 2003 • 10880 Posts
[QUOTE="ronvalencia"]

[QUOTE="tionmedon"] .....u can not cf the 7990 that would b 6-8pin plus a least a 1200psu........690gtx is 2 -8pin and with a 650 psu...tionmedon

Reference 7990 (Malta) uses two 8-pin PCI-E power connectors i.e. four 8-pin PCI-E power connectors for two 7990.

http://www.xbitlabs.com/picture/?src=/images/news/2013-03/amd_radeon_hd_7990_malta_tahiti.jpg

AMD waited for lower power XT2 for 7990 Malta.

newegg has a powercolor ax 7990 6g it`s 3-8pin power with min 850 psu............... u lose.....

That thing has been out since last year...aka its not a reference design.
Avatar image for tionmedon
tionmedon

468

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#308 tionmedon
Member since 2006 • 468 Posts
[QUOTE="MK-Professor"]meanwhile tormentos is stuck at 1280x720 low settings and 30fps:Ptormentos
Really last time i heard Killzone PS4 was 30FPS 1080p not 720p,maybe you are going by BF4 rumors,in which case i will have to ask you what make you think the 560T I will run BF4 max out at 1080P..:lol: It can't even max out Crysis 3 at 1080p not even the 690GTX can at 60FPS..

my 690gtx does 51fps@2560x1600 and 78 fps @1920/1200................
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#309 Cranler
Member since 2005 • 8809 Posts
[QUOTE="ronvalencia"]

[QUOTE="Cranler"]

How is not.? The 7970 cheapest model is like $379 the cheapest 680GTX is like $459 and both basically perform the same. The RSX on PS3 was a piece of crap compare to the Xenos,and yet it cost sony more than what MS pay for the Xenos. Better driver is a lame excuse to try to justify Nvidia lame ass over priced crap,MS drop them because of that,Sony drop them because of that and i don't think any console maker will ever work with them again.. Nvidia like Intel sell way over priced,the only difference is at least intel has the upper hand on CPU while Nvidia doesn't really.tormentos

Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.

Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315

This poll indicates otherwise http://www.overclock.net/t/1351071/crossfire-7970-or-sli-680s

Lol your poll compares 2 cards with 45 people surveyed. Mine compared drivers for all cards with 200 respondents. If amd is so much better then why are people still willing to spend more on Nvidia?
Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#310 Cranler
Member since 2005 • 8809 Posts
[QUOTE="tormentos"][QUOTE="Cranler"]

Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.

Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315

Dude stop i know what drivers are i am not a damn newbie to GPU,in fact one of my first GPU's in the late 90's was a rage,after that most of the GPU i have own had bee from Nvidia,including their horrible 5000 series,down to the last one i bough like several years ago the 240GT.. I owned 2 Radeons as well and i did not had many issues with drivers on either cards.

Looks like you dont do much researching before buying. It was all over the web how bad the 5000 series was.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#311 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="tionmedon"] .....u can not cf the 7990 that would b 6-8pin plus a least a 1200psu........690gtx is 2 -8pin and with a 650 psu...tionmedon

Reference 7990 (Malta) uses two 8-pin PCI-E power connectors i.e. four 8-pin PCI-E power connectors for two 7990.

http://www.xbitlabs.com/picture/?src=/images/news/2013-03/amd_radeon_hd_7990_malta_tahiti.jpg

AMD waited for lower power XT2 for 7990 Malta.

newegg has a powercolor ax 7990 6g it`s 3-8pin power with min 850 psu............... u lose.....

Powercolor AX 7990 is not a "Malta" 7990 reference build i.e. it has two 8-pin PCI-E power connectors and a dual slot cooling. I even provided a screenshot of "Malta" 7990 reference build.

u lose.

Avatar image for SamiRDuran
SamiRDuran

2758

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#312 SamiRDuran
Member since 2005 • 2758 Posts

Regarding GPUs in general:

 

nvidia-ps4.jpg

 

What trips me out about that chart is how radically things have changed since the beginning of this gen. The GPUs in Xbox and Xbox 360 were both ahead of the PC market at the times of their launches. But this time all the consoles are gonna be way behind PC right from the start. Heck, is looking like Nextbox's GPU won't even beat the one in PS4 let alone computers. My, how things change.

Wickerman777

no they were not ahead of the pc market at the start of this gen. the graph compares the xbox360 gpu to a geforce 7800 but more powerful gpus were available at that time such as the geforce 7900 and ati x1950. Nvidia graphs are always weird and inaccurate.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#313 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Cranler"]

Gpu is nothing without the drivers. New drivers are constantly in the works and lots of money is spent on them.

Maybe this will shed some light on why people are willing spend more on Nvidia http://hardforum.com/showthread.php?t=1683315

Cranler

This poll indicates otherwise http://www.overclock.net/t/1351071/crossfire-7970-or-sli-680s

Lol your poll compares 2 cards with 45 people surveyed. Mine compared drivers for all cards with 200 respondents. If amd is so much better then why are people still willing to spend more on Nvidia?

My link specifies 7970 CF vs 680 SLI and it did NOT include Cypress or Cayman based CF.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#314 ronvalencia
Member since 2008 • 29612 Posts

Regarding GPUs in general:

nvidia-ps4.jpg

What trips me out about that chart is how radically things have changed since the beginning of this gen. The GPUs in Xbox and Xbox 360 were both ahead of the PC market at the times of their launches. But this time all the consoles are gonna be way behind PC right from the start. Heck, is looking like Nextbox's GPU won't even beat the one in PS4 let alone computers. My, how things change.

Wickerman777

Before 8800 GTX(185 watts, two PCI-E power connectors), Radeon X1950 XTX has 125 watts and one PCI-E power connector. http://www.game-debate.com/hardware/index.php?gid=1054&graphics=Radeon%20X1950%20XTX

The next-gen console's TDP targets didn't change with 2012/2013 flagship PCs i.e. ~200 watt GPUs.

Radeon X1950 XTX's single PCI-E power connector.

card_power.jpg

Avatar image for Cranler
Cranler

8809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#315 Cranler
Member since 2005 • 8809 Posts
[QUOTE="ronvalencia"]

[QUOTE="Cranler"][QUOTE="ronvalencia"]

This poll indicates otherwise http://www.overclock.net/t/1351071/crossfire-7970-or-sli-680s

Lol your poll compares 2 cards with 45 people surveyed. Mine compared drivers for all cards with 200 respondents. If amd is so much better then why are people still willing to spend more on Nvidia?

My link specifies 7970 CF vs 680 SLI and it did NOT include Cypress or Cayman based CF.

Am I supposed to know what Cypress and Caymen are? The performance difference between the 2 cards is minute and Nvidia has better drivers plain and simple. Amd just recently started offering app profiles and not nearly as well implemented as Nvidias I might add. By the time Amd gets adaptive vsync oleds will be the standard and it wont even be a worthwhile feature at that point.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#316 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Cranler"] Lol your poll compares 2 cards with 45 people surveyed. Mine compared drivers for all cards with 200 respondents. If amd is so much better then why are people still willing to spend more on Nvidia?Cranler

My link specifies 7970 CF vs 680 SLI and it did NOT include Cypress or Cayman based CF.

Am I supposed to know what Cypress and Caymen are? The performance difference between the 2 cards is minute and Nvidia has better drivers plain and simple. Amd just recently started offering app profiles and not nearly as well implemented as Nvidias I might add. By the time Amd gets adaptive vsync oleds will be the standard and it wont even be a worthwhile feature at that point.

Consoles has been using adaptive adaptive vsync for sometime.

-------

AMD Cypress = Radeon HD 5850/5870 which uses VLIW5 stream processor design.

AMD Cayman = Radeon HD 6950/6970 which uses VLIW4 stream processor design.

AMD Tahiti = Radeon HD 7870 XT/7950/7970/8950-OEM/8970-OEM which uses SIMD stream processor design.


------

VLIW, http://en.wikipedia.org/wiki/Very_long_instruction_word

SIMD, http://en.wikipedia.org/wiki/SIMD

-----

Avatar image for darksusperia
darksusperia

6945

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#317 darksusperia
Member since 2004 • 6945 Posts

[QUOTE="Cranler"][QUOTE="ronvalencia"]

My link specifies 7970 CF vs 680 SLI and it did NOT include Cypress or Cayman based CF.

ronvalencia

Am I supposed to know what Cypress and Caymen are? The performance difference between the 2 cards is minute and Nvidia has better drivers plain and simple. Amd just recently started offering app profiles and not nearly as well implemented as Nvidias I might add. By the time Amd gets adaptive vsync oleds will be the standard and it wont even be a worthwhile feature at that point.

AMD Cypress = Radeon HD 5850/5870 which uses VLIW5 stream processor design.

AMD Cayman = Radeon HD 6950/6970 which uses VLIW4 stream processor design.

AMD Tahiti = Radeon HD 7870 XT/7950/7970/8950-OEM/8970-OEM which uses SIMD stream processor design.

VLIW, http://en.wikipedia.org/wiki/Very_long_instruction_word

SIMD, http://en.wikipedia.org/wiki/SIMD

Doesn't change that crossfire drivers have issues and will be months before AMD release the fix.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#318 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="Cranler"] Am I supposed to know what Cypress and Caymen are? The performance difference between the 2 cards is minute and Nvidia has better drivers plain and simple. Amd just recently started offering app profiles and not nearly as well implemented as Nvidias I might add. By the time Amd gets adaptive vsync oleds will be the standard and it wont even be a worthwhile feature at that point. darksusperia

AMD Cypress = Radeon HD 5850/5870 which uses VLIW5 stream processor design.

AMD Cayman = Radeon HD 6950/6970 which uses VLIW4 stream processor design.

AMD Tahiti = Radeon HD 7870 XT/7950/7970/8950-OEM/8970-OEM which uses SIMD stream processor design.

VLIW, http://en.wikipedia.org/wiki/Very_long_instruction_word

SIMD, http://en.wikipedia.org/wiki/SIMD

Doesn't change that crossfire drivers have issues and will be months before AMD release the fix.

The recent majority of AA/AAA titles has AMD "Gaming Evolved" with Battlefield 4 being the latest addition to the list.

For Radeons HD PC, there's 3rd party RadeonPro tool that enables adaptive vsync. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-11.html

Aside from a few dropped frames and a handful of spikes when the test changes scenes, our dual-Tahiti card enjoys much smoother sailing. In fact, the end result is often better than what you'd see from a single graphics card, with virtually no micro-stuttering left.

Detail%20Frame%20Rate%2003%20SLI%20Adapt

Detail%20Frame%20Rate%20Crossfire%2005%2

Avatar image for darksusperia
darksusperia

6945

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#319 darksusperia
Member since 2004 • 6945 Posts
[QUOTE="ronvalencia"]

[QUOTE="darksusperia"][QUOTE="ronvalencia"] Im not talking about vsync. crossfire has problems. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-GeForce-GTX-Titan-GeForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi [QUOTE="PCper"] The results shouldn't surprise you, and it won't surprise AMD any more either - if released today, the HD 7990 would not perform well in our tests. AMD has told me that they are working on an option to meter frames in the way that NVIDIA is doing it, but offering users the options to enable or disable it, but we are months away from that fix. Until then, any dual-GPU Radeon HD 7000-series cards are going to show these problems represented as runts and dropped frames. We have many more pages of results to go over for the HD 7950/7870/7850/etc and those will be published in the coming days - but the story will look very similar as you'll find. In all honesty, when AMD told me they were planning this card release I recommend they hold off until its driver fix is in place - myself and other reviewers are going to be hitting them hard on these issues until then, and any dual-GPU option with the Radeon name is going struggle to live up to any placed expectations.

Even though we only have NVIDIA results for 5760x1080 due to the extreme amount of dropped frames on the HD 7990 / HD 7970s, comparing these two options is interesting.PCper
[QUOTE="PCper"] There are two take aways from this first page of results. First, the AMD Radeon HD 7990 or HD 7970s in CrossFire are not going to compare well to the GTX Titan or GTX 690 in many cases because of the runt and dropped frame issues we have detailed

http://www.anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools/6http://www.overclockersclub.com/reviews/frame_capture_and__analysis_tools/ Im not saying amd is bad in general. IM saying they have issues at the driver level with crossfire which has been proven over the last year of beta testing this method which PCperspective was part of and AMD have said they are implementing something similar to nvidias frame metering in coming months.

Avatar image for AndersK
AndersK

396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#320 AndersK
Member since 2005 • 396 Posts

I nearly spat out my coffee reading, that the GPUs in PS3 and Xbox were more powerful at launch, than high end PC cards. When do people realize that the stuff inside the consoles is basically made by the exact same people that make regular computer hardware. It does not make sense from a business standpoint.

 

I have never visisted a forum, with so much wrong information, as on GameSpot, and specifically systemwars. I mean where do people come up with this shit?


Are people dilusional? It's like claiming your stock Honda Civic beats a Ferrari at the lights. No, saying it enough times does not make it true.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#321 MK-Professor
Member since 2009 • 4218 Posts

[QUOTE="MK-Professor"]meanwhile tormentos is stuck at 1280x720 low settings and 30fps:Ptormentos
Really last time i heard Killzone PS4 was 30FPS 1080p not 720p,maybe you are going by BF4 rumors,in which case i will have to ask you what make you think the 560T I will run BF4 max out at 1080P..:lol: It can't even max out Crysis 3 at 1080p not even the 690GTX can at 60FPS..

My bad, i was under the assumption that you play games on ps3(or 360), I didn't know that you had a time machine:|

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#322 AzatiS
Member since 2004 • 14969 Posts
Butthurt Nvidia...
Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#323 Wickerman777
Member since 2013 • 2164 Posts

I nearly spat out my coffee reading, that the GPUs in PS3 and Xbox were more powerful at launch, than high end PC cards. When do people realize that the stuff inside the consoles is basically made by the exact same people that make regular computer hardware. It does not make sense from a business standpoint.

 

I have never visisted a forum, with so much wrong information, as on GameSpot, and specifically systemwars. I mean where do people come up with this shit?


Are people dilusional? It's like claiming your stock Honda Civic beats a Ferrari at the lights. No, saying it enough times does not make it true.

AndersK

 

It's what the graph says and what an article said where I saw the graph. I don't friggin remember what PC GPUs were like 8 years ago, lol. If the info is inaccurate take it up with them. And if something so slight and meaningless as that caused you to "nearly spit out your coffee" then LMMAO.

Avatar image for AndersK
AndersK

396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#324 AndersK
Member since 2005 • 396 Posts

[QUOTE="AndersK"]

I nearly spat out my coffee reading, that the GPUs in PS3 and Xbox were more powerful at launch, than high end PC cards. When do people realize that the stuff inside the consoles is basically made by the exact same people that make regular computer hardware. It does not make sense from a business standpoint.

 

I have never visisted a forum, with so much wrong information, as on GameSpot, and specifically systemwars. I mean where do people come up with this shit?


Are people dilusional? It's like claiming your stock Honda Civic beats a Ferrari at the lights. No, saying it enough times does not make it true.

Wickerman777

 

It's what the graph says and what an article said where I saw the graph. I don't friggin remember what PC GPUs were like 8 years ago, lol. If the info is inaccurate take it up with them. And if something so slight and meaningless as that caused you to "nearly spit out your coffee" then LMMAO.

The consoles had some really good looking games when it came out, such as Gears of War. Also when compared to the PC games available at the time. That does not by any stretch mean that the hardware is better, or ahead of it's time. It simply means a talented developer choose the Xbox, and not the PC.

.. And about the coffee, this is systemwars :p

Avatar image for PC_Otter
PC_Otter

1623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#325 PC_Otter
Member since 2010 • 1623 Posts

[QUOTE="Wickerman777"]

[QUOTE="AndersK"]

I nearly spat out my coffee reading, that the GPUs in PS3 and Xbox were more powerful at launch, than high end PC cards. When do people realize that the stuff inside the consoles is basically made by the exact same people that make regular computer hardware. It does not make sense from a business standpoint.

 

I have never visisted a forum, with so much wrong information, as on GameSpot, and specifically systemwars. I mean where do people come up with this shit?


Are people dilusional? It's like claiming your stock Honda Civic beats a Ferrari at the lights. No, saying it enough times does not make it true.

AndersK

 

It's what the graph says and what an article said where I saw the graph. I don't friggin remember what PC GPUs were like 8 years ago, lol. If the info is inaccurate take it up with them. And if something so slight and meaningless as that caused you to "nearly spit out your coffee" then LMMAO.

The consoles had some really good looking games when it came out, such as Gears of War. Also when compared to the PC games available at the time. That does not by any stretch mean that the hardware is better, or ahead of it's time. It simply means a talented developer choose the Xbox, and not the PC.

.. And about the coffee, this is systemwars :p

The best PC GPUs 8 years ago in some ways were as good or better than what would be in the 360 and PS3. For example, the Radeon X800 XT and X850 both being just over 500 MHz, have the same texture fillrate as Xenos (they each have 16 TMUs), and the old Radeons had 16 ROPs, not 8 like Xenos. They also had more memory bandwidth. Yes they lacked eDRAM and versus a unified shader equipped Xenos, they falter in overall performance, but to simply wave them off as completely obsolete when Xenos came out is to be mistaken. The X850XT could actually come close to the X1800 which came out October 2005 right before the 360 released. The X1800 XT is as fast as the Nvidia 7800 GTX of which the PS3's RSX is a gimped version of, and it was released a whole year earlier! The first X1900s releasing in Jan 2006 if coded close to metal like Xenos in the 360 would tear Xenos to shreds.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#326 ronvalencia
Member since 2008 • 29612 Posts

Im not saying amd is bad in general. IM saying they have issues at the driver level with crossfire which has been proven over the last year of beta testing this method which PCperspective was part of and AMD have said they are implementing something similar to nvidias frame metering in coming months.

darksusperia

For the micro-stutter issues, they haven't used the 3rd party RadeonPro tool.

http://www.radeonpro.info/2013/03/crossfire-microstutter-afterburner-vs-radeon-pro/

Before RadeonPro

borderlands2afterburner.png

After RadeonPro

borderlands2radeonpro.png


The consoles doesn't have to worry about DX's issues nor managing two GPUs.

From http://www.tomshardware.com/reviews/geforce-gtx-titan-performance-review,3442-3.html

On single GPU setup and latency, 7970 GE has better latency results than Titan and GTX 680.

bf3-2560-latency.png

From http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-6.html

1920-high-ver.png

The latency issue can be double edged sword.