I have a Uni meeting to attend to today, but I'll quickly address your points, Thinker.
First of all how is it a "ridiculous claim" to say that the 640GTS is better than 2900XT.I have proven it so why dont you disprove it or just accept it to be true.
Thinker
It's a ridiculous claim because it's not true.
You haven't proven anything, even after its (R600) launch a few months ago.
These benches were done by www.techreport.com, on the launch day of the HD2900 XT. Take note that this review was done with beta drivers at launch.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/stalker.gif)
The 8800 GTS wins here, however the CrossFire scaling should be noted here.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/supcom-2560.gif)
The HD2900 XT wins here.
However, it can be seen that NVIDIA is having SLi driver issues. Techreport has a Windows Vista test suite, and at the time, NVIDIA had various problems optimising SLi.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/bf2142.gif)
The 8800 GTS wins narrowly here.
The alpha driver makes a noticible difference in performance here but it isn't enough to take the performance crown.
CrossFire looks strong however.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/hl2-2560.gif)
AMD takes a good lead here.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/oblivion-aa.gif)
With beta drivers, the HD2900 XT is unable to beat the 8800 GTS. However, the alpha driver set pushes the HD2900 XT ahead of the 8800 GTS.
![](http://www.techreport.com/r.x/radeon-hd-2900xt/r6-vegas.gif)
AMD snatches the win from the 8800 GTS, and trails the 8800 GTX by a minor difference.
-
The main reason why people on this forum decided that the HD2900 XT was better than the 8800 GTS (performance either "see-sawed" between the two cards or were at neck and neck) was because AMD also provided a very generous accessory with the HD2900 XT: Valve's Black Box, which saved the consumer a further $50. AMD also provided a better solution to media enthusiasts who may have had their PC connected to a HD TV by providing a DVI-HDMI adapter (without the need for cabling to a sound processor, as the HD2900 XT output its own sound).
You missed coversations regarding the HD2900 XT vs 8800 GTS because you weren't around at the time, Thinker.
I "never" said that the old 640GTS is better than the 8800GT.There is a new 112 stream processor 640MB GTS for $270 at newegg and that has the possibility of being better but my card certainly isnt.And i also say that the 8800GT aint no GTX like people say that it offers GTX level performance.
Thinker_145
:|
That is flame bait, IMO.
No one expects an 8800 GT to beat the 8800 GTX, especially when AA/HDR or the resolution is increased.
You think that a 3870 with 22" monitor will provide a better gaming experience than 8800GT on a 19".Well you are just saying that a 22" monitor is better than a 19".Well everybody knows that gaming on a bigger screen is better.But you talk about bang for buck here see this"The bigger the screen the more powerful your PC should be and the more upgrading you will have to do".That's a fact.I am not saying that everybody should play on small screens but their is a reason why people play on small screens.
Thinker
What?
You just proved my point--Playing on a 22" WS with a HD3870 is better than playing on a 19" with an 8800 GT.
Again i will come to you saying that a 3870 with 22">>>8800GT with 19" as far as gaming experience is concerned.Ok did you notice that the guy was going with the Q6600.Will you not agree that E8400 and 8800GT with 22" is far better than both the other options.;)
Thinker
...except that the E8400 isn't out yet. Especially not in Australia.
But yes, depending on your game preferences, a good Wolfdale CPU will provide better performance in most of today's titles. But, software (games) is going multithreaded, so games such as Alan Wake, those developed on the UE3 engine or those on the updated version of Source will perform better on a quad core.
And again did you not notice how much money that guy was pumping in.Then why simply why would you advice somebody to buy a card which is atleast 15% weaker.Maybe 15% doesnt matter to you but it is important.And yes i agree that the new GTS is worth $50 over a GT unlike many other people.
Thinker
His rig wasn't balanced at all. He had 19" LCD, with a 620w PSU and an 8800 GT.
Buying an 22" WS LCD with a lower wattage PSU and HD3870 would provide a better experience.
The 3870 has largely the same price/performance as the 8800GT but i never argued this.And do you realize that a 8800 GTS 320MB had the best price performance back in the day so did that mean that it was the best card and nobody should buy the other cards.Look at the 320GTS now,it's started to stuggle with some games now.It cannot play crysis smoothly at high settings whereas all the other 8800 cards can(excepting the 256MB GT).This is what can happen when you have an extra reserve of performance in your card just like in the case of the 3870 and 8800GT.
Thinker
The 320MB GTS is limited in performance in Crysis because of its memory buffer, which many people were warned about before purchasing the card.
Also, it seems that all you care about is Crysis (after making a topic complaining about how Crysis sucks -_-). There are still plenty of new games where the 320MB GTS provides good performance.
And BTW the performance charts of 20x15 are irrelavent.Nobody plays at that res with a single card.
Thinker
It's a benchmark, Thinker, it's meant to show the performance of various graphics cards. Further, because of the E6550 CPU, TPU upped the resolution (and applied 4xAA/16xAF) to by pass the CPU bottleneck.
Also, the logic goes: "If this card can play game X at 20x15 comfortably, it should handle 16x10 easily."
And my sig is not meant to be accusing ATI of anything.It is just meant to tell people to not care about 3dmark cuz it doesnt matter and really is a poor reflection of real world performance.Even if sometimes it shows an accurate difference between the performance of 2 cards.
Thinker
Your sig was accusing ATi, because you specifically stated that ATi optimises for 3Dmark06.
If you wanted to get your point across (like you are saying now), then you would've simply stated that 3DMark06 isn't accurate.
Finally, 3DMar06 is still very accurate. The cards attain scores relative to one another that reflects their real world performance. Look at the benchmarks in my original post again.
Log in to comment