Recently, there's been a sharp pro NVIDIA trend in this forum (mainly thanks to Thinker_145--cheers, man!), which has caused many to have warped perceptions at the performance of ATi Radeon graphics cards, specifically the HD2000 and HD3800 series of graphics cards.
Thinker_145 has made some ridiculous claims throughout his posting time here on the PC Hardware forum: The 2900 XT being inferior to the 8800 GTS 640MB, CRT's obsoleting LCD's, questioning why people build their own PC's and claiming that the 8800 GTS 640MB is better than the 8800 GT (512MB). But perhaps his most recent inflammatory phase is his confusion that the HD3870 is somehow not worth the price given the performance it provides.
His argument is that the HD3870 performs extremely poorly in games, particularly when AA and AF is applied. This simply is not the case, and I'm here today to disprove that idea.
His habit of flame baiting (whether or not it's flame bait or just confusion is still unknown) has attracted the attention of the regulars on this board, such as LordEC911, Indestructible2, wklzip and myself (as well countless others who have challenged him time and time again). Why do you care, I hear you say? Well, the sad thing is that some people (newbies to hardware) actually believe Thinker_145 and sometimes do not make the best decisions when purchasing their hardware. Recent memory has Thinker_145 recommending an 8800 GT, despite the OP having to go with a 19" LCD (1280x1024) because of a tight budget. Clearly, had he gone with a cheaper alternative such as the HD3870 with a 22" WS LCD, he would have a much better gaming experience, IMO.
You may have seen Thinker_145's recent signature:
Nvidia the way it's meant to be played.ATI the way 3dmark is meant to be played.Thinker_145
Which is funny because, back in the GeForce FX vs Radeon 9000 days, NVIDIA was "caught" extreme driver fixing, especially for 3DMark03 (as well as games). I quote wikipedia's page on the GeForceFX:
Questionable tactics
NVIDIA's GeForce FX era was one of great controversy for the company. The competition had soundly beaten them on the technological front and the only way to get the FX chips competitive with the Radeon R300 chips was to optimize the drivers to the extreme.
NVIDIA historically has been known for their impressive OpenGL driver performance and quality, and the FX series certainly maintained this. However, with regard to image quality in both Direct3D and OpenGL, they aggressively began various questionable optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy, and thus visual quality. Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.
NVIDIA also began to clandestinely replace pixel shader code in software with hand-coded optimized versions with lower accuracy, through detecting what program was being run. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that NVIDIA had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all. This artificially boosted the scores the FX series received. Side by side analysis of screenshots in games and 3DMark03 showed noticeable differences between what a Radeon 9800/9700 displayed and what the FX series was doing NVIDIA also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers. It should however be noted that ATI also created a software profile for 3DMark03. In fact, this is also a frequent occurrence with other software, such as games, in order to work around bugs and performance quirks. With regards to 3DMark, Futuremark began updates to their software and screening driver releases for these optimizations.
Both NVIDIA and ATI have optimized drivers for tests like this historically. However, NVIDIA went to a new extreme with the FX series. Both companies optimize their drivers for specific applications even today (2007), but a tight rein and watch is kept on the results of these optimizations by a now more educated and aware user community.
wiki
Anyway, the point of this thread isn't to take cheap shots at Thinker_145, but to inform and dispell some of the bad rep that the HD3870 is receiving.
I'll be using techpowerup's review of the Sapphire HD3870 for benchmark purposes, while I'll borrow architecture info from beyond3D, anandtech, PC perspective and techreport.
Let's go!
The RV670 GPU
The ATi Radeon HD3870 is powered by the RV670 GPU design:
As many of you may have noticed, this is the same underlying design used by AMD/ATi in the R600 GPU which featured as the HD2900 XT. However, AMD have done some tweaks to the R600 design:
- Shrunk the optical die from 80nm to 55nm at TSMC.
- Reduce the memory bus width from 512 bit external bus (1024 bit internal ring bus) to 256 bit external bus (512 bit internal ring bus).
- Various tweaks to improve the memory controller and setup engine to increase IPC count to R600 levels or higher.
- Provision of full hardware decode of HD content (AMD's UVD)
- DirectX10.1 Support
Further, AMD has included support for CrossFireX and PCI-E 2.0. Most games and applications won't make use of these features, but they're good to have none the less, showing an increase in potential performance.
So all in all, we can consider that RV670 is a die shrink of R600, with a few tweaked features--Much like how G92 was to G80.
Test Bench and Results!
Techpowerup's test bench:
CPU: Core 2 Duo E6550
Motherboard: Gigabyte P35C-DS3R
RAM: 2 x 1GB A.DATA DDR2-1066
HDD: Western Digital Raptor 74GB
OS: Windows XP w/ SP2
Driver Suite: NVIDIA 169.04/ATi Catalyst 7.11
CPU looks to be on the short side but since I'm only going to use high resolution benchmarks with 4xAA and 16xAF, it won't pose much of a problem. Further, using a motherboard without PCI-E 2.0 does seem a little odd.
I'd also like to note that the focal point of this review is to compare the 8800 GT to the HD3870, in terms of price/performance. I know you've probably and seen a lot of comparisons of the two cards, but it doesn't hurt to take another look at two of the most popular cards on the market (or at least on this forum).
Anyways, here are the results:
I'd like to note a few things here:
- This is the DX9 version of the game set to max in game settings.
- Notice how the HD3870 keeps up with the HD2900 XT, despite having around 25GB/s less memory bandwidth than the HD2900 XT?
- Take note at how the HD3870 closes the gap with the 8800 GT as the resolution increases.
This game was supported largely by NVIDIA (particularly with NVIDIA's "TWIMTBP" program), and it shows. The ATi cards are beaten quite soundly by NVIDIA's parts.
The game was set to "High" (obviously "Very High" was not possible as TPU uses XP).
The HD3870 manages to stay competitive with the 880 GT at 1600x1200, however it faces a large drop in performance when going to a larger resolution. Oddly enough, HD2900 XT suffers the same problem.
It could be a driver problem as the Catalyst driver was 7.11.
Anyway, NVIDIA wins even in price/performance here, at least when you crank the resolution up.
:lol: R580 and G71 are beating R600 in Far Cry.
Most probably a driver issue.
The HD3870 does very well here. It manages to stay very competitive with the 8800 GT, especially as the resolution increases.
ATi's long history of OpenGL seems to be taking step in the right direction. The HD3870 does very well here, matching the HD2900 XT and keeping very close to the 8800 GT.
The 8800 GTX still shows its dominance throughout the benchmarks though.
The ball is in NVIDIA's court now, and it's an OpenGL title too!
I think we can conclude that there's a driver issue with NVIDIA. That, or Quake 4 really likes the R600 architecture.
Performance is all over the place for all the cards.
The HD2900 XT is getting beat by the 7900 GTX, which in turn is getting beaten by a HD3870, which is getting beaten by an X1900 XTX.
Driver problem, no?
STALKER's performance patch has greatly improved performance across all cards.
The trend of the HD3870 closing the gap with the 8800 GT continues on.
The HD3870 does very well at 1600x1200, but it trips up at 2048x1536--the HD3850 beat the HD3870!
Odd indeed...
AA can't be applied to the nature of Unreal Engine 3.
The benchmark continues the trend set by the other benchmarks--HD3870 closes the gap on the 8800 GT as the resolution increases.
Thinker_145's claims that ATi deceivingly optimises their drivers for 3DMark06 are unfounded. The HD3870 is behind the 8800 GT, and the results here echo what we have found with the in-game benchmarks.
Here's a nice conclusion chart:
Final Words
There we have it.
Obviously, this set of benchmarks had their limits due to a set of outdated drivers (latest for AMD are Catalyst 8.1 + Hotfix), but for the most part, I think they're still quite accurate.
So what's better; the 8800 GT or HD3870? Well, that is up to you. Do you think the performance of the 8800 GT warrants the extra cost over the HD3870? Do you have an nForce powered motherboard and are willing to try SLi? Do you have a CrossFire capable motherboard? You need to ask yourself these questions, and take into account all the factors of your PC before making a firm decision. Either way, you can't lose. If you go with the 8800 GT, you get higher performance but at a higher price. If you got with the HD3870, you still get a performance product but with a little extra saved in your wallet.
One thing we can't deny is AMD's complete lack of a Halo high end product, and maybe this is why they seem to get pegged with negative opinions. This should change with the HD3870 X2, though, which uses a CrossFire subsystem to achieve enthusiast high end performance. NVIDIA is providing a counter attack that uses two G92 chips to create an SLi subsystem packaged as a single card, too. BTW, the HD3870 X2 launch is January the 23rd, so look out for hardware reviews if you're interested.
Anyway, I thank anyone who took the effort to read this, and I hope that you can take something positive from it.
Ciao!
Log in to comment