MrWizard8800's comments

Avatar image for MrWizard8800
MrWizard8800

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By MrWizard8800

Ok well I've got a couple updates. Brand spankin new DX10 code path ready game: Lost planet! Nice huh? Well not really, DX10 is an infant and is acting like one. The driver teams at both NV and AMD (ATI) are waiting for the publishers to come out with a working beta so they can test the damn thing. However! In lost planet the 2900XT gives the 8800GTS 640Mb a run for its money! Check it out: http://www.firingsquad.com/hardware/lost_planet_demo_directx_10_performance/page5.asp @windmill007 It all depends on what games you play, and how you want to play em, but your quite right, the Radeon 9000 series was a good time in ATI history, and Im sure it will be enough to play games as graphically intensive as crysis. But I just want to play it at my monitors default res along with some AA, AF, and (if possible) some physics. All of which are something the 9800 isnt capable of. @LightofHeroes I might have agreed with you a couple of years ago. But then the 6800 Ultra came out: The NV45... TSMC's 130nm process, clocked at 450 MHz. When that card first came out, I had no idea how Nvidia had done it... the answer was, they hadn't. And then the 8800.... I know someone who went through Three (3) of the 6800 Ultras. Thank god EVGA was so willing to replace them. No overclocking, no volt modding or soft modding, nothing, completely stock, and we went through three of em. The fourth one seems to be holding out. And in light of recent events which you may have even been a victim of It shocks me that you would say that NVs driver team is better than ATIs. The G80, coupled with longhorn (windows Vista's codename) , was a HORRIBLE experience, for the first SEVERAL weeks. I was BSODing every twenty minutes, after mutilating bios and changing all sorts of things around in the name of stability, I could get it to run for a few hours at a time. Im thankful SQL 2005 and other windows coding programs are so reliable; otherwise I coulda lost hours and hours worth of (highly paid) work. I suspect someone with a similar rig might not have been so lucky. It took far too long for Nvidia to come out with a stable driver for Vista. The end of 2006 will forever be remembered as a dark time in NV history.

Avatar image for MrWizard8800
MrWizard8800

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By MrWizard8800

@lysluk and RZappa31 I don't think that this review gives an accurate representation of what this card can and cannot handle. It certainly shows what it cant handle, but... thats not enough. I mean, dont get me wrong, my friend Jamie plays WoW at ~15fps (6600LE T.T), and thats fine for him. A good review is a review that tests everything, different FPS's for different people, I personally play with this thing never leaving the 50's. Some people cant do anything other than a constant 60. It's also important to remember, that when you crank the AA, your card has to up the resolution (putting more stress on the HD2900XTs already overstressed Texture Mapping Units (TMUs)), where it can take a few million samples, discard the image and then put together an image at your resolution using those samples. So, with ATIs new uber-sampling 24X AA, even at 1200X1000, you can expect the FPS to drop into the 30s (the 512bit memory bus is amazing, which is one reason why this card can handle such a high AA, but I think the Texture mappers are holding it back.

Avatar image for MrWizard8800
MrWizard8800

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By MrWizard8800

I dont at all like how this review was done. Who games at 2048x1536? I've been in the PC business for three years, and I don't know of a single CRT that supports that resolution. On the LCD side, I suspect most of us are on 19" and 20" monitors, meaning we hang around in the 1200 X 1000 to 1600 X 1200 range. For the people who have the big bucks to spend on monitors, perhaps you own a 24" panel in which case your native is 1900 X 1200, but testing at 2000 X 1500 is ridicules, because so few of us game at that resolution. Furthermore the sampling required for the AA will cause the card to artificially boost the resolution before display, which is what is murdering those frame rates. Which is another point, who games at 30 FPS? I can tell you right now, If you're willing to put $350 down on this card you're not willing to play at 30 FPS. If you want a better review, here's a couple: My personal favorite, Firingsquad's: http://www.firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/ Anand's: http://www.anandtech.com/video/showdoc.aspx?i=2988, http://www.anandtech.com/video/showdoc.aspx?i=2990 Legit Review's: http://www.legitreviews.com/article/503/1/ Xbit didn't do a review of the card, but rather the architecture behind it: http://www.xbitlabs.com/articles/video/display/r600-architecture.html This card is designed to compete with the 8800GTS 640mb, which it wille eventually do sucessfully. The biggest problem right now for the HD 2900 is the drivers. They're poor, to say the least. Typically however, ATI driver support has been fabulous, unlike Nvidia's. And this review doesn't mention it, but ATI brings a new technology which can do a whole lot to impact performance (in a positive way) for ATI if developers decide to use it. It's called the "tescilation engine" (if im spelling that right).