Poeticinsomniac's comments

Avatar image for Poeticinsomniac
Poeticinsomniac

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Poeticinsomniac

I always have to take game performance reviews like this with a grain of salt, ya know just cause the results may be financially motivated. But when the review overlooks the basic funtionality of the hardware being tested it's just laughable. On top of which i'm so incredibely sick of seeing 64bit hardware reviewed using 32bit operating systems, which is in part the laughable bit of the basic funtionality of the hardware.... 32bit OPERATING SYSTEMS HAVE A MAXIMUM MEMORY CAPACITY OF 2 GIGABYTES. It doesn't matter how much memory the motherboard supports, you can have 16 gigs installed and 32bit xp and 32bit vista will only reconize 2 gigs. Aside from that who really installs ram in odd amounts? AMD 939 socket motherboard for example require you to have 1, 2 or 4 sticks installed. For 3 gigs you would have to have two 1 gig sticks and two 512mb sticks. and i am resonably sure that would limit you to single channel mode. Which is not only pointless, but brain damaged. Yes you can argue that the 3rd gig of memory can be used for AGP texture swap size....but that really only was benifitial back when 128mb was the "high capcity" gfx memory cards. Another difference between 32bit and 64bit when running AMD chips, AMD amazingly performs much better when running 64bit software in 64bit operating systems. By better i mean, 15-20% performance increase running 64bit. But then when you apply the same test to intel chips the opposite seems to be true. Just the same ATI generally performs better in DX10 games then nvidia, and more so while running DX10 in 64bit vista. My HD2900xt and 939 toledo core 4400x2 clocked to 3.6ghz can still play Crysis in dx10 ultra high settings with 8xAA in 1280x1024 running 64bit vista and 4gigs of DDR @630mhz. But then i also notice that crysis sucks up about 2gigs of memory in 64bit vista. I seem to have forgotten my specific point as i get more and more agrivated at the sheer idiocy exhibited in this review, so i just reccomend it be ignored.

Avatar image for Poeticinsomniac
Poeticinsomniac

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Poeticinsomniac

Why do so many people have a problem understanding the simple fact that the HD2900xt is ati's MID-RANGE dx10 card. The HD2900 has a price tag of $380 that comes with the Half life 2 orange box voucher which is something like $50-80 of software. The 8800gtx is something like $500, the GTX ultra is $700+. Also the gts and gtx cards have an additional 128-256mbs of memory. The HD2900 was meant to compete with the 640gts card. Yet in numerous tests it comes in close to the gtx or surpasseses it in performance. There are almost no DX10 games available to accurately compare performance between the cards, but in terms of price, the HD2900 is better. As well as a hell of a lot more inovative then the 8800. It isn't the slaughter over nvidia most were expecting, but it's impressive considering it's only their midrange card and it competes with nvidias high end hardware for half the price. SP33doh saying that the 8800 is the best dx9 card out is stupid. Plain and simple, stupid. It would be pretty sad if the next gen card couldnt outpreform the previos generation of hardware. With the 1950xt costing around $200 now it is the best dx9 card available, mine can still run every dx9 game out on ultra settings in 1600x1200 and most in 2048x1536. It's not just the video card...the rest of your system has to have good hardware as well. Why the hell should AMD/ATI rush hardware that isn't needed. With no DX10 games around there isn't any need there, and considering the quad cores are being optimized for 64bit vista (they're working with microsoft to do so) why should they rush those when everyone running a 64bit rig is still using xp, or 32bit vista. There isn't going to be anymore 32bit software, or operating system/support come 2008, and since nvidia and intel are focusing on 32bit interface, i suspect they're going to be screwed once that happens. So why must every fanboy crawl out of the woodwork to boast about the fact they spend $1500-2000 every 6months to get the latest and greatest video cards and cpu's to use software on a 5 year old operating system while running their monitor resolution at 1280x1024. Up the resolution, get a 64bit OS to go with your 64bit hardware and then run some tests and see who fares best. BTW...until nvida came out with the 8800 line....they were getting their teeth kicked in since ATI came out with the x850. Just like AMD was destroying intel since t he release of the 939 socket up until the conroe came out. But intel and nvidia are just pumping out new shiny crap that no one needs, because they don't even use the software it was intended for, because they know once the software upgrade is made they'll once again be screwed.....much like the people spending $1500 on GTX ultra's to play Warcraft or Half life 2 in 1280x1024 so they can say they get 300 FPS. Bravo

Avatar image for Poeticinsomniac
Poeticinsomniac

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Poeticinsomniac

Ok, first unless you have a conroe cpu...intel sucks, has for years. deal with it. Upgraded to a dual core in may, back when the AMD dual core 4400x2 and above were still 500-1100 bucks. I opted for a 165 opteron with a toledo core (same core as the FX-60) found the right stepping digging through chips on E-bay. $287 for the cpu, $199 for my DFI LP NF4 SLI Dr-Expert and used my existing OCZ GE VX PC4000 ram (512x4). Added a x1800xtx 512 GDDR3 ATI card for $199. Ended up with: memory divider 1:1 =320 HTT, CPU voltage @ 1.425 ram voltage @ 3.6, 2880mhz...on air memory timings at 2-2-2-8 memory divider 166/200 =365 HTT, CPU voltage @ 1.5 ram voltage @ 3.9 3285mhz...on air. memory timings at 2.5-2-2-5 CPU temp doesn't get above 37c under load, maintains 27-28c at idle memory bandwidth is between 8900-9500 mb/s End result for about $700 i ended up with a completly new system that outperformed the FX-60 by about 500mhz for $3-400 less then if i just bought an fx-60. Outperforms every other cpu except the conroe's and has better memory performance then the AM2 DDR2. Any other intel chip especially any of the P-4 line it at least performs 40% better. The single ATI x1800xtx has better performance then the majority of the nvidia cards in SLI mode. 146 opteron with same ram, mother board etc clocked at 3.2ghz on air. That chip cost $90 off ebay. Also have gotten a barton mobile 2400+ up to 3047mhz (265x11.5) on a DFI LP Rev B NF2 with same OCZ ram with the aid of OCZ ram booster, 5 other mobile barton chips and a 3200+ xp with locked multi on various boards (epox 8RDA3+, Gigabyte GA700N 400 NF2. DFI ultra infinity, an Abit and several Asus boards as well. all NF2 chipsets) clocked at 2.65ghz-2980ghz on every one of them, Each chip had only a Hefty heatsink, usually a thermaltake tornado or Big Typhoon. None of the barton chips required more then 1.875v on the cpu. and all of those chips outperformed P-4, substantially better memory performance then the intel boards with DDR2. From all i've seen intel finally did some work and made a good chip with the conroe, but ill be waiting till january when the quad core 65nm AMD chips come out, 4 cores on a single die i suspect will be a bit more impressive then 2 dual cores in a single package. For now i tend to think of the AM2 chips out now in the same way i think of the 754 socket chips, simply something to bridge the gap between the real products.