This topic is locked from further discussion.
2,304,000 pixels > 2,073,600 pixels
Your games will be rendered at a higher resolution than a PS3 at 1080p, but what the game looks like depends on, well, the game. A game like Quake III won't look better than Resistance: Fall of Man even at a higher resolution.
Some monitors have a really nice resolution but will only run it at something terrible like 60Hz.Baselerd
60 Hz is the standard for LCD monitors. It's far from terrible.
Your right, it's not terrible, but it's not good. 75Hz is the minimum I would want though. It's much easier on the eyes.Baselerd
You are incorrect lcd's dont actualyl refresh at 60 hz, they refresh at much higher, thats the number for the gpu, check on wiki. you can quote me on this, those the higher refresh rate may help on input lag...
If you say so, but....
The refresh rate is really a function of the response time I think. I had an 8ms response time monitor about a month ago, and with my 7900gt I could only use 60Hz (1280x1024). Now I have a 3ms response time monitor and I can run it at either 60Hz or 75Hz (both 1280x1024.)
LCDs don't exactly "refresh" at all. CRTs have to keep pushing the image to the screen or the image fades. Computer monitors do this a minimum of 60 times per second. LCDs on the other hand have physical pixels on a screen that are lit up to a specific color, but the image is persistent. How fast these pixels can change from one color to another is the "response time", but it's not exactly the same as the refresh on a CRT.
-Byshop
Two always = more, but Two != Double
What I mean is, for example, say an 8800 GTX was getting 100 FPS on Crysis, adding another card bringing it to SLi won't give it 200 FPS - more on the line of 25-50 more FPS. Worth it price wise? No, do you get more performance? Yes.
K_r_a_u_s_e_r
only justifiable spending for that is if you coupled it with a 30 in or greater monitor, but still you'd run up a high bill anyways. for all intents and purposes, one gtx is more than enough for the 1920 x 1200 resolution.
but if i got the money to spend it should i get it? i want the bestgregdpw
I wouldn't. Had I known what I know now, I would have gotten one GTX, and then when that wasn't cutting it anymore I would have already saved up $600 towards the latest card. SLI makes sense on lower cards, not on the 8800GTX.
but two cards will always be better than onegregdpw
Well, that is not always the case. Two cards introduces a level of complexity into your system (SLI) and some games dont like SLI, and SLI drivers are sometimes "flaky", ESPECIALLY the 8800GTX SLI drivers. Also, at lower resolutions, the performance increase with the second GTX may not even be noticeable. You can go from 140fps to 180fps - you can see the increase running a benchmark program, but playing a game you won't be able to tell the difference.
Even at 2560x1600, for example, Doom3 runs a timedemo at 90fps with a single card, 140fps with dual cards. Big deal! When playing, you can't tell the difference - its exactly the same!!!! If you have covered all your other bases (like processor, memory, hard drives, keyboard, mouse, etc, really nice monitor) and still have money left over, then go for it. I would much rather have a 30" LCD and a single 8800GTX than a 24" LCD and dual GTXs. If I had to choose. :D
[QUOTE="gregdpw"]well i am looking into 2 gtx's 4 gigs and a 1000 watt psuZBoater
It doesnt get much better than that. I would definintely recommend the Dell 30" at 2560x1600 instead of the 24" you are planning. That setup screams BIG LCD!!!! :D
yeah lets just say that its akin to a marathon runner doing a practice jog with the 24", and a full-blown marathon with the 30"
well i am looking into 2 gtx's 4 gigs and a 1000 watt psugregdpw
There is absolutely no reason what-so-ever to get a PSU that big. You will be running less efficiently then a lower wattage PSU.
I'm running a 790i mobo with two 9800GTX's in SLI, 5 fans (counting CPU), and 2 HDD's in Raid and I have yet to break 400w. You will save a ton of money, and not notice a single difference if you get a 600w PSU or so. Running closer to peak will also mean your PSU will be more efficient, so you may be burning less electricity as well.
Also, to address your original question. 1920x1200 is indeed better then 1080p. 1080p is 1920x1080 resolution. So you are getting 120 extra pixels of height. This means an extra 230,400 total pixels per image.
1920x1200 = 2,304,000 pixels = 16:10 aspect ratio
1920x1080 = 2,073,600 pixels = 16:9 aspect ratio
Really, they are virtually identical. Those 230,400 pixels are not going to be noticed very much once you get that high of a resolution anyways. It's just 10% less height on the image.
[QUOTE="gregdpw"]well i am looking into 2 gtx's 4 gigs and a 1000 watt psuBgrngod
There is absolutely no reason what-so-ever to get a PSU that big. You will be running less efficiently then a lower wattage PSU.
Your pretty late there look when that message was posted[QUOTE="Bgrngod"][QUOTE="gregdpw"]well i am looking into 2 gtx's 4 gigs and a 1000 watt psumastershake575
There is absolutely no reason what-so-ever to get a PSU that big. You will be running less efficiently then a lower wattage PSU.
Your pretty late there look when that message was postedHoly hell. Why would someone bump a 1 year old thread? Freakinfrackityfrick.
Your right, it's not terrible, but it's not good. 75Hz is the minimum I would want though. It's much easier on the eyes.Baselerd
lcd don't flicker no mater what refresh rate, makes no difference on the eyes
Please Log In to post.
Log in to comment