Monitor Size With A Gaming Computer

This topic is locked from further discussion.

Avatar image for FlyingArmbar
FlyingArmbar

1545

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 FlyingArmbar
Member since 2009 • 1545 Posts

I see people saying that some of the higher end cards are a waste of power if your monitor is too small (mine is 1680X1050, a decent size).

In the past, big monitors were a liability when it came to gaming. They would slow down performance. Have modern video cards been designed so that their power blossoms when they are played at higher resolutions? Making it so that monitor size (and therefore higher resolutions) no longer have a great effect on performance. Does a smaller monitor with a lower resolution bottleneck a GPU?

Thanks.

Avatar image for Masterdj1992
Masterdj1992

977

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 Masterdj1992
Member since 2007 • 977 Posts
doesn't bottleneck, some are just TOO powerful for the monitor so it is overkill. It is really a matter of the GPU being able to utilize more so why not use more.
Avatar image for GTR12
GTR12

13490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 GTR12
Member since 2006 • 13490 Posts

The higher resoultuions only utilise the RAM available, 2-3 yrs ago, the best u could get was 512MB (They were expensive though), and 512MB is just not enough to run high-end resolutions and be able to max out games.

The 4890 and the GTX 295 both have enough RAM to be able to supply the huge resolutions.

Video cards now also have the power to easily utilise all the RAM and the width of the bus.

1680*1050 is a respectable resolution these days, higher end would be 1920*1200 and higher.

I have always wondered what the chip on GPU's could be compared to, could it be equivilant to a Pentium 1?

Avatar image for PunishedOne
PunishedOne

6045

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#4 PunishedOne
Member since 2003 • 6045 Posts

Let's look at a thing most people don't look into when purchasing monitors: ppi (Pixels per inch).

If this number is too high, you will waste money buying a 20" monitor with a native resolution of 1680x1050.

1680x1050 is 1,764,000 pixels.

So the ppi is 88,200.

So in every inch of the screen (diagonally), there are about 3,473 pixels in that inch. Try to imagine 3,473 little squares in that inch. Can't see all of them clearly, can you?

When the ppi is way too high, you don't see the complete quality that the true image has as there is a scaling problem.

For a good example of this, see here:http://i16.photobucket.com/albums/b30/nicolasb/1366_scaling.png

The true quality of the image is the 59" with 1080p at 1920x1200.

What is happening with a 20" monitor with a native resolution at 1680x1050 is similar to the 1366x768 downscaling 1080p. Notice that the text seems to have a fuzz around it, like it's missing data.

Avatar image for k0r3aN_pR1d3
k0r3aN_pR1d3

2148

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#5 k0r3aN_pR1d3
Member since 2005 • 2148 Posts

The higher resoultuions only utilise the RAM available, 2-3 yrs ago, the best u could get was 512MB (They were expensive though), and 512MB is just not enough to run high-end resolutions and be able to max out games.

The 4890 and the GTX 295 both have enough RAM to be able to supply the huge resolutions.

Video cards now also have the power to easily utilise all the RAM and the width of the bus.

1680*1050 is a respectable resolution these days, higher end would be 1920*1200 and higher.

I have always wondered what the chip on GPU's could be compared to, could it be equivilant to a Pentium 1?

Amith12
The chip on the GPu being equivalent to a Pentium 1? Umm, nope, I think that with parallel processing, GPU's can easily be equal to a modern day CPU but slightly toned down.
Avatar image for NamelessPlayer
NamelessPlayer

7729

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 NamelessPlayer
Member since 2004 • 7729 Posts
It's not the size of the monitor that's of concern-it's the resolution. (A 30" PC monitor will generally have a resolution of 2560x1600. I don't know of a single HDTV with MUCH larger screen size that has anything higher than 1920x1080p.) As for GPUs being a waste of power at certain resolutions, some might decide to buy an "overkill" graphics card just to have something that will provide better performance for longer periods of time so that they won't have to upgrade as often. (I mean, I only play at 1280x1024 most of the time because my old CRT monitor can't drive 1600x1200 at anything higher than 60 Hz, but I'm still itching to replace my 8800 GT because Crysis on Very High still has a rather low and choppy average framerate.)
Avatar image for Gog
Gog

16376

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 Gog
Member since 2002 • 16376 Posts

I have always wondered what the chip on GPU's could be compared to, could it be equivilant to a Pentium 1?

Amith12

Are you serious? Modern GPU's are more complex that CPU's. A GTX 280 has twice as many transistors than a Core i7 chip.

Avatar image for GTR12
GTR12

13490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 GTR12
Member since 2006 • 13490 Posts

[QUOTE="Amith12"]

I have always wondered what the chip on GPU's could be compared to, could it be equivilant to a Pentium 1?

Gog

Are you serious? Modern GPU's are more complex that CPU's. A GTX 280 has twice as many transistors than a Core i7 chip.

No I understand that GPU's are far more powerful, and I thought I saw somewhere it had more than the I7, u just confirmed this (thought I was dreaming, lol). But I know that GPU's are sort of "stream-lined" (They only use basic maths, or something like that), so I was wondering how the rest of chip faired to modern day CPU's.

Avatar image for TerroRizing
TerroRizing

3210

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 TerroRizing
Member since 2007 • 3210 Posts

people talking about a a powerful gpu being a waste on a 1680*1050 monitor are talking bs. Not every game will need the same amount of power at the same res. its all about what games you want to play and how long you want to go before upgrading.