***News*** Nvidia Fermi GTX300 series performance preview!!! ---REPORTED FAKE---

  • 104 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for opamando
opamando

1268

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 opamando
Member since 2007 • 1268 Posts
[QUOTE="XtcmeLS"]Let them be fast, lets just hope their Image Quality reaches ATIs standards. Not a fanboy, But ATI does hold the IQ crown.Daytona_178
WTF? Its been proven that there is no difference in picture quality...you must be confused with better monitors.

I always hear fanboys make this statement. It just depends on whether they are ATI or Nvidia fanboys as to who has the better image quality. I have 2 friends that swear Nvidia looks tons better than ATI, and I have a ATI fanboy friend who will swear on his mom grave that ATI is better. Not one person has offered me any proof that one ACTUALLY looks better.
Avatar image for AzNs3nSaT1On
AzNs3nSaT1On

921

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52 AzNs3nSaT1On
Member since 2005 • 921 Posts
what we should realize is even if these charts are real, by the time these 300 series card come out, ATI will by then release the 5890 or 5990 (may i dare say 6000 series?). This is definitely a really interesting race.
Avatar image for LeadnSteel
LeadnSteel

371

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 LeadnSteel
Member since 2009 • 371 Posts

I don't even want to know how much these cards are going to cost.

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#54 hartsickdiscipl
Member since 2003 • 14787 Posts

[QUOTE="Daytona_178"][QUOTE="XtcmeLS"]Let them be fast, lets just hope their Image Quality reaches ATIs standards. Not a fanboy, But ATI does hold the IQ crown.opamando
WTF? Its been proven that there is no difference in picture quality...you must be confused with better monitors.

I always hear fanboys make this statement. It just depends on whether they are ATI or Nvidia fanboys as to who has the better image quality. I have 2 friends that swear Nvidia looks tons better than ATI, and I have a ATI fanboy friend who will swear on his mom grave that ATI is better. Not one person has offered me any proof that one ACTUALLY looks better.

Having been a longtime Nvidia user who just switched to an ATI card, I feel that I have some room to speak on this issue. Toward the end of my run with Nvidia cards (I had every GPU from Geforce FX through a G92-based 8800gts 512), I noticed that as new drivers that offered "faster performance" would come out, my graphics would look ever so slightly "sloppier" (for lack of a better descriptive word). Edges looked softer.. my AA and AF settings seemed to have a diminishing effect on games that I'd played for months or years prior. Take this for what you will.. the games didn't look as good as they used to. I think Nvidia's driver releases are designed to "streamline" the rendering of some games on certain hardware.

Since I've gotten my 5870, games are back to looking "right," in fact better than ever. 4xAA on this card looks sharper than 4x ever looked on my Geforce cards. I don't have any screenshots to prove it (I don't even know if a still-frame comparison would prove the point.. it may only be evident in motion), but the ATI card definitely offers better IQ.

Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#55 wklzip
Member since 2005 • 13925 Posts

[QUOTE="smc91352"][QUOTE="killab2oo5"] The 5750 and 5770 are ATi's lower-end DX11 cards. polarwrath11

meh; that's still pretty medium range due to the price tag. The 5670 would be lower end.

I'd say

5670 - low range

5750 - low/medium range

5770 - medium range

HD5600-5700 mid range HD5800-5900 high end HD5500 (and lower) low end
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#56 Daytona_178
Member since 2005 • 14962 Posts
[QUOTE="Daytona_178"][QUOTE="XtcmeLS"]Let them be fast, lets just hope their Image Quality reaches ATIs standards. Not a fanboy, But ATI does hold the IQ crown.opamando
WTF? Its been proven that there is no difference in picture quality...you must be confused with better monitors.

I always hear fanboys make this statement. It just depends on whether they are ATI or Nvidia fanboys as to who has the better image quality. I have 2 friends that swear Nvidia looks tons better than ATI, and I have a ATI fanboy friend who will swear on his mom grave that ATI is better. Not one person has offered me any proof that one ACTUALLY looks better.

Exactly! It funny how random people notice a difference but experiences reviewers on websites such as bit-tech never seem to notice it.,,,hmmmmm
Avatar image for XtcmeLS
XtcmeLS

106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#57 XtcmeLS
Member since 2009 • 106 Posts
I am sorry, I am no fan boy, and like others in this thread have said, being an owner of Both cards, my last rig had a 8800gt. ATI is far superior with AA Image quality. If you ever upgrade that 8800gt to a new ATI card, then post back and tell me YOUR real world experience, that would be great Hmmm K?
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#58 Daytona_178
Member since 2005 • 14962 Posts

I am sorry, I am no fan boy, and like others in this thread have said, being an owner of Both cards, my last rig had a 8800gt. ATI is far superior with AA Image quality. If you ever upgrade that 8800gt to a new ATI card, then post back and tell me YOUR real world experience, that would be great Hmmm K?XtcmeLS

OK so i typed "nvidia vs ati picture quality".

#edit: looks like they were not being compared to 3* ATI cards so its not completely fair...hmmm

Avatar image for XtcmeLS
XtcmeLS

106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 XtcmeLS
Member since 2009 • 106 Posts
are you kidding me, the x1000 series? It was the 4000 and up that took the IQ crown, do some more research young padiwan.
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#60 Daytona_178
Member since 2005 • 14962 Posts
are you kidding me, the x1000 series? It was the 4000 and up that took the IQ crown, do some more research young padiwan.XtcmeLS
Yeah, just noticed that....edited! The search continues!
Avatar image for XtcmeLS
XtcmeLS

106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 XtcmeLS
Member since 2009 • 106 Posts
http://enthusiast.hardocp.com/article/2008/07/20/amds_ati_radeon_hd_4800_series_custom_filtering_aa/7 here is a veery up to date conclusion, this is link to final page, but whole article is good!
Avatar image for codezer0
codezer0

15898

Forum Posts

0

Wiki Points

0

Followers

Reviews: 44

User Lists: 0

#62 codezer0
Member since 2004 • 15898 Posts
Exactly! It funny how random people notice a difference but experiences reviewers on websites such as bit-tech never seem to notice it.,,,hmmmmmDaytona_178
There were differences in the past, particularly when both sides were playing different tricks to try and boost performance and/or inflate 3DMark scores to look good in publicity. Remember the "Crouching Tiger, Hidden Dragon" episode with 3DMark 01? Or about the "Quack3" debacle for Radeon 8500 cards? Nowadays, both seem to understand that minor tweaks here and there that help speed up rendering is alright, so long as the image quality is where it needs to be, thankfully. But if anything, I still have a few "gripes" regarding the software on both sides. On the nvidia side, I'm not personally too fond of the new control panel layout, and preferred it when it seemed to stick with the more 'windows-compliant' model in older drivers. My other pet peeve would just be that the uninstaller is rather quirky in that the primary options don't let me just uninstall the video drivers.. the primary radio buttons in my case are: - Uninstall all except display drivers - Uninstall all, including display drivers - Uninstall ONLY these drivers. Well, I know that I only need to pick the last one and the display and all... but it seems weird to me that they try to make the interface so you keep the display driver at (nearly) all times. And I don't like how the number for the "LAN connection" in windows keeps incrementing every time I need to (re)install or update chipset drivers, but that seems to be more a Windows thing, because I've seen it even since back in the Windows 98 era. The final pet peeve with nvidia drivers now is just how now it seems I have to install the system tools to get access to things that were already included in the standard driver years before, such as temperature monitoring for the GPU's. I can understand getting it for temp monitoring for my motherboard, but come on... at least cut me some slack on the GPU(s) there. After all, what if I didn't have an nvidia chipset board? how then would I know if my graphics card was anywhere near unsafe temperatures? On the ATi side though, what might have started as peeves have grown to be flat out irritating. - I really fail to see why I need to have BOTH the hotkey poller AND help services on and automatic at all times just to use the CCC. Or more importantly, why there needs to be multiple instances of both of these in the task manager running at all times, including the launcher for the CCC to even open in the first place. That to me just seems like bad coding or program design, when the only "helper" service from nvidia's drivers can be turned off and flat out disabled without ill effect to being able to launch or operate the control panel (save for not being able to open the help file, but that's a small price to pay anyway). This was especially a problem when they first made it mandatory, because when I was still running ATi on my "1st box", I only had 1GB of RAM, and these processes together took up nearly a quarter of my available RAM on startup. Needless to say, I was NOT happy in the least, and I let ATi have it in their support line for it. The memory footprints have gone down since from what I've observed, but their propensity for such redundant instances for the "required" (and completely unnecessary) services and launcher is something I will never understand. - As well as I could remember, for a good year or so, ATi made it really difficult to turn off the Catalyst AI, and all its driver level optimizations.. the very same thing that they were (at the time of the X800 launch) accusing NVIDIA of using to unfair advantage. Ironic, because ATi's optimizations at the time not only could NOT be turned off through their control panel, but were causing for the X800 to apparently be unable to apply proper anisotropic and trilinear filtering, while nvidia's driver had a very simple check box for "enable AF optimizations". - Two words: Forced obsolescence. Even if/when they were still functional (depending on the card), ATi has in almost every case, "legacy'd out" the card I needed a driver from them for, which meant that I was SOL if the problems I needed the driver to fix were not. Most recently, the X1600 I had put in a desktop I'd since given to my roommate had been legacy'd out by ATi. When it died later, I felt that their move was just an attempt to force me to buy a new, supported card. All I wanted to do instead was to get an NVIDIA card to put in there instead, but alas, there were no NVIDIA AGP cards to be had, and we needed one right there and then. My roomie then surprised me by buying an AGP HD3650 (1GB) card. But then we get home, install it physically, then install the driver, and find the driver won't recognize the card. Why? Because ATi's stock driver won't recognize AGP versions of the cards... when they're still making and selling them. :| And it was needlessly (and given the circumstance, unbelievably) difficult to find the "AGP hotfix" version. Rant hat installer, everything appeared to be fine, but then I find that it too didn't want to install the driver? Why? Because their installer decided for me not to because it was "an unsigned driver." :| *facepalm* Meanwhile, on the other side of the fence, the 6600GT I have in my personal oldie not only still works, but the newest driver out of nvidia will still recognize and work with it.
Avatar image for XtcmeLS
XtcmeLS

106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 XtcmeLS
Member since 2009 • 106 Posts
I agree, i have stated in others posts, I am not too happy with ATI CCC or its drivers, But for the games i play, ATI has really come a long way with their AA and Other filters. I am running Dragon age lately, And went over to a buddies house to see his rig run Dragon Age (he has a 295 series card) And I can tell you this. (oh and we run same monitors) Dragon Age on my PC looks almost 30% better, he didnt believe me. He came over and he was like DUDE, its incredible to see the differences. But yes, I do agree, ATI drivers have alwasy been plaqued with problems...And i HATE the CCC. But I have had decent luck lately running the latest drivers with Vista 64 bit system. And I am really looking forward to Nvidias next release. If it is anything like the charts , One is going into my System, i can sell my 4870x2 no problem (they are hard to find and a LOT of peeps are looking for them to run Crossfire.
Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#64 wklzip
Member since 2005 • 13925 Posts

I have an update which will dissapoint some Nvidia enthusiasts :(

edit:

Fake Fermi benchmarks circulating

Everyone, including NVIDIA, is on the edge of their seats waiting for the new monster graphics circuit Fermi. We know for a fact that Fermi will be bringing more than 3 billion transistors to the table, we know that the architecture is very good at GPGPU applications, but we can also conclude that Fermi or GF100 if you will, will be very fast for gaming. During the days that have passed benchmarks using two Fermi graphics cards of the GeForce GTX 300 series have started circulating.

They don't match our own information and after talking to NVIDIA we can only conclude that they are very much fake. Someone simply took one of NVIDIA's older roadmaps and edited it to present what looks like the two cards GeForce GTX 380 and GTX 360, in competition with Radeon HD 5970 and HD 5870.

NVIDIA who usually doesn't comment on product names was reluctant to discuss the identity of the cards , but the benchmarks are not real and are based on templates from old NVIDIA presentations. We won't be linking to any of the possible sources, but will of course return when and if more reliable information surfaces..

Alas, we will have to wait just a little bit longer to see what Fermi has to offer.

nordichardware

Avatar image for XtcmeLS
XtcmeLS

106

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 XtcmeLS
Member since 2009 • 106 Posts
ouch! Oh well, Nvidia might have talked because either A These are faster then card or B They do not represent the true power.....Who knows
Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#66 Daytona_178
Member since 2005 • 14962 Posts

LOL, fair enough!

Avatar image for redrezo
redrezo

256

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 redrezo
Member since 2009 • 256 Posts

Is the GTX 380 a single chip? That would be very impressive.

Unfortunately, it will probably cost about $600+, which I'm just not willing to pay. Maybe if the GTX 360 was in the $300 range...

Swiftstrike5

Well hopefully with the 5970 and 5870, having Fermi come in will drop all Vid card prices down. I don't particularly care for Nvidia and their underhanded tactics, but the market needs them otherwise the marginalized pc gaming market will stagnate due to lack of their involvment and GPU price will remain high due to ATI monopoly.

Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68 X360PS3AMD05
Member since 2005 • 36320 Posts
Always assume fake.
Avatar image for SLIisaownsystem
SLIisaownsystem

964

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 SLIisaownsystem
Member since 2009 • 964 Posts

http://www.hexus.net/content/item.php?item=21284&page=4

:roll:

scoots9

under 60 frames is under my standard.

This.

Plus yes it's been proven time and time again Ati Do have great image quality. Going onto a non PC subject, the xbox360 with Ati graphics, the games do look sharper than the nv based ps3 games. Go read reviews comparing games on said consoles, so even them low end pffts, prove it :)

jamesfffan

yeah and for some reasons xbox cant even show pics or videos in full 1080p.

Avatar image for mangasm
mangasm

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 mangasm
Member since 2009 • 30 Posts
Dude, these cards are gunna cost more than my whole computer, monitors and all.
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 smc91352
Member since 2009 • 7786 Posts
Dude, these cards are gunna cost more than my whole computer, monitors and all.mangasm
I don't know your specs (and I'm not saying you're wrong) but I think I remember you said you had a Corsair psu and a HAF. I think you might be exaggerating.
Avatar image for mangasm
mangasm

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 mangasm
Member since 2009 • 30 Posts
[QUOTE="mangasm"]Dude, these cards are gunna cost more than my whole computer, monitors and all.smc91352
I don't know your specs (and I'm not saying you're wrong) but I think I remember you said you had a Corsair psu and a HAF. I think you might be exaggerating.

Well.. Three most expensive components in my rig: HAF 932= $206 Hx-620=$160 Ati HD4890= $236 Most expensive Nvidia card ATM = 295 @ around $600-700 I can imagine the 300 series being upwards of $1000 at release.
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 smc91352
Member since 2009 • 7786 Posts
[QUOTE="mangasm"]Most expensive Nvidia card ATM = 295 @ around $600-700 I can imagine the 300 series being upwards of $1000 at release.

:o In the US they're atleast $500 and the most expensive (except for the Mars edition) one I've seen is $850. I think that at most they'll adopt that price; but I don't know. We'll just have to wait and see.
Avatar image for mangasm
mangasm

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 mangasm
Member since 2009 • 30 Posts
Yeah, sadly US prices tend not to reflect on our prices, maybe i should try buying from the US?
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 smc91352
Member since 2009 • 7786 Posts
Yeah, sadly US prices tend not to reflect on our prices, maybe i should try buying from the US? mangasm
maybe. What's the conversion rate?
Avatar image for mangasm
mangasm

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76 mangasm
Member since 2009 • 30 Posts
[QUOTE="mangasm"]Yeah, sadly US prices tend not to reflect on our prices, maybe i should try buying from the US? smc91352
maybe. What's the conversion rate?

AUS dollar is worth about 91 US cents ATM.
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 smc91352
Member since 2009 • 7786 Posts
[QUOTE="smc91352"][QUOTE="mangasm"]Yeah, sadly US prices tend not to reflect on our prices, maybe i should try buying from the US? mangasm
maybe. What's the conversion rate?

AUS dollar is worth about 91 US cents ATM.

so the cards are not more expensive here nor there. This one is $530US * $.91AUS/$1US = $582.42AUS.
Avatar image for mangasm
mangasm

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78 mangasm
Member since 2009 • 30 Posts
True, but most 295's here are $650ish. Anyway, not as if i NEED the best hardware. :)
Avatar image for ssaadd123
ssaadd123

278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 ssaadd123
Member since 2009 • 278 Posts

Who wants 2 5870's for $400?:D

Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#80 smc91352
Member since 2009 • 7786 Posts

Who wants 2 5870's for $400?:D

ssaadd123
each or both for $400? If its the latter, I'll take 'em and sell my 5850. :D
Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#81 JigglyWiggly_
Member since 2009 • 24625 Posts

Who wants 2 5870's for $400?:D

ssaadd123
Both for 400$? I will be gladly consider buying! :D
Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#82 wklzip
Member since 2005 • 13925 Posts
ouch! Oh well, Nvidia might have talked because either A These are faster then card or B They do not represent the true power.....Who knowsXtcmeLS
Probably replacing the GTX360 for GTX380. And the GTX380 for GTX395 could be closer to the real numbers.
Avatar image for Iantheone
Iantheone

8242

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 Iantheone
Member since 2007 • 8242 Posts

[QUOTE="scoots9"]

http://www.hexus.net/content/item.php?item=21284&page=4

:roll:

SLIisaownsystem

under 60 frames is under my standard.

This.

Plus yes it's been proven time and time again Ati Do have great image quality. Going onto a non PC subject, the xbox360 with Ati graphics, the games do look sharper than the nv based ps3 games. Go read reviews comparing games on said consoles, so even them low end pffts, prove it :)

jamesfffan

yeah and for some reasons xbox cant even show pics or videos in full 1080p.

Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

Avatar image for SLIisaownsystem
SLIisaownsystem

964

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 SLIisaownsystem
Member since 2009 • 964 Posts

[QUOTE="SLIisaownsystem"]

[QUOTE="scoots9"]

under 60 frames is under my standard.

[QUOTE="jamesfffan"]

This.

Plus yes it's been proven time and time again Ati Do have great image quality. Going onto a non PC subject, the xbox360 with Ati graphics, the games do look sharper than the nv based ps3 games. Go read reviews comparing games on said consoles, so even them low end pffts, prove it :)

Iantheone

yeah and for some reasons xbox cant even show pics or videos in full 1080p.

Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

it cant ask the question in systemwars and we see who has right.

Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 smc91352
Member since 2009 • 7786 Posts
SLIisaownsystem
that was a fast response. Do you have some sort of notification when someone quotes you? How did you respond so damn fast?
Avatar image for Iantheone
Iantheone

8242

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 Iantheone
Member since 2007 • 8242 Posts

[QUOTE="Iantheone"]

[QUOTE="SLIisaownsystem"]

yeah and for some reasons xbox cant even show pics or videos in full 1080p.

SLIisaownsystem

Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

it cant ask the question in systemwars and we see who has right.

5 second google search: http://www.gamespot.com/news/6157933.html I also notice that you ignored what i said about nvidia :P
Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#87 wklzip
Member since 2005 • 13925 Posts

Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

Iantheone

The gtx280 has no problems doing that :P

Avatar image for Iantheone
Iantheone

8242

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#88 Iantheone
Member since 2007 • 8242 Posts

[QUOTE="Iantheone"]Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

wklzip

The gtx280 has no problems doing that :P

A GTX280 can only go 2560x1600 =/

EDIT: Whoo 1000 posts!

Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#90 wklzip
Member since 2005 • 13925 Posts

[QUOTE="wklzip"]

[QUOTE="Iantheone"]Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

Iantheone

The gtx280 has no problems doing that :P

A GTX280 can only go 2560x1600 =/

EDIT: Whoo 1000 posts!

:lol: Well you are mistaken then. The GTX280 cannot only do 2560x1600 then every gtx280 owner would have to get a 30" monitor to play games without being able to game on lower res monitors. So it can do 2560x1600 as well as 1920x1080, as 1920x1200 same as 1680x1050, and any other custom resolution you can pick with the nvidia control panel. :) Any current gpu doesnt or shouldnt have any problem running on any of those resolutions even the crappy HD4350.
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 smc91352
Member since 2009 • 7786 Posts
Whoo 1000 posts!Iantheone
Yay! When you get to 10,000 posts you can make a thread to celebrate in OT. I wonder if I'll make it without getting banned. :x Those mods :x
Avatar image for JigglyWiggly_
JigglyWiggly_

24625

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#92 JigglyWiggly_
Member since 2009 • 24625 Posts
[QUOTE="Iantheone"]Whoo 1000 posts!smc91352
Yay! When you get to 10,000 posts you can make a thread to celebrate in OT. I wonder if I'll make it without getting banned. :x Those mods :x

You think, you have it bad :P
Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 smc91352
Member since 2009 • 7786 Posts
You think, you have it bad :PJigglyWiggly_
You get modded :o I couldn't imagine what they mod you for.
Avatar image for SLIisaownsystem
SLIisaownsystem

964

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 SLIisaownsystem
Member since 2009 • 964 Posts

[QUOTE="Iantheone"]Whoo 1000 posts!smc91352
Yay! When you get to 10,000 posts you can make a thread to celebrate in OT. I wonder if I'll make it without getting banned. :x Those mods :x

its not the mods its the people who like the report button.

Avatar image for Daytona_178
Daytona_178

14962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#95 Daytona_178
Member since 2005 • 14962 Posts
[QUOTE="magicalclick"][QUOTE="SLIisaownsystem"]

[QUOTE="Iantheone"] Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

it cant ask the question in systemwars and we see who has right.

Well, no need to go SW, I am doing 1080p on Xbox360 to my 1080p monitor...... It certainly can do 1080p, but I don't think RE5 is 1080p, but that game looks amazing on my 1080p monitor. And glorified real time cut-scene. Man....... (Well, if you have the old model, you need to buy the HDIM cable that connects to the big plug). Anyway, as of now, I would say ATI is better this round. The time of GeForce8800 GT is definitly better than ATI and I was a big fan of it. When I have Xbox, I bought first ever DX9 card from ATI, and it doesn't perform as well as I expected. So, I am always a bit specticle on ATI. But, X360 runs really smoothly and my new HD5870 is really good. So, IMO, ATI wins this round, no need to panic or defensive or anything. Back to topic I think ATI is winning because it is more motivated. As nVidia said, "why do DX11 when no games", well, that's clearly lack of motivation. And they don't seem to push the standard as much as ATI. When a new DX comes out, usually the graphics card should be out around the same time, thus, game developers can have the hardware to work with. But, nVidia has been slow on this apect since DX10. I don't even get this "why do DX11" claim, I mean, comming from a Hardware manufacture, who should be pushing the DX? Duh, it is them!! It is suppose to be hardware first, then DX to standarize it, then game developer has the standarized API to work on those hardware. Hardware is actually before DX. I know this is probably has to do with their crazy marketing department, but, rest assure, they indeed play a big part in hardware development. There are very very few dev department that is untouchable by other department. Dev department are usually the research monkey following orders.

LOL, xbox 360 games are barely ever 1080p...think more about 1280x720 then upscaled to 1080p.
Avatar image for costyssj4
costyssj4

473

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 costyssj4
Member since 2005 • 473 Posts

Yes it can. It doesnt do it automatically, you have to manually set it to 1080p. Also look at the resolution of the benchmarks. Lets see any nvidia card do that

Ignorance is bliss!

1080p has been on PC for both ATI and Nvidia GPU's for a long time. Even the Geforce 4 Titanium which was from 2002 if i am not mistaken.

NVIDIA GeForce4 TI 4200

AGP 4x

128 MB

DDR SDRAM

256-bit

Max. Screen Resolution 2048 x 1536 (1080p is 1920x1080 so 2048x1536 is higher)

Not to mention the new cards such as ATI 5870 or higher which can run at5760x2400

http://www.youtube.com/watch?v=mzGtxlaPQqY

or if you add 4 cards you can run the resolution up to 12270x5160

http://www.youtube.com/watch?v=N6Vf8R_gOec

or maybe more if you do the math and compare it to one card.

i said 12270x5160 because i saw it running at that resolution so im certain its possible.

Corrected. Thanks :)

Avatar image for smc91352
smc91352

7786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 smc91352
Member since 2009 • 7786 Posts
its not the mods its the people who like the report button.SLIisaownsystem
who says I was talking about the people-mods? I was referring to a the modification. :P And, you're right I guess; but that's for a different discussion.
Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#99 04dcarraher
Member since 2004 • 23858 Posts

[QUOTE="magicalclick"][QUOTE="SLIisaownsystem"]

it cant ask the question in systemwars and we see who has right.

Daytona_178

Well, no need to go SW, I am doing 1080p on Xbox360 to my 1080p monitor...... It certainly can do 1080p, but I don't think RE5 is 1080p, but that game looks amazing on my 1080p monitor. And glorified real time cut-scene. Man....... (Well, if you have the old model, you need to buy the HDIM cable that connects to the big plug). Anyway, as of now, I would say ATI is better this round. The time of GeForce8800 GT is definitly better than ATI and I was a big fan of it. When I have Xbox, I bought first ever DX9 card from ATI, and it doesn't perform as well as I expected. So, I am always a bit specticle on ATI. But, X360 runs really smoothly and my new HD5870 is really good. So, IMO, ATI wins this round, no need to panic or defensive or anything. Back to topic I think ATI is winning because it is more motivated. As nVidia said, "why do DX11 when no games", well, that's clearly lack of motivation. And they don't seem to push the standard as much as ATI. When a new DX comes out, usually the graphics card should be out around the same time, thus, game developers can have the hardware to work with. But, nVidia has been slow on this apect since DX10. I don't even get this "why do DX11" claim, I mean, comming from a Hardware manufacture, who should be pushing the DX? Duh, it is them!! It is suppose to be hardware first, then DX to standarize it, then game developer has the standarized API to work on those hardware. Hardware is actually before DX. I know this is probably has to do with their crazy marketing department, but, rest assure, they indeed play a big part in hardware development. There are very very few dev department that is untouchable by other department. Dev department are usually the research monkey following orders.

LOL, xbox 360 games are barely ever 1080p...think more about 1280x720 then upscaled to 1080p.

Their sub 720 most of the time , specially demanding games. alot of their demanding games "AAA titles" run at 1024x600 , :lol: most 360 games(compared to native pc graphics) look blurry, low-medium setting eye sores,

Avatar image for Luminouslight
Luminouslight

6397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 Luminouslight
Member since 2007 • 6397 Posts
[QUOTE="smc91352"][QUOTE="Iantheone"]Whoo 1000 posts!JigglyWiggly_
Yay! When you get to 10,000 posts you can make a thread to celebrate in OT. I wonder if I'll make it without getting banned. :x Those mods :x

You think, you have it bad :P

Indeed.