Just came across this video. Not sure if it is fake or not.
thoughts?
http://www.youtube.com/watch?v=0dxpZE_H-wI
This topic is locked from further discussion.
Just came across this video. Not sure if it is fake or not.
thoughts?
http://www.youtube.com/watch?v=0dxpZE_H-wI
I'll actually take numbers into consideration when a 3rd party actually does their own testing and the numbers don't come out of Nvidia's mouth.
Which those do.
Its the Fermi architecture vs the GT200 architecture. Its real (as far as we know). It was from Tom's hardware or one of those sites.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/27892-nvidia-s-geforce-gf100-under-microscope.html NDA was lifted, this article has a few benchies alongside with the plethora of other sites giving information on the Fermi atm. With unreleased clock speeds/unfinalized hardware/ drivers; it is pretty impressive to say the least.Just came across this video. Not sure if it is fake or not.
thoughts?
http://www.youtube.com/watch?v=0dxpZE_H-wI
o0squishy0o
So how are the figures so far stacking up against the ATi benchmarks? of course I am not expecting it to be better but so far is it looking promising that Nvidia will have a more power card than ATi?
So how are the figures so far stacking up against the ATi benchmarks? of course I am not expecting it to be better but so far is it looking promising that Nvidia will have a more power card than ATi?
o0squishy0o
Well from a raw power stand point the Fermi is supposed to reach almost that of double the fps of a single 285gtx. It is also going to offer much better tessellation performance than the 5 series; actually from the benchies that are coming out/information leaking it quite blows it out of the water (according to the information we have now). Also, the way Fermi handles AA (x32) supposedly only shows an 8% drop in performance compared to x4 AA for example.
Performance all around isn't lost on such dramatic levels from cranking up features. The architecture of the card is completely new and different than that of the 8800->2xx series and the 3870->4870->5870; Nvidia from the way it looks now is going to have a revolutionary card; price point wise we're just going to have to wait.
I can't imagine why anyone is setting up a video and not take a picture of the real Card itself, for what we know of it could be a HD5970 in there. I am and will be very sceptic to all news on Fermi's ownage or if they exsist at all since Nvidia showed up a faked Fermi and lied about the whole deal and when in fact two gtx280's were running in SLI they said it was fermi in their tech demo. Pardon me, but i hope Fermi be even better than what many folks say but they've (Nvidia) treated us consumers as stupid chips in their foul play, and does constantly lie us strait in the face. If fermi show up to be great im buying one either way if the price is rigth but i still don't like the ugly way they do buissnies.Just came across this video. Not sure if it is fake or not.
thoughts?
http://www.youtube.com/watch?v=0dxpZE_H-wI
o0squishy0o
So we needed the 5800 series to be able to Max out Crysis at resolutions that are realistic for everyday gamers... and there's no other game that's even AS demanding as Crysis.. so what is the demand for Fermi?
I'm sorry to sound like a negative Nancy, I guess the only market need for the GT300/GF100 series is to drive ATI prices down? I just seem to remember there being more games on the market that pushed current hardware to the limit when Nvidia released the 6/7/8/200 series cards. This will be a very interesting time for Nvidia, even if Fermi IS as fast as these number show. It's like having a 200mph car that you only drive on a 60mph road.
This isn't Nvidias or ATI's fault, you should rant on the game/software devs instead. :D Many games take... forever to complete and the hardware is a giant step ahead rigth now, many games are porly optimiced theese will still put strain on new hardware.So we needed the 5800 series to be able to Max out Crysis at resolutions that are realistic for everyday gamers... and there's no other game that's even AS demanding as Crysis.. so what is the demand for Fermi?
hartsickdiscipl
Its the Fermi architecture vs the GT200 architecture. Its real (as far as we know). It was from Tom's hardware or one of those sites.
tequilasunriser
Actually, as far as we know it's not real.
Why?
Because first of all, we don't know the configuration of EITHER PC. One could have a Core 2 Duo at 2.26 GHz and the other could have a core i7 975 @ 4.0GHz.
Secondly, it's coming from Nvidia and we KNOW that they HAVE lied about Fermi.
Finally, until Toms Hardware, Guru3D, AnandTech, [H]ard|Ocp, Hardware Canucks or a plethora of other places release their OWN benchmarks and NOT Nvidia's we don't even know the capabilities of the Fermi.
Until than, everyone needs to sit back down in their seat, and not on the edge, and wait for REAL numbers.
if that is true. then fermi is not as good as 5970. because 5970 has more than twice the power of 285.
I think you mean the 5890 and that because it has 2 GPU's inside of it...nothing special about that.if that is true. then fermi is not as good as 5970. because 5970 has more than twice the power of 285.
gangster480
[QUOTE="gangster480"]I think you mean the 5890 and that because it has 2 GPU's inside of it...nothing special about that.if that is true. then fermi is not as good as 5970. because 5970 has more than twice the power of 285.
Daytona_178
no its 5970. and yea it does have 2 gpu but im just saying everyone has been saying fermi will be better than 5970.i just think that we might be a bit disappointed.
[QUOTE="tequilasunriser"]
Its the Fermi architecture vs the GT200 architecture. Its real (as far as we know). It was from Tom's hardware or one of those sites.
kilerchese
Actually, as far as we know it's not real.
Why?
Because first of all, we don't know the configuration of EITHER PC. One could have a Core 2 Duo at 2.26 GHz and the other could have a core i7 975 @ 4.0GHz.
Secondly, it's coming from Nvidia and we KNOW that they HAVE lied about Fermi.
Finally, until Toms Hardware, Guru3D, AnandTech, HardOcp, Hardware Canucks or a plethora of other places release their OWN benchmarks and NOT Nvidia's we don't even know the capabilities of the Fermi.
Until than, everyone needs to sit back down in their seat, and not on the edge, and wait for REAL numbers.
Hence the "as far as we know" bit that I threw in there...
I personally think Nvidia will trump Ati's cards again. However, when you look at the price of the card and how much of a performance gain you get it is clear that it is not worth it. People always throw all these numbers around and at the end of the day the difference between Ati's 5800 series and the new gtx card will be 10 fps at best. At this point if you have a 5870 you are getting well over 60 fps in just about every game out there, with max settings. People do know that after 60 fps the human eye can't tell the difference right? I mean think about it, if your current PC can play games at max settings and get 70-100 fps why on earth would you need a more expensive card to bring that frame rate up by 10? It just doesn't make sense to me. I am very sure that this card is going to be at least in the 500-600 dollar range. Thats why I think Ati has gotten it right this time. the 5870 is an awesome card that can handle anything out there and is priced at 400 bucks. Expensive yes, but when the gf100 comes out people won't think it's expensive anymore. I have a GTX 260 and will be switching over to Ati this time around.
I personally think Nvidia will trump Ati's cards again.digitalsound1
With current APIs that is a possibility, but Fermi is supposidly a ray tracing powerhouse. If/when games switch to ray tracing Fermi, becasue it is optimised to do so, would probably take the lead.
People do know that after 60 fps the human eye can't tell the difference right?digitalsound1
That is an urban legend.
Read this.
if that is true. then fermi is not as good as 5970. because 5970 has more than twice the power of 285.
gangster480
A linky to wet your appetite (or lack there of). From the specs, it looks like it's going to be pretty powerful, but in actual testing in games is where it really counts.
i always thought that was said because most monitors are 60 Hz and people thought that was the highest frequency we could see, however people like having FPS well above sixty to stretch the E-penis and because nobody likes a choppy game, i can tell when a game dips below sixty and having consistent FPS well above 60 gives smoother gameplayPeople do know that after 60 fps the human eye can't tell the difference right?
digitalsound1
Yeah most LCD monitors are 60hz and some of those can support up to 75hz. Now the LCD TV segment is moving towards 120hz and 240hz displays. Supposedly 120hz will be the bottom line for anyone that wishes to watch movies/TV and play games in 3D.
I'll wait until it is no longer a novelty and in the mainstream until I plunk down that kind of cash on a display above 60hz.
Please Log In to post.
Log in to comment