gtx 260 actually faster than the 4870 ?! ?! check link

  • 88 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for tequilasunriser
tequilasunriser

6379

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 tequilasunriser
Member since 2004 • 6379 Posts

lol for the guy who ''rofl''ed , look on the threads what problems the owners have with the 4870 ! heat problems stupid drivers power consumption , everyone (pro websites) says that the nVIdia has sharper images , and the other reviews didnt review the actual oc'd gtx (which is 25$ more than the stock one)sentenced83

Lol, where the hell did you read that? ATi cards actually tend to have more detailed images. At least that was the case with the 9k series through the HD3k series and I would only assume that the image quality continues today.

I'm not knocking nVidia, they've always tended to create hardware that is great at bute force performance (except the 4 and 5 series lol) but could never compete with ATi on the image quality front.

When you get down to it though the differences are negligble. You would have to freeze individual frames to see the differences in the image quality.

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#52 JP_Russell
Member since 2005 • 12893 Posts

[QUOTE="mastershake575"]Wait.... The other parts don't really require that much power so how is 200-250 wrong ? The pci xpress slot with a single 6pin connector alone is around 150w.....artiedeadat40

Sorry I was kindof just messing with ya but, I heard (I could be wrong here) that the pci-e slot provides 50w and a 6-pin connector provides 100w and an 8-pin (this Im really not sure about) carries like 125w.

I think officially it's

Molex: 37.5W
6-pin: 75W
PCI-E slot: 75W
8-pin: 150W

However, the official power ratings of Molex and 6-pins are inexplicably very low, or at least unreasonably conservative. A Molex on the 12V rail should handle up to 60W, while a 6-pin should be able to handle around 100W or more.

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#53 sentenced83
Member since 2005 • 1529 Posts
someone asked if we're comparing oc'd cards against stock ones , the answer is Yes when the price difference is 10$ (after rebate for the gtx)
Avatar image for swehunt
swehunt

3637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#54 swehunt
Member since 2008 • 3637 Posts

someone asked if we're comparing oc'd cards against stock ones , the answer is Yes when the price difference is 10$ (after rebate for the gtx) sentenced83

I Guess you are really rigth then?

GTX260 FTW =309.99 after mir... ($40 MIR)

HD4870 (Visiontek with lifetime warranty) =250 after mir. ($20 MIR)

Is this the $10 difference you are speaking of? :roll:

The price difference is $80 without the MIR!

GTX260 FTW newegg

HD4870 newegg

Edit: Sorry the cheapest HD4870 is just $250 on newegg this make the difference $80 between the FTW an a un-oc'ed HD4870.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 Wesker776
Member since 2005 • 7004 Posts

This OP's posts smell of sweat, sexual frustration and failure. :|

[QUOTE="artiedeadat40"]

[QUOTE="mastershake575"]Wait.... The other parts don't really require that much power so how is 200-250 wrong ? The pci xpress slot with a single 6pin connector alone is around 150w.....JP_Russell

Sorry I was kindof just messing with ya but, I heard (I could be wrong here) that the pci-e slot provides 50w and a 6-pin connector provides 100w and an 8-pin (this Im really not sure about) carries like 125w.

I think officially it's

Molex: 37.5W
6-pin: 75W
PCI-E slot: 75W
8-pin: 150W

However, the official power ratings of Molex and 6-pins are inexplicably very low, or at least unreasonably conservative. A Molex on the 12V rail should handle up to 60W, while a 6-pin should be able to handle around 100W or more.

I believe the specifications allow "breathing room" for power supplies that do not strictly follow the ATX spec (ie PSU's with poor voltage regulation).

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#56 sentenced83
Member since 2005 • 1529 Posts
i think its your problem if you go for the cheapest item around ! i was talking about the sapphire (a better quality) but they have dropped prices , if you want get the HIS , its for 250 $
Avatar image for Neme2010
Neme2010

206

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 Neme2010
Member since 2008 • 206 Posts
[QUOTE="Neme2010"]

I had a Radeon 9800pro. There used to be lots of ppl reporting that the heatsink was too hot to touch. It was. After a year, that fried. Replaced with X800XL. Same story, It too was too hot to touch. It too fried. Got 4850. That was same story. It was too hot to touch as well as monitoring app reporting temps as high as 95 degrees. Thats why I replaced default cooler with aftermarket one. Can touch cooler and it is cold to lukewarm. Not going to let this card fry. I still feel annoyed that those other cards fried just because of inadequate coolers from ATI.

matrixian

I guess you never heard of Ati Tool, Ati Tray Tools or the famous fan fix to raise the fan speed. My 9800Pro and X800XL are still functional.

You miss the point. You shouldnt have to raise fan speeds. These cards were manufaturer fitted with inadequate cooling solutions.

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#59 sentenced83
Member since 2005 • 1529 Posts
Wesker776 : Wtf do you mean ?
Avatar image for swehunt
swehunt

3637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#61 swehunt
Member since 2008 • 3637 Posts

i think its your problem if you go for the cheapest item around ! i was talking about the sapphire (a better quality) but they have dropped prices , if you want get the HIS , its for 250 $ sentenced83

:lol: I think it's your problem if you actually think the GTX260 is faster...

And as the difference is $80 betw. the VISIONTEC, the only ATI card wich offer lifetime warranty and the GTX260 oc'ed card. you fail hard to proove your point.

Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#62 wklzip
Member since 2005 • 13925 Posts

i think its your problem if you go for the cheapest item around ! i was talking about the sapphire (a better quality) but they have dropped prices , if you want get the HIS , its for 250 $ sentenced83

Wait, are you actually recommending HIS over Sapphire while the Sapphire card is cheaper?
Sapphire is known as being the best ATI partner.

Avatar image for matrixian
matrixian

624

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63 matrixian
Member since 2003 • 624 Posts
[QUOTE="matrixian"][QUOTE="Neme2010"]

I had a Radeon 9800pro. There used to be lots of ppl reporting that the heatsink was too hot to touch. It was. After a year, that fried. Replaced with X800XL. Same story, It too was too hot to touch. It too fried. Got 4850. That was same story. It was too hot to touch as well as monitoring app reporting temps as high as 95 degrees. Thats why I replaced default cooler with aftermarket one. Can touch cooler and it is cold to lukewarm. Not going to let this card fry. I still feel annoyed that those other cards fried just because of inadequate coolers from ATI.

Neme2010

I guess you never heard of Ati Tool, Ati Tray Tools or the famous fan fix to raise the fan speed. My 9800Pro and X800XL are still functional.

You miss the point. You shouldnt have to raise fan speeds. These cards were manufaturer fitted with inadequate cooling solutions.

You miss the fact that the default fan speeds are too low. The stock coolers are fine. My HD4850 idles at 47-48C with 35-40% speed (depending on ambient temps), and under full load around 65-68C with 55% speed. Raising the fan speed is free and doesn't void your warranty.

Avatar image for Neme2010
Neme2010

206

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 Neme2010
Member since 2008 • 206 Posts
[QUOTE="Neme2010"][QUOTE="matrixian"][QUOTE="Neme2010"]

I had a Radeon 9800pro. There used to be lots of ppl reporting that the heatsink was too hot to touch. It was. After a year, that fried. Replaced with X800XL. Same story, It too was too hot to touch. It too fried. Got 4850. That was same story. It was too hot to touch as well as monitoring app reporting temps as high as 95 degrees. Thats why I replaced default cooler with aftermarket one. Can touch cooler and it is cold to lukewarm. Not going to let this card fry. I still feel annoyed that those other cards fried just because of inadequate coolers from ATI.

matrixian

I guess you never heard of Ati Tool, Ati Tray Tools or the famous fan fix to raise the fan speed. My 9800Pro and X800XL are still functional.

You miss the point. You shouldnt have to raise fan speeds. These cards were manufaturer fitted with inadequate cooling solutions.

You miss the fact that the default fan speeds are too low. The stock coolers are fine. My HD4850 idles at 47-48C with 35-40% speed (depending on ambient temps), and under full load around 65-68C with 55% speed. Raising the fan speed is free and doesn't void your warranty.

That may be true for the 4850, but what about the 9800pro and what about the X800XL. And anyway, were not the 9800pro and X800XL cooling fans operating normally anyway. And even you say that the fan speeds are too low. That is a fault. So, again, you miss the point that as supplied and out of the box, these cards are not reliable. I still have an old Geforce 2 currently unused, and also a Geforce TI4800 being used. Because these cards had cooling heatsinks and fans that were matched to the heat output of the GPUs, they have been reliable.

Avatar image for matrixian
matrixian

624

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 matrixian
Member since 2003 • 624 Posts

That may be true for the 4850, but what about the 9800pro and what about the X800XL. And anyway, if you mess about with the BIOS, dont you invalidate your warranty. So, again, you miss the point that as supplied and out of the box, these cards are not reliable. I still have an old Geforce 2 currently unused, and also a Geforce TI4800 being used. Because these cards had cooling heatsinks and fans that were matched to the heat output of the GPUs, they have been reliable.

Neme2010

I used Ati Tool for my 9800 Pro and ATT for my X800XL. Raising the fan speed doesn't change anything in the bios. If you want to spend more money and void your warranty, then do as you please.

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 JP_Russell
Member since 2005 • 12893 Posts

[QUOTE="JP_Russell"][QUOTE="artiedeadat40"]

[QUOTE="mastershake575"]Wait.... The other parts don't really require that much power so how is 200-250 wrong ? The pci xpress slot with a single 6pin connector alone is around 150w.....Wesker776

Sorry I was kindof just messing with ya but, I heard (I could be wrong here) that the pci-e slot provides 50w and a 6-pin connector provides 100w and an 8-pin (this Im really not sure about) carries like 125w.

I think officially it's

Molex: 37.5W
6-pin: 75W
PCI-E slot: 75W
8-pin: 150W

However, the official power ratings of Molex and 6-pins are inexplicably very low, or at least unreasonably conservative. A Molex on the 12V rail should handle up to 60W, while a 6-pin should be able to handle around 100W or more.

I believe the specifications allow "breathing room" for power supplies that do not strictly follow the ATX spec (ie PSU's with poor voltage regulation).

So in other words, if you have a power supply from a good quality manufacturer (Corsair, OCZ, PC Power and Cooling, etc.), which anyone building a PC who knows the importance of a quality PSU will do, you can safely estimate the maximum wattages of its Molex and 6-pins above the official specs a bit.

This site says "Part of the reason may be that pin 2 (listed above as a 12 volt line) may be listed as not connected in the specification. I've never seen a 6 pin PCI Express power cable with pin 2 not connected. They've all had a 12 volt line connected to pin 2."

That may actually relate to what you're saying; some cheapy power supplies with 6-pin connectors may not have pin 2 hooked up to the 12 volt, deviating from the official specifications.

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#67 sentenced83
Member since 2005 • 1529 Posts
you want lifetime warranty because yo intend never to upgrade ever again ?! and i was refereing to sapphire against the ocd version ( the difference was 20$) . and its not a prblem since the benchmarks shows that this ocd version rapes the 4870
Avatar image for swehunt
swehunt

3637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#68 swehunt
Member since 2008 • 3637 Posts

you want lifetime warranty because yo intend never to upgrade ever again ?! and i was refereing to sapphire against the ocd version ( the difference was 20$) . and its not a prblem since the benchmarks shows that this ocd version rapes the 4870sentenced83

cope with reallity...

In the benchmark you've get us a $309 card over take a card for $259, is this amasing? Althru the HD4870 is being superior to the slower GTX260 someone did OC the GTX to it's limmit to compete.

Give us all a break and read a few more benchmark than this one, if you did you would stop writing.

And all your conclutions are wrong, I don't care about the card your're reffering to is a Saphire, the Visiontec has better value and being the only ATI card with lifetime warranty, i'd get that over a saphire any day.

Go justify you're new card, but it's not better. I hope you did not pay $309 for it. (overpriced!)

Avatar image for X360PS3AMD05
X360PS3AMD05

36320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69 X360PS3AMD05
Member since 2005 • 36320 Posts
New GTX 260 is on the way.................http://en.expreview.com/2008/08/21/nvidia-will-offer-a-upgraded-gtx-260-in-mid-september/#more-663
Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#70 wklzip
Member since 2005 • 13925 Posts

you want lifetime warranty because yo intend never to upgrade ever again ?! sentenced83

You have a eVGA 8800gt that according to you, its not needed to upgrade ever again! But you plan to upgrade again (to a gtx260)?

And what is your theory for XFX? :P They provide double lifetime warranty.

Well anyway, the GTX260 FTW its better than the stock HD4870 but also being more expensive. If you have the money then go for it :)

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71 RayvinAzn
Member since 2004 • 12552 Posts

you want lifetime warranty because yo intend never to upgrade ever again ?! sentenced83
Backup rigs are extremely useful to have around for any number of reasons, and it's nice to know that even your old setup from years ago at least has a few components still covered by warranty.

I have to wonder at the type of person who upgrades from an 8800GT to a GTX 260 anyway, especially when there are other components in their system that could definitely use an upgrade instead of the graphics card (maybe save up a bit for a Bloomfield setup in a few months).

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#72 sentenced83
Member since 2005 • 1529 Posts

i cant upgrade every while because am outside the US , and whenever i go there i order online i get the stuff (if i want to buy component from here they're will be 3x their original price , and if you're kind enough to tell me what kind of person that upgrades a 8800 gt to a gtx 260 ? for the guy who was wondering

i've read many benchies and the 4870 performs a little bit more than the gtx 260 and other times the other way around , but i think there's a difference in performance even the brands are different

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 RayvinAzn
Member since 2004 • 12552 Posts

i cant upgrade every while because am outside the US , and whenever i go there i order online i get the stuff (if i want to buy component from here they're will be 3x their original price , and if you're kind enough to tell me what kind of person that upgrades a 8800 gt to a gtx 260 ? for the guy who was wondering

i've read many benchies and the 4870 performs a little bit more than the gtx 260 and other times the other way around , but i think there's a difference in performance even the brands are different

sentenced83

I never said you needed an upgrade, I just said that there are components in your system that would benefit from an upgrade more than your graphics card.

And at stock speeds the HD4870 indeed does have a slight edge over the GTX 260, but the really important factor (to me) is that AA performance and DirectX 10 performance is where the card really shines. That is likely to be indicative of a superior architecture that will last for quite a while longer competitively. Take the X1900 versus 7900 series for example. They went blow-for-blow back then for the most part (as the GTX 260 and HD4870 do now), but the X1900 series really started to shine at higher resolutions and with more AA enabled. Two years from then, my X1900XT whoops on even the 7900GTX in modern games - as it turns out the overkill shader processing power back then turned out to be well-utilized.

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#74 sentenced83
Member since 2005 • 1529 Posts
Dont mess with my head man , am 90% fixed on the gtx since also they're going to release a new version of the card , i had a 7900 gto 512 and now the 8800 gt 512 and so i dont want to get another 512 Mb card :( , about other upgrades i cant upgrade my cpu because then i'll need to change my motherboard and powersupply which would cost much more than a 3d card (and btw my 7900 gto used to rape my friend's x1900 xt or something like that
Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#75 RayvinAzn
Member since 2004 • 12552 Posts

They're going to release a 1GB version of the HD4870, how's that for messing with your head?

And I was inches away from picking up a 7900GTO myself, those were an excellent deal. They got snatched up quickly though, and I missed my chance to get one. I have no doubt it outperformed the X1900XT most of the time when it was new, but over time games have become more shader intensive and favor the X1900 series cards more.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#76 loco145
Member since 2006 • 12226 Posts
Both are good card. Leave it at that.
Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#77 sentenced83
Member since 2005 • 1529 Posts
guess am gonna wait to see how the new generation of the gtx 260 will perform ( in mid spetmeber) :D
Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#78 sentenced83
Member since 2005 • 1529 Posts

vanilla gtx against vanilla 4870

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/8869-bfg-geforce-gtx-260-896mb-video-card-review-15.html

Avatar image for RayvinAzn
RayvinAzn

12552

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 RayvinAzn
Member since 2004 • 12552 Posts

vanilla gtx against vanilla 4870

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/8869-bfg-geforce-gtx-260-896mb-video-card-review-15.html

sentenced83

Notice the DX10/AA trend? The extra memory on the GTX 260 definitely helps it out at higher resolutions, but the HD4870 handles AA and DirectX 10 better than the GTX 260 most of the time. A 1GB HD4870 should sort that problem out.

Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#80 sentenced83
Member since 2005 • 1529 Posts

on medium resolutions even without AA the gtx 260 is better !

iam gonna play in dx9 because 10 is crap and on a 1280X1024 and i'll upgrade the monitor later

Avatar image for wklzip
wklzip

13925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#81 wklzip
Member since 2005 • 13925 Posts

on medium resolutions even without AA the gtx 260 is better !

iam gonna play in dx9 because 10 is crap and on a 1280X1024 and i'll upgrade the monitor later

sentenced83

If you really want something better than the HD4870 go for the GTX280.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 Wesker776
Member since 2005 • 7004 Posts

on medium resolutions even without AA the gtx 260 is better !

iam gonna play in dx9 because 10 is crap and on a 1280X1024 and i'll upgrade the monitor later

sentenced83

:|

The HD4870 manages to outperform the GTX 280 up to 2560x1600, where the GTX 280's large memory buffer helps it pull through.

The HD4870 lands right in between the GTX 260 and GTX 280. This game is quite old, so the G200's large texture capabilities are probably pushing it through here.

The GTX 260, GTX 280 and HD4870 are all very close. But again: GTX 260 < HD4870 < GTX 280.

If this game weren't so heavily Nvidia funded, we'd probably see the HD4870 beating the GTX 280. Crysis is a game that should be taking advantage of complex, special effect shader algorithms (which is the RV770's strength), so it's questionable as to why the game doesn't reflect this in the performance results.

Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 Wesker776
Member since 2005 • 7004 Posts

WTF, GS ate half of my post!

Here's what I'm talking about: Future games are heading more toward shader heavy environments, which is where R600's, RV670's and RV770's strength (or pretty much any ATI architecture) lies in. Crysis should be showing something similar like this, but I guess Nvidia didn't like that, especially after handing over a pay cheque to Crytek.

Not too much to say here. The results speak for themselves.

Source:
http://www.techreport.com/articles.x/14990

Anyway, you can get the GTX 260 if you want, but the fact of the matter is that the HD4870 doesn't compete with the GTX 260; it goes after the GTX 280. What these tests don't show is that the HD4870 actually gets a solid lead over the GTX 280 once you enable higher levels of AA.

Avatar image for kodex1717
kodex1717

5925

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 kodex1717
Member since 2005 • 5925 Posts
I doubt that many benchmarks account for this, but if you rename 'Crysis.exe' to something else on an Nvidia system, you get a nice image quality boost. However, you also take a pretty sizable performance hit. It appears that Nvidia optimized for Crysis to the point of noticeably reducing image quality. If you compare that to a ATi system, Crysis runs worse, but it looks a hell lot better.
Avatar image for Wesker776
Wesker776

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 Wesker776
Member since 2005 • 7004 Posts

I doubt that many benchmarks account for this, but if you rename 'Crysis.exe' to something else on an Nvidia system, you get a nice image quality boost. However, you also take a pretty sizable performance hit. It appears that Nvidia optimized for Crysis to the point of noticeably reducing image quality. If you compare that to a ATi system, Crysis runs worse, but it looks a hell lot better.kodex1717

Do you have a link for that claim?

That's an extremely low thing to do, especially when everyone is so concerned about FPS in Crysis (not many people take into account image quality these days :( ).

Avatar image for JP_Russell
JP_Russell

12893

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#86 JP_Russell
Member since 2005 • 12893 Posts
I concur; link, please.
Avatar image for sentenced83
sentenced83

1529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#87 sentenced83
Member since 2005 • 1529 Posts
the link you provided shows that the 4870 is better , while mine shows otherwise , which one are we supposed to believe ?!
Avatar image for swehunt
swehunt

3637

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#88 swehunt
Member since 2008 • 3637 Posts

the link you provided shows that the 4870 is better , while mine shows otherwise , which one are we supposed to believe ?!sentenced83

Anandtech perhaps? :roll:

Serious, isn't it time to let this thread die yet, HD4870 is able to outperform OC'ed GTX260 cards.

A expencive FTW oc'ed $309 Nvidia beat out the $259 ATI stock cards...

just be happy with your card, if you like it it's up to you.

Im gonna buy HD4870 because of the performance it brings.