AMD Bulldozer review by Guru3d is up.

This topic is locked from further discussion.

Avatar image for entropyecho
entropyecho

22053

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#151 entropyecho
Member since 2005 • 22053 Posts

I cannot begin to express how disappointed I am with Bulldozer. :(

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#152 kaitanuvax
Member since 2007 • 3814 Posts

lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?

Avatar image for ravenguard90
ravenguard90

3064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#153 ravenguard90
Member since 2005 • 3064 Posts

lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?

kaitanuvax

I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#154 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="ferret-gamer"][QUOTE="wis3boi"] my friend had an nvidia 5000 series up until a few years ago. That thng was awful beyond words. It could barely run any DX 9 games

wis3boi

i tried running oblivion on an fx 5200 its failure at rendeirng it correctly was glorious.

hehe same game he tried using it on :D

those cards had a difficult time playing battlefield 1942, and half life 2. I find it funny that you held onto it that long, by the time oblivion came out I was using a 7800 GT.

Avatar image for Tezcatlipoca666
Tezcatlipoca666

7241

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 Tezcatlipoca666
Member since 2006 • 7241 Posts

[QUOTE="kaitanuvax"]

lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?

ravenguard90

I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(

If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.

Avatar image for ravenguard90
ravenguard90

3064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#156 ravenguard90
Member since 2005 • 3064 Posts

Well, when you have a friend constantly bashing you for your low 3DMark11 Physics score compared to his i5 2500k, the urge to upgrade is ever so slightly higher :P

Plus the fact that a bottleneck is apparent from my processor, since lowering my resolution in the Unigine benchmark does nada to my framerates.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#157 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="ravenguard90"]

[QUOTE="kaitanuvax"]

lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?

Tezcatlipoca666

I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(

If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.

I went from one of those (3.9ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.

1090t: http://3dmark.com/3dm11/870273

2500K: http://3dmark.com/3dm11/1331694

thats my 3dmark score differences, now my gpu's are the bottleneck.

PhysicsTest: 28.27 FPS vs 21.18 FPS

CombinedTest: 33.72 FPS vs 26.44 FPS

7 fps on a test like that and going from a hexacore to a quad core is amazing in my book. Just shows how the clock to clock difference with intel vs AMD, they are generations appart if a quad core smokes a hexacore on a CPU test like that.

Avatar image for Tezcatlipoca666
Tezcatlipoca666

7241

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158 Tezcatlipoca666
Member since 2006 • 7241 Posts

[QUOTE="Tezcatlipoca666"]

[QUOTE="ravenguard90"]

I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(

theshadowhunter

If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.

I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.

Yes, there must truly be a WORLD of difference :lol: How did you ever survive?

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#159 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"][QUOTE="Tezcatlipoca666"]

If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.

Tezcatlipoca666

I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.

Yes, there must truly be a WORLD of difference :lol: How did you ever survive?

well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.
Avatar image for Tezcatlipoca666
Tezcatlipoca666

7241

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 Tezcatlipoca666
Member since 2006 • 7241 Posts

[QUOTE="Tezcatlipoca666"]

[QUOTE="theshadowhunter"] I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.theshadowhunter

Yes, there must truly be a WORLD of difference :lol: How did you ever survive?

well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.

I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the seem bad, the truth is that both provide smooth gameplay...

I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#161 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"][QUOTE="Tezcatlipoca666"]

Yes, there must truly be a WORLD of difference :lol: How did you ever survive?

Tezcatlipoca666

well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.

I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...

I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.

tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.

plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 kaitanuvax
Member since 2007 • 3814 Posts

Whilest you may have gained a frame or two in Metro 2033, it is hardly a CPU intensive game but rather GPU.

And tbh I laughed when you posted synthetic benchmark scores after you said there was a world of difference between PII and Sandy Bridge.

That just goes to show that PII is still money in REAL games. And as long as you game and no more, you frankly don't need better. That's why I find this 2500k frenzy so puzzling.

Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#163 GummiRaccoon
Member since 2003 • 13799 Posts

[QUOTE="Tezcatlipoca666"]

[QUOTE="theshadowhunter"] well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.theshadowhunter

I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...

I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.

tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.

plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.

>Most graphically intensive game out there

>Claims upgrading CPU, which wasn't the bottleneck improved performance

>2011

>ISHYGDDT

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#164 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

[QUOTE="Tezcatlipoca666"]

I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...

I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.

GummiRaccoon

tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.

plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.

>Most graphically intensive game out there

>Claims upgrading CPU, which wasn't the bottleneck improved performance

>2011

>ISHYGDDT

there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)
Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165 kaitanuvax
Member since 2007 • 3814 Posts

there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)theshadowhunter

Even if such reviews exists, just ask yourself: what frames are is a "crossfire" setup getting? 100 fps on a PII as opposed to 125 on a Sandy Bridge? Are you really going to use that as a basis for the superiority of Sandy Bridge? A more likely argument would be say 20 fps vs 30 fps.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#166 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"] there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)kaitanuvax

Even if such reviews exists, just ask yourself: what frames are is a "crossfire" setup getting? 100 fps on a PII as opposed to 125 on a Sandy Bridge? Are you really going to use that as a basis for the superiority of Sandy Bridge? A more likely argument would be say 20 fps vs 30 fps.

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#167 kaitanuvax
Member since 2007 • 3814 Posts

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

theshadowhunter

Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.

What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.

/rant

Avatar image for Tezcatlipoca666
Tezcatlipoca666

7241

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 Tezcatlipoca666
Member since 2006 • 7241 Posts

[QUOTE="theshadowhunter"]

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

kaitanuvax

Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.

What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.

/rant

Gotta second this...

It's fine to build an i5 2500k build now. If you have the cash then why not? But to suggest that a Phenom II is not good enough and needs to be upgraded when games run just fine on it is.. well strange.

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#169 hartsickdiscipl
Member since 2003 • 14787 Posts

[QUOTE="GummiRaccoon"]

[QUOTE="Gambler_3"]Well from the time I have been following PC hardware, the 2900XT is still hands down the biggest flop ever. Everyone was "sure" that it is going to be significantly faster than an 8800GTX and it turned out to be weaker than the 8800GTS.

It had a serious performance issue with anti-aliasing and it never got fixed by any driver updates. And it wasnt until the 5870 that AMD really got back in the high end market.

Gambler_3

There is no bigger flop than the FX 5800 launch.

Ya have heard alot about that but wasnt a PC hardware follower back then so cant really have an opinion myself.

FX 5800 was definitely a bigger flop than this. It was turrrrible.

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#170 hartsickdiscipl
Member since 2003 • 14787 Posts

I really don't understand why people with 3ghz+ Core 2 Quads and Phenom II X4's in PCs used primarily for gaming with single-gpu setups would be thinking about upgrading to anything currently out. These processors can run anything smoothly and probably will for the next 1-2 years. I'm definitely waiting for some signs of choppy gameplay before I even think about upgrading. The next generation of CPUs that come out will probably justify an upgrade.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#171 wis3boi
Member since 2005 • 32507 Posts

[QUOTE="kaitanuvax"]

[QUOTE="theshadowhunter"]

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

Tezcatlipoca666

Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.

What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.

/rant

Gotta second this...

It's fine to build an i5 2500k build now. If you have the cash then why not? But to suggest that a Phenom II is not good enough and needs to be upgraded when games run just fine on it is.. well strange.

pretty much....I run an X6 1100T at 4ghz with my 570, it eats Metro, Crysis, BF3 alive
Avatar image for C_Rule
C_Rule

9816

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#172 C_Rule
Member since 2008 • 9816 Posts

[QUOTE="theshadowhunter"]

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

kaitanuvax

Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.

What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.

/rant

Even if your CPU is limiting your GPU performance to 200 fps out of a possible 300, it's still technically (and by the definition of the word) a bottleneck.
Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173 kaitanuvax
Member since 2007 • 3814 Posts

Even if your CPU is limiting your GPU performance to 200 fps out of a possible 300, it's still technically (and by the definition of the word) a bottleneck. C_Rule

Yes, that's why I said "intended emphasis" and not "meaning".

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#174 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.

kaitanuvax

Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.

What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.

/rant

my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#175 mitu123
Member since 2006 • 155290 Posts

[QUOTE="Gambler_3"]

[QUOTE="GummiRaccoon"]

There is no bigger flop than the FX 5800 launch.

hartsickdiscipl

Ya have heard alot about that but wasnt a PC hardware follower back then so cant really have an opinion myself.

FX 5800 was definitely a bigger flop than this. It was turrrrible.

I used to own one, then again back then I had no experience with PC hardware...and then got a different gpu afterwards.:lol:

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#176 kaitanuvax
Member since 2007 • 3814 Posts

my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.

theshadowhunter

No, I read your post, it's you that didn't read or comprehend mine.

Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?

Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.

Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.

Avatar image for millerlight89
millerlight89

18658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#177 millerlight89
Member since 2007 • 18658 Posts
I was really hoping for more than this. I was rooting for AMD :(
Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#178 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.

kaitanuvax

No, I read your post, it's you that didn't read or comprehend mine.

Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?

Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.

Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.

the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.

And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 kaitanuvax
Member since 2007 • 3814 Posts

Sorry I stopped reading at "PIIs are obsolete". In reference to your post in another topic, I think you are more a Sandy Bridge fanboy than the AMD fanboy you profess yourself to be.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#180 theshadowhunter
Member since 2004 • 2956 Posts

Sorry I stopped reading at "PIIs are obsolete". In reference to your post in another topic, I think you are more a Sandy Bridge fanboy than the AMD fanboy you profess yourself to be.

kaitanuvax

obsolete: Cause (a product or idea) to be or become obsolete by replacing it with something new: "we're obsoleting last year's designs". by that definition PII's are obsolete, they are clock for clock outperformed by all i7's and Bulldozer replaced them. I have owned AMD since the k5 days. conroe made me change my opinion a bit though, and I saw AMD's reaction and it was hard to not get a intel setup. so I sold my opteron 148 for a E6400 (biggest upgrade difference I have ever seen in my history of upgrading) then I bought the Xenon E8400 equivalent because all AMD had out was the Phenom I, and then I bought a 1090t to try out AMD again because I thought (more cores = better) I wont make that mistake again. I dont see myself buying another AMD rig until they can top what I currently own clock for clock. clock for clock is what AMD was great out before (athlon xp's, athlon 64's, etc...) then the Conroe came out and they lost that.

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#181 hartsickdiscipl
Member since 2003 • 14787 Posts

[QUOTE="kaitanuvax"]

[QUOTE="theshadowhunter"]

my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.

theshadowhunter

No, I read your post, it's you that didn't read or comprehend mine.

Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?

Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.

Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.

the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.

And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.

There are really only 2 generations of i5/i7. The i7's made for LGA 1156 weren't really a new generation. Clock for clock they're basically the same as the original LGA 1366 Nehalem i7's.

The term "obsolete" usually indicates that a product no longer serves it's practical purpose. That can't be said of the Phenom II. In fact any Phenom II X4 over around 3ghz can run any game well, and won't be a significant bottleneck to any single gpu in the VAST majority of games. They are still going strong. The fact that you can buy a 955 or 965 for $115 to $140 and a supporting motherboard for dirt cheap tells me that the platform is still a very viable option for people on limited budgets. The i3 Sandy Bridge dual-cores are equal to or faster than these chips at stock in most games, but we know that the X4 pulls away in games that are heavily-threaded. They will ultimately be better suited to handling the games of the next 2-3 years because of this.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#182 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

[QUOTE="kaitanuvax"]

No, I read your post, it's you that didn't read or comprehend mine.

Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?

Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.

Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.

hartsickdiscipl

the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.

And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.

There are really only 2 generations of i5/i7. The i7's made for LGA 1156 weren't really a new generation. Clock for clock they're basically the same as the original LGA 1366 Nehalem i7's.

The term "obsolete" usually indicates that a product no longer serves it's practical purpose. That can't be said of the Phenom II. In fact any Phenom II X4 over around 3ghz can run any game well, and won't be a significant bottleneck to any single gpu in the VAST majority of games. They are still going strong. The fact that you can buy a 955 or 965 for $115 to $140 and a supporting motherboard for dirt cheap tells me that the platform is still a very viable option for people on limited budgets. The i3 Sandy Bridge dual-cores are equal to or faster than these chips at stock in most games, but we know that the X4 pulls away in games that are heavily-threaded. They will ultimately be better suited to handling the games of the next 2-3 years because of this.

for the high end gamer though, you will have a dual card setup. I highly doubt many people will be using P II's in 2-3 years(they came out in December 2008 ) but then again some people have first/second gen conroes still (my friend has one and plans on getting SB for Skyrim) like I said by the time the software catches up these CPU's will be irrelevant. just like the Phenom I is right now, lets take a poll of how many people still have one of those? I forsee the same fate for bulldozer, but I could be wrong.

Bulldozer isnt a cheap CPU, and is around the same price as SB so there are no cost savings.

I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.

this is truely pathetic:

Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.

Avatar image for hartsickdiscipl
hartsickdiscipl

14787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#183 hartsickdiscipl
Member since 2003 • 14787 Posts

I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.

this is truely pathetic:

Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.

theshadowhunter

Yes, Bulldozer is quite pathetic. You'll get no disagreement from me there.

This has turned into a general CPU upgrade discussion, as it was destined to. I would be willing to wager than in 2 years a 3.5ghz Phenom II X4 will be able to run 80% or more of the games on the market smoothly. The element that people seem to be missing is that the hardware requirements for PC games aren't really moving much right now... especially when it comes to CPU demands. Most games on the market right now will still run well on a 3ghz+ dual core, and completely smoothly on a 2.6 to 3ghz C2Q or PII X4.

The next generation of consoles probably won't be out for at least 18-24 months. We won't see a huge increase in PC game system requirements between now and then. There will be a couple of titles a year that stress hardware, but that's about it. The quad-core CPUs that run games well now will probably run them well in 2 years.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#184 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.

this is truely pathetic:

Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.

hartsickdiscipl

Yes, Bulldozer is quite pathetic. You'll get no disagreement from me there.

This has turned into a general CPU upgrade discussion, as it was destined to. I would be willing to wager than in 2 years a 3.5ghz Phenom II X4 will be able to run 80% or more of the games on the market smoothly. The element that people seem to be missing is that the hardware requirements for PC games aren't really moving much right now... especially when it comes to CPU demands. Most games on the market right now will still run well on a 3ghz+ dual core, and completely smoothly on a 2.6 to 3ghz C2Q or PII X4.

The next generation of consoles probably won't be out for at least 18-24 months. We won't see a huge increase in PC game system requirements between now and then. There will be a couple of titles a year that stress hardware, but that's about it. The quad-core CPUs that run games well now will probably run them well in 2 years.

lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#186 kaitanuvax
Member since 2007 • 3814 Posts

lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.

theshadowhunter

Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#187 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.

kaitanuvax

Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.

that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#188 wis3boi
Member since 2005 • 32507 Posts

[QUOTE="kaitanuvax"]

[QUOTE="theshadowhunter"]

lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.

theshadowhunter

Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.

that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.

well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080p
Avatar image for tequilasunriser
tequilasunriser

6379

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189 tequilasunriser
Member since 2004 • 6379 Posts
It looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :PElann2008
Seriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?!
Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#190 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="theshadowhunter"]

[QUOTE="kaitanuvax"]

Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.

wis3boi

that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.

well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080p

I usually dont buy dual card setups myself, but I wanted something more powerful than my 5870, and I was looking at getting a 6970 (would of been dumb) of the GTX 580 (even dumber because of the cost and performance difference) I found my 2nd card unused on overclock.net, it was a local deal and I got it for $150, way cheaper than going the GTX 580 route I was considering, and performs better too.

I am just a enthusiast like I said, not everyone is I understand that, but I am not the only one that does this. overclock.net is full of people that do the same thing.

I am crazy I know but I plan on getting IB next year and selling my 2500K, thats why I didnt get the 2600K, because I was planning on getting something to replace it next year, call me crazy but I can assure you I wont be the only one doing this, and when dx12 comes out (it better next year) ill upgrade to that too. or if a single GPU setup comes out that outperforms my current 5870 setup, i will get that too. just so I can get a single GPU setup again, I hate having two cards...

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#191 kaitanuvax
Member since 2007 • 3814 Posts

[QUOTE="Elann2008"]It looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :Ptequilasunriser
Seriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?!

See: hardware addicts

Avatar image for theshadowhunter
theshadowhunter

2956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#192 theshadowhunter
Member since 2004 • 2956 Posts

[QUOTE="tequilasunriser"][QUOTE="Elann2008"]It looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :Pkaitanuvax

Seriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?!

See: hardware addicts

now whos not being open minded and accepting? its a hobby, just like people with cars, some people are content driving older cars (some with cars from the 80s-90s), while others want newer ones even though there is nothing wrong with their last one.

except this hobby is cheaper than that, way cheaper, and makes more sense IMHO.

Avatar image for davaniius
davaniius

498

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193 davaniius
Member since 2007 • 498 Posts

[QUOTE="wis3boi"][QUOTE="theshadowhunter"] that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.

theshadowhunter

well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080p

I usually dont buy dual card setups myself, but I wanted something more powerful than my 5870, and I was looking at getting a 6970 (would of been dumb) of the GTX 580 (even dumber because of the cost and performance difference) I found my 2nd card unused on overclock.net, it was a local deal and I got it for $150, way cheaper than going the GTX 580 route I was considering, and performs better too.

I am just a enthusiast like I said, not everyone is I understand that, but I am not the only one that does this. overclock.net is full of people that do the same thing.

I am crazy I know but I plan on getting IB next year and selling my 2500K, thats why I didnt get the 2600K, because I was planning on getting something to replace it next year, call me crazy but I can assure you I wont be the only one doing this, and when dx12 comes out (it better next year) ill upgrade to that too. or if a single GPU setup comes out that outperforms my current 5870 setup, i will get that too. just so I can get a single GPU setup again, I hate having two cards...

Your not crazy at all for the way you look at it. I agree with you.

I have a family member who's hobby is jewelry making and spends more on the equipment and raw materials than we do on pc hardware and we are not talking about using gold either. They also say that they are crazy too, but I always tell them that " It's your hobby, do want ever you want."

Avatar image for AlphaJC
AlphaJC

712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195 AlphaJC
Member since 2010 • 712 Posts
5 years of development and they still cant beat Intels mid range proc. *sigh* oh AMD what happened.
Avatar image for Mr_BillGates
Mr_BillGates

3211

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#196 Mr_BillGates
Member since 2005 • 3211 Posts
[QUOTE="Mr_BillGates"]

The flop is near fermi-level.

04dcarraher
How was fermi a failure?

Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer.
Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#197 GummiRaccoon
Member since 2003 • 13799 Posts

[QUOTE="04dcarraher"][QUOTE="Mr_BillGates"]

The flop is near fermi-level.

Mr_BillGates

How was fermi a failure?

Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer.

You must live in some fantasy land where the nvidia 4xx and 5xx graphics cards are bad.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#198 mitu123
Member since 2006 • 155290 Posts

[QUOTE="04dcarraher"][QUOTE="Mr_BillGates"]

The flop is near fermi-level.

Mr_BillGates

How was fermi a failure?

Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer.

Yet my 460s are serving me well.

Avatar image for kaitanuvax
kaitanuvax

3814

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#199 kaitanuvax
Member since 2007 • 3814 Posts

I think by Fermi he's referring to GF100 (GTX 465, 470, 480), not GF104 (GTX 460 768MB/1GB).

Avatar image for Elann2008
Elann2008

33028

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#200 Elann2008
Member since 2007 • 33028 Posts

[QUOTE="04dcarraher"]Also isnt Ivy delayed until 2013?mitu123

Last time I heard they would be in Spring 2012, unless new reports confirm they are really delayed into 2013.

That sounds about a right time for me to upgrade then. I'll wait for Ivy bridge.