I cannot begin to express how disappointed I am with Bulldozer. :(
This topic is locked from further discussion.
lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?
lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?
kaitanuvax
I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(
[QUOTE="ferret-gamer"][QUOTE="wis3boi"] my friend had an nvidia 5000 series up until a few years ago. That thng was awful beyond words. It could barely run any DX 9 gamesi tried running oblivion on an fx 5200 its failure at rendeirng it correctly was glorious. hehe same game he tried using it on :Dwis3boi
those cards had a difficult time playing battlefield 1942, and half life 2. I find it funny that you held onto it that long, by the time oblivion came out I was using a 7800 GT.
[QUOTE="kaitanuvax"]
lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?
ravenguard90
I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(
If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.
Well, when you have a friend constantly bashing you for your low 3DMark11 Physics score compared to his i5 2500k, the urge to upgrade is ever so slightly higher :P
Plus the fact that a bottleneck is apparent from my processor, since lowering my resolution in the Unigine benchmark does nada to my framerates.
[QUOTE="ravenguard90"]
[QUOTE="kaitanuvax"]
lol does everyone on here have a duo core that waited to upgrade to sandybridge/would-have-been bulldozer?
Tezcatlipoca666
I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(
If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.
I went from one of those (3.9ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.1090t: http://3dmark.com/3dm11/870273
2500K: http://3dmark.com/3dm11/1331694
thats my 3dmark score differences, now my gpu's are the bottleneck.
PhysicsTest: 28.27 FPS vs 21.18 FPS
CombinedTest: 33.72 FPS vs 26.44 FPS
7 fps on a test like that and going from a hexacore to a quad core is amazing in my book. Just shows how the clock to clock difference with intel vs AMD, they are generations appart if a quad core smokes a hexacore on a CPU test like that.
[QUOTE="Tezcatlipoca666"][QUOTE="ravenguard90"]
I was waiting on BD to replace my processor. As you can see, I already had the motherboard ready since my other one kicked the bucket :(
theshadowhunter
If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.
I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.Yes, there must truly be a WORLD of difference :lol: How did you ever survive?
I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.[QUOTE="theshadowhunter"][QUOTE="Tezcatlipoca666"]
If you already have a 6 core 4Ghz Phenom II then I really don't see why you would need to upgrade yet.
Tezcatlipoca666
Yes, there must truly be a WORLD of difference :lol: How did you ever survive?
well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.[QUOTE="Tezcatlipoca666"][QUOTE="theshadowhunter"] I went from one of those (4ghz with the 3ghz NB which makes a big difference) to the 2500K (runs at 4.5ghz) and there is a world of difference.theshadowhunter
Yes, there must truly be a WORLD of difference :lol: How did you ever survive?
well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the seem bad, the truth is that both provide smooth gameplay...
I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.
well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.[QUOTE="theshadowhunter"][QUOTE="Tezcatlipoca666"]
Yes, there must truly be a WORLD of difference :lol: How did you ever survive?
Tezcatlipoca666
I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...
I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.
tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.
Whilest you may have gained a frame or two in Metro 2033, it is hardly a CPU intensive game but rather GPU.
And tbh I laughed when you posted synthetic benchmark scores after you said there was a world of difference between PII and Sandy Bridge.
That just goes to show that PII is still money in REAL games. And as long as you game and no more, you frankly don't need better. That's why I find this 2500k frenzy so puzzling.
[QUOTE="Tezcatlipoca666"]
[QUOTE="theshadowhunter"] well if games ever took advantage of the 6 cores I wouldnt have seen a huge difference between the two, but the facts are not much games use more than 2-4 cores that effectively, and my benchmark shows even when they do the 2500K smokes the 1090t.theshadowhunter
I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...
I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.
tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.
>Most graphically intensive game out there
>Claims upgrading CPU, which wasn't the bottleneck improved performance
>2011
>ISHYGDDT
tbh that isnt all true, I couldnt play games like metro 2033 smoothly on my setup with the settings on ultra when I had the 1090t, I can now with my 2500K. so those small FPS differences do make a world of difference on some games, like metro 2033 like I said. I saw that the phenom II was a dead end upgrade path and I had it for over a year, so I sold it, and bought the 2500K, which in trade was faster, and I sold my 1090t setup for a good price, so there was nothing stopping me from upgrading.[QUOTE="theshadowhunter"]
[QUOTE="Tezcatlipoca666"]
I would have said the same for a 4 core Phenom II at 4Ghz. While benchmarks and fancy charts make the difference seem large the truth is that both provide smooth gameplay...
I'm not denying the fact that the 2500k is a lot faster but rather questioning the necessity of such an upgrade when a 4Ghz Phenom will already play pretty much everything smoothly.
GummiRaccoon
plus now I have the option to upgrade to even better CPU's next year with my same motherboard, while the AM3 line is dead. They already have the bios for the 22nm cpus for my motherboard out. I reminds me when I went from my p965 to the p35 chipset, the p965 was dead and the p35 supported the 45nm cpu's that came out the next year.
>Most graphically intensive game out there
>Claims upgrading CPU, which wasn't the bottleneck improved performance
>2011
>ISHYGDDT
there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)theshadowhunter
Even if such reviews exists, just ask yourself: what frames are is a "crossfire" setup getting? 100 fps on a PII as opposed to 125 on a Sandy Bridge? Are you really going to use that as a basis for the superiority of Sandy Bridge? A more likely argument would be say 20 fps vs 30 fps.
[QUOTE="theshadowhunter"] there are many reviews on the internet that show that the P II bottlenecks a crossfire setup.. (Unless at 25x16 res)kaitanuvax
Even if such reviews exists, just ask yourself: what frames are is a "crossfire" setup getting? 100 fps on a PII as opposed to 125 on a Sandy Bridge? Are you really going to use that as a basis for the superiority of Sandy Bridge? A more likely argument would be say 20 fps vs 30 fps.
my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.
theshadowhunter
Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.
What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.
/rant
[QUOTE="theshadowhunter"]
my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.
kaitanuvax
Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.
What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.
/rant
Gotta second this...
It's fine to build an i5 2500k build now. If you have the cash then why not? But to suggest that a Phenom II is not good enough and needs to be upgraded when games run just fine on it is.. well strange.
[QUOTE="GummiRaccoon"]
[QUOTE="Gambler_3"]Well from the time I have been following PC hardware, the 2900XT is still hands down the biggest flop ever. Everyone was "sure" that it is going to be significantly faster than an 8800GTX and it turned out to be weaker than the 8800GTS.
It had a serious performance issue with anti-aliasing and it never got fixed by any driver updates. And it wasnt until the 5870 that AMD really got back in the high end market.
Gambler_3
There is no bigger flop than the FX 5800 launch.
Ya have heard alot about that but wasnt a PC hardware follower back then so cant really have an opinion myself.FX 5800 was definitely a bigger flop than this. It was turrrrible.
I really don't understand why people with 3ghz+ Core 2 Quads and Phenom II X4's in PCs used primarily for gaming with single-gpu setups would be thinking about upgrading to anything currently out. These processors can run anything smoothly and probably will for the next 1-2 years. I'm definitely waiting for some signs of choppy gameplay before I even think about upgrading. The next generation of CPUs that come out will probably justify an upgrade.
[QUOTE="kaitanuvax"]
[QUOTE="theshadowhunter"]
my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.
Tezcatlipoca666
Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.
What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.
/rant
Gotta second this...
It's fine to build an i5 2500k build now. If you have the cash then why not? But to suggest that a Phenom II is not good enough and needs to be upgraded when games run just fine on it is.. well strange.
pretty much....I run an X6 1100T at 4ghz with my 570, it eats Metro, Crysis, BF3 alive[QUOTE="theshadowhunter"]
my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.
kaitanuvax
Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.
What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.
/rant
Even if your CPU is limiting your GPU performance to 200 fps out of a possible 300, it's still technically (and by the definition of the word) a bottleneck.Even if your CPU is limiting your GPU performance to 200 fps out of a possible 300, it's still technically (and by the definition of the word) a bottleneck. C_Rule
Yes, that's why I said "intended emphasis" and not "meaning".
[QUOTE="theshadowhunter"]
my point is, everyone talks about future proofing here, next gen GPU's will be more powerful than my croosfire setup ( I REALLY HOPE SO, so I can sell them) and if you buy a high end gpu setup you probably plan on keeping it for a while, and plan on upgrading it too. if you stick a lousy cpu with that you will see a bottleneck, and more of one when more demanding apps/games come out and right now its 100+ fps, but by then it will be lower than that I assure you. my point is a P II is a bottleneck for them, and if it bottlenecks that it will bottleneck future GPU's worse. (IE 6xx, 7xxx series or even a 6990 or 6970 crossfire, or a 570/580 SLI setup) heck a 2600K (not overclocked) bottlenecks a sli 580 setup.
kaitanuvax
Sigh..I think the word "bottleneck" has lost its intended emphasis. No, the CPU will not be your "bottleneck", it will be your "weakest link". A "bottleneck" would be if that CPU fails to perform WELL on a game - you should not be talking if you are getting 50-60+ FPS on it.
What you should instead focus your attention on is the progression of games ultilizing more cores. They have JUST begun to make use of four cores, and it will quite a long time (think next gen consoles release) before mainstream quads (such as the PII) start to struggle in games. THAT is when you get a CPU upgrade, not when you want to flex your e-peen by getting 75 fps instead of 70 fps because you spent $400 on a new motherboard and cpu just so it can look pretty next to your recently bought GPU.
/rant
my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.Ya have heard alot about that but wasnt a PC hardware follower back then so cant really have an opinion myself.[QUOTE="Gambler_3"]
[QUOTE="GummiRaccoon"]
There is no bigger flop than the FX 5800 launch.
hartsickdiscipl
FX 5800 was definitely a bigger flop than this. It was turrrrible.
I used to own one, then again back then I had no experience with PC hardware...and then got a different gpu afterwards.:lol:my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.
theshadowhunter
No, I read your post, it's you that didn't read or comprehend mine.
Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?
Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.
Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.
[QUOTE="theshadowhunter"]
my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.
kaitanuvax
No, I read your post, it's you that didn't read or comprehend mine.
Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?
Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.
Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.
the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.
Sorry I stopped reading at "PIIs are obsolete". In reference to your post in another topic, I think you are more a Sandy Bridge fanboy than the AMD fanboy you profess yourself to be.
obsolete: Cause (a product or idea) to be or become obsolete by replacing it with something new: "we're obsoleting last year's designs". by that definition PII's are obsolete, they are clock for clock outperformed by all i7's and Bulldozer replaced them. I have owned AMD since the k5 days. conroe made me change my opinion a bit though, and I saw AMD's reaction and it was hard to not get a intel setup. so I sold my opteron 148 for a E6400 (biggest upgrade difference I have ever seen in my history of upgrading) then I bought the Xenon E8400 equivalent because all AMD had out was the Phenom I, and then I bought a 1090t to try out AMD again because I thought (more cores = better) I wont make that mistake again. I dont see myself buying another AMD rig until they can top what I currently own clock for clock. clock for clock is what AMD was great out before (athlon xp's, athlon 64's, etc...) then the Conroe came out and they lost that.Sorry I stopped reading at "PIIs are obsolete". In reference to your post in another topic, I think you are more a Sandy Bridge fanboy than the AMD fanboy you profess yourself to be.
kaitanuvax
[QUOTE="kaitanuvax"]
[QUOTE="theshadowhunter"]
my point is people that use DUAL CARD setups NOT SINGLE CARD setups, and FUTURE HIGHEND GPU's shouldnt use a phenom II. did you read my post or did you just see the word bottleneck and rant off that? no offense but it looks like that. more cores means nothing (AMD just proved this) its clock for clock performance that matters, and in that intel owns AMD.
theshadowhunter
No, I read your post, it's you that didn't read or comprehend mine.
Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?
Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.
Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.
the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.
There are really only 2 generations of i5/i7. The i7's made for LGA 1156 weren't really a new generation. Clock for clock they're basically the same as the original LGA 1366 Nehalem i7's.
The term "obsolete" usually indicates that a product no longer serves it's practical purpose. That can't be said of the Phenom II. In fact any Phenom II X4 over around 3ghz can run any game well, and won't be a significant bottleneck to any single gpu in the VAST majority of games. They are still going strong. The fact that you can buy a 955 or 965 for $115 to $140 and a supporting motherboard for dirt cheap tells me that the platform is still a very viable option for people on limited budgets. The i3 Sandy Bridge dual-cores are equal to or faster than these chips at stock in most games, but we know that the X4 pulls away in games that are heavily-threaded. They will ultimately be better suited to handling the games of the next 2-3 years because of this.
the P II's are obsolete, something else out there exists thats better than them, heck clock for clock the C2Q's are faster. all 3 generations of te i7/15's are faster clock for clock. right now unless you are on a extreme budget you shouldnt buy one brand new. and if you do that you wouldnt own a dual card setup. you would have to be a blind fanboy to buy a new P II setup for a dual card setup. P II's were built to go against the C2Q's because the PI's failed at this. by the time games use quad core( or more) better, the phenom II's will look like the Phenom I's look now, NO one wants one, and this is even worse for bulldozer.[QUOTE="theshadowhunter"]
[QUOTE="kaitanuvax"]
No, I read your post, it's you that didn't read or comprehend mine.
Dual card set up means 60+ FPS in the majority of cases. Which can be achieved through a PII or any better CPU. Yes this is when the CPU becomes the "bottleneck". But are you really going to say not to get a PII in this case? When it clearly DOMINATES the game, even though it may not clearly be the best? Why do you think a PII isn't good enough? Just because it's old?
Future high end GPUs? No, that's where I told you to focus your attention instead to the progression of games ultilizing more and more cores. I'll repeat myself just because you seem like you have alot of steam in you to not carefully read over my post - games are JSUT starting to ultilize quad cores. Thus mainstream quads such as the PII will NOT be obsolete for a long time to come, which is probably when hte next gen consoles roll around.
Just because it was a rant doesn't mean it was a pointless post that I just spat spontateously.
hartsickdiscipl
And by the time the software comes, both Bulldozer and Sandy Bridge will be irrelevant.
There are really only 2 generations of i5/i7. The i7's made for LGA 1156 weren't really a new generation. Clock for clock they're basically the same as the original LGA 1366 Nehalem i7's.
The term "obsolete" usually indicates that a product no longer serves it's practical purpose. That can't be said of the Phenom II. In fact any Phenom II X4 over around 3ghz can run any game well, and won't be a significant bottleneck to any single gpu in the VAST majority of games. They are still going strong. The fact that you can buy a 955 or 965 for $115 to $140 and a supporting motherboard for dirt cheap tells me that the platform is still a very viable option for people on limited budgets. The i3 Sandy Bridge dual-cores are equal to or faster than these chips at stock in most games, but we know that the X4 pulls away in games that are heavily-threaded. They will ultimately be better suited to handling the games of the next 2-3 years because of this.
for the high end gamer though, you will have a dual card setup. I highly doubt many people will be using P II's in 2-3 years(they came out in December 2008 ) but then again some people have first/second gen conroes still (my friend has one and plans on getting SB for Skyrim) like I said by the time the software catches up these CPU's will be irrelevant. just like the Phenom I is right now, lets take a poll of how many people still have one of those? I forsee the same fate for bulldozer, but I could be wrong.Bulldozer isnt a cheap CPU, and is around the same price as SB so there are no cost savings.
I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.
this is truely pathetic:
Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.
I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.
this is truely pathetic:
Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.
theshadowhunter
Yes, Bulldozer is quite pathetic. You'll get no disagreement from me there.
This has turned into a general CPU upgrade discussion, as it was destined to. I would be willing to wager than in 2 years a 3.5ghz Phenom II X4 will be able to run 80% or more of the games on the market smoothly. The element that people seem to be missing is that the hardware requirements for PC games aren't really moving much right now... especially when it comes to CPU demands. Most games on the market right now will still run well on a 3ghz+ dual core, and completely smoothly on a 2.6 to 3ghz C2Q or PII X4.
The next generation of consoles probably won't be out for at least 18-24 months. We won't see a huge increase in PC game system requirements between now and then. There will be a couple of titles a year that stress hardware, but that's about it. The quad-core CPUs that run games well now will probably run them well in 2 years.
[QUOTE="theshadowhunter"]
I dont know how this turrned into a discussion about P II's when it is about Bulldozer, my beef is with that, not P II's, because I said Bulldozer/P II's bottleneck a decent SLI/Crossfire setup I got this.... doesnt matter how much the bottleneck is, a bottleneck is a bottleneck.
this is truely pathetic:
Like I said I owned a P II, it wasnt pathetic, but something else was quite a bit better out, and I usually upgrade when there is (if its in my budget) so thats why I got the 2500K.
hartsickdiscipl
Yes, Bulldozer is quite pathetic. You'll get no disagreement from me there.
This has turned into a general CPU upgrade discussion, as it was destined to. I would be willing to wager than in 2 years a 3.5ghz Phenom II X4 will be able to run 80% or more of the games on the market smoothly. The element that people seem to be missing is that the hardware requirements for PC games aren't really moving much right now... especially when it comes to CPU demands. Most games on the market right now will still run well on a 3ghz+ dual core, and completely smoothly on a 2.6 to 3ghz C2Q or PII X4.
The next generation of consoles probably won't be out for at least 18-24 months. We won't see a huge increase in PC game system requirements between now and then. There will be a couple of titles a year that stress hardware, but that's about it. The quad-core CPUs that run games well now will probably run them well in 2 years.
lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.
theshadowhunter
Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.
[QUOTE="theshadowhunter"]
lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.
kaitanuvax
Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.
that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.[QUOTE="kaitanuvax"]
[QUOTE="theshadowhunter"]
lol the sad thing is, my opteron 170 (2.65ghz) still plays most games. and it is paired with a 8800 GTX. how sad it that? tech from 2006 still plays modern games? consoles did this to us.... it wasnt always like this though. that wont stop me from upgrading, I always like upgrading, its a hobby.
theshadowhunter
Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.
that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it. well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080pIt looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :PElann2008Seriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?!
[QUOTE="theshadowhunter"]that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it. well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080p I usually dont buy dual card setups myself, but I wanted something more powerful than my 5870, and I was looking at getting a 6970 (would of been dumb) of the GTX 580 (even dumber because of the cost and performance difference) I found my 2nd card unused on overclock.net, it was a local deal and I got it for $150, way cheaper than going the GTX 580 route I was considering, and performs better too.[QUOTE="kaitanuvax"]
Whatever floats your boat, just don't go saying things are obsolete / will be in the near future. It's a blessing in disguise anyways, makes developers optimize their games more and more till they suck DX9 dry. More money saved for people who aren't addicted to upgrading like me.
wis3boi
I am just a enthusiast like I said, not everyone is I understand that, but I am not the only one that does this. overclock.net is full of people that do the same thing.
I am crazy I know but I plan on getting IB next year and selling my 2500K, thats why I didnt get the 2600K, because I was planning on getting something to replace it next year, call me crazy but I can assure you I wont be the only one doing this, and when dx12 comes out (it better next year) ill upgrade to that too. or if a single GPU setup comes out that outperforms my current 5870 setup, i will get that too. just so I can get a single GPU setup again, I hate having two cards...
[QUOTE="Elann2008"]It looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :PtequilasunriserSeriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?!
See: hardware addicts
Seriously? Why on earth would you upgrade your 3.8GHz BE965 to an i5 2500? I can understand if you had a Phenom 1 or an E6800 or something... There is no need to make that upgrade lol. Why, Elann, why!?!?![QUOTE="tequilasunriser"][QUOTE="Elann2008"]It looks like I'll be grabbing a i5-2500k or 2600k. Finally, I can move on. :Pkaitanuvax
See: hardware addicts
now whos not being open minded and accepting? its a hobby, just like people with cars, some people are content driving older cars (some with cars from the 80s-90s), while others want newer ones even though there is nothing wrong with their last one.except this hobby is cheaper than that, way cheaper, and makes more sense IMHO.
well, most people dont do it that hardcore, I like having beefy hardware, but ill never use multiple GPUs in my gaming PCs, nor will i buy the top of the line things right when they drop. My 570 SSC and my 1100T at 4ghz are far more than any gamer needs to eat though just about every game at 1050p/1080p I usually dont buy dual card setups myself, but I wanted something more powerful than my 5870, and I was looking at getting a 6970 (would of been dumb) of the GTX 580 (even dumber because of the cost and performance difference) I found my 2nd card unused on overclock.net, it was a local deal and I got it for $150, way cheaper than going the GTX 580 route I was considering, and performs better too.[QUOTE="wis3boi"][QUOTE="theshadowhunter"] that's the difference then, I am a hardware enthusiast that always wants new stuff, I upgrade multiple times a year, but I understand not everyone is like that. its my hobby and its what I put my money into, I usually sell off hardware before it loses too much value, I also run a computer business so its easy to find someone that is willing to buy used stuff at a discounted price (thats how I sold my 1090t setup) so as long as someone is willing to buy my older components I'll do it.
theshadowhunter
I am just a enthusiast like I said, not everyone is I understand that, but I am not the only one that does this. overclock.net is full of people that do the same thing.
I am crazy I know but I plan on getting IB next year and selling my 2500K, thats why I didnt get the 2600K, because I was planning on getting something to replace it next year, call me crazy but I can assure you I wont be the only one doing this, and when dx12 comes out (it better next year) ill upgrade to that too. or if a single GPU setup comes out that outperforms my current 5870 setup, i will get that too. just so I can get a single GPU setup again, I hate having two cards...
Your not crazy at all for the way you look at it. I agree with you.
I have a family member who's hobby is jewelry making and spends more on the equipment and raw materials than we do on pc hardware and we are not talking about using gold either. They also say that they are crazy too, but I always tell them that " It's your hobby, do want ever you want."
[QUOTE="Mr_BillGates"]How was fermi a failure? Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer.The flop is near fermi-level.
04dcarraher
[QUOTE="04dcarraher"][QUOTE="Mr_BillGates"]How was fermi a failure? Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer.The flop is near fermi-level.
Mr_BillGates
You must live in some fantasy land where the nvidia 4xx and 5xx graphics cards are bad.
[QUOTE="04dcarraher"][QUOTE="Mr_BillGates"]How was fermi a failure? Came late like Bulldozer, net-burst power consumption like Bulldozer, disappointing performance given the extra time like Bulldozer, runs hot like Bulldozer, and flops like Bulldozer. Yet my 460s are serving me well.The flop is near fermi-level.
Mr_BillGates
Please Log In to post.
Log in to comment