Going to swap the cpu from a E6600 to a i5 2500k in an hour or so. How much improvement am I likely to see?
This topic is locked from further discussion.
Going to swap the cpu from a E6600 to a i5 2500k in an hour or so. How much improvement am I likely to see?
Let me rephrase. I purchased a new motherboard, cpu and ram. Everything iscompatible. And I have not put it in yet, doing it when I get home.
What other parts are you using? If your on a 1080p monitor using a HD5770 or less your not going to see any improvment at all while gaming. The only correct assumpyion would be that it depends on the other parts you match it with. And just as ravenguard90 said, motherboard and ram would also need to be changed, have a hard time beliving that you still rocking a E6600 with a DDR3 capable motherboard. (The s775 motherboards that could take DDR3 was released sevral years after the E66)Going to swap the cpu from a E6600 to a i5 2500k in an hour or so. How much improvement am I likely to see?
The_Gaming_Baby
You can just press down really REALLY hard and it'll fit.Considering you're using the word 'swap,' I hope you know they're both different sockets...
ravenguard90
Current graphics card is a 8800gt 1gb. Intend on upgrading later this year.If that's the only think you're swapping (including MB and RAM?) it will depend on the application, many games are heavily reliant on the GPU for good frames.
yellonet
I said I was going to make the switch in an hour or so.You tell us. You've already bought it.
XaosII
the E6600 is a dual core 2.4ghz cpu from 2006 with 4mb of cache, 1066mhz FSB.
the 2500K is a quad core 3.3ghz (with turbo 3.7ghz) cpu from 2011, with 7mb of total cache, and doesnt use FSB.
there is a night and day difference between the two.
not to mention clock of clock the 2500 is much faster 1.5 ghz of the e6600 is like 1 ghz of the i5 sandy bridge For games however you won't see so much differencethe E6600 is a dual core 2.4ghz cpu from 2006 with 4mb of cache, 1066mhz FSB.
the 2500K is a quad core 3.3ghz (with turbo 3.7ghz) cpu from 2011, with 7mb of total cache, and doesnt use FSB.
there is a night and day difference between the two.
theshadowhunter
[QUOTE="theshadowhunter"]not to mention clock of clock the 2500 is much faster 1.5 ghz of the e6600 is like 1 ghz of the i5 sandy bridge For games however you won't see so much difference it mostly depends if his current CPU was bottlenecking his GPU.the E6600 is a dual core 2.4ghz cpu from 2006 with 4mb of cache, 1066mhz FSB.
the 2500K is a quad core 3.3ghz (with turbo 3.7ghz) cpu from 2011, with 7mb of total cache, and doesnt use FSB.
there is a night and day difference between the two.
evildead6789
[QUOTE="yellonet"]Current graphics card is a 8800gt 1gb. Intend on upgrading later this year.If that's the only think you're swapping (including MB and RAM?) it will depend on the application, many games are heavily reliant on the GPU for good frames.
The_Gaming_Baby
You will not see much improvement, if any.
Current graphics card is a 8800gt 1gb. Intend on upgrading later this year.[QUOTE="The_Gaming_Baby"][QUOTE="yellonet"]
If that's the only think you're swapping (including MB and RAM?) it will depend on the application, many games are heavily reliant on the GPU for good frames.
GummiRaccoon
You will not see much improvement, if any.
Thats not really true. There will be a decent boost in games that like a quad.[QUOTE="GummiRaccoon"][QUOTE="The_Gaming_Baby"] Current graphics card is a 8800gt 1gb. Intend on upgrading later this year.gmaster456
You will not see much improvement, if any.
Thats not really true. There will be a decent boost in games that like a quad. Correct. I had an E6600 myself up until this spring. it bottlenecked my 4870 card, and obviously also my GTX 570. I got a good deal on an AMD X6 1100, and the difference was night and day, framerate increases abound, in every game I played. Now, the Intel 2500 and 2600 are very powerful, better than my AMD X6, so you can bet your increase will be even betterWhat rubbish, I had an E6750 and went to Q8400. There were half a dozen games with a HUGE improvement and I ONLY have a GTS 250. People who still think that dual cores and lower end DX10 cards like the GTS 250 and 4850 are still "enough for everything" really need to stop kidding themselves.
Going from dual to quad CAN actually give you almost double the performance, shocking I know!
And here we have core 2 duo to sandy bridge which is like 3x more power as he is getting a huge architectural upgrade as well and not just 2 more cores like I got with the Q8400.
It all depends on the game/program, the differences between cpu's and if the program or game can actaully allocate the extra processing power. Cache size, architecture design, processing frequency, all can play a role. Also possible bottlenecking can also occur when a game or program maxes out both cores and causing data flow issues with a gpu. When I went from an old Athlon X2 to a Phenom 2 X4 which was a bigger jump then you had, in alot cases I didnt see any big difference because ethier I had enough processing power or I was gpu bound. For example Crysis with same gpu's (dual 8800GT'S) from the old 2007 build to the 2010 build.What rubbish, I had an E6750 and went to Q8400. There were half a dozen games with a HUGE improvement and I ONLY have a GTS 250. People who still think that dual cores and lower end DX10 cards like the GTS 250 and 4850 are still "enough for everything" really need to stop kidding themselves.
Going from dual to quad CAN actually give you almost double the performance, shocking I know!
And here we have core 2 duo to sandy bridge which is like 3x more power as he is getting a huge architectural upgrade as well and not just 2 more cores like I got with the Q8400.
Gambler_3
The only real difference was minimum and average framerate not really maximum framerate. While other games that pushed the cpu to the limit and or liked quads then yes I seen massive improvement. However with the TC going from a E6600 to a i5 with the same 8800GT will see improvements with only very cpu demanding games that max one-two cores or are coded to use quads. He will be limited by his gpu, and in most cases he wont even tell the difference if the games he played ran fine before. Its like playing Left 4 Dead with his old E6600 and was getting say 100 fps average while with an i5 he would get 130 fps, he wouldnt see the difference.
What rubbish, I had an E6750 and went to Q8400. There were half a dozen games with a HUGE improvement and I ONLY have a GTS 250. People who still think that dual cores and lower end DX10 cards like the GTS 250 and 4850 are still "enough for everything" really need to stop kidding themselves.
Going from dual to quad CAN actually give you almost double the performance, shocking I know!
And here we have core 2 duo to sandy bridge which is like 3x more power as he is getting a huge architectural upgrade as well and not just 2 more cores like I got with the Q8400.
Gambler_3
How can you think that a 2500k and 8800GT isn't lopsided?
[QUOTE="Gambler_3"]
What rubbish, I had an E6750 and went to Q8400. There were half a dozen games with a HUGE improvement and I ONLY have a GTS 250. People who still think that dual cores and lower end DX10 cards like the GTS 250 and 4850 are still "enough for everything" really need to stop kidding themselves.
Going from dual to quad CAN actually give you almost double the performance, shocking I know!
And here we have core 2 duo to sandy bridge which is like 3x more power as he is getting a huge architectural upgrade as well and not just 2 more cores like I got with the Q8400.
GummiRaccoon
How can you think that a 2500k and 8800GT isn't lopsided?
It is it is dont get me wrong. I just think that the TC made the right decision upgrading the CPU first as I believe E6600 is more of an outdated product than 8800GT. There couldnt have been a point in upgrading GPU with an E6600 even if it was overclocked. Unless he is playing on super high resolution with AA, he is bound to get some noticible performance boost in quite a few games.He says he intends on upgrading the GPU later this year so I dont see much wrong then with what he did.
I recently upgraded from E6600 to Q9400 (not i5-2500k mind you, but it's a quad core at least). I have an HD 5870, and the difference was huge. Games that barely run at medium now run maxed out without any problem. I guess the E6600 couldn't feed the GPU and run the game together that well. You will see an improvement.
That 8800GT will bottleneck your system quite a lot. Consider not eating for some weeks and buy a new graphics card as well :)
[QUOTE="GummiRaccoon"]
[QUOTE="Gambler_3"]
What rubbish, I had an E6750 and went to Q8400. There were half a dozen games with a HUGE improvement and I ONLY have a GTS 250. People who still think that dual cores and lower end DX10 cards like the GTS 250 and 4850 are still "enough for everything" really need to stop kidding themselves.
Going from dual to quad CAN actually give you almost double the performance, shocking I know!
And here we have core 2 duo to sandy bridge which is like 3x more power as he is getting a huge architectural upgrade as well and not just 2 more cores like I got with the Q8400.
Gambler_3
How can you think that a 2500k and 8800GT isn't lopsided?
It is it is dont get me wrong. I just think that the TC made the right decision upgrading the CPU first as I believe E6600 is more of an outdated product than 8800GT. There couldnt have been a point in upgrading GPU with an E6600 even if it was overclocked. Unless he is playing on super high resolution with AA, he is bound to get some noticible performance boost in quite a few games.He says he intends on upgrading the GPU later this year so I dont see much wrong then with what he did.
I'm on Gamblers side with this one. I think TC made the right choice.[QUOTE="Gambler_3"]It is it is dont get me wrong. I just think that the TC made the right decision upgrading the CPU first as I believe E6600 is more of an outdated product than 8800GT. There couldnt have been a point in upgrading GPU with an E6600 even if it was overclocked. Unless he is playing on super high resolution with AA, he is bound to get some noticible performance boost in quite a few games.[QUOTE="GummiRaccoon"]
How can you think that a 2500k and 8800GT isn't lopsided?
gmaster456
He says he intends on upgrading the GPU later this year so I dont see much wrong then with what he did.
I'm on Gamblers side with this one. I think TC made the right choice. The positive about getting that 2500k upgrade is that he can move onto a better graphics whenever he wants an have a great experience, but that wasnt really the question.As a upgrade for performance it wasn't a great upgrade, the graphic card was just a 8800gt or roughly a HD5670, thats not a strong card at all even paired with a great CPU as the 2500k that card wont benefit the boost over a E6600 in more than a handfull games.
So on the question if he will see a improvment the correct anwser is :not in many games.
Your card is MILES faster, his card is the eqvalient to a HD5670 or a bit weaker. O.oI recently upgraded from E6600 to Q9400 (not i5-2500k mind you, but it's a quad core at least). I have an HD 5870, and the difference was huge. Games that barely run at medium now run maxed out without any problem. I guess the E6600 couldn't feed the GPU and run the game together that well. You will see an improvement.
That 8800GT will bottleneck your system quite a lot. Consider not eating for some weeks and buy a new graphics card as well :)foggy666
Not really. They are actually about equal. The 5670 is a TAD weaker than the 9800gt but a hair stronger than the 8800gt. The newer 6670 is about equal to the 8800gtx/9800gtx/HD48308800GT is quite a bit faster than 5670....
Gambler_3
In all the techpowerup charts ive seen, the 5670 is only a couple frames behind the 9800gt. And at 1600p the 5670 is actually faster.Since when was there a difference between 8800GT and 9800GT? :?
9800GT = 8800GT >>> 5670. Please dont make me post a dreaded techpowerup chart. :)
Nobody said anything about 6670.
Gambler_3
Not really, like with Crysis a 5670 at 1920x1200 the 5670 beats a 8800GT by a single fps however using 1680x1050 with 4x AA the 8800GT only beats the 5670 by not even a frame. Another example the 5670 beating a 8800GT is with Farcry 2 at 1980x1200 with 4x AA by 4 fps. then in like L4D the 8800GT beats it at 1920x1200 by almost 10 frames but when you get to lower resolutions they are nearly the same. The 8800GT is not quite a bit faster then a 5670 from what Ive seen overall the 8800GT is only slightly faster.8800GT is quite a bit faster than 5670....
Gambler_3
[QUOTE="evildead6789"]true the cards are all rubbishgmaster456And the point of this post was........?[/QUOTE i don't really know , i guess it was to stop a discussion between two rubbish cards that are actually the same in performance, give or take a frame i this or that game, who cares
[QUOTE="Gambler_3"]In all the techpowerup charts ive seen, the 5670 is only a couple frames behind the 9800gt. And at 1600p the 5670 is actually faster.Since when was there a difference between 8800GT and 9800GT? :?
9800GT = 8800GT >>> 5670. Please dont make me post a dreaded techpowerup chart. :)
Nobody said anything about 6670.
gmaster456
This is the most relavent resolution for this sort of a card but if you are not satisfied,
1600p doesnt matter.
the 8800GT is still a huge bottleneck and was arguably the weaker of the 2 on his old machine.
Incremental upgrades are great, but he just needs to understand it won't be night and day until he picks up a 570 at which point he will fall out of his chair.
When were those charts made? because I have seen benchmarks done where in 90% of games tested they were only 1-3 fps difference and other games both traded blows.... with high resolutions. Only one or two games that were testedthat the 8800GT had more then a 6 fps gain over 5670
I'm well aware of the limitations of my 8800gt. But it made more sense to me to upgrade the cpu, motherboard and ram before upgrading the gpu.the 8800GT is still a huge bottleneck and was arguably the weaker of the 2 on his old machine.
Incremental upgrades are great, but he just needs to understand it won't be night and day until he picks up a 570 at which point he will fall out of his chair.
GummiRaccoon
no they don't the gtx 560 is in the same league as the hd 5850 the gtx 560ti (which is different from a gtx 560) is in the same league as the 5870 and hd 6950 which is correct http://www.tomshardware.com/reviews/gaming-performance-radeon-geforce,2997-7.html normally the chart is pretty good but apparently they made an error with the hd 5670 or the 9800 gt or i should be missing somethingThe tomshardware hierchy chart also lists the 560 Ti and 5850 in the same league whereas the 5870 and 6950 are in a different league than the 560 Ti. What a load of fail. :lol:
Gambler_3
i went from an e6400 to a i7 950, and the difference is night and day. having bc2 max out my 6400 at 100% CONSTANTLY was annoying, but now i rarely get to 20% with multiple programs running, including bc2. multitasking is going to be so much better overall. you will love life.Going to swap the cpu from a E6600 to a i5 2500k in an hour or so. How much improvement am I likely to see?
The_Gaming_Baby
Techpowerup is a load of fail... :roll: Techpowerup has these enourmos charts who tell nothing about the drivers and the hardware used to test them, in fact techpowerup charts are really only good for one thing and thats a roughly estemate. Going y their chart to prove a cuple % here or there is worthless when drivers and difference circum. may play a bigger role. And 10%, thats very near in my opinion, HD5760 and the 8800GT def. plays in the same region.The tomshardware hierchy chart also lists the 560 Ti and 5850 in the same league whereas the 5870 and 6950 are in an upper league than the 560 Ti. What a load of fail. :lol:
Gambler_3
It's these huge techpowerup charts. :lol: These charts are both old and new, each card runs a different driver, no real comparison can be made outta it, only a estemate.When were those charts made? because I have seen benchmarks done where in 90% of games tested they were only 1-3 fps difference and other games both traded blows.... with high resolutions. Only one or two games that were testedthat the 8800GT had more then a 6 fps gain over 5670
04dcarraher
Please Log In to post.
Log in to comment