What is the most advanced CPU PROCESSOR there is to play PC games and around how much costs ?
Thanks in advance
This topic is locked from further discussion.
What is the most advanced CPU PROCESSOR there is to play PC games and around how much costs ?
Thanks in advance
Actually the E6400 overclocks pretty damn well and is cheaper than the E6600mrhankeydinksmeh E6600 is better value. It only costs $40 more, and the extra 2MB on the L2 cache is alone worth it plus, higher stock speed.
Wait a bit AMD is about to Release thier K10 "Core Duo Killer" In a few months04dcarraher
And Intel is about to release there QX6800 "Everything Killer" in a few months.
[QUOTE="04dcarraher"]Wait a bit AMD is about to Release thier K10 "Core Duo Killer" In a few monthsPlatearmor_6
And Intel is about to release there QX6800 "Everything Killer" in a few months.
Lol I doubt it they will just and more cache and ghz/mhz speed. If you look at the past remember AMD was the best for almost 3 years til intel came out with the core duo. But before then when amd came out with 64 bit intel was like o crap so they put millions into research on improving their p4's pd's etc and some what copying some of amd's ideas. AMD have always been more inovative. And they are looking at a diifferent way with the cpu or cpus directly feed and direct info of and to the ram etc not memory controllers as far as i can understand not like now .. Intel is like 4x bigger so they have more resources and when amd new cpus come out core duo's/ inte lwill have there work cut out for them. And then intel will look into the new techiques and copy just like they have done before.
[QUOTE="Platearmor_6"][QUOTE="04dcarraher"]Wait a bit AMD is about to Release thier K10 "Core Duo Killer" In a few months04dcarraher
And Intel is about to release there QX6800 "Everything Killer" in a few months.
Lol I doubt it they will just and more cache and ghz/mhz speed. If you look at the past remember AMD was the best for almost 3 years til intel came out with the core duo. But before then when amd came out with 64 bit intel was like o crap so they put millions into research on improving their p4's pd's etc and some what copying some of amd's ideas. AMD have always been more inovative. And they are looking at a diifferent way with the cpu or cpus directly feed and direct info of and to the ram etc not memory controllers as far as i can understand not like now .. Intel is like 4x bigger so they have more resources and when amd new cpus come out core duo's/ inte lwill have there work cut out for them. And then intel will look into the new techiques and copy just like they have done before.
I can only stress how uninformative and inaccurate this post is.ok how do explain amd having 64 bit single cores before intel hmmmmmm....... and then intel husles and weeks later come out with them base on p4's hmmmm04dcarraher
"1999: Intel released the instruction set for the IA-64 architecture. First public disclosure of AMD's set of 64-bit extensions to IA-32, called x86-64 (later renamed AMD64)." -[http://en.wikipedia.org/wiki/64_bit]
AMD coppied. Sorry buddy.
ok read this info/theory
You can't tell me you haven't asked this question yourself: Why exactly is Intel coming along right now with an integrated memory controller idea? And why is it that Intel now plans to put graphics capability into the CPU. Does AMD innovate and Intel has begun to follow? Here's some food for thought.
So, let's get right to it. Why is the integrated memory controller (IMC), a key feature that made AMD's current Athlon 64/Opteron platform so successful, developed for the next-gen Intel Nehalem platform? When I heard about the news, a statement by Intel's Pat Gelsinger from a 2005 IDF popped up in my mind, in which the executive said that Intel would be thinking about such a technology when there's the right time.
Nehalem's release must be right time, apparently. But why? Intel had such a technology already developed in the past (see also the reader comments in our first Nehalem article ) for its never released Timna processor (some background on the development and the decision to scrap this chip can be found in our interview with Intel's Mooly Eden). While AMD's success with the integrated controller and customer pressure may have motivated Intel to rethink its IMC strategy, the official explanation is that multi-core has changed the landscape and will retire the concept of the good old FSB.
PR Manager George Alfs told me that engineers typically have certain tools they can use to improve a processor and an IMC appeared to be the right approach to deal with the quick increase of threads in Nehalem. 45 nm Nehalem processora will be available with at least 8 cores on the high-end and, with the return of Hyperthreading, there will be at least 16 threads in Intel's fastest CPUs. "There is a lot of data going in and out. It makes a whole lot of sense to use an IMC in this architecture," said Alfs.
Intel also said that it will be integrating graphics into the processor. We haven't really heard about this concept from the blue team until a few days ago. Could this idea be inspired by AMD? Intel's news comes just about a year after AMD had announced that it will leverage ATI knowledge to build Fusion, a processor that will offer a graphics core on the low-end and possibly a stream-processing core on higher-end versions of this processor.
It would be almost foolish to think that Intel never had thought about the capability of integrating its graphics technology into a processor. But integrating graphics into the processor goes against the very basic concept Intel's business is built on - to sell as many chips as they can. In today's model, Intel sells CPUs and graphics processors separately, with two profit margins in place. In a future model, Intel may only sell only one chip with a profit margin that is far less than today. So far there hasn't been really an incentive for Intel to integrate graphics into the CPU. Put Intel's 40% market share in the graphics industry into this equation and you have one convincing reason for the company not to integrate graphics into the processor.
But the market requirements and technologies are changing: "The CPU tends to absorb other components over time," Alfs said. He also mentioned that integrated graphics always have been part of the Nehalem technology, which has been in development for about three years now.
I leave it up to you to decide how much influence AMD's Fusion processor had on Intel. Interestingly, both approaches appear to be very similar, as Alfs said that a graphics-equipped CPU would be positioned as a mainstream solution, while the company expects that there will always be a market more discrete graphics cards. Sounds like a Fusion competitor to me.
The most interesting part of this whole scenario will be timing. If Intel will be able to roll out a graphics-Nehalem processor close to the processor's release date - which we expect to be the second half of 2008 - then Intel will have a huge advantage over AMD: Fusion is not expected to be unveiled until 2009/2010.
Is Intel copying AMD and throwing its enormous resources at every good idea AMD comes up with? Or is AMD just a bit more talkative about its ideas and the company really leverage the idea of Fusion as a key reason to justify its acquisition of ATI? The answers really depend on your preferences and your point of view.
I can't tell and only certain executive ranks at both companies know for sure. But I doubt these are the real questions anyway. From a consumer perspective, both processor companies are in a highly competitive environment right now, which can only result in much better products. From the perspective of AMD, it really doesn't matter if Intel copies ideas or not: Suing your competitor is really only one side of the story. The green team knows about the capabilities of the blue team and will need to continue to have ideas that differentiate the firm's products - and we now know that it will be a challenge to position the IMC and Fusion, at least on the low end, as a unique feature.
Interestingly, one component is largely left out of the graphics discussion. If Intel and AMD aim to battle for the lion's share of the market of the graphics market, what does that mean for Nvidia? Yes, both AMD and Intel say that there will always be a market for discrete graphics. But will Nvidia be able to survive from the leftovers? We have discussed this topic with Nvidia extensively and you can read the answers in an interview here on TG Daily on Monday.
Now that Intel has flexed their muscles with the Core2Duo AMD needs to answer with another great idea. We all benefit from the competition. LB Actually THG editors used to own all AMD processors Apr 02, 2007 13:35 A few years ago (before Core 2/Core 2 Duo)... most of the THG editors owned computers powered by AMD processors. It was an amazing ratio. At the time, AMD CPUs gave a much better bang for the buck. I personally still have three computers with AMD processors, including a trusty Shuttle box that has survived almost six years of abuse.
Please Log In to post.
Log in to comment