This is supposedly a real-time tech-demo for R700 on which the Wii U GPU will be based upon:
http://www.youtube.com/watch?v=0YjXCae4Gu0
The difference between this demo and most other demos is that this demo uses many visionary rendering technologies like voxels and ray-tracing.
The fact is that modern game graphics are held back by their APIs and ancient rendering methods which are still stubbornly used instead of innovative new technologies.
Of course the hardware plays a role here too.
100-core CPUs are already a reality and tech for 1000-core ones exists.
In the next few years they should be totaly affordable, having superior performance compared to current 4 to 8 core home user CPUs.
The reason why Intel and AMD still stick to so low core CPUs is simply because most applications and games don't make use of more than four or even two cores.
Furthermore, CPUs could easily replace graphics cards if they added more graphics functions to them.
The difference between a CPU and a GPU is not really that big, it's simply that CPUs are used for general processing whereas GPUs are used for graphics acceleration.
If they added more graphics functions to CPUs there would be no need for dedicated GPUs.
I believe it's AMD and Nvidia who don't want that to happen so they can continue selling their dedicated GPUs which are constantly getting new features which could actually be rendered by the CPU alone.
Infact the tech for GPU-less graphics rendering already exists and is looking very promising.
It seems to me we live in very exciting times for technology yet it doesn't nearly get utilized as well as it could.
Also note that this isn't important only for graphics but also for things like physics, AI, interactivity, dynamically changing and massive game worlds, ect.
Log in to comment