Unless you have an infinite speed CPU and infinite memory, removing the GPU will never be called progress.
Software rendering does have limits: the hardware it runs on.
This topic is locked from further discussion.
[QUOTE="onemic"]So what exactly is ray tracing? From reading about the description on wikipedia, it seems like it's just an accurate representation of light being casted based on light being seen from your eyes.rimnet00
I just looked at the quake 3 ray traced engine. It seems as if realistic looking shadows(ones that grow, move, or shrink in size depending on how far or near a light source is, as well as its general location in relation to the object) and reflections are also formed using ray tracing. Lighting, of course looks much better as well.
[QUOTE="rimnet00"][QUOTE="onemic"]So what exactly is ray tracing? From reading about the description on wikipedia, it seems like it's just an accurate representation of light being casted based on light being seen from your eyes.onemic
I just looked at the quake 3 ray traced engine. It seems as if realistic looking shadows(ones that grow, move, or shrink in size depending on how far or near a light source is, as well as its general location in relation to the object) and reflections are also formed using ray tracing. Lighting, of course looks much better as well.
It an interesting thought.
One could also speculate the progress of graphics over this decade or two year span of processing power that we will need to catch up to power both cpu/gpu. Just imagie playing doom 15 years or so ago. Would you ever predict the quality of what dx10 games will bring from looking at crysis or world in conflict and several others. Probably not. Who knows, we may have whole seperae graphics systems from a PC still in the future projecting the game in a hologram in or liviing room in the future.
Just to say graphics may keep advancing way beyond the capability of the main CPUs power.
Sorry, don't have the time to read the entire thread but...
GPUs won't really die out, they will just slowly meld into a different form.
All that is needed is something with ridiculously high floating point power.
The real future is close.
I personally can't wait until we move onto raytracing.
Sorry, don't have the time to read the entire thread but...
GPUs won't really die out, they will just slowly meld into a different form.
All that is needed is something with ridiculously high floating point power.The real future is close.
I personally can't wait until we move onto raytracing.LordEC911
yeah i first read about ray tracing in 2004 and it seemed so far away. now we seem so on the cusp of it all
IBM using 15 Cell to render a complex landscape in realtime. I believe next Cell will render 3d alone. That's possible for console. Im not sure PC can do that because we need a large x86 CPU to reach there.Macolele
If a console will be able to do it then of course a pc will...
[QUOTE="Macolele"]IBM using 15 Cell to render a complex landscape in realtime. I believe next Cell will render 3d alone. That's possible for console. Im not sure PC can do that because we need a large x86 CPU to reach there.Gangans
If a console will be able to do it then of course a pc will...
The cells ibm use are not ps3 cells. They have 1 spu and 8 epu's (or w/e they're called) and are clocked at 6 or 9 GHz, instead of the ps3's 1:6 @ 2.6GHz proc. The next cell will not come for several years, if not a decade.
Right now you can do a complex landscape with a nvidia tesla maxed out. faster.
Wow, you've like completely latched onto AMD's idea of the general processing unit of the future. You even used the word "fusion" like AMD did.
But then again AMD doesn't plan to drop the idea of hardware rendering. AMD wants to merge the CPU and GPU onto one die, not as one processing core. So I guess that's where your's and AMD's vision differ. AMD being more realistic and your's just wishful thinking.
I'm not sure how one can move away from hardware rendering, let alone DirectX (DirectX is an API). I really don't like how you mentioned Pixar and CGI studios. They don't render their animations in real-time, it's pre-rendered. If you haven't notice CGI is a digital form of stop motion animation. And yes, Pixar and such do use hardware rendering to speed up the rendering time.
P.S. I do remember playing Half Life in software rendering. But, gosh, did the game look better with OpenGL hardware rendering.
Why is it that we are so amazed at computer graphics over real life? Crysis footage makes me gasp, but the bluffs around my area bore me. Huh.
gbarules2999
HAHAHA! That's great. I still love to get outdoors as often as I can to swim, bike, run, hike, ski, whatever. The views from the tops of 4000+ foot peaks still puts the awe of God in me. Though, I can't stop drooling over the latest screeshots of the hottest next generation games.
Wow some of you guys are skeptical. In 1987 graphics cards were a rare thing indeed, so 20 years from now what makes you think the humble gpu of the future will even resemble anything we have now?
We are all speculating, but I'm just saying some of you seem to underestimate the pace of progress. Did you know that by 2012 intel plans to start mass producing cpu's on a 22nm process? This may even happen sooner considering AMD is planning a 45nm chip next year.
PCI-e and directx10 will all be ancient technologies in 10 years time, let alone 20. Have faith for the future, stop thinking in the short-term.
And with these words I retire from this thread. I'm glad many of you enjoyed what I wrote. Have fun continuing any arguments/discussions you guys.
Now you are just putting words into our mouths. We were disagreeing with the radical viewpoints you held about the future of computer graphics. Instead of admitting you were wrong, you completely edited your main post and essentially rewrote it in it's entirety. In fact, you adopted some of the points we mentioned in our critiques and completely changed the entire premise of your post. What is worse is, you tried to make us look like fools by making others believe your revision had been your stance all along.
No one said DX10 was going to be around in 10 years. No one said PCI-e was going to be either. We were not thinking short term. We were thinking about the future, instead of bantering about devolving the entire field of computer graphics by removing APIs and GPUs, and trying to do software rendering on general purpose CPUs. Then editing our your entire post to pretend you never said such things, after realizing how wrong it was.
Frankly, it looks like you are surfing through wikipedia and then coming back posting things on the board pretending you actually grasp the concepts. If you havn't already realized, this is a form of plagiarism, and it is looked down upon by any academic.
Sorry if this comes off as harsh, but the way you going about with your replies is rather irritating to say the least. Like I said in my previous posts; it is nice to see people looking towards the future. However, learn to bat before trying to couch a baseball team.
AMD is already planning an architecture that combines the CPU and GPU as one chip, sounds like your prediction is becoming true.hamidious
Thats what i was thinking too!
I have been reading up on things like programmable shaders, and basically even directx 10 with it's shader model 4 features does not even come close to software rendering capabilities. Remember playing half-life or quake for the first time, and not even needing a graphics card?
The possibilites in theory were unlimited, yet in reality cpu's were just too slow at the time.
But welcome to 2007, the mass produced and affordable intel core2duo quad processor (just one example) could do so much with software rendering. Dedicate a whole 2 cores to graphics computations with smart and highly efficient voxel based rendering techniques and you can achieve crysis like scenes on a cpu at a playable framerate. (OK, I'm exaggerating ;) )
Unlike programmable gpu's, software rendering is unlimited in it's possibilities, pixar and other studios use software rendering to produce their highly detailed 3d films, graphics card don't have the flexibility for techniques such as ray-tracing etc in real time.
So, here's what I'm getting at....about 20 years from now, gpu's are basically going to become so flexible, so general purpose with their shader implementations, that they will in fact have all the capabilities of cpu's. Then the logical step is to combine any differences and create a hybrid cpu/gpu design that will become so powerful, so efficient, that specialised graphics processing boards will become obsolete.
Why bother optimizing your game for a particular gpu instruction set like directx10 or opengl2.0, when you can just write all your shaders in software and achieve any effect you wish on any scale, which would be possible, given the power of the cpu you're dealing with.
To further simplify my point....we will only get true to life graphics and leave the uncanny valley, when cpu's become our graphics and physics processors as well. From the beginning it has been about integration. We are headed towards some sort of processing singularity, when the core features of a whole PC will fit on a single nail sized chip, I guarantee it, and it's sooner than you think.
And with it will come games that will be so realistic, they will be games no more. Convincing AI will have the chance to flourish when such supercomputing power is in the homes of mainstream consumers. We are headed for a fusion of hardware. When that fusion occurs, developers will only need to worry about one thing, software. They will be limited no more.
What does this mean for the future of specialised games consoles?
Nothing sinister. They will simply become simplified entertainment extensions/appendages of the greater processing hub of the home of 2027 that will be the PC.
Couple all this with new peripherals, that will allow user interaction on an entirely new level (direct brain-to-pc interfaces) and we are living in ancient history people.
Remember the games of 1987? We live in an entirely new era. Imagine what 2027 will look like...
I think we should look ahead, instead of focusing on directx11 or the geforce 9800, the gpu makers and cpu makers should experiment, work together and dvelve into entirely new and radical technologies. Let's skip this time consuming sequential evolution of computer graphics, and move by leaps and bounds. To do this the cpu and gpu manufacturers should pool their resources and technology.
Gpu's are starting to share so many aspect of the cpu, it is becoming wasteful, us gamers are ready for union NOW. No separate physics cards, no more graphics cards, just one single processing unit smack bang in the heart of our machines. No need for specialisation, for fancy visual tricks, if the cpu is fast enough.
If nano/quantum computing ever takes off, and mainstream pc's hit the terahertz range, specialised hardware will become obsolete, and the possibilities endless. I look forward to this day.
~Gangans~
Gangans
BTW (off topic)- how gutted would you have been if Glitchspot decided to give you the ol "a post message cannot be blank" error when you hit the submit button...man, i hate it when that happens. The right click copy option is your friend :)
Now you are just putting words into our mouths. We were disagreeing with the radical viewpoints you held about the future of computer graphics. Instead of admitting you were wrong, you completely edited your main post and essentially rewrote it in it's entirety. In fact, you adopted some of the points we mentioned in our critiques and completely changed the entire premise of your post. What is worse is, you tried to make us look like fools by making others believe your revision had been your stance all along.
No one said DX10 was going to be around in 10 years. No one said PCI-e was going to be either. We were not thinking short term. We were thinking about the future, instead of bantering about devolving the entire field of computer graphics by removing APIs and GPUs, and trying to do software rendering on general purpose CPUs. Then editing our your entire post to pretend you never said such things, after realizing how wrong it was.
Frankly, it looks like you are surfing through wikipedia and then coming back posting things on the board pretending you actually grasp the concepts. If you havn't already realized, this is a form of plagiarism, and it is looked down upon by any academic.
Sorry if this comes off as harsh, but the way you going about with your replies is rather irritating to say the least. Like I said in my previous posts; it is nice to see people looking towards the future. However, learn to bat before trying to couch a baseball team.rimnet00
I think everyone should be aware of the bold , before you start praising the TC. :( hate to be the one that says it, but it was a tad unethical :cry:
Haha i couldv'e saved the orginal poster alot of time. Check out the new quantam computer someone made. You could literally put you CPU,Gpu and everrything else altogether in one little system. It reads binary by 01 and so cracking code that would take 10 years would take only couple minutes or seconds. The computer is almost unlimited of what you could do with it.
heres link go to contents and click on power of quantam computer
http://en.wikipedia.org/wiki/Quantum_computer#The_power_of_quantum_computers
I suggest that people take a read of this article by Ray Kurzweil
Law of acceleratiing returns
http://www.kurzweilai.net/articles/art0134.html?printable=1
As you have probably noticed over the last few years things seem to be getting a bit slower, as in there have been no huge leaps in processing capabilities, no upwards scaling in ghz. From research it is clearly evident that we have an exponential growth in computing power, but we have many "S" curves along the way (you can see them on the graphs by going to that link), right now research is being done which will create future hardware that will once again drive them LEAP in technology. We are just on the flat part of the curve. This has happened many times in computer technology in the 1900's. See example of S curve here http://en.wikipedia.org/wiki/Diffusion_of_innovations
Our future is based heavily on the progress of advanced nanotechnology and methods of integrating the technology or moving away from conventional methods and relying on new nanotech innovations. Carbon nanotubes as already mentioned are likely to play a huge part because of their conducitivity and heat dissipation and low energy requirements.
In 20 years computers will probably be far more capable than the human brain, or many brains and able to render life like games. The most important thing now I believe is creating good A.I and interaction with games. Computer interfaces are important because they add so much to the experience. There are aleady some really cool developments in VR its just right now its not cost effective, however, there are better devices that are just only 5 years away that will enable full visual VR by using either 'contact lenses' to create either full VR or just augmented VR (computer overlay on real world) or glasses to beam the images right into our retinas.
Then of course we can just speculate where nano computing will take us, maybe direct brain - computer interfacing where we are fully immersed into the virtual world by using nano devices to switch off/on certain parts of the brain to enable full vr with all senses.
All speculation but its fun to talk about it I think.
I believe that the orginal poster is correct. With advanced nanotech computer will literally almost dissapear, they will be integrated and gpu/cpu/ and other devices will be compact into something very small.
I encourage you to take a look at RAY KURZWEIL talks. He wrote some books called "the singularity is near" and "age of spiritual machines" and he has been correct on almost every prediction he made in his first book. He is a famous inventor and does the predictions based on research, not a crystal ball. lol
Type in ray kurzweil
http://www.youtube.com
and check out his talks. I feel some of you will be quite interested, if you're interested in where computer technology is going.
Matt
[QUOTE="Macolele"]IBM using 15 Cell to render a complex landscape in realtime. I believe next Cell will render 3d alone. That's possible for console. Im not sure PC can do that because we need a large x86 CPU to reach there.Gangans
If a console will be able to do it then of course a pc will...
Unfortunately, the x86 CPU never reach theoretic performance itself. We know Intel 80 core can render 3d too. But OS can't handle it. We need a smart OS in the future.
Rumor Intel is going to launch GPU with a teraflop power. Its 10 core with shared cache like a CPU than a traditional GPU. It's kind of CPU do GPU function.
Haha i couldv'e saved the orginal poster alot of time. Check out the new quantam computer someone made. You could literally put you CPU,Gpu and everrything else altogether in one little system. It reads binary by 01 and so cracking code that would take 10 years would take only couple minutes or seconds. The computer is almost unlimited of what you could do with it.
heres link go to contents and click on power of quantam computer
http://en.wikipedia.org/wiki/Quantum_computer#The_power_of_quantum_computers
Fuseking
super computer right here
And with these words I retire from this thread. I'm glad many of you enjoyed what I wrote. Have fun continuing any arguments/discussions you guys.Gangans
Meaning: You guys are too complex I'm leaving.
I never heard anyone attacking your concept. In fact, I think that as we go things will become outdated and everything will move as a technology. But assuming that we are just attacking everything is silly. Just because we don't like the entirety of your post doesn't mean you should just run away.
Well, whatever. But I do enjoy reading up on this stuff. But this future guessing doesn't really get us anywhere. That's whay we're elaborating. Grow a pair.
Please Log In to post.
Log in to comment