This topic is locked from further discussion.
I was watching the final fantasy movie which came out in 2001, and the graphics look sooooo freakin good , better then 360 and ps3 and it came out in 2001. so if my tv was capable of showing such good graphics back then, im sure they could have made a game to look as good. Also i was playing silent hill 1 and noticed the cutscenes look soo good but the gameplay graphics were crappy, so if they were able to make the cutscenes look good what about the gameplay? Same thing can be said for terminator 2 and jurassic park which both came out around 1992-1993, if they can make the special effects look so good back then they im sure they could have done the same for a console, to me it seems like they are holding back just so you would have an excuse to purchase a new console every 5 or 6 years. maybe im way off but it does make you thinkjetslalom
*facepalm*
all of the examples you used are done on what's called a "render farm" where they render graphics frame-by-frame using hardware much more powerful than consumer hardware at the time they were done
none of the examples you mentioned could be done in real time when they were made, even Toy Story would give any hardware today a run for its money if they tried to render it in real time
The CG used for film and movies is rendered on "farms" of high-powered computers. A single computer in the cluster can take a few minutes to render a single frame for complex scenes.
A real-time video game trying to maintain a framerate of 30fps has about 33 milliseconds to render a single frame. Obviously you're not going to acheive the same quality as an off-line render.
maybe the new radeons can come close, those things pack a lot of power. What was it 2tflops per card.The CG used for film and movies is rendered on "farms" of high-powered computers. A single computer in the cluster can take a few minutes to render a single frame for complex scenes.
Teufelhuhn
A real-time video game trying to maintain a framerate of 30fps has about 33 milliseconds to render a single frame. Obviously you're not going to acheive the same quality as an off-line render.
[QUOTE="Teufelhuhn"]maybe the new radeons can come close, those things pack a lot of power. What was it 2tflops per card.The CG used for film and movies is rendered on "farms" of high-powered computers. A single computer in the cluster can take a few minutes to render a single frame for complex scenes.
A real-time video game trying to maintain a framerate of 30fps has about 33 milliseconds to render a single frame. Obviously you're not going to acheive the same quality as an off-line render.muscleserge
The gaming industry and the film industry are completely different from each other. Also technology back then wasn't as great as it is today. Do I really need to explain this? :roll:
Please Log In to post.
Log in to comment