Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
This topic is locked from further discussion.
Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
[QUOTE="Wasdie"][QUOTE="Storm_Marine"] Minus the motion-control absurdity, that sounds like what PC's have been doing forever. More services, larger hardrives....I don't think that's going to get people excited.AncientDozer
The success of the Wii and the Kinect make that notn absurd. PCs don't work like consoles either. Different divisions of the PC are replying to the demand for different features for different industries. We need better graphics for games, we need better sound for audio mixing, we need more powerful processing for tons of other stuff. None of that is how consoles work.
Adding tons of unsustainable horsepower to a console after the Wii has sold over 80 million is kind of pointless from a business perspective. System wars is far to focused on graphics to know why people buy gaming consoles in the first place.
That I can agree with. System wars is in its own little world. I don't know a console generation that hasn't experienced a massive leap in graphics.Avatar graphics in a real-time interactive world? A long long time away.Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
Mozelleple112
At least as good as this generation, but with standard 60fps locked.foxhound_fox
60fps will never be standard.
[QUOTE="AncientDozer"][QUOTE="Wasdie"]That I can agree with. System wars is in its own little world. I don't know a console generation that hasn't experienced a massive leap in graphics.The success of the Wii and the Kinect make that notn absurd. PCs don't work like consoles either. Different divisions of the PC are replying to the demand for different features for different industries. We need better graphics for games, we need better sound for audio mixing, we need more powerful processing for tons of other stuff. None of that is how consoles work.
Adding tons of unsustainable horsepower to a console after the Wii has sold over 80 million is kind of pointless from a business perspective. System wars is far to focused on graphics to know why people buy gaming consoles in the first place.
Storm_Marine
We've hit the law of dimishing returns. Double the strength of a GPU, double the price, only get a 5% real world performance increase. That's the reality.
[QUOTE="theuncharted34"]60fps will never be standard.foxhound_foxIt is a shame. If I were a publisher, I'd enforce it.
I actually would prefer a 60fps standard as well. It was the standard back in the 2d days, but ever since the 5th gen it's been up to the developer.
id has shown you can have a fantastic looking game and still run at 60fps.
No it hasn't, it's been RUMORED.It's all ready been confirmed that the next Xbox will have graphic's on par with the movie Avatar
Supabul
C'mon Wasadie, doubling the RSX's power would not double the price in 2011.Storm_MarineAnd where's this 5% performance thing come from?
PS3 level with higher framerates and 1080p standard, at least in the beginning, until the newer system's pushed well beyonds its capabilities and they start sptting out games with 20fps that dip to 5fps every now and again.
I don't imagine it will be a large boost...just more stable and easier to work with. we'll see games looking slightly better than Uncharted or Crysis, running in 3D, 60FPS and full AA, 1080P standard instead of the seldom reached goal
[QUOTE="Mozelleple112"]Avatar graphics in a real-time interactive world? A long long time away.Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
Storm_Marine
This, this ,this. Could not have put it better. Why? Because of the following reasons:
As far as i'm aware, we don't have a petabyte disc or 1000-core processor console yet:P
It is a shame. If I were a publisher, I'd enforce it.[QUOTE="foxhound_fox"][QUOTE="theuncharted34"]60fps will never be standard.theuncharted34
I actually would prefer a 60fps standard as well. It was the standard back in the 2d days, but ever since the 5th gen it's been up to the developer.
id has shown you can have a fantastic looking game and still run at 60fps.
It was never a standard in in the 2D days. A lot of games back then had framerate issues though of course a lot of them did not as well, but I wouldn't call it a set standard. The only console that I've played that had it as a pretty standard feature was the Dreamcast. I can only think of one of perhaps 50 or so Dreamcast games I've played, that ran poorly and it was a bad game called NFL Quarterback Club60fps isn't going to be standard. Targetting 60fps instead of 30fps means severely downgrading quality of your graphics, since you literally have half the time available to render each frame. It also puts a huge strain on production, since staying at 60fps requires a lot of diligence and profiling not just from the programmers, but also from the artists and designers. Whether 60fps even looks better than 30fps with a nice motion blur is up for debate...a lot of developers have run tests with focus groups showing them games running at 60fps and 30fps with motion blur, and ended up with the majority of people picking the latter as appearing smoother! So in end the only real benefit is less latency for snappier controls, and the importance of that depends on what kind of game you're making.
As for RAGE, you have to keep in mind that this a game using some very exotic tech that comes with a lot of tradeoffs and drawbacks (huge space on disc, lower texture density, etc). It also took them 6 years to get it right and actually make a game! It's not really realistic to point at that and say "oh hey, RAGE does 60fps so everybody can do 60fps".
Well as its looks MS will be going with an AMD Triton APU with their next console which means dont expect graphics power beyond AMD 5770 levels. Which still is nearly 5x more powerful then 360's gpu. With Sony and the PS4 the only thing thats for sure is that they will be using an upgraded Cell. No word with GPU power . and the WiiU is suppose to have a quad core and a R770 based gpu (AMD 4800's)with 1.5 gb of memory. Many are hoping that both Sony and MS use at the very least 2gb of memory with 1gb of video memory which would allow true HD resolutions and much better detail quality. So would the next generation of consoles be able to max BF3? the answer is no, play the game on a mix of medium and high setting I would say that will be mostly what will happen. Dice has stated that you will need a GTX 580 to run BF3 at 1080 with 60 fps average with all high settings and direct x 11 and the game will need more gpu power to max the game. so which means the next set of consoles wont even come close.
Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
Mozelleple112
Considering the thing that rendered Avatar was probably the size of my room & still couldn't do it in real time....probably a LONG time from now.
Well as its looks MS will be going with an AMD Triton APU with their next console which means dont expect graphics power beyond AMD 5770 levels. Which still is nearly 5x more powerful then 360's gpu. With Sony and the PS4 the only thing thats for sure is that they will be using an upgraded Cell. No word with GPU power . and the WiiU is suppose to have a quad core and a R770 based gpu (AMD 4800's)with 1.5 gb of memory. Many are hoping that both Sony and MS use at the very least 2gb of memory with 1gb of video memory which would allow true HD resolutions and much better detail quality. So would the next generation of consoles be able to max BF3? the answer is no, play the game on a mix of medium and high setting I would say that will be mostly what will happen. Dice has stated that you will need a GTX 580 to run BF3 at 1080 with 60 fps average with all high settings and direct x 11 and the game will need more gpu power to max the game. so which means the next set of consoles wont even come close.
04dcarraher
its called fixed hardware dude. Pretty sure next gen consoles have a chance at maxing it out even if they are using a weaker GPU. This is assuming they go for a mid/highendish GPU.
I expect Crysis 1 to Battlefield 3, both maxed 1080p, to be the benchmarks for Next Gen. I mean, Wii U is supposed to hit that benchmark, and that is the only Next Gen system we have any info on.
And what does it matter if we only have a half step in graphics? Why do we want a mini-PCs as consoles? Can't we have stuff to differentiate the consoles Next Gen? I don't want HD Twins again (Or Triplets).
I dunno, but it seems that it will be a while for consoles. This generation is really starting to show its age. Jumped ship to the PC because of it.
[QUOTE="Mozelleple112"]
Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
painguy1
Considering the thing that rendered Avatar was probably the size of my room & still couldn't do it in real time....probably a LONG time from now.
Well as its looks MS will be going with an AMD Triton APU with their next console which means dont expect graphics power beyond AMD 5770 levels. Which still is nearly 5x more powerful then 360's gpu. With Sony and the PS4 the only thing thats for sure is that they will be using an upgraded Cell. No word with GPU power . and the WiiU is suppose to have a quad core and a R770 based gpu (AMD 4800's)with 1.5 gb of memory. Many are hoping that both Sony and MS use at the very least 2gb of memory with 1gb of video memory which would allow true HD resolutions and much better detail quality. So would the next generation of consoles be able to max BF3? the answer is no, play the game on a mix of medium and high setting I would say that will be mostly what will happen. Dice has stated that you will need a GTX 580 to run BF3 at 1080 with 60 fps average with all high settings and direct x 11 and the game will need more gpu power to max the game. so which means the next set of consoles wont even come close.
04dcarraher
its called fixed hardware dude. Pretty sure next gen consoles have a chance at maxing it out even if they are using a weaker GPU. This is assuming they go for a mid/highendish GPU.
Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself.[QUOTE="Mozelleple112"]Avatar graphics in a real-time interactive world? A long long time away. Sadly you're right =(Intestesting topic: When do you think we'll have Avatar graphics?
Imo the earliest sign of Avatar quality CGI in real time will be 2020, an Nvidia tech demo of like 6 $1000 gpus in SLI together yet to be released
Storm_Marine
avatar like graphics in 480p with 24fps :Pkickass1337Avatar graphics in 160x90p with 0.1 fps is more realistic.
[QUOTE="painguy1"][QUOTE="Mozelleple112"]
Considering the thing that rendered Avatar was probably the size of my room & still couldn't do it in real time....probably a LONG time from now.
[QUOTE="04dcarraher"]
Well as its looks MS will be going with an AMD Triton APU with their next console which means dont expect graphics power beyond AMD 5770 levels. Which still is nearly 5x more powerful then 360's gpu. With Sony and the PS4 the only thing thats for sure is that they will be using an upgraded Cell. No word with GPU power . and the WiiU is suppose to have a quad core and a R770 based gpu (AMD 4800's)with 1.5 gb of memory. Many are hoping that both Sony and MS use at the very least 2gb of memory with 1gb of video memory which would allow true HD resolutions and much better detail quality. So would the next generation of consoles be able to max BF3? the answer is no, play the game on a mix of medium and high setting I would say that will be mostly what will happen. Dice has stated that you will need a GTX 580 to run BF3 at 1080 with 60 fps average with all high settings and direct x 11 and the game will need more gpu power to max the game. so which means the next set of consoles wont even come close.
04dcarraher
its called fixed hardware dude. Pretty sure next gen consoles have a chance at maxing it out even if they are using a weaker GPU. This is assuming they go for a mid/highendish GPU.
Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself.No i'm not. Its a well know fact that devs can push more out of the hardware on a fixed platform. On PC the DX/OpenGL API hinders performance. Console devs don't have to deal with that since they can use low level languages to push more out of the hardware.
I'm not expecting a huge leap again. I imagine most games will have graphics on par with Killzone 3 next gen.
Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself. 04dcarraher
Of course it does. Targeting fixed hardware means you can figure out ways to do the same thing more efficiently for that hardware, which in turn lets you more things. Do you think console developers just sit around all day tweaking settings until they get the game running well?
Ill tell you when PS4 details reveal. DX11 is all i want.In a battlefield 3 interview the developer said frostbite 2.0 was an engine for next gen consoles, so basically battlefield 3 pc graphics are what I am expecting for next gen consoles. It is actually really sad because Battlefield 3 will be similar to crysis in graphics quality, which came out in 2007. If we are not expecting a new console from microsoft or sony until 2013, then that means that pcs are 6 years ahead of consoles. Wow. That is basically an entire life span of a console. It is like pcs are ps3s and consoles are ps2s.
So what kind of graphics do you expect next generation. consoles of course, pcs dont have silly generations
M4Ntan
Maxed out witcher 2/metro 2033/Arma 3 graphics. Around that area at least.
edit: Forget to add BF3 to that list.
Judging by the amount of people that were unimpressed with the Wii U tech demos, I predict a lot of people will be disappointed by the next gen. Diminishing returns are here in full force.
Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself.[QUOTE="04dcarraher"][QUOTE="painguy1"]
its called fixed hardware dude. Pretty sure next gen consoles have a chance at maxing it out even if they are using a weaker GPU. This is assuming they go for a mid/highendish GPU.
painguy1
No i'm not. Its a well know fact that devs can push more out of the hardware on a fixed platform. On PC the DX/OpenGL API hinders performance. Console devs don't have to deal with that since they can use low level languages to push more out of the hardware.
No its still not true lol, the hardware is still the same Say if you have a console with a 9800GT and a Pc has a 9800GT same gpu,memory and abilities. Ok? those devs can not make that 9800GT faster then what it is... a 9800GT.... modern DX/ OpenGL API's do not hinder the hardware that much to warrant to say omg it hinders performance!!!. What happens on consoles after the code is optimized the best they can get it, they compromise all over the place to get it to work well, which is all they do on current consoles all the time because of memory and hardware limits. Also its like saying on console 256mb of video memory can out do 512mb of video memory on Pc because console dont have deal with Direct x... 256mb can only hold 256mb of data at any time not 512mb or 1024mb no matter how good they are...
[QUOTE="04dcarraher"] Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself. Teufelhuhn
Of course it does. Targeting fixed hardware means you can figure out ways to do the same thing more efficiently for that hardware, which in turn lets you more things. Do you think console developers just sit around all day tweaking settings until they get the game running well?
Pushing the hardware to 100% is different then saying they can make this gpu faster then this gpu even though they are exactly the same. They can not make the hardware faster then what it is. there are plenty of examples of PC games pushing gpu's to nearly 100% usage showing that the game/software is efficient. Ya devs do tweak the games to get it to work correctly why do you think many games run below HD resolutions have items or detail in sections tone down or cut to make it run well. All they do do is compromise when the hardware it already at its limits....Give me 1080p and 60fps. If not 60fps then a constant 30fps would be nice. Framerate dips can be rather frustrating.
Other than, that, then nice bumps in graphical fidelity would be good. I'd rather have good performance than a large jump in graphical fidelity though.
Please Log In to post.
Log in to comment