This topic is locked from further discussion.
[QUOTE="painguy1"]
[QUOTE="04dcarraher"] Wrong,, fixed hardware does not make the hardware or game run better with weaker hardware. You have what you have and that's it you have to decide what gets cut or toned down to make it work or fit. If they happen to use a mid-high end gpu like the 5770/6770 gpu's its still cant be faster then itself. 04dcarraher
No i'm not. Its a well know fact that devs can push more out of the hardware on a fixed platform. On PC the DX/OpenGL API hinders performance. Console devs don't have to deal with that since they can use low level languages to push more out of the hardware.
No its still not true lol, the hardware is still the same Say if you have a console with a 9800GT and a Pc has a 9800GT same gpu,memory and abilities. Ok? those devs can not make that 9800GT faster then what it is... a 9800GT.... modern DX/ OpenGL API's do not hinder the hardware that much to warrant to say omg it hinders performance!!!. What happens on consoles after the code is optimized the best they can get it, they compromise all over the place to get it to work well, which is all they do on current consoles all the time because of memory and hardware limits. Also its like saying on console 256mb of video memory can out do 512mb of video memory on Pc because console dont have deal with Direct x... 256mb can only hold 256mb of data at any time not 512mb or 1024mb no matter how good they are...
first off u need to stop over analyzing my words. I never once siad that having fixed hardware magically makes the hardware faster & have a larger memory capacity so stop twisting my words around. Fact of the matter is that when u have low level access to the hardware u can push out much more. The API overhead is a big hinderance whether u like it or not.
The API overhead is not a big hindrance..... if it was you couldnt run games as good or better with similar gpu's performance wise with console ports. How do you explain fpr example Geforce 6800's or geforce 7's being able to equal or to out pace almost all multiplat games all the way into 2008? The DirectX or OpenGL overhead as you call it does not gimp the hardware as much as you think. DirectX and OpenGL provide invaluable common interfaces to a varied array of graphics cards. PS3 uses a modified OpenGL and the 360 uses Direct x to address the Xenos (which uses the 360's cpu resources to do) ... so whats that about consoles not using any API that hinders performance? They can push a fixed piece of hardware since that the only piece of hardware they have to work with so they can sqeeze a much as they can. However you claiming API hinders performance is nonesense since both console uses an API. Even though PC uses a generalized API its up the delevopers to make the software/game effiecent/optminized.
The API overhead is not a big hindrance..... if it was you couldnt run games as good or better with similar gpu's performance wise with console ports. How do you explain fpr example Geforce 6800's or geforce 7's being able to out pace almost all multiplat games all the way into 2008? The DirectX or OpenGL overhead as you call it does not gimp the hardware as much as you think. DirectX and OpenGL provide invaluable common interfaces to a varied array of graphics cards. PS3 uses a modified OpenGL and the 360 uses Direct x to address the Xenos (which uses the 360's cpu resources to do) ... so whats that about consoles not using any API that hinders performance? They can push a fixed piece of hardware since that the only piece of hardware they have to work with so they can sqeeze a much as they can. However you claiming API hinders performance is nonesense since both console uses API coding
04dcarraher
http://www.flesheatingzipper.com/tech/2011/03/amd-directx-sucks-too-much-overhead-that-is/
Straight from the mouth of ATi/AMD. Do u still want too keep telling me that PC API's dont cause much problems?
In a battlefield 3 interview the developer said frostbite 2.0 was an engine for next gen consoles, so basically battlefield 3 pc graphics are what I am expecting for next gen consoles. It is actually really sad because Battlefield 3 will be similar to crysis in graphics quality, which came out in 2007. If we are not expecting a new console from microsoft or sony until 2013, then that means that pcs are 6 years ahead of consoles. Wow. That is basically an entire life span of a console. It is like pcs are ps3s and consoles are ps2s.
So what kind of graphics do you expect next generation. consoles of course, pcs dont have silly generations
M4Ntan
Voxel + raytracing + raster hybrid.
[QUOTE="04dcarraher"]
The API overhead is not a big hindrance..... if it was you couldnt run games as good or better with similar gpu's performance wise with console ports. How do you explain fpr example Geforce 6800's or geforce 7's being able to out pace almost all multiplat games all the way into 2008? The DirectX or OpenGL overhead as you call it does not gimp the hardware as much as you think. DirectX and OpenGL provide invaluable common interfaces to a varied array of graphics cards. PS3 uses a modified OpenGL and the 360 uses Direct x to address the Xenos (which uses the 360's cpu resources to do) ... so whats that about consoles not using any API that hinders performance? They can push a fixed piece of hardware since that the only piece of hardware they have to work with so they can sqeeze a much as they can. However you claiming API hinders performance is nonesense since both console uses API coding
painguy1
http://www.flesheatingzipper.com/tech/2011/03/amd-directx-sucks-too-much-overhead-that-is/
Straight from the mouth of ATi/AMD. Do u still want too keep telling me that PC API's dont cause much problems?
Lol they recanted that statement later because it was made taken out of context... and was wrong[QUOTE="04dcarraher"]
The API overhead is not a big hindrance..... if it was you couldnt run games as good or better with similar gpu's performance wise with console ports. How do you explain fpr example Geforce 6800's or geforce 7's being able to out pace almost all multiplat games all the way into 2008? The DirectX or OpenGL overhead as you call it does not gimp the hardware as much as you think. DirectX and OpenGL provide invaluable common interfaces to a varied array of graphics cards. PS3 uses a modified OpenGL and the 360 uses Direct x to address the Xenos (which uses the 360's cpu resources to do) ... so whats that about consoles not using any API that hinders performance? They can push a fixed piece of hardware since that the only piece of hardware they have to work with so they can sqeeze a much as they can. However you claiming API hinders performance is nonesense since both console uses API coding
painguy1
http://www.flesheatingzipper.com/tech/2011/03/amd-directx-sucks-too-much-overhead-that-is/
Straight from the mouth of ATi/AMD. Do u still want too keep telling me that PC API's dont cause much problems?
Are you claiming it sucks so much that it reduces GPUs like Radeon HD 6950 to Xbox 360 levels? This POV is absurd, since the multi-platform games provides otherwise.
AMD Radeon HD 4670 can render Devil May Cry PC far beyond Xbox 360's 720p rendering results. Beyond Xbox 360 render results from AMD Radeon HD 4670 GpGPU.
Btw, if Wii U's GPU was Radeon HD 4670 (320 SP)it would kill both PS3 and Xbox 360. AMD Radeon HD 4670 with GDDR5 would widen the gap i.e. most PC "fat" GPUs are bandwidth constrained e.g. Radeon HD5770M (400 SP at 650Mhz)with GDDR5 is about 40 percent better than Radeon HD 5730M (400 SP at 650Mhz).
PS; PC games with following AMD's Gaming Evolved logo are optimized for AMD Radeon HD hardware.
PC games with following NVIDIA's TWIMTBP logo are optimized for NVIDIA Geforce CUDA hardware.
While AMD is building it's own Intel "Larrabee" type cGPU with AMD's own X86-64 IP (i.e. AMD's Graphics Core Next/RadeonHD 79x0 is the start), AMD has re-affirmed support for DirectX.
AMD's statement on reaffirm support for DirectX
There are no technical barriers to stop devs in writing a game for AMD Accelerated Parallel Processing (APP, known as ATI Stream) platform.
You already have high performance application devs like Cyberlink or MainActor supporting AMD Accelerated Parallel Processing platform.
Depends. I think the graphics will be close to current CGI but of course it will still have several disadvantages. I think the next gen we'll slowly start seeing less interest for cutting-edge graphics due to the games slowly starting to reach "photorealism" (hypothetically speaking).
[QUOTE="M4Ntan"]
In a battlefield 3 interview the developer said frostbite 2.0 was an engine for next gen consoles, so basically battlefield 3 pc graphics are what I am expecting for next gen consoles. It is actually really sad because Battlefield 3 will be similar to crysis in graphics quality, which came out in 2007. If we are not expecting a new console from microsoft or sony until 2013, then that means that pcs are 6 years ahead of consoles. Wow. That is basically an entire life span of a console. It is like pcs are ps3s and consoles are ps2s.
So what kind of graphics do you expect next generation. consoles of course, pcs dont have silly generations
ronvalencia
Voxel + raytracing + raster hybrid.
It's a question if Xbox 720 and PS4 will already be capable of real-time raytracing though. That hybrid model does sound likely though. Or perhaps they'll just use standard rasterization and heavy tesselation.
[QUOTE="ronvalencia"]
[QUOTE="M4Ntan"]
In a battlefield 3 interview the developer said frostbite 2.0 was an engine for next gen consoles, so basically battlefield 3 pc graphics are what I am expecting for next gen consoles. It is actually really sad because Battlefield 3 will be similar to crysis in graphics quality, which came out in 2007. If we are not expecting a new console from microsoft or sony until 2013, then that means that pcs are 6 years ahead of consoles. Wow. That is basically an entire life span of a console. It is like pcs are ps3s and consoles are ps2s.
So what kind of graphics do you expect next generation. consoles of course, pcs dont have silly generations
nameless12345
Voxel + raytracing + raster hybrid.
It's a question if Xbox 720 and PS4 will already be capable of real-time raytracing though. That hybrid model does sound likely though. Or perhaps they'll just use standard rasterization and heavy tesselation.
They wont be real time ray tracing ready because that would take more power then two GTX 580's for fluid framerates
[QUOTE="painguy1"]
[QUOTE="04dcarraher"]
The API overhead is not a big hindrance..... if it was you couldnt run games as good or better with similar gpu's performance wise with console ports. How do you explain fpr example Geforce 6800's or geforce 7's being able to out pace almost all multiplat games all the way into 2008? The DirectX or OpenGL overhead as you call it does not gimp the hardware as much as you think. DirectX and OpenGL provide invaluable common interfaces to a varied array of graphics cards. PS3 uses a modified OpenGL and the 360 uses Direct x to address the Xenos (which uses the 360's cpu resources to do) ... so whats that about consoles not using any API that hinders performance? They can push a fixed piece of hardware since that the only piece of hardware they have to work with so they can sqeeze a much as they can. However you claiming API hinders performance is nonesense since both console uses API coding
ronvalencia
http://www.flesheatingzipper.com/tech/2011/03/amd-directx-sucks-too-much-overhead-that-is/
Straight from the mouth of ATi/AMD. Do u still want too keep telling me that PC API's dont cause much problems?
Are you claiming it sucks so much that it reduces GPUs like Radeon HD 6950 to Xbox 360 levels?
No im not. :)
[QUOTE="ronvalencia"]
Voxel + raytracing + raster hybrid.
nameless12345
It's a question if Xbox 720 and PS4 will already be capable of real-time raytracing though. That hybrid model does sound likely though. Or perhaps they'll just use standard rasterization and heavy tesselation.
Real time Voxel + raytracing lights (20 bounce) + raster hybrid already demo'ed during AMD's Cinema 2.0 Radeon HD 4870 release.
[QUOTE="nameless12345"]
[QUOTE="ronvalencia"]
Voxel + raytracing + raster hybrid.
04dcarraher
It's a question if Xbox 720 and PS4 will already be capable of real-time raytracing though. That hybrid model does sound likely though. Or perhaps they'll just use standard rasterization and heavy tesselation.
They wont be real time ray tracing ready because that would take more power then two GTX 580's for fluid framerates
They are not using voxels to speed up ray-tracing.
[QUOTE="kickass1337"]avatar like graphics in 480p with 24fps :PMozelleple112Avatar graphics in 160x90p with 0.1 fps is more realistic.This. Avatar-like graphics at 480p with 24fps is also not possible for quite sometime.
[QUOTE="04dcarraher"]
[QUOTE="nameless12345"]
It's a question if Xbox 720 and PS4 will already be capable of real-time raytracing though. That hybrid model does sound likely though. Or perhaps they'll just use standard rasterization and heavy tesselation.ronvalencia
They wont be real time ray tracing ready because that would take more power then two GTX 580's for fluid framerates
They are not using voxels to speed up ray-tracing.
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
[QUOTE="ronvalencia"]
[QUOTE="04dcarraher"]
They wont be real time ray tracing ready because that would take more power then two GTX 580's for fluid framerates
04dcarraher
They are not using voxels to speed up ray-tracing.
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
Note that AMD is designing it's own Larrabee type X86-64 based graphics processor i.e. raytracing it's on the cards.
Here's how I see it:
It is still up in the air as far as the power of the consoles is concerned. Sony and MS are pushing Motion controls to probably cut costs on the next generation of hardware. At the moment, this is working for MS but not for Sony. If people buy into Kinect, you can bet the system will probably be less impressive spec-wise than if kinect flopped. Move isn't selling as well. In this case, Sony HAS to sell the PS4 somehow. Considering that current gen consoles more or less offer all the "options" one can think of, it is only natural that sony sells next gen based off outstanding visuals.
People that think next gen consoles won't be as powerful as current gen PCs don't seem to understand that they will probably exceed what we see today. It has to in order to sell. I expect EVERYBODY to be floored when the next Playstation is unveiled.
[QUOTE="04dcarraher"]
[QUOTE="ronvalencia"]
They are not using voxels to speed up ray-tracing.
ronvalencia
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
That's running on a 4870?
[QUOTE="04dcarraher"]
[QUOTE="ronvalencia"]
They are not using voxels to speed up ray-tracing.
ronvalencia
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
With the emphasis "should be able". Today's graphics techniques are surprisingly backwards and simply rely on raw polygon crunching power (tesselation being just increasing the poly count on finer details) and barely make use of more advanced and/or alternative graphics techniques.
Those do seem believable next-gen graphics though.
[QUOTE="ronvalencia"]
[QUOTE="04dcarraher"]
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
topgunmv
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
That's running on a 4870?
Probably a 4870 X2 which is two 4870s on one card. But it's supposeldy real-time.
I expect EVERYBODY to be floored when the next Playstation is unveiled.
Heirren
I expect Yamauchi to be finaly able to fully realize his vision of Gran Turismo on the PS4 (the PS3 sadly wasn't sufficient) ;)
[QUOTE="ronvalencia"]
[QUOTE="04dcarraher"]
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
topgunmv
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
That's running on a 4870?
It was a 3rd party demo for Radeon HD 4870's release. It was indicated to run on two RV770s.
Hopefully alot better then this be sad if a 2007 game could compete with the next gen consoles at least the ps4 and 720 anyways.
[QUOTE="ronvalencia"]
[QUOTE="04dcarraher"]
O so they are going with polygon over voxel? so like using sparse voxel octree format for geometry?
nameless12345
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
With the emphasis "should be able". Today's graphics techniques are surprisingly backwards and simply rely on raw polygon crunching power (tesselation being just increasing the poly count on finer details) and barely make use of more advanced and/or alternative graphics techniques.
Those do seem believable next-gen graphics though.
Most devs relies to SDK examples (and graphics card vendor's ISV support) e.g. best example is NVIDIA's FXAA source code blog dump and then the rest of the industry adapted it into existing game engines.
Thanks NVIDIA for doing R&D.
[QUOTE="topgunmv"]
[QUOTE="ronvalencia"]
As in AMD's Cinema 2.0 demos, the vertex stage was skipped, but Cinema 2.0 demos still use AMD's RV770 tessellation hardware...
The next-gen console hardware should be able to render something like this.
nameless12345
That's running on a 4870?
It was a 3rd party demo for Radeon HD 4870's release. It was indicated to run on two RV770s.
The problem is that a single 5870 is stronger then two 4870's on paper , and with small lux-gpu that benchmarks a gpu's power in rays in the thousands per second rates around 8400 for 5870 while a GTX 570+ is rated above 12,000. Also I dont think that demo was real time but more or less pre-rendered cinematic run through
Hopefully alot better then this be sad if a 2007 game could compete with the next gen consoles at least the ps4 and 720 anyways.
DJ_Headshot
I don't know why Crysis gets mentioned so much as the "defining" graphics benchmark tbh. That was true back in 2007 but nowadays we have games that surpass it's visual quality. And appart from being horribly optimized, Crysis ran fine on a dual core with two gigs of RAM and a nvidia 8800 series card. Even the Wii-U will be able to handle Crysis on a good detail level. Infact the only reason current consoles didn't get it is due to their too small amount of RAM.
[QUOTE="nameless12345"]
[QUOTE="topgunmv"]
That's running on a 4870?
04dcarraher
It was a 3rd party demo for Radeon HD 4870's release. It was indicated to run on two RV770s.
The problem is that a single 5870 is stronger then two 4870's on paper , and with small lux-gpu that benchmarks a gpu's power in rays in the thousands per second rates around 8400 for 5870 while a GTX 570+ is rated above 12,000. Also I dont think that demo was real time but more or less pre-rendered cinematic run through
It was claimed to be real time and lux-gpu is not the same program as in the demo e.g. in terms of speed, open source Java VM is not the same quality as Sun's Java VM. We don't know the other algorithm short-cuts employed in the Cinema 2.0 demos i.e. it makes the siad software IP to have some dollar value.
This is also true for GCC 4.x vs Intel CC v10 vs MS VC8. You get what you paid for.
Ones that will blow me away. I mean, it will cause me to sit down, turn the console on, see the game graphics and literally take liftoff.
[QUOTE="DJ_Headshot"]
Hopefully alot better then this be sad if a 2007 game could compete with the next gen consoles at least the ps4 and 720 anyways.
nameless12345
I don't know why Crysis gets mentioned so much as the "defining" graphics benchmark tbh. That was true back in 2007 but nowadays we have games that surpass it's visual quality. And appart from being horribly optimized, Crysis ran fine on a dual core with two gigs of RAM and a nvidia 8800 series card. Even the Wii-U will be able to handle Crysis on a good detail level. Infact the only reason current consoles didn't get it is due to their too small amount of RAM.
Or the fact that console's GPU power is also much weaker then The old Geforce 8800's too,
[QUOTE="nameless12345"]
[QUOTE="DJ_Headshot"]
Hopefully alot better then this be sad if a 2007 game could compete with the next gen consoles at least the ps4 and 720 anyways.
04dcarraher
I don't know why Crysis gets mentioned so much as the "defining" graphics benchmark tbh. That was true back in 2007 but nowadays we have games that surpass it's visual quality. And appart from being horribly optimized, Crysis ran fine on a dual core with two gigs of RAM and a nvidia 8800 series card. Even the Wii-U will be able to handle Crysis on a good detail level. Infact the only reason current consoles didn't get it is due to their too small amount of RAM.
Or the fact that console's GPU power is also much weaker then The old Geforce 8800's too,
Graphics wouldn't be a problem:
http://www.youtube.com/watch?v=2WJG14uLA3k
It would essentially look like the PC version on medium detail with some possible enhancements here and there.
For the next gen of consoles not much tbh,I expect the next gen to be the one with the smallest visual leap ever.Arach666
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
[QUOTE="Arach666"]For the next gen of consoles not much tbh,I expect the next gen to be the one with the smallest visual leap ever.nameless12345
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
The Wii U isn't going to be THAT much more powerful. If anything it's going to be akin to the Ps2 > Gc = Xbox gap.
Sony will probably make a powerful console, Microsoft most likely won't though, Because they just don't need to.
[QUOTE="nameless12345"]
[QUOTE="Arach666"]For the next gen of consoles not much tbh,I expect the next gen to be the one with the smallest visual leap ever.theuncharted34
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
The Wii U isn't going to be THAT much more powerful. If anything it's going to be akin to the Ps2 > Gc = Xbox gap.
Sony will probably make a powerful console, Microsoft most likely won't though, Because they just don't need to.
It all depends on the WiiU's next GPU which is suppose to be a R770 but no ones know how modified it's going to be. Its ethier going to weaker or equal to the normal 4800's. If the WiiU has a 4850 or even a 4870 its going to much better then the PS3 and 360 because of two factors: memory (1,5 gb total vs 512 total) and the GPU which can be as much as 4x faster then the other two. Which means much more detail, larger level designs and higher resolutions in the end more options in games. and with MS fixating with AMD's upcoming APU line its not going to be breaking any modern medium-high ended Pc records out today. However no ones know whats Sony is going to do besides rehashing the Cell beefing it up some for the PS4.
[QUOTE="Arach666"]For the next gen of consoles not much tbh,I expect the next gen to be the one with the smallest visual leap ever.nameless12345
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
The Wii-U is going to be better than the current gen ones,but most likely not a generational gap,and by the way MS is going,I seriously doubt they´ll go for that as well.
As for sony,I recall seeing a thread here from someone at sony stating that the next PS most likely won´t be a big leap as the PS3 was compared to the PS2,and tbh,after the failure that sony had this gen by going all powerfull and losing a huge chunk of their marketshare,I doubt they´ll take the same risks as they did with the PS3.
An even more casual aproach to the console market is the future of console gaming,and that doesn´t need huge tech leaps.
I personaly stand by what I´ve said,net gen the leap will be the smallest ever.
going by this thread, i see that deep in their heart of hearts, console fanboys really want pc graphics
[QUOTE="theuncharted34"]
[QUOTE="nameless12345"]
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
04dcarraher
The Wii U isn't going to be THAT much more powerful. If anything it's going to be akin to the Ps2 > Gc = Xbox gap.
Sony will probably make a powerful console, Microsoft most likely won't though, Because they just don't need to.
It all depends on the WiiU's next GPU which is suppose to be a R770 but no ones know how modified it's going to be. Its ethier going to weaker or equal to the normal 4800's. If the WiiU has a 4850 or even a 4870 its going to much better then the PS3 and 360 because of two factors: memory (1,5 gb total vs 512 total) and the GPU which can be as much as 4x faster then the other two. Which means much more detail, larger level designs and higher resolutions in the end more options in games.
and with MS fixating with AMD's upcoming APU line its not going to be breaking any modern medium-high ended Pc records out today. However no ones know whats Sony is going to do besides rehashing the Cell beefing it up some for the PS4.
I was basing what I said about the Wii U on the demo's that have been shown, which are on par with Current console games. But I also understand those demo's were made very hastily, and on underclocked dev kits. I know the Wii U will be more powerful, but it's not going to be a *huge* difference.
Rehashing the cell? Wow you really hate that CPU, don't you? :P At least enough to disregard the main part of its architecture every chance you get.
Anyways, I'm pretty sure the Cell 2.0 or whatever it's called will be much more efficient.
Graphics definitely help to sell new consoles, but how many companies have complained about rising costs this generation? How many developers went bankrupt this generation because they couldn't make enough money off their games? How many console/pc-only developers went multiplat this generation to help ease on the cost? I bet MS/Sony are very close with 3rd party developers, and get input from them as to what the next generation should bring. Doubling the cost of development is not something I see any company wanting given that blockbuster games now cost $20-$40 million to produce. The consoles getting more ram seems to be a guarantee, but CPU/GPU processing power will probably not increase much at all.Here's how I see it:
It is still up in the air as far as the power of the consoles is concerned. Sony and MS are pushing Motion controls to probably cut costs on the next generation of hardware. At the moment, this is working for MS but not for Sony. If people buy into Kinect, you can bet the system will probably be less impressive spec-wise than if kinect flopped. Move isn't selling as well. In this case, Sony HAS to sell the PS4 somehow. Considering that current gen consoles more or less offer all the "options" one can think of, it is only natural that sony sells next gen based off outstanding visuals.
People that think next gen consoles won't be as powerful as current gen PCs don't seem to understand that they will probably exceed what we see today. It has to in order to sell. I expect EVERYBODY to be floored when the next Playstation is unveiled.
Heirren
[QUOTE="Heirren"]Graphics definitely help to sell new consoles, but how many companies have complained about rising costs this generation? How many developers went bankrupt this generation because they couldn't make enough money off their games? How many console/pc-only developers went multiplat this generation to help ease on the cost? I bet MS/Sony are very close with 3rd party developers, and get input from them as to what the next generation should bring. Doubling the cost of development is not something I see any company wanting given that blockbuster games now cost $20-$40 million to produce. The consoles getting more ram seems to be a guarantee, but CPU/GPU processing power will probably not increase much at all.Here's how I see it:
It is still up in the air as far as the power of the consoles is concerned. Sony and MS are pushing Motion controls to probably cut costs on the next generation of hardware. At the moment, this is working for MS but not for Sony. If people buy into Kinect, you can bet the system will probably be less impressive spec-wise than if kinect flopped. Move isn't selling as well. In this case, Sony HAS to sell the PS4 somehow. Considering that current gen consoles more or less offer all the "options" one can think of, it is only natural that sony sells next gen based off outstanding visuals.
People that think next gen consoles won't be as powerful as current gen PCs don't seem to understand that they will probably exceed what we see today. It has to in order to sell. I expect EVERYBODY to be floored when the next Playstation is unveiled.
Ly_the_Fairy
Tons of small devs on pc make graphically outstanding games for nowhere near 20 million.
Some developers just have trouble managing their finances.
Pushing the hardware to 100% is different then saying they can make this gpu faster then this gpu even though they are exactly the same. They can not make the hardware faster then what it is.
04dcarraher
there are plenty of examples of PC games pushing gpu's to nearly 100% usage showing that the game/software is efficient.
04dcarraher
Ya devs do tweak the games to get it to work correctly why do you think many games run below HD resolutions have items or detail in sections tone down or cut to make it run well. All they do do is compromise when the hardware it already at its limits....04dcarraher
Tons of small devs on pc make graphically outstanding games for nowhere near 20 million.
Some developers just have trouble managing their finances.
topgunmv
Tons of small PC devs?
[QUOTE="theuncharted34"]
[QUOTE="nameless12345"]
How so? Even the Wii-U is going to be noticably better than the HD twins. I don't think MS and Sony will go *that* cheap on their next machines. I do expect them to be better than the Wii-U. Perhaps even considerably better.
04dcarraher
The Wii U isn't going to be THAT much more powerful. If anything it's going to be akin to the Ps2 > Gc = Xbox gap.
Sony will probably make a powerful console, Microsoft most likely won't though, Because they just don't need to.
It all depends on the WiiU's next GPU which is suppose to be a R770 but no ones know how modified it's going to be. Its ethier going to weaker or equal to the normal 4800's. If the WiiU has a 4850 or even a 4870 its going to much better then the PS3 and 360 because of two factors: memory (1,5 gb total vs 512 total) and the GPU which can be as much as 4x faster then the other two. Which means much more detail, larger level designs and higher resolutions in the end more options in games. and with MS fixating with AMD's upcoming APU line its not going to be breaking any modern medium-high ended Pc records out today. However no ones know whats Sony is going to do besides rehashing the Cell beefing it up some for the PS4.
AMD Fusion has both CPU and GPU side i.e. Radeon HD 79x0's X86-64 IP and AMD64 CPU's Radeon HD IP.
On AMD APU products and GFLOPs....
AMD Trinity APU with +800 GFLOPs would be about same level Radeon HD 4850/5750/6750/5850M, which is alot of compute power for single CGPU chip solution.
It would kill my current Dell Studio XPS 1645 laptop.
AMD Trinity laptop demo
Please Log In to post.
Log in to comment