http://www.neogaf.com/forum/showpost.php?p=81785777&postcount=2692
This topic is locked from further discussion.
- How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
- DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
- They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
- 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
- Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.
1st point: Driver updates are constant. What parallel dimension is this person coming from? Graphics cards are shipped and sold with TONS of driver updates afterward! My GeForce 285GTX STILL gets updated periodically.
Â
2:I would be MUCH more concerned about the XBone if they were NOT updating its toolchains since E3. That should be a work in constant progress, just like optimizing and fine-tuning the drivers. To lean back on their updates by looking at it with a past frame of reference tells me that he is either fully aware that he is pulling the wool over your eyes or he is too goddamn ignorant of game development to be making this post in the first place.
Â
3:He just contradicted himself here. Obviously, code would need to be written to call out to specific hardware. He should have caught his own snap on this. He just blurted out a big reason why games need to be reworked more to run on the XBone than they do for the PS4.
Â
4: Ubisoft's The Crew development team already said they have a lot of GPU-compute headroom with the PS4 that they do not with the XBox One.
http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
Â
5: Do a google search for "clamshelling" and "GDDR5" and see what develops ;)
A thread about a post on another forum?MonsieurX
Better than the gibberish from a typical Tighaman topic, but still bad.
In fact, with all the hub-bub a few months ago about how they had to completely bullshit their way through the Kinect demonstrations at their unveiling conference, how can we NOT come to the conclusion that their firmware was not undergoing revision up to this point? What an utter load of malarky!
Â
Hey, Microsoft. If everything was fleshed out at E3, like this post claims, then why weren't all games running on XBox One hardware instead of overspecc'd PCs?
More PS4 vs XB1 power threads.Â
psymon100
We should have known the "Power" Struggle between Lems and Cows would continue after Gen 7.
[QUOTE="freedomfreak"]Never heard of Neogaf.StrongBlackVine
It is like System Wars, except you get banned for trolling.
Sounds boring.[QUOTE="StrongBlackVine"][QUOTE="freedomfreak"]Never heard of Neogaf.freedomfreak
It is like System Wars, except you get banned for trolling.
Sounds boring.The majority of useful information that gets posted here is taken from there.
[QUOTE="psymon100"]
More PS4 vs XB1 power threads.Â
LegatoSkyheart
We should have known the "Power" Struggle between Lems and Cows would continue after Gen 7.
I can't imagine the launch titles will settle the score either. Oh well. At least these threads which discuss the power on paper are always so fresh and interesting. They never ever bore me. OK now hang on I just have to find a pic...
[spoiler]
[/spoiler]
Sounds boring.[QUOTE="freedomfreak"][QUOTE="StrongBlackVine"]
It is like System Wars, except you get banned for trolling.
StrongBlackVine
The majority of useful information that gets posted here is taken from there.
Ah yes. Classic. Post a link to the article, and then post a link to the thread in Gaf. Very useful.Damn, TCHBO :cool:RR360DDNot even close. The only ones being owned are the ones who will be trying to post his mind-numbingly bad points on this and other forums for the next week or so and claiming to know sweet piss-all about what they are talking about. This time, it's not just bad damage control, it is blatant bullshit.
its funny everything they tell you to google and how it pertains to the x1 its all bullshit and they dont know what they are talking about, they are idiots unless its bad then its fact lol but all the stuff you all look up for the ps4 yall just totally blind to something that hinders your argument. Cerny said he suggest to uses the gpu for compute yall dont read that Cerny said yes we have a audio encoding and decoding formats, but said heavy audio processing is going to be handled by the gpu again yall just blindly forgot about that or choose to not ready this place is sooooo hypocritical.
- How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
- DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
- They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
- 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
- Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.Shewgenja
1st point: Driver updates are constant. What parallel dimension is this person coming from? Graphics cards are shipped and sold with TONS of driver updates afterward! My GeForce 285GTX STILL gets updated periodically.
Â
2:I would be MUCH more concerned about the XBone if they were NOT updating its toolchains since E3. That should be a work in constant progress, just like optimizing and fine-tuning the drivers. To lean back on their updates by looking at it with a past frame of reference tells me that he is either fully aware that he is pulling the wool over your eyes or he is too goddamn ignorant of game development to be making this post in the first place.
Â
3:He just contradicted himself here. Obviously, code would need to be written to call out to specific hardware. He should have caught his own snap on this. He just blurted out a big reason why games need to be reworked more to run on the XBone than they do for the PS4.
Â
4: Ubisoft's The Crew development team already said they have a lot of GPU-compute headroom with the PS4 that they do not with the XBox One.
http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
Â
5: Do a google search for "clamshelling" and "GDDR5" and see what develops ;)
[QUOTE="Shewgenja"]its funny everything they tell you to google and how it pertains to the x1 its all bullshit and they dont know what they are talking about, they are idiots unless its bad then its fact lol but all the stuff you all look up for the ps4 yall just totally blind to something that hinders your argument. Cerny said he suggest to uses the gpu for compute yall dont read that Cerny said yes we have a audio encoding and decoding formats, but said heavy audio processing is going to be handled by the gpu again yall just blindly forgot about that or choose to not ready this place is sooooo hypocritical. That was diarrhea from the fingertips if I've ever read it. What are you on about? So what if 7.1PCM is offloaded to CUs? There's a ton of them there and it's easy to access. Also, there is absolutely no way to substantiate a law of diminishing returns for using them. The GPU and the CPU can address the available RAM simultaneously. We've all been over this a trillion times. Please change the title to this shit thread.
- How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
- DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
- They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
- 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
- Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.Tighaman
1st point: Driver updates are constant. What parallel dimension is this person coming from? Graphics cards are shipped and sold with TONS of driver updates afterward! My GeForce 285GTX STILL gets updated periodically.
Â
2:I would be MUCH more concerned about the XBone if they were NOT updating its toolchains since E3. That should be a work in constant progress, just like optimizing and fine-tuning the drivers. To lean back on their updates by looking at it with a past frame of reference tells me that he is either fully aware that he is pulling the wool over your eyes or he is too goddamn ignorant of game development to be making this post in the first place.
Â
3:He just contradicted himself here. Obviously, code would need to be written to call out to specific hardware. He should have caught his own snap on this. He just blurted out a big reason why games need to be reworked more to run on the XBone than they do for the PS4.
Â
4: Ubisoft's The Crew development team already said they have a lot of GPU-compute headroom with the PS4 that they do not with the XBox One.
http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
Â
5: Do a google search for "clamshelling" and "GDDR5" and see what develops ;)
[QUOTE="Tighaman"][QUOTE="Shewgenja"]its funny everything they tell you to google and how it pertains to the x1 its all bullshit and they dont know what they are talking about, they are idiots unless its bad then its fact lol but all the stuff you all look up for the ps4 yall just totally blind to something that hinders your argument. Cerny said he suggest to uses the gpu for compute yall dont read that Cerny said yes we have a audio encoding and decoding formats, but said heavy audio processing is going to be handled by the gpu again yall just blindly forgot about that or choose to not ready this place is sooooo hypocritical. That was diarrhea from the fingertips if I've ever read it. What are you on about? So what if 7.1PCM is offloaded to CUs? There's a ton of them there and it's easy to access. Also, there is absolutely no way to substantiate a law of diminishing returns for using them. The GPU and the CPU can address the available RAM simultaneously. We've all been over this a trillion times. Please change the title to this shit thread. that EDGE statement was bullshit but you was agreeing with that bullshit that aniff test against a.p. bandwidth addition was bullshit but you went it like I said x1 =not true unless bad ps4 =true unless its bad
- How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
- DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
- They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
- 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
- Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.Shewgenja
1st point: Driver updates are constant. What parallel dimension is this person coming from? Graphics cards are shipped and sold with TONS of driver updates afterward! My GeForce 285GTX STILL gets updated periodically.
Â
2:I would be MUCH more concerned about the XBone if they were NOT updating its toolchains since E3. That should be a work in constant progress, just like optimizing and fine-tuning the drivers. To lean back on their updates by looking at it with a past frame of reference tells me that he is either fully aware that he is pulling the wool over your eyes or he is too goddamn ignorant of game development to be making this post in the first place.
Â
3:He just contradicted himself here. Obviously, code would need to be written to call out to specific hardware. He should have caught his own snap on this. He just blurted out a big reason why games need to be reworked more to run on the XBone than they do for the PS4.
Â
4: Ubisoft's The Crew development team already said they have a lot of GPU-compute headroom with the PS4 that they do not with the XBox One.
http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
Â
5: Do a google search for "clamshelling" and "GDDR5" and see what develops ;)
Please Log In to post.
Log in to comment