@tormentos: You are trying way to hard. You’re so triggered over the Road to PS5 spec announcement.
@pc_rocks:
You know what your problem is? That you have nothing,and since you can't prove the PS5 GPU will deep well bellow the 2.23ghz you start this campaign to discredit any information showing the contrary.
Quote te so call question he supposedly evade.
Because he clearly stated that the worse case scenario would be 10% reduction in power which would be equivalent to a few % drops in frequency now considering a freaking 1.8TF difference can produce 6fps more in a game at 4k,hell 3FPS in FF15 between the 5700xt vs 5700 is very easy to know the drop would do nothing.
Any REAL PC GAMER WOULD KNOW THIS.
But you are not a PC gamers,which is why you ask for proof of gddr been better for CPU,fact is I prove my point CPU benefit from faster speed and higher bandwidth so if you can't prove me wrong go ahead I dare you.
But I am sure you will not else you would have post it long ago,which is why you are reduce to downplay my argument rather than proving me wrong.
So for the 125 time prove me WRONG I DARE YOU.
@Guy_Brohski: Sony throttled the 36 CUs to get to 10. They knew 9 TF would look bad on paper. The reality is the 10TF will drop back to 9 when the CPU and GPU hit full capacity. The PS5 will run hot. The PS5 needs a good cooling system to running overclocked every day.
@lundy86_4: X has 52x 64 shaders PS5 has 36x 64 shaders. There is a big difference with power. The new debate will be raytracing next gen. PS5 new physics, lighting and sound will be inferior. Sony messed up so badly. After the PS4 who would have thought they release a weaker machine.
@pc_rocks:
You know what your problem is? That you have nothing,and since you can't prove the PS5 GPU will deep well bellow the 2.23ghz you start this campaign to discredit any information showing the contrary.
Quote te so call question he supposedly evade.
Because he clearly stated that the worse case scenario would be 10% reduction in power which would be equivalent to a few % drops in frequency now considering a freaking 1.8TF difference can produce 6fps more in a game at 4k,hell 3FPS in FF15 between the 5700xt vs 5700 is very easy to know the drop would do nothing.
Any REAL PC GAMER WOULD KNOW THIS.
But you are not a PC gamers,which is why you ask for proof of gddr been better for CPU,fact is I prove my point CPU benefit from faster speed and higher bandwidth so if you can't prove me wrong go ahead I dare you.
But I am sure you will not else you would have post it long ago,which is why you are reduce to downplay my argument rather than proving me wrong.
So for the 125 time prove me WRONG I DARE YOU.
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
Notice "running near" not locked 2230Mhz clock speed.
@tormentos:
That's cool but for the 15th time, where are the sources for GDDR is better than DDR for CPUs?
X has a larger memory bus then the PS5 overlooked in the spec talk. X can potentially have higher frame high quality.textures loading in quicker. It can also feed the CPU quicker.
@pc_rocks:
You know what your problem is? That you have nothing,and since you can't prove the PS5 GPU will deep well bellow the 2.23ghz you start this campaign to discredit any information showing the contrary.
Quote te so call question he supposedly evade.
Because he clearly stated that the worse case scenario would be 10% reduction in power which would be equivalent to a few % drops in frequency now considering a freaking 1.8TF difference can produce 6fps more in a game at 4k,hell 3FPS in FF15 between the 5700xt vs 5700 is very easy to know the drop would do nothing.
Any REAL PC GAMER WOULD KNOW THIS.
But you are not a PC gamers,which is why you ask for proof of gddr been better for CPU,fact is I prove my point CPU benefit from faster speed and higher bandwidth so if you can't prove me wrong go ahead I dare you.
But I am sure you will not else you would have post it long ago,which is why you are reduce to downplay my argument rather than proving me wrong.
So for the 125 time prove me WRONG I DARE YOU.
Given what we now know now about "Ray Tracing" and it will be utilized next gen and Sony releasing a inferior machine, it not hard to imagine Sony will always be playing catch up to the new developments in the gaming industry. Devs will be able to do things on Xbox x console better when there 16x64 shaders/ extra compute units. Sony was quiet for months and months because they knew they had to spin the reveal and not look terrible. Fact they had to boost the clocks on 36 CU GPU says a lot about them. Sony reveal in 2013 was amazing, did everything right, but all those people working there are gone now and you left with corporate heads messed up here. There no spin the PS5 weaker machine 25 percent difference.
You're the kind of guy that would suggest developers make games exclusive to the RTX 2080 ti because allowing the peasants who use lowly GPUs such as a GTX 1060 or even a GTX 1080 are not worth addressing.
That's how twisted your argument has become in your efforts to defend the PS5. It's a vastly inferior piece of hardware, just let it go. The Xbox Series X will be a lot more powerful, learn that.
Is just 18.3% behind in GPU power how can that be consider VASTLY is beyond logic.
And is this kind of pathetic arguments why this thread exist.🤣
The gap the PS5 has with the xbox is the same the 5700XT has with the 5700 1.8TF and at 4k the gap in performance at the same settings is just lol worthy.
The difference between 5700 vs 5700 XT situation with XSX vs PS5 is XSX includes 25% extra memory bandwidth over 5700/5700 XT.
5700 XT didn't include 25% extra memory bandwidth to scale with 9.66 TFLOPS vs 5700's 7.7 TFLOPS with 448 GB/s memory bandwidth. AMD's financial situation beats the optimal technical design.
AMD didn't include extra L2 cache for 5700 XT's higher TFLOPS when compared to 5700.
Your argument to equate XSX vs PS5 as another RX 5700 XT vs 5700 is wrong!
XSX is already delivering Gears 5 built-in benchmark with PC Ultra settings at RTX 2080 level results which is +25% higher than RX 5700 XT!
51 / 39 = 1.30 or 30%
Well guess is OK to claim from 18 to 30%,so can i do the opossite i think it will be between 18 and 10%.
After all look at the 5700XT vs 5700 both have a gap of 1.8TF the same as the PS5 and xbox series X and on FF15 the gap is only 3FPS more man,this is great and since lemmings don't present any avidence of 30% i don't have to either is a win win for me.
XSX includes 25% extra memory bandwidth over RX 5700 XT's 448 GB/s.
XSX includes 25% extra memory controllers over RX 5700 XT's 16 memory channels which could lead to 25% extra GPU L2 cache over RX 5700 XT's 4MB L2 cache e.g. 5MB L2 cache.
RX 5700 XT doesn't include extra memory bandwidth over RX 5700, Cow farts!
Add 25% on RX 5700 XT's 37.8 fps results would land on 47.25 fps which is RTX 2080 level.
RX 5700 XT's results are below my monitor's free sync range, hence this GPU is insufficient for smooth 4K frame rates.
XSX includes 25% extra memory bandwidth over RX 5700 XT's 448 GB/s.
XSX includes 25% extra memory controllers over RX 5700 XT's 16 memory channels which could lead to 25% extra GPU L2 cache over RX 5700 XT's 4MB L2 cache e.g. 5MB L2 cache.
RX 5700 XT doesn't include extra memory bandwidth over RX 5700, Cow farts!
Add 25% on RX 5700 XT's 37.8 fps results would land on 47.25 fps which is RTX 2080 level.
RX 5700 XT's results are below my monitor's free sync range, hence this GPU is insufficient for smooth 4K frame rates.
My argument wasn't comparing the 5700XT vs the xbox series X,so again you are arguing something no one was.
My comparison was what a 1.8TF gap MEANS PERFORMACE WISE on this 2 GPU's.
I don't care if your argument is based on your frying pan,the argument here i was having and which YOU quoted OPENLY was about the performance delta between this 2 GPU with 1.8TF gap.
Even taking your argument as good 10FPS is nothing man,the PS4 command bigger gap that those in several games,in fact it command some times frames and resolution as well.
@Guy_Brohski: Sony throttled the 36 CUs to get to 10. They knew 9 TF would look bad on paper. The reality is the 10TF will drop back to 9 when the CPU and GPU hit full capacity. The PS5 will run hot. The PS5 needs a good cooling system to running overclocked every day.
Even at 9.2TF that gap is 30% thats smaller than the gap this gen in both cases.
WTF those that bold part even mean?
Prove the PS5 will run hot,while the xbox with 16 more CU producing heat and a higher clocked CPU will not.
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
Notice "running near" not locked 2230Mhz clock speed.
or at peak frequency most of the time.
The GPU is not locked at 2.23ghz the only time the GPU drops is when the CPU use more power,which can be prevented by developers as DF confirmer.
"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.
We know the GPU speed is not locked but we know it can BE SUSTAIN as long as the CPU is been hold back a few mhz.
So again you are losing your time trying to just give validity to what serve you best as freaking always.
Prove the GPU can't sustain 2.23ghz quote a developer or sony stating so.
@lundy86_4: X has 52x 64 shaders PS5 has 36x 64 shaders. There is a big difference with power. The new debate will be raytracing next gen. PS5 new physics, lighting and sound will be inferior. Sony messed up so badly. After the PS4 who would have thought they release a weaker machine.
There will be a difference but not much,again the xbox can't go crazy with RT simply becuase it has more CU,as RT has a nasty impact on performance and bandwidth usage as well.
Inferiour sound for real? lol
You know what is funny here is another account from 2013 that magically started to post regurlarly after 2016 when scorpio was anounce,funny isn't where in hell were you when the PS4 was commanding a 40% lead in power?
Is like the new trend,old ass accounts appearing out of nowhere with a super viased xbox opinion,damn you lemmings are a joke.😂
@Pedro: They are already putting pressure on the thermals inside the machine with an overclock. PS5 may break after a couple of years if the cooling system flawed.
Based on what is your theory how you know the OC is big? Let me guess you are making a comparison vs 5700XT or other GPU,haven't you notice the xbox series X is 1800+ mhz with 52CU how many GPU with that speed have you seen? Hell the xbox one X clock speed was way lower.
We don't know what are the stock clocks of RDNA2 GPU on PC because they are not here.
Hell the xbox could be underclocked if RDNA2 comes out with 52CU and has 1900mhz or 2.0ghz.
Given what we now know now about "Ray Tracing" and it will be utilized next gen and Sony releasing a inferior machine, it not hard to imagine Sony will always be playing catch up to the new developments in the gaming industry. Devs will be able to do things on Xbox x console better when there 16x64 shaders/ extra compute units. Sony was quiet for months and months because they knew they had to spin the reveal and not look terrible. Fact they had to boost the clocks on 36 CU GPU says a lot about them. Sony reveal in 2013 was amazing, did everything right, but all those people working there are gone now and you left with corporate heads messed up here. There no spin the PS5 weaker machine 25 percent difference.
When MS released a 40% inferior machine with $100 dollar higher price tag where were you? You have a 2013 join date yet i didn't see you here crying about it,why cry now?
Still waiting on El Tomato to show us Lems who are not happy with SeX
I already did do you have any theory on the HUGE gaps the xbox will have over the PS5,we need more theories.
@tormentos:
I think it's you who doesnt read...or understand. Sony is using a 3rd party proprietary ssd. You do know that PC ssd's wont even fit in the ps5....right? You do know that ps5's requirements are very specific, including the actual shape of the drive?
That's why I said that we have no idea who is making the drives...no-one has announced they are supporting it yet.
Lol, I dont need excuses to justify my choice....just reasons. If ms released a console that was less powerful, and had an inferior BC policy with a controller that I didnt appreciate, along with a more expensive game service like psnow that I thought to be inferior, and ms didnt tell me about peripherals working on all consoles, failed to explain how the sound will work or if there will be crossbuy....I guarenfrikkintee you i would get the other console.
As it is, it's the other way around.
We've seen Microsoft's proprietary drives but Sony is sticking with its strategy of allowing users to buy off-the-shelf parts and fit them into the console themselves - so yes, NVMe PC drives will work in PlayStation 5. The only problem is that PC technology is significantly behind PS5. It'll take some time for the newer, PCIe 4.0-based drives with the bandwidth required to match Sony's spec to hit the market.
"We can hook up a drive with only two priority levels, definitely, but our custom I/O unit has to arbitrate the extra priorities - rather than the M.2 drive's flash controller - and so the M.2 drive needs a little extra speed to take care of issues arising from the different approach," says Cerny. "That commercial drive also needs to physically fit inside the bay we created in PS5 for M.2 drives. Unlike internal hard drives, there's unfortunately no standard for the height of an M.2 drive, and some M.2 drives have giant heat sinks - in fact, some of them even have their own fans."
We know it has to fit the bay they make,this is not new not all hdd fitted inside the PS3 or PS4 simply because they were laptop HDD.
But they allow 3rd party ones off the shelf not proprietary like MS,which already were expensive on 2006 vs regular PC drives,MS will probably will try to offset any hardware loss from services and hardware it is always that way.
Do you own an xbox one since launch?
@Pedro: That depends on the physical bay doesnt it? If its unique to ps5, then its proprietary surely? A standard PC ssd wont fit even if it matches requirements for access speed, and that was what I was getting at.
Dude not all laptop drives fit inside the PS4 or PS3 bay,your grasping what they told you is that the one you buy need to fit there nothing more,and has to be as fast or faster than the PS5 one,they design the machine to accept 3rd party drives.
Correct but what will be an issue is if you buy a PS5 at launch you won't be able to expand the memory at first. Eventually you will and since it will require such a fast drive, it will probably be more expensive than the XsX proprietary SSD at first. Especially if you want a large SSD.
A few years into the gen this won't matter as an M.2 drive that beats PS5 drive specifications will be much cheaper. However, was all of this even necessary? What is the advantage of going so fast on the SSD so as to make it expensive and incompatible only to gain a few seconds in loading? To me the XsX solution clearly makes more sense. You can always put a faster and larger SSD later if you need to with a mid gen refresh.
Apparently you don't remember how freaking expensive xbox 360 hdd were,man a 20GB HDD on 2006 was $100 for the xbox,when on PC and PS3 you could have get a 100GB one for that same price.
When you make proprietary stuff it always more expensive because it can't be manufacture as easy and sell to open markets,SSD with speed like the PS5 will be expensive on PC because like any new tech it will cost more,is like wanting to pay $300 for a series X when you know it is worth the xtra price sticker.
Sure it does because you are a lemming,off course the xbox will make more sense to you,do you think 52CU make sense having extra 16CU just to have 18% more power,when on 2013 the PS4 with just 6CU had 40% more power.
So you see both played the field differently,sony wanted a super fast ssd becuase they saw maybe an advantage.
There is one undeniable fact here that people like you are ignoring the PS5 will have some incredible graphics it can be 50% less powerful than the xbox it will matter little,because graphics have reach a point were even shitty games look great.
Maybe sony saw a bigger advantage from using faster ssd than going after more power.
By the way a 52CU GPU should cost considerably more than a 36CU,so when it come to price maybe sony will end with a cheaper unit.
I don't think putting a faster SSD will help much if your game wasn't build to take advantage,many PS games don't even benefit from having SSD while other do.
You do enjoy eating shit don't you, i'm just getting tired of feeding it to you. When will that pea brain of yours understand that I don't care about da power, I care about making fun of you cows for the lack of it.
As for your exclusive explanation, why the **** are you telling me for you muppet I showed in my link that Godawfall is not an exclusive as stated by the devs of the game, go tell your fellow cow josh.
And now that's done, time for some more mocking lol...
That is the way you see it,but then again you can't even see your issues with lesbians,so all i can say for sure is that you have a twisted vision of reality.
I explain it to you because i know how limite your mind is,and since you claim ff7 was on xbox when it isnt so.😂
Now dial back your tears.
Xbox fans are unhappy with the new specs?
If you have certain specs and you know the gap is minimal,in fact compare to previous gen not even half,would you claim the gap is massive,incredible,blow up the actual number to almost double or more than double?
Would you do that?
I think they are unhappy they wanted the xbox to wipe the floor with the PS5.🤷♀️
@pc_rocks:
You know what your problem is? That you have nothing,and since you can't prove the PS5 GPU will deep well bellow the 2.23ghz you start this campaign to discredit any information showing the contrary.
Quote te so call question he supposedly evade.
Because he clearly stated that the worse case scenario would be 10% reduction in power which would be equivalent to a few % drops in frequency now considering a freaking 1.8TF difference can produce 6fps more in a game at 4k,hell 3FPS in FF15 between the 5700xt vs 5700 is very easy to know the drop would do nothing.
Any REAL PC GAMER WOULD KNOW THIS.
But you are not a PC gamers,which is why you ask for proof of gddr been better for CPU,fact is I prove my point CPU benefit from faster speed and higher bandwidth so if you can't prove me wrong go ahead I dare you.
But I am sure you will not else you would have post it long ago,which is why you are reduce to downplay my argument rather than proving me wrong.
So for the 125 time prove me WRONG I DARE YOU.
Yes, my problem is that after 4 weeks I still don't have answers to my simple questions only irrelevant drivel.
So, for the 27th time what are the base or as DF calls it core clocks for PS5's CPU and GPU? Why did Cerney refused to answer it when asked point blank?
In case you still have difficulty understanding it, what's the lowest the clocks can go on each component to compensate the full clocks of other?
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
@tormentos:
That's cool but for the 15th time, where are the sources for GDDR is better than DDR for CPUs?
X has a larger memory bus then the PS5 overlooked in the spec talk. X can potentially have higher frame high quality.textures loading in quicker. It can also feed the CPU quicker.
Irrelevant to my question.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
@tormentos: You need to admit that you are the one that is unhappy with the PS5 specs. You are so unhappy that you created this thread, pretend that the GPU and CPU max clocks is its base clocks, admitted you don't know the base clocks but continue to claim that its the boost clocks and have been damage controlling the PS5 weaker specs for the month. No one has complained about the Xbox Series X specs. Right now you are just making the situation worse for yourself. Which I appreciate.
Yes, my problem is that after 4 weeks I still don't have answers to my simple questions only irrelevant drivel.
So, for the 27th time what are the base or as DF calls it core clocks for PS5's CPU and GPU? Why did Cerney refused to answer it when asked point blank?
In case you still have difficulty understanding it, what's the lowest the clocks can go on each component to compensate the full clocks of other?
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Your problem is that you have shit to refute the evidence presented to you.
Is DF say that developers are holding back the CPU to sustain 2.23ghz what is the gap in performance vs the series X?
Quote the so call question you say DF asked and didn't get a response.
The lowest it can go is irrelevant you fool when a 10% drop in power will translate into a 3 to 5% or even less drop in frequency,if you truly were a PC gamer you would know this,just like you would know that latency mean shit when higher bandwidth and faster speed offset any problems,and as a matter of FACT that has been the case with ddr since 1,each and every new model pack faster clocks higher bandwidht and more latency yet always the benefit outweighted any downside.
Now prove to me that developers will choose CPU over GPU on the PS5.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
Just CAS latency not overall latency. That's the mistake people make when they try to compare DDR versions and claim that CPUs don't need low latency. Clock speeds also affects latency and latency is either reduces or remains the same with each DDR version upgrade while a significant boost in bandwidth.
Your problem is that you have shit to refute the evidence presented to you.
Is DF say that developers are holding back the CPU to sustain 2.23ghz what is the gap in performance vs the series X?
Quote the so call question you say DF asked and didn't get a response.
The lowest it can go is irrelevant you fool when a 10% drop in power will translate into a 3 to 5% or even less drop in frequency,if you truly were a PC gamer you would know this,just like you would know that latency mean shit when higher bandwidth and faster speed offset any problems,and as a matter of FACT that has been the case with ddr since 1,each and every new model pack faster clocks higher bandwidht and more latency yet always the benefit outweighted any downside.
Now prove to me that developers will choose CPU over GPU on the PS5.
Irrelevant to the questions.
So, for the 29th time what are the base clocks for PS5's CPU and GPU? Why did Cerney refused to answer it when asked point blank?
In case you still have difficulty understanding it, what's the lowest the clocks can go on each component to compensate the full clocks of other?
Lastly, for the 21st time, where are the sources for GDDR is better than DDR for CPUs?
EDIT: For the quote you asked:
From 9:58: So, I guess the basic question is, is there a base clock for PS5? Well, Mark Cerny wasn't saying. In fact, he suggested the kind of workload I'm talking about is more abstract.
I've seen zero lems complain about xsx specs. I have seen Cows complain about ps5 specs though....that tells me all I need to know.
Like I asked you earlier....would you prefer it if ps5 was the more powerful console?
'Course you would, which is why you've been absolutely raging since the deep dive.
XSX includes 25% extra memory bandwidth over RX 5700 XT's 448 GB/s.
XSX includes 25% extra memory controllers over RX 5700 XT's 16 memory channels which could lead to 25% extra GPU L2 cache over RX 5700 XT's 4MB L2 cache e.g. 5MB L2 cache.
RX 5700 XT doesn't include extra memory bandwidth over RX 5700, Cow farts!
Add 25% on RX 5700 XT's 37.8 fps results would land on 47.25 fps which is RTX 2080 level.
RX 5700 XT's results are below my monitor's free sync range, hence this GPU is insufficient for smooth 4K frame rates.
My argument wasn't comparing the 5700XT vs the xbox series X,so again you are arguing something no one was.
My comparison was what a 1.8TF gap MEANS PERFORMACE WISE on this 2 GPU's.
I don't care if your argument is based on your frying pan,the argument here i was having and which YOU quoted OPENLY was about the performance delta between this 2 GPU with 1.8TF gap.
Even taking your argument as good 10FPS is nothing man,the PS4 command bigger gap that those in several games,in fact it command some times frames and resolution as well.
This topic is about XSX's power. Your RX 5700's vs 5700 XT's performance results vs TFLOPS gap argument is misleading for XSX's gap against PS5.
You cows made a big deal with miserable 0.52 TFLOPS.
@tormentos said:
@ronvalencia said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
Notice "running near" not locked 2230Mhz clock speed.
or at peak frequency most of the time.
The GPU is not locked at 2.23ghz the only time the GPU drops is when the CPU use more power,which can be prevented by developers as DF confirmer.
"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.
We know the GPU speed is not locked but we know it can BE SUSTAIN as long as the CPU is been hold back a few mhz.
So again you are losing your time trying to just give validity to what serve you best as freaking always.
Prove the GPU can't sustain 2.23ghz quote a developer or sony stating so.
Notice "most of the time" statement in "peak frequency most of the time" and not it's "all of the time".
You can't claim 2230 Mhz average when your beloved PS5 doesn't exceed 2230 Mhz to counter dips below 2230 Mhz.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
The argument for GDDR being better for CPU is useless when PS5 needs to reduce CPU usage to able 2230 Mhz GPU.
CPU processing raster graphics is not optimal.
Irrelevant to the questions.
So, for the 29th time what are the base clocks for PS5's CPU and GPU? Why did Cerney refused to answer it when asked point blank?
In case you still have difficulty understanding it, what's the lowest the clocks can go on each component to compensate the full clocks of other?
Lastly, for the 21st time, where are the sources for GDDR is better than DDR for CPUs?
EDIT: For the quote you asked:
From 9:58: So, I guess the basic question is, is there a base clock for PS5? Well, Mark Cerny wasn't saying. In fact, he suggested the kind of workload I'm talking about is more abstract.
😂😂😂😂
So now you are the new Ronvalencia you refuse to answer question which you KNOW the answer.
If the system sustain 2.23ghz what is the GPU gap vs the Xbox.?
Don't run because you can't give validity to what you like on DF alone.
But what if developers aren't going to optimise specifically to PlayStation 5's power ceiling? I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have.
"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle.
That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
This is the part he refers on the video as you can see sony didn't refuse to answer,and he didn't asked in the way he make it look on the video,and in fact Cerny reply is correct.
As consoles work now they run on constant frequency and power is variable,which is why some games heat up consoles more than other,is the reason why when gears of war came on 360 on 2006 it killed even more 360 because up to that point no game had push the hardware so hard.
But is funny because on the own video you posted,he say Mark Cerny told him that a 10% drop in power would translate into a minor drop in frequency,so he goes further and say 2,3,4,% reduction in power would bring even less drop.
Again is frequency is not 1 to 1 with a power drop,if sony drops the power at worse by 10% you may see probably a 5% drop in frequency,now take 5% clocks from 2.23ghz and see were you land.
2,230mhz - 5% = 2,118 mhz in other words 2.11ghz The performance drop you will get from this is nothing 1 frame if anything.
This ^^ is the worse scenario and the reason i say you are grasping,again any REAL PC gamer who has experince OC their GPU would tell you that a 5% reduction in frequency mean total shit 1 frame 2 at most and who knows if nothing.
He also gives as valid the speed of 2.0ghz when it was nothing but testing,in fact there was another test done at 1.8ghz as well.
The Gonzalo leak back in April suggested that PlayStation 5 would feature a Zen 2-based CPU cluster running at 3.2GHz paired with a Navi graphics core running at 1.8GHz.
https://www.eurogamer.net/articles/digitalfoundry-2019-playstation-5-xbox-series-x-spec-leak-analysed
Your final specs are not final until they are close down,so testing is always done,even the xbox one up clocked at last minute and that was from a confirmed speed,unlike the PS5 which was a leat of a test,so 2.0ghz wasn't the target speed it was just a test just like sony tested memory and one of the test showed even faster bandwidth that the xbox series X so how come is now slower?
So they increase power and decrese bandwidth?
No it was just a test and i am sure that sony didn't chose that faster bandwidth because what it probably saw as gain for OC the GPU and having even more bandwidth that the xbox would not yield much better results,after all you can give a 7770 the bandwidth of the 7970 it would mean shit at the end.
Power doesn't come from bandwidth it come from speed number of CU.
The lowest clock the GPU will drop will probably be 2.11ghz that is my estimation based on DF own claims and that is the worse case,dropping power by 5% will probably yield a 2% drop in frequency which would amount to nothing and yes take into account.
This is the gap 1.8TF dilver on FF15 between the 5700XT and 5700,again droping 5% clock speed is not even close to a 1.8FT drop in power.
Now again is the GPU is sustained like DF claim at 2.23ghz what is the actual GPU gap vs the xbox series X.
@tormentos: You need to admit that you are the one that is unhappy with the PS5 specs. You are so unhappy that you created this thread, pretend that the GPU and CPU max clocks is its base clocks, admitted you don't know the base clocks but continue to claim that its the boost clocks and have been damage controlling the PS5 weaker specs for the month. No one has complained about the Xbox Series X specs. Right now you are just making the situation worse for yourself. Which I appreciate.
Yo need to stop damage controling everything you don't like.
I am not the one increasing a gap artificially to make the xbox look better,i am not the once falsely claiming the PS5 is 9.2TF.
I am not the one claiming 30% or more gaps or saying it is massive is lemmings not me.
So who is not happy with the actual gap it has?
You are actually an idiot and as always you love to spin things and try to smart ass your way but failiing miserably,now they don't have to complain about the series X spec to not been happy,all they have to do is artificially inflate gaps to show their discontent with the actual gap.
Because lets face it in 2013 the PS4 had more than double the gap the series X has now,and was $100 less dollars,now they get 18% better GPU non linear memory which is part faster part slower is not even a big win.
But i don't have to tell you this you see it all the time here you just chose to ignore it,just like you have choose now to call out more people on the lemming side to appear more unbiased again failing hard.
Is not me who claim it is a developer and play blind to almost all the stupidity say by lemmings it is you.
Now if the GPU speed is 2.23ghz sustain what is the gap?
If the GPU drops 5% in frequency how HUGE the gap will be?
Its 2013 all over again,this time instead of having DX12 to close a 40% gap they have wishful thinking power to over blow a 18% one.
This topic is about XSX's power. Your RX 5700's vs 5700 XT's performance results vs TFLOPS gap argument is misleading for XSX's gap against PS5.
You cows made a big deal with miserable 0.52 TFLOPS.
@tormentos said:
@ronvalencia said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
Notice "running near" not locked 2230Mhz clock speed.
or at peak frequency most of the time.
The GPU is not locked at 2.23ghz the only time the GPU drops is when the CPU use more power,which can be prevented by developers as DF confirmer.
"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.
We know the GPU speed is not locked but we know it can BE SUSTAIN as long as the CPU is been hold back a few mhz.
So again you are losing your time trying to just give validity to what serve you best as freaking always.
Prove the GPU can't sustain 2.23ghz quote a developer or sony stating so.
Notice "most of the time" statement in "peak frequency most of the time" and not it's "all of the time".
You can't claim 2230 Mhz average when your beloved PS5 doesn't exceed 2230 Mhz to counter dips below 2230 Mhz.
No my argument is what a 1.8TF gap mean in performance delta period the gap produce by actual clea GPU power.
I wans't comparing the xbox series X vs the 5700XT i am not YOU.
And this is the problem with you,that you are a total blind hypocrite,when you pretend the 0.52Gflops gaps is nothing comparing it vs the 1.8TF gap on xbox now you are been a intelectually dishonet bufoon,period there is no word that can describe your hipocricy when you know the number is irrelevant0.52 or 1.8 TF what matter is vs what is that gap and how much it is in % equivalency.
I already proved this by showing how the difference in BF5 was 6FPS in 4k max settings on the 5700XT vs the 5700,the gap is 1.8TF but in % it is actually bigger than the xbox series X vs the PS5.
Because from 7.9TF to 9.6TF there is a 21% gap in power
The PS5 vs xbox there is only 18.3% so % talking the 1.8TF difference between the 5700XT and the 5700 is actually bigger than the xbox vs PS5,now my argument are about power have nothig to do with bandwidth nor have take it into account,because that is another 2 cents,bandwidth doesn't increase power it allows you to use the maximun power you have,giving the series X 1,000GB/s will not turn into a monster it would just be a GPU with to much bandwidth.
0.52Gflos was enough to make the PS4 double the xbox in resolutions,almost double it in frames or have both faster frames and higher pixels as well like it is the case of doom which is a pretty optimized game and is one of the biggest gaps the PS4 has over the xbox.
0.52 = 40% more GPU power.
1.8TF = 18.3% more GPU power the number is irrelevant.
Look at the gap between the 7850 and 7970 for that 1.8TF gap and compare it vs teh 5700XT vs 5700,but look also how the 7970 and 7950 which are much closer in performance look 6FPS lol the same gap from the 5700XT to 5700.
You argument is a joke and you are totally dishonest buffoon who would argue even against it self if it help the xbox.
I'm very happy in the knowledge for the next several years I will own the most powerful console.
I'm also happy in the knowledge when it comes to performance I'll have ownage over people who make threads like this one. (there wont be many)
I am happy in the knowledge that if i seat 8 feet from my TV i would not see any difference,according to some expert who make a thread here a few years ago.
@tdkmillsy said:
With all the discussions around how much better 1080p resolution on the PS4 is than 720p/900p resolution on the Xbox One. How much of a benefit it is depends on how far you sit away from your TV. I sit 8.5 feet from a 50" TV so will see very little difference between 720p and 1080p and no difference between 900p and 1080p.
How far do you sit from the TV?
@tdkmillsy said:
That's exactly what I'm saying. Its well known in the TV world 1080p is only beneficial when distance and/or screen size reaches a certain level. I'm suggesting if you play in the living room then there's a good chance you wouldn't notice.
https://www.gamespot.com/forums/system-wars-314159282/how-far-do-you-sit-from-your-tv-is-1080p-really-th-30856016/?page=1
I am also happy in the knowledge that when the comparison comes i will not make such pathetic damage control threads like you did.
If you didn't die at 720p i will be incredibly find at 4k.
😂😂😂😂
So now you are the new Ronvalencia you refuse to answer question which you KNOW the answer.
If the system sustain 2.23ghz what is the GPU gap vs the Xbox.?
Don't run because you can't give validity to what you like on DF alone.
But what if developers aren't going to optimise specifically to PlayStation 5's power ceiling? I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have.
"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle.
That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
This is the part he refers on the video as you can see sony didn't refuse to answer,and he didn't asked in the way he make it look on the video,and in fact Cerny reply is correct.
As consoles work now they run on constant frequency and power is variable,which is why some games heat up consoles more than other,is the reason why when gears of war came on 360 on 2006 it killed even more 360 because up to that point no game had push the hardware so hard.
But is funny because on the own video you posted,he say Mark Cerny told him that a 10% drop in power would translate into a minor drop in frequency,so he goes further and say 2,3,4,% reduction in power would bring even less drop.
Again is frequency is not 1 to 1 with a power drop,if sony drops the power at worse by 10% you may see probably a 5% drop in frequency,now take 5% clocks from 2.23ghz and see were you land.
2,230mhz - 5% = 2,118 mhz in other words 2.11ghz The performance drop you will get from this is nothing 1 frame if anything.
This ^^ is the worse scenario and the reason i say you are grasping,again any REAL PC gamer who has experince OC their GPU would tell you that a 5% reduction in frequency mean total shit 1 frame 2 at most and who knows if nothing.
He also gives as valid the speed of 2.0ghz when it was nothing but testing,in fact there was another test done at 1.8ghz as well.
The Gonzalo leak back in April suggested that PlayStation 5 would feature a Zen 2-based CPU cluster running at 3.2GHz paired with a Navi graphics core running at 1.8GHz.
https://www.eurogamer.net/articles/digitalfoundry-2019-playstation-5-xbox-series-x-spec-leak-analysed
Your final specs are not final until they are close down,so testing is always done,even the xbox one up clocked at last minute and that was from a confirmed speed,unlike the PS5 which was a leat of a test,so 2.0ghz wasn't the target speed it was just a test just like sony tested memory and one of the test showed even faster bandwidth that the xbox series X so how come is now slower?
So they increase power and decrese bandwidth?
No it was just a test and i am sure that sony didn't chose that faster bandwidth because what it probably saw as gain for OC the GPU and having even more bandwidth that the xbox would not yield much better results,after all you can give a 7770 the bandwidth of the 7970 it would mean shit at the end.
Power doesn't come from bandwidth it come from speed number of CU.
The lowest clock the GPU will drop will probably be 2.11ghz that is my estimation based on DF own claims and that is the worse case,dropping power by 5% will probably yield a 2% drop in frequency which would amount to nothing and yes take into account.
This is the gap 1.8TF dilver on FF15 between the 5700XT and 5700,again droping 5% clock speed is not even close to a 1.8FT drop in power.
Now again is the GPU is sustained like DF claim at 2.23ghz what is the actual GPU gap vs the xbox series X.
Why would I run? I'm not claiming or debating anything. I'm asking the questions. Anyway, Irrelevant to the questions as always.
So, for the 30th time what are the base clocks for PS5's CPU and GPU? Why did Cerney refused to answer it when asked point blank?
In case you still have difficulty understanding it, what's the lowest the clocks can go on each component to compensate the full clocks of other?
Lastly, for the 22st time, where are the sources for GDDR is better than DDR for CPUs?
I'm very happy in the knowledge for the next several years I will own the most powerful console.
I'm also happy in the knowledge when it comes to performance I'll have ownage over people who make threads like this one. (there wont be many)
I am happy in the knowledge that if i seat 8 feet from my TV i would not see any difference,according to some expert who make a thread here a few years ago.
@tdkmillsy said:
With all the discussions around how much better 1080p resolution on the PS4 is than 720p/900p resolution on the Xbox One. How much of a benefit it is depends on how far you sit away from your TV. I sit 8.5 feet from a 50" TV so will see very little difference between 720p and 1080p and no difference between 900p and 1080p.
How far do you sit from the TV?
@tdkmillsy said:
That's exactly what I'm saying. Its well known in the TV world 1080p is only beneficial when distance and/or screen size reaches a certain level. I'm suggesting if you play in the living room then there's a good chance you wouldn't notice.
https://www.gamespot.com/forums/system-wars-314159282/how-far-do-you-sit-from-your-tv-is-1080p-really-th-30856016/?page=1
I am also happy in the knowledge that when the comparison comes i will not make such pathetic damage control threads like you did.
If you didn't die at 720p i will be incredibly find at 4k.
Specifically on resolution experts suggested you couldn't see the difference when sitting a set distance from the TV. I backed up my claims with expert sites.
I then got shot down claiming the fact the impact makes no difference in the standard seating in a living room doesn't matter because in System Wars the fact that its better means everything.
Call it damage control all you want, its only the same thing as you are doing now with PS5.
The really sad thing is the fact you pulled up a thread from 6 years ago to try and prove a point that while is true (there is no difference sat a certain distance according to experts and just on resolution) it had already been shown as irrelevant in these forums.
Sad man Sad.
@tormentos: Poor you. Now you don't even know what damage controlling is. Let me help you out with that. Everything you have posted in the past 6 weeks is textbook damage control. There is LITERALLY nothing for me damage control. There is nothing for any Xbox fanny to damage control because they have the fastest next gen system.
You have yet to give the base clocks of the PS5. You still falsely believe that the base clocks are the boost clocks despite Sony explicitly stating that its the max and that its variable. But, you are still falsely arguing that it doesn't go lower. Damage control is all you know. Its going to be a very rough generation for you.
The facts are against you and no amount of deflecting, essay writing, damage control, flashbacks to 360 or 2013 is going to change that. The PS5 is the weaker of the two and we still don't know how weak it can get because base clocks have NOT been disclosed.
@tormentos: Poor you. Now you don't even know what damage controlling is. Let me help you out with that. Everything you have posted in the past 6 weeks is textbook damage control. There is LITERALLY nothing for me damage control. There is nothing for any Xbox fanny to damage control because they have the fastest next gen system.
You have yet to give the base clocks of the PS5. You still falsely believe that the base clocks are the boost clocks despite Sony explicitly stating that its the max and that its variable. But, you are still falsely arguing that it doesn't go lower. Damage control is all you know. Its going to be a very rough generation for you.
The facts are against you and no amount of deflecting, essay writing, damage control, flashbacks to 360 or 2013 is going to change that. The PS5 is the weaker of the two and we still don't know how weak it can get because base clocks have NOT been disclosed.
Actually you are the one who don't get it,but i don't expect you to get it since i already exposed how your double standard works and how hipocrite you are and selective as well.
The difference is simple i am always here and i has been,i was when the 360 vs PS3,ps4 vs xbox one and Pro vs XBO X,just like i am now,this has always been my posting style always,for you is damage control for me is normal posting.
I have been challenging pathetic notions here for close to 20 years,so there is no damage control,damage control is what i have quote from many lemmings.
Or if i was saying there would be no gap or that the PS5 would magically close the gap using some pathetic api or workaround.
I know mostly what this gap will be,and you also know alto you pretend to be stupid and act like you don't know and tell people no one knows how big or small it will be,yes we know it will be minimal but that doesn't work for your narrative.
Actually you are the one who don't get it,but i don't expect you to get it since i already exposed how your double standard works and how hipocrite you are and selective as well.
The difference is simple i am always here and i has been,i was when the 360 vs PS3,ps4 vs xbox one and Pro vs XBO X,just like i am now,this has always been my posting style always,for you is damage control for me is normal posting.
I have been challenging pathetic notions here for close to 20 years,so there is no damage control,damage control is what i have quote from many lemmings.
Or if i was saying there would be no gap or that the PS5 would magically close the gap using some pathetic api or workaround.
I know mostly what this gap will be,and you also know alto you pretend to be stupid and act like you don't know and tell people no one knows how big or small it will be,yes we know it will be minimal but that doesn't work for your narrative.
That's cool. Which part addresses my comment and what are the base clocks?😎
I'm very happy in the knowledge for the next several years I will own the most powerful console.
I'm also happy in the knowledge when it comes to performance I'll have ownage over people who make threads like this one. (there wont be many)
I am happy in the knowledge that if i seat 8 feet from my TV i would not see any difference,according to some expert who make a thread here a few years ago.
@tdkmillsy said:
With all the discussions around how much better 1080p resolution on the PS4 is than 720p/900p resolution on the Xbox One. How much of a benefit it is depends on how far you sit away from your TV. I sit 8.5 feet from a 50" TV so will see very little difference between 720p and 1080p and no difference between 900p and 1080p.
How far do you sit from the TV?
@tdkmillsy said:
That's exactly what I'm saying. Its well known in the TV world 1080p is only beneficial when distance and/or screen size reaches a certain level. I'm suggesting if you play in the living room then there's a good chance you wouldn't notice.
https://www.gamespot.com/forums/system-wars-314159282/how-far-do-you-sit-from-your-tv-is-1080p-really-th-30856016/?page=1
I am also happy in the knowledge that when the comparison comes i will not make such pathetic damage control threads like you did.
If you didn't die at 720p i will be incredibly find at 4k.
Specifically on resolution experts suggested you couldn't see the difference when sitting a set distance from the TV. I backed up my claims with expert sites.
I then got shot down claiming the fact the impact makes no difference in the standard seating in a living room doesn't matter because in System Wars the fact that its better means everything.
Call it damage control all you want, its only the same thing as you are doing now with PS5.
The really sad thing is the fact you pulled up a thread from 6 years ago to try and prove a point that while is true (there is no difference sat a certain distance according to experts and just on resolution) it had already been shown as irrelevant in these forums.
Sad man Sad.
See @Pedro now this is damage control at its best and you were not there to see it.
Actually you are the one who don't get it,but i don't expect you to get it since i already exposed how your double standard works and how hipocrite you are and selective as well.
The difference is simple i am always here and i has been,i was when the 360 vs PS3,ps4 vs xbox one and Pro vs XBO X,just like i am now,this has always been my posting style always,for you is damage control for me is normal posting.
I have been challenging pathetic notions here for close to 20 years,so there is no damage control,damage control is what i have quote from many lemmings.
Or if i was saying there would be no gap or that the PS5 would magically close the gap using some pathetic api or workaround.
I know mostly what this gap will be,and you also know alto you pretend to be stupid and act like you don't know and tell people no one knows how big or small it will be,yes we know it will be minimal but that doesn't work for your narrative.
That's cool. Which part addresses my comment and what are the base clocks?😎
Probably in Cernys storage next to the secrets of time travel.
I'm very happy in the knowledge for the next several years I will own the most powerful console.
I'm also happy in the knowledge when it comes to performance I'll have ownage over people who make threads like this one. (there wont be many)
I am happy in the knowledge that if i seat 8 feet from my TV i would not see any difference,according to some expert who make a thread here a few years ago.
@tdkmillsy said:
With all the discussions around how much better 1080p resolution on the PS4 is than 720p/900p resolution on the Xbox One. How much of a benefit it is depends on how far you sit away from your TV. I sit 8.5 feet from a 50" TV so will see very little difference between 720p and 1080p and no difference between 900p and 1080p.
How far do you sit from the TV?
@tdkmillsy said:
That's exactly what I'm saying. Its well known in the TV world 1080p is only beneficial when distance and/or screen size reaches a certain level. I'm suggesting if you play in the living room then there's a good chance you wouldn't notice.
https://www.gamespot.com/forums/system-wars-314159282/how-far-do-you-sit-from-your-tv-is-1080p-really-th-30856016/?page=1
I am also happy in the knowledge that when the comparison comes i will not make such pathetic damage control threads like you did.
If you didn't die at 720p i will be incredibly find at 4k.
Specifically on resolution experts suggested you couldn't see the difference when sitting a set distance from the TV. I backed up my claims with expert sites.
I then got shot down claiming the fact the impact makes no difference in the standard seating in a living room doesn't matter because in System Wars the fact that its better means everything.
Call it damage control all you want, its only the same thing as you are doing now with PS5.
The really sad thing is the fact you pulled up a thread from 6 years ago to try and prove a point that while is true (there is no difference sat a certain distance according to experts and just on resolution) it had already been shown as irrelevant in these forums.
Sad man Sad.
See @Pedro now this is damage control at its best and you were not there to see it.
Was it you who said PS5 /SX couldn't possibly have SSD??
Ok i make this thread because it is far to obvious now to me,that lemmings are not happy with the xbox series X been 12TF,they really wanted the xbox to be 50% minimun more powerful than the PS5.
This is quite obvious by the incredible bending of specs,trying to make gaps look larger than they will appear.
Now i just reply to Naviguy using a theory to increase the gap using Cerny comparison of GCN vs RDNA2,but that is just the lastest in what has become the norm around here.
From Ronvalencia comparing the gap between the PS4 and a 7970 which is 1.9TF to claim the gap between the xbox and PS5 is big because it is 1.8Tf,completely ignoring that the gap between the 7970 and the PS4 is actually 200%,to other lemmings like Blackshirt20 claming the gap is 30% based on nothing but his opinion,opinion shared by several other lemmings as well.
Other say the gap is a complete PS4 as if that mean anything,when the PS4 launch it was more than 2 xbox 360 more powerful than the xbox one,gaps aren't measure that way,which even Ronvalencia has use as well as argument.
To the xbox series X will have much more raytracing because it has more CU and according to some RT scale with shaders power,completely ignoring that the more the xbox push RT the more it will suffer as the RTX 2080TI has demostarted which also pack more power than the xbox series X as well and still has to drop on minecraft to 1080p.
If anything this post show that even while having the advantage power wise xbox fans simply are not happy with the gap and they want to blow it as much as they can,be happy with what you have a pretty powerful machine that will have superior hardware,don't over sell it because when the machine comes out you people don't have to make stupid excuses as to why the gap is not materializing.
Just my point of view.
And yes i rate this thread 9.2 out of 10.28 just in case.😊
I think most people are happy that the Xbox Series X is 12.15 TF.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
The argument for GDDR being better for CPU is useless when PS5 needs to reduce CPU usage to able 2230 Mhz GPU.
CPU processing raster graphics is not optimal.
The argument is not useless. DDR is superior to GDDR for CPU workloads.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
The argument for GDDR being better for CPU is useless when PS5 needs to reduce CPU usage to able 2230 Mhz GPU.
CPU processing raster graphics is not optimal.
The argument is not useless. DDR is superior to GDDR for CPU workloads.
Unless you have all of the specifications for timings and relative latencies I'm just not buying it.
Lastly, for the 18th time, where are the sources for GDDR is better than DDR for CPUs?
Well the CAS latency is nearly identical now between GDDR and DDR and you have an immensely larger amount of bandwidth at your disposal with GDDR so I would assume it's better all around..
The argument for GDDR being better for CPU is useless when PS5 needs to reduce CPU usage to able 2230 Mhz GPU.
CPU processing raster graphics is not optimal.
The argument is not useless. DDR is superior to GDDR for CPU workloads.
Unless you have all of the specifications for timings and relative latencies I'm just not buying it.
I do, as in it's easy to google. But the point is not numbers vs numbers rather which memory type is suited for which workloads. GPUs need higher throughput. CPUs need low latency. Unless you have a memory type that has the best of both worlds, CPU and GPU will use different memory types.
Please Log In to post.
Log in to comment