Hasn't Phil Spencer now said twice that it won't be a dramatic change
@FoxbatAlpha: Well I think anyone who believes the Xb1 was not built to take advantage of DX12 is out of their mind. That being said even though Phil's statement about the gains is pretty vague, it's also correct. It's not like the Xb1 is gonna be able to do anything higher than 1080p. Take GTA V for example, if DX12 came out and the only thing that got boosted was it was able to match the PS4 extra foliage we wouldn't consider that a dramatic boost. Like Spencer said they new what DX12 was doing when they built the xbox, they new older cards would get a nice bump with the performance gains of Dx12 and you would need less power to get good results. No use in fitting in a giant gpu when you are building an API to boost the performance of pretty much all gpu's....it's kind of self explanatory. It's not like the xb1 is gonna get dx12 and all of a sudden be able to play 4k games, but if you look at M$ vision for the xbox some things speak for themselves. Devs will most likely get more control over the system and be free to figure out new ways to get more out of the Xb1. I read about devs wondering why M$ started the XB1 at DX11 when it's built for DX12, basically wondering why they started with an API that could not take full advantage of the full potential of the box.
MS built the xb1 with cloud computing in mind, we hear cows like tormentos always talking about gpgpu on the PS4, but if the cloud takes off like the Crackdown devs are saying then no amount of gpgpu on the PS4 can compete with that, essentially meaning the XB1 wont have to rely on gpgpu because it can offload to the cloud. No sacrificing gpu hardware for gpgpu. Resolution may not change but in game content such as AI(ie.Titanfall) and amount of objects on screen could get a bump, better physics etc. So my judgement hangs on Crackdown if the cloud works as it should then later in the gen when games become more demanding and the PS4 has to allocate more gpu power for compute we'll probably see both consoles with matching resolutions, but the difference comes in from the cpu stand point. Now if the cloud stands true we'll most likely see better fps from xbox because of the fact that the cpu will have less work to do, and the cpu does not bottleneck the system like it does on PS4.
Cows are not really dumb they are just insecure. They know eventually the ps4 will loose it's gpu advantage when gpgpu is thrown into the equation, they know aside from companies like Google and Apple, MS is the only company who could really make cloud computing a reality, Dx12 is the industry standard and already supported by UE4 and Unity. At this point the extra 180p that the PS4 has to offer is not a factor. It's gonna be just like last gen, games made for PC and Xbox first then the PS gets ported last.
@tormentos: Umm.....you can't read, that was the first tweet I quoted which also contained a link to the original tweet.
Ok so yea consoles have had low level API, but there is still work to be done on spreading the work evenly between cores. I guess you forget that Sony was looking to hire someone to work on this same concept not too long ago. Like I said above the bump in the XB1 gains probably wont be as big as some think but there are other things at play besides just DX12 and graphics. Graphics is one piece of the puzzle, if you can fill your worlds with smarter AI and more stuff, that makes a bigger impact than pretty graphics.
@daious: you talking about misinformation but didnt give me any information. Enlighten me on how they are different, i showed ou how they are the same. Where this site get mixed up is businesses and consumers, you think because they are not selling to you the consumers big businesses cant get it either. Stop thnking everything is from that site because you only wait for negative info. They are the same just smaller scale on the x1.
@daious: you talking about misinformation but didnt give me any information. Enlighten me on how they are different, i showed ou how they are the same. Where this site get mixed up is businesses and consumers, you think because they are not selling to you the consumers big businesses cant get it either. Stop thnking everything is from that site because you only wait for negative info. They are the same just smaller scale on the x1.
Thank you for confirming that your only source is misterxpedia and that you don't have any information outside of that site to support it.
You have zero evidence to support your claim. Zero. No where else does it say that HBM is the same. No where. Don't you find that oddly suspicious that AMD, MS, hynix, and any other technology site has said nothing about it? That
HBM has never been mass produced before. This year is the first. Hell the development started in 2010 and partnership and adoption started end of oct 2013 (After the xboxone specs were already produced). It is not out right now. It wasn't out in 2013. It wasn't even finalized in oct 2013.
Xboxone one does not have HBM.
@lostrib: Damn homie yu was just on 38990 post yesterday now you on your 40015 today and counting ........WHAT R THE **** YOU DOING WITH YOUR LIFE? Do understand what the word drastic means? any improvement is a win
lol he would have to post like 27 times every day for nearly 4 years to get to what he has now.
@slimdogmilionar: Thanks! I wasn't expecting such a detailed point on it. I always like your knowledge on these things! ;)
@daious: I never got that info from misterx ,i also dont bookmark or copy and paste info i read that off of their site when HBM was first being introduced. You might want to start there, I done arguing with you they specifically said their HBM was in consoles, and not in the ps4 so whos left?
@daious: I never got that info from misterx ,i also dont bookmark or copy and paste info i read that off of their site when HBM was first being introduced. You might want to start there, I done arguing with you they specifically said their HBM was in consoles, and not in the ps4 so whos left?
So I guess I was right. Thank you for confirming it.
I asked you to provide evidence. You have provided none.
As I said before- You have zero evidence to support your claim. Zero. No where else does it say that HBM is the same. No where. Don't you find that oddly suspicious that AMD, MS, hynix, and any other technology site has said nothing about it?
You would think it would be easy for you to find at least one link stating this. No tech site has ever claimed what you are saying. Even AMD hasn't and MS hasn't.
Come on. HBM wasn't finalized before the xboxone production. Its a fact. No if ands or buts about it. AMD and Hynix just announced their joint venture and collaboration a year ago. They announced they were working on it in december 2013 (after consoles were shipped out). Production doesn't happen until this year.
Stop making things up.
@daious: Damn i suppose to do your work for you? I already know they are the same but since you cant find it through your usual negative sites you want me to find it for you? Stop it and go read hynix HBM info.
Because your source does not exist. No where does it say it. MS didn't say it. AMD didn't say it. Hynix didn't say it. No tech site says it. You have zero evidence to support your claim. There is no evidence to support your claim.
AMD and Hynix announced that they are developing jointly in December 2013. HBM was still in development in 2014.
Then in June 2014 at the Hynix conference they announced that HBM was mostly finalized. Hynix also claimed that 21 design-ins are in progress with the first looking to be out within the next year (our 2015).
Stop making things up.
@FoxbatAlpha: Well I think anyone who believes the Xb1 was not built to take advantage of DX12 is out of their mind. That being said even though Phil's statement about the gains is pretty vague, it's also correct. It's not like the Xb1 is gonna be able to do anything higher than 1080p. Take GTA V for example, if DX12 came out and the only thing that got boosted was it was able to match the PS4 extra foliage we wouldn't consider that a dramatic boost. Like Spencer said they new what DX12 was doing when they built the xbox, they new older cards would get a nice bump with the performance gains of Dx12 and you would need less power to get good results. No use in fitting in a giant gpu when you are building an API to boost the performance of pretty much all gpu's....it's kind of self explanatory. It's not like the xb1 is gonna get dx12 and all of a sudden be able to play 4k games, but if you look at M$ vision for the xbox some things speak for themselves. Devs will most likely get more control over the system and be free to figure out new ways to get more out of the Xb1. I read about devs wondering why M$ started the XB1 at DX11 when it's built for DX12, basically wondering why they started with an API that could not take full advantage of the full potential of the box.
MS built the xb1 with cloud computing in mind, we hear cows like tormentos always talking about gpgpu on the PS4, but if the cloud takes off like the Crackdown devs are saying then no amount of gpgpu on the PS4 can compete with that, essentially meaning the XB1 wont have to rely on gpgpu because it can offload to the cloud. No sacrificing gpu hardware for gpgpu. Resolution may not change but in game content such as AI(ie.Titanfall) and amount of objects on screen could get a bump, better physics etc. So my judgement hangs on Crackdown if the cloud works as it should then later in the gen when games become more demanding and the PS4 has to allocate more gpu power for compute we'll probably see both consoles with matching resolutions, but the difference comes in from the cpu stand point. Now if the cloud stands true we'll most likely see better fps from xbox because of the fact that the cpu will have less work to do, and the cpu does not bottleneck the system like it does on PS4.
Cows are not really dumb they are just insecure. They know eventually the ps4 will loose it's gpu advantage when gpgpu is thrown into the equation, they know aside from companies like Google and Apple, MS is the only company who could really make cloud computing a reality, Dx12 is the industry standard and already supported by UE4 and Unity. At this point the extra 180p that the PS4 has to offer is not a factor. It's gonna be just like last gen, games made for PC and Xbox first then the PS gets ported last.
@tormentos: Umm.....you can't read, that was the first tweet I quoted which also contained a link to the original tweet.
Ok so yea consoles have had low level API, but there is still work to be done on spreading the work evenly between cores. I guess you forget that Sony was looking to hire someone to work on this same concept not too long ago. Like I said above the bump in the XB1 gains probably wont be as big as some think but there are other things at play besides just DX12 and graphics. Graphics is one piece of the puzzle, if you can fill your worlds with smarter AI and more stuff, that makes a bigger impact than pretty graphics.
Both thew xbox one GU and CPU were build before DX12,there is nothing freaking secret about the xbox one its GU was freaking slice in half just like the PS4 one,there is nothing secret inside the xbox one.
DX12 is MS attempt to bring console efficiency to PC,it is the reason it has been demo every freaking were but on the xbox one which is the place it is born from,DX12 was demo on PC and on surface which is basically a windows tablet as well,if DX12 bring any changes MS would be parading it by now in the old way.
When Gpgpu is throw into the equation buffoon the xbox one would still be behind of suffer having cripple effects,like i already told you if you use 4 CU for particle effects with what the hell you will counter that on xbox one.? From where the xbox one will pull 400+Gflops to compensate that..?
You should really answer me this, if ubisoft in a game use 400Gflops for compute from where they will get that on xbox one without actually hurting the graphics.?
You seem to think that the PS4 will using 400Gflops for something will not actually yield any results,i am in shock as to how you came to that blind and wrong theory.
@daious: Dont AMD and MS have a R&D deal since 2012? Dont AMD help with designing the X1? Isnt Hynix Memory in the X1? I dont make shit up go to the hynix site and they specifically said gme console and the ONLY game cosole that has Hynix memory is the xbox one. Beta test for the future
@daious: Dont AMD and MS have a R&D deal since 2012? Dont AMD help with designing the X1? Isnt Hynix Memory in the X1? I dont make shit up go to the hynix site and they specifically said gme console and the ONLY game cosole that has Hynix memory is the xbox one. Beta test for the future
Again you fail to provide any evidence that xboxone has HBM.
Hynix memory is found everywhere. NVIDIA using it. AMD uses it. Everyone uses it.
I just looked and read about the Hynix HBM conference in June 2014 where they talked about it being almost completely finalized. They talked about the 20+ on-going HBM devices in current progress. They talked about how the first is going to be released in 2015.
Stop making things up.
You have no evidence to support anything that you saying. Zero evidence from hynix, MS, and AMD.
esdram has been around for awhile. It is not the same as HBM. The HBM design wasn't even finalized before consoles were even out. I provided evidence of this.,
Stop lying.
@daious: Damn i suppose to do your work for you? I already know they are the same but since you cant find it through your usual negative sites you want me to find it for you? Stop it and go read hynix HBM info.
Because your source does not exist. No where does it say it. MS didn't say it. AMD didn't say it. Hynix didn't say it. No tech site says it. You have zero evidence to support your claim. There is no evidence to support your claim.
AMD and Hynix announced that they are developing jointly in December 2013. HBM was still in development in 2014.
Then in June 2014 at the Hynix conference they announced that HBM was mostly finalized. Hynix also claimed that 21 design-ins are in progress with the first looking to be out within the next year (our 2015).
Stop making things up.
He is use to pull bullshit from his buns were ever he has the chance,and no links just his word for it..hahahaha...
@daious: Damn i suppose to do your work for you? I already know they are the same but since you cant find it through your usual negative sites you want me to find it for you? Stop it and go read hynix HBM info.
This is the most lame way to say i don't have a source other that misterX media crap..hahaha'
ESRAM is a cheap memory patch to fix the xbox one bandwidth deficiencies,and is to small like many developers have already confirm.
@daious: Damn i suppose to do your work for you? I already know they are the same but since you cant find it through your usual negative sites you want me to find it for you? Stop it and go read hynix HBM info.
Because your source does not exist. No where does it say it. MS didn't say it. AMD didn't say it. Hynix didn't say it. No tech site says it. You have zero evidence to support your claim. There is no evidence to support your claim.
AMD and Hynix announced that they are developing jointly in December 2013. HBM was still in development in 2014.
Then in June 2014 at the Hynix conference they announced that HBM was mostly finalized. Hynix also claimed that 21 design-ins are in progress with the first looking to be out within the next year (our 2015).
Stop making things up.
He is use to pull bullshit from his buns were ever he has the chance,and no links just his word for it..hahahaha...
@daious: Damn i suppose to do your work for you? I already know they are the same but since you cant find it through your usual negative sites you want me to find it for you? Stop it and go read hynix HBM info.
This is the most lame way to say i don't have a source other that misterX media crap..hahaha'
ESRAM is a cheap memory patch to fix the xbox one bandwidth deficiencies,and is to small like many developers have already confirm.
I provided evidence that HBM has not even finalized in June 2014. Straight from the source. He claims xbox one has HBM. Zero source and zero evidence. No news site, no tech site, no information from AMD. No information from MS. No information from Hynix. No information anywhere.
The fact that he thinks esram is the same as HBM shows his tenuous grasp of technology.
I just hate it when people make things up.
Drives me insane. Why do people do this?
@daious: go to the site ......and didnt know AMD and NVIDIA was making consoles.....stop it you killing me smalls
Again your reading comprehension is horrendous. AMD and NVIDIA both use hynix memory. Hynix memory is extremely popular. Its utilized everywhere. I never said NVIDIA is making a console. You might want to go back and read what I said before posting.
Zero source and zero information on your claims. I just read the information from Hynix summit conference on HBM in June 2014. A whole conference on HBM.
You just consistently make up information. Xbox one uses esdram memory not HBM.
Stop lying and learn to read.
I have already provided the evidence and links that show that you are completely wrong. You have provided nothing to even remotely suggest that you are right.
I don't understand how fanboys can truly be this delusional. How can something not even finalized until the second half of 2014 and not even in production be in an xboxone?
@daious: tell me something then D it tells you right on the site. Patch work for ddr3? Well when the pascal and AMD GPUs come out still using ddr3 for the CPU you and HBM for the GPU will it be patch work then? Gddr5 is heading out the door just get with the times. I know where tech is heading you the one seemd lost
@daious: tell me something then D it tells you right on the site. Patch work for ddr3? Well when the pascal and AMD GPUs come out still using ddr3 for the CPU you and HBM for the GPU will it be patch work then? Gddr5 is heading out the door just get with the times. I know where tech is heading you the one seemd lost
Trying to shift the topic.
Jesus Christ, do you even know how to write sentences? That was painful to read. The first sentence was cringeworthy.
I provided you the sources that HBM was not done in 2014 and first products will be out in 2015. I provided evidence that AMD and Hynix joint development was announced in December 2013 and had goals to finish developing the product in 2014.
Where is the simple source that says HBM is in the xbox one? esdram is different technology than HBM. edram are in the gamecube, wii, playstation 2, xbox 360 and xbox one.
Microsoft isn't saying it. AMD isn't saying it. Hynix isn't saying it. Technology sites aren't saying it. And I am suppose to believe something that can barely write a coherent sentence and who can't even muster up a single source? Your wrong. Stop being a fanboy.
Please enlighten me with the tons of sources and statements that say your right.
Again your reading comprehension is horrendous. AMD and NVIDIA both use hynix memory. Hynix memory is extremely popular. Its utilized everywhere. I never said NVIDIA is making a console.
Zero source and zero information on your claims. I just read the information from Hynix summit conference on HBM in June 2014. A whole conference on HBM.
You just consistently make up information.
Stop lying and learn to read.
I have already provided the evidence and links that show that you are completely wrong. You have provided nothing to even remotely suggest that you are right.
I know where his argument comes from,i remember it now it is the silly notion that ESRAM is stacked memory,i already argue this shit with him,he doesn't know that ESRAM isn't stacked memory but Embedded memory which isn't the same,just like EDRAM on 360 was embedded.
The same shit happen with hUMA which they also confuse with uma on xbox 360.
UMA =Unified memory Access
hUMA = Heterogeneous Uniform Memory access.
Hahaha lemming are really sad...
Again your reading comprehension is horrendous. AMD and NVIDIA both use hynix memory. Hynix memory is extremely popular. Its utilized everywhere. I never said NVIDIA is making a console.
Zero source and zero information on your claims. I just read the information from Hynix summit conference on HBM in June 2014. A whole conference on HBM.
You just consistently make up information.
Stop lying and learn to read.
I have already provided the evidence and links that show that you are completely wrong. You have provided nothing to even remotely suggest that you are right.
I know where his argument comes from,i remember it now it is the silly notion that ESRAM is stacked memory,i already argue this shit with him,he doesn't know that ESRAM isn't stacked memory but Embedded memory which isn't the same,just like EDRAM on 360 was embedded.
The same shit happen with hUMA which they also confuse with uma on xbox 360.
UMA =Unified memory Access
hUMA = Heterogeneous Uniform Memory access.
Hahaha lemming are really sad...
I honestly think he doesn't know that the E in edram stands for embedded.
I am just against people who are lying and making things up. People who assume things and make crazy conclusions that go against everything drive me insane.
Sometimes you go overboard and full cow mode but you aren't currently making up things like the other user is doing.
Spreading misinformation only hurts your console and the other fans of that console here.
@daious: just go read hynix site its the same tech its just not stacked this back and forward thing is lame
Thank you for continuing to provide zero evidence for your claim and proving I am right.
I already went through the Hynix information. I already linked you the information. You can't provide any information supporting anything you said.
The fact that you think esram is the same as HBM shows how you don't understand the topic at all.
Please proceed to continue replying. I will be happy to point out how wrong you are everytime. I will also be more than happy to point out that you have zero support for your argument. I can do this all day.
@daious: i never said esram was stacked memory i said it was high bandwidth memory small package and high bandwidth is high bandwidth memory
Thank you for proving that you don't know what Hynix and AMD's HBM is by claiming that having high bandwidth is the same as being HBM. I am glad we cleared this up. I see that your confusion has to do with your reading comprehension and lack of understanding that HBM stands for a type of ram and not a general statement that means fast ram.
HBM isn't a general term to describe fast things. Its a type of technology being co-developed by AMD and Hynix that will see the light of day in 2015.
Christ.
To make it really simple for you to understand I came up with a hypothetical situation. Let say Ford has a car called "Super Fast Car (SFC)". If Honda has a fast car does it mean that it is a "Super Fast Car (SFC)"? No, it doesn't because a "SFC" is a car made by Ford.
I can try to make a simpler comparison if you need me too.
Both thew xbox one GU and CPU were build before DX12,there is nothing freaking secret about the xbox one its GU was freaking slice in half just like the PS4 one,there is nothing secret inside the xbox one.
DX12 is MS attempt to bring console efficiency to PC,it is the reason it has been demo every freaking were but on the xbox one which is the place it is born from,DX12 was demo on PC and on surface which is basically a windows tablet as well,if DX12 bring any changes MS would be parading it by now in the old way.
When Gpgpu is throw into the equation buffoon the xbox one would still be behind of suffer having cripple effects,like i already told you if you use 4 CU for particle effects with what the hell you will counter that on xbox one.? From where the xbox one will pull 400+Gflops to compensate that..?
You should really answer me this, if ubisoft in a game use 400Gflops for compute from where they will get that on xbox one without actually hurting the graphics.?
You seem to think that the PS4 will using 400Gflops for something will not actually yield any results,i am in shock as to how you came to that blind and wrong theory.
MOOOOO did you not read the tweets from my post earlier or did you just do some selective reading?
Spencer said they knew what DX12 was doing when they built the xbox one, the link that I posted even said the xbox one was not using all of it's features, that some would be not able to be used until Dx12.
When gpgpu is thrown into the equation? Well like I said earlier it depends on Crackdown and how successful they manage the cloud, if they pull off some ground breaking stuff you can expect others to follow. No matter how you slice it that 400 gflops of gpgpu you boast for PS4 can't compare with cloud computing. So it basically depends on the direction devs go this gen, if they go cloud computing then that 400 gflops you talkin about gonna look like shit compared to that.
Basically look at a game like Titanfall and how much processing the cloud is doing with no lag at all, you have tons of AI being controlled, when you hop out of your Titan and it's on auto pilot it's handled by the cloud, your Titan AI system that notifies you of threats etc. is handled by the cloud, and still no lag like everyone says the cloud will bring. So imagine how much it would tax the system to do all of those things, if the cpu had to handle, grunts, spectres, titan autopilot, calling your titan to the field, horde mode and how much the cloud is doing there when it's controlling player Titan AI, enemy AI, and over a dozen mortar titans, plus regular titans. That's a lot of work taken off of the system.
Hopefully devs actually start to use the cloud, if Ubisoft ran a test of ps4 gpu compute vs Azure compute which do you think would win? It's looking like they will though, with Epic on board with cloudgine and the recent announcement of UE4 being supported by DX12 I can't see them not using the cloud. If unreal is supporting DX12 and the cloud is supporting Dx12 it's only a matter of time, if you have your game engine and API both supporting it why not use it.
Lol nobody said the PS4 won't benefit from gpgpu but as it stands it's still barely hitting 1080p with stable fps. What will the resolution/fps be if they did have to spare gpu time for compute?
@slimdogmilionar: agree with most of that. This may come as a shock to PS fans but it doesn't matter who wins this gen. MS is preparing for the real war. How long is it before Google and apple join the party...
MS is creating a win/Xbox Walled garden where it will leverage against iTunes and Google play. I wouldn't be surprised if they bought valve in the next few years.
The future is high integration/engineering/design where the cost of entry starts in the billions. Sony doesn't have the money nor the talent to keep up.
@daious: Fam get the **** out of here ITS HYNIX TECH .... and its high bandwidth memory yes its not stacked but its still HBM the tech that MS is using for their eSRAM is HBM made by HYNIX lil guy dont come to me trying to sound smart you only hurting yourself ........you sound like torm and thats a complete ass.
@daious: Fam get the **** out of here ITS HYNIX TECH .... and its high bandwidth memory yes its not stacked but its still HBM the tech that MS is using for their eSRAM is HBM made by HYNIX lil guy dont come to me trying to sound smart you only hurting yourself ........you sound like torm and thats a complete ass.
Again you are wrong. Hynix tech is in tons of different products. Its in AMD products. Its in NVIDIA products. Its in tons of different products. Just because esdram is hynix tech doesn't mean its Hynix's HBM ram.
Jesus, you really can't read and understand basic concepts.
HBM is not a descriptive word that incorporates all fast/high bandwidth ram made by Hynix. Just because your fast and made by hynix doesn't mean your HBM. HBM is a seperate piece of technology that Hynix and AMD is developing. Certain Hynix ram can be high bandwidth but not HBM because AMD/Hynix named their new type of ram technology as HBM. Its a type of ram. It isn't edram, it isn't gddr5 ram, it isn't any type of ram design but its own. HBM is a specific type of stacked dram. If you aren't this specific type stacked dram then you're not HBM.
HBM is a new design. Just because esdram is fast or high bandwidth doesn't mean its HBM.
I will make my example even dumber for you to understand.
Let say Ford has a car called "Super Fast Car (SFC)". If Ford has another fast car like the mustang does it mean that it is a "Super Fast Car (SFC)"? No, it doesn't because a "SFC" is a specific type of car made by Ford.
Cheetah and jaguar is both types of cats. Just because a jaguar is fast doesn't make it a cheetah because they are completely different species. Esdram can be fast but it isn't HBM because HBM is a specific type of technology.
Some people just can't read.
Still waiting on your evidence. Will it ever come? Nope because you are making things up. You are reading HBM in a literal sense that it is high bandwidth memory but HBM is its own type of technology. It isn't a measurement of speed. Its a type of memory.
Jesus, I can't stand people who make up things on these forums.
@daious: man if you dont stop with these wack ass analogies im going to slap you myself. Ecoboost is the same tech even if it in a Ford fusion or the new GT 40 its still the same tech by the same people. Esram is the same tech as HBM just not stacked made by and designed by the same people which is AMD and hynix. Stop making yourself look foolish first it was Honda and Ford analogy and when that sounded dumb and try to do a Ford and Ford analogy and it got plan ridiculous lol
@daious: man if you dont stop with these wack ass analogies im going to slap you myself. Ecoboost is the same tech even if it in a Ford fusion or the new GT 40 its still the same tech by the same people. Esram is the same tech as HBM just not stacked made by and designed by the same people which is AMD and hynix. Stop making yourself look foolish first it was Honda and Ford analogy and when that sounded dumb and try to do a Ford and Ford analogy and it got plan ridiculous lol
Sorry I had to dumb it down for you to understand.
You claimed esram was HBM. You provided zero evidence of this. You are making assumptions with no support evidence except that they share the same manufacture.
Certain Hynix ram can be high bandwidth but not HBM because AMD/Hynix named their new type of ram technology as HBM. Its a type of ram. It isn't edram, it isn't gddr5 ram, it isn't any type of ram design but its own. HBM is a specific type of stacked dram. If you aren't this specific type stacked dram then you're not HBM.
You claimed that that xbox one had HBM which was false. I provided evidence that it wasn't. You have not.
Stop making things up.
Hynix makes tons of ram products. Several types of ram like DRA, SDR, PSRAM, DDR, DDR2, DDR3, GDDR3, GDDR5, esram, esdram, eram, SDRAM, and variants of DRAM and HBM. They don't just make esram and HBM. Did you not realize they make dozen's of different memory? Christ. You are claiming that esram is HBM is wrong. Just because its the same manufacturer doesn't mean its the same type of ram. esram is embedded. HBM is not. HBM is distinguished for a specific type of ram that they are developing. All you do is assume and provide zero evidence.
Come give me one shred of evidence that points to you bring correct. A shred. There has to be some. AMD, Hynix, or MS had to have announced it. Oh wait, there is none because you are making it up. You can't even rationalize that your assumptions may be incorrect. You keep on stating lies without support it.
He has warned that Sony can adopt Mantle to compete. This is true however I wonder how much heat the PS4 can take if its GPU is pushed hard.
For the fun of discussing the other side of the coin. Let's say DX12 was actually going to do a whole lot, what do lems make of this?:
Sounds as if these 'improvements' seem to be communal.
He has warned that Sony can adopt Mantle to compete. This is true however I wonder how much heat the PS4 can take if its GPU is pushed hard.
Nope.
@daious: same 1024 bit wide, lowpower consuming ,same voltage , made by the same people which one help design the arch in the xbox one and the other supplied to memory, how is that not proof enough? You are basiclly saying that the HBM that NVIDIA is not HBM because they call it 3-D memory instead of HBM, Christ has nothing to with your slow thinking lol and stop putting him in the topic. Show me where AMD and Hynix trademarked the term HBM or a copywrite. Then we will end the discussion.
One more item of interest. Why would Epic build DX12 compatibility in UE4? If there was no improvement why go through the effort on the Xbox One? If the Xbox One was not fully DX supported in Hardware why implement it..
@daious: same 1024 bit wide, lowpower consuming ,same voltage , made by the same people which one help design the arch in the xbox one and the other supplied to memory, how is that not proof enough? You are basiclly saying that the HBM that NVIDIA is not HBM because they call it 3-D memory instead of HBM, Christ has nothing to with your slow thinking lol and stop putting him in the topic. Show me where AMD and Hynix trademarked the term HBM or a copywrite. Then we will end the discussion.
Still no source. Still no evidence.
You said there was a source. Where is it?
Must be hard being so wrong.
Lol! I love how you asked for for a source when you have provided none. Zero sources.
"Show me AMD and Hynix trademarked the term HBM" here is AMD and Hynix filling of HBM DRAM - here and here specifically this filing and standard here JESD235. It is not a trademark but a standard of what HBM ram is.
Dear god, did you seriously just find out that HBM is it's own type of memory? HBM DRAM is HBM DRAM. esram is esram. esram is not HBM dram. HBM is a completely different type.
Come on where is your sources? There has to be tons because you are so confident. If esram is the same thing as HBM there must be statements from all those companies. Where is the statements that esram complies with the JEDEC's HBM standard?
Meanwhile, you just found out that Hynix makes dozens of different types of ram. LOL
Thanks for the laughs.
MOOOOO did you not read the tweets from my post earlier or did you just do some selective reading?
Spencer said they knew what DX12 was doing when they built the xbox one, the link that I posted even said the xbox one was not using all of it's features, that some would be not able to be used until Dx12.
When gpgpu is thrown into the equation? Well like I said earlier it depends on Crackdown and how successful they manage the cloud, if they pull off some ground breaking stuff you can expect others to follow. No matter how you slice it that 400 gflops of gpgpu you boast for PS4 can't compare with cloud computing. So it basically depends on the direction devs go this gen, if they go cloud computing then that 400 gflops you talkin about gonna look like shit compared to that.
Basically look at a game like Titanfall and how much processing the cloud is doing with no lag at all, you have tons of AI being controlled, when you hop out of your Titan and it's on auto pilot it's handled by the cloud, your Titan AI system that notifies you of threats etc. is handled by the cloud, and still no lag like everyone says the cloud will bring. So imagine how much it would tax the system to do all of those things, if the cpu had to handle, grunts, spectres, titan autopilot, calling your titan to the field, horde mode and how much the cloud is doing there when it's controlling player Titan AI, enemy AI, and over a dozen mortar titans, plus regular titans. That's a lot of work taken off of the system.
Hopefully devs actually start to use the cloud, if Ubisoft ran a test of ps4 gpu compute vs Azure compute which do you think would win? It's looking like they will though, with Epic on board with cloudgine and the recent announcement of UE4 being supported by DX12 I can't see them not using the cloud. If unreal is supporting DX12 and the cloud is supporting Dx12 it's only a matter of time, if you have your game engine and API both supporting it why not use it.
Lol nobody said the PS4 won't benefit from gpgpu but as it stands it's still barely hitting 1080p with stable fps. What will the resolution/fps be if they did have to spare gpu time for compute?
Knowing what DX12 was doing doesn't mean we have 100% DX12 compatible hardware which is what he was asked.
GCN is not 100% DX12 compatible maxwell is,the xbox one is GCN stop your sorry ass damage controls,as Phil didn't stated in any place yea we are 100% DX12 compatible,that is what you are interpreting.
First of all from were in fu**ing he you get that you can stream 400Gflops of power from the damn crappy cloud.?
The process the cloud handles are thing which don't need constant refreshing,anything that needs high bandwidth can't be run on the cloud period,or anything that need to be constantly refresh,it was already shot down,so trying to imply that anything the PS4 by compute will be compensated by the cloud is a joke not even close,and requiring online means any problem on your end or theirs and you don't play anything.
On PS4 all works because it is inside every console nothing need it.
Titanfall is 792p with shit ass graphics and frame drops into the 35 frames,basically you are killing your own argument by using that game,Titanfall is the living proof that cloud mean shit when it comes to graphics.
Please Resistance had no lag was 20 vs 20 on 2006..lol you don't need a cloud for that just some dedicated servers.
I am sure they will win,the problem is that you can't stream the cloud power to help your games graphically,that was stated already the cloud will not make for graphics deficiencies,like i already told you a server from MS can have 4TF problem is you can't stream that to the xbox one because online connection don't allot it,and GPU handle graphics in GB/s not MB/s.
Unreal support all platforms they the fu** out is a damn game engine which work even on tablets,you are telling me nothing by saying they support DX12.
Thats funny because in Saints Row it is the xbox one behind in frames and effects...but but but the weak CPU...lol
@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol
@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol
HBM DRAM is a standard and a specific type of ram you idiot. Its a certain type of DRAM ram. I just showed you the documents. I just showed you the JEDEC number filling and standard for HBM. Heck, the HBM DRAM has its own wikipedia article.
Again, you can't find a source to prove that you are right. You can't link a source. You can't provide any evidence. I gave you links. Meanwhile, you can't even give me one.
"Look at their website". I went through their entire conference and statements on the HBM conference. Link to their statements that esram is HBM on their website. Just give me the link. How hard is it if you're telling the truth.
I provided you with tons of sources that prove your wrong. You can't even provide a source that you are right.
@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol
HBM DRAM is a standard and a specific type of ram you idiot. Its a certain type of DRAM ram. I just showed you the documents. I just showed you the JEDEC number filling and standard for HBM. Heck, the HBM DRAM has its own wikipedia article.
Again, you can't find a source to prove that you are right. You can't link a source. You can't provide any evidence. I gave you links. Meanwhile, you can't even give me one.
"Look at their website". I went through their entire conference and statements on the HBM conference. Link to their statements that esram is HBM on their website. Just give me the link.
I provided you with tons of sources that prove your wrong. You can't even provide a source that you are right.
That what MS eSRAM is lol HBM dram you the one try to say it only AMD and hynix when these are the same people who made the eSRAM its the same tech. Duck season, rabbit season, duck season, rabbit season, rabbit season, duck season........gets shot in the face lol
Please Log In to post.
Log in to comment