And cows and lems keep fighting while sheep and herms laugh :roll:
SchnabbleTab
sheeps dont exist, and hermits are an afterthought.
This topic is locked from further discussion.
PS4 has more Bandwidth to its System RAM than the XBone does to the ESRAM. The only way the XBone can move more than 88 GB/s is with simultaneous read/write from both ESRAM and main memory. So, please kindly take a seat while the grown ups talk.[QUOTE="Shewgenja"][QUOTE="Mr720fan"]
trying way too hard :lol: gues it wont be good for the higher latency PS4 Gddr5 model. DEALWITHIT cow apologist.
ronvalencia
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
[QUOTE="ronvalencia"]
[QUOTE="Shewgenja"] PS4 has more Bandwidth to its System RAM than the XBone does to the ESRAM. The only way the XBone can move more than 88 GB/s is with simultaneous read/write from both ESRAM and main memory. So, please kindly take a seat while the grown ups talk.Mr720fan
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
While GDDR5 module's latency can be similar to DDR3 module(I have the docs to prove it), the memory controller is another story.
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3
Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.
Intel attacked the memory interface for GDDR.
[QUOTE="ronvalencia"]
[QUOTE="Shewgenja"] PS4 has more Bandwidth to its System RAM than the XBone does to the ESRAM. The only way the XBone can move more than 88 GB/s is with simultaneous read/write from both ESRAM and main memory. So, please kindly take a seat while the grown ups talk.Mr720fan
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
The ESRAM doesn't hold a lot of information which means however you will be able to get quick transfers as PS4 your not really going to see at most of the data is going to be going through the DDR3 while only a silly little amount will actually be quick. and then when you take into account the the Xbox ONE has access to 5GB of its DDR3 and the PS4 has access to 7GB of its GDDR5 it makes the speed of transfer even more noticeable in games. At the start on the new gen I think you won't see much difference as both are playing the games roughly at the same level but later down the line the Xbox will show struggle with its less access of ram for gaming and slower ram compared to the PS4 more ram for games and quicker ram.[QUOTE="Mr720fan"][QUOTE="ronvalencia"]
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
ziggyww
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
The ESRAM doesn't hold a lot of information which means however you will be able to get quick transfers as PS4 your not really going to see at most of the data is going to be going through the DDR3 while only a silly little amount will actually be quick. and then when you take into account the the Xbox ONE has access to 5GB of its DDR3 and the PS4 has access to 7GB of its GDDR5 it makes the speed of transfer even more noticeable in games. At the start on the new gen I think you won't see much difference as both are playing the games roughly at the same level but later down the line the Xbox will show struggle with its less access of ram for gaming and slower ram compared to the PS4 more ram for games and quicker ram.dat Damage control... why dont you just watch the video cow.... and i always said that the difference will be like it was this generation. Maybe multiplats run smoother on ps4 but overall not that big of a deal. To make it like PS4 is going to have such superior games is hogwash, cows are just too stupid to get this. Trying to justify a purchase i guess.
[QUOTE="Mr720fan"]
[QUOTE="ronvalencia"]
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
ronvalencia
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
While GDDR5 module's latency can be similar to DDR3 module(I have the docs to prove it), the memory controller is another story.
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3
Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.
Intel attacked the memory interface for GDDR.
dude i love you.. keep up the good work, thanks for educating us all!!! you must be a developer or something.
[QUOTE="ronvalencia"]
[QUOTE="granddogg"]
http://venturebeat.com/2013/06/26/microsoft-reveals-how-tiled-resources-enables-detailed-graphics-on-xbox-one-and-windows-8-1/
Mr720fan
Sounds like AMD GCN's Partially Resident Textures (PRT) hardware feature being exposed for DirectX. Microsoft haven't exposed AMD GCN's PRT for PC DirectX.
AMD's PRT hardware in GCN would work nicely on X1's ESRAM (very low latency, zero refresh overheads).
so in your estimation what kind of value if any does this have to microsoft and do you see SONY using this as well with OPEN GL?
With AMD PRT, you can overcommit your VRAM and still perform well i.e. this is dependant on texture visibility/recyclability and look ahead pre-fetch engine/processes. It's a puzzle to be solved.
http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx
PRT can utilize absolutely enormous texture files, up to 32 terabytes large, with minimal performance impact. PRT accomplishes this by streaming small bits of these massive textures into the GPU as needed, giving compatible games a virtually endless supply of unique texture data it can apply to the game world. The GCN Architecture in 28nm AMD Radeon products is the first GPU design to feature a hardware implementation of this technology. Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.
PS; Blu-ray is not enough for 32 terabyte textures. Btw, AMD also gave hints (i.e. "future games") for going beyond DX11.1 with thier GCN presentations e.g. DX11.2
[QUOTE="Mr720fan"]
[QUOTE="ronvalencia"]
Sounds like AMD GCN's Partially Resident Textures (PRT) hardware feature being exposed for DirectX. Microsoft haven't exposed AMD GCN's PRT for PC DirectX.
Â
Â
AMD's PRT hardware in GCN would work nicely on X1's ESRAM (very low latency, zero refresh overheads).
Â
Â
ronvalencia
so in your estimation what kind of value if any does this have to microsoft and do you see SONY using this as well with OPEN GL?
With AMD PRT, you can overcommit your VRAM and still perform well i.e. this is dependant on texture visibility/recyclability and look ahead pre-fetch engine/processes. It's a puzzle to be solved.
http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx
PRT can utilize absolutely enormous texture files, up to 32 terabytes large, with minimal performance impact. PRT accomplishes this by streaming small bits of these massive textures into the GPU as needed, giving compatible games a virtually endless supply of unique texture data it can apply to the game world. The GCN Architecture in 28nm AMD Radeon products is the first GPU design to feature a hardware implementation of this technology. Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.
Â
Â
PS; Blu-ray is not enough for 32 terabyte textures. Btw, AMD also gave hints for going beyond DX11.1 with thier GCN presentations e.g. DX11.2
Â
Â
thanks alot you are extremely knowledgableÂ
PS4 has more Bandwidth to its System RAM than the XBone does to the ESRAM. The only way the XBone can move more than 88 GB/s is with simultaneous read/write from both ESRAM and main memory. So, please kindly take a seat while the grown ups talk.[QUOTE="Shewgenja"][QUOTE="Mr720fan"]
trying way too hard :lol: gues it wont be good for the higher latency PS4 Gddr5 model. DEALWITHIT cow apologist.
ronvalencia
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
This is a non-sequitur due to the XBox not having more RAM than the PS4. The XBone has less RAM available to games due to the OS layout as well.[QUOTE="ronvalencia"][QUOTE="Shewgenja"] PS4 has more Bandwidth to its System RAM than the XBone does to the ESRAM. The only way the XBone can move more than 88 GB/s is with simultaneous read/write from both ESRAM and main memory. So, please kindly take a seat while the grown ups talk.Shewgenja
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
Â
Â
Â
Â
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
Â
Â
Â
Â
Â
This is a non-sequitur due to the XBox not having more RAM than the PS4. The XBone has less RAM available to games due to the OS layout as well.yeah because most games will go over 5 gigs :lol: dude youve been owned, get off that shit payroll of sonysÂ
[QUOTE="Mr720fan"][QUOTE="ronvalencia"]
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
ziggyww
:lol: right here you have the education of a misinforming mongrel cow. EPIC, RON FOR PRESIDENT but but it cant be done because of ESRAM hahahahahahaha :lol:
The ESRAM doesn't hold a lot of information which means however you will be able to get quick transfers as PS4 your not really going to see at most of the data is going to be going through the DDR3 while only a silly little amount will actually be quick. and then when you take into account the the Xbox ONE has access to 5GB of its DDR3 and the PS4 has access to 7GB of its GDDR5 it makes the speed of transfer even more noticeable in games. At the start on the new gen I think you won't see much difference as both are playing the games roughly at the same level but later down the line the Xbox will show struggle with its less access of ram for gaming and slower ram compared to the PS4 more ram for games and quicker ram.You are not factoring AMD's PRT hardware feature e.g. 32 terabyte texture with minimal performance impact. You are thinking of the last-gen game engines i.e. AMD PRT hardware is designed for future game engines and large PC 3D applications.
hhhmmm
After months of research and countless hours of hard work I came to this conclusion
Lems think Xbox > Playstation while Cows believe Playstation > Xbox, and no matter how many hardware comparisons, legit information or any type of proof you'll gather/show on these boads things will never change..Â
This is a non-sequitur due to the XBox not having more RAM than the PS4. The XBone has less RAM available to games due to the OS layout as well.[QUOTE="Shewgenja"][QUOTE="ronvalencia"]
http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx
Â
Â
Â
Â
While GDDR5 is faster, you have diminishing returns with GDDR5. With this in-mind, note why AMD included 384 bit GDDR5 in thier flagship GPUs i.e. AMD's attempt to overcome the diminishing returns with brute force.
Â
Â
Â
Â
Â
Mr720fan
yeah because most games will go over 5 gigs :lol: dude youve been owned, get off that shit payroll of sonysÂ
To get the throughput outlined in this graph, you are actually comparing 5GBs of (G)DDR3 to what the GDDR5 does with 2.5 effectively. So, yeah. I reiterate what I said before. You should probably sit down and let the adults talk.Hm. I'm not technically savvy as many users here, but in all honesty, I thought consoles and PCs were already doing this (to the same or lesser extent). Isn't this just basically swapping out textures depending upon the field of view?
Says the guy that posts on every pro MS topic just to add some useless drivel about how the PS4 is better. Yeah....lems sure are desperate.Lems are so desperate lately, what gives?
Davekeeh
Cows are having a rough weak just look up and read shew all ober the place and trying so hard :lol: Â ron for prez after this
Mr720fan
sad indeedÂ
[QUOTE="Mr720fan"][QUOTE="ronvalencia"] This is a non-sequitur due to the XBox not having more RAM than the PS4. The XBone has less RAM available to games due to the OS layout as well.Shewgenja
yeah because most games will go over 5 gigs :lol: dude youve been owned, get off that shit payroll of sonysÂ
To get the throughput outlined in this graph, you are actually comparing 5GBs of (G)DDR3 to what the GDDR5 does with 2.5 effectively. So, yeah. I reiterate what I said before. You should probably sit down and let the adults talk.Lame attempts by you getting even lamer, Â i am sure it hurts you that this will used more on the x1 than the ps4 being that it is built into 11.2, you we r e educated throughly by ron this thread and trying to save face, Â dx < opengl deal with it cow and dont quit your day job :lol:
Says the guy that posts on every pro MS topic just to add some useless drivel about how the PS4 is better. Yeah....lems sure are desperate.[QUOTE="always_explicit"][QUOTE="Davekeeh"]
Lems are so desperate lately, what gives?
Mr720fan
:lol:Â
.
here is a good example of Xbox One's secret sauce in play, that powerful 1.1 teraflop GPU with CLOUD POWERRRÂ
Microsoft has unveiled DirectX 11.2, an updated version of its existing graphics technology that introduces "a host of new features to improve performance" in games and graphics apps.
Â
DirectX 11.2's headline feature is the addition of 'Tiled Resources', an advanced graphics technology that lets developers pull hi-res assets into a scene dynamically without overloading the graphics card. Essentially, the tech ensures that textures don't appear blurred or fuzzy when viewed close up.
Â
Microsoft says that Tiled Resources will let developers "make games with unprecedented amounts of detail".
Â
DirectX 11.2 also reduces latency for DirectX apps, allowing for "faster UI response".
Â
The technology will be compatible with Xbox One and devices running Windows 8.1.
Microsoft has unveiled DirectX 11.2, an updated version of its existing graphics technology that introduces "a host of new features to improve performance" in games and graphics apps.
Â
DirectX 11.2's headline feature is the addition of 'Tiled Resources', an advanced graphics technology that lets developers pull hi-res assets into a scene dynamically without overloading the graphics card. Essentially, the tech ensures that textures don't appear blurred or fuzzy when viewed close up.
Â
Microsoft says that Tiled Resources will let developers "make games with unprecedented amounts of detail".
Â
DirectX 11.2 also reduces latency for DirectX apps, allowing for "faster UI response".
Â
The technology will be compatible with Xbox One and devices running Windows 8.1.
Mr720fan
[QUOTE="Mr720fan"]
[QUOTE="ronvalencia"]
Sounds like AMD GCN's Partially Resident Textures (PRT) hardware feature being exposed for DirectX. Microsoft haven't exposed AMD GCN's PRT for PC DirectX.
Â
Â
AMD's PRT hardware in GCN would work nicely on X1's ESRAM (very low latency, zero refresh overheads).
Â
Â
ronvalencia
so in your estimation what kind of value if any does this have to microsoft and do you see SONY using this as well with OPEN GL?
With AMD PRT, you can overcommit your VRAM and still perform well i.e. this is dependant on texture visibility/recyclability and look ahead pre-fetch engine/processes. It's a puzzle to be solved.
http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx
PRT can utilize absolutely enormous texture files, up to 32 terabytes large, with minimal performance impact. PRT accomplishes this by streaming small bits of these massive textures into the GPU as needed, giving compatible games a virtually endless supply of unique texture data it can apply to the game world. The GCN Architecture in 28nm AMD Radeon products is the first GPU design to feature a hardware implementation of this technology. Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.
Â
PS; Blu-ray is not enough for 32 terabyte textures. Btw, AMD also gave hints (i.e. "future games") for going beyond DX11.1 with thier GCN presentations e.g. DX11.2
Â
Â
Â
Â
Damn it is like shock and awe up in this bitch lems getting bombed :lol:You didn't see sh1t but games running on PC's. If you're too young to remember E3'05 I'll tell ya this... Almost nothing looked the same at launch as it did at E3 '05.both consoles will be similar. devs said it, we saw it at e3. now plz rats and roaches stop the fighting. u idiots r giving me a headache
AD216
Solution: Wait and see
You didn't see sh1t but games running on PC's. If you're too young to remember E3'05 I'll tell ya this... Almost nothing looked the same at launch as it did at E3 '05.[QUOTE="AD216"]
both consoles will be similar. devs said it, we saw it at e3. now plz rats and roaches stop the fighting. u idiots r giving me a headache
Douevenlift_bro
Solution: Wait and see
lmao i'm 27 bruh and i'm pretty sure ur like 13. with that said, i've been saying wait and see for well over a month. from everything i've been seeing lately tho the x1 is starting to build a lot of hype. not that it matters to me because i'll be getting both.[QUOTE="ronvalencia"]
[QUOTE="Mr720fan"]
so in your estimation what kind of value if any does this have to microsoft and do you see SONY using this as well with OPEN GL?
no-scope-AK47
With AMD PRT, you can overcommit your VRAM and still perform well i.e. this is dependant on texture visibility/recyclability and look ahead pre-fetch engine/processes. It's a puzzle to be solved.
http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx
PRT can utilize absolutely enormous texture files, up to 32 terabytes large, with minimal performance impact. PRT accomplishes this by streaming small bits of these massive textures into the GPU as needed, giving compatible games a virtually endless supply of unique texture data it can apply to the game world. The GCN Architecture in 28nm AMD Radeon products is the first GPU design to feature a hardware implementation of this technology. Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.
Â
PS; Blu-ray is not enough for 32 terabyte textures. Btw, AMD also gave hints (i.e. "future games") for going beyond DX11.1 with thier GCN presentations e.g. DX11.2
Â
Â
Â
Â
Damn it is like shock and awe up in this bitch lems getting bombed :lol:The GPU in the XBox One is using PRT technology...[QUOTE="no-scope-AK47"]Damn it is like shock and awe up in this bitch lems getting bombed :lol:The GPU in the XBox One is using PRT technology... Both consoles can do the same thing so no advantages to either given but the PS4 still is 50% more powerful[QUOTE="ronvalencia"]
With AMD PRT, you can overcommit your VRAM and still perform well i.e. this is dependant on texture visibility/recyclability and look ahead pre-fetch engine/processes. It's a puzzle to be solved.
http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx
PRT can utilize absolutely enormous texture files, up to 32 terabytes large, with minimal performance impact. PRT accomplishes this by streaming small bits of these massive textures into the GPU as needed, giving compatible games a virtually endless supply of unique texture data it can apply to the game world. The GCN Architecture in 28nm AMD Radeon products is the first GPU design to feature a hardware implementation of this technology. Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.
Â
PS; Blu-ray is not enough for 32 terabyte textures. Btw, AMD also gave hints (i.e. "future games") for going beyond DX11.1 with thier GCN presentations e.g. DX11.2
Â
Â
Â
Â
UltimateviL
It looks like MS addressing some latency issues.Microsoft has unveiled DirectX 11.2, an updated version of its existing graphics technology that introduces "a host of new features to improve performance" in games and graphics apps.
Â
DirectX 11.2's headline feature is the addition of 'Tiled Resources', an advanced graphics technology that lets developers pull hi-res assets into a scene dynamically without overloading the graphics card. Essentially, the tech ensures that textures don't appear blurred or fuzzy when viewed close up.
Â
Microsoft says that Tiled Resources will let developers "make games with unprecedented amounts of detail".
Â
DirectX 11.2 also reduces latency for DirectX apps, allowing for "faster UI response".
Â
The technology will be compatible with Xbox One and devices running Windows 8.1.
Mr720fan
[QUOTE="Far_RockNYC"]
[QUOTE="Mr720fan"]
Cows are having a rough weak just look up and read shew all ober the place and trying so hard :lol: Â ron for prez after this
Mr720fan
sad indeedÂ
Hard day again for cows and shew getting schooled by ron is delicious
I treat PS4 or X1 like different Radeon HD GCN SKUs. If I have a "cow" mindset, I would dish the owners with 7770s with my 7970+7950 CrossFire, but I wouldn't do that.You didn't see sh1t but games running on PC's. If you're too young to remember E3'05 I'll tell ya this... Almost nothing looked the same at launch as it did at E3 '05.[QUOTE="AD216"]
both consoles will be similar. devs said it, we saw it at e3. now plz rats and roaches stop the fighting. u idiots r giving me a headache
Douevenlift_bro
Solution: Wait and see
And yes, there were some games that were - categorically, without a shadow of a doubt - running on Xbox One hardware. It'll come as little surprise to learn that first party software was more likely to be showcased running on the new console, with Turn 10's Forza Motorsport 5 the most high profile title we saw that was visibly operating on the actual unit.
http://www.eurogamer.net/articles/digitalfoundry-hands-on-with-xbox-one
http://www.geek.com/games/sony-iimprove-directx-11-for-the-ps4-blu-ray-1544364/stereointegrity
AMD has it's own improvements. Refer to http://www.slideshare.net/zlatan4177/gpgpu-algorithms-in-games
Your table doesn't show X1's SRAM and effective gain from Just-In-Time LZ/JPEG compression/decompression hardware.And still 100$ cheaper Go home xbox, you're drunkvoicereason
Intet's low latency eDRAM (50 GB/s per direction) setup
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3
Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.
Your table doesn't show X1's SRAM and effective gain from Just-In-Time LZ/JPEG compression/decompression hardware.[QUOTE="voicereason"]
And still 100$ cheaper Go home xbox, you're drunkronvalencia
Intet's low latency eDRAM (50 GB/s per direction) setup
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3
Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.
Hate to say it dude but your making the fatal mistake of saying something clever to someone stupid. This guys all over the boards spouting the same misinformation as all the other cows....difference being he isnt as good at it.This technology has been available to OpenGL for a long time. John Carmack said that the PRT technology was great but this functionality was not exposed in DX 11.1 so they could not really use it. Now it is exposed in DX 11.2 and Xbox 1 but it is also available for PS4. Ron, even if we assume that the JIT jpeg texture encode/decode and the ESRAM mean that the effective bandwidth is the same on the Xbox 1 and the PS4 there are still 2 issue. 1) It will take more developer optimisation to maximise as it is a more complicated system. 2) The PS4 still has 50% more shader and 100% more fill rate performance as well as having enhanced compute capability through the extra ACE units. Further due to the enhancements in the compute pipeline it is possible for the GPU to perform compute and graphics at the same time meaning some load can be shifted from the CPU onto the GPU for workloads that suit the compute architecture. This will only really be used by 1st party devs I expect but it does enable them to use more advanced lighting methods or more interactive physics.Microsoft has unveiled DirectX 11.2, an updated version of its existing graphics technology that introduces "a host of new features to improve performance" in games and graphics apps.
Â
DirectX 11.2's headline feature is the addition of 'Tiled Resources', an advanced graphics technology that lets developers pull hi-res assets into a scene dynamically without overloading the graphics card. Essentially, the tech ensures that textures don't appear blurred or fuzzy when viewed close up.
Â
Microsoft says that Tiled Resources will let developers "make games with unprecedented amounts of detail".
Â
DirectX 11.2 also reduces latency for DirectX apps, allowing for "faster UI response".
Â
The technology will be compatible with Xbox One and devices running Windows 8.1.
Mr720fan
Please Log In to post.
Log in to comment