what's this, "DX12 will have a substantial effect on XB1"?

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#251  Edited By Daious
Member since 2013 • 2315 Posts

@Tighaman said:

@daious said:

@Tighaman said:

@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol

HBM DRAM is a standard and a specific type of ram you idiot. Its a certain type of DRAM ram. I just showed you the documents. I just showed you the JEDEC number filling and standard for HBM. Heck, the HBM DRAM has its own wikipedia article.

Again, you can't find a source to prove that you are right. You can't link a source. You can't provide any evidence. I gave you links. Meanwhile, you can't even give me one.

"Look at their website". I went through their entire conference and statements on the HBM conference. Link to their statements that esram is HBM on their website. Just give me the link.

I provided you with tons of sources that prove your wrong. You can't even provide a source that you are right.

That what MS eSRAM is lol HBM dram you the one try to say it only AMD and hynix when these are the same people who made the eSRAM its the same tech. Duck season, rabbit season, duck season, rabbit season, rabbit season, duck season........gets shot in the face lol

Still no source? still love being wrong?

Come on man. Where is your source?

You assume things and you are so dead wrong.

HBM DRAM is a standard. I gave you the evidence you

LOL, I found some slides from NVIDIA talking about HBM standard. Oh god, you must love being wrong. They even site the JESD235 standard on what HBM DRAM is (proving you wrong yet again). There is even a wikipedia article proving you are wrong.

I have Hynix, Hynix slides, Hynix HBM conference, JESD235 standard (irrefutable evidence), wikipedia, and NVIDIA stating you are wrong and that HBM DRAM is its own type. Are you seriously going against what the JEDEC said? Because you can't. You literally can't.

If you don't provide me the link to HYNIX stating that esram is HBM DRAM in the next reply then you are admitting that you are completely wrong. You will be admitting that you can't provide any evidence and cannot support your argument.

Your only argument is that hynix is making esram. Guess what. Hynix makes dozens of different types of ram. I gave you just a small list of the dozens of types they make. You know what Hynix has stated? They have stated what HBM is. Hynix has stated that HBM DRAM is its own type of ram. Hynix has stated that it has has been officially been adopted by the JEDEC has its own time of ram. I have showed you proof from AMD, NVIDIA, Hynix, and JEDEC that you are wrong. Your turn. Where are your JEDEC statements? Where are your hynix statements?

JEDEC evidence is irrefutable. They set the standard of the entire memory industry. The Joint Electron Device Engineering Council is what sets the standard of ram type. No one else does. You are wrong beyond any doubt courtesy of them.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#252  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

Benchmark testing software company Futuremark has revealed it will add a test to its 3DMark product that will demo the graphical improvements in DirectX 12, which will ship as part of Windows 10.

The new API Overhead feature test will be the same demo that Microsoft used to show off the difference between DirectX 11 and DirectX 12 as part of its Windows 10 gaming presentation earlier this week. Futuremark states:

Games make thousands of draw calls per frame, but each one creates performance-limiting overhead for the CPU. APIs with less overhead can handle more draw calls and produce richer visuals. Microsoft's demo shows DirectX 11 and DirectX 12 running side by side on the same hardware. As the number of draw calls increases, DirectX 12 is able to handle greater complexity and detail while achieving higher frame rates than DirectX 11. We're thrilled that Microsoft chose our new API Overhead feature test to show the benefits of DirectX 12. And soon you will be able to test your own PC and see the difference yourself.

There's no word on exactly when the new test will be added to 3DMark. As we have previously reported,DirectX 12 will be an exclusive for Windows 10 and will not be made available for Windows 7 and 8.1 users. More information about DirectX 12 is expected to be revealed in March as part of the annual Game Developers Conference.

http://www.windowscentral.com/upcoming-version-3dmark-will-include-benchmark-test-windows-10-and-directx-12?utm_source=wpc&utm_medium=twitter

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#253  Edited By slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos: whatever man. You really have no idea what you are talking about. GPGPU is there to free up the cpu and if you think the gpu in the PS4 has more compute power than Azure all I can say to you is ignorance is bliss.

Yea the fact that you actually believe a console named after the companies flagship API won't be built to take advantage of it just shows your ignorance. That's like saying Apple would not build an iPhone built around iOS. Everyone knows the Xb1 has more customization under the hood than PS4, are you really that much of an idiot to think that with DX12 and the Xb1 being developed at the same time, and having the lead DX engineer working on the Xb that it would not support DX12, come now that's just stupid. It's been said over and over that the Xb1 has dx12 support, you even have developers questioning why M$ even launched the Xb with Dx11. Did the 360 not launch with future DX tech inside?

I understand the problems of loosing connection with the cloud, but the thing is as numbers have shown XB1 players are constantly online, more so than PS4 players. I honestly don't think anyone who bought an xbox would buy it not to play online. So cloud connection probably won't be a problem for most X1 players.

Yea talk about Titanfalls res, but those frame drops you referring to rarely happen. It's just a small flaw for you to try and strengthen your argument, but I don't see you complaining about the framerate drops on the majority of the PS games. I guess frame drops are only bad when it's on xbox when it happens on PS4 cows are like "Sony gave us inconsistent frames to enhance gameplay."

yea resistance 2 had 20 vs 20 in 2006 but it had nowhere near the amount of stuff Titanfall has goin on right now.

I'm not talking about streaming graphics I'm talking about extra cpu power so the Xb1 don't have to use gpgpu. Get that through your skull.

Yea unreal supports all platforms, but like I said DX12 is going to be the industry standard. PC, Xbox and rumors of Googles Nexus 9 with tegra K1 all supporting DX(DX11 confirmed), Apple is leaving OpenGL and going to Apple Metal, so basically leaving Opengl to just the PS4, some games may still use it. This would make sense seeing as how Epic and Unity are both on board with DX. And I'll put this out there one more time EPIC IS ON BOARD WITH THE CLOUD, SO IS NVIDIA, AND SO IS HAVOK.

Oh yea SR4, I think it's unanimous that that game was not well optimized, but if you wanna throw stones we can throw REremake in there where the XB1 has better fps.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#254  Edited By 04dcarraher
Member since 2004 • 23857 Posts

@slimdogmilionar said:

I understand the problems of loosing connection with the cloud, but the thing is as numbers have shown XB1 players are constantly online, more so than PS4 players. I honestly don't think anyone who bought an xbox would buy it not to play online. So cloud connection probably won't be a problem for most X1 players.


All cloud processing will be on server side of things aka multiplayer or server based co-op. You can not have cloud based computing to handle all the people playing X1's individually. That would take too much bandwidth, time and money to implement.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#255 GrenadeLauncher
Member since 2004 • 6843 Posts

@b4x said:

The guy gave no facts in that video.

He dismissed peoples so called facts. Rightfully so. How do you get "won't help X1" from massive change"?

He made NO factual claims... None.

I don't know who the guy is... He may be a fanboy. That video is not fanboy at all. He's 100% right. We don't know what the X1 and DX12 will bring.

To say one way or the other is just BS... Hot air.

You can sit in here and try to lie about it... To a person that reads all the info about the Xbox One.. Not your best choice though.

No publisher has ever said it wouldn't help the Xbox One. Well one. "Naughty Dog"

Their impartial view is 100% concrete. :D

Most of the technical articles you are referring to are based around that Phil Tweet also.

How can you sit in here and pretend you know what it is going to do based off of that tweet, or anything that has been said up to this point?

I can't.

Did I say that? Nope. I said he talked shit. Looking at your garbage, he excelled himself.

And yes, he's a fanboy. Just like you. I'm surprised you can read too: we said it would make minimal difference. Faster porting, mostly. It already has the API benefits PC CPUs don't enjoy.

Couldn't find your 1.2 million prediction for November, B4X. Fancy linking?

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#256 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

Did I say that? Nope. I said he talked shit. Looking at your garbage, he excelled himself.

And yes, he's a fanboy. Just like you. I'm surprised you can read too: we said it would make minimal difference. Faster porting, mostly. It already has the API benefits PC CPUs don't enjoy.

Couldn't find your 1.2 million prediction for November, B4X. Fancy linking?

That's not really true, I've seen Krelian and Tormentos saying it won't do anything. While DX12 has some new features not being used by Xbox One at the moment

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#257 GrenadeLauncher
Member since 2004 • 6843 Posts

@slimdogmilionar said:

Cows are not really dumb they are just insecure. They know eventually the ps4 will loose it's gpu advantage when gpgpu is thrown into the equation,

You mean that thing the PS4 can also do? Keep trying to hype up da clawd too, it might be relevant for CPU tasks in some serious function in ten years. I suppose you have to cover your bases when DX12 falls through.

Keep damage controlling for the Shitbox. Dat advantage is never going away. :)

@FastRobby said:

@GrenadeLauncher said:

Did I say that? Nope. I said he talked shit. Looking at your garbage, he excelled himself.

And yes, he's a fanboy. Just like you. I'm surprised you can read too: we said it would make minimal difference. Faster porting, mostly. It already has the API benefits PC CPUs don't enjoy.

Couldn't find your 1.2 million prediction for November, B4X. Fancy linking?

That's not really true, I've seen Krelian and Tormentos saying it won't do anything. While DX12 has some new features not being used by Xbox One at the moment

The same way lems say "PS4 has no games." Exaggeration (well, it is for cows, lems genuinely believe the shit they spout).

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#258  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

@FastRobby said:

@GrenadeLauncher said:

Did I say that? Nope. I said he talked shit. Looking at your garbage, he excelled himself.

And yes, he's a fanboy. Just like you. I'm surprised you can read too: we said it would make minimal difference. Faster porting, mostly. It already has the API benefits PC CPUs don't enjoy.

Couldn't find your 1.2 million prediction for November, B4X. Fancy linking?

That's not really true, I've seen Krelian and Tormentos saying it won't do anything. While DX12 has some new features not being used by Xbox One at the moment

The same way lems say "PS4 has no games." Exaggeration (well, it is for cows, lems genuinely believe the shit they spout).

Sure PS4 has no games is exaggeration, but believe me when Tormentos posts 50 reply's saying the same thing, defending it with copy/paste text, links, throwing away hours of his life, it's not really exaggeration anymore.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#259  Edited By ronvalencia
Member since 2008 • 29612 Posts
@daious said:

@Tighaman said:

@daious: i never said esram was stacked memory i said it was high bandwidth memory small package and high bandwidth is high bandwidth memory

Thank you for proving that you don't know what Hynix and AMD's HBM is by claiming that having high bandwidth is the same as being HBM. I am glad we cleared this up. I see that your confusion has to do with your reading comprehension and lack of understanding that HBM stands for a type of ram and not a general statement that means fast ram.

HBM isn't a general term to describe fast things. Its a type of technology being co-developed by AMD and Hynix that will see the light of day in 2015.

Christ.

To make it really simple for you to understand I came up with a hypothetical situation. Let say Ford has a car called "Super Fast Car (SFC)". If Honda has a fast car does it mean that it is a "Super Fast Car (SFC)"? No, it doesn't because a "SFC" is a car made by Ford.

I can try to make a simpler comparison if you need me too.

HBM replaces JEDEC's GDDR5. AMD chairs the GDDR working group in JEDEC.

@Tighaman said:

@daious said:

@Tighaman said:

@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol

HBM DRAM is a standard and a specific type of ram you idiot. Its a certain type of DRAM ram. I just showed you the documents. I just showed you the JEDEC number filling and standard for HBM. Heck, the HBM DRAM has its own wikipedia article.

Again, you can't find a source to prove that you are right. You can't link a source. You can't provide any evidence. I gave you links. Meanwhile, you can't even give me one.

"Look at their website". I went through their entire conference and statements on the HBM conference. Link to their statements that esram is HBM on their website. Just give me the link.

I provided you with tons of sources that prove your wrong. You can't even provide a source that you are right.

That what MS eSRAM is lol HBM dram you the one try to say it only AMD and hynix when these are the same people who made the eSRAM its the same tech. Duck season, rabbit season, duck season, rabbit season, rabbit season, duck season........gets shot in the face lol

Xbox One's ESRAM is not HBM i.e. HBM is based on DRAM not SRAM. HBM is stacked DRAM.

GDDR5 is based on DRAM not SRAM.

SRAM = http://en.wikipedia.org/wiki/Static_random-access_memory Doesn't need periodical refresh function.

DRAM = http://en.wikipedia.org/wiki/Dynamic_random-access_memory Needs periodical refresh function.

AMD Embedded Radeon™ E8860 GPU has on-chip package 2GB GDDR5. Microsoft selected the ESRAM solution while AMD selected embedded GDDR5 solution for it's own products.

HBM is the next evolution after GDDR5.

IF AMD was building it's embedded solution with it's own money, they would not build Xbox One. Microsoft is addicted to small size high speed RAM.

The ideal console with X1's silicon budget would be two R7-260 GPUs with 256bit 2GB GDDR5 and 128bit 8 GB DDR3. Such a box would blow away Sony's PS4.

Two R7-260 GCN = 24 CU. At 850Mhz, it would result in 2.6 TFLOPS GCN.

Microsoft is just being stupid.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#260 ronvalencia
Member since 2008 • 29612 Posts

@ttboy said:

Benchmark testing software company Futuremark has revealed it will add a test to its 3DMark product that will demo the graphical improvements in DirectX 12, which will ship as part of Windows 10.

The new API Overhead feature test will be the same demo that Microsoft used to show off the difference between DirectX 11 and DirectX 12 as part of its Windows 10 gaming presentation earlier this week. Futuremark states:

Games make thousands of draw calls per frame, but each one creates performance-limiting overhead for the CPU. APIs with less overhead can handle more draw calls and produce richer visuals. Microsoft's demo shows DirectX 11 and DirectX 12 running side by side on the same hardware. As the number of draw calls increases, DirectX 12 is able to handle greater complexity and detail while achieving higher frame rates than DirectX 11. We're thrilled that Microsoft chose our new API Overhead feature test to show the benefits of DirectX 12. And soon you will be able to test your own PC and see the difference yourself.

There's no word on exactly when the new test will be added to 3DMark. As we have previously reported,DirectX 12 will be an exclusive for Windows 10 and will not be made available for Windows 7 and 8.1 users. More information about DirectX 12 is expected to be revealed in March as part of the annual Game Developers Conference.

http://www.windowscentral.com/upcoming-version-3dmark-will-include-benchmark-test-windows-10-and-directx-12?utm_source=wpc&utm_medium=twitter

Windows 7, 8.0 and 8.1 users will get free upgrade to Windows 10.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#261  Edited By tormentos
Member since 2003 • 33793 Posts

@Tighaman said:

@daious: Fam get the **** out of here ITS HYNIX TECH .... and its high bandwidth memory yes its not stacked but its still HBM the tech that MS is using for their eSRAM is HBM made by HYNIX lil guy dont come to me trying to sound smart you only hurting yourself ........you sound like torm and thats a complete ass.

ESRAM isn't staked memory you blind biased MORON it isn't i already argue that shit with you,stacked memory ins't here,HBM is stacked memory ESRAM is embedded memory is not the same or fu**ing close..

No is not..

High-Bandwidth Memory, otherwise known as HBM, is a form of stacked DRAM designed to sit on the same package as a processor.

According to the slides, Hynix's first-gen implementation stacks four DRAM dies on top of a single base layer.The dies are linked by vertical channels called through-silicon vias. By my count, there are 256 of those per slice, each one capable of transmitting at 1Gbps. That gives the four-way KGSD, or Known Good Stacked Die, a staggering 128GB/s of total bandwidth. For perspective, consider that the memory interface on the GeForce GTX 750 Ti tops out at just 86GB/s.

http://techreport.com/news/27129/hynix-slides-tease-vertically-stacked-memory-with-256gb-s-of-bandwidth

Stacked memory is Memory stacked vertically on of top of another linked by vertical channels.

Do you see any stacked memory here.? Do you.?

If it isn't stacked it isn't HBM period...

@Tighaman said:

@daious: man if you dont stop with these wack ass analogies im going to slap you myself. Ecoboost is the same tech even if it in a Ford fusion or the new GT 40 its still the same tech by the same people. Esram is the same tech as HBM just not stacked made by and designed by the same people which is AMD and hynix. Stop making yourself look foolish first it was Honda and Ford analogy and when that sounded dumb and try to do a Ford and Ford analogy and it got plan ridiculous lol

Is not and your are a moron,HBM is stacked memory.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#262 GrenadeLauncher
Member since 2004 • 6843 Posts

@FastRobby said:

Sure PS4 has no games is exaggeration, but believe me when Tormentos posts 50 reply's saying the same thing, defending it with copy/paste text, links, throwing away hours of his life, it's not really exaggeration anymore.

That's his issue. He does understate what DX12 will do though. Better than overstating it immensely and being disappointed.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#263  Edited By tormentos
Member since 2003 • 33793 Posts

@slimdogmilionar said:

@tormentos: whatever man. You really have no idea what you are talking about. GPGPU is there to free up the cpu and if you think the gpu in the PS4 has more compute power than Azure all I can say to you is ignorance is bliss.

Yea the fact that you actually believe a console named after the companies flagship API won't be built to take advantage of it just shows your ignorance. That's like saying Apple would not build an iPhone built around iOS. Everyone knows the Xb1 has more customization under the hood than PS4, are you really that much of an idiot to think that with DX12 and the Xb1 being developed at the same time, and having the lead DX engineer working on the Xb that it would not support DX12, come now that's just stupid. It's been said over and over that the Xb1 has dx12 support, you even have developers questioning why M$ even launched the Xb with Dx11. Did the 360 not launch with future DX tech inside?

I understand the problems of loosing connection with the cloud, but the thing is as numbers have shown XB1 players are constantly online, more so than PS4 players. I honestly don't think anyone who bought an xbox would buy it not to play online. So cloud connection probably won't be a problem for most X1 players.

Yea talk about Titanfalls res, but those frame drops you referring to rarely happen. It's just a small flaw for you to try and strengthen your argument, but I don't see you complaining about the framerate drops on the majority of the PS games. I guess frame drops are only bad when it's on xbox when it happens on PS4 cows are like "Sony gave us inconsistent frames to enhance gameplay."

yea resistance 2 had 20 vs 20 in 2006 but it had nowhere near the amount of stuff Titanfall has goin on right now.

I'm not talking about streaming graphics I'm talking about extra cpu power so the Xb1 don't have to use gpgpu. Get that through your skull.

Yea unreal supports all platforms, but like I said DX12 is going to be the industry standard. PC, Xbox and rumors of Googles Nexus 9 with tegra K1 all supporting DX(DX11 confirmed), Apple is leaving OpenGL and going to Apple Metal, so basically leaving Opengl to just the PS4, some games may still use it. This would make sense seeing as how Epic and Unity are both on board with DX. And I'll put this out there one more time EPIC IS ON BOARD WITH THE CLOUD, SO IS NVIDIA, AND SO IS HAVOK.

Oh yea SR4, I think it's unanimous that that game was not well optimized, but if you wanna throw stones we can throw REremake in there where the XB1 has better fps.

No GPGPU is to run some CPU process on the GPU that are less efficient running them in the CPU,and that will run better with a fraction of the power of a CPU on a GPU,tehre is GpGPU on PC and you don't need to offload shit from CPU on PC they have strong enough ones,hell most games on PC are in GPU bound not cpu bound.

Please stop with the whole naming of the console after Directx,DX has existed for years and the xbox 360 came in a similar way with DX9,when DX10 came the xbox 360 supported some features several it didn't because it lacked hardware just like the xbox one,and i am sure MS knew what DX10 was doing when they build the xbox 360.

Consoles are done in time frames what is available at the time of closing specs,in that case for the xbox one that was 2011,reason why GCN is the GPU in it,and not something much newer,like gthe 360 the one will miss several DX12 features and in no place Phil confirmed full compatibility with DX12 he basically side step the question with weak response.

Really the xbox 360 came in November 2005,in November 2006 DX10 came and it wasn't fully compatible with the xbox 360 even that it was release just 1 year latter,when DX12 hit in late 2015 the xbox one in 2013 so yeah it would be even a bigger gap between the console and the api than DX10 had with the xbox 360.

The xbox one is GCN,GCN is not fully compatible with DX12 maxwell is and sadly the xbox one doesn't have a GPU related to it.

The xbox one didn't launch with DX11,it is DX11x which is a leaner more efficient one,with all the cpu over head cut off just like the xbox 360 had it before it,what DX12 bring to the table will benefit PC on xbox one no it already had those and some things will never get it.

Rarely happen my ass when you have multiple titans in one place the frame drop like a stone to 35FPS and is 792p not even 900p for Christ sake,it is the biggest proof that cloud will do shit for graphics.

You don't get it 20 vs 20 on big maps lag free and smooth as butter,Gear of War could not even properly do 4 vs 4 on 2006 without lag,host advantages issues and crap like that.

Warhawk 16 vs 16 2007,huge stages with flying ships,tanks,jeeps jet packs again smooth as butter,Resistance 2 30 vs 30 60 players online 8 players co-op on a time were the max was 4.

All you need for that is dedicated servers not a cloud..

Goggle android run on a linux kernel not windows.

Hahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Google launches Android 5.0 "Lollipop" with full ARMv8-A 64 bit and OpenGL 3.1 + AEP support on the Nexus 9 tablet powered by the NVIDIA® Tegra® K1 64 bit mobile processor.

At GDC in March, Khronos announced OpenGL ES 3.1 which brought a host of new graphics features to mobile 3D developers. This was a great step but NVIDIA knew that developers wanted more. Google answered the call with the Android Extension Pack (announced at Google IO in June) and NVIDIA is the first vendor to support AEP; AEP is available on the Nexus 9 and will be on SHIELD tablet when it updates to Android 5.0.

AEP adds full-scale OpenGL functionality and extensions including geometry shaders and tessellation. With OpenGL ES 3.1 and AEP, mobile developers can get desktop level rendering and implement advanced rendering methods such as global illumination and some techniques from AZDO. For those wanting to profile or debug 3D games and apps, be sure to check out NVIDIA's free Tegra Graphics Debugger.

https://developer.nvidia.com/content/android-lollipop-and-nexus-9-launch-officially-unveiling-nvidia-tegra-k1-64-bit

There is no need for DX12 on android,it runs Linux kernel and Opengl.

Keep quoting misterxmedia kind of crap and you will continue to look like a complete buffoon...

The PS4 doesn't run on Opengl,it has its own APi call libgnm,what the PS4 has is opengl4.4+ functionality which mean all the feature Opengl4.4 has the PS4 has them to.

No it was unanimous around you fanboys,funny when the PS4 drop frame oh the weak CPU,but when the xbox one does it is bad optimization..hahahahaa

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#266 Krelian-co
Member since 2006 • 13274 Posts

@xboxiphoneps3 said:

@Tighaman: uh no, ESRAM pool is not stacked memory, its just simply a pool of ESRAM , which happens to be fast memory, but is only 32 mb of it.

LOLOLOLOL

the moron is arguing esram is stacked memory? well wouldn't be surprised, after all, everything he does is make bullshit and spread false information.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#267 Krelian-co
Member since 2006 • 13274 Posts

@Tighaman said:

@daious said:

@Tighaman said:

@daious: it on the site why do i have to copy and paste? They cant trademark or copywrite HBM because it a term everyone uses they dont own that term hence anyone can say it. I just to you facts about the power consumption, the voltage , and the width of the bus, its all the same the only thing you have showed is that they dont own the phrase HBM ,thanks for your help lol

HBM DRAM is a standard and a specific type of ram you idiot. Its a certain type of DRAM ram. I just showed you the documents. I just showed you the JEDEC number filling and standard for HBM. Heck, the HBM DRAM has its own wikipedia article.

Again, you can't find a source to prove that you are right. You can't link a source. You can't provide any evidence. I gave you links. Meanwhile, you can't even give me one.

"Look at their website". I went through their entire conference and statements on the HBM conference. Link to their statements that esram is HBM on their website. Just give me the link.

I provided you with tons of sources that prove your wrong. You can't even provide a source that you are right.

That what MS eSRAM is lol HBM dram you the one try to say it only AMD and hynix when these are the same people who made the eSRAM its the same tech. Duck season, rabbit season, duck season, rabbit season, rabbit season, duck season........gets shot in the face lol

your stupidity has reached levels so high i feel bad for you.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#269 04dcarraher
Member since 2004 • 23857 Posts

@xboxiphoneps3 said:

ESRAM is not HBM, or another term, stacked memory, it is simply just ESRAM, a pool of ESRAM which happens to be quite fast memory, but there is only 32mb of it. it is not a stacked memory module, Tighaman you have no idea what your saying and really all you do is spew nonsense, no one takes you seriously anyway

lol if it was stacked it would be two stack of 16mb or a four stack of 8mb memory dies, which would be a waste, if they were using HBM based memory they would have gone with a hell of alot more memory per die totaling at least 1gb to 4gb

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#270  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@GrenadeLauncher said:

@FastRobby said:

Sure PS4 has no games is exaggeration, but believe me when Tormentos posts 50 reply's saying the same thing, defending it with copy/paste text, links, throwing away hours of his life, it's not really exaggeration anymore.

That's his issue. He does understate what DX12 will do though. Better than overstating it immensely and being disappointed.

He doesn't know anything, and neither do we. Are we developers? Are we using DX12? Because as far as I know, DX12 is still under embargo, and only given to some chosen developers who can't say anything about the new stuff. The only thing we know is some developers saying some things about the performance of it, that's all.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#271  Edited By tormentos
Member since 2003 • 33793 Posts

@FastRobby said:

He doesn't know anything, and neither do we. Are we developers? Are we using DX12? Because as far as I know, DX12 is still under embargo, and only given to some chosen developers who can't say anything about the new stuff. The only thing we know is some developers saying some things about the performance of it, that's all.

I know this for fact.

DX12 = console API gains bring to PC with some new hardware features the XBO GPU doesn't support.

That Mantle does the same and is here for some time now and AMD isn't even a software company.

That xbox one and PS4 like the PS3 and xbox 360,had lower CPU over head and could do more draw call than PC for years.

That consoles had lower CPU over head than PC for endless years,and that even AMD knew it and spoke of it.

That AMD blamed MS for the over head in 2011,that AMD even imply that there wound no be more DX.

But mantle isn't and is out and does exactly the same,so yeah developers know what it will do,just like CD Projekt talked about it not helping the xbox one in the way you lemmings claim.

I don't need to be a developer all i need is not to be a blind buffoon like you lemmings are.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#272 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

I'm not reading the post above me, I just know he quoted me. Let me guess, he knows best, he knows everything, I'm an idiot and stupid.

Sums it up quite about right probably

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#273 Krelian-co
Member since 2006 • 13274 Posts

@FastRobby said:

I'm not reading the post above me, I just know he quoted me. Let me guess, he knows best, he knows everything, I'm an idiot and stupid.

Sums it up quite about right probably

yeah you are idiot and stupid, you are right about that, glad to see you are finally admitting it.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#275 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

NYCrookedSmile‏@JCrookedSmile

@digitalsyrup@Xone_br33@XboxOneThuth@Nahkapukki@XBGameON older and new games can be upped to 1080p60 / 4K. Keep it as that ;)

Btw he seems to work at

Microsoft Research

: @MSFTresearch Employee and Game Concept art designer at ACC.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#276 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

http://gamingbolt.com/evolve-dev-about-ps4xbox-one-parity-console-wars-being-tedious

“Parity” seems to be one of the more problematic issues with today’s current gen consoles. When Turtle Rock Studios said it would be going with graphical parity for the Xbox One and PS4 versions, what influenced the decision especially when the PS4 is more powerful than the Xbox One?

GamingBolt’s Kurtis Simpson recently spoke to producer Chloe Skew who said that, “Well…I could find a lot of people that would disagree with you (laughter). But we want it to be an equivalent experience…console wars are so tedious. It’s like if you like that controller better, if you feel like that’s stronger than do what you want to do but we want to provide an equal experience for everyone.”

Skew further said that it’s the core experience that matters. “I think that getting into adapting things for different systems, you know PC is always going to win.

“But you know we do have parity between all three and I think that’s really helpful. You know it kind of silences that ‘Oh such and such is better’.” As for the Twitter wars about which is better, Skew said, “Yeah, nobody needs that. It’s better to talk about which characters you like more.”

Read more at http://gamingbolt.com/evolve-dev-about-ps4xbox-one-parity-console-wars-being-tedious#2YgblycBc4Du72lm.99

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#277 B4X
Member since 2014 • 5660 Posts

What I wouldn't give for Microsoft to send out a bios flash for the GPU in the Xbox One with Dx12 instructions on die that they have hid....

To destroy the people that think they know either way what Dx12 and X1 is going to do. PLEASE HAPPEN!!! for the ownage dear lord... for the ownage.

For an entire fan base to base an entire defense around a vague tweet from Phil Spencer... For entire fan base basing their debates around secret sauce. acting as if they wrote the code.. :D

As Brad Wardell said... Beware the Crow. If this shit comes to fruition... that Dx12 is a huge increase or does jack shit in any way.. and they have been hiding it the whole time or the naysayers are right in their assumptions.

The ownage is going to be monumental... Then you will realize why people like me don't say shit about stuff like this.

Hell I don't think Gamespot has ever even Justified this Dx12 debate with an article. Not even Eddie. They know it's thin Ice and there is zero solid evidence to support any of the armchair claims.

The CBOAT/ MisterX believers of the world just might be riding on a Titanic. Sipping the kool-aid never seeing the Iceberg coming right at them.

Megaton Ownage would be an understatement. Shit talking, know it all's will never live it down... There will be plenty of people to bump your old threads to never let you forget. It could become an actual holiday of remembrance in the gaming community.

Every time you make a claim about something / anything... all anyone is going to have to say.. " You remember that DX12 debate." and You're owned. You have no credibility. You went to that well.

Keep them shovels digging boys for either side of this debate. The ownage that is fixing hit either way will be glorious. There's no win in supporting either side of this debate. You know why.. We don't know what it's going to do.

If DX12 does nothing for the X1 = you lose.

If DX12 increases the performance = you lose.

I would love to see CBOAT backing fanboys on Gaf eat a massive crow cock though..

The neutral guys are just watching this debate from the fences licking their lips, itching to pounce. I know you can feel it.

Their eyes are on you. Like a fat guy at a Buffet.

Carry on.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#278  Edited By Daious
Member since 2013 • 2315 Posts

@Tighaman I also had a lovely conversation with one of the RAM specialists at Hynix

I posted an excerpt from the conversation.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#279 delta3074
Member since 2007 • 20003 Posts

@Krelian-co said:

So, microsoft *jazzhands* look it is "teh cloud" before release, lems go crazy, but teh secret sauce, uploading dem graphix to teh cloud, end up being nothing more than multiplayer servers, now microsoft wants people to forget and try to hype their new word *jazzhands* look it is teh directx12! lems go crazy, spread misinformation, ignorance and lies yet again.

Lems, lems never change.

what the **** are you talking about? they already used the cloud on XB1 for more than just multiplayer servers, it handles the AI for titanfall for starters/

Learn what you are actually talking about before you speak sunshine because as far as thew Developers are concerned the cloud can be used for far more than just multiplayer servers and it already has been in certain cases.

You really do talk an utter load of bollocks sunshine.

Besides, it's no worse than you Cow idiots last gen with 'teh cell makes the Ps3 a lot more powerful than the 360'

You guys where talking about the Secret unlocked power of 'teh cell' for the first 3 years of the 7th gen

And it turned out the Ps3 was only 'marginally' more powerful than the 360,lol.,

Lema and Cows , NEITHER of them ever change.

Avatar image for the_last_ride
The_Last_Ride

76371

Forum Posts

0

Wiki Points

0

Followers

Reviews: 122

User Lists: 2

#280 The_Last_Ride
Member since 2004 • 76371 Posts

@tymeservesfate: DX12 won't have any affect....

Link

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#281  Edited By tormentos
Member since 2003 • 33793 Posts

@kuu2 said:

Still don't know why anyone bothers to respond to Tomato?

The guy is the biggest phony on SW.

Actually that would be blackace fallowed by several of you.

@ttboy said:

NYCrookedSmile‏@JCrookedSmile

@digitalsyrup@Xone_br33@XboxOneThuth@Nahkapukki@XBGameON older and new games can be upped to 1080p60 / 4K. Keep it as that ;)

Btw he seems to work at

Microsoft Research

: @MSFTresearch Employee and Game Concept art designer at ACC.

He is a douchebag and fake look at his twits and you will soon see that,but since your desperation is so big i guess you will believe any shit any one say on the net..lol

@ttboy said:

http://gamingbolt.com/evolve-dev-about-ps4xbox-one-parity-console-wars-being-tedious

“Parity” seems to be one of the more problematic issues with today’s current gen consoles. When Turtle Rock Studios said it would be going with graphical parity for the Xbox One and PS4 versions, what influenced the decision especially when the PS4 is more powerful than the Xbox One?

GamingBolt’s Kurtis Simpson recently spoke to producer Chloe Skew who said that, “Well…I could find a lot of people that would disagree with you (laughter). But we want it to be an equivalent experience…console wars are so tedious. It’s like if you like that controller better, if you feel like that’s stronger than do what you want to do but we want to provide an equal experience for everyone.”

Skew further said that it’s the core experience that matters. “I think that getting into adapting things for different systems, you know PC is always going to win.

But you know we do have parity between all three and I think that’s really helpful. You know it kind of silences that ‘Oh such and such is better’.” As for the Twitter wars about which is better, Skew said, “Yeah, nobody needs that. It’s better to talk about which characters you like more.”

Read more at http://gamingbolt.com/evolve-dev-about-ps4xbox-one-parity-console-wars-being-tedious#2YgblycBc4Du72lm.99

There you have it the XBO as powerful as a R290X parity achieve...hahaha

@b4x said:

What I wouldn't give for Microsoft to send out a bios flash for the GPU in the Xbox One with Dx12 instructions on die that they have hid....

To destroy the people that think they know either way what Dx12 and X1 is going to do. PLEASE HAPPEN!!! for the ownage dear lord... for the ownage.

For an entire fan base to base an entire defense around a vague tweet from Phil Spencer... For entire fan base basing their debates around secret sauce. acting as if they wrote the code.. :D

As Brad Wardell said... Beware the Crow. If this shit comes to fruition... that Dx12 is a huge increase or does jack shit in any way.. and they have been hiding it the whole time or the naysayers are right in their assumptions.

The ownage is going to be monumental... Then you will realize why people like me don't say shit about stuff like this.

Hell I don't think Gamespot has ever even Justified this Dx12 debate with an article. Not even Eddie. They know it's thin Ice and there is zero solid evidence to support any of the armchair claims.

The CBOAT/ MisterX believers of the world just might be riding on a Titanic. Sipping the kool-aid never seeing the Iceberg coming right at them.

Megaton Ownage would be an understatement. Shit talking, know it all's will never live it down... There will be plenty of people to bump your old threads to never let you forget. It could become an actual holiday of remembrance in the gaming community.

Every time you make a claim about something / anything... all anyone is going to have to say.. " You remember that DX12 debate." and You're owned. You have no credibility. You went to that well.

Keep them shovels digging boys for either side of this debate. The ownage that is fixing hit either way will be glorious. There's no win in supporting either side of this debate. You know why.. We don't know what it's going to do.

If DX12 does nothing for the X1 = you lose.

If DX12 increases the performance = you lose.

I would love to see CBOAT backing fanboys on Gaf eat a massive crow cock though..

The neutral guys are just watching this debate from the fences licking their lips, itching to pounce. I know you can feel it.

Their eyes are on you. Like a fat guy at a Buffet.

Carry on.

Oh yes we do it is lemming who pretend it will do things not done before,DX12 is MS way of bringing console gains in CPU over head and other tricks to PC,even MS it self have stated this multiple sites state this as well.

DX12 what will do is ease a little porting games between PC and xbox one,which wasn't hard in the first place because most games were build with DX and the xbox consoles use it as well,the one on consoles is juts more streamline and efficient,which is why MS is bringing it to PC.

DX12 is a direct response to Mantle which is out already for some time and will be more than a year old when DX12 arrives.

Avatar image for lglz1337
lglz1337

4959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#282  Edited By lglz1337
Member since 2013 • 4959 Posts

Lemmings hopes and dreams really made me almost crash my car!

Spencer is more genuine then greenberg that's the obvious

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#283 tormentos
Member since 2003 • 33793 Posts

@delta3074 said:

what the **** are you talking about? they already used the cloud on XB1 for more than just multiplayer servers, it handles the AI for titanfall for starters/

Learn what you are actually talking about before you speak sunshine because as far as thew Developers are concerned the cloud can be used for far more than just multiplayer servers and it already has been in certain cases.

You really do talk an utter load of bollocks sunshine.

Besides, it's no worse than you Cow idiots last gen with 'teh cell makes the Ps3 a lot more powerful than the 360'

You guys where talking about the Secret unlocked power of 'teh cell' for the first 3 years of the 7th gen

And it turned out the Ps3 was only 'marginally' more powerful than the 360,lol.,

Lema and Cows , NEITHER of them ever change.

But it didn't improve the xbox one graphics which is the point argue here always,because the cloud can't do that.

You know that bold part actually confirm Cell was a beats,how can the PS3 turn out marginally more powerful than the xbox 360 when it had an inferior GPU.?

In that case it wasn't secret sauce it was a great CPU which could aid the GPU in forms the xbox 360 CPU could not help the Xenos,doing AA on the xbox 360 cpu was a waste of resources because 4XAA was suppose to be free on xbox 360,which turn out not even 2XAA was entirely free.

The things Cell did to aid the GPU were quite new,the problem is that the xbox one doesn't have a Cell like CPU it has the same Jaguar on PS4 and a considerable weaker GPU with a memory structure that is even cumbersome and problematic.

@daious said:

@Tighaman I also had a lovely conversation with one of the RAM specialists at Hynix

I posted an excerpt from the conversation.

Hahahaha now that is what i call been owned lets see if the little man has the guts to admit been wrong...

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#284 GrenadeLauncher
Member since 2004 • 6843 Posts

@FastRobby said:

@GrenadeLauncher said:

@FastRobby said:

Sure PS4 has no games is exaggeration, but believe me when Tormentos posts 50 reply's saying the same thing, defending it with copy/paste text, links, throwing away hours of his life, it's not really exaggeration anymore.

That's his issue. He does understate what DX12 will do though. Better than overstating it immensely and being disappointed.

He doesn't know anything, and neither do we. Are we developers? Are we using DX12? Because as far as I know, DX12 is still under embargo, and only given to some chosen developers who can't say anything about the new stuff. The only thing we know is some developers saying some things about the performance of it, that's all.

We sure aren't. All we can do is rely on what the experts say. And the experts are currently on a line of thinking that I argue.

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#285  Edited By B4X
Member since 2014 • 5660 Posts

@tormentos said:

@b4x said:

What I wouldn't give for Microsoft to send out a bios flash for the GPU in the Xbox One with Dx12 instructions on die that they have hid....

To destroy the people that think they know either way what Dx12 and X1 is going to do. PLEASE HAPPEN!!! for the ownage dear lord... for the ownage.

For an entire fan base to base an entire defense around a vague tweet from Phil Spencer... For entire fan base basing their debates around secret sauce. acting as if they wrote the code.. :D

As Brad Wardell said... Beware the Crow. If this shit comes to fruition... that Dx12 is a huge increase or does jack shit in any way.. and they have been hiding it the whole time or the naysayers are right in their assumptions.

The ownage is going to be monumental... Then you will realize why people like me don't say shit about stuff like this.

Hell I don't think Gamespot has ever even Justified this Dx12 debate with an article. Not even Eddie. They know it's thin Ice and there is zero solid evidence to support any of the armchair claims.

The CBOAT/ MisterX believers of the world just might be riding on a Titanic. Sipping the kool-aid never seeing the Iceberg coming right at them.

Megaton Ownage would be an understatement. Shit talking, know it all's will never live it down... There will be plenty of people to bump your old threads to never let you forget. It could become an actual holiday of remembrance in the gaming community.

Every time you make a claim about something / anything... all anyone is going to have to say.. " You remember that DX12 debate." and You're owned. You have no credibility. You went to that well.

Keep them shovels digging boys for either side of this debate. The ownage that is fixing hit either way will be glorious. There's no win in supporting either side of this debate. You know why.. We don't know what it's going to do.

If DX12 does nothing for the X1 = you lose.

If DX12 increases the performance = you lose.

I would love to see CBOAT backing fanboys on Gaf eat a massive crow cock though..

The neutral guys are just watching this debate from the fences licking their lips, itching to pounce. I know you can feel it.

Their eyes are on you. Like a fat guy at a Buffet.

Carry on.

Oh yes we do it is lemming who pretend it will do things not done before,DX12 is MS way of bringing console gains in CPU over head and other tricks to PC,even MS it self have stated this multiple sites state this as well.

DX12 what will do is ease a little porting games between PC and xbox one,which wasn't hard in the first place because most games were build with DX and the xbox consoles use it as well,the one on consoles is juts more streamline and efficient,which is why MS is bringing it to PC.

DX12 is a direct response to Mantle which is out already for some time and will be more than a year old when DX12 arrives.

Show me one article from a respected site or developer.... that does not refer to the Phil Spencer tweet as credence. Show me one game on the X1 built on a 100% DX12 engine.

Crytec, Unreal4, Unity are the ONLY supported Dx12 engines at this time... There are no game engines made on DX12 yet. That are released.

Show me Microsoft stating it multiple times, as you said..... without referring to the Phil Spencer vague tweets.

They have been working on DX12 for 5 Years at Microsoft. "We knew what was in DX12 when we built the Xbox One" Phil Spencer. How long has AMD been working on Mantle. No one is trying to say mantel and DX aren't trying to achieve the same goals. I have no clue when AMD started the mantle program.

It's all MisterX/Cboat garbage right now. To say different, is a lie.

I read about this shit everyday.... I 100% know what these guys base their arguments around.... It's all BS till Microsoft states CLEARLY one way or the other.

These guys don't know their ass from a hole in the ground when it comes to DX12 and X1... only the inter circle does and they ain't talking in a clear language.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#286  Edited By tormentos
Member since 2003 • 33793 Posts

@b4x said:

Show me one article from a respected site or developer.... that does not refer to the Phil Spencer tweet as credence. Show me one game on the X1 built on a 100% DX12 engine.

Crytec, Unreal4, Unity are the ONLY supported Dx12 engines at this time... There are no game engines made on DX12 yet. That are released.

Show me Microsoft stating it multiple times, as you said..... without referring to the Phil Spencer vague tweet.

They have been working on DX12 for 5 Years at Microsoft. "We knew what was in DX12 when we built the Xbox One" Phil Spencer. How long has AMD been working on Mantle. No one is trying to say mantel and DX aren't trying to achieve the same goals. I have no clue when AMD started the mantle program.

It's all MisterX/Cboat garbage right now. To say different, is a lie.

I read about this shit everyday.... I 100% know what these guys base their arguments around.... It's all BS till Microsoft states CLEARLY one way or the other.

These guys don't know their ass from a hole in the ground when it comes to DX12... only the inter circle does and they ain't talking.

All games on xbox one are build around DX12 dude,because DX12 what bring to PC is something consoles have for years,and that is the problem you people have you think is something new is not.

We implemented it on Xbox 360 and had a whole lot of ideas on how to make that more efficient [and with] a cleaner API, so we took that opportunity with Xbox One and with our customised command processor we've created extensions on top of D3D which fit very nicely into the D3D model and this is something that we'd like to integrate back into mainline 3D on the PC too - this small, very low-level, very efficient object-orientated submission of your draw [and state] commands.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

We’re continually innovating in areas of performance, functionality and debug and performance tooling for Xbox One. We’re also working with our ISV and IHV partners on future efforts, including bringing the lightweight runtime and tooling capabilities of the Xbox One Direct3D implementation to Windows, and identifying the next generation of advanced 3D graphics technologies.

http://blogs.windows.com/buildingapps/2013/10/14/raising-the-bar-with-direct3d/

This ^^ was on October 2013 when DX12 wasn't even talk about..

Microsoft has confirmed the existence of DirectX 12. As expected, it will be unveiled at GDC 2014 on March 20. It would appear that all the usual suspects are already on board — the use of their logos on the DirectX 12 website strongly suggests that AMD, Nvidia, Intel, and Qualcomm are all preparing DX12 hardware. Considering DX10 was released in 2006, and DX11 was released in 2009, DX12 is incredibly overdue. All signs point to DirectX 12 being a direct competitor to AMD’s recently released Mantle API, and in all likelihood will stymie AMD’s attempt to gain a critical mass of game developers.

http://1clicknews.com/microsoft-confirms-long-overdue-directx-12-will-be-unveiled-at-gdc/

This was on march 2014,Mantle was already implemented in BF4 by 2013,DX12 would not probably be implemented until 2016,is impossible that mantle was out and DX12 wasn't if they really had been working on DX12 for 5 years,that is a joke,and basically another of MS lies to try to make seen like DX12 wasn't just a direct response to Mantle.

It doesn't take a rocket scientist to know what DX12 is and what will do,it has been on xbox one since day 1,aside from the new hardware features which the xbox one doesn't support because it lack it..

DX12 doesn't have 5 years in the making,and is a hell of allot late because it started basically in 2013,because they saw AMD was coming with Mantle,in fact Mantle beat DX12 to the market which is a joke since AMD is not a software company at heart MS is.

Even if we go by MS Dx10 to DX11 time frame,development of DX12 would have it starting in late 2012 by that time the xbox one hardware was on lock already,much like it happen with DX10 and the xbox 360.

The xbox one has a GPU not fully compatible with DX12 dude it is NOT stated by MS it self,they should stop freaking contradicting them self.

In fact that quote you have there of Phil in no place say we are 100% dx12 compatible with the xbox one which is what he was asked,instead of a yes or no answer or something direct he chose to say we know what DX12 when we build the xbox,off course they did the xbox one has almost all the gains on Dx12 but the ones that require hardware.

You people are confuse because you don't know what DX12 is,it even that MS can stop screaming it..

Let me ask you this why if the xbox one had already features of Dx12 Confirmed by MS it self,why hasn't it been demo on xbox one.?

The xbox one has working dx12 features now,but some now DX12 hasn't been demo on xbox one,it has been demo on PC or Surface 3 which is another portable PC.

It is because running DX11 on PC vs DX12 would show some difference because you are bringing something new,on xbox one since those gains exist since day 1 they would not show in any way.

That is the reason DX12 doesn't get shown off on xbox one even that the features are working on this moment and on PC they are not because is not out.

We know that DX 12 will have a fairly strong impact on PC games when it launches late next year but will it have the same technical impact on the Xbox One? Interestingly, Engel revealed that, “The Xbox One already has an API which is similar to DirectX 12. So Microsoft implemented a driver that is similar to DirectX 12 already on the Xbox One. That freed up a lot of CPU time.

http://gamingbolt.com/dx12-to-allow-better-use-of-cpu-cores-xbox-one-already-has-similar-api-for-freeing-up-cpu#0KidsKcqzrZWqxgI.99

It is you people who don't want to read what is presented to you,the xbox one has those gains already since day 1 it has them because DX11 on xbox one,wasn't the same as on PC,just like the 360 also had those gains.

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#287  Edited By B4X
Member since 2014 • 5660 Posts

@tormentos said:

@b4x said:

Show me one article from a respected site or developer.... that does not refer to the Phil Spencer tweet as credence. Show me one game on the X1 built on a 100% DX12 engine.

Crytec, Unreal4, Unity are the ONLY supported Dx12 engines at this time... There are no game engines made on DX12 yet. That are released.

Show me Microsoft stating it multiple times, as you said..... without referring to the Phil Spencer vague tweet.

They have been working on DX12 for 5 Years at Microsoft. "We knew what was in DX12 when we built the Xbox One" Phil Spencer. How long has AMD been working on Mantle. No one is trying to say mantel and DX aren't trying to achieve the same goals. I have no clue when AMD started the mantle program.

It's all MisterX/Cboat garbage right now. To say different, is a lie.

I read about this shit everyday.... I 100% know what these guys base their arguments around.... It's all BS till Microsoft states CLEARLY one way or the other.

These guys don't know their ass from a hole in the ground when it comes to DX12... only the inter circle does and they ain't talking.

All games on xbox one are build around DX12 dude,because DX12 what bring to PC is something consoles have for years,and that is the problem you people have you think is something new is not.

We implemented it on Xbox 360 and had a whole lot of ideas on how to make that more efficient [and with] a cleaner API, so we took that opportunity with Xbox One and with our customised command processor we've created extensions on top of D3D which fit very nicely into the D3D model and this is something that we'd like to integrate back into mainline 3D on the PC too - this small, very low-level, very efficient object-orientated submission of your draw [and state] commands.

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

We’re continually innovating in areas of performance, functionality and debug and performance tooling for Xbox One. We’re also working with our ISV and IHV partners on future efforts, including bringing the lightweight runtime and tooling capabilities of the Xbox One Direct3D implementation to Windows, and identifying the next generation of advanced 3D graphics technologies.

http://blogs.windows.com/buildingapps/2013/10/14/raising-the-bar-with-direct3d/

This ^^ was on October 2013 when DX12 wasn't even talk about..

Microsoft has confirmed the existence of DirectX 12. As expected, it will be unveiled at GDC 2014 on March 20. It would appear that all the usual suspects are already on board — the use of their logos on the DirectX 12 website strongly suggests that AMD, Nvidia, Intel, and Qualcomm are all preparing DX12 hardware. Considering DX10 was released in 2006, and DX11 was released in 2009, DX12 is incredibly overdue. All signs point to DirectX 12 being a direct competitor to AMD’s recently released Mantle API, and in all likelihood will stymie AMD’s attempt to gain a critical mass of game developers.

http://1clicknews.com/microsoft-confirms-long-overdue-directx-12-will-be-unveiled-at-gdc/

This was on march 2014,Mantle was already implemented in BF4 by 2013,DX12 would not probably be implemented until 2016,is impossible that mantle was out and DX12 wasn't if they really had been working on DX12 for 5 years,that is a joke,and basically another of MS lies to try to make seen like Mantle wasn't just a direct response to Mantle.

It doesn't take a rocket scientist to know what DX12 is and what will do,it has been on xbox one since day 1,aside from the new hardware features which the xbox one doesn't support because it lack it..

DX12 doesn't have 5 years in the making,and is a hell of allot late because it started basically in 2013,because they saw AMD was coming with Mantle,in fact Mantle beat DX12 to the market which is a joke since AMD is not a software company at heart MS is.

Even if we go by MS Dx10 to DX11 time frame,development of DX12 would have it starting in late 2012 by that time the xbox one hardware was on lock already,much like it happen with DX10 and the xbox 360.

The xbox one has a GPU not fully compatible with DX12 dude it is NOT stated by MS it self,they should stop freaking contradicting them self.

In fact that quote you have there of Phil in no place say we are 100% dx12 compatible with the xbox one which is what he was asked,instead of a yes or no answer or something direct he chose to say we know what DX12 when we build the xbox,off course they did the xbox one has almost all the gains on Dx12 but the ones that require hardware.

You people are confuse because you don't know what DX12 is,it even that MS can stop screaming it..

Let me ask you this why if the xbox one had already features of Dx12 Confirmed by MS it self,why hasn't it been demo on xbox one.?

The xbox one has working dx12 features now,but some now DX12 hasn't been demo on xbox one,it has been demo on PC or Surface 3 which is another portable PC.

It is because running DX11 on PC vs DX12 would show some difference because you are bringing something new,on xbox one since those gains exist since day 1 they would not show in any way.

That is the reason DX12 doesn't get shown off on xbox one even that the features are working on this moment and on PC they are not because is not out.

We know that DX 12 will have a fairly strong impact on PC games when it launches late next year but will it have the same technical impact on the Xbox One? Interestingly, Engel revealed that, “The Xbox One already has an API which is similar to DirectX 12. So Microsoft implemented a driver that is similar to DirectX 12 already on the Xbox One. That freed up a lot of CPU time.

http://gamingbolt.com/dx12-to-allow-better-use-of-cpu-cores-xbox-one-already-has-similar-api-for-freeing-up-cpu#0KidsKcqzrZWqxgI.99

It is you people who don't want to read what is presented to you,the xbox one has those gains already since day 1 it has them because DX11 on xbox one,wasn't the same as on PC,just like the 360 also had those gains.

Read that........... That wall of vagueness ... With YOUR spin on it..... did I actually see a 360 mentioned reference there?

You're a guy that fell for the Nov. NPD CBOAT 7-10k BS.. and ran with it like it was truth facts.

There is nothing there that clearly states what the X1 and DX12 will do.... nothing dude.

No... they're is no DX12 game engine games on the market... That's bullshit... I don't give a **** what the API can do or if DX12 instructions are in the API in the X1.... If the engine isn't written specificity for DX12..... it doesn't matter. Guess what dude there are Zero games on the market that are built on a DX12 engine. None.

You think a last gen port ENGINES have DX12 features... Is that the BS you're trying to feed me...

All I'm saying..... The Crow could be Mighty... and you're stepping right up to the table, fork in hand.

There is Zero evidence to support what DX12 and X1 will really do. None. The inner circle are talking in riddles.

To say anything other than that is 100% BS. When you get REAL facts.... let me know.

Avatar image for 04dcarraher
04dcarraher

23857

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#288  Edited By 04dcarraher
Member since 2004 • 23857 Posts

el tormentos is at it some more

XB1 is currently using a customized DX11 as it base API... they still cant overcome DX11 inefficiently using a single core for the majority of the rendering pipeline. Suggesting the X1 is already using dx12 or dx 12 like API already is really dumb when we know the fact that dx11 does have some flaws and do limit a processor's ability to feed gpus.Fact is that dx12 adds efficiency allowing more to be done with less resources and allows all processors to feed gpu much better. While we know the new tools and features of DX12 will help with the development of games, and greatly improve pc's communication with gpu's and in turn will free up more cpu cycles on the X1.

Engel said:

“I like it. It’s great and it’s a fantastic opportunity to raise the bar again. It works with the same piece of hardware, so it’s the same CPU and the same GPU, and certainly we have much more CPU time to spend."

"“The workload on the CPU decreases substantially, because you can utilize more of the cores of the CPU. In this way you are less likely to be CPU limited. One of the cool features of modern games is that we have physics, and they have been traditionally implemented on the CPU and as a game developer you have to go back and ask ‘do I have to spend 40% of my CPU time’ on rendering or ‘can I reduce this so that I can use it for physics’ and this is one of the things that DirectX 12 allows you. This makes sure that the developers can get more out of the existing hardware.”

Needless to say X1 will see improvements on the cpu front which will allow a smoother experience in games.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#289  Edited By tormentos
Member since 2003 • 33793 Posts

@b4x said:

Read that........... That wall of vagueness ... With YOUR spin on it..... did I actually see 360 mentioned there?

You're a guy the fell for the Nov. NPD CBOAT 7-10k BS.. and ran with it like it was truth facts.

There is nothing there that clearly states what the X1 and DX12 will do.... nothing dude.

No... they're is no DX12 game engine game on the market... That's bullshit... I don't give a **** what the API can do or if DX12 instructions are in the API in the X1.... If the engine isn't written specificity for DX12..... it doesn't matter. Guess what dude there are Zero games on the market that are built on a DX12 engine. None.

You think a last gen port ENGINES have DX12 features... Is that the BS you're trying to feed me...

All I'm saying..... The Crow could be Mighty... and you're stepping right up to the table, fork in hand.

There is Zero evidence to support what DX12 and X1 will really do. None. The inner circle are talking in riddles.

To say anything other than that is 100% BS. When you get REAL facts.... let me know.

Yes that is a confirmation of something that has been for years on consoles buffoon,even on 360.

What Cboat say on NPD has shit to do here,you are the guy who claim the xbox one surpass the PS4 this holiday in US so please at least what i quote came from some one who use to be credible and was banned for it,on the other YOU your self claimed that..lol

There is no DX12 game engine on PC,on xbox one it has been since day 1.

See that bold part is a classic example of this...

And the fun part is how you pretend that you don't argue in favor of DX12 and their crap congratulations you just contradicted your self with actions..

@b4x said:

The ownage is going to be monumental... Then you will realize why people like me don't say shit about stuff like this.

@b4x said:

No... they're is no DX12 game engine game on the market... That's bullshit... I don't give a **** what the API can do or if DX12 instructions are in the API in the X1.... If the engine isn't written specificity for DX12..... it doesn't matter. Guess what dude there are Zero games on the market that are built on a DX12 engine. None.

See you are saying something you are arguing pro DX12,so basically you are on the soon to be owned list as you your self claimed..lol

That bullshit that you don't talk about this lasted 2 seconds when i quoted you,and was enough to make you spark a whole fight and go into your usual meltdown mode...hahahahaa

Lets see consoles have been doing 10,000 to 20,000 draw calls per frames when PC was doing 4,000 and 5,000 and anything beyond 6k or 7k would leave them into performance problems,now PC with mantle can do 100,000 draw calls and soon with DX12 on PC as well,on xbox one and PS4 don't expect that much because well their CPU aren't as fast and strong as other are on PC.

You think dx12 is a complete rewrite of the wheel a new code never done before,not dude is an optimized code to run in a much more efficient way on PC on consoles is already efficient.

Confetti CEO and co-founder Wolfgang Engel gave his thoughts on the same. The name may sound unfamiliar but Engel worked as the lead graphics programmer for Rockstar’s RAGE group, implementing graphics for titles like Grand Theft Auto 4, Read Dead Redemption, Midnight Club Los Angeles and many more.

Engel has since gone on to form Confetti, a middleware company that specializes in developing solutions for global illumination, Sky dome and Post FX system, and has consulted and worked on several other big name releases besides aiding studios in better understanding and implementing graphical techniques.

Interestingly, Engel revealed that, “The Xbox One already has an API which is similar to DirectX 12. So Microsoft implemented a driver that is similar to DirectX 12 already on the Xbox One. That freed up a lot of CPU time.

Don't be BLIND B4X.

Meanwhile you have a programmer who worked at Rockstart in games as big as GTA4,which went to co found a middle ware company which also support the xbox one telling you that the xbox one already has the api similar to DX12,that is because the gains DX12 bring to PC are mostly taken from consoles you blind biased fanboy.

More crow for your plate.

There is evidence of what DX12 will do idiot they are being even demo by MS it self..hahahaha

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#290 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos:

Qaulcomm makes the soc's for windows phones and the majority of android phones. The nexus 9 has a tegra k1 which already has Dx12 supprot. If Apple is using metal and Qualcomm is makes DX12 soc's for windows phones then that mean the soc's in android will support Dx12. It's already been confirmed that the new nexus supports Dx. SO devs will have to develop games for Opengl, Dx12, and Apple Metal. Now games will come to Apple first just becasuedevs know thats where the money is, leaving a toss up for second between Dx and Opengl. If Qualcomm already has an soc that supports Dx and that same soc is being used in Android phones why would devs add more work on themselves porting the game to Opengl.

This better explains it: http://www.developer-tech.com/news/2014/sep/22/metal-apple-may-destroy-opengl-and-boost-directx-12/

Oh yea LMAO at whoever said eSRAM and HBM are the same.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#291 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

So people are hyping up DX12 as some sort of MMX thing?

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#292  Edited By B4X
Member since 2014 • 5660 Posts

@tormentos:

Keep talking to yourself... John Carmack says Hi. If an engine is not built around an API... It doesn't matter.. The DX12 instructions were added to the X1 API to ease development of engines not built on the API itself. Refer to the metro developer interview. You're full of shit if you say other wise.

Why wouldn't I be pro something that improves a gaming experience?

Only an douchebag wouldn't....

So now GTA is a DX12 engine.....

The guy gave his THOUGHTS / Opinions, or was it truth facts? Just like I gave my thoughts opinions about Holiday sales. Which I was only off by 400k in the end.. When the PS4 came into the Holidays with 1.3 million lead in the US. That's closer than your blind predictions all year that the PS4 would crush the X1 in holiday sale... Is it not?

You need serious help man. You are what is wrong with gaming... how could you possibly even be Against something that benefits gamers?

You know why... you're a fanboy... Not a gamer. This is your life... think about that.

You are consumed by it.

I'll admit I'm a huge Microsoft supporter... because I'm a gamer and steam has more games on PC and Xbox has games too that I can't play elsewhere.. You will deprive yourself of games... for this petty crusade, that consumes your very soul.

I bash PS4 / Sony which I have owned every console and have loved for 20 years, from the first day I played Ridge Racer on the PS1 ... because of people like you.. You make me sick.. The whole lot of you.

I do so with facts......... I don't shit where I eat.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#293  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Heirren said:

So people are hyping up DX12 as some sort of MMX thing?

An example,

DirectX 12 improves the performance with descriptor heaps and tables i.e. it allows resources to be dynamically indexed in shaders, providing additional flexibility and unlocking new rendering techniques. For example, a deferred rendering engines usually encode a material or object identifier of some kind to the intermediate g-buffer. With Direct3D 11, these engines must be careful to avoid using too many materials, since including too many in one g-buffer can significantly slow down the final render pass. With dynamic index resources, a scene with a thousand materials can be finalized as quickly as one with only ten.

The scene complexity can increase without significantly impacting performance.

MMX is an integer 64bit SIMD that uses X87 registers.

Avatar image for iambatman7986
iambatman7986

4649

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 0

#294  Edited By iambatman7986
Member since 2013 • 4649 Posts

Does anyone actually think that DX12 is going to make the hardware in the X1 better? It'll make it easier to program for, but it still is a weaker console than it's competition which shows in most multplat games.

Microsoft can't even agree as to whether it'll make a big difference.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#295 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@ronvalencia said:

@Heirren said:

So people are hyping up DX12 as some sort of MMX thing?

An example,

DirectX 12 improves the performance with descriptor heaps and tables i.e. it allows resources to be dynamically indexed in shaders, providing additional flexibility and unlocking new rendering techniques. For example, a deferred rendering engines usually encode a material or object identifier of some kind to the intermediate g-buffer. With Direct3D 11, these engines must be careful to avoid using too many materials, since including too many in one g-buffer can significantly slow down the final render pass. With dynamic index resources, a scene with a thousand materials can be finalized as quickly as one with only ten.

The scene complexity can increase without significantly impacting performance.

MMX is an integer 64bit SIMD that uses X87 registers.

So if that is the case then how are people downplaying this? I don't get all the terminology you use but I understand the general logic. I remember when 3DFX Voodoo arrived and all these cards were on the horizon. Basically, it was that time where graphics cards were becoming a thing. I had a relative getting ready to build a pc--which I did shortly after--but then I also remember installing MDK on a Pentium 133mhz computer and thinking to myself "wow, this looks really good" or much better than anything else that would run on the thing at the time.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#296 GrenadeLauncher
Member since 2004 • 6843 Posts

Jesus, lemmings, stop lying to yourselves. It isn't healthy.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#297 FoxbatAlpha
Member since 2009 • 10669 Posts

@GrenadeLauncher: I know it isn't healthy. I'm actually sick right now. Must......press............on.

Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#298 BlbecekBobecek
Member since 2006 • 2949 Posts

@ProtossX said:

dx12 = perfect gateway for pc ---> console

This is not how things work. Its been console ---> pc for years now. DX12 will probably make it easier (read: cheaper) to port stuff from xbone to PC, but thats about it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#299  Edited By ronvalencia
Member since 2008 • 29612 Posts
@Heirren said:

@ronvalencia said:

@Heirren said:

So people are hyping up DX12 as some sort of MMX thing?

An example,

DirectX 12 improves the performance with descriptor heaps and tables i.e. it allows resources to be dynamically indexed in shaders, providing additional flexibility and unlocking new rendering techniques. For example, a deferred rendering engines usually encode a material or object identifier of some kind to the intermediate g-buffer. With Direct3D 11, these engines must be careful to avoid using too many materials, since including too many in one g-buffer can significantly slow down the final render pass. With dynamic index resources, a scene with a thousand materials can be finalized as quickly as one with only ten.

The scene complexity can increase without significantly impacting performance.

MMX is an integer 64bit SIMD that uses X87 registers.

So if that is the case then how are people downplaying this? I don't get all the terminology you use but I understand the general logic. I remember when 3DFX Voodoo arrived and all these cards were on the horizon. Basically, it was that time where graphics cards were becoming a thing. I had a relative getting ready to build a pc--which I did shortly after--but then I also remember installing MDK on a Pentium 133mhz computer and thinking to myself "wow, this looks really good" or much better than anything else that would run on the thing at the time.

My first classic Pentium PC was Pentium 150 Mhz (over clocked to 166Mhz by changing bus speeds from 60 Mhz to 66 Mhz) and I remember MDK. Pentium 150 Mhz was cheaper than Pentium 166Mhz and play it on the mentioned PC. Pentium 150 Mhz PC was loaded with Windows 95.

My Pentium 150 Mhz PC was exchanged from Commodore Amiga 3000 (with 68030 25Mhz CPU, 68882 FPU 25Mhz overclocked to 50Mhz with a hack) since the person who bought my Amiga 3000 was going to use with video production. My Amiga 3000 to Pentium PC transfer was my turning point from art work/multimedia to programming code.

Anyway, DirectX 12 wouldn't overcome X1's small ESRAM nor 12 CU GCN issues, but it gives additional flexibility for more complex rendering techniques i.e. material/texture handling would be quicker. This is one of the areas that DirectX12 will improve over DirectX11.X/11.2.

From http://www.slideshare.net/DevCentralAMD/direct3d-and-the-future-of-graphics-apis-amd-at-gdc14

Btw, current Xbox One doesn't run DirectX 12 software ecosystem i.e. it's DirectX11.X is a half bake solution.

Avatar image for eNT1TY
eNT1TY

1319

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#300 eNT1TY
Member since 2005 • 1319 Posts

I feel the pain of those pinning all their hopes on their under performing xb1. It's like rooting for an underdog team that can never quite elevate itself from the status quo. The fan will rationalize past poor performance with sheer apologism while living in a prayer that next year will be the year everything turns around like it was destined to. My condolences to you; may you find that fortitude to "just wait" for another year. Hearing lems mooing is mildly entertaining this gen so please keep on living that tortured existence :)