The PS4 Hardware Advantage

This topic is locked from further discussion.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#151 tormentos
Member since 2003 • 33793 Posts

I don't aggregate any numbers, The fact is microsoft is using this new tech and it will be implemented in the newer intel chips. Anantech said it themselves, the fast on chip ram can counter the gddr5 speed advantage, they just don't know for sure, because they don't know how it's implemented.

And the cloud is not like some hollow promise (this isn't sony with their emotion engine and the cell), they have the amount of servers to pull this off, the only thing you will need is a ultra fast internet connection and this will be more common the following years.

It seems you keep forgetting this is microsoft, they're already making game operating systems for 20 years. Directx is their invention. Sony was still making basic 3d games while you could already play 3d accelerated games like halflife, unreal and system shock 2. Sony is laggging behind all these years. I mean they had to make a system that was nearly twice the price of the x360 back in 2006 and it was released a year later.

On paper the ps3 was al lot stronger than the x360 and we all know how that turned out. It's like you're reviewing unified shaders 6 months before the x360's release. If you think sony is just going to surpass them out of nowhere , you got another thing coming. The cloud could very well be the solution to the console lifecycles and the ultimate solution in piracy.

However, sony has it's advantages, their first party support is better than microsofts (imo) and owning a game on a ps4 will not be the same as a cloud powered game on the xboxone but the hardware advantage in the ps4 smells a lot like history repeating itself.

And you're forgetting about the kinect, maybe i'm dreaming but microsoft is the only one that bring us one step close to star trek's holodek. Wether it's going to be succesfull is another question, but it's the reason the xboxone costs more than ps4. Withouth the kinect it would be like 2005-2006 all over again.

evildead6789

Miss leading joke..

 

First of all the 7870 at 150gb/s is not bandwidth started neither is the 7850,whin mean the PS4 isn't either.

The xbox one can have 250gb/s bandwidth,but it doesn't have a damn GPU powerful enough to saturate that bandwidth,the xbox one has a 12 CU 768 GPU,the closes GPU out there that Ronvalencia think fits this criteria is the Firepro W5000 which has the exact same number of CU and SP,it doesn't mean it is by the way,the firepro has a 102gb/s bandwidth,and i am sure is no bandwidth limited is just 1.3 TF.

So putting more bandwidth on something that doesn't have the power to fill it mean sh**,yeah it just mean that the GPU will not be bandwidth startved,by the way the EDRAM on xbox 360 has a 256GB/s the PS3 has barely 23GB/s so maybe now you know that those fixed hardware are not the be all end all of hradwares,they are quick fixes rather than enhancements.

So even is some how by a miracle ESRAM can counter byte per byte GDDR5 (which i know it won't) that doesn't make up for the 6 extra CU the PS4 has and 384 SP more the PS4 has over the xbox one as well,more CU more SP more performance is very simple,and is a line you can fallow on GCN from the very weakest 7xxx to the strongest.

 

Cloud is a hoax look at the damn Tittanall is look like sh**,is not impressive by any means,and it use the damn cloud want to bet who will have the better looking version between PC and xbox one.? Now don't hyde on the fact that PC had stronger GPU,because we all know the power of the cloud as sold by MS is infinite right.?

40 times the xbox 360 power right.?

230 Gflops X 40 = 9200 Gflops in other words 9.2 TF of performance from the cloud MS will deliver over a dirty stinking 50MB internet connection.

 

By MS the xbox one with cloud will have more performance than a Titan GPU...:lol:

 

You are and idiot,the PS3 and xbox 360 have similar power with the PS3 edging it abit,the 360 has a stronger GPU the PS3 a stronger CPU and both units specs were BS,1TF xbox 360 2TF PS3 both were fake and it was MS who started not sony.

And thanking MS for 3D acelerated games is a joke,MS doesn't sell GPU or invented them idiot.

 

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#152 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

X1's ESRAM is combined with JIT(just in time) LZ/JPEG compression/decompression hardware i.e. it's not just ESRAM.

JIT(just in time) LZ/JPEG compression/decompression hardware changes memory storage and memory bandwidth factors.

You can't do the standard DDR3 vs GDDR5 comparisons with non-standard hardware.

tormentos

JIT compresion will do sh** the PS4 also support compresion and decompresion at hardware level..

JIT LZ/JPEG compression/decompression does reasonably well with data transfers/data storage. Any network cell phone engineer would know this. Your fixation on PS4 is a LOL. All GCNs has DX9/DX10(e.g. BC4, BC5/3DC) /DX11 (e.g. BC6H, BC7) level texture compression/decompression hardware but it doesn't have LZ/JPEG. Out of the four X1's move engines, all GCNs has two of them.

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#153 btk2k2
Member since 2003 • 440 Posts
No, I didn't.I did NOT equate 7770 with X1's GPU i.e. this is why used W5000 or "7830" prototype instead.


Did you missed my statement on why 7770 would leave a giant hole with 5 billion transistors?

Did you missed my statement on why 7770 would unsuited design due to the current GCN TMU to 16 stream processor ratio.

I'm well aware of 1.6 billion for eDRAM.

Unlike you, I haven't equated 8 core Jaguars' transistors count with 8 core AMD Piledrivers.

Unlike you, I didn't equate 7770 with X1's GPU primitives per cycles are NOT the same as 7770.

ronvalencia
Chances are the X1 GPU is based on Bonair not Cape Verde. Bonair is 2Billion transistors. + 1.6Billion ESRAM + CPU, fixed function hardware and any extra transistors to improve yield. All GCN hardware has 1 TMU for 16 SPs, There are 64SPs in a CU (compute unit) and there are 4 TMUs for every CU. This is fixed across all of the GCN hardware. 7790 has 14 CUs and 56 TMUs, 7770 has 10 CUs and 40 TMUs, 7970 has 32 CUs and 128TMUs etc. It is fixed so all of it has the same TMU to CU ratio. Again, the primitives per cycle is not the important factor it is the primitives/second that is important and that will depend on hardware units * clockspeed. The 7770 runs at 1,000Mhz which is 25% faster than the X1's 800Mhz but the X1 has 25% more shader and texture hardware than the 7770 so this difference is a wash.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#154 tormentos
Member since 2003 • 33793 Posts

 

No, I didn't.I did NOT equate 7770 with X1's GPU i.e. this is why used W5000 or "7830" prototype instead.


Did you missed my statement on why 7770 would leave a giant hole with 5 billion transistors?

Did you missed my statement on why 7770 would unsuited design due to the current GCN TMU to 16 stream processor ratio.

 

I'm well aware of 1.6 billion for eDRAM.

 

Unlike you, I haven't equated 8 core Jaguars' transistors count with 8 core AMD Piledrivers.

 

Unlike you, I didn't equate 7770 with X1's GPU primitives per cycles are NOT the same as 7770.

 

ronvalencia

 

Yes as soon as the 1.2TF came out of the bag.

 

Is not about leaving a hole,you have try falsely to imply that there is stronger GPU on the xbox one based on transistor counts,yes you have the same happen with the 7790 argument,which you again refuse to admit it fit the profile after you your self bring it as possible GPU,because it was stronger than the 7770 in your eyes,problem is the 7790 is 14CU and more SP than the 768 leaked and confirmed and runs also at 1ghz,wait weren't you a few days ago,arguing pro xbox one saying MS hasn't reveal the clock speed and then you bring 1ghz as example instead of 800mhz.?

 

Dude you are looking for secret sause to tag on the xbox one for months,now is JIT compression and decompression the latest in your arcenal of stupid things,or your latest argument on how power is nothing wituot man power,trying to imply that MS developers are ahead because the 360 GPU is AMD..:lo:

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#155 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]No, I didn't.I did NOT equate 7770 with X1's GPU i.e. this is why used W5000 or "7830" prototype instead.


Did you missed my statement on why 7770 would leave a giant hole with 5 billion transistors?

Did you missed my statement on why 7770 would unsuited design due to the current GCN TMU to 16 stream processor ratio.

I'm well aware of 1.6 billion for eDRAM.

Unlike you, I haven't equated 8 core Jaguars' transistors count with 8 core AMD Piledrivers.

Unlike you, I didn't equate 7770 with X1's GPU primitives per cycles are NOT the same as 7770.

btk2k2

Chances are the X1 GPU is based on Bonair not Cape Verde. Bonair is 2Billion transistors. + 1.6Billion ESRAM + CPU, fixed function hardware and any extra transistors to improve yield. All GCN hardware has 1 TMU for 16 SPs, There are 64SPs in a CU (compute unit) and there are 4 TMUs for every CU. This is fixed across all of the GCN hardware. 7790 has 14 CUs and 56 TMUs, 7770 has 10 CUs and 40 TMUs, 7970 has 32 CUs and 128TMUs etc. It is fixed so all of it has the same TMU to CU ratio. Again, the primitives per cycle is not the important factor it is the primitives/second that is important and that will depend on hardware units * clockspeed. The 7770 runs at 1,000Mhz which is 25% faster than the X1's 800Mhz but the X1 has 25% more shader and texture hardware than the 7770 so this difference is a wash.

If primitives per cycle is not an important factor, it would a waste of time for AMD to upgrade it for Bonaire, Pitcairn and Tahiti.

AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#156 tormentos
Member since 2003 • 33793 Posts

 

JIT LZ/JPEG compression/decompression does reasonably well with data transfers/data storage. Any network cell phone engineer would know this. Your fixation on PS4 is a LOL. All GCNs has DX9/DX10(e.g. BC4, BC5/3DC) /DX11 (e.g. BC6H, BC7) level texture compression/decompression hardware but it doesn't have LZ/JPEG. Out of the four X1's move engines, all GCNs has two of them.

ronvalencia

 

The PS4 support Zlib compression dude stop,and at a hardware level it has hardware for compress and decompres that is not tied to the GPU,is nothing Ron.

Funny GCN has 2 ACES the PS4 has 8 ACES,so PS4 the compute king.?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#157 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

JIT LZ/JPEG compression/decompression does reasonably well with data transfers/data storage. Any network cell phone engineer would know this. Your fixation on PS4 is a LOL. All GCNs has DX9/DX10(e.g. BC4, BC5/3DC) /DX11 (e.g. BC6H, BC7) level texture compression/decompression hardware but it doesn't have LZ/JPEG. Out of the four X1's move engines, all GCNs has two of them.

tormentos

The PS4 support Zlib compression dude stop,and at a hardware level it has hardware for compress and decompres that is not tied to the GPU,is nothing Ron.

Funny GCN has 2 ACES the PS4 has 8 ACES,so PS4 the compute king.?

From http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=3

PS4 only has "zlib decompression" and the context is to help with blu-ray.

To further help the Blu-ray along, the system also has a unit to support zlib decompression -- so developers can confidently compress all of their game data and know the system will decode it on the fly. "As a minimum, our vision is that our games are zlib compressed on media," said Cerny.

Your getting sloppy with the details.

X1's move engines with JIT compression/decompression hardware.

move_engine1.jpg

X1's version goes beyond the blu-ray booster.

Intel Core i7 can create a larger ACE bucket size than AMD Jaguar. The amount of ACEs doesn't change the wavefront processing i.e. 7970 has higher TDP for additional active CUs. 7970 has a higher clock speed for caches, local data storage, memory controllers, internal crossbar, register storage, global data storage and 'etc'.

PS4's APU is not even used for AMD FirePro S9000 Server Graphics (Tahiti XT2).


-------------------------------

PS; CELL's SPUs $ucks being a GPU i.e. too slow for bilinearly interpolated texture lookups.

Avatar image for gamevet77
gamevet77

555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#158 gamevet77
Member since 2013 • 555 Posts

[QUOTE="PinkiePirate"]

[QUOTE="evildead6789"]

Who cares

1. As for the difference between the xboxone and ps4

- memory bandwith is far from everything, xboxone has the on chip fast ram which will counter the gddr5 speed advantage.

- microsoft has a lot more experience when it comes to operating systems, (especially in game operating systems, windows 95 was afterall also a game operating system)

People learn history to not make the same mistakes, if you look at the history of sony and ms, then you would realize that microsoft is better at choosing hardware and making os's, sony is just an electronics manufacturers.

 

2. As for the difference between the ps4 and the pc

- you can put 4 titans in your pc, which is like 10-12 times the power of the ps4's gpu.

- The high end intel cpu's are like three time the power of the ps4's cpu's with half the cores, and they will release 8-core cpu's soon.

- Ddr4 is just around the corner but memory bandwith is far from everything hence the small differences in framerates in similar systems with different ram speeds.

- pci3.0 and pci 2.0 has so much bandwitdh, that there's hardly a difference between pci 3.0 and pci 2.0 when you install a titan (which is 3 times the power of the ps4 gpu)

You're comparing a ferrari with a rocket, it doesn matter the ferrari has all this enhancements, the rocket flies because of the raw power, straight into space where it get's a top speed of 20000 mph, who cares the ferrari can do 200 mph instead of 180 because of the better tires and suspension. The rocket doesn't need it, it flies.

All the console tweaks didn't make the consoles run the 2007-game crysis and x360 and ps3 released with top end hardware respectively 2 years and 1 year before. This time they release with mediocre hardware, like the ps2 did back in 2001. The ps2 may have been very successfull, but there a reason a number of games didn't release on the ps2. Graphic kings like halflife 2, far cry , doom 3 , fear, Tes III.

This gen was one of it's kind but devs will develop more for the pc this gen. The piracy can be easily countered now by making multiplayer/co op oriented games and the hardware is there to make some truly next gen games.

evildead6789

 

There is so much mis-information in this post. 

Memory and bandwidth aren't the only things that set these two machines apart. And no, the eSRAM in the Xbox One doesn't "counter" the PS4's GDDR5. You don't just aggregate the numbers. That's not how it works.

So many people, including you, are overlooking a lot of things in the PS4's hardware. That's why I made this topic. It seems you didn't even read the first post. The PS4 has an embedded ARM CPU where the OS and all of its processes reside. There are also dedicated units for audio and video, so you can support audio chat and video streaming at the same time without needing to dedicate any resources to them. And since you have a dedicated bus for the GPU, the L2 cache can be used for both graphics processing and async compute. This drastically reduces any overhead caused by running compute on the GPU. 

Also, the PS4 has 32 ROPs vs. the Xbox One's 16. That's a 100% difference in fillrate. 

I don't aggregate any numbers, The fact is microsoft is using this new tech and it will be implemented in the newer intel chips. Anantech said it themselves, the fast on chip ram can counter the gddr5 speed advantage, they just don't know for sure, because they don't know how it's implemented.

And the cloud is not like some hollow promise (this isn't sony with their emotion engine and the cell), they have the amount of servers to pull this off, the only thing you will need is a ultra fast internet connection and this will be more common the following years.

It seems you keep forgetting this is microsoft, they're already making game operating systems for 20 years. Directx is their invention. Sony was still making basic 3d games while you could already play 3d accelerated games like halflife, unreal and system shock 2. Sony is laggging behind all these years. I mean they had to make a system that was nearly twice the price of the x360 back in 2006 and it was released a year later.

On paper the ps3 was al lot stronger than the x360 and we all know how that turned out. It's like you're reviewing unified shaders 6 months before the x360's release. If you think sony is just going to surpass them out of nowhere , you got another thing coming. The cloud could very well be the solution to the console lifecycles and the ultimate solution in piracy.

However, sony has it's advantages, their first party support is better than microsofts (imo) and owning a game on a ps4 will not be the same as a cloud powered game on the xboxone but the hardware advantage in the ps4 smells a lot like history repeating itself.

And you're forgetting about the kinect, maybe i'm dreaming but microsoft is the only one that bring us one step close to star trek's holodek. Wether it's going to be succesfull is another question, but it's the reason the xboxone costs more than ps4. Withouth the kinect it would be like 2005-2006 all over again.

That being said. The ps3 is more powerful then the xbox 360. Look at games like Uncharted and you can see the difference. It's up to the developers to utilize that power so we will see. I don't want to pay for or use that stupid kinect device so the x1 turns me off. I've been a primary 360 owner and the xbox one turns my stomach
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#159 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

[QUOTE="CrownKingArthur"]

[QUOTE="ShoulderOfOrion"]

Quantum_Screenshot_2.jpg

Evo_nine

dude

that looks so amazing. i believe in the ps4 hardware advantage now.

what ps4 game is this?

WOW!!!! what PS4 game is this??

PS4 day 1!!

its CG for a xbox 1 game, not real in game or gameplay
Avatar image for gamevet77
gamevet77

555

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#160 gamevet77
Member since 2013 • 555 Posts

The cloud? LOL.  Lets get real

 

There is still a huge part of the population who does not possess internet.  When you are talking about cloud gaming you need a high speed internet to utilizie that.

Even a much lesser population have high speed internet.  Crappy dsl won't work with cloud gaming.  I am guessing broadband with atleast 5megs down atleast.

 

When all that is figured out you are taking a big portion of consumers out of the equation.  When a publisher see's the numbers they are not going to give a crap about cloud gaming. Until the majority of homes have high speed internet it's just wishful thinking.

 

Publishers want money.  When you spend tons of it at cloud gaming and have a small amount of users who can utilizie it, that is a recipe for disaster.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 tormentos
Member since 2003 • 33793 Posts

 

AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

ronvalencia

 

Memory controllers can be custom made for consoles.

Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 faizan_faizan
Member since 2009 • 7869 Posts
[QUOTE="Evo_nine"]

[QUOTE="CrownKingArthur"]

dude

that looks so amazing. i believe in the ps4 hardware advantage now.

what ps4 game is this?

xboxiphoneps3

WOW!!!! what PS4 game is this??

PS4 day 1!!

its CG for a xbox 1 game, not real in game or gameplay

If you think that, that is CG then you have never seen CG.
Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#163 btk2k2
Member since 2003 • 440 Posts
AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

ronvalencia
It is of no surprise that the X1 GPU is custom. It looks like a 256bit bonair with 12CU's. The thing is when you compare the 7770 to the 7790 the 7790 has 40% more shader power, 34% more memory bandwidth and 100% more prims/clock but the average performance improvement over the 7770 at 1080P is only 28%. The extra prim/clock does not really make much difference so using the 7770 as a baseline for the X1 seems perfectly valid since it is practically in the same performance envelope. We are not talking exact figures here. The PS4 has these hardware advantages over the X1 50% more CUs and TMUs 100% more ROPS 3% / 72% / 257% more bandwidth (ESRAM + Main ram / ESRAM only / Main ram only) It also has 8 ACE units which the base bonair/cape verde GPU's do not have any of. There is no getting around the fact that in multiplatform games the PS4 will run it better. There is no getting around the fact that Sony's first party developers have more headroom to work with and will produce better looking games.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#164 tormentos
Member since 2003 • 33793 Posts

 

 

 

move_engine1.jpg

X1's version goes beyond the blu-ray booster.

 

 

Intel Core i7 can create a larger ACE bucket size than AMD Jaguar.

 

ronvalencia

 

Ron i just did a searh on Jit compression on xbox one,and guess what is the first thing that appear and basically the only one talking about this sh**..?

 

http://www.google.com.pr/#sclient=psy-ab&q=+jit+compression+decompression+on+xbox+one&oq=+jit+compression+decompression+on+xbox+one&gs_l=hp.12...9309.18200.4.20042.27.24.3.0.0.14.790.6862.0j13j4j2j1j2j2.24.0...0.0...1c.1.17.psy-ab.Yd0nAFhgIAY&pbx=1&bav=on.2,or.r_qf.&bvm=bv.48293060,d.eWU&fp=cfebeeabd3c3c0fa&biw=864&bih=412

 

Your blog... ...hahahahaaa another of your baseless secret sauce theories..:lol:

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165 tormentos
Member since 2003 • 33793 Posts

[QUOTE="ronvalencia"]AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

btk2k2

It is of no surprise that the X1 GPU is custom. It looks like a 256bit bonair with 12CU's. The thing is when you compare the 7770 to the 7790 the 7790 has 40% more shader power, 34% more memory bandwidth and 100% more prims/clock but the average performance improvement over the 7770 at 1080P is only 28%. The extra prim/clock does not really make much difference so using the 7770 as a baseline for the X1 seems perfectly valid since it is practically in the same performance envelope. We are not talking exact figures here. The PS4 has these hardware advantages over the X1 50% more CUs and TMUs 100% more ROPS 3% / 72% / 257% more bandwidth (ESRAM + Main ram / ESRAM only / Main ram only) It also has 8 ACE units which the base bonair/cape verde GPU's do not have any of. There is no getting around the fact that in multiplatform games the PS4 will run it better. There is no getting around the fact that Sony's first party developers have more headroom to work with and will produce better looking games.

 

Basically spot on..

 

Regardless of what it has inside the xbox one has 1.2TF which fall in line with the 7770 performance,this are GCN GPU and from the weakest to the strongest there is a line,and more flops mean more performance,either been 7770 vs 7790,7850 vs 7870,or 7950 vs 7970 is a line and the one with more flops perform better.

 

Avatar image for gamebreakerz__
gamebreakerz__

5120

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#166 gamebreakerz__
Member since 2010 • 5120 Posts

The cloud? LOL.  Lets get real

 

There is still a huge part of the population who does not possess internet.  When you are talking about cloud gaming you need a high speed internet to utilizie that.

Even a much lesser population have high speed internet.  Crappy dsl won't work with cloud gaming.  I am guessing broadband with atleast 5megs down atleast.

 

When all that is figured out you are taking a big portion of consumers out of the equation.  When a publisher see's the numbers they are not going to give a crap about cloud gaming. Until the majority of homes have high speed internet it's just wishful thinking.

 

Publishers want money.  When you spend tons of it at cloud gaming and have a small amount of users who can utilizie it, that is a recipe for disaster.

gamevet77
Not the right angle to criticize cloud gaming, the right angle is that a cloud needs to render a game via an internet connection, which means there will be built in lag even for single player games. If you've played online shooters on consoles then you will know how annoying lag can be, now imagine that in single player games.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#167 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

move_engine1.jpg

X1's version goes beyond the blu-ray booster.

Intel Core i7 can create a larger ACE bucket size than AMD Jaguar.

tormentos

Ron i just did a searh on Jit compression on xbox one,and guess what is the first thing that appear and basically the only one talking about this sh**..?

http://www.google.com.pr/#sclient=psy-ab&q=+jit+compression+decompression+on+xbox+one&oq=+jit+compression+decompression+on+xbox+one&gs_l=hp.12...9309.18200.4.20042.27.24.3.0.0.14.790.6862.0j13j4j2j1j2j2.24.0...0.0...1c.1.17.psy-ab.Yd0nAFhgIAY&pbx=1&bav=on.2,or.r_qf.&bvm=bv.48293060,d.eWU&fp=cfebeeabd3c3c0fa&biw=864&bih=412

Your blog... ...hahahahaaa another of your baseless secret sauce theories..:lol:

Remove Xbox One from the search. Your PS4 obsession is LOL. Your attention to detail is a joke. Hardware LZ decompression/compression goes beyond X1 vs PS4. AMD doesn't have a monopoly with this technology.

I don't have use X1 based arguments.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 tormentos
Member since 2003 • 33793 Posts

 

Remove Xbox One from the search. Your PS4 obsession is LOL. Your attention to detail is a joke. Hardware LZ decompression/compression goes beyond X1 vs PS4. AMD doesn't have a monopoly with this technology.

 

I don't have use X1 based arguments.

ronvalencia

 

 

No your xbox one obsession is out of this world,so much that the only person hyping JIT compression as the xbox one savior is you..:lol:

Secret Sauce..:lol:

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#169 commander
Member since 2010 • 16217 Posts

[QUOTE="evildead6789"]

I don't aggregate any numbers, The fact is microsoft is using this new tech and it will be implemented in the newer intel chips. Anantech said it themselves, the fast on chip ram can counter the gddr5 speed advantage, they just don't know for sure, because they don't know how it's implemented.

And the cloud is not like some hollow promise (this isn't sony with their emotion engine and the cell), they have the amount of servers to pull this off, the only thing you will need is a ultra fast internet connection and this will be more common the following years.

It seems you keep forgetting this is microsoft, they're already making game operating systems for 20 years. Directx is their invention. Sony was still making basic 3d games while you could already play 3d accelerated games like halflife, unreal and system shock 2. Sony is laggging behind all these years. I mean they had to make a system that was nearly twice the price of the x360 back in 2006 and it was released a year later.

On paper the ps3 was al lot stronger than the x360 and we all know how that turned out. It's like you're reviewing unified shaders 6 months before the x360's release. If you think sony is just going to surpass them out of nowhere , you got another thing coming. The cloud could very well be the solution to the console lifecycles and the ultimate solution in piracy.

However, sony has it's advantages, their first party support is better than microsofts (imo) and owning a game on a ps4 will not be the same as a cloud powered game on the xboxone but the hardware advantage in the ps4 smells a lot like history repeating itself.

And you're forgetting about the kinect, maybe i'm dreaming but microsoft is the only one that bring us one step close to star trek's holodek. Wether it's going to be succesfull is another question, but it's the reason the xboxone costs more than ps4. Withouth the kinect it would be like 2005-2006 all over again.

tormentos

Miss leading joke..

 

First of all the 7870 at 150gb/s is not bandwidth started neither is the 7850,whin mean the PS4 isn't either.

The xbox one can have 250gb/s bandwidth,but it doesn't have a damn GPU powerful enough to saturate that bandwidth,the xbox one has a 12 CU 768 GPU,the closes GPU out there that Ronvalencia think fits this criteria is the Firepro W5000 which has the exact same number of CU and SP,it doesn't mean it is by the way,the firepro has a 102gb/s bandwidth,and i am sure is no bandwidth limited is just 1.3 TF.

So putting more bandwidth on something that doesn't have the power to fill it mean sh**,yeah it just mean that the GPU will not be bandwidth startved,by the way the EDRAM on xbox 360 has a 256GB/s the PS3 has barely 23GB/s so maybe now you know that those fixed hardware are not the be all end all of hradwares,they are quick fixes rather than enhancements.

So even is some how by a miracle ESRAM can counter byte per byte GDDR5 (which i know it won't) that doesn't make up for the 6 extra CU the PS4 has and 384 SP more the PS4 has over the xbox one as well,more CU more SP more performance is very simple,and is a line you can fallow on GCN from the very weakest 7xxx to the strongest.

 

Cloud is a hoax look at the damn Tittanall is look like sh**,is not impressive by any means,and it use the damn cloud want to bet who will have the better looking version between PC and xbox one.? Now don't hyde on the fact that PC had stronger GPU,because we all know the power of the cloud as sold by MS is infinite right.?

40 times the xbox 360 power right.?

230 Gflops X 40 = 9200 Gflops in other words 9.2 TF of performance from the cloud MS will deliver over a dirty stinking 50MB internet connection.

 

By MS the xbox one with cloud will have more performance than a Titan GPU...:lol:

 

You are and idiot,the PS3 and xbox 360 have similar power with the PS3 edging it abit,the 360 has a stronger GPU the PS3 a stronger CPU and both units specs were BS,1TF xbox 360 2TF PS3 both were fake and it was MS who started not sony.

And thanking MS for 3D acelerated games is a joke,MS doesn't sell GPU or invented them idiot.

 

Half your argument build on one thing, that the cloud is a hoax?

MS , the leading corporation, when it comes to operating systems and NETWORK operating systems, has set 300000 servers just for the fun of it. I don't think you're a name on leading tech or can predict the future. I base myself on microsofts history, and they're simply more reliable that sony ( I can't remember microsoft making hollow promises :D)

Titanfall isn't the best looking games but metal gear solid  and dead rising 3 look pretty kick ass too me, without cloud..

Sure the ps3 and x360 have similar power , but that's not how sony sold the ps3 is it? Either your memory sucks big time or you're too young to remember but the ps3 launched at 500$ , the x360 launched at 300$ (a year sooner). While sony promised a much stronger system.The xbox may have had the lower storage memory, that is true, but on all models the ps3 was the beefier price, with hollow promises.

But you're right, ms didn't invent 3d accelerated gaming, but they did buy a company that made Direct3d in 1992, which made 3d accelerated gaming with a gpu possible. Microsoft implemented direct3d (and also support for 3dfx glide and opengl) and support for 3d accelerated gpu's in their windows 95 os.

That's how 3d games like quake , halflife and unreal we're born. (Direct3D works also without a 3d card, it was called 'software rendering'. That's why people without 3d accelaration on their gpu could still play 3d games like unreal.)

Those games are the roots of the 3d gaming that you know today, ms may not have invented direct3d, or 3d accelerated videocards but they made the kind of 3d gaming possible  that you know today when they released windows 95. All the other platforms we're nowhere near the pc.( I suppose some things always stay the same)

I know what you're thinking The xbox one isn't a pc, no but pc gaming happens on windows, and windows is microsoft. The xbox one runs on a custom ms windows, and windows can have upgrades, if it's cloud or extra hardware. I can't predict the future, but i think if ms really wants too, they can make the xboxone as strong as it needs to be.

Well, I suppose I just demolished everything you said, sony fanboy

 

 

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#170 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

btk2k2

It is of no surprise that the X1 GPU is custom. It looks like a 256bit bonair with 12CU's. The thing is when you compare the 7770 to the 7790 the 7790 has 40% more shader power, 34% more memory bandwidth and 100% more prims/clock but the average performance improvement over the 7770 at 1080P is only 28%. The extra prim/clock does not really make much difference so using the 7770 as a baseline for the X1 seems perfectly valid since it is practically in the same performance envelope. We are not talking exact figures here. The PS4 has these hardware advantages over the X1 50% more CUs and TMUs 100% more ROPS 3% / 72% / 257% more bandwidth (ESRAM + Main ram / ESRAM only / Main ram only) It also has 8 ACE units which the base bonair/cape verde GPU's do not have any of. There is no getting around the fact that in multiplatform games the PS4 will run it better. There is no getting around the fact that Sony's first party developers have more headroom to work with and will produce better looking games.

Higher primitives per cycle will be important for Crysis 2 e.g. NVIDIA exploited AMD Cayman's weakness and AMD lost the benchmark race. With Crysis 2, NVIDIA made sure that this game would be rendering near solid wall of triangles. Any hermit would know this "incident".

Have you factor in the refesh overheads for DRAM type storage solution?

With DRAMs, a read operation would be destructive and a refresh process would be required. SRAM doesn't have DRAM's refresh overheads.

I'll post another 768 stream processor GCN, but with lower clock speed and memory specs i.e. AMD FirePro W5000 SKU.

FirePro_W5000_GPUZ.jpg

Notice FirePro W5000's 102 GB/s video memory bandwidth almost matches VGLeaks' X1 eDRAM memory bandwidth. It is estimated that DRAM has upto 5 percent fresh overheads.

Some gaming benchmarks for AMD FirePro W5000.

http://www.tomshardware.co.uk/workstation-graphics-card-gaming,review-32643-9.html

7850 = 45.

W5000 = 33.

Crysis2DX11-1080p.png

With Crysis 2, ROPs issue is minor i.e. 10 CU scale down from 7870 GE's results roughly matches 7770 GE's results.

7870 GE's 52.3 fps / 20 CUs = 2.615 x 10 CUs = 26.15 fps which roughly matches 7770's 25.9 fps result. My scale down theory and actual results works.

FirePro W5000's 12 CUs (825 Mhz) scales down from Radeon HD 7850's 16 CUs (860 Mhz).

7850's 45.3 fps / 16 CUs (860Mhz) = 2.831 x 8 CUs = 22.65 fps which roughly matches 7750's 21.5 fps result. 7750 is clocked at 800Mhz. Again, my scale down theory and actual results works.

If we use the 7850 and 7750 as the two points for the "line of best fit", FirePro W5000 falls into the expected slot for scaled 12 CUs @ ~800 Mhz.

------------

I'm using the following statements for estimating the gap between two points.

http://www.videogamer.com/news/xbox_one_and_ps4_have_no_advantage_over_the_other_says_redlynx.html

Speaking to VideoGamer.com at E3, Ilvessuo said: " Obviously we have been developing this game for a while and you can see the comparisons. I would say if you know how to use the platform they are both very powerful. I don't see a benefit over the other with any of the consoles."

http://www.videogamer.com/xboxone/metal_gear_solid_5_the_phantom_pain/news/ps4_and_xbox_one_power_difference_is_minimal_says_kojima.html

"The difference is small, and I don't really need to worry about it," he said, suggesting versions for Xbox One and PS4 won't be dramatically different.

------------

The amount ACEs doesn't change wavefront processing power.

Avatar image for ProjectPat187
ProjectPat187

2178

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#171 ProjectPat187
Member since 2005 • 2178 Posts
im getting both consoles so it doesnt matter to me
Avatar image for Alucard_Prime
Alucard_Prime

10107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#172 Alucard_Prime
Member since 2008 • 10107 Posts

I hope those numbers help with the online aspect, because I recently tried out The Last of Us on PS3(which is an incredible game by the way) and when we tried to go online and play we were greeted with different error messages(error codes without any descriptions) , and we eventually gave up. Tried to play a few matches online with that new F2P Tekken game, and couldn't find a single match. I hope this is not the kind of online that the PS4 will have, because I plan on buying both consoles but I play online a lot. 

Avatar image for ProjectPat187
ProjectPat187

2178

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#173 ProjectPat187
Member since 2005 • 2178 Posts

[QUOTE="MonsieurX"][QUOTE="lundy86_4"]

Proven? Shit would be weird.

Hell, that being said, hardware doesn't mean shit, if you don't use it.

Davekeeh

It's an Alienware


Alienware is for the elite

Custom build is for the poor

Alienware is for the clueless noobs who dont know any better
Avatar image for KillzoneSnake
KillzoneSnake

2761

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#174 KillzoneSnake
Member since 2012 • 2761 Posts

The difference is very high. I cant wait to see how Sony exclusives dominate in graphics and how multiplats perform better on PS4 than X1. Only 4-5 months to go.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#175 ronvalencia
Member since 2008 • 29612 Posts

The difference is very high. I cant wait to see how Sony exclusives dominate in graphics and how multiplats perform better on PS4 than X1. Only 4-5 months to go.

KillzoneSnake
Effective difference is low. PS4 doesn't sport 7970 level GPU to make a large difference.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#176 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

Remove Xbox One from the search. Your PS4 obsession is LOL. Your attention to detail is a joke. Hardware LZ decompression/compression goes beyond X1 vs PS4. AMD doesn't have a monopoly with this technology.

I don't have use X1 based arguments.

tormentos

No your xbox one obsession is out of this world,so much that the only person hyping JIT compression as the xbox one savior is you..:lol:

Secret Sauce..:lol:

You can't even get PS4's zlib detail correctly.:roll:

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

I'm game for fanboy wars.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#177 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

 

 

 

move_engine1.jpg

X1's version goes beyond the blu-ray booster.

 

 

Intel Core i7 can create a larger ACE bucket size than AMD Jaguar.

 

tormentos

 

Ron i just did a searh on Jit compression on xbox one,and guess what is the first thing that appear and basically the only one talking about this sh**..?

 

http://www.google.com.pr/#sclient=psy-ab&q=+jit+compression+decompression+on+xbox+one&oq=+jit+compression+decompression+on+xbox+one&gs_l=hp.12...9309.18200.4.20042.27.24.3.0.0.14.790.6862.0j13j4j2j1j2j2.24.0...0.0...1c.1.17.psy-ab.Yd0nAFhgIAY&pbx=1&bav=on.2,or.r_qf.&bvm=bv.48293060,d.eWU&fp=cfebeeabd3c3c0fa&biw=864&bih=412

 

Your blog... ...hahahahaaa another of your baseless secret sauce theories..:lol:

You're too dumb with applied science type debate.
Avatar image for DaBrainz
DaBrainz

7959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#178 DaBrainz
Member since 2007 • 7959 Posts
Sony put the pee-you into APU
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#179 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="btk2k2"][QUOTE="ronvalencia"]AMD Bonaire would a candidate but X1's 256bit memory controllers with it's related L2 caches would be different i.e. it would closer to Pitcairn's L2 cache count/access ports but with DDR3. This is assuming AMD/MS didn't change L2 cache and MCH relationship and VGLeak's 256bit memory controllers for the X1's GCN is correct.

If VGLeak's info is true, 7770 wouldn't be candidate since this GCN can only do 1 primitive per cycle while X1's GCN can do 2 primitives per cycle i.e. matching Bonaire, Pitcairn and Tahiti.

tormentos

It is of no surprise that the X1 GPU is custom. It looks like a 256bit bonair with 12CU's. The thing is when you compare the 7770 to the 7790 the 7790 has 40% more shader power, 34% more memory bandwidth and 100% more prims/clock but the average performance improvement over the 7770 at 1080P is only 28%. The extra prim/clock does not really make much difference so using the 7770 as a baseline for the X1 seems perfectly valid since it is practically in the same performance envelope. We are not talking exact figures here. The PS4 has these hardware advantages over the X1 50% more CUs and TMUs 100% more ROPS 3% / 72% / 257% more bandwidth (ESRAM + Main ram / ESRAM only / Main ram only) It also has 8 ACE units which the base bonair/cape verde GPU's do not have any of. There is no getting around the fact that in multiplatform games the PS4 will run it better. There is no getting around the fact that Sony's first party developers have more headroom to work with and will produce better looking games.

Basically spot on..

Regardless of what it has inside the xbox one has 1.2TF which fall in line with the 7770 performance,this are GCN GPU and from the weakest to the strongest there is a line,and more flops mean more performance,either been 7770 vs 7790,7850 vs 7870,or 7950 vs 7970 is a line and the one with more flops perform better.

Did you missed the entire episode when NVIDIA dropped the tessellation brick at AMD Cayman's Crysis 2 results.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#180 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"] 1) The primitives / cycle is a meaningless statistic because the clockspeed is different between the X1 GPU and the 7770. The important factor is the primitives/second and based on what we know this will be roughly the same. 12CUs + 48TMUs @ 800Mhz is practically equal to 10CUs + 40TMUs @ 1,000Mhz. 2) All GCN GPUs have 4TMUs per CU. A CU contains 64 stream processors. This is a fixed ratio for all GCN hardware and applies to the PS4 as well. 3) I also see a transistor discrepancy. The ESRAM is around 1.6Billion transistors, the GPU is around 1.5Billion transistors and I cannot see the CPU + fixed function stuff equalling around 1.9billion so there is defiantly something else but it could just be extra transistors to help make sure that as many of the dies have enough functional parts. Not even intel make an ESRAM that is larger than 16MB and they are the best in the business at it. 4) The 7770 has a 128bit memory controller just like the X1. 5) In the best case scenario the X1 has more bandwidth than a 7770 but in the worst case it has worse. The thing is this requires the developer to manage the memory location of assets to get the best out of the X1. On the PS4 this is not required making it easier to get closer to the theoretical peak. Even if the ESRAM means that the bandwidth difference is totally nullified (which I doubt very much) the X1 GPU still has less GPU hardware to process the scene so it is still behind. I would estimate the X1 GPU to be equal to the 7770 + 10%. I reckon that the difference between the consoles in multiplats will be the PS4 running at 'Very High' settings and the X1 running at 'High' settings. As the base architecture is so similar it will never run worse on the PS4. As for 1st parties I imagine the X1 will get up to 'Very High' but the PS4 to get up to 'Ultra'.btk2k2
7790 has 2 primitives / cycle and 7770 has 1 primitives / cycle.
[QUOTE="ronvalencia"]No, I didn't.I did NOT equate 7770 with X1's GPU i.e. this is why used W5000 or "7830" prototype instead.


Did you missed my statement on why 7770 would leave a giant hole with 5 billion transistors?

Did you missed my statement on why 7770 would unsuited design due to the current GCN TMU to 16 stream processor ratio.

I'm well aware of 1.6 billion for eDRAM.

Unlike you, I haven't equated 8 core Jaguars' transistors count with 8 core AMD Piledrivers.

Unlike you, I didn't equate 7770 with X1's GPU primitives per cycles are NOT the same as 7770.

btk2k2

Chances are the X1 GPU is based on Bonair not Cape Verde. Bonair is 2Billion transistors. + 1.6Billion ESRAM + CPU, fixed function hardware and any extra transistors to improve yield. All GCN hardware has 1 TMU for 16 SPs, There are 64SPs in a CU (compute unit) and there are 4 TMUs for every CU. This is fixed across all of the GCN hardware. 7790 has 14 CUs and 56 TMUs, 7770 has 10 CUs and 40 TMUs, 7970 has 32 CUs and 128TMUs etc. It is fixed so all of it has the same TMU to CU ratio. Again, the primitives per cycle is not the important factor it is the primitives/second that is important and that will depend on hardware units * clockspeed. The 7770 runs at 1,000Mhz which is 25% faster than the X1's 800Mhz but the X1 has 25% more shader and texture hardware than the 7770 so this difference is a wash.

Higher primitives per cycle is one of the important factors when battling against NVIDIA Fermi.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#181 tormentos
Member since 2003 • 33793 Posts

 

Half your argument build on one thing, that the cloud is a hoax?

MS , the leading corporation, when it comes to operating systems and NETWORK operating systems, has set 300000 servers just for the fun of it. I don't think you're a name on leading tech or can predict the future. I base myself on microsofts history, and they're simply more reliable that sony ( I can't remember microsoft making hollow promises :D)

Titanfall isn't the best looking games but metal gear solid  and dead rising 3 look pretty kick ass too me, without cloud..

Sure the ps3 and x360 have similar power , but that's not how sony sold the ps3 is it? Either your memory sucks big time or you're too young to remember but the ps3 launched at 500$ , the x360 launched at 300$ (a year sooner). While sony promised a much stronger system.The xbox may have had the lower storage memory, that is true, but on all models the ps3 was the beefier price, with hollow promises.

But you're right, ms didn't invent 3d accelerated gaming, but they did buy a company that made Direct3d in 1992, which made 3d accelerated gaming with a gpu possible. Microsoft implemented direct3d (and also support for 3dfx glide and opengl) and support for 3d accelerated gpu's in their windows 95 os.

That's how 3d games like quake , halflife and unreal we're born. (Direct3D works also without a 3d card, it was called 'software rendering'. That's why people without 3d accelaration on their gpu could still play 3d games like unreal.)

Those games are the roots of the 3d gaming that you know today, ms may not have invented direct3d, or 3d accelerated videocards but they made the kind of 3d gaming possible  that you know today when they released windows 95. All the other platforms we're nowhere near the pc.( I suppose some things always stay the same)

I know what you're thinking The xbox one isn't a pc, no but pc gaming happens on windows, and windows is microsoft. The xbox one runs on a custom ms windows, and windows can have upgrades, if it's cloud or extra hardware. I can't predict the future, but i think if ms really wants too, they can make the xboxone as strong as it needs to be.

Well, I suppose I just demolished everything you said, sony fanboyevildead6789

 

Cloud is a hoax is real to people who don't know sh** about how graphics work:lol:

And you don't remember MS making halow promises.?

 

How about all games will be 720p minimum with 4XAA.?  How i about we will reach 1 billion gamers with the xbox 360.? :lol:

 

Dead Rising was runing at 15fps..:lol: And MGS will also look if not better on PS4,the fact that games that don't use the cloud actually look better than the ones that use it says it all..:lol:

 

Really should i post Major Nelson bullsh** screen showing how the 360 was more powerful.? Maybe you were to young to remember..

 

Buying a company that already made something is no the same as inventing it,and DirectX is one of the reason PC gaming is hold back,MS API suck they are good just to keep legasy with older hardware,nothing more nothing less.

 

And you did not even address my points basically you are a blind lemming.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#182 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="evildead6789"]

Half your argument build on one thing, that the cloud is a hoax?

MS , the leading corporation, when it comes to operating systems and NETWORK operating systems, has set 300000 servers just for the fun of it. I don't think you're a name on leading tech or can predict the future. I base myself on microsofts history, and they're simply more reliable that sony ( I can't remember microsoft making hollow promises :D)

Titanfall isn't the best looking games but metal gear solid and dead rising 3 look pretty kick ass too me, without cloud..

Sure the ps3 and x360 have similar power , but that's not how sony sold the ps3 is it? Either your memory sucks big time or you're too young to remember but the ps3 launched at 500$ , the x360 launched at 300$ (a year sooner). While sony promised a much stronger system.The xbox may have had the lower storage memory, that is true, but on all models the ps3 was the beefier price, with hollow promises.

But you're right, ms didn't invent 3d accelerated gaming, but they did buy a company that made Direct3d in 1992, which made 3d accelerated gaming with a gpu possible. Microsoft implemented direct3d (and also support for 3dfx glide and opengl) and support for 3d accelerated gpu's in their windows 95 os.

That's how 3d games like quake , halflife and unreal we're born. (Direct3D works also without a 3d card, it was called 'software rendering'. That's why people without 3d accelaration on their gpu could still play 3d games like unreal.)

Those games are the roots of the 3d gaming that you know today, ms may not have invented direct3d, or 3d accelerated videocards but they made the kind of 3d gaming possible that you know today when they released windows 95. All the other platforms we're nowhere near the pc.( I suppose some things always stay the same)

I know what you're thinking The xbox one isn't a pc, no but pc gaming happens on windows, and windows is microsoft. The xbox one runs on a custom ms windows, and windows can have upgrades, if it's cloud or extra hardware. I can't predict the future, but i think if ms really wants too, they can make the xboxone as strong as it needs to be.

Well, I suppose I just demolished everything you said, sony fanboytormentos

Cloud is a hoax is real to people who don't know sh** about how graphics work:lol:

And you don't remember MS making halow promises.?

How about all games will be 720p minimum with 4XAA.? How i about we will reach 1 billion gamers with the xbox 360.? :lol:

Dead Rising was runing at 15fps..:lol: And MGS will also look if not better on PS4,the fact that games that don't use the cloud actually look better than the ones that use it says it all..:lol:

Really should i post Major Nelson bullsh** screen showing how the 360 was more powerful.? Maybe you were to young to remember..

Buying a company that already made something is no the same as inventing it,and DirectX is one of the reason PC gaming is hold back,MS API suck they are good just to keep legasy with older hardware,nothing more nothing less.

And you did not even address my points basically you are a blind lemming.

720p with 4XAA works fine with forward rendering and it's broken with deferred rendering.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#183 tormentos
Member since 2003 • 33793 Posts

 

Higher primitives per cycle will be important for Crysis 2 e.g. NVIDIA exploited AMD Cayman's weakness and AMD lost the benchmark race. With Crysis 2, NVIDIA made sure that this game would be rendering near solid wall of triangles. Any hermit would know this "incident".

 

Have you factor in the refesh overheads for DRAM type storage solution?

With DRAMs, a read operation would be destructive and a refresh process would be required. SRAM doesn't have DRAM's refresh overheads.

 

I'll post another 768 stream processor GCN, but with lower clock speed and memory specs i.e. AMD FirePro W5000 SKU.

 

 

Notice FirePro W5000's 102 GB/s video memory bandwidth almost matches VGLeaks' X1 eDRAM memory bandwidth. It is estimated that DRAM has upto 5 percent fresh overheads.

 

Some gaming benchmarks for AMD FirePro W5000.

http://www.tomshardware.co.uk/workstation-graphics-card-gaming,review-32643-9.html

7850 = 45.

W5000 = 33.

 

With Crysis 2, ROPs issue is minor i.e. 10 CU scale down from 7870 GE's results roughly matches 7770 GE's results.

 

7870 GE's 52.3 fps / 20 CUs = 2.615 x 10 CUs = 26.15 fps which roughly matches 7770's 25.9 fps result. My scale down theory and actual results works.

 

FirePro W5000's 12 CUs (825 Mhz) scales down from Radeon HD 7850's 16 CUs (860 Mhz).

7850's 45.3 fps / 16 CUs (860Mhz) = 2.831 x 8 CUs = 22.65 fps which roughly matches 7750's 21.5 fps result. 7750 is clocked at 800Mhz. Again, my scale down theory and actual results works.

 

 

If we use the 7850 and 7750 as the two points for the "line of best fit", FirePro W5000 falls into the expected slot for scaled 12 CUs @ ~800 Mhz.

 

 

------------

I'm using the following statements for estimating the gap between two points.

 

http://www.videogamer.com/news/xbox_one_and_ps4_have_no_advantage_over_the_other_says_redlynx.html

Speaking to VideoGamer.com at E3, Ilvessuo said: " Obviously we have been developing this game for a while and you can see the comparisons. I would say if you know how to use the platform they are both very powerful. I don't see a benefit over the other with any of the consoles."

 

 

http://www.videogamer.com/xboxone/metal_gear_solid_5_the_phantom_pain/news/ps4_and_xbox_one_power_difference_is_minimal_says_kojima.html

"The difference is small, and I don't really need to worry about it," he said, suggesting versions for Xbox One and PS4 won't be dramatically different.

 

------------

The amount ACEs doesn't change wavefront processing power.

ronvalencia

 

So you keep assuming that it is a Firepro which is a dan workstation GPU.?

 

Ron how many transistors do you think the 8 core Jaguar + other hardwares not related to the GPU or ESRAM take.?

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#184 tormentos
Member since 2003 • 33793 Posts

 720p with 4XAA works fine with forward rendering and it's broken with deferred rendering.

ronvalencia

 

Yeah is that why most 360 games are sub HD and don't have 4XAA.?

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185 tormentos
Member since 2003 • 33793 Posts

Effective difference is low. PS4 doesn't sport 7970 level GPU to make a large difference.ronvalencia

 

 

 

Sony gave the PS4 50% more raw shader performance, plain and simple (768 SPs @ 800MHz vs. 1152 SPs & 800MHz). Unlike last generation, you don't need to be some sort of Jedi to extract the PS4's potential here. The Xbox One and PS4 architectures are quite similar, Sony just has more hardware under the hood. Well have to wait and see how this hardware delta gets exposed in games over time, but the gap is definitely there.

 

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/5

 

That is not what people at Anadtech think.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#186 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

Higher primitives per cycle will be important for Crysis 2 e.g. NVIDIA exploited AMD Cayman's weakness and AMD lost the benchmark race. With Crysis 2, NVIDIA made sure that this game would be rendering near solid wall of triangles. Any hermit would know this "incident".

Have you factor in the refesh overheads for DRAM type storage solution?

With DRAMs, a read operation would be destructive and a refresh process would be required. SRAM doesn't have DRAM's refresh overheads.

I'll post another 768 stream processor GCN, but with lower clock speed and memory specs i.e. AMD FirePro W5000 SKU.

Notice FirePro W5000's 102 GB/s video memory bandwidth almost matches VGLeaks' X1 eDRAM memory bandwidth. It is estimated that DRAM has upto 5 percent fresh overheads.

Some gaming benchmarks for AMD FirePro W5000.

http://www.tomshardware.co.uk/workstation-graphics-card-gaming,review-32643-9.html

7850 = 45.

W5000 = 33.

With Crysis 2, ROPs issue is minor i.e. 10 CU scale down from 7870 GE's results roughly matches 7770 GE's results.

7870 GE's 52.3 fps / 20 CUs = 2.615 x 10 CUs = 26.15 fps which roughly matches 7770's 25.9 fps result. My scale down theory and actual results works.

FirePro W5000's 12 CUs (825 Mhz) scales down from Radeon HD 7850's 16 CUs (860 Mhz).

7850's 45.3 fps / 16 CUs (860Mhz) = 2.831 x 8 CUs = 22.65 fps which roughly matches 7750's 21.5 fps result. 7750 is clocked at 800Mhz. Again, my scale down theory and actual results works.

If we use the 7850 and 7750 as the two points for the "line of best fit", FirePro W5000 falls into the expected slot for scaled 12 CUs @ ~800 Mhz.

------------

I'm using the following statements for estimating the gap between two points.

http://www.videogamer.com/news/xbox_one_and_ps4_have_no_advantage_over_the_other_says_redlynx.html

Speaking to VideoGamer.com at E3, Ilvessuo said: " Obviously we have been developing this game for a while and you can see the comparisons. I would say if you know how to use the platform they are both very powerful. I don't see a benefit over the other with any of the consoles."

http://www.videogamer.com/xboxone/metal_gear_solid_5_the_phantom_pain/news/ps4_and_xbox_one_power_difference_is_minimal_says_kojima.html

"The difference is small, and I don't really need to worry about it," he said, suggesting versions for Xbox One and PS4 won't be dramatically different.

------------

The amount ACEs doesn't change wavefront processing power.

tormentos

So you keep assuming that it is a Firepro which is a dan workstation GPU.?

Ron how many transistors do you think the 8 core Jaguar + other hardwares not related to the GPU or ESRAM take.?

I'm not going to buy the f**k-in 7830 prototype or 7850 (re-flash to W5000 which disables 2 CUs and down clock to 800Mhz). If you can't get around AMD's marketing BS then that's your problem.

I have already estimated 8 core Jaguar i.e. it's closer to AMD Bobcat and it's far from AMD Piledriver. Audio DSP details can be gathered from the best/cheap low power audio DSP chip in the market (guess, MS would be licensing it and it's not CELL or PPE) i.e. it's nowhere near AMD Ontario APU. LZ and JPEG hardware from known FPGA implementations.

Avatar image for PinkiePirate
PinkiePirate

1973

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 PinkiePirate
Member since 2012 • 1973 Posts

32MB is a bit small for 1080p in a deferred renderer scenario.

4x MSAA in 1080p would be a challenge on the Xbox One.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188 tormentos
Member since 2003 • 33793 Posts

 

I'm not going to buy the f**k-in 7830 prototype or 7850 (re-flash to W5000 which disables 2 CUs and down clock to 800Mhz). If you can't get around AMD's marketing BS then that's your problem.

 

I have already estimated 8 core Jaguar i.e. it's closer to AMD Bobcat and it's far from AMD Piledriver. Audio DSP details can be gathered from the best/cheap low power audio DSP chip in the market (guess, MS would be licensing it and it's not CELL or PPE) i.e. it's nowhere near AMD Ontario APU. LZ and JPEG hardware from known FPGA implementations.

ronvalencia

 

You completely evaded my question,how many transistors do you think the 8 core jaguar + the other on die components not related to the GPU or ESRAM have.

A number please.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189 tormentos
Member since 2003 • 33793 Posts

32MB is a bit small for 1080p in a deferred renderer scenario.

4x MSAA in 1080p would be a challenge on the Xbox One.

PinkiePirate

 

There will be countless others scenarios were the xbox one will have problems,thanks to MS building them like a media box first.

Avatar image for mems_1224
mems_1224

56919

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 mems_1224
Member since 2004 • 56919 Posts
If the power of a system is such a big deal then why even bother with consoles?
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#191 tormentos
Member since 2003 • 33793 Posts

If the power of a system is such a big deal then why even bother with consoles? mems_1224

 

Because you get power cheaper and warranty support for years.

Avatar image for mems_1224
mems_1224

56919

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#192 mems_1224
Member since 2004 • 56919 Posts

[QUOTE="mems_1224"]If the power of a system is such a big deal then why even bother with consoles? tormentos

 

Because you get power cheaper and warranty support for years.

So?
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193 tormentos
Member since 2003 • 33793 Posts

[QUOTE="tormentos"]

[QUOTE="mems_1224"]If the power of a system is such a big deal then why even bother with consoles? mems_1224

 

Because you get power cheaper and warranty support for years.

So?

 

You don't get that on PC..

 

Try and play Crysis 3 on a 7800GTX or a X1800 from 2005..

Avatar image for mems_1224
mems_1224

56919

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#194 mems_1224
Member since 2004 • 56919 Posts

[QUOTE="mems_1224"][QUOTE="tormentos"]

 

Because you get power cheaper and warranty support for years.

tormentos

So?

 

You don't get that on PC..

 

Try and play Crysis 3 on a 7800GTX or a X1800 from 2005..

So? Its pointless to have a dick measuring contest over consoles that will look mostly the same.
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="ronvalencia"]

 720p with 4XAA works fine with forward rendering and it's broken with deferred rendering.

tormentos

 

Yeah is that why most 360 games are sub HD and don't have 4XAA.?

GTA IV, RDR and others were 640p on PS3 and 720p on X360.
Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#196 tormentos
Member since 2003 • 33793 Posts

GTA IV, RDR and others were 640p on PS3 and 720p on X360.faizan_faizan

 

More Ram available for games.

Most 360 games are sub HD and don't have 4XAA either which is what MS stated would be the standard.

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#197 Krelian-co
Member since 2006 • 13274 Posts

[QUOTE="mems_1224"][QUOTE="tormentos"]

 

Because you get power cheaper and warranty support for years.

tormentos

So?

 

You don't get that on PC..

 

Try and play Crysis 3 on a 7800GTX or a X1800 from 2005..

and now try to play it full hd and with all the eye candy on consoles.

Youget what you pay for, id rather pay more for more quality but looks like people enjoy settling for less.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#198 tormentos
Member since 2003 • 33793 Posts

So? Its pointless to have a dick measuring contest over consoles that will look mostly the same. mems_1224

 

Wow you have some very rare ways to use measurements..:lol:

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#199 tormentos
Member since 2003 • 33793 Posts

 

and now try to play it full hd and with all the eye candy on consoles.

Youget what you pay for, id rather pay more for more quality but looks like people enjoy settling for less.

Krelian-co

 

But the point was been supported not who looked best,the 7800GTX doesn't run games like crysis 3.

 

Yeah i can't with all the eye candy,but at least i can play it and is the same game,and i did not have to sell my car to get a new PC,because yeah Crysis 3 doesn't run with all the eye candy on average PC,and since the argument you pull is all the eye candy nothing but the best count Titan,690 or 7990 pick one..

Avatar image for mems_1224
mems_1224

56919

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 mems_1224
Member since 2004 • 56919 Posts

[QUOTE="mems_1224"] So? Its pointless to have a dick measuring contest over consoles that will look mostly the same. tormentos

 

Wow you have some very rare ways to use measurements..:lol:

Its a pretty common expression bro