The biggest lie in console gaming history

  • 155 results
  • 1
  • 2
  • 3
  • 4
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#101  Edited By ronvalencia
Member since 2008 • 29612 Posts

@asylumni said:
@ronvalencia said:

@asylumni:

ARM Cortex A8 can host it's own operating system and supports the basic supervisor/user instruction modes(important for unix type operating system) and SPE can't even do that. SPE ISA only support user instruction modes just like Motorola's 56000 DSP. Furthermore, SPEs are treated separately from the PPE CPU.

Btw, ARM Cortex A8 has an advanced branch prediction unit with >95% accuracy.

GpGPU can run general serial/scalar user code will less than optimal performance i.e. must be treated like a DSP with very wide MIMD extensions (aka wave front or warp ).

The major differences between DSPs and GpGPUs are due to the extra raster hardware and very large register sets.

MIMD = a group of multiple SIMDs being issued as a single instruction. MIMD is better than super wide 512bit AVXv3 SIMD.

With SPE's 128bit SIMD, the programmer only worries about a single SIMD payload per instruction issue.

AMD GCN has a light out-of-order MIMD processing..

Of course, Cell was created at a time when Intel and AMD were fitting 2 cores into their CPU's. IBM, Sony and Toshiba wanted to jump ahead of the curve and fit 9. This required paring down the bulk of the cores and getting rid of many of the modern luxuries. That doesn't make it any less of a CPU.

I would say the major difference between DSPs and GPUs are that DSPs are a category covering a range of chips that are largely specialized in functionality with different ones covering a range of tasks while GPUs are specifically designed to process graphic work loads with more program-ability. Needing instructions in a certain order doesn't make a GPU a DSP. In fact, the modern program-ability makes it closer to a CPU, but it still isn't one.

The problem with "Intel and AMD were fitting 2 cores into their CPU's" argument is that the desktop PC is a modular system and there was even an IBM CELL based PCI-E card for the PC in both CE (SPEs only with uses X86 host CPU) and HPC markets.

Both ATI Xenos (unified shader instructions, mem-port writes that bypass ROPS writes) and NVIDIA CUDA advanced the GPU towards non-graphics compute workloads.

NVIDIA K1(Kepler IGP) SoCs are appearing in GE(General Electric)'s radar signal processing solutions.

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#102 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos said:
@slimdogmilionar said:

Ye there is something someting stopping Sony from using cloud computing, it's the fact that they dont have a cloud infrastructure that would be able to offer what Azure can. It's not like they can afford it either ATM, Azure is probably worth more than Sony as a whole, MS spent half of Sony's worth in cloud computing in 2013, and have steadily been building data centers all over the world. Right now Sony just can't afford to compete with MS in the Cloud department, Azure is a money maker for MS over 50% of the fortune 500 companies make use of Microsofts cloud infrasturcture.

Gaikai is not enough to support every PS owner, if they want this tech they have to pay one of two ways.

1. Rent servers from a company that has a distributed cloud(costly and would most likely raise PS+ prices)

or

2. Start investing in their own cloud infrastucture(extremely costly and takes a long time)

They are using a cloud right now is 100% working streaming PS3 games,and i am sure features of PSN+ like share play also take advantage of the cloud,is the reason you don't need 2 copies of far cry.

You don't get it Sony doesn't have to do anything they are kicking MS ass with DX12 and the cloud,so they are in no hurry to do anything.

MS cloud is not just for xbox one nor was put in place for xbox one,it is for MS horde of services as well on PC and many other things.

1-They don't need that they have an on the spot hardware advantage that is going no where the cloud doesn't make the xbox one more powerful.

2-Again they don't need to they haven't even lower the price of the unit yet while MS has drop $150 from its original price and still get outsold monthly.

I never said Sony did not have a cloud system, I was implying that they do not have a cloud infrastructure capable of doing what Azure can.

You are Exactly right Sony doesn't have to do anything and they aren't right now.

And yes MS cloud is not just for XB1 but what does that have to do with anything, currently they have one if not the biggest cloud infrastructures available today.

PS4 has decent hardware, but the fact remains that the hardware in the PS4 could not pull off what the Xb1 can with Crackdown. Fact. You will never get a game with that type of destruction on a PS4, Never.

And eventually they will need to, some of the most reputable gaming companies are investing in the cloud. With XB1 and PC both having access to Azure if developers start to utilize the cloud for games PS4 will be left in the dark and Sony would have no choice but to acquire a worldwide cloud infrastructure to support their customers.

Just admit you were wrong, you said the cloud would not do anything for the xb1 and were wrong. You said that it would require insanely fast internet and it does not, in fact it will be optimized to work with the slowest speeds that broadband ISP's offer.

Fact is there no way to spin this Sony cannot replicate this and by the time they can afford to even think about cloud computing on this level it will be to late and MS will be far, far ahead by then.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@Heil68:

Sony did the same shit....flower limbo heavy rain journey heavy rain 2 etc.

Avatar image for ladyblue
LadyBlue

4943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 LadyBlue
Member since 2012 • 4943 Posts

Milo comes to mind.

Avatar image for GhoX
GhoX

6267

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#105 GhoX
Member since 2006 • 6267 Posts

"Latest technology".

Avatar image for Heil68
Heil68

60831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#107 Heil68
Member since 2004 • 60831 Posts

@Heirren said:

@Heil68:

Sony did the same shit....flower limbo heavy rain journey heavy rain 2 etc.

Not compared to the Kinect frenzy of shit games after 2009 compared to core. Not even close. Journey was SONY's 3rd GOTY of last gen too.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@Heil68:

You're right. And here we are now with two casual boxes and one core wiiu game console.

Avatar image for l0ngshot
L0ngshot

516

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#109 L0ngshot
Member since 2014 • 516 Posts

@Cloud_imperium said:

Yeah, Witcher 3 runs like a slide show on PS4. But cows love to brag about 1080p so that's all what matters. Doesn't matter if they had to change the game's visuals from original build or compromise fps. What matters is 1080p. 20 fps are cinematic anyway (even lower than cinematic but you got my point).

haha so true. They will argue 1080 v 900p versus xbox and call it a victory but when someone said MSGV will look much better on PC with 4k reso, everyone says resolution doesn't matter. What total hypocrites.

Avatar image for HalcyonScarlet
HalcyonScarlet

13838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#110 HalcyonScarlet
Member since 2011 • 13838 Posts

If this is bait, it's pretty good. Cows losing their shit. Lol.

Avatar image for PAL360
PAL360

30574

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#111  Edited By PAL360
Member since 2007 • 30574 Posts

It's hard to decide between Emotion Engine's CGI quality graphics, or how would The Cell make 360 look like Xbox 1.5.

Sony was alot more humble with the Playstation 4.

Avatar image for GarGx1
GarGx1

10934

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#112 GarGx1
Member since 2011 • 10934 Posts

@hernandezzzz: I guess it's just an excuse for the OP to post butt hurt gif's and claim ownage over Sony fans, in other words it's a SW troll thread.

The Xbox fans have been having a hard time since the start of 8th gen, now that MS have had a good E3 and some good showings from Gamescom they can stick their heads above the parapet and return fire, for a while anyway.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113  Edited By ronvalencia
Member since 2008 • 29612 Posts

@slimdogmilionar said:
@tormentos said:
@slimdogmilionar said:

Ye there is something someting stopping Sony from using cloud computing, it's the fact that they dont have a cloud infrastructure that would be able to offer what Azure can. It's not like they can afford it either ATM, Azure is probably worth more than Sony as a whole, MS spent half of Sony's worth in cloud computing in 2013, and have steadily been building data centers all over the world. Right now Sony just can't afford to compete with MS in the Cloud department, Azure is a money maker for MS over 50% of the fortune 500 companies make use of Microsofts cloud infrasturcture.

Gaikai is not enough to support every PS owner, if they want this tech they have to pay one of two ways.

1. Rent servers from a company that has a distributed cloud(costly and would most likely raise PS+ prices)

or

2. Start investing in their own cloud infrastucture(extremely costly and takes a long time)

They are using a cloud right now is 100% working streaming PS3 games,and i am sure features of PSN+ like share play also take advantage of the cloud,is the reason you don't need 2 copies of far cry.

You don't get it Sony doesn't have to do anything they are kicking MS ass with DX12 and the cloud,so they are in no hurry to do anything.

MS cloud is not just for xbox one nor was put in place for xbox one,it is for MS horde of services as well on PC and many other things.

1-They don't need that they have an on the spot hardware advantage that is going no where the cloud doesn't make the xbox one more powerful.

2-Again they don't need to they haven't even lower the price of the unit yet while MS has drop $150 from its original price and still get outsold monthly.

I never said Sony did not have a cloud system, I was implying that they do not have a cloud infrastructure capable of doing what Azure can.

You are Exactly right Sony doesn't have to do anything and they aren't right now.

And yes MS cloud is not just for XB1 but what does that have to do with anything, currently they have one if not the biggest cloud infrastructures available today.

PS4 has decent hardware, but the fact remains that the hardware in the PS4 could not pull off what the Xb1 can with Crackdown. Fact. You will never get a game with that type of destruction on a PS4, Never.

And eventually they will need to, some of the most reputable gaming companies are investing in the cloud. With XB1 and PC both having access to Azure if developers start to utilize the cloud for games PS4 will be left in the dark and Sony would have no choice but to acquire a worldwide cloud infrastructure to support their customers.

Just admit you were wrong, you said the cloud would not do anything for the xb1 and were wrong. You said that it would require insanely fast internet and it does not, in fact it will be optimized to work with the slowest speeds that broadband ISP's offer.

Fact is there no way to spin this Sony cannot replicate this and by the time they can afford to even think about cloud computing on this level it will be to late and MS will be far, far ahead by then.

Loading Video...
Loading Video...

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#114 asylumni
Member since 2003 • 3304 Posts

@ronvalencia said:
@asylumni said:

Of course, Cell was created at a time when Intel and AMD were fitting 2 cores into their CPU's. IBM, Sony and Toshiba wanted to jump ahead of the curve and fit 9. This required paring down the bulk of the cores and getting rid of many of the modern luxuries. That doesn't make it any less of a CPU.

I would say the major difference between DSPs and GPUs are that DSPs are a category covering a range of chips that are largely specialized in functionality with different ones covering a range of tasks while GPUs are specifically designed to process graphic work loads with more program-ability. Needing instructions in a certain order doesn't make a GPU a DSP. In fact, the modern program-ability makes it closer to a CPU, but it still isn't one.

The problem with "Intel and AMD were fitting 2 cores into their CPU's" argument is that the desktop PC is a modular system and there was even an IBM CELL based PCI-E card for the PC in both CE (SPEs only with uses X86 host CPU) and HPC markets.

Both ATI Xenos (unified shader instructions, mem-port writes that bypass ROPS writes) and NVIDIA CUDA advanced the GPU towards non-graphics compute workloads.

NVIDIA K1(Kepler IGP) SoCs are appearing in GE(General Electric)'s radar signal processing solutions.

Why is that a problem? What does that have to do with anything? There are motherboards with sockets for a second Intel or a second AMD CPUs as well. Would this make them any less of a CPU? And, yes, I was aware of modern GPUs being more programmable and stated such. They still, however, have specific, dedicated graphics hardware making them still GPUs. The Cell doesn't. The ps3 OS also runs on the Cell chip. You can use a CPU as a GPU, that doesn't mean it is one.

Avatar image for foxhound_fox
foxhound_fox

98532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#115  Edited By foxhound_fox
Member since 2005 • 98532 Posts

I came into this thread expecting some sort of either reasoned topic, or some stupid anti-MS/Nintendo bullshit.

Was pleasantly surprised to see it was neither, however, still useless trolling.

Despite the PS4 being a complete joke and massive waste of time and money for anyone who owns more than one gaming platform, it's still the best platform for someone looking for a single gaming platform. Multiplats drive the console industry these days, and for those who don't game on PC, they are really the only reason to own a console these days that isn't made by Nintendo.

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#116 Cloud_imperium
Member since 2013 • 15146 Posts

@l0ngshot said:
@Cloud_imperium said:

Yeah, Witcher 3 runs like a slide show on PS4. But cows love to brag about 1080p so that's all what matters. Doesn't matter if they had to change the game's visuals from original build or compromise fps. What matters is 1080p. 20 fps are cinematic anyway (even lower than cinematic but you got my point).

haha so true. They will argue 1080 v 900p versus xbox and call it a victory but when someone said MSGV will look much better on PC with 4k reso, everyone says resolution doesn't matter. What total hypocrites.

Also don't forget "Since only 0.000001% hermits will be able to play it in 4k so it doesn't count. Checkmate PC mustard race".

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#117 ReadingRainbow4
Member since 2012 • 18733 Posts

@ladyblue said:

Milo comes to mind.

Loading Video...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#118  Edited By ronvalencia
Member since 2008 • 29612 Posts

@asylumni said:
@ronvalencia said:
@asylumni said:

Of course, Cell was created at a time when Intel and AMD were fitting 2 cores into their CPU's. IBM, Sony and Toshiba wanted to jump ahead of the curve and fit 9. This required paring down the bulk of the cores and getting rid of many of the modern luxuries. That doesn't make it any less of a CPU.

I would say the major difference between DSPs and GPUs are that DSPs are a category covering a range of chips that are largely specialized in functionality with different ones covering a range of tasks while GPUs are specifically designed to process graphic work loads with more program-ability. Needing instructions in a certain order doesn't make a GPU a DSP. In fact, the modern program-ability makes it closer to a CPU, but it still isn't one.

The problem with "Intel and AMD were fitting 2 cores into their CPU's" argument is that the desktop PC is a modular system and there was even an IBM CELL based PCI-E card for the PC in both CE (SPEs only with uses X86 host CPU) and HPC markets.

Both ATI Xenos (unified shader instructions, mem-port writes that bypass ROPS writes) and NVIDIA CUDA advanced the GPU towards non-graphics compute workloads.

NVIDIA K1(Kepler IGP) SoCs are appearing in GE(General Electric)'s radar signal processing solutions.

Why is that a problem? What does that have to do with anything? There are motherboards with sockets for a second Intel or a second AMD CPUs as well. Would this make them any less of a CPU? And, yes, I was aware of modern GPUs being more programmable and stated such. They still, however, have specific, dedicated graphics hardware making them still GPUs. The Cell doesn't. The ps3 OS also runs on the Cell chip. You can use a CPU as a GPU, that doesn't mean it is one.

You still can't see it? SPU is not a CPU just as GPU is also not a CPU. Your artificial separation between SPU and GPU means nothing when both boxes deals with mostly raster graphics workloads.

Ultimately Cell's long term problem has been GPUs themselves. As you say Cell sucks as a general purpose CPU. No problem, that wasn't really its design. However as a stream processor it can't keep up with the new GPUs. That wasn't an issue when it was designed (this was back in the pre nVidia 8800 days) but now it gets out stream processed by GPUs.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#119 commander
Member since 2010 • 16217 Posts

@foxhound_fox said:

I came into this thread expecting some sort of either reasoned topic, or some stupid anti-MS/Nintendo bullshit.

Was pleasantly surprised to see it was neither, however, still useless trolling.

Despite the PS4 being a complete joke and massive waste of time and money for anyone who owns more than one gaming platform, it's still the best platform for someone looking for a single gaming platform. Multiplats drive the console industry these days, and for those who don't game on PC, they are really the only reason to own a console these days that isn't made by Nintendo.

If you like slideshows

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#120  Edited By asylumni
Member since 2003 • 3304 Posts

@ronvalencia said:
@asylumni said:

Why is that a problem? What does that have to do with anything? There are motherboards with sockets for a second Intel or a second AMD CPUs as well. Would this make them any less of a CPU? And, yes, I was aware of modern GPUs being more programmable and stated such. They still, however, have specific, dedicated graphics hardware making them still GPUs. The Cell doesn't. The ps3 OS also runs on the Cell chip. You can use a CPU as a GPU, that doesn't mean it is one.

You still can't see it? SPU is not a CPU just as GPU is also not a CPU. Your artificial separation between SPU and GPU means nothing when both boxes deals with mostly raster graphics workloads.

Ultimately Cell's long term problem has been GPUs themselves. As you say Cell sucks as a general purpose CPU. No problem, that wasn't really its design. However as a stream processor it can't keep up with the new GPUs. That wasn't an issue when it was designed (this was back in the pre nVidia 8800 days) but now it gets out stream processed by GPUs.

The thing is, the Cell is a CPU, and the SPU is part of the Cell. The Cell was never meant to be a GPU and thus, was not designed to compete with a GPU. It was made to put as many cores as cost effectively as possible into a single die. Therefore, if it wasn't critical, it was removed. The SPUs had the PPU to run the OS, so they didn't need to. This saved quite a bit of transistors, as did other cut features. They may not stand on their own well as CPUs, but that's still the root of their design. It gave the Cell a significantly higher computational ability than the (then) current offerings from both Intel and AMD. The downside was the difficulty in programming. The biggest thing that limited the useful life of the Cell was the advancement of the manufacturing process. The Cell was created with a 90nm process, we're now down to 14nm (IBM recently had a successful test fab at 7nm). As the processes shrunk, it became more economically feasible to fit more full x86 cores on each die approaching the power of the Cell, but with the ease of use of the modern x86 architecture.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121  Edited By ronvalencia
Member since 2008 • 29612 Posts

@asylumni:

That's not the whole story.

AMD Jaguar used a similar high density library IP (secrete sauce) from ATI and combined with re-design optimizations, they where able to compress AMD K8 class CPU into near ARM A15 core size.

CELL's SPE has GPU influenced scatter gather instructions for it's local memory i.e. there was an attempt to bring it closer to a GPU. SPE's ISA was based from Altivec which is a multi-media SIMD instruction set. The problem for SPEs is that they were tasked to patch PS3's aging GPU design.

IBM's SPE array is like a blast from the past PS2's EE concepts updated for 2006 while the GPU has evolved from their simpler SIMD array designs e.g. N64's GPU solution was MIPS based with 8 integer SIMD arrays + some fix function hardware.

Again, the general market did not see the artificial separation between CELL's SPE vs GpGPU. The majority of workload type is raster graphics not HPC math problems i.e. Sony's priorities was wrong. Anyway, GpGPU blasted IBM's CELL out of HPC market.

Intel's AVXv3-512bit is already approaching GPU's MIMD instruction issue width.

The old RISC iron vendors' larger register set arguments was use against them in the form of GPU's thousands of registers from ATI/AMD and NVIDIA. Intel has the last laugh against PowerPC camp (IBM, Freescale/Motorola, Apple).

For PS4, Sony did not repeat the same mistakes and went for the bang-per-buck semi-custom GPU king i.e. AMD's ATi Radeon group.

Avatar image for jg4xchamp
jg4xchamp

64057

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#122 jg4xchamp
Member since 2006 • 64057 Posts

@Desmonic said:
@mems_1224 said:

Meh, Sony has a long history of lying to consumers. Why would the ps4 be any different

Right, because "MOST POWERFUL CONSOLE OF ALL TIME" after 95% of the games playing better on your rival's console is super legit.

What about the part where Sony pitched a Kickstarter game on stage and acted like they have no hand in its development, when the very fucking next day it was like 'lol sike of course we are, we're taking advantage of suckers."?

Avatar image for Desmonic
Desmonic

19990

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#123  Edited By Desmonic  Moderator
Member since 2007 • 19990 Posts

@jg4xchamp said:
@Desmonic said:
@mems_1224 said:

Meh, Sony has a long history of lying to consumers. Why would the ps4 be any different

Right, because "MOST POWERFUL CONSOLE OF ALL TIME" after 95% of the games playing better on your rival's console is super legit.

What about the part where Sony pitched a Kickstarter game on stage and acted like they have no hand in its development, when the very fucking next day it was like 'lol sike of course we are, we're taking advantage of suckers."?

Oh, they're paying for the port of the game? Soooo evilz :v

Granted that whole Kickstarter ordeal, from start to finish, could/should have been done differently. We've known for a while that they're just paying for the port. Of course paying for the port only happens if the game happens, which no doubt got a massive boost from being at E3, as scummy as it was.

In any case: MOST POWERFUL CONSOLE OF ALL TIME!! Super legit. :P

Also, do notice I didn't say Sony doesn't lie nor does no wrong (I've recently called them out on their BS PS+ price increase for example). They do. Just like MS (outside the console business, MS is massively guilty of lying/manipulating a looot of stuff). Just like Ninty at some point surely has (I don't follow their PR/news as much though).

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#124  Edited By Mr_Huggles_dog
Member since 2014 • 7805 Posts

@l0ngshot said:
@Cloud_imperium said:

Yeah, Witcher 3 runs like a slide show on PS4. But cows love to brag about 1080p so that's all what matters. Doesn't matter if they had to change the game's visuals from original build or compromise fps. What matters is 1080p. 20 fps are cinematic anyway (even lower than cinematic but you got my point).

haha so true. They will argue 1080 v 900p versus xbox and call it a victory but when someone said MSGV will look much better on PC with 4k reso, everyone says resolution doesn't matter. What total hypocrites.

I find that line of thinking quite arrogant.

I'm a so called cow...and I wouldn't buy a shit game like something from the Witcher series for more than $10.

Resolution, GFX, and fps would affect my opinion.

Avatar image for l0ngshot
L0ngshot

516

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#125  Edited By L0ngshot
Member since 2014 • 516 Posts

@mr_huggles_dog said:
@l0ngshot said:
@Cloud_imperium said:

Yeah, Witcher 3 runs like a slide show on PS4. But cows love to brag about 1080p so that's all what matters. Doesn't matter if they had to change the game's visuals from original build or compromise fps. What matters is 1080p. 20 fps are cinematic anyway (even lower than cinematic but you got my point).

haha so true. They will argue 1080 v 900p versus xbox and call it a victory but when someone said MSGV will look much better on PC with 4k reso, everyone says resolution doesn't matter. What total hypocrites.

I find that line of thinking quite arrogant.

I'm a so called cow...and I wouldn't buy a shit game like something from the Witcher series for more than $10.

Resolution, GFX, and fps would affect my opinion.

I enjoyed the game regardless of what it scored and I couldn't care less about your opinion.

You cows use latest batman game to claim dominance of ps4? Very well, add ignorance to the list too.

Avatar image for jg4xchamp
jg4xchamp

64057

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#126 jg4xchamp
Member since 2006 • 64057 Posts

@Desmonic said:
@jg4xchamp said:

What about the part where Sony pitched a Kickstarter game on stage and acted like they have no hand in its development, when the very fucking next day it was like 'lol sike of course we are, we're taking advantage of suckers."?

Oh, they're paying for the port of the game? Soooo evilz :v

Granted that whole Kickstarter ordeal, from start to finish, could/should have been done differently. We've known for a while that they're just paying for the port. Of course paying for the port only happens if the game happens, which no doubt got a massive boost from being at E3, as scummy as it was.

In any case: MOST POWERFUL CONSOLE OF ALL TIME!! Super legit. :P

Also, do notice I didn't say Sony doesn't lie nor does no wrong (I've recently called them out on their BS PS+ price increase for example). They do. Just like MS (outside the console business, MS is massively guilty of lying/manipulating a looot of stuff). Just like Ninty at some point surely has (I don't follow their PR/news as much though).

Nintendo's biggest lie is the same one Vince McMahon is using "We're not out of touch at all'...lol yeah okay.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

MOST POWERFUL CONSOLE EVARRRRRRRR

For one game's multiplayer mode in a year. Another crap thread by OP.

RIP DX12, you served your purpose but it's all about da clawd now, baby

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 tormentos
Member since 2003 • 33793 Posts

@slimdogmilionar said:

I never said Sony did not have a cloud system, I was implying that they do not have a cloud infrastructure capable of doing what Azure can.

You are Exactly right Sony doesn't have to do anything and they aren't right now.

And yes MS cloud is not just for XB1 but what does that have to do with anything, currently they have one if not the biggest cloud infrastructures available today.

PS4 has decent hardware, but the fact remains that the hardware in the PS4 could not pull off what the Xb1 can with Crackdown. Fact. You will never get a game with that type of destruction on a PS4, Never.

And eventually they will need to, some of the most reputable gaming companies are investing in the cloud. With XB1 and PC both having access to Azure if developers start to utilize the cloud for games PS4 will be left in the dark and Sony would have no choice but to acquire a worldwide cloud infrastructure to support their customers.

Just admit you were wrong, you said the cloud would not do anything for the xb1 and were wrong. You said that it would require insanely fast internet and it does not, in fact it will be optimized to work with the slowest speeds that broadband ISP's offer.

Fact is there no way to spin this Sony cannot replicate this and by the time they can afford to even think about cloud computing on this level it will be to late and MS will be far, far ahead by then.

They don't have to do anything the cloud will change nothing.

Oh please the xbox 360 and PS3 pull it off last gen,Mercenaries have totally destructible environments,the xbox one look nicer because it is a new generation,the PS4 can use GPU compute to do physics and i have to remind you again that GPU are stronger and better suited for Physics than CPU thanks to its parallel nature,hell on PS4 they can use compute for it and have the results back in the same frame before it is actually ready to be render which the cloud will not do,because of speed and latency.

Anything that need constant MS refresh can't be on the cloud.

So yeah it is totally doable on PS4,you don't need on GPU equal power of a CPU to do even better physics on a GPU a fraction of the GPU will run better those physics than a CPU would.

Yeah it has been 2 years now and the cloud has been there from launch nothing happen and so far no 3rd party has byte,in fact the only 3rd party game using it is Titanfall which was seal before both consoles came out and is the same reason why it is exclusive after that no one is in a hurry to use the cloud.

Again doing better physics will not change the fact that the xbox one will be inferior resolution wise and frame wise all gen long.

Dude it would do nothing physics of a cloud is not the same as what was claim by you and many other lemmings who claimed it would rise the xbox one graphics and power the cloud doesn't make the xbox one more powerful it offload some physics and constant online most be there,just hope lag doesn't ruin it because the same glitches and problems you get online with multiplaying game you will get them on Crackdown 3 is those physics miss the mark even for 1 second it mean a building going through the floor or your character going through walls, building not going down when you destroy them or crumbling way after you shoot them.

Sony has GPU compute and work to a much finer grain than on xbox one,so they don't need a cloud again i don't see developer falling for the cloud,even less now that MS continue to get it ass whoop by sony.

@Cloud_imperium said:

Also don't forget "Since only 0.000001% hermits will be able to play it in 4k so it doesn't count. Checkmate PC mustard race".

NO is not that it doesn't count is that the majority of Hermits are peasants inside the master race and you people act like if every damn hermit has a PC capable of running 4K 60FPS,which is a joke when you make fun of 900p games on PS4 you don't realize that endless millions more on PC run game not only on 900p but also on 1366x768 and the majority is on 1080p.

So basically PC is inline with consoles in the 1080p era since most gamers are 1080p or lower.

Again what is the most played game on steam.? Dota2 yeah you don't need a titan with a i7 black edition for that.

@ronvalencia said:

@asylumni:

That's not the whole story.

AMD Jaguar used a similar high density library IP (secrete sauce) from ATI and combined with re-design optimizations, they where able to compress AMD K8 class CPU into near ARM A15 core size.

CELL's SPE has GPU influenced scatter gather instructions for it's local memory i.e. there was an attempt to bring it closer to a GPU. SPE's ISA was based from Altivec which is a multi-media SIMD instruction set. The problem for SPEs is that they were tasked to patch PS3's aging GPU design.

IBM's SPE array is like a blast from the past PS2's EE concepts updated for 2006 while the GPU has evolved from their simpler SIMD array designs e.g. N64's GPU solution was MIPS based with 8 integer SIMD arrays + some fix function hardware.

Again, the general market did not see the artificial separation between CELL's SPE vs GpGPU. The majority of workload type is raster graphics not HPC math problems i.e. Sony's priorities was wrong. Anyway, GpGPU blasted IBM's CELL out of HPC market.

Intel's AVXv3-512bit is already approaching GPU's MIMD instruction issue width.

The old RISC iron vendors' larger register set arguments was use against them in the form of GPU's thousands of registers from ATI/AMD and NVIDIA. Intel has the last laugh against PowerPC camp (IBM, Freescale/Motorola, Apple).

For PS4, Sony did not repeat the same mistakes and went for the bang-per-buck semi-custom GPU king i.e. AMD's ATi Radeon group.

SPE run at 3.2ghz find me the GPU that runs at that speed then you have a point.

SPE were not tasked for that many games on PS3 use SPE differently,and in exclusive things like post process Physics and many others were rest on SPE not only that audio also was done on SPE,since it was perfectly suited for those jobs.

Fact is Cell was a CPU hybrid not a GPU so comparing it to a GPU is a joke and they only reason you do it is because CPU wise for gaming Cell kicked the living crap out of many CPU out there back in the day.

Again GPU don't run a 3.2ghz they are much much slower that is a CPU speed not a GPU one.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#129 tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

@asylumni:

The point was Tormented claimed CELL is a CPU which is true for PPE, but for SPEs, I referred to the primary source (i.e. the product designer) which states "DSP like" NOT "CPU like".

Performance is another issue.

AMD Jaguar CPUs are good enough to take on PS3's CELL.

IBM CELL's FLOPS numbers vs effective work worth less than AMD Jaguars'.

Tormented's separation between CELL's SPE (DSP like) and GpGPU is artificial i.e. the important part is the work being done regardless of the processor type.

The market generally didn't see the difference and results speak for themselves i.e. IBM CELL is dead, while NVIDIA CUDA is still alive and kicking.

You know that test used only 5 SPE right.? vs the xbox one 6 and the PS4 6,using 6 SPE Cell would have beat that jaguar,not bad for a 2005 PPC CPU right.?

GPU cores don't run at 3.2ghz GPU cores are slower much much slower so DSP are more like CPU than GPU when it come to how they run than GPU.

Unless you show me average GPU on the market have a speed of 3.2ghz.

980ti 1000- 1075mhz

Cell SPE 3.2ghz.

Jaguar cores are good enough for some things and not exactly better in other they are more efficient since it is a new architecture.

That is because sony decided to kill cell and not waste more money making it,it was better to do a box with a smaller CPU and GPU combo for profits sake not because Cell could have not evolve into something better.

Your hate for sony is a joke.

@commander said:

the ps4 can't even do an overclock on their cpu that they need badly lol

Sure it can the PS4 is not melting from running that speed now,but why will they need to is the bigger question.?

They still kicking MS ass with 1.6ghz...lol

Avatar image for Heil68
Heil68

60831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#130 Heil68
Member since 2004 • 60831 Posts

@Heirren said:

@Heil68:

You're right. And here we are now with two casual boxes and one core wiiu game console.

Wiiu? Not for me. I've bought 1 game last year and none this far.

PS4 on the other hand is the corest of core. With over 350+ games to play,. there is something for everyone.

Avatar image for LJS9502_basic
LJS9502_basic

180196

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 LJS9502_basic
Member since 2003 • 180196 Posts

@foxhound_fox said:

I came into this thread expecting some sort of either reasoned topic, or some stupid anti-MS/Nintendo bullshit.

Was pleasantly surprised to see it was neither, however, still useless trolling.

Despite the PS4 being a complete joke and massive waste of time and money for anyone who owns more than one gaming platform, it's still the best platform for someone looking for a single gaming platform. Multiplats drive the console industry these days, and for those who don't game on PC, they are really the only reason to own a console these days that isn't made by Nintendo.

Yeah I don't understand why multi plats are non existent in SW for deciding to pick up a console or not. SW is strange....

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#132 Cloud_imperium
Member since 2013 • 15146 Posts

@tormentos said:
@slimdogmilionar said:

I never said Sony did not have a cloud system, I was implying that they do not have a cloud infrastructure capable of doing what Azure can.

You are Exactly right Sony doesn't have to do anything and they aren't right now.

And yes MS cloud is not just for XB1 but what does that have to do with anything, currently they have one if not the biggest cloud infrastructures available today.

PS4 has decent hardware, but the fact remains that the hardware in the PS4 could not pull off what the Xb1 can with Crackdown. Fact. You will never get a game with that type of destruction on a PS4, Never.

And eventually they will need to, some of the most reputable gaming companies are investing in the cloud. With XB1 and PC both having access to Azure if developers start to utilize the cloud for games PS4 will be left in the dark and Sony would have no choice but to acquire a worldwide cloud infrastructure to support their customers.

Just admit you were wrong, you said the cloud would not do anything for the xb1 and were wrong. You said that it would require insanely fast internet and it does not, in fact it will be optimized to work with the slowest speeds that broadband ISP's offer.

Fact is there no way to spin this Sony cannot replicate this and by the time they can afford to even think about cloud computing on this level it will be to late and MS will be far, far ahead by then.

They don't have to do anything the cloud will change nothing.

Oh please the xbox 360 and PS3 pull it off last gen,Mercenaries have totally destructible environments,the xbox one look nicer because it is a new generation,the PS4 can use GPU compute to do physics and i have to remind you again that GPU are stronger and better suited for Physics than CPU thanks to its parallel nature,hell on PS4 they can use compute for it and have the results back in the same frame before it is actually ready to be render which the cloud will not do,because of speed and latency.

Anything that need constant MS refresh can't be on the cloud.

So yeah it is totally doable on PS4,you don't need on GPU equal power of a CPU to do even better physics on a GPU a fraction of the GPU will run better those physics than a CPU would.

Yeah it has been 2 years now and the cloud has been there from launch nothing happen and so far no 3rd party has byte,in fact the only 3rd party game using it is Titanfall which was seal before both consoles came out and is the same reason why it is exclusive after that no one is in a hurry to use the cloud.

Again doing better physics will not change the fact that the xbox one will be inferior resolution wise and frame wise all gen long.

Dude it would do nothing physics of a cloud is not the same as what was claim by you and many other lemmings who claimed it would rise the xbox one graphics and power the cloud doesn't make the xbox one more powerful it offload some physics and constant online most be there,just hope lag doesn't ruin it because the same glitches and problems you get online with multiplaying game you will get them on Crackdown 3 is those physics miss the mark even for 1 second it mean a building going through the floor or your character going through walls, building not going down when you destroy them or crumbling way after you shoot them.

Sony has GPU compute and work to a much finer grain than on xbox one,so they don't need a cloud again i don't see developer falling for the cloud,even less now that MS continue to get it ass whoop by sony.

@Cloud_imperium said:

Also don't forget "Since only 0.000001% hermits will be able to play it in 4k so it doesn't count. Checkmate PC mustard race".

NO is not that it doesn't count is that the majority of Hermits are peasants inside the master race and you people act like if every damn hermit has a PC capable of running 4K 60FPS,which is a joke when you make fun of 900p games on PS4 you don't realize that endless millions more on PC run game not only on 900p but also on 1366x768 and the majority is on 1080p.

So basically PC is inline with consoles in the 1080p era since most gamers are 1080p or lower.

Again what is the most played game on steam.? Dota2 yeah you don't need a titan with a i7 black edition for that.

@ronvalencia said:

@asylumni:

That's not the whole story.

AMD Jaguar used a similar high density library IP (secrete sauce) from ATI and combined with re-design optimizations, they where able to compress AMD K8 class CPU into near ARM A15 core size.

CELL's SPE has GPU influenced scatter gather instructions for it's local memory i.e. there was an attempt to bring it closer to a GPU. SPE's ISA was based from Altivec which is a multi-media SIMD instruction set. The problem for SPEs is that they were tasked to patch PS3's aging GPU design.

IBM's SPE array is like a blast from the past PS2's EE concepts updated for 2006 while the GPU has evolved from their simpler SIMD array designs e.g. N64's GPU solution was MIPS based with 8 integer SIMD arrays + some fix function hardware.

Again, the general market did not see the artificial separation between CELL's SPE vs GpGPU. The majority of workload type is raster graphics not HPC math problems i.e. Sony's priorities was wrong. Anyway, GpGPU blasted IBM's CELL out of HPC market.

Intel's AVXv3-512bit is already approaching GPU's MIMD instruction issue width.

The old RISC iron vendors' larger register set arguments was use against them in the form of GPU's thousands of registers from ATI/AMD and NVIDIA. Intel has the last laugh against PowerPC camp (IBM, Freescale/Motorola, Apple).

For PS4, Sony did not repeat the same mistakes and went for the bang-per-buck semi-custom GPU king i.e. AMD's ATi Radeon group.

SPE run at 3.2ghz find me the GPU that runs at that speed then you have a point.

SPE were not tasked for that many games on PS3 use SPE differently,and in exclusive things like post process Physics and many others were rest on SPE not only that audio also was done on SPE,since it was perfectly suited for those jobs.

Fact is Cell was a CPU hybrid not a GPU so comparing it to a GPU is a joke and they only reason you do it is because CPU wise for gaming Cell kicked the living crap out of many CPU out there back in the day.

Again GPU don't run a 3.2ghz they are much much slower that is a CPU speed not a GPU one.

I love how you get so emotional. Dude just stop. No one is bashing your precious "godstation". And by your logic, Console gamers are even bigger peasants because most of those PS3 owners don't have enough money to switch to PS4. And even if we take Steam survey's seriously, there are still more PCs powerful than PS4 than number of PS4s sold.

-Patiently awaits for wall of excuses and moaning like little bitches.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#133  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

The difference is relatively minor and Jaguar CPUs has minimal GPU patching.

Jaguar is clock at 1.6Ghz (PS4) or 1.75Ghz (XBO) while PPE and SPUs are clocked 3.2Ghz i.e. Jaguar has superior effective performance per watt (it's a tablet CPU), superior effective performance per clock speed. It's too bad Sony did not configured PS4 with 1st gen PS3's TDP envelope i.e. 200 watts e.g. Jaguar at 2.4Ghz and GCN at 925Mhz would have been better.

It's even better with AVXv2 enabled Intel Haswell/Skylake and AMD Excavator.

@tormentos:

GPU cores don't run at 3.2ghz GPU cores are slower much much slower so DSP are more like CPU than GPU when it come to how they run than GPU.

You are forgetting Blue Gene's IBM PowerPC 440 (with dual FPU option) at 700Mhz in very large core count.

The old G8x CUDA GPUs are clocked >1.1 Ghz e.g. 8600 GT has 1.19 Ghz while 8600 GTS has 1.45 Ghz

The fact is SPE is "DSP like" and anything else is just a waste clock cycles.

@tormentos:

Fact is Cell was a CPU hybrid not a GPU so comparing it to a GPU is a joke and they only reason you do it is because CPU wise for gaming Cell kicked the living crap out of many CPU out there back in the day.

It's too bad the general market didn't recognize the artificial separation between SPE vs DX10 GpGPU.

For Geforce 7/RSX, CPU or SPE handles the workload on the first 3 stages.

For Geforce 8 with DX10, GpGPU handles the workload on the first 3 stages.

Note that GPU-SO = Stream Out which similar to Xenos' mem-export. Xenos version would be interpolate with it's DX9L vertex shaders.

------------------------

From forum.beyond3d.com/showthread.php?t=57736&page=5

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"

1) Two ppu/vmx units

There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.

2) Vertex culling

You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.

3) Vertex texture sampling

You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.

4) Shader patching

Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.

5) Branching

You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.

6) Shader inputs

You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.

7) MSAA alternatives

Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.

Post processing

360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.

9) Load balancing

360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.

10) Half floats

You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.

11) Shader array indexing

You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.

Refer to http://forum.beyond3d.com/showthread.php?p=552774

Read Jawed's post

For example texture fetches in RSX will always be painfully slow in comparison - but how slow depends on the format of the textures.

Also, control flow operations in RSX will be out of bounds because they are impractically slow - whereas in Xenos they'll be the bread and butter of good code because there'll be no performance penalty.

Dependent texture fetches in Xenos (I presume that's what the third point means), will work without interrupting shader code - again RSX simply can't do this, dependent texturing blocks one ALU per pipe

To fix Geforce 7's issues, PC owners just upgrades to DX10 class GPU. NVIDIA G80 was released ahead of PS3.

Note the SPEs are being use as a GPU to patch the aging RSX design issues.

http://www.gpucomputing.net/sites/default/files/papers/1098/CEC_2008.pdf

Xbox 360's GpGPU example on ATI Xenos. It also states the tight coupling between GPU and CPU, which is the lead up for AMD's APU model. Later Xbox 360 revision combines CPU and GPU into a single chip.

Your love for Sony is joke.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#134 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@Heil68 said:
@Heirren said:

@Heil68:

You're right. And here we are now with two casual boxes and one core wiiu game console.

Wiiu? Not for me. I've bought 1 game last year and none this far.

PS4 on the other hand is the corest of core. With over 350+ games to play,. there is something for everyone.

Okay if you phrase it like that, then sure. Too bad the core audience of the game industry is now casual. 99 percent of the games released are casual titles. Few sports games and fighting games and thats about it. Gameplay is on wiiu.

Avatar image for freedomfreak
freedomfreak

52551

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 freedomfreak
Member since 2004 • 52551 Posts

Every company has said some dumb shit.

Wish I could relive Sony's E3 2006. And I was hyped as hell.

Avatar image for Heil68
Heil68

60831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136 Heil68
Member since 2004 • 60831 Posts

@Heirren said:
@Heil68 said:
@Heirren said:

@Heil68:

You're right. And here we are now with two casual boxes and one core wiiu game console.

Wiiu? Not for me. I've bought 1 game last year and none this far.

PS4 on the other hand is the corest of core. With over 350+ games to play,. there is something for everyone.

Okay if you phrase it like that, then sure. Too bad the core audience of the game industry is now casual. 99 percent of the games released are casual titles. Few sports games and fighting games and thats about it. Gameplay is on wiiu.

Not too bad for me. I'm buried in PS4 games atm.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#137  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@asylumni said:

The Cell was never meant to be a GPU and thus, was not designed to compete with a GPU.

False, When it was being designed the Cell was suppose to be their all in one all purpose processor that did all cpu&gpu work without a dedicated gpu. When they started testing the SPE's could not keep up with current standardized dedicated gpus. That was when the Sony went to Nvidia for a fix for a dedicated gpu since the Cell was not up to snuff to handle it all.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

@Heil68 said:
@Heirren said:
@Heil68 said:
@Heirren said:

@Heil68:

You're right. And here we are now with two casual boxes and one core wiiu game console.

Wiiu? Not for me. I've bought 1 game last year and none this far.

PS4 on the other hand is the corest of core. With over 350+ games to play,. there is something for everyone.

Okay if you phrase it like that, then sure. Too bad the core audience of the game industry is now casual. 99 percent of the games released are casual titles. Few sports games and fighting games and thats about it. Gameplay is on wiiu.

Not too bad for me. I'm buried in PS4 games atm.

Not arguing that. Personally games with no depth just don't cut it for me. Sony pushing this streaming nonsense is also disheartening. It is really proof that people don't care about quality. I'm in the minority here, for sure.

PES2015 is pretty amazing though--best sports sim of all time. Looking forward to Uncharted. Last Guardian looks pretty sweet. MGS5. Titles are really bought the console for, I suppose. I think the popularity of last gen systems has hurt this gen somewhat.

Avatar image for clone01
clone01

29844

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#139 clone01
Member since 2003 • 29844 Posts

Huh?

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#140 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos said:

They don't have to do anything the cloud will change nothing.

Oh please the xbox 360 and PS3 pull it off last gen,Mercenaries have totally destructible environments,the xbox one look nicer because it is a new generation,the PS4 can use GPU compute to do physics and i have to remind you again that GPU are stronger and better suited for Physics than CPU thanks to its parallel nature,hell on PS4 they can use compute for it and have the results back in the same frame before it is actually ready to be render which the cloud will not do,because of speed and latency.

Anything that need constant MS refresh can't be on the cloud.

So yeah it is totally doable on PS4,you don't need on GPU equal power of a CPU to do even better physics on a GPU a fraction of the GPU will run better those physics than a CPU would.

Yeah it has been 2 years now and the cloud has been there from launch nothing happen and so far no 3rd party has byte,in fact the only 3rd party game using it is Titanfall which was seal before both consoles came out and is the same reason why it is exclusive after that no one is in a hurry to use the cloud.

Again doing better physics will not change the fact that the xbox one will be inferior resolution wise and frame wise all gen long.

Dude it would do nothing physics of a cloud is not the same as what was claim by you and many other lemmings who claimed it would rise the xbox one graphics and power the cloud doesn't make the xbox one more powerful it offload some physics and constant online most be there,just hope lag doesn't ruin it because the same glitches and problems you get online with multiplaying game you will get them on Crackdown 3 is those physics miss the mark even for 1 second it mean a building going through the floor or your character going through walls, building not going down when you destroy them or crumbling way after you shoot them.

Sony has GPU compute and work to a much finer grain than on xbox one,so they don't need a cloud again i don't see developer falling for the cloud,even less now that MS continue to get it ass whoop by sony.

LOL The cloud will change nothing. Really?

Again LMAO at the fact that you think the PS4 could handle whats going in Crackdown with just the hardware in the box. If it could that would essentially mean that the PS4 is more powerful than a Titan, becasue the first demo MS released was run on a TItan and the destruction still was too much for the card. And also latency should not be a problem since the game is being optimized to run on a 2 - 4mbps connection.

Anything that needs constant refreshing can't be done on the cloud.Really? So how do you explain my cloud controlled auto pilot Titan engaging enemies and following my orders when I'm playing Titanfall, how do you explain the AI on both teams running at shooting during battle. SO if anything that requires constant refreshing can't be done in the cloud how do explain those things.

LOL you really do think it's doable on the PS4 GPU. Again even on high end PC's with less destruction frame rates drop significantly. PS4 can barely maintain 1080p/60 fps and you think it can handle the amount of destruction Crackdown has to offer.

It has been two years and what we have so far is just a taste of what the cloud has to offer.

LOL who cares about resolution when I have the ability to completely destroy a whole city during a firefight. No third party has bit you mean, well why would they cloudgine already has it covered, you have Unreal, Havok, and DX12 that already support cloudgine's middleware for cloud computing. Lol no third party has bit yet because the Cloudgine middleware has not been released yet, Crackdown is the first game to take advantage of it.

How is physics in the cloud not what I claimed, lol. And FYI I don't know of these glitches and and problems you speak of, I can't remember the last time XBL was down for me. Oh yea it was Christmas with the whole Lizard Squad thing. Lol glitches and downtime sounds like you are too used to playing on PSN, LOL find me one person who claims to have any lag at all on Titanfall, even if there was lag you have the option of switching to another data center, we have options you know, lol Azure has 99.997% uptime. Physics are a part of graphics also, just so you know..

Again Ps4 could never pull this off no matter how much denial you muster up, PS4 does not have what it takes to run this, Hell even the destruction in BF4 causes frame drops on both consoles, and you want to say that the PS4 could handle 100x that. So basically in your eyes the PS4 is powerful enough for an open world game where you can enter every building, destroy a whole city, and host online multiplayer games at the same time.

LOL so now Sony does not need a cloud, I smell butthurt. The fact is PS4 can't run this level of destruction period, Shu even openly said at launch :he does not understand what MS is doing with the cloud". This lets me know that Sony is far behind in this tech while companies like MS and Nvidia are already making breakthroughs in this area.

Here is some more info for you that you won't like.

My contention going into GDC 2015: The Microsoft XBox One is a cloud computing platform in the form of a console while the Sony PS4 is a console reaching into the cloud gaming space.

My contention leaving GDC 2015: Not only is the Microsoft Xbox One way ahead of the Sony PS4 in terms of cloud gaming, the Azure cloud platform is flat out, incredible....

While Sony purchased Gaikai for its cloud gaming platform, enthusiasts, like older brother and myself, thought the purchase was made to apply endless virtual resources to a single game to create MAG type scenarios wherein hundreds to thousands of players could connect and game simultaneously. That was the wish.

Unfortunately, that didn't and hasn't happened yet.

GDC 2015 made one thing abundantly clear for me: While Playstation 4 is excellent at streaming gaming content, XBox One is the go to cloud console platform on the block....

Walking around GDC 2015, you got the distinct feeling that the Azure gaming experience, Titanfall was only the start. Like Sony, Microsoft brought physical XBox One consoles to GDC 2015 yet unlike Sony, the company brought Azure powered gaming experiences via the XBox One. The games aside, what really stands out when it comes to XBox One is the potential it can fulfill with the infrastructure provided by the Azure platform.

When Respawn Entertainment was asked why they chose the Azure platform to host their game, they responded by saying Azure was/is the only true cloud gaming platform in the world which could marshal more than 100,000 servers with 300,000 dedicated virtual cores to support the virtual machine resources and that, when need arises, the company could simply throw up more servers instantly as need arose. Walking around GDC 2015, it seemed evident that XBox One was an extension of the Azure Cloud platform - a cloud based gaming system taking the form of a traditional console. If you don't believe it, here are some of the facts about the current state of Microsoft Azure:

  • More than 20% of Azure virtual machines run on Linux
  • 80% of fortune 500 companies currently operate on the Azure platform
  • Windows Server now holds about 75 percent of the market share for x86 server operating systems
  • SQL Server, running off the Azure platform, now ranks as the #1 most-used database in the world.
  • As of March 2015, roughly 10 trillion objects are stored on the Azure platform. 10 trillion.
  • The Azure platform is growing by roughly 1,000 costumers, daily.
  • The Azure platform currently contains over one million cloud servers operating within their managed data centers
  • The Azure platform services roughly 5.5 billion Bing search queries per month
  • The Microsoft Windows Azure Active Directory has handled 400 billion identity authentications since its launch. The figure equates to roughly 15 billion authentications per week
  • Since its launch, the Microsoft Azure Active Directory operates out of 27 data centers worldwide and amazingly, has consistently delivered 99.997 percent uptime
  • The figure is spotty, but conservative estimates figure there are between 100,000 - 300,000 websites currently hosted on the Azure platform

These stats alone should say one thing and one thing only - the Azure cloud platform is very impressive. When you think about Xbox One in terms of the Azure platform, with the backing of stats like these, it should become evidently clear that Microsoft has wholly build out a cloud network wherein the XBox One is simply a hardware tool designed to maximize the GPU and gaming power of that cloud platform.

But here is the thing, even though 1 GB of cloud storage isn't cutting it, I don't particularly blame Sony for the lack of back end data center storage experience. The Playstation platform, for all intents and purposes, is and has historically been a gaming console platform. Every iteration and version of the platform has been geared towards console gaming. Games were packaged on a disc and eventually, by the end of the PS2, allowed for server connection platform gaming. The Playstation platform has historically been a console gaming platform whereas the Microsoft platform has historically been a computing platform which happened to enter the gaming arena nearly 26 years (1975 - 2001) after its founding.

For this reason, I can't be mad at the current state of cloud gaming within Sony as they have never had to, until roughly 2012, invest, understand and maximize data center tech to build better gaming consoles and experiences. Additionally, for this reason I have no reservations in saying the XBox One platform is the current cloud gaming console of the future. With the backing of Microsoft Azure, the XBox One platform is ways ahead of anything the PS format can do in the cloud.

http://www.informit.com/blogs/blog.aspx?uk=GDC-2015-The-Current-State-of-Cloud-Gaming

MS has over 1 million servers with 300,000 dedicated to the Xbox one alone, not even PC and Xbox 360 can access this portion of the cloud.

That means Azure 1 mil >>>>>>>>>>Sony 50,000(Gaikai) and Thunderhead(XB1 servers) 300,000>>>>>>>>>>>Sony 50,000(Gaikai). Not to mention the fact that the Gaikai physical servers are not even property of Sony, they actually have to use a third party's datacenter for physical servers.

Frostbite destruction 3.0 exclusive to XB1 and PC, aka Microsoft. LOL

Avatar image for speedfog
speedfog

4966

Forum Posts

0

Wiki Points

0

Followers

Reviews: 18

User Lists: 0

#141 speedfog
Member since 2009 • 4966 Posts

The thread alone shows how many lies Sony produced.

I can't remember the biggest one but the Watch Dogs on PS4 only one was funny as hell.

Avatar image for kratosyoloswag
KratosYOLOSwag

1827

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#142 KratosYOLOSwag
Member since 2013 • 1827 Posts

Lems still in denial about the PS4 being the most powerful console of all time lulz. 1 game so far to use the cloud and it doesn't have a release date in sight or testing in people's homes, glad to see Microsoft is making such good use of the cloud this gen.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#143 darkangel115
Member since 2013 • 4562 Posts

The biggest lie was that the PS3 would do 4D and 120FPS

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144  Edited By tormentos
Member since 2003 • 33793 Posts

@slimdogmilionar said:

LOL The cloud will change nothing. Really?

Again LMAO at the fact that you think the PS4 could handle whats going in Crackdown with just the hardware in the box. If it could that would essentially mean that the PS4 is more powerful than a Titan, becasue the first demo MS released was run on a TItan and the destruction still was too much for the card. And also latency should not be a problem since the game is being optimized to run on a 2 - 4mbps connection.

Anything that needs constant refreshing can't be done on the cloud.Really? So how do you explain my cloud controlled auto pilot Titan engaging enemies and following my orders when I'm playing Titanfall, how do you explain the AI on both teams running at shooting during battle. SO if anything that requires constant refreshing can't be done in the cloud how do explain those things.

LOL you really do think it's doable on the PS4 GPU. Again even on high end PC's with less destruction frame rates drop significantly. PS4 can barely maintain 1080p/60 fps and you think it can handle the amount of destruction Crackdown has to offer.

It has been two years and what we have so far is just a taste of what the cloud has to offer.

LOL who cares about resolution when I have the ability to completely destroy a whole city during a firefight. No third party has bit you mean, well why would they cloudgine already has it covered, you have Unreal, Havok, and DX12 that already support cloudgine's middleware for cloud computing. Lol no third party has bit yet because the Cloudgine middleware has not been released yet, Crackdown is the first game to take advantage of it.

How is physics in the cloud not what I claimed, lol. And FYI I don't know of these glitches and and problems you speak of, I can't remember the last time XBL was down for me. Oh yea it was Christmas with the whole Lizard Squad thing. Lol glitches and downtime sounds like you are too used to playing on PSN, LOL find me one person who claims to have any lag at all on Titanfall, even if there was lag you have the option of switching to another data center, we have options you know, lol Azure has 99.997% uptime. Physics are a part of graphics also, just so you know..

Again Ps4 could never pull this off no matter how much denial you muster up, PS4 does not have what it takes to run this, Hell even the destruction in BF4 causes frame drops on both consoles, and you want to say that the PS4 could handle 100x that. So basically in your eyes the PS4 is powerful enough for an open world game where you can enter every building, destroy a whole city, and host online multiplayer games at the same time.

LOL so now Sony does not need a cloud, I smell butthurt. The fact is PS4 can't run this level of destruction period, Shu even openly said at launch :he does not understand what MS is doing with the cloud". This lets me know that Sony is far behind in this tech while companies like MS and Nvidia are already making breakthroughs in this area.

Here is some more info for you that you won't like.

MS has over 1 million servers with 300,000 dedicated to the Xbox one alone, not even PC and Xbox 360 can access this portion of the cloud.

That means Azure 1 mil >>>>>>>>>>Sony 50,000(Gaikai) and Thunderhead(XB1 servers) 300,000>>>>>>>>>>>Sony 50,000(Gaikai). Not to mention the fact that the Gaikai physical servers are not even property of Sony, they actually have to use a third party's datacenter for physical servers.

Frostbite destruction 3.0 exclusive to XB1 and PC, aka Microsoft. LOL

Loading Video...

Is the fact that you have pitiful knowledge about this subject the issue here,destructibility is not tied to a damn cloud,what MS is doing is basically using an external PC to offload some physics for the xbox one,nothing GPU compute can't do,the PS4 GPU can do physics with a fraction of of what would take on a CPU,thanks to its high parallel nature GPU while not smart are cheer brute force which come in handy in things like Physics which is why Running PhysX on a GPU will yield better results than on a CPU.

This ^^ was on 2009 6 years ago,you can see how destructibility happen on that video and not only that object get destroy by things hitting it but the one behind it get destroy as well from things going through it.

So go else where and pretend you need a cloud for that...lol

Is like Physics didn't exist without the cloud..hahahahaaaa

Latency loss packages everything will ruin it,everything and i don't think the cloud is streaming power equivalent to a damn titan over a freaking 2MB connection that doesn't make freaking sense what so ever,you know how much bandwidth a Titan GPU has.? So if anything MS is streaming a video of those physics been calculated already,considering how one of the patents described H264 stream one of the last posted here.

That bold part there confirm to be that you totally lack the knowledge to discuss this,again stop saying stupid tings you telling a titan to do something doesn't require constant refreshing for got sake i told squad teams what to do in socom 2 more than a decade ago,what i am talking about is physics so that the results of the calculations be inside the frame before the frame is ready to render.

Or using the cloud to do complex GPU task which require not only very fast bandwidth but also ms times,a screen is refresh 30 or 60 Frames per second that mean every second 30 frames are draw or in the case of 60, 60 frames per second so that frame is render 60 times in 1 second your internet connection will be fried it can't do it why in hell you think the xbox one has ESRAM for as luxury.?

It is there because DDR3 inside the xbox one doesn't have the bandwidth to cope with the data both GPU and CPU use,so don't tell me MS will stream power relative of a titan GPU over a 2 or 4 MB connection is physically impossible period.

Funny thing is the xbox one can't barely maintain 900p 60FPS so you are not doing better,i tell you what the cloud will be it will be this gen joke it will be use in a few games from MS IP to what it can do some physics and some AI sadly for you that doesn't increment the xbox one power in fact the whole 2 to 4 MB optimization tell me that the whole calculations is done on the server side and what MS will provably stream you will be a movie or the results of it that is not better than having backed lighting.

Please give me a link from sony stating that the PS4 can't run a game like that,the only reason the PS4 can't run crackdown is because it is a MS exclusive..hahahahaa

Good luck waiting for developers to byte into the cloud 2 years have pass already and most of the game are cloudless hell most don't even use it as dedicated servers for gaming...lol

Loading Video...

Again 2009 the only reason you don't see this often is because developers don't want to go through all the trouble,and not all games would work with a fully destructible worlds as far as story goes.

So Yeah the PS4 can do that crap.

NO Physics is what determine how object behave in games is not in any way related to anything graphically,you are as of this moment certified as a joke on this subject,you can have the very best looking game ever and have stupid physics or no physics at all,Physics is what tell the object in game how they should behave relative to real world some are more accurate than others because well in the end this are video games.

All that info serve for nothing all of fortune 500 can run on MS cloud it say nothing and in nothing help the xbox one.

Funny thing is if MS has 300k servers for the xbox one and crackdown had 500,000 users at the same time how will that work.?

So is 2 per server but wait what about other games.? how many xbox one can run per server.? Doing a controlled demo for the public is not the same as releasing the full thing something MS had consistently fail to hit the mark with Kinect after they claim it work in certain way,only to be know latter that the presentation was fake,hell don't look to far and look Forza 5 demo on E3 2013 vs the retail product.

Really where is the link to that.? Since EA use its own Servers for its game and no Titanfall is not an EA game is a Respawn game published by EA.

All EA games run on EA servers including all done using the frostbite engine.

@Cloud_imperium said:

I love how you get so emotional. Dude just stop. No one is bashing your precious "godstation". And by your logic, Console gamers are even bigger peasants because most of those PS3 owners don't have enough money to switch to PS4. And even if we take Steam survey's seriously, there are still more PCs powerful than PS4 than number of PS4s sold.

-Patiently awaits for wall of excuses and moaning like little bitches.

Really based on what you say that.?

You know consoles have a higher adoption rate that PC right.?

Which mean nothing since stronger than PS4 GPU had been available since 2010 2009 yet few on PC own those,and the majority still weaker than PS4 that is a fact.

@ronvalencia said:

@tormentos:

The difference is relatively minor and Jaguar CPUs has minimal GPU patching.

Jaguar is clock at 1.6Ghz (PS4) or 1.75Ghz (XBO) while PPE and SPUs are clocked 3.2Ghz i.e. Jaguar has superior effective performance per watt (it's a tablet CPU), superior effective performance per clock speed. It's too bad Sony did not configured PS4 with 1st gen PS3's TDP envelope i.e. 200 watts e.g. Jaguar at 2.4Ghz and GCN at 925Mhz would have been better.

It's even better with AVXv2 enabled Intel Haswell/Skylake and AMD Excavator.

You are forgetting Blue Gene's IBM PowerPC 440 (with dual FPU option) at 700Mhz in very large core count.

The old G8x CUDA GPUs are clocked >1.1 Ghz e.g. 8600 GT has 1.19 Ghz while 8600 GTS has 1.45 Ghz

The fact is SPE is "DSP like" and anything else is just a waste clock cycles.

It's too bad the general market didn't recognize the artificial separation between SPE vs DX10 GpGPU.

For Geforce 7/RSX, CPU or SPE handles the workload on the first 3 stages.

For Geforce 8 with DX10, GpGPU handles the workload on the first 3 stages.

Note that GPU-SO = Stream Out which similar to Xenos' mem-export. Xenos version would be interpolate with it's DX9L vertex shaders.

------------------------

From forum.beyond3d.com/showthread.php?t=57736&page=5

Refer to http://forum.beyond3d.com/showthread.php?p=552774

Read Jawed's post

To fix Geforce 7's issues, PC owners just upgrades to DX10 class GPU. NVIDIA G80 was released ahead of PS3.

Note the SPEs are being use as a GPU to patch the aging RSX design issues.

http://www.gpucomputing.net/sites/default/files/papers/1098/CEC_2008.pdf

Xbox 360's GpGPU example on ATI Xenos. It also states the tight coupling between GPU and CPU, which is the lead up for AMD's APU model. Later Xbox 360 revision combines CPU and GPU into a single chip.

Your love for Sony is joke.

Ok you should stop now i am not claiming Cell could not be use for GPU task,in fact i have been one of those who have highlight that fact.

But the fact is it is a CPU not a GPU,your excuse on why it is 3.2ghz is pathetic to say the least you freaking argue with people knowing you are wrong and using shitty ass arguments FIND ME A COMMERCIAL GPU RUNNING AT 3.2GHZ AND YOU HAVE A POINT.

CPU are faster in speed than GPU much faster,the speed alone is a really big indication,Cell SPE were great for parallel processing just like cuda cores or stream processors that is all.

By the way Cell was done way before the Gforce 7 series came out,in fact it was the Gforce 7 which was put in place because sony wanted to use 2 Cell instead of one GPU and CPU and since it wasn't that powerful compare to GPU they ended up with that Gforce 7.

Your hate for sony is the joke and proved,not only you have a quite clear agenda about everything sony but you downplay Cell using a damn GPU,and you defend the crappy xbox one hardware even more than any lemming Post by post you actually defend the xbox one more than several lemmings combined..lol

But but but you are a hermit right...lol

@04dcarraher said:

False, When it was being designed the Cell was suppose to be their all in one all purpose processor that did all cpu&gpu work without a dedicated gpu. When they started testing the SPE's could not keep up with current standardized dedicated gpus. That was when the Sony went to Nvidia for a fix for a dedicated gpu since the Cell was not up to snuff to handle it all.

Cell is a hybrid but is not a GPU per say,how many GPU run at 3.2ghz?

Cell uses DPS like processors,which are suited for parallel jobs just like GPU are which is the reason it can do some GPU task,but cell it self wasn't as powerful as a lone GPU,nor is exactly one,is something else.

@darkangel115 said:

The biggest lie was that the PS3 would do 4D and 120FPS

Yeah because MS didnt lie when they claim all 360 games would be 720p 4xAA minimum,if MS cold not even achieve that i fell less bad about sony after all 4d 120FPS is something much more difficult.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#145 darkangel115
Member since 2013 • 4562 Posts

@tormentos said:
@slimdogmilionar said:

LOL The cloud will change nothing. Really?

Again LMAO at the fact that you think the PS4 could handle whats going in Crackdown with just the hardware in the box. If it could that would essentially mean that the PS4 is more powerful than a Titan, becasue the first demo MS released was run on a TItan and the destruction still was too much for the card. And also latency should not be a problem since the game is being optimized to run on a 2 - 4mbps connection.

Anything that needs constant refreshing can't be done on the cloud.Really? So how do you explain my cloud controlled auto pilot Titan engaging enemies and following my orders when I'm playing Titanfall, how do you explain the AI on both teams running at shooting during battle. SO if anything that requires constant refreshing can't be done in the cloud how do explain those things.

LOL you really do think it's doable on the PS4 GPU. Again even on high end PC's with less destruction frame rates drop significantly. PS4 can barely maintain 1080p/60 fps and you think it can handle the amount of destruction Crackdown has to offer.

It has been two years and what we have so far is just a taste of what the cloud has to offer.

LOL who cares about resolution when I have the ability to completely destroy a whole city during a firefight. No third party has bit you mean, well why would they cloudgine already has it covered, you have Unreal, Havok, and DX12 that already support cloudgine's middleware for cloud computing. Lol no third party has bit yet because the Cloudgine middleware has not been released yet, Crackdown is the first game to take advantage of it.

How is physics in the cloud not what I claimed, lol. And FYI I don't know of these glitches and and problems you speak of, I can't remember the last time XBL was down for me. Oh yea it was Christmas with the whole Lizard Squad thing. Lol glitches and downtime sounds like you are too used to playing on PSN, LOL find me one person who claims to have any lag at all on Titanfall, even if there was lag you have the option of switching to another data center, we have options you know, lol Azure has 99.997% uptime. Physics are a part of graphics also, just so you know..

Again Ps4 could never pull this off no matter how much denial you muster up, PS4 does not have what it takes to run this, Hell even the destruction in BF4 causes frame drops on both consoles, and you want to say that the PS4 could handle 100x that. So basically in your eyes the PS4 is powerful enough for an open world game where you can enter every building, destroy a whole city, and host online multiplayer games at the same time.

LOL so now Sony does not need a cloud, I smell butthurt. The fact is PS4 can't run this level of destruction period, Shu even openly said at launch :he does not understand what MS is doing with the cloud". This lets me know that Sony is far behind in this tech while companies like MS and Nvidia are already making breakthroughs in this area.

Here is some more info for you that you won't like.

MS has over 1 million servers with 300,000 dedicated to the Xbox one alone, not even PC and Xbox 360 can access this portion of the cloud.

That means Azure 1 mil >>>>>>>>>>Sony 50,000(Gaikai) and Thunderhead(XB1 servers) 300,000>>>>>>>>>>>Sony 50,000(Gaikai). Not to mention the fact that the Gaikai physical servers are not even property of Sony, they actually have to use a third party's datacenter for physical servers.

Frostbite destruction 3.0 exclusive to XB1 and PC, aka Microsoft. LOL

Loading Video...

Is the fact that you have pitiful knowledge about this subject the issue here,destructibility is not tied to a damn cloud,what MS is doing is basically using an external PC to offload some physics for the xbox one,nothing GPU compute can't do,the PS4 GPU can do physics with a fraction of of what would take on a CPU,thanks to its high parallel nature GPU while not smart are cheer brute force which come in handy in things like Physics which is why Running PhysX on a GPU will yield better results than on a CPU.

This ^^ was on 2009 6 years ago,you can see how destructibility happen on that video and not only that object get destroy by things hitting it but the one behind it get destroy as well from things going through it.

So go else where and pretend you need a cloud for that...lol

Is like Physics didn't exist without the cloud..hahahahaaaa

Latency loss packages everything will ruin it,everything and i don't think the cloud is streaming power equivalent to a damn titan over a freaking 2MB connection that doesn't make freaking sense what so ever,you know how much bandwidth a Titan GPU has.? So if anything MS is streaming a video of those physics been calculated already,considering how one of the patents described H264 stream one of the last posted here.

That bold part there confirm to be that you totally lack the knowledge to discuss this,again stop saying stupid tings you telling a titan to do something doesn't require constant refreshing for got sake i told squad teams what to do in socom 2 more than a decade ago,what i am talking about is physics so that the results of the calculations be inside the frame before the frame is ready to render.

Or using the cloud to do complex GPU task which require not only very fast bandwidth but also ms times,a screen is refresh 30 or 60 Frames per second that mean every second 30 frames are draw or in the case of 60, 60 frames per second so that frame is render 60 times in 1 second your internet connection will be fried it can't do it why in hell you think the xbox one has ESRAM for as luxury.?

It is there because DDR3 inside the xbox one doesn't have the bandwidth to cope with the data both GPU and CPU use,so don't tell me MS will stream power relative of a titan GPU over a 2 or 4 MB connection is physically impossible period.

Funny thing is the xbox one can't barely maintain 900p 60FPS so you are not doing better,i tell you what the cloud will be it will be this gen joke it will be use in a few games from MS IP to what it can do some physics and some AI sadly for you that doesn't increment the xbox one power in fact the whole 2 to 4 MB optimization tell me that the whole calculations is done on the server side and what MS will provably stream you will be a movie or the results of it that is not better than having backed lighting.

Please give me a link from sony stating that the PS4 can't run a game like that,the only reason the PS4 can't run crackdown is because it is a MS exclusive..hahahahaa

Good luck waiting for developers to byte into the cloud 2 years have pass already and most of the game are cloudless hell most don't even use it as dedicated servers for gaming...lol

Loading Video...

Again 2009 the only reason you don't see this often is because developers don't want to go through all the trouble,and not all games would work with a fully destructible worlds as far as story goes.

So Yeah the PS4 can do that crap.

NO Physics is what determine how object behave in games is not in any way related to anything graphically,you are as of this moment certified as a joke on this subject,you can have the very best looking game ever and have stupid physics or no physics at all,Physics is what tell the object in game how they should behave relative to real world some are more accurate than others because well in the end this are video games.

All that info serve for nothing all of fortune 500 can run on MS cloud it say nothing and in nothing help the xbox one.

Funny thing is if MS has 300k servers for the xbox one and crackdown had 500,000 users at the same time how will that work.?

So is 2 per server but wait what about other games.? how many xbox one can run per server.? Doing a controlled demo for the public is not the same as releasing the full thing something MS had consistently fail to hit the mark with Kinect after they claim it work in certain way,only to be know latter that the presentation was fake,hell don't look to far and look Forza 5 demo on E3 2013 vs the retail product.

Really where is the link to that.? Since EA use its own Servers for its game and no Titanfall is not an EA game is a Respawn game published by EA.

All EA games run on EA servers including all done using the frostbite engine.

@Cloud_imperium said:

I love how you get so emotional. Dude just stop. No one is bashing your precious "godstation". And by your logic, Console gamers are even bigger peasants because most of those PS3 owners don't have enough money to switch to PS4. And even if we take Steam survey's seriously, there are still more PCs powerful than PS4 than number of PS4s sold.

-Patiently awaits for wall of excuses and moaning like little bitches.

Really based on what you say that.?

You know consoles have a higher adoption rate that PC right.?

Which mean nothing since stronger than PS4 GPU had been available since 2010 2009 yet few on PC own those,and the majority still weaker than PS4 that is a fact.

@ronvalencia said:

@tormentos:

The difference is relatively minor and Jaguar CPUs has minimal GPU patching.

Jaguar is clock at 1.6Ghz (PS4) or 1.75Ghz (XBO) while PPE and SPUs are clocked 3.2Ghz i.e. Jaguar has superior effective performance per watt (it's a tablet CPU), superior effective performance per clock speed. It's too bad Sony did not configured PS4 with 1st gen PS3's TDP envelope i.e. 200 watts e.g. Jaguar at 2.4Ghz and GCN at 925Mhz would have been better.

It's even better with AVXv2 enabled Intel Haswell/Skylake and AMD Excavator.

You are forgetting Blue Gene's IBM PowerPC 440 (with dual FPU option) at 700Mhz in very large core count.

The old G8x CUDA GPUs are clocked >1.1 Ghz e.g. 8600 GT has 1.19 Ghz while 8600 GTS has 1.45 Ghz

The fact is SPE is "DSP like" and anything else is just a waste clock cycles.

It's too bad the general market didn't recognize the artificial separation between SPE vs DX10 GpGPU.

For Geforce 7/RSX, CPU or SPE handles the workload on the first 3 stages.

For Geforce 8 with DX10, GpGPU handles the workload on the first 3 stages.

Note that GPU-SO = Stream Out which similar to Xenos' mem-export. Xenos version would be interpolate with it's DX9L vertex shaders.

------------------------

From forum.beyond3d.com/showthread.php?t=57736&page=5

Refer to http://forum.beyond3d.com/showthread.php?p=552774

Read Jawed's post

To fix Geforce 7's issues, PC owners just upgrades to DX10 class GPU. NVIDIA G80 was released ahead of PS3.

Note the SPEs are being use as a GPU to patch the aging RSX design issues.

http://www.gpucomputing.net/sites/default/files/papers/1098/CEC_2008.pdf

Xbox 360's GpGPU example on ATI Xenos. It also states the tight coupling between GPU and CPU, which is the lead up for AMD's APU model. Later Xbox 360 revision combines CPU and GPU into a single chip.

Your love for Sony is joke.

Ok you should stop now i am not claiming Cell could not be use for GPU task,in fact i have been one of those who have highlight that fact.

But the fact is it is a CPU not a GPU,your excuse on why it is 3.2ghz is pathetic to say the least you freaking argue with people knowing you are wrong and using shitty ass arguments FIND ME A COMMERCIAL GPU RUNNING AT 3.2GHZ AND YOU HAVE A POINT.

CPU are faster in speed than GPU much faster,the speed alone is a really big indication,Cell SPE were great for parallel processing just like cuda cores or stream processors that is all.

By the way Cell was done way before the Gforce 7 series came out,in fact it was the Gforce 7 which was put in place because sony wanted to use 2 Cell instead of one GPU and CPU and since it wasn't that powerful compare to GPU they ended up with that Gforce 7.

Your hate for sony is the joke and proved,not only you have a quite clear agenda about everything sony but you downplay Cell using a damn GPU,and you defend the crappy xbox one hardware even more than any lemming Post by post you actually defend the xbox one more than several lemmings combined..lol

But but but you are a hermit right...lol

@04dcarraher said:

False, When it was being designed the Cell was suppose to be their all in one all purpose processor that did all cpu&gpu work without a dedicated gpu. When they started testing the SPE's could not keep up with current standardized dedicated gpus. That was when the Sony went to Nvidia for a fix for a dedicated gpu since the Cell was not up to snuff to handle it all.

Cell is a hybrid but is not a GPU per say,how many GPU run at 3.2ghz?

Cell uses DPS like processors,which are suited for parallel jobs just like GPU are which is the reason it can do some GPU task,but cell it self wasn't as powerful as a lone GPU,nor is exactly one,is something else.

@darkangel115 said:

The biggest lie was that the PS3 would do 4D and 120FPS

Yeah because MS didnt lie when they claim all 360 games would be 720p 4xAA minimum,if MS cold not even achieve that i fell less bad about sony after all 4d 120FPS is something much more difficult.

he asked biggest lie. just like sony said that 1080p/60FPS would be a phrase we hear alt this gen, tey how many games achieve that? still thats a much smaller lie to miss out on a few pixels or half the framerate, then promising 120fps and "4D" whatever the hell they meant by that lol

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#146  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@tormentos said:

Cell is a hybrid but is not a GPU per say,how many GPU run at 3.2ghz?

Cell uses DPS like processors,which are suited for parallel jobs just like GPU are which is the reason it can do some GPU task,but cell it self wasn't as powerful as a lone GPU,nor is exactly one,is something else.

The Cell was originally designed to be an all-in one all purpose specialized processor handing all tasks, able to do parallel based calculations as well.

Grasping for straws with how many gpu's run at 3.2ghz?....lol, .... Clock rates means nothing when your comparing against architectures with hundreds to thousands of processors designed for parallel tasks that are leagues faster..... You have 6 SPE's at 3.2 ghz that can handle around 200 GFLOPS in parallel with one task. vs say a 8800GTX with 112 shader processors at 1.5 ghz handling over 500 GFLOPS.

The Cell's SPE's handles both scalar and vector data( doing multiple operations simultaneously with a single instruction), are in-order processors and have no Out-Of-Order capabilities. They have no cache but use a local store. SPE's are limited what they can do and have to be told what to do all the time or their processing power is wasted.

Avatar image for sailor232
sailor232

6880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#147 sailor232
Member since 2003 • 6880 Posts

I think its Sony keeping quiet for a whole week about accounts being hacked before releasing news to the public is a pretty bad thing, one would think they would understand what has happened to their own network pretty much straight away and then start to alert people to change passwords, contact banks etc. For them to take a whole week to tell people the truth is pretty shocking.

Other than that, the whole 2005 Sony E3 was shocking, and the fact they havent changed how they market games on pipe dreams since then, then there's other companies taking a page from Sony's book and doing the same thing, sigh.

Recently I'd say the winner is Nvidia with Batman:AK and the Geforce experience video, also gimping older model graphics cards on purpose just so people are forced to upgrade.

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#148 slimdogmilionar
Member since 2014 • 1345 Posts

@tormentos said:

Is the fact that you have pitiful knowledge about this subject the issue here,destructibility is not tied to a damn cloud,what MS is doing is basically using an external PC to offload some physics for the xbox one,nothing GPU compute can't do,the PS4 GPU can do physics with a fraction of of what would take on a CPU,thanks to its high parallel nature GPU while not smart are cheer brute force which come in handy in things like Physics which is why Running PhysX on a GPU will yield better results than on a CPU.

This ^^ was on 2009 6 years ago,you can see how destructibility happen on that video and not only that object get destroy by things hitting it but the one behind it get destroy as well from things going through it.

So go else where and pretend you need a cloud for that...lol

Is like Physics didn't exist without the cloud..hahahahaaaa

Latency loss packages everything will ruin it,everything and i don't think the cloud is streaming power equivalent to a damn titan over a freaking 2MB connection that doesn't make freaking sense what so ever,you know how much bandwidth a Titan GPU has.? So if anything MS is streaming a video of those physics been calculated already,considering how one of the patents described H264 stream one of the last posted here.

That bold part there confirm to be that you totally lack the knowledge to discuss this,again stop saying stupid tings you telling a titan to do something doesn't require constant refreshing for got sake i told squad teams what to do in socom 2 more than a decade ago,what i am talking about is physics so that the results of the calculations be inside the frame before the frame is ready to render.

Or using the cloud to do complex GPU task which require not only very fast bandwidth but also ms times,a screen is refresh 30 or 60 Frames per second that mean every second 30 frames are draw or in the case of 60, 60 frames per second so that frame is render 60 times in 1 second your internet connection will be fried it can't do it why in hell you think the xbox one has ESRAM for as luxury.?

It is there because DDR3 inside the xbox one doesn't have the bandwidth to cope with the data both GPU and CPU use,so don't tell me MS will stream power relative of a titan GPU over a 2 or 4 MB connection is physically impossible period.

Funny thing is the xbox one can't barely maintain 900p 60FPS so you are not doing better,i tell you what the cloud will be it will be this gen joke it will be use in a few games from MS IP to what it can do some physics and some AI sadly for you that doesn't increment the xbox one power in fact the whole 2 to 4 MB optimization tell me that the whole calculations is done on the server side and what MS will provably stream you will be a movie or the results of it that is not better than having backed lighting.

Please give me a link from sony stating that the PS4 can't run a game like that,the only reason the PS4 can't run crackdown is because it is a MS exclusive..hahahahaa

Good luck waiting for developers to byte into the cloud 2 years have pass already and most of the game are cloudless hell most don't even use it as dedicated servers for gaming...lol

Again 2009 the only reason you don't see this often is because developers don't want to go through all the trouble,and not all games would work with a fully destructible worlds as far as story goes.

So Yeah the PS4 can do that crap.

NO Physics is what determine how object behave in games is not in any way related to anything graphically,you are as of this moment certified as a joke on this subject,you can have the very best looking game ever and have stupid physics or no physics at all,Physics is what tell the object in game how they should behave relative to real world some are more accurate than others because well in the end this are video games.

All that info serve for nothing all of fortune 500 can run on MS cloud it say nothing and in nothing help the xbox one.

Funny thing is if MS has 300k servers for the xbox one and crackdown had 500,000 users at the same time how will that work.?

So is 2 per server but wait what about other games.? how many xbox one can run per server.? Doing a controlled demo for the public is not the same as releasing the full thing something MS had consistently fail to hit the mark with Kinect after they claim it work in certain way,only to be know latter that the presentation was fake,hell don't look to far and look Forza 5 demo on E3 2013 vs the retail product.

Really where is the link to that.? Since EA use its own Servers for its game and no Titanfall is not an EA game is a Respawn game published by EA.

All EA games run on EA servers including all done using the frostbite engine.

All I hear is blah, blah, blah, butthurt, goalpost moving, and denial.

No matter how bad you want it too this will never happen on PS4 unless they pay big bucks to Google or MS. PS4 can't run crackdown because they have no cloud powah. BF4 is already too much for the consoles even with the PS4 using ASYNC.

LIke I said before PS4(and XB1) both lag during destruction bits in BF4 and you really think that adding this much destruction the PS4 would be able to handle it GTFO. This just shows how butthurt you are that you were wrong. Since the demo has been shown people have been going crazy talking about what MS has accomplished, and talking about how good Azure is. Everyone thought it was a joke because they thought it could not be done. The only people talking bad about this are butthurt fanboys.

NO Physics is what determine how object behave in games is not in any way related to anything graphically,you are as of this moment certified as a joke on this subject<=======this statement right here is sad. It makes me feel bad for correcting you because I feel like you didn't receive a proper education.

phys·ics

(fĭz′ĭks)

n.

1. (used with a sing. verb) The science of matter and energy and of interactions between the two, grouped in traditional fields such as acoustics, optics, mechanics, thermodynamics, and electromagnetism, as well as in modern extensions including atomic and nuclear physics, cryogenics, solid-state physics, particle physics, and plasma physics.2. (used with a pl. verb) Physical properties, interactions, processes, or laws: the physics of supersonic flight.3. (used with a sing. verb)Archaic The study of the natural or material world and phenomena; natural philosophy.

Physical properties of an object include things like it's density, volume, texture. Physical properties are a part of physics, duh. Do you think this whole moving physics to the GPU is for fun. NO its about getting the GPU to handle all things that have to do with graphics.

LOL I don't have to post a link saying that this is not possible on the PS4, it's common knowledge. Sony does not have an infrastructure to compete with Azure and they won't be building one anytime soon. Just that portion of MS is still worth more than Sony, You can't spend what you don't have.

Funny thing is if MS has 300k servers for the xbox one and crackdown had 500,000 users at the same time how will that work.?<<<<<=======And again you prove your ignorance, you are reaching for anything at this point. LMAO. This is old and anyone who claims to know as much as you do about system architectures should know this.

QUESTION:How does a company with only one or two servers use those servers to run the operating system and applications that their workers use on a daily basis. How do those 2 or 3 servers actually handle the day to day processes for say 40-50 pc's, while the OS and applications are only installed on the server not the 40 - 50 PC's. How does that work? LMAO How does Sony supply PSnow to millions of people with only 50k servers? ask @ronvalencia he'll explain it to you LOLolol

You really do just get all of your info from Google.

Basically at this point you are looking for anyway to try and discredit the cloud, but you can't. Provide me with a link that says MS cloud is nothing more than streaming, if it were just simply streaming people would not be making a big deal out of this, the reason they are is because they know that this is all happening real time, and they thought this kind of technology would be something of the future not the now.

I'm not waiting for developers to bite into the cloud. EPIC has already, so has Havok, and so has Nvidia. Nvidia being involved should already you that MS and Cloudgine are on to something.

Funny thing is the xbox one can't barely maintain 900p 60FPS so you are not doing better,i tell you what the cloud will be it will be this gen joke it will be use in a few games from MS IP to what it can do some physics and some AI sadly for you that doesn't increment the xbox one power in fact the whole 2 to 4 MB optimization tell me that the whole calculations is done on the server side and what MS will provably stream you will be a movie or the results of it that is not better than having backed lighting.

How can you mix a movie and a game. If MS is sending me back a clip of a movie that shows the destruction what happens when someone steps under a falling building, or someone is inside a building being blown up or brought down. Think before you type. That would mean that the cloud would have to know exactly what I was going to do next, but how. If Im playing real time but the movie is predetermined.......it just doesn't work no matter how you try to deny it.

Oh and nice trying to hide behind behind PC Physx. LOL that was from a 2009 DESKTOP GPU that is still probably more powerful than the X1 and PS4 and also it's from Nviidia, consoles use AMD. You still reaching bro, remember these consoles were already being designed and gimped by 2009.

The fact is you were wrong everything you say about the cloud is wrong. You say it is just sending video signals, but it's not. This is cloud computing not cloud streaming aka PSnow. Destruction sequences are not scripted, it is all happening real time just as everything does in Titanfall.

PS4 CANT DO CRACKDOWN LEVEL DESTRUCTION.

Avatar image for babyjoker1221
babyjoker1221

1313

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149  Edited By babyjoker1221
Member since 2015 • 1313 Posts

Seeing tormentos getting rekt over and over and over again never gets old. The poor little guy just keeps on trying though. I give him a C for effort.

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#150 Mr_Huggles_dog
Member since 2014 • 7805 Posts

@l0ngshot said:
@mr_huggles_dog said:
@l0ngshot said:
@Cloud_imperium said:

Yeah, Witcher 3 runs like a slide show on PS4. But cows love to brag about 1080p so that's all what matters. Doesn't matter if they had to change the game's visuals from original build or compromise fps. What matters is 1080p. 20 fps are cinematic anyway (even lower than cinematic but you got my point).

haha so true. They will argue 1080 v 900p versus xbox and call it a victory but when someone said MSGV will look much better on PC with 4k reso, everyone says resolution doesn't matter. What total hypocrites.

I find that line of thinking quite arrogant.

I'm a so called cow...and I wouldn't buy a shit game like something from the Witcher series for more than $10.

Resolution, GFX, and fps would affect my opinion.

I enjoyed the game regardless of what it scored and I couldn't care less about your opinion.

You cows use latest batman game to claim dominance of ps4? Very well, add ignorance to the list too.

I didn't say you should care about my opinion.

I fear you're a lost cause.

I don't use Batman as dominance....I use it as an example of AAA quality games being completely broken on PC.

VERY WELL!!!!!!