Sonysexual1's forum posts

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#1 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="Sonysexual1"]

[QUOTE="mems_1224"] prove it mems_1224

http://www.amazon.com/gp/bestsellers/2013/videogames/ref=zg_bs_tab_t_bsar

thats not proof. try again

http://www.vgchartz.com/article/251266/ps4-vs-xbox-one-pre-order-totals-to-august-24th-2013/

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#2 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="Sonysexual1"]

[QUOTE="mems_1224"]lol Sony cant even win at home mems_1224

Judging from pre-order numbers from the US, the same could be said about Microsoft.

prove it

http://www.amazon.com/gp/bestsellers/2013/videogames/ref=zg_bs_tab_t_bsar

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#3 Sonysexual1
Member since 2013 • 811 Posts

lol Sony cant even win at home mems_1224

Judging from pre-order numbers from the US, the same could be said about Microsoft.

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#5 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="Sonysexual1"]

[QUOTE="p3anut"]

So where is the links for the X1?

p3anut

Microsoft hasn't released them yet.

"Silent" probably means slightly less noisy than the Xbox 360.

So why is ButDuuude claiming the PS4 is gonna be quieter?

Microsoft is hiding. If their cooling system/system temperature was better than PS4's, then shouldn't they have released their info by now?

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#6 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="ButDuuude"]

 

 [QUOTE="sukraj"]

 

wheres the proof.

p3anut

Hard to believe people didn't know this:

http://n4g.com/news/1309262/mark-cerny-they-know-how-to-design-ps4-so-it-wont-overheat

Not only will the PS4 use less power, but it will run cooler too:

http://www.nowgamer.com/news/2015280/ps4_has_less_risk_of_failing_consoles.html

So where is the links for the X1?

Microsoft hasn't released them yet.

"Silent" probably means slightly less noisy than the Xbox 360.

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#7 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="Wickerman777"]

 

PS4's audio chips aren't on the APU because Sony's engineers had common sense. Who cares about the audio? Seriously, how many debates about audio capabilities have ya seen around here? The APU is the most important part of the console. Ya should cram it with as many CPU and GPU resources as possible. That's what Sony did. MS, meanwhile, puts stuff like audio chips on there and runs out of room for ROPs and CUs. Stupid! But obviously they knew what they were doing. They had to know that crap like audio chips on the APU was going to mean less important stuff on the APU like CUs. But apparently they didn't care. They could have put the audio chips somewhere else but chose not to. This console is weak on purpose. If a weak console is what they wanted why are they trying to talk themselves out of it now?

tormentos

 

It actually surprice me to see MS fit the audio block inside the APU,rather than put it outside with all the room the xbox one has inside on that board it should have been quite more easy to fit that audio chip outside and actually have more room for GPU.

It was lack of direcetion what hurted MS,is the reason Mattrick was kick out,i really believed that the 3rd console curse only happen to those who won 2 generations in a row,but it has strike MS big time,after they sold so many 360's they dropped the ball.

Well, I think the audio chip is for Kinect voice recognition, not for gaming. Kinect is going to have to be running in the background 24/7 keeping track of 50+ commands at many different languages. The audio block will be used for decoding verbal commands on the fly (Microsoft stressed OS smoothness).

It's just so strange that Microsoft would invest in sound rather than graphical computing power. Kinect is the only explanation.

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#8 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="Wickerman777"]So, here are couple of points about some of the individual parts for people to consider:

18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims.

http://www.neogaf.com/forum/showpost.php?p=80951633&postcount=195

 

What especially strikes me as absurd about that is his first claim about more GPU cores. The way he puts it one could be the left with the impression that less graphics cores > more graphics cores. Ughh, what?!!! If that's the case I guess if you're building a gaming PC you'd be better off with a Radeon 7770 than you would be a Radeon 7970 since a 7770 has just 10 cores to the 7970's 32, lol.

btk2k2

18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU This is misleading. 50% more CUs does not = 50% more performance because there are other factors to take into account. It has nothing to do with the CPU provided the CPU is quick enough which they should be. Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall. This is BS. It is a 6% clock speed bump, of course each CU is running 6% faster but so is the whole GPU. The way he has written this is suggest that the 6% increase is cumulative for each CU which is bogus and very misleading. We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted. There will be very few cases where you can steam data from both the DDR3 and the ESRAM. The ESRAM will not be in use at all times because the DDR3 feeding it is much slower, it will help with the bandwidth and I am sure it can peak as high as the PS4 but on average sustained throughput the PS4 will come out ahead, I just do not know by how much. We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles. The PS4 CPU has not had its CPU clockspeed revealed as far as I am aware, 1.6Ghz does seem the most likely though. The PS4 also has audio chips, just no on the APU. Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU. This is misdirection. Yes it will help the CPU read GPU generated data, but the whole point of GPGPU is that the GPU does the processing and in this regard the X1 is behind the PS4. It also does not answer the question of weather the CPU can read/write directly to the GPU cache. I do not disagree that the X1 is well balanced, it is just that the PS4 is also well balanced at a higher tier of performance, the X1 is about as good as they could have made it with their initial design goals, available silicon and power budget so it is not a shit box by any means. The issue is that the PS4 had different design goals which meant they did not have to sacrifice APU space to fit in the ESRAM which enabled them to have a more powerful GPU.

You should add that the genius Cerny added tons of ACEs and modified the architecture to support fine-grain computing so that the GPU can utilize GPGPU computing without taking a hit to the graphics of the games.

The Xbox ONE, on the other hand, uses off-the-shelf GPU, so developers wanting to use GPGPU computing will have to sacrifice graphical fidelity.

Cerny is a genius. Who does Microsoft have that's on the same level?

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#9 Sonysexual1
Member since 2013 • 811 Posts

[QUOTE="tormentos"]

[QUOTE="StormyJoe"]

 

You just keep falling into that "PS4 games will look better" argument, don't you? I am going to have so much fun at your expense this next gen pointing out how the games look the same; but the XB1 versions end up being better because they are more feature-rich. You might want to start thinking about an alt. screen name now.

StormyJoe

 

This is so funny on one side you admit the hardware is stronger and on another you refuse to see that the stronger hardware will give better results.

 

Let see all the secret suace has die,the final spec are in the xbox one is weak and your hopes and dreams are shatter,i think i will have more laughts with you,even a missing patch of grass will be enough to make fun of you,you know since so many lemming hyped more grass on xbox 360 games..:lol:

How the xbox one version will be more features rich.? Because the Ps4 support party chat and across systems not just games,unless your talking about the ability to watch TV while i play a game which i don't care or like in any way.

As devs start to take advantage of the cloud for multiplayer, you will start to see gimped PS4 versions.

MMORPGs have been using server-sided calculations... *Ahem* I mean, cloud computing for awhile now... Here's how they look:

the-warlords-man-city.jpg?ec9f9b

Avatar image for Sonysexual1
Sonysexual1

811

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 0

#10 Sonysexual1
Member since 2013 • 811 Posts
Looks worse than the PS4 UI.