Xbox One could be more powerful than you think

  • 123 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 BattlefieldFan3
Member since 2012 • 361 Posts

@Tighaman said:

Its funny that everything has to be 1080p now but the most stable 1080p 60fps is on the x1 with forza not killzone only on multiplayer and only because a lack of AI they was able to have 60fps and its not stable DF mentioned FORZA SMOOTH AS BUTTER AT 1080p 60fps KILLZONE UNSTABLE AT 60fps BOTH 1st PARTY DEVs I don't see what COWS are seeing I X1 gave you great games at every resolution and 3rd party devs will soon follow

Killer Instinct 720p great game

Ryse 900p THE GRAPHIC KING AT THE MOMENT

FORZA 1080p 60fps stable with GREAT AI cloud AI

Yall can keep telling yourselves them lies about the hardware but the truth is in the software

How do you know if those games can't be run on the PS4 with higher graphical settings?

Forza 5 doesn't look much better than last-gen racing games.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52  Edited By MonsieurX
Member since 2008 • 39858 Posts

@BattlefieldFan3 said:

@Tighaman said:

Its funny that everything has to be 1080p now but the most stable 1080p 60fps is on the x1 with forza not killzone only on multiplayer and only because a lack of AI they was able to have 60fps and its not stable DF mentioned FORZA SMOOTH AS BUTTER AT 1080p 60fps KILLZONE UNSTABLE AT 60fps BOTH 1st PARTY DEVs I don't see what COWS are seeing I X1 gave you great games at every resolution and 3rd party devs will soon follow

Killer Instinct 720p great game

Ryse 900p THE GRAPHIC KING AT THE MOMENT

FORZA 1080p 60fps stable with GREAT AI cloud AI

Yall can keep telling yourselves them lies about the hardware but the truth is in the software

How do you know if those games can't be run on the PS4 with higher graphical settings?

Forza 5 doesn't look much better than last-gen racing games.

Because PS4 can't run X1 games.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 BattlefieldFan3
Member since 2012 • 361 Posts

@MonsieurX said:

@BattlefieldFan3 said:

@Tighaman said:

Its funny that everything has to be 1080p now but the most stable 1080p 60fps is on the x1 with forza not killzone only on multiplayer and only because a lack of AI they was able to have 60fps and its not stable DF mentioned FORZA SMOOTH AS BUTTER AT 1080p 60fps KILLZONE UNSTABLE AT 60fps BOTH 1st PARTY DEVs I don't see what COWS are seeing I X1 gave you great games at every resolution and 3rd party devs will soon follow

Killer Instinct 720p great game

Ryse 900p THE GRAPHIC KING AT THE MOMENT

FORZA 1080p 60fps stable with GREAT AI cloud AI

Yall can keep telling yourselves them lies about the hardware but the truth is in the software

How do you know if those games can't be run on the PS4 with higher graphical settings?

Forza 5 doesn't look much better than last-gen racing games.

Because PS4 can't run X1 games.

Assuming that the games had a PS4 port.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54  Edited By MonsieurX
Member since 2008 • 39858 Posts

@BattlefieldFan3 said:

@MonsieurX said:

@BattlefieldFan3 said:

@Tighaman said:

Its funny that everything has to be 1080p now but the most stable 1080p 60fps is on the x1 with forza not killzone only on multiplayer and only because a lack of AI they was able to have 60fps and its not stable DF mentioned FORZA SMOOTH AS BUTTER AT 1080p 60fps KILLZONE UNSTABLE AT 60fps BOTH 1st PARTY DEVs I don't see what COWS are seeing I X1 gave you great games at every resolution and 3rd party devs will soon follow

Killer Instinct 720p great game

Ryse 900p THE GRAPHIC KING AT THE MOMENT

FORZA 1080p 60fps stable with GREAT AI cloud AI

Yall can keep telling yourselves them lies about the hardware but the truth is in the software

How do you know if those games can't be run on the PS4 with higher graphical settings?

Forza 5 doesn't look much better than last-gen racing games.

Because PS4 can't run X1 games.

Assuming that the games had a PS4 port.

It can't run them,get over it.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#55  Edited By ReadingRainbow4
Member since 2012 • 18733 Posts

It just seems weak to me. Maybe a bit more powerful than the WiiU.

I guess you guys will have to "Just wait," and see what it's true power level is. Ryse of vaseline and Forza Google earth isn't really the most impressive first showings, I'd agree.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56  Edited By Tighaman
Member since 2006 • 1038 Posts

@BattlefieldFan3: Because KILLZONE IS NOT A POWER HOUSE ONLY RESOLUTION THE ONLY THING THAT IT GO GOING FOR IT THATS IT. This came right from the devs mouth: Since we didn't have ANY AI in MULTIPLAYER we could achieve 60fps and its still UNSTABLE only achievable with the lack of AI Forza does look better than 4 no matter how you spin it better everything with a STABLE 1080p at 60fps WHATs so hard to understand?

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#57 ReadingRainbow4
Member since 2012 • 18733 Posts

@Tighaman said:

@BattlefieldFan3: Because KILLZONE IS NOT A POWER HOUSE ONLY RESOLUTION THE ONLY THING THAT IT GO GOING FOR IT THATS IT. This came right from the devs mouth: Since we didn't have ANY AI in MULTIPLAYER we could achieve 60fps and its still UNSTABLE only achievable with the lack of AI Forza does look better than 4 no matter how you spin it better everything with a STABLE 1080p at 60fps WHATs so hard to understand?

Are you mad?

Avatar image for thegroveman
thegroveman

123

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 thegroveman
Member since 2012 • 123 Posts

Judging any console by the quality of its launch titles is foolish, I think these fanboy wars are beyond laughable.

Go back and play Resistance, Kameo, Oblivion, or any launch window title from 7th Gen consoles and tell me the quality of the graphics in any way reflected the future capability of the console.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 BattlefieldFan3
Member since 2012 • 361 Posts

@Tighaman said:

@BattlefieldFan3: Because KILLZONE IS NOT A POWER HOUSE ONLY RESOLUTION THE ONLY THING THAT IT GO GOING FOR IT THATS IT. This came right from the devs mouth: Since we didn't have ANY AI in MULTIPLAYER we could achieve 60fps and its still UNSTABLE only achievable with the lack of AI Forza does look better than 4 no matter how you spin it better everything with a STABLE 1080p at 60fps WHATs so hard to understand?

Graphical grunt isn't determined by AI alone.

Forza 5 is a racing game with last-gen graphics. The PS4 could probably run the game at 1080p and 60 FPS if there was a PS4 port.

Avatar image for ReadingRainbow4
ReadingRainbow4

18733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#61 ReadingRainbow4
Member since 2012 • 18733 Posts

yup, he mad.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 BattlefieldFan3
Member since 2012 • 361 Posts

The point that many xboners are missing is that exclusive games don't imply that the game can't be done on another platform.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#63  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@ronvalencia said:

LOL, SPU ISA is roughly a subset of Altivec. Your the fu**ing moron.

Also, PC's IBM CELL PCI-Express add-on-card is a full 8 SPU version. Again, your the fu**ing moron.

My supplied link's context was against that company's i7 vs CELL test.. LOL. The software programmer for x264 encoder software has labelled your company's i7 vs CELL test as BS. IBM CELL's PC commercial adventure was a failure.

PS3 SPE's IEEE-754 support is $hit i.e. not comparable to Intel SSEx.

A programmer can use half of the processor's capability LOL. Can you Fu*k off? It's clear you don't know what you are talking about.

As for my Sony hate..

I still own Sony Vaio VGN-FW45 laptop (has Radeon HD 4650M GDDR3) and still has pretty good audio circuits i.e. it's still better than old my DELL Studio XPS 1645. My Sony Vaio VGN-FW45 laptop is still operational as my audio editing machine.

A Sony laptop owner should be able to identify Sony's desktop wireless app.

My Sony Vaio VGN-FW45 laptop has an external Radeon HD 5770 eGPU solution.

I also own a Sony 46 inch 1080p HDTV.

With my Sony laptop, I spent more $$ on a Sony device than your combined PS3 and PS4 hardware. F**k off, you don't know me.

My Intel Coreâ„¢ i7 Processor 3635QM would smash PS3's CELL processor i.e. four sets of 256bit wide Intel AVX SIMDs + 16 GT2 SIMDx processors (Intel HD 4000) and Intel QuickSync.

Intel HD 4000 rivals PS3 (combined 6 SPE + RSX) and Xbox 360 in gaming results.

You can claim anything you like you fu**ing troll fact is what i posted wasn't a moron on forum talking about Cell as if Cell had cores...lol

It was a company who did a test,ad yeah Cell beat an i7 period.

Is not the first time Cell use to own CPU as well in folding home,you had to bring GPU to actually beat it and i remember how you use to bring Nvidia GPU to compete with Cell..lol

The statement was from x264 software programmer. lol. http://en.wikipedia.org/wiki/X264

Your company's encoding test for the PC was slower than x264 software version at the SAME settings. LOL.

Winfast's quad SPU version for the PC add-on card was enhanced with H.264 decoder and encoder blocks. LOL. If the mighty SPU was so good with H264, then it wouldn't need the H.264 decoder and encoder blocks. Stupid cow.

Intel QuickSync is a video encoder block for H.264. LOL. Your company's comparison can F**k-off.

CELL is a dead end architecture and even Sony knows that i.e. PS4 says hi.

Avatar image for indigenous_euphoria
indigenous_euphoria

255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#64  Edited By indigenous_euphoria
Member since 2013 • 255 Posts

PS4 isn't more powerful then X1 though 0.o....yeah i said it!

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 Tighaman
Member since 2006 • 1038 Posts

@BattlefieldFan3: man that's a stupid excuse you can say the same thing about killzone its not doing anything new either its just killzone with color and a higher resolution that's it and its a FPS means less polys needed and its still not 1080p 60fps in SP only multiplayer and need AI in multiplayer it wouldnt have achieved that. and its not stable so how you gonna make that claim?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Gue1 said:

XB1 has weaker CPU, weaker GPU and its media features take a percentage of those resources too. XB1 is not only weaker but MUCH weaker than the PS4.

A web browser wouldn't show the true CPU's computational capabilities and this is why we have 3DMarks physics test.

-------

For lems.

PS4's GCN is slightly stronger than XI's GCN i.e. prototype 7850 with 12 CU wasn't able to match the retail 7850 with 16 CUs and both has the same memory bandwidth. PS4's GCN is slightly stronger than the retail 7850 at 860 mhz.

Avatar image for PinkieWinkie
PinkieWinkie

1454

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67 PinkieWinkie
Member since 2006 • 1454 Posts

@Tighaman: Are you seriously comparing Forza 5 to killzome SF? Forza isn't even comparable man. The x1 cannot run kz sf at 1080 and the same fps..

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68  Edited By Tighaman
Member since 2006 • 1038 Posts

@PinkieWinkie: Are you really trying not to? This is two flagship titles both takes a hell of a lot to run both by first party devs to showcase their consoles. Are you trying to tell me it takes more resources to run killzone than Forza are you serious? You just talking out your ass and you really need to stop it. KILLZONE IS NOT RUNNING ON SP AT 60fps ONLY WITHOUT ANY AI THEY WHERE ABLE TO ACHIEVE 60fps WITH MP THATS THE ONLY REASON SO YES THE X1 WOULD EASILY RUN IT AND BETTER BECAUSE THEY WOULD USE TO CLOUD FOR AI JUST LIKE FORZA HENCE 60fps in SP and MP.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#69 silversix_
Member since 2010 • 26347 Posts

Its actually weaker than we all think. The system wasn't made for gaming so its not important anyways

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#70 silversix_
Member since 2010 • 26347 Posts

@ronvalencia said:
@tormentos said:


@ronvalencia said:

Not against Intel's Quick Sync i.e. hardware extensions that was specifically designed for video encoding.

-------------

From x264 software developer on IBM CELL vs Intel Core i7 920, http://forum.doom9.org/showpost.php?p=1454286&postcount=2

Question: While it was working, however, it worked great thanks to the power of the Cell and proved to be faster than a core i7 920)

Answer: No, it wasn't. It was way, way, way slower than x264 on a core i7 with similar settings. Sure, if you put x264 on slow settings and it on fast settings, it was faster -- but that's hardly a surprise.

The Cell is a pretty slow CPU. It takes roughly 2.5 cores (out of 8 ) to do realtime 1080p H.264 decoding with a highly optimized decoder. A fast i7 can do that with about ~0.4 cores (out of 4 or 6).

You are a fu**ing moron did you even read what you quote.?

First of all Cell only had 7 SPE not 8,the 8 one was disable for redundancy.

Second Cell doesn't have cores you idiot,it has 1 PPE and 7 SPE they are not actual core nor work exactly like core.

And 3rd this is the first time i hear some one claiming 2.5 cores as if you could divide SPE down to half of one,you either use one or you don't.

Also nice quoting people on a forum your hate for sony is so great that it make you look stupid,once again a company did test and Cell >>> and i7.....Cry all you want..lol

LOL, SPU ISA is roughly a subset of Altivec. Your the fu**ing moron.

Also, PC's IBM CELL PCI-Express add-on-card is a full 8 SPU version. Again, your the fu**ing moron.

My supplied link's context was against that company's i7 vs CELL test.. LOL. The software programmer for x264 encoder software has labelled your company's i7 vs CELL test as BS. IBM CELL's PC commercial adventure was a failure.

PS3 SPE's IEEE-754 support is $hit i.e. not comparable to Intel SSEx.

A programmer can use half of the processor's capability LOL. Can you Fu*k off? It's clear you don't know what you are talking about.

As for my Sony hate..

I still own Sony Vaio VGN-FW45 laptop (has Radeon HD 4650M GDDR3) and still has pretty good audio circuits i.e. it's still better than old my DELL Studio XPS 1645. My Sony Vaio VGN-FW45 laptop is still operational as my audio editing machine.

A Sony laptop owner should be able to identify Sony's desktop wireless app.

My Sony Vaio VGN-FW45 laptop has an external Radeon HD 5770 eGPU solution.

I also own a Sony 46 inch 1080p HDTV.

With my Sony laptop, I spent more $$ on a Sony device than your combined PS3 and PS4 hardware. F**k off, you don't know me.

My Intel Coreâ„¢ i7 Processor 3635QM would smash PS3's CELL processor i.e. four sets of 256bit wide Intel AVX SIMDs + 16 GT2 SIMDx processors (Intel HD 4000) and Intel QuickSync.

Intel HD 4000 rivals PS3 (combined 6 SPE + RSX) and Xbox 360 in gaming results.

You two should marry each other and invite me to the marriage.

Avatar image for PinkieWinkie
PinkieWinkie

1454

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#71  Edited By PinkieWinkie
Member since 2006 • 1454 Posts

@Tighaman: Forza more resources than kz...I lold. Son you are delusional.

Avatar image for RimacBugatti
RimacBugatti

1632

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#72 RimacBugatti
Member since 2013 • 1632 Posts

Xbone wont be able to keep up with this gen. It's already struggling with games like Forza 5. And games like Project Cars? No way in hell. We will be stuck playing Angry Birds. Xbone is toast! Face it!

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#73 GrenadeLauncher
Member since 2004 • 6843 Posts

Just give it up already.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#74 FoxbatAlpha
Member since 2009 • 10669 Posts

@indigenous_euphoria: agreed.

Avatar image for Serioussamik
Serioussamik

773

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 Serioussamik
Member since 2010 • 773 Posts

@leandrro said:

@FreedomFreeLife said:

Yeah when Xbox 360 came out then we saw Gears of War while PS3 had game Liar and Haze.

PS3 is like maybe 5% more powerful than Xbox 360 yet Xbox 360 had best looking games than PS3(untill PS3 uncharted came out).

So right now we see lower res on Xbox One but maybe because they havent developed it well yet?

So maybe Xbox One is this time like Playstation 3, slow start, shitty ports but in time we see games that runs 1080p and 60fps without problems.

Yes, PS4 is more powerful than Xbox One but this dosent mean that Xbox One cant do 1080p games

x360 has a 10% better graphics card than ps3, after so many years all x360 games still look 10% better

ps4 has a 80% faster graphics card than x1, in 2020 ps4 games will still look 80% better

LOL!!! post the new year for me...

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#76 blackace
Member since 2002 • 23576 Posts

@FreedomFreeLife: Well, if that overlapping stack on AMD chips turns out to be true, the ball game will be over. There is suppose to be some major software upgrade going to XB DevKits this month as well. Not enitrely sure what that is suppose to do. I just remember a developer saying they were waiting patiently for it. In any case, 2014 is going to be fun.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77  Edited By Tighaman
Member since 2006 • 1038 Posts

@blackace: this pic is going WILD over the forums lol I wish someone would FULLY explain it to me it looked stack but who know but its too many COWS and MS HATERS on this board to hear beyond all the mooooooooin these are somethings I read that was clouded by COWS

3 xbox ones in the cloud to 1 xbox inside the house. The Ethernet is connected to the SOC WHY?

over 272gb total bandwith maybe more after reading flash nand and the used efficiently

Yukon diagram Had two APUs

Japan dev said it had two APUs

Yukon and the Durango are the samething

the MAIN SOC?

I could go on and on and this is not from fanboys or bloggers every word I said is from someone THAT HAS SEEN INSIDE OR DEVELOPING WITH THE XBOX 1

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#78 blackace
Member since 2002 • 23576 Posts

@Tighaman said:

@blackace: this pic is going WILD over the forums lol I wish someone would FULLY explain it to me it looked stack but who know but its too many COWS and MS HATERS on this board to hear beyond all the mooooooooin these are somethings I read that was clouded by COWS

3 xbox ones in the cloud to 1 xbox inside the house. The Ethernet is connected to the SOC WHY?

over 272gb total bandwith maybe more after reading flash nand and the used efficiently

Yukon diagram Had two APUs

Japan dev said it had two APUs

Yukon and the Durango are the samething

the MAIN SOC?

I could go on and on and this is not from fanboys or bloggers every word I said is from someone THAT HAS SEEN INSIDE OR DEVELOPING WITH THE XBOX 1

The Apocalypse is coming. Everything is rumors right now, but there are developers and people inside Microsoft talking. There's some gage order and a certain court case with AMD, defectors and NVidia going on that's holding the beast chained in it's cage. All should be revealed at the E3.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#79 tormentos
Member since 2003 • 33798 Posts

@blackace said:

@FreedomFreeLife: Well, if that overlapping stack on AMD chips turns out to be true, the ball game will be over. There is suppose to be some major software upgrade going to XB DevKits this month as well. Not enitrely sure what that is suppose to do. I just remember a developer saying they were waiting patiently for it. In any case, 2014 is going to be fun.

Hahahahaahaaaaaaaaaaaaaaaaaaaaaaa................

Le Secrete sauce strike back...hahahahaaaaaaaaaaa

@Tighaman said:

@blackace: this pic is going WILD over the forums lol I wish someone would FULLY explain it to me it looked stack but who know but its too many COWS and MS HATERS on this board to hear beyond all the mooooooooin these are somethings I read that was clouded by COWS

3 xbox ones in the cloud to 1 xbox inside the house. The Ethernet is connected to the SOC WHY?

over 272gb total bandwith maybe more after reading flash nand and the used efficiently

Yukon diagram Had two APUs

Japan dev said it had two APUs

Yukon and the Durango are the samething

the MAIN SOC?

I could go on and on and this is not from fanboys or bloggers every word I said is from someone THAT HAS SEEN INSIDE OR DEVELOPING WITH THE XBOX 1

My god i never read so much garbage and miss information in my life you people are still going with the damn secret sauce.

Is nothing and the cloud is sh** period sh** you can't pass data fast enough on a damn internet connection to mean anything,when you get a 100Gb internet line then call me,that is when you can at leas do something,graphics cards work with GB of data a second not with MB a second period you can have a 100MB connection and that cloud will help you for sh**.

Is very limited what you can do already prove it,Titanfall uses the cloud and look like utter sh** 3 xbox one of power by the cloud my ass,some of you don't have the slightest idea how impossible MS claims are,oh and i am not even bringing latency issues.

The xbox one has 140 to 150 GB/s bandwidth is all you will get and is the practical number,the rest is theory which mean sh** and MS then self confirmed this to DF and they claim 140 to 150GB/s is what can be achieve in reality.

See you next year when all this crap fail to materialize and you and Blackace look like 2 morons running on secrete sauce crap,the xbox one is weak move on and admit it already,because even MS already admit it.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81 tormentos
Member since 2003 • 33798 Posts

@Tighaman said:

@tormentos: See this is what im talking about hows is this misinformation from people who worked a developed the damn console but the shit that comes out of you mouth not? Wheres your CREDS that's right you have none if you read beyond these forums like I tell you to do my son lol you would understand that theres companies in Germany right now that is doing that GPU farms so don't tell me it cant be done when its being done already. Why are you always trying to debunk shyt with no knowledge of the situation in my life they say don't speak on it if you never experience it, that's being a f k boy in my book.

Dude the xbox one was broke apart there are non 2 APU,there is no secret sauce,the GPU has 14 CU 2 disable for yields stated by MS it self,do you believe that if MS had 2 GPU they would risk over heating by rising the GPU and CPU speed.?

Is common sense which you basically lack period there is no secret sauce,there is nothing the damn drivers all it will do is fix some minor optimization problems and that's it.

I also read beyond this forums,but i don't read mister X crap like you,the fact that you even name the damn cloud say it all,it is pure bullsh** it mean dedicated servers for games,is nothing GPU work with massive information movement,internet can't deliver it period they are not even close,even a 2000 GPU had way faster bandwidth than the best connection available now.

The xbox one is made in China not Germany,and is a cheap piece of sh** with a power equivalent to that of a 7770 nothing more nothing less there is no hidden GPU or crap,and you will look stupid down the down the road when the PS4 keep owning the xbox one.

But if you like been made fun of be my guess...lol

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#82  Edited By blackace
Member since 2002 • 23576 Posts

Did Tormentos says something? LMAO!!

http://weknowgifs.com/wp-content/uploads/2013/08/didnt-read-lol-gif-6.gif

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83 Wickerman777
Member since 2013 • 2164 Posts

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 tormentos
Member since 2003 • 33798 Posts

@blackace said:

Did Tormentos says something? LMAO!!

Dat secret Sauce...But but I am a manticore...lol

@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

Long time no see bro..

Avatar image for AmazonTreeBoa
AmazonTreeBoa

16745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#85 AmazonTreeBoa
Member since 2011 • 16745 Posts

@MonsieurX said:

@BattlefieldFan3 said:

@MonsieurX said:

@BattlefieldFan3 said:

@Tighaman said:

Its funny that everything has to be 1080p now but the most stable 1080p 60fps is on the x1 with forza not killzone only on multiplayer and only because a lack of AI they was able to have 60fps and its not stable DF mentioned FORZA SMOOTH AS BUTTER AT 1080p 60fps KILLZONE UNSTABLE AT 60fps BOTH 1st PARTY DEVs I don't see what COWS are seeing I X1 gave you great games at every resolution and 3rd party devs will soon follow

Killer Instinct 720p great game

Ryse 900p THE GRAPHIC KING AT THE MOMENT

FORZA 1080p 60fps stable with GREAT AI cloud AI

Yall can keep telling yourselves them lies about the hardware but the truth is in the software

How do you know if those games can't be run on the PS4 with higher graphical settings?

Forza 5 doesn't look much better than last-gen racing games.

Because PS4 can't run X1 games.

Assuming that the games had a PS4 port.

It can't run them,get over it.

Are you trolling or really that stupid? I got to ask.

Avatar image for Shielder7
Shielder7

5191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#86 Shielder7
Member since 2006 • 5191 Posts

@Wasdie said:

You're all over the place with this post, it's hard to exactly know what you're even saying.

Doesn't matter whatever point you're trying to make, the fact is the Xbox One is weaker than the PS4 in every single area. Since both machines have practically the same architecture, there is no "hidden power" to be found. The Xbox One can easily do 1080p games if devs are willing to cut down the graphcs in other areas to get 1080p at a playable framerate.

What happened in past generations does not apply to this gen. This gen we have 2 computers whose internals are built in the same way by the same company to nearly the same spec. The Playstation 4 has a more efficient memory architecture, faster ram, and a significantly more powerful GPU, and no requirement of the Kinect to drain on system resources.

It's that simple this gen.

What??? ............. What about the super overpowered 32 mb esram and the supreme powerz of the cloud.

Avatar image for Jacobistheman
Jacobistheman

3975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87  Edited By Jacobistheman
Member since 2007 • 3975 Posts
@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

The flops...

The PS3 had had like 2 times the TFLOPs as the Xbox 360 and the PS2 had 4.2 times that of the original Xbox. The PS3 and 360 looked almost identical and the Original Xbox was clearly the more powerful console graphics wise.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#88 blackace
Member since 2002 • 23576 Posts

LMAO!! The E3 is going to be so much fun this year.

http://mrwgifs.com/wp-content/uploads/2013/05/Denzel-Washington-Boom-Gif.gif

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 Wickerman777
Member since 2013 • 2164 Posts

@Jacobistheman said:
@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

The flops...

The PS3 had had like 2 times the TFLOPs as the Xbox 360 and the PS2 had 4.2 times that of the original Xbox. The PS3 and 360 looked almost identical and the Original Xbox was clearly the more powerful console graphics wise.

Umm, wildly different architectures. Cell has a bunch of theoretical flops but it rarely shows up in real-world performance. But PS4 and X1 use the same AMD architecture so the numbers are directly comparable. Basically you've got a AMD 7790 in X1 and a 7970 in PS4 (With 2 cores disabled in each for yields).

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90  Edited By Tighaman
Member since 2006 • 1038 Posts

@@tormentos: man you can keep telling your self that it has a Bonaire 7770 in it but is say it has a custom BONAIRE 260x in it at 853mhz I think my specs are closer than yours and as you can see they can give it a upclock at will up to 1.95tf at 1ghz but might want to stay a little lower for coolant these are my actual thoughts on whats in the xbox and don't care what you say about that opinion because everyone has their own.

But what I was saying in the last topic THAT WAS FROM PEOPLE THATS INSIDE NOT FROM ME DONT KILL THE MESSENGER. 2 APUs that's not from me 274+ bdwith not from me THERES NOT ONE XBOX XRAY OF THE SOC NOT ONE NO CUT THROUGH of the chip like they did with the PS4 why?

EVEN THE PHOTOS OF THE TEAR DOWNS fixit, all of them have photo shop and removing transistors on the CHIP go look please come back to me why is it clear to show what the PS4 have but not the x1 why?

All I like is answers I don't want to hear bullshit mf can tell me jfk got hit three times with one magic bullet all day but I know what I see and read so TORM answer the ?s please.

Avatar image for cainetao11
cainetao11

38078

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#91 cainetao11
Member since 2006 • 38078 Posts

@FreedomFreeLife said:

Yeah when Xbox 360 came out then we saw Gears of War while PS3 had game Liar and Haze.

PS3 is like maybe 5% more powerful than Xbox 360 yet Xbox 360 had best looking games than PS3(untill PS3 uncharted came out).

So right now we see lower res on Xbox One but maybe because they havent developed it well yet?

So maybe Xbox One is this time like Playstation 3, slow start, shitty ports but in time we see games that runs 1080p and 60fps without problems.

Yes, PS4 is more powerful than Xbox One but this dosent mean that Xbox One cant do 1080p games

I never played LIAR for the PS3. I know a few liars here when it comes to the PS3, but never played the game. On topic, with its own eyes.............the Xbox One could try to take over the world!!

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#92  Edited By blackace
Member since 2002 • 23576 Posts

Microsoft is hiding something. It will be revealed in the near future.

When people keep saying, "Why would Microsoft use this GPU, have a weaker system, but use more power then the PS4?"

Then think back. XBox more power. XBox 360 More powerful then what people thought. XB1................? LOL!!!

BOOM!!!

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#93 tormentos
Member since 2003 • 33798 Posts

@Tighaman said:

@@tormentos: man you can keep telling your self that it has a Bonaire 7770 in it but is say it has a custom BONAIRE 260x in it at 853mhz I think my specs are closer than yours and as you can see they can give it a upclock at will up to 1.95tf at 1ghz but might want to stay a little lower for coolant these are my actual thoughts on whats in the xbox and don't care what you say about that opinion because everyone has their own.

But what I was saying in the last topic THAT WAS FROM PEOPLE THATS INSIDE NOT FROM ME DONT KILL THE MESSENGER. 2 APUs that's not from me 274+ bdwith not from me THERES NOT ONE XBOX XRAY OF THE SOC NOT ONE NO CUT THROUGH of the chip like they did with the PS4 why?

EVEN THE PHOTOS OF THE TEAR DOWNS fixit, all of them have photo shop and removing transistors on the CHIP go look please come back to me why is it clear to show what the PS4 have but not the x1 why?

All I like is answers I don't want to hear bullshit mf can tell me jfk got hit three times with one magic bullet all day but I know what I see and read so TORM answer the ?s please.

First of all Bonaire is not 7770 that is cape Verde,Bonaire is 7790 .

Second the R260 is a rebag 7790 over clock to 1.1ghz,as you maky have notice the xbox one doesn't run at 1,100 mhz,it runs at 853,what does this mean.?

Even if by a small change that was true,that would mean that the xbox one is running 247mhz slower than the R260 while having 2 less CU active,which mean the same sh**,the xbox one is under power and crap even if it had a R260 because the R260 is just and over clocked rebag 7790.

Now lets see this some how secret performance that you seen on the R260 over the normal Bonaire 7790..

Oh god the R260 does almost 2 frames more on BF4 than normal Bonaire 7790 holy cow now that is true power wow i just can't imagine such a huge gain from 100mhz more..wow incredible..

You know what i find funny that even at 1.97TF and 240 mhz faster than the 1.76TF 7850 still the 7850 is faster...lol...

Bonaire = the broken link in GCN is the only GCN with more flops than another GCN yet is un able to beat one with lesser flops,those 16 ROP most really be hurting that R260.

The R260 is 1.1ghz so if MS upclock by a miracle to 1ghz still would be 100mhz under clocked vs the R260,and second in were the fu** do you get that 12 CU at 1ghz will give you 1.95TF.?

The xbox one has 12 CU not 14 like the R260 see this is why you should not talk about crap you don't know,you look stupid and the more you try the more stupid you look.

There are not 2 APU you an mister X are morons,you actually more for believing him,since he does it to troll sad people like you into thinking the xbox one has something hidden when it doesn't...

The R7 260X is a rebadged Radeon HD 7790 that has been overclocked, with cards running at 1.1GHz opposed to 1GHz. Jumping up in speed we have the R9 270X which is a rebadged Radeon HD 7870 (more about these two in a second). Finally, the R9 280X which we'll eventually retest looks to be a direct copy of the Radeon HD 7970 GHz Edition.

http://www.techspot.com/review/722-radeon-r9-270x-r7-260x/

Class dismiss...

Avatar image for Jacobistheman
Jacobistheman

3975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 Jacobistheman
Member since 2007 • 3975 Posts

@Wickerman777 said:

@Jacobistheman said:
@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

The flops...

The PS3 had had like 2 times the TFLOPs as the Xbox 360 and the PS2 had 4.2 times that of the original Xbox. The PS3 and 360 looked almost identical and the Original Xbox was clearly the more powerful console graphics wise.

Umm, wildly different architectures. Cell has a bunch of theoretical flops but it rarely shows up in real-world performance. But PS4 and X1 use the same AMD architecture so the numbers are directly comparable. Basically you've got a AMD 7790 in X1 and a 7970 in PS4 (With 2 cores disabled in each for yields).

You go on a rant about flops and then you agree that theoretical flops rarely show up in real-world performance?

Even in two systems that have similar architecture, the flops don't make a huge difference unless the GPU waiting for floating point calculations to finish is the bottle neck (which is isn't in most cases).

Sure, the PS4 GPU is more powerful than the Xbox One, but there is in no way 50% more graphics capability just because there are 50% more shaders which result in 50% more tflops in the GPU.

Avatar image for Gaming-Planet
Gaming-Planet

21107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#95  Edited By Gaming-Planet
Member since 2008 • 21107 Posts

You can do native 1080p games at the cost of visuals. The scaler on Xbox One is slightly better than the PS4, but also makes the game look sort of dark. It's ok for some games get away with 1080p upscaled. Optimizing can work wonders but you can't push the limit if you hit one.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 tormentos
Member since 2003 • 33798 Posts
@Jacobistheman said:
@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

The flops...

The PS3 had had like 2 times the TFLOPs as the Xbox 360 and the PS2 had 4.2 times that of the original Xbox. The PS3 and 360 looked almost identical and the Original Xbox was clearly the more powerful console graphics wise.

That's no true,they both lie about flops MS and Sony,it was MS who started,MS claimed that the xbox 360 was 1TF when it real life it wasn't even close to that,then sony came and say 2TF which again wasn't even close again.

The real flops of the unit is a little higher over all on PS3 because of Cell,Cell had way more flops than the Xenon CPU,but the Xenos GPU had more flops than the RSX.

One has like 230Gflops and the other like 250Gflops.

@blackace said:

Microsoft is hiding something. It will be revealed in the near future.

When people keep saying, "Why would Microsoft use this GPU, have a weaker system, but use more power then the PS4?"

Then think back. XBox more power. XBox 360 More powerful then what people thought. XB1................? LOL!!!

BOOM!!!

Lol...Cover your self your inner lemming is showing my manticore friend...hahahaha

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97  Edited By tormentos
Member since 2003 • 33798 Posts
@Gaming-Planet said:

You can do native 1080p games at the cost of visuals. The scaler on Xbox One is slightly better than the PS4, but also makes the game look sort of dark. It's ok for some games get away with 1080p upscaled. Optimizing can work wonders but you can't push the limit if you hit one.

The scaler on the xbox one is the same one on the PS4,they both have the same GCN with same scaler the PS4 is not using its scaler because 99% of the games are native 1080p.

@Jacobistheman said:

You go on a rant about flops and then you agree that theoretical flops rarely show up in real-world performance?

Even in two systems that have similar architecture, the flops don't make a huge difference unless the GPU waiting for floating point calculations to finish is the bottle neck (which is isn't in most cases).

Sure, the PS4 GPU is more powerful than the Xbox One, but there is in no way 50% more graphics capability just because there are 50% more shaders which result in 50% more tflops in the GPU.

Not even on PC 50% more power = 50% better graphics,it mostly mean more frames most of the time with more things on,which will make the game look a little better.

The 7850 can run at the same setting the 7950 can,but the 7950 is faster that is all,one does it at 20 the other at 30 or 35,in some cases like the xbox one,to actually keep frame rate parity you have to sacrifice resolution because if you use the same visuals as a stronger card you get the visuals but you don't get the frames and the game is unplayable.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#99  Edited By ronvalencia
Member since 2008 • 29612 Posts

@silversix_ said:

@ronvalencia said:
@tormentos said:


@ronvalencia said:

Not against Intel's Quick Sync i.e. hardware extensions that was specifically designed for video encoding.

-------------

From x264 software developer on IBM CELL vs Intel Core i7 920, http://forum.doom9.org/showpost.php?p=1454286&postcount=2

Question: While it was working, however, it worked great thanks to the power of the Cell and proved to be faster than a core i7 920)

Answer: No, it wasn't. It was way, way, way slower than x264 on a core i7 with similar settings. Sure, if you put x264 on slow settings and it on fast settings, it was faster -- but that's hardly a surprise.

The Cell is a pretty slow CPU. It takes roughly 2.5 cores (out of 8 ) to do realtime 1080p H.264 decoding with a highly optimized decoder. A fast i7 can do that with about ~0.4 cores (out of 4 or 6).

You are a fu**ing moron did you even read what you quote.?

First of all Cell only had 7 SPE not 8,the 8 one was disable for redundancy.

Second Cell doesn't have cores you idiot,it has 1 PPE and 7 SPE they are not actual core nor work exactly like core.

And 3rd this is the first time i hear some one claiming 2.5 cores as if you could divide SPE down to half of one,you either use one or you don't.

Also nice quoting people on a forum your hate for sony is so great that it make you look stupid,once again a company did test and Cell >>> and i7.....Cry all you want..lol

LOL, SPU ISA is roughly a subset of Altivec. Your the fu**ing moron.

Also, PC's IBM CELL PCI-Express add-on-card is a full 8 SPU version. Again, your the fu**ing moron.

My supplied link's context was against that company's i7 vs CELL test.. LOL. The software programmer for x264 encoder software has labelled your company's i7 vs CELL test as BS. IBM CELL's PC commercial adventure was a failure.

PS3 SPE's IEEE-754 support is $hit i.e. not comparable to Intel SSEx.

A programmer can use half of the processor's capability LOL. Can you Fu*k off? It's clear you don't know what you are talking about.

As for my Sony hate..

I still own Sony Vaio VGN-FW45 laptop (has Radeon HD 4650M GDDR3) and still has pretty good audio circuits i.e. it's still better than old my DELL Studio XPS 1645. My Sony Vaio VGN-FW45 laptop is still operational as my audio editing machine.

A Sony laptop owner should be able to identify Sony's desktop wireless app.

My Sony Vaio VGN-FW45 laptop has an external Radeon HD 5770 eGPU solution.

I also own a Sony 46 inch 1080p HDTV.

With my Sony laptop, I spent more $$ on a Sony device than your combined PS3 and PS4 hardware. F**k off, you don't know me.

My Intel Coreâ„¢ i7 Processor 3635QM would smash PS3's CELL processor i.e. four sets of 256bit wide Intel AVX SIMDs + 16 GT2 SIMDx processors (Intel HD 4000) and Intel QuickSync.

Intel HD 4000 rivals PS3 (combined 6 SPE + RSX) and Xbox 360 in gaming results.

You two should marry each other and invite me to the marriage.

He is a f**king noob. My background was with 68K and early PowerPC and I have wasted time and money with this $hit. It took Apple (a few years before 2006) and Sony (a few years before 2013) to realize it.

PowerPC ISA has it's own design issues e.g. performance issues with functions (i.e. optimized for stack based for architectures such as X86), transfers between GPR-to-SIMD/SIMD-to-GPR registers (such transfers involves routing via FSB, X86 has direct transfers between GPR and SIMD registers) and code density relative CISC ISA (makes better use with the smaller cache storage). The example of this problem was with Doom 3 MacOS PowerPC version.

During AMD K8 era, Intel X86 has issues with SIMD when compared with Altivec i.e. 64bit SIMD hardware implementation while AMD K8 has at least 128bit FADD SSE1 hardware (one of the reasons why AMD K8 beats Intel Pentium IV). Things changed when Intel introduced 128bit SSE (for both FADD and FMUL) hardware with Intel Core 2.

In general, AMD K8 matched IBM PowerPC 970 i.e. using the same Apple's Adobe Photoshop test. PowerPC camps lost it's media processing edge when AMD assimilated ATI i.e. AMD leaped frog it's competitors (e.g. IBM PowerPC/SPU (Nov 2006 for PS3), Intel X86 and etc) via ATI's assimilation (started from July 2006).

Intel started to get serious with 3D when Intel started to embed GPUs (which contains programmable SIMD arrays) with their CPUs i.e. started from Clarkdale and Arrandale processors. Intel is currently has the 3rd fastest GPU designs in the world and left IBM for dead.

Ultimately, IBM's road map didn't match Intel's road map which was important for Apple.

On ARM's side, SoCs combined CPU and GPUs (most of them are DX9 class GPUs). Almost everybody is doing their own "CELL" solutions. Intel's GT2 and GT3 GPUs in mobile tablet space is class leading (against ARM based GPUs). Where's PowerPC in both market segments (i.e. ARM and X86 target markets)?

My smart phone can do h.264 real time video encoding LOL.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#100  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:
@Jacobistheman said:
@Wickerman777 said:

Xbox One has 12 GPU compute units, PS4 has 18, a 50% difference. There's no getting around that. It's a bigger gap than there is between X360 and PS3. I expected both consoles to have 2.5 tflops of graphics performance so even PS4 is disappointing to me. But it blows my mind how weak the GPU in Xbox One is considering what's possible today. Had someone told me ahead of time that the new Xbox would be only 1.3 tflops I would have laughed and called them nuts. And despite being so much weaker it costs them more to make it than PS4 does Sony. It's a bigtime design failure if ever there was one. Because of it I'll be picking Sony for the first time since PS1. MS has left me with little choice.

The flops...

The PS3 had had like 2 times the TFLOPs as the Xbox 360 and the PS2 had 4.2 times that of the original Xbox. The PS3 and 360 looked almost identical and the Original Xbox was clearly the more powerful console graphics wise.

That's no true,they both lie about flops MS and Sony,it was MS who started,MS claimed that the xbox 360 was 1TF when it real life it wasn't even close to that,then sony came and say 2TF which again wasn't even close again.

The real flops of the unit is a little higher over all on PS3 because of Cell,Cell had way more flops than the Xenon CPU,but the Xenos GPU had more flops than the RSX.

One has like 230Gflops and the other like 250Gflops.

With Xbox 360 and PS3, you can't compare CPU to CPU and GPU to GPU in 1-to-1 relationship since Xenos GPU is more flexible than RSX GPU.

MS's 1 TFLOPS (Xbox 360) and Sony/NVIDIA's 2 TFLOPS (PS3) numbers includes GPU's fix function units and they don't factor integer workloads. In another words, they are meaningless.

---------------

On R7-260X vs 7790, AMD just "made" factory overclock editions as a reference standard and stick a new model number.

As part of AMD's CU design, R7-260X's TMUs are another limiting factor.

TMU's load/store function relates to memory bandwidth. ROPS is not the only units in the GPU that consumes large amount of memory bandwidth.

For 1080p, the different results between 7950 BE 's 32 ROPS (850Mhz base) vs 7850's 32 ROPS (860Mhz) proves 16 ROPS would not be the limiting factor.

To workout if the game is mostly CU limited, using your BF3 benchmarks to estimate frame rate and matching it with real Radeon HD SKU

Radeon HD 7870 GE = 59 fps / 20 CU at ~1 Ghz (1000Mhz) = 2.95 x 14 = 41.3 estimated fps

Real 7790 1GB has 36.5 fps.

Real R7-260X 2GB (1100Mhz) has 38.5 fps.

Based from CU count, BF3 can be use to estimate theoretical Radeon HD SKU's performance. Better use BF4 since BF3 was superseded.

It's interesting to note that 7950's 32 ROPS (800Mhz) vs 7850's 32 ROPS (860Mhz) continues to support higher frame rate rate via the increase CU count i.e. in another words, 16 ROPS is hardly the limiting factor at 1080p.