Xbox One could be more powerful than you think

  • 123 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101  Edited By Wickerman777
Member since 2013 • 2164 Posts

Because both of these consoles are using x64 AMD APUs this generation's specs are probably the easiest to understand. It couldn't be any more simple: The primary difference between them spec-wise is the amount of CUs (And both use the same type of CUs) in the GPUs and PS4 has 6 more of them than X1 does. All ya gotta do is look at AMD 7000 series GPUs for PC to know how significant the amount of them is. More CUs = more processing power. The second biggest difference between them is the memory type. Sony wins on that front as well.

Some have mentioned that games on PS4 don't look 50% better than games on X1 and then ask "So how can it be 50% more powerful?" The reason is because with modern GPUs extra power tends to go towards added framerate and/or resolution rather than extra art. 1080p may not look 50% better than 720p to you and/or 60 fps may not look 50% better than 30 fps to you but technically it is. Heck, actually the difference with those two examples is 100% but you'll find plenty of people claiming to not see even 50% difference between them ... maybe no difference at all depending on the observer. But regardless of what a given person claims to be able to see it doesn't change the fact that on a technical level a piece of hardware running a given game at 1080p is doing twice as much as another piece of hardware running the same game at 720p.

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#102  Edited By tormentos
Member since 2003 • 33798 Posts

@ronvalencia said:

With Xbox 360 and PS3, you can't compare CPU to CPU and GPU to GPU in 1-to-1 relationship since Xenos GPU is more flexible than RSX GPU.

MS's 1 TFLOPS (Xbox 360) and Sony/NVIDIA's 2 TFLOPS (PS3) numbers includes GPU's fix function units and they don't factor integer workloads. In another words, they are meaningless.

---------------

On R7-260X vs 7790, AMD just "made" factory overclock editions as a reference standard and stick a new model number.

As part of AMD's CU design, R7-260X's TMUs are another limiting factor.

TMU's load/store function relates to memory bandwidth. ROPS is not the only units in the GPU that consumes large amount of memory bandwidth.

For 1080p, the different results between 7950 BE 's 32 ROPS (850Mhz base) vs 7850's 32 ROPS (860Mhz) proves 16 ROPS would not be the limiting factor.

To workout if the game is mostly CU limited, using your BF3 benchmarks to estimate frame rate and matching it with real Radeon HD SKU

Radeon HD 7870 GE = 59 fps / 20 CU at ~1 Ghz (1000Mhz) = 2.95 x 14 = 41.3 estimated fps

Real 7790 1GB has 36.5 fps.

Real R7-260X 2GB (1100Mhz) has 38.5 fps.

Based from CU count, BF3 can be use to estimate theoretical Radeon HD SKU's performance. Better use BF4 since BF3 was superseded.

It's interesting to note that 7950's 32 ROPS (800Mhz) vs 7850's 32 ROPS (860Mhz) continues to support higher frame rate rate via the increase CU count i.e. in another words, 16 ROPS is hardly the limiting factor at 1080p.

What.?

You do know the Xenon like Cell is PPC based right.? Cell smoked the xbox 360 CPU,the Xenos smoked the RSX,GPU fixed function mean little,the real flop count was way lower,the rest is totally irrelevant to what i was saying..

Wait so your excuse to the crap 7790 performance was,that the 7790 didn't have the bandwidth,now your excuse is that it doesn't have the CU.?

Wait aren't you the same who claimed that a 7850 at 900mhz = the PS4.? how can that be when the 7850 even that it would be running at 900mhz still is limited by having 2 less CU.?

So you latest theory contradict your first theory.?

Hahahahaaaaaaaaaaaaaaaaa

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 silversix_
Member since 2010 • 26347 Posts

@ronvalencia said:

@silversix_ said:

@ronvalencia said:
@tormentos said:


@ronvalencia said:

Not against Intel's Quick Sync i.e. hardware extensions that was specifically designed for video encoding.

-------------

From x264 software developer on IBM CELL vs Intel Core i7 920, http://forum.doom9.org/showpost.php?p=1454286&postcount=2

Question: While it was working, however, it worked great thanks to the power of the Cell and proved to be faster than a core i7 920)

Answer: No, it wasn't. It was way, way, way slower than x264 on a core i7 with similar settings. Sure, if you put x264 on slow settings and it on fast settings, it was faster -- but that's hardly a surprise.

The Cell is a pretty slow CPU. It takes roughly 2.5 cores (out of 8 ) to do realtime 1080p H.264 decoding with a highly optimized decoder. A fast i7 can do that with about ~0.4 cores (out of 4 or 6).

You are a fu**ing moron did you even read what you quote.?

First of all Cell only had 7 SPE not 8,the 8 one was disable for redundancy.

Second Cell doesn't have cores you idiot,it has 1 PPE and 7 SPE they are not actual core nor work exactly like core.

And 3rd this is the first time i hear some one claiming 2.5 cores as if you could divide SPE down to half of one,you either use one or you don't.

Also nice quoting people on a forum your hate for sony is so great that it make you look stupid,once again a company did test and Cell >>> and i7.....Cry all you want..lol

LOL, SPU ISA is roughly a subset of Altivec. Your the fu**ing moron.

Also, PC's IBM CELL PCI-Express add-on-card is a full 8 SPU version. Again, your the fu**ing moron.

My supplied link's context was against that company's i7 vs CELL test.. LOL. The software programmer for x264 encoder software has labelled your company's i7 vs CELL test as BS. IBM CELL's PC commercial adventure was a failure.

PS3 SPE's IEEE-754 support is $hit i.e. not comparable to Intel SSEx.

A programmer can use half of the processor's capability LOL. Can you Fu*k off? It's clear you don't know what you are talking about.

As for my Sony hate..

I still own Sony Vaio VGN-FW45 laptop (has Radeon HD 4650M GDDR3) and still has pretty good audio circuits i.e. it's still better than old my DELL Studio XPS 1645. My Sony Vaio VGN-FW45 laptop is still operational as my audio editing machine.

A Sony laptop owner should be able to identify Sony's desktop wireless app.

My Sony Vaio VGN-FW45 laptop has an external Radeon HD 5770 eGPU solution.

I also own a Sony 46 inch 1080p HDTV.

With my Sony laptop, I spent more $$ on a Sony device than your combined PS3 and PS4 hardware. F**k off, you don't know me.

My Intel Coreâ„¢ i7 Processor 3635QM would smash PS3's CELL processor i.e. four sets of 256bit wide Intel AVX SIMDs + 16 GT2 SIMDx processors (Intel HD 4000) and Intel QuickSync.

Intel HD 4000 rivals PS3 (combined 6 SPE + RSX) and Xbox 360 in gaming results.

You two should marry each other and invite me to the marriage.

He is a f**king noob. My background was with 68K and early PowerPC and I have wasted time and money with this $hit. It took Apple (a few years before 2006) and Sony (a few years before 2013) to realize it.

PowerPC ISA has it's own design issues e.g. performance issues with functions (i.e. optimized for stack based for architectures such as X86), transfers between GPR-to-SIMD/SIMD-to-GPR registers (such transfers involves routing via FSB, X86 has direct transfers between GPR and SIMD registers) and code density relative CISC ISA (makes better use with the smaller cache storage). The example of this problem was with Doom 3 MacOS PowerPC version.

During AMD K8 era, Intel X86 has issues with SIMD when compared with Altivec i.e. 64bit SIMD hardware implementation while AMD K8 has at least 128bit FADD SSE1 hardware (one of the reasons why AMD K8 beats Intel Pentium IV). Things changed when Intel introduced 128bit SSE (for both FADD and FMUL) hardware with Intel Core 2.

In general, AMD K8 matched IBM PowerPC 970 i.e. using the same Apple's Adobe Photoshop test. PowerPC camps lost it's media processing edge when AMD assimilated ATI i.e. AMD leaped frog it's competitors (e.g. IBM PowerPC/SPU (Nov 2006 for PS3), Intel X86 and etc) via ATI's assimilation (started from July 2006).

Intel started to get serious with 3D when Intel started to embed GPUs (which contains programmable SIMD arrays) with their CPUs i.e. started from Clarkdale and Arrandale processors. Intel is currently has the 3rd fastest GPU designs in the world and left IBM for dead.

Ultimately, IBM's road map didn't match Intel's road map which was important for Apple.

On ARM's side, SoCs combined CPU and GPUs (most of them are DX9 class GPUs). Almost everybody is doing their own "CELL" solutions. Intel's GT2 and GT3 GPUs in mobile tablet space is class leading (against ARM based GPUs). Where's PowerPC in both market segments (i.e. ARM and X86 target markets)?

My smart phone can do h.264 real time video encoding LOL.

Don't you want to rub his back with your superior knowledge? You teach him, he teach you and we have a strong marriage for life.

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 stereointegrity
Member since 2007 • 12151 Posts

@blackace said:

@FreedomFreeLife: Well, if that overlapping stack on AMD chips turns out to be true, the ball game will be over. There is suppose to be some major software upgrade going to XB DevKits this month as well. Not enitrely sure what that is suppose to do. I just remember a developer saying they were waiting patiently for it. In any case, 2014 is going to be fun.

are u really using MisterxMEdia....wtf is wrong with lems

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#105  Edited By blackace
Member since 2002 • 23576 Posts

@stereointegrity said:

@blackace said:

@FreedomFreeLife: Well, if that overlapping stack on AMD chips turns out to be true, the ball game will be over. There is suppose to be some major software upgrade going to XB DevKits this month as well. Not enitrely sure what that is suppose to do. I just remember a developer saying they were waiting patiently for it. In any case, 2014 is going to be fun.

are u really using MisterxMEdia....wtf is wrong with lems

Not a lem. The info is actually coming from developers, not MisterXMedia.

What is wrong with Cows. They seem to panic everytime something positive is posted about the XB1. lmao!!

Should I believe developers working on the XB1 or idiots like Tormentos who haven't a clue?

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#106  Edited By silversix_
Member since 2010 • 26347 Posts

How can anyone even still dream about X1 being more powerful than what we think? I mean just look at CoD, this is like the direct answer to your dreams... a last gen title can't run at 720p. What will happen when there's real next gen engine? Games will run at 640p on X1. Its just sad how worthless the system is.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#107 blackace
Member since 2002 • 23576 Posts

@silversix_ said:

How can anyone even still dream about X1 being more powerful than what we think? I mean just look at CoD, this is like the direct answer to your dreams... a last gen title can't run at 720p. What will happen when there's real next gen engine? Games will run at 640p on X1. Its just sad how worthless the system is.

Mark Rubin already said they had Cod:Ghost running at 1080P on the XB1. They didn't have the time to optimize and configure it to a higher framerate. Which makes sense since they got the new devkits at the beginning of Sept. M$ pretty much threw a monkey wrench into developers work when they released a updated O/S with the new devkit. That's really the only reason it's 720P. I'll bet the next CoD game isn't 720P upscaled.

Is hyst sad how wothless your bias comments are.

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 silversix_
Member since 2010 • 26347 Posts

@blackace: if they had it running in 1080p at anything over 15fps they would've released the game in 1080p. X1 can't handle last gen engines with static worlds in 1080p... BF4 is another example but CoD one is just funny how the system can't handle 2nd ugliest next gen title in 1080p

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#110  Edited By blackace
Member since 2002 • 23576 Posts

@silversix_ said:

@blackace: if they had it running in 1080p at anything over 15fps they would've released the game in 1080p. X1 can't handle last gen engines with static worlds in 1080p... BF4 is another example but CoD one is just funny how the system can't handle 2nd ugliest next gen title in 1080p

It can handle all that stuff and more in 1080P. lol!! You obviously didn't read my whole post or you are just ignorant to what I said. BF4 has issues on every platform, even the PC. lol!! BF4 is still being patched again and again as I type this. It was a disaster on every platform. That's the developers fault. There are already 1080p games on XB1 and more will be announced as the E3 gets closer and closer.

Oohh.. did Tormentos say something? It was just more garbage BS as usual? Yes, trolls love to do that.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#111 Wickerman777
Member since 2013 • 2164 Posts

@tormentos said:

@blackace said:

Not a lem. The info is actually coming from developers, not MisterXMedia.

What is wrong with Cows. They seem to panic everytime something positive is posted about the XB1. lmao!!

Should I believe developers working on the XB1 or idiots like Tormentos who haven't a clue?

Really so what developer is that.? Where is the link.?

Panic the only morons in panic are lemming,is sad you can't just let it go the xbox one is weaker,i wonder if you will change your argument again o all it matter is games when all the crap you are saying fail to materialize.

@silversix_ said:

How can anyone even still dream about X1 being more powerful than what we think? I mean just look at CoD, this is like the direct answer to your dreams... a last gen title can't run at 720p. What will happen when there's real next gen engine? Games will run at 640p on X1. Its just sad how worthless the system is.

Oh there are some mysterious driver that will unlock the xbox one DGPU and turn the CPU into a 4.0ghz i7...

Yeah, it's amazing to me that so many people still won't accept it. In addition to the specs there's games out now that show the difference. Battlefield 4 is 1080p on PS4, 900p on X1. COD:Ghosts is 1080p on PS4, 720p on X1. None of it is at all surprising to anyone that bothered to look at the specs. Wishing something isn't true doesn't stop it from being true.

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112  Edited By stereointegrity
Member since 2007 • 12151 Posts

@blackace said:

@stereointegrity said:

@blackace said:

@FreedomFreeLife: Well, if that overlapping stack on AMD chips turns out to be true, the ball game will be over. There is suppose to be some major software upgrade going to XB DevKits this month as well. Not enitrely sure what that is suppose to do. I just remember a developer saying they were waiting patiently for it. In any case, 2014 is going to be fun.

are u really using MisterxMEdia....wtf is wrong with lems

Not a lem. The info is actually coming from developers, not MisterXMedia.

What is wrong with Cows. They seem to panic everytime something positive is posted about the XB1. lmao!!

Should I believe developers working on the XB1 or idiots like Tormentos who haven't a clue?

what developers have said anything about this cause it would be all over every gaming news website....

u got this from misterx stop the denial

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#113 silversix_
Member since 2010 • 26347 Posts

@blackace said:

@silversix_ said:

@blackace: if they had it running in 1080p at anything over 15fps they would've released the game in 1080p. X1 can't handle last gen engines with static worlds in 1080p... BF4 is another example but CoD one is just funny how the system can't handle 2nd ugliest next gen title in 1080p

It can handle all that stuff and more in 1080P. lol!! You obviously didn't read my whole post or you are just ignorant to what I said. BF4 has issues on every platform, even the PC. lol!! BF4 is still being patched again and again as I type this. It was a disaster on every platform. That's the developers fault. There are already 1080p games on XB1 and more will be announced as the E3 gets closer and closer.

Oohh.. did Tormentos say something? It was just more garbage BS as usual? Yes, trolls love to do that.

It can handle 1080p so well that even simplistic titles with last gen visuals such as KI are 720p.

Avatar image for Jakandsigz
Jakandsigz

6341

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 Jakandsigz
Member since 2013 • 6341 Posts

This reminds me of the PS3 comparisons where people said the PS3 was to hard to developed for and could not run at an even 30frames per second and a lot of games were below 720P upscaled.

I am sure the Xbox one can run games higher. Just like the PS3 could and did run games higher eventually. Hoepfully it takes a much shorter time for the Xbox one to run games better as it would be quite an issue if this problem remained by the half-way mark of 2014.

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 Jankarcop
Member since 2011 • 11058 Posts

xboner zero shitonme runs slower than a toaster

Avatar image for deactivated-603db33572396
deactivated-603db33572396

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116  Edited By deactivated-603db33572396
Member since 2007 • 361 Posts

@Jankarcop said:

xboner zero shitonme runs slower than a toaster

you smell like shit that was stuffed into a toaster and set to crispy

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#117  Edited By Gue1
Member since 2004 • 12171 Posts

@Jakandsigz said:

This reminds me of the PS3 comparisons where people said the PS3 was to hard to developed for and could not run at an even 30frames per second and a lot of games were below 720P upscaled.

I am sure the Xbox one can run games higher. Just like the PS3 could and did run games higher eventually. Hoepfully it takes a much shorter time for the Xbox one to run games better as it would be quite an issue if this problem remained by the half-way mark of 2014.

PS3 had a much more powerful CPU that could even render graphics but it was an exotic component. The Xbone has weaker CPU, weaker GPU and worse ram set-up....

Unless you're a computer illiterate or a complete idiot this is something that you should already know.

Avatar image for deactivated-5c79c3cfce222
deactivated-5c79c3cfce222

4715

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#118 deactivated-5c79c3cfce222
Member since 2009 • 4715 Posts

There is no secret sauce.
Firmware updates will make things easier, quicker and more convenient for developers, which is important, but the weaker hardware will remain a constant for all eternity.
Deal with it.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#119  Edited By remiks00
Member since 2006 • 4249 Posts

@McStrongfast said:

There is no secret sauce.

Firmware updates will make things easier, quicker and more convenient for developers, which is important, but the weaker hardware will remain a constant for all eternity.

Deal with it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#120  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@ronvalencia said:

With Xbox 360 and PS3, you can't compare CPU to CPU and GPU to GPU in 1-to-1 relationship since Xenos GPU is more flexible than RSX GPU.

MS's 1 TFLOPS (Xbox 360) and Sony/NVIDIA's 2 TFLOPS (PS3) numbers includes GPU's fix function units and they don't factor integer workloads. In another words, they are meaningless.

---------------

On R7-260X vs 7790, AMD just "made" factory overclock editions as a reference standard and stick a new model number.

As part of AMD's CU design, R7-260X's TMUs are another limiting factor.

TMU's load/store function relates to memory bandwidth. ROPS is not the only units in the GPU that consumes large amount of memory bandwidth.

For 1080p, the different results between 7950 BE 's 32 ROPS (850Mhz base) vs 7850's 32 ROPS (860Mhz) proves 16 ROPS would not be the limiting factor.

To workout if the game is mostly CU limited, using your BF3 benchmarks to estimate frame rate and matching it with real Radeon HD SKU

Radeon HD 7870 GE = 59 fps / 20 CU at ~1 Ghz (1000Mhz) = 2.95 x 14 = 41.3 estimated fps

Real 7790 1GB has 36.5 fps.

Real R7-260X 2GB (1100Mhz) has 38.5 fps.

Based from CU count, BF3 can be use to estimate theoretical Radeon HD SKU's performance. Better use BF4 since BF3 was superseded.

It's interesting to note that 7950's 32 ROPS (800Mhz) vs 7850's 32 ROPS (860Mhz) continues to support higher frame rate rate via the increase CU count i.e. in another words, 16 ROPS is hardly the limiting factor at 1080p.

What.?

You do know the Xenon like Cell is PPC based right.? Cell smoked the xbox 360 CPU,the Xenos smoked the RSX,GPU fixed function mean little,the real flop count was way lower,the rest is totally irrelevant to what i was saying..

Wait so your excuse to the crap 7790 performance was,that the 7790 didn't have the bandwidth,now your excuse is that it doesn't have the CU.?

Wait aren't you the same who claimed that a 7850 at 900mhz = the PS4.? how can that be when the 7850 even that it would be running at 900mhz still is limited by having 2 less CU.?

So you latest theory contradict your first theory.?

Hahahahaaaaaaaaaaaaaaaaa

LOL, I already know Xenon and CELL are based on PowerPC and Altivec. For certain asymmetric floating point workloads, CELL will beat Xenon's 3 kit-bashed VMX.

Btw, FLOPS only deals with floating point workloads not with integers.

As for 7790 vs 7850, I have already shown you BF3's CU based scaling vs frame rate.

At 1080p, GeForce GTX 660 Ti's 24 ROPS wasn't a limiting factor against 32 ROPS equipped Radeon HD 7850/7870 GE/7870 XT/7950.

As for 7850 at 900Mhz, pure FLOPS numbers doesn't include TMU factors. TMU's load/store operations requires memory access (that's involves texture cache and external memory access).

PS4: 72 TMU x 800 Mhz = 57600

7850-900 Mhz: 64 TMU x 900 Mhz = 57600.

900Mhz also speeds up the entire GPU i.e. tessellation, cache, local data storage, ROPS, dispatchers and 'etc'.

For TMU's load/store operations, R7-260X still has inferior memory bandwidth when compared to 7850. AMD gimped R7-260X (1.97 TFLOPS) to be under R9-270 (2.368TFLOPS) i.e. 104 GB/s vs 179 GB/s

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121  Edited By ronvalencia
Member since 2008 • 29612 Posts
@Gue1 said:

@Jakandsigz said:

This reminds me of the PS3 comparisons where people said the PS3 was to hard to developed for and could not run at an even 30frames per second and a lot of games were below 720P upscaled.

I am sure the Xbox one can run games higher. Just like the PS3 could and did run games higher eventually. Hoepfully it takes a much shorter time for the Xbox one to run games better as it would be quite an issue if this problem remained by the half-way mark of 2014.

PS3 had a much more powerful CPU that could even render graphics but it was an exotic component. The Xbone has weaker CPU, weaker GPU and worse ram set-up....

Unless you're a computer illiterate or a complete idiot this is something that you should already know.

With Swiftshader software, I can also render raster graphics with my Intel Sandybridge/Ivybridge CPU e.g. Crysis.

It's not very efficient when compared to low end Intel/ATI/NVIDIA DX10 GPUs.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122  Edited By GrenadeLauncher
Member since 2004 • 6843 Posts

Spakace ruined by misterxmedia. I told you not to believe that conman.

Avatar image for iwasgood2u
iwasgood2u

831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 iwasgood2u
Member since 2009 • 831 Posts

The lemmings are forgetting that the ps3 and 360 were like comparing an apple to an orange because of completely different cpu and ram. Ps4 and xboner is like comparing a fresh apple(ps4) to a rotten apple(xboner). There doesnt really seem to be much hope left for it except cloud. Sorry but cloud is heavily on internet connections so a drop in connections can cause interruption in progress. If in middle of battling a boss the game may end up resetting. Privacy is also a big concern with cloud. I dont see how cloud can make a console any faster. It only enhances the experience such as storing save games and training the ai but hasnt been proven great. Look at siri from iphone that girl is not very useful. Been over a year now and no one seems to use it much. Pretty much thats what you will get out of it .

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#124 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

OMG most of you go around saying it’s easier to make games on PS4 when Xbox One is using DIRECTX11.2.

PS4 uses OPENGL. I think. OR at least something in common with OPEN GL. Due to Sony.

Direct X 11 is the better API.

MS makes Direct X

Avatar image for deactivated-5c79c3cfce222
deactivated-5c79c3cfce222

4715

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#125 deactivated-5c79c3cfce222
Member since 2009 • 4715 Posts

@acp_45 said:

OMG most of you go around saying it’s easier to make games on PS4 when Xbox One is using DIRECTX11.2.

PS4 uses OPENGL. I think. OR at least something in common with OPEN GL. Due to Sony.

Direct X 11 is the better API.

MS makes Direct X

I just did a quick "opengl vs directx" google and this happened.

http://www.extremetech.com/gaming/133824-valve-opengl-is-faster-than-directx-even-on-windows
http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX

The whole Valve bit (source performing better in opengl, poopooing Windows, going linux etc.) is actually quite interesting. Valve can make things happen.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#126 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

haha 2012 ???

Direct x 11.2 wasn’t even announced.

Think again my friend.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#127 blackace
Member since 2002 • 23576 Posts

@GrenadeLauncher said:

Spakace ruined by misterxmedia. I told you not to believe that conman.

It's all just rumors and the devs and M$ insiders are anonymous. Misterxmedia isn't providing the info. He is getting the info from devs and inside contact who work with M$ and on the XB1. Like I said before. Time will tell the truth. If Microsoft comes out and says none of this is true, then that will be the end of that. Microsoft isn't talking though.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 GrenadeLauncher
Member since 2004 • 6843 Posts
@blackace said:

@GrenadeLauncher said:

Spakace ruined by misterxmedia. I told you not to believe that conman.

It's all just rumors and the devs and M$ insiders are anonymous. Misterxmedia isn't providing the info. He is getting the info from devs and inside contact who work with M$ and on the XB1. Like I said before. Time will tell the truth. If Microsoft comes out and says none of this is true, then that will be the end of that. Microsoft isn't talking though.

misterpratmedia's "insiders" are the voices in his head that tell him to burn things.

It got even sadder when he impersonated Penello and insisted that it was the real deal. Then again I guess being humiliated by Albie twice isn't enough for him.