DX12 will not solve XBones 1080p problems

  • 402 results
  • 1
  • ...
  • 3
  • 4
  • 5
  • 6
  • 7
  • ...
  • 9
Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#201 GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

@GrenadeLauncher said:

@StormyJoe said:

Everyone "in the know", including the "insiders" at NeoGAF that you cows cling to like messiahs, have said that with time the XB1 will be able to consistently hit 1080p/60fps. Now that it is starting to happen, you cows chant "that's an anomaly" because it ruins your "PS4 is teh Godbox" belief.

It's kind of fun to watch.

And it's always with cross-gen games and games a year off. Keep clinging to the belief that one day the Shitbox will be a halfway decent console. It's kind of fun to watch.

The XBox One is a great console - so is the PS4. That's the difference between you and me - I may like the XBox One over the PS4, but I don't bash the PS4 (in fact, I plan on getting one). I can admit the competition has merit. You cannot - because you are a moo-cow fanboy.

So, I could give you hundreds of links to back up what I said - and you would not believe any of them because it goes against your fanboy ideals. Kind of sad...

Oh, one more thing: RE remake 1080p on PS4 & XBox One.

How can I be a moo-cow fanboy when I like Nintendo consoles as well? Shit, I'd even give the Ouya a look before the Xbone. At least that thing has games.

Oh Jesus, Joey, are you actually using a straight port of a 12 year old Gamecube game to try and back up your flailing point? Nurse! We need to increase Joey's medication!

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#202 Shewgenja
Member since 2009 • 21456 Posts

Personally, I just think as the generation unfolds, as has with every gen before it that game design will push the hardware further and further. In that sense, the XBone will plateau dramatically. The PS4 is easier to develop for, it is the strongest technologically and it is the most ubiquitous commercially.

For me, at least, it has little to do with proving or disproving any of the consoles as a "godbox" and more to do with what one can do to actually justify the XBone in a day and age when one console has literally all the advantages. On top of that, when the XBox was the king of the land, they shrank their development resources and even let go of their most venerated first party studio.

So, there's an extra layer on top of the power debate and that is the developer debate and who will be willing to draw a line at what they can do for the sake of parity on a system that is struggling commercially. It's becoming an exercise in futility extending the branch of impartiality on people who defend the XBone for any reason. As the days pass, that reason diminishes.

Avatar image for softwaregeek
SoftwareGeek

573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#203  Edited By SoftwareGeek
Member since 2014 • 573 Posts

@Gue1 said:

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

@FastRobby said:

Shooting for parity doesn't mean holding back PS4... As if PS4 version could have been 1440p... You are reading things that aren't being said. The reason I put that piece of eSRAM in it, was because that was the question they asked BEFORE the DX12 question, to make sure you understood that they are talking about the Xbox One. Normal people could have seen this, but a cow with Sony goggles...

This kind of behavior has a name: Denial.

you're the only one in denial. Your beloved PS4 has no cloud and is therefore a second-rate, half-assed online console. Enjoy your laggy online play, host advantage, longtime match-making waits, and lame ass online game play. XB1 owners in the mean time will have none of that garbage and will enjoy an 8 billion dollar cloud. Sony doesn't have the technical know how to build such a cloud which is why you bought an upgraded ps3 hardware device branded as the ps4. The future of gaming is online. MS is on the ball with this future while sony sells you some hyperbole about resolution and frame rate. You're eating their garbage right from their hand. The truth is the graphics are gonna be about the same but the online play won't be. This is where the X1 completely outclasses your supposedly superior gpu. The ps3 was supposed to have a huge graphics advantage last gen and we saw where that ended up. Gonna be about the same this gen. Except ps4 owners will have a 3rd class online experience while xb1 owners enjoy the best online gaming experience there is. hahahaha! gawd, your online experience is gonna continue to be PATHETIC!

Avatar image for ccagracing
ccagracing

845

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#204 ccagracing
Member since 2006 • 845 Posts
@tormentos said:

@FastRobby said:

Shooting for parity doesn't mean holding back PS4... As if PS4 version could have been 1440p... You are reading things that aren't being said. The reason I put that piece of eSRAM in it, was because that was the question they asked BEFORE the DX12 question, to make sure you understood that they are talking about the Xbox One. Normal people could have seen this, but a cow with Sony goggles...

@GrenadeLauncher I don't see any reason why a Xbox gamer SHOULD have XBLG, and if he doesn't can't be interested in EA Access... You shouldn't make such dumb generalizations. And paying up? If you really think PS Now prices have better value than EA Access, then there is no point in arguing anymore.

If you have a Hyundai Accent and i have a Hyundai Genesis and i decide to stay side by side with you,it isn't because our cars are on par.

No matter what the PS4 has extra resources,and they don't need to be place on higher resolution,the PS4 version can be 1080p 60FPS like the xbox one version but have better image quality,low on xbox one mid to high on PS4.

And i want you to remember that developer call their games 60FPS but there is a far cry from it,Sniper elite is call 1080p 60FPS when it really isn't,BF4 is call 60FPS and it isn't either on xbox one,is 10 FPS slower on average vs the PS4.

So while they claim that 1080p 60FPS should be the same as on ps4,that doesn't mean it will be and we know how several games on xbox one have fail on that front,the ones that do hold the 60FPS,like Wolfenstein have dynamic resolution on xbox one and drop to hell when action get intense,or like MGS5 60 FPS but 720 while the PS4 version is 1080p how do you ignore so much examples of this is beyond logic but yeah i am the fanboy..lol

While no doubt the PS4 has a more powerful GPU, is it fair to say that PS4 & XB1 will be unable to handle BF Hardline or BF5 in 1080p 60fps as they couldn't do it on BF4?

Avatar image for ccagracing
ccagracing

845

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205 ccagracing
Member since 2006 • 845 Posts

@softwaregeek said:

@Gue1 said:

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

@FastRobby said:

Shooting for parity doesn't mean holding back PS4... As if PS4 version could have been 1440p... You are reading things that aren't being said. The reason I put that piece of eSRAM in it, was because that was the question they asked BEFORE the DX12 question, to make sure you understood that they are talking about the Xbox One. Normal people could have seen this, but a cow with Sony goggles...

This kind of behavior has a name: Denial.

you're the only one in denial. Your beloved PS4 has no cloud and is therefore a second-rate, half-assed online console. Enjoy your laggy online play, host advantage, longtime match-making waits, and lame ass online game play. XB1 owners in the mean time will have none of that garbage and will enjoy an 8 billion dollar cloud. Sony doesn't have the technical know how to build such a cloud which is why you bought an upgraded ps3 hardware device branded as the ps4. The future of gaming is online. MS is on the ball with this future while sony sells you some hyperbole about resolution and frame rate. You're eating their garbage right from their hand. The truth is the graphics are gonna be about the same but the online play won't be. This is where the X1 completely outclasses your supposedly superior gpu. The ps3 was supposed to have a huge graphics advantage last gen and we saw where that ended up. Gonna be about the same this gen. Except ps3 owners will have a 3rd class online experience while xb1 owners enjoy the best online gaming experience there is. hahahaha! gawd, your online experience is gonna continue to be PATHETIC!

Im an XB1 owner and that's a load of rubbish you have just typed and makes you no better than a blind fanboy. Does the Xbox $8 billion dollar cloud make Fifa or BF4 better on XB1 than PS4? Where is MS back catalogue of titles available to rent online? Ignore cost of PS Now, they have a choice to pay, we do not.

Avatar image for softwaregeek
SoftwareGeek

573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#206  Edited By SoftwareGeek
Member since 2014 • 573 Posts

@ccagracing said:

@softwaregeek said:

@Gue1 said:

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

@FastRobby said:

Shooting for parity doesn't mean holding back PS4... As if PS4 version could have been 1440p... You are reading things that aren't being said. The reason I put that piece of eSRAM in it, was because that was the question they asked BEFORE the DX12 question, to make sure you understood that they are talking about the Xbox One. Normal people could have seen this, but a cow with Sony goggles...

This kind of behavior has a name: Denial.

you're the only one in denial. Your beloved PS4 has no cloud and is therefore a second-rate, half-assed online console. Enjoy your laggy online play, host advantage, longtime match-making waits, and lame ass online game play. XB1 owners in the mean time will have none of that garbage and will enjoy an 8 billion dollar cloud. Sony doesn't have the technical know how to build such a cloud which is why you bought an upgraded ps3 hardware device branded as the ps4. The future of gaming is online. MS is on the ball with this future while sony sells you some hyperbole about resolution and frame rate. You're eating their garbage right from their hand. The truth is the graphics are gonna be about the same but the online play won't be. This is where the X1 completely outclasses your supposedly superior gpu. The ps3 was supposed to have a huge graphics advantage last gen and we saw where that ended up. Gonna be about the same this gen. Except ps3 owners will have a 3rd class online experience while xb1 owners enjoy the best online gaming experience there is. hahahaha! gawd, your online experience is gonna continue to be PATHETIC!

Im an XB1 owner and that's a load of rubbish you have just typed and makes you no better than a blind fanboy. Does the Xbox $8 billion dollar cloud make Fifa or BF4 better on XB1 than PS4? Where is MS back catalogue of titles available to rent online? Ignore cost of PS Now, they have a choice to pay, we do not.

you get what you pay for. The truth is, the cloud won't make every game better, but those tailor made for it will be an experience that ps4 owners will not get. Those are coming, slowly but surely. Like any new tech, it takes a while to master. And the live online experience does make most of the games better. I own a ps3/360/ROG/xb1. I will eventually buy a ps4. But I know what I'm going to get with that. And from my experience with ps3, xbl smokes their online service.

Avatar image for softwaregeek
SoftwareGeek

573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#207  Edited By SoftwareGeek
Member since 2014 • 573 Posts

dat dodge hellcat though. gotta get one of those soon.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#208  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:


@evildead6789 said:

dude, that's why flops don't mean anything when you're talking about game consoles. Those numbers are right , wether you like it or not. The rsx maybe based of a 7800 gtx, it isn't a 7800 gtx.

If the 256 bit bus doesn't mean anything on the x1 gpu, why does the gts 250, 9800 gtx and 8800 gtx have a 256 bit bus. They're only half the power of a hd 7790 and have ddr3 as well. If i would follow your reasoning, nvidia would have given these cards a 256 bit bus for no reason. A gtx 660ti has a 192 bit bus, if the bit bus was just like a highway, this card would be severly bottlenecked, and this in a very competive market.

Numerous weaker cards have 256 bit busses , run at slower memory and clock speeds than the 660 ti, also in a competive market.

A bit bus isn't like a highway, Go to computer school and then start arguing with me, now you just making a fool of yourself. And again, the 256 bit is also needed to communicate with the esram.

You don't understand how bitrates work and all you do is try to get the pie of your face lol., all because you want your ps4 to be that much stronger than a xboxone.

Sadly, it isn't , it has an edge, but nothing like you make it out to be. Stop crying and telling lies

No the numbers weren't right it was based on non programmable crap,which was useless because it could nto be use.

660TI is a great example on how you are wrong,the 660TI has a 192 bit bus and 144GB/s bandwidth,yet it completely smokes the 7850 which has 256 bit bus and 153GB/s,the bus will not increase the GPU performance unless your GPU is heavily handicap by the bus,which in this case isn't the 7790 the full one that has 1.79TF has a 128bit bus,the xbox one has less power so yeah the 128 bit would have been fine,a 256 bit bus will not change anything.

The bus is where communication happen idiot,and is like a highway,if you have a 2 lane track and 500 cars to pass over that strack you will be bottleneck probably,compare to having those same 500 cars and a 4 lane strack.

In the case of the xbox one it has 3 cars and a 4 line track.

Go cry else where lemming you are making a fool of your self inventing crap,but but but the PS3 is 2 TF..lol


Where do you get you info on non programmable crap. This has nothing to with it, floating point performance is floating point performance, you can stand on your head lol, it isn't going to change the numbers.

You still don't get how bits and bytes work. The bus is not just a way to pass through lol. A gtx 275 has a 448 bit bus and is nowhere near the hd 7790 in performance or the gtx 660 ti. A gtx 280 has a 512 bit bus. They just put a 512 bit highway on a gtx 280 , that has ddr3 ram and is like 1/3 of the performance of a gtx 660ti. Don't you think the 660ti would be bottlenecked lol

You're just hilarious how you try to sell your wannabe knowledge. You're arguing with people who know what they're talking about, and you just keep selling your bs like it's hotcakes. Sadly no one is buying them , you just keep on making a fool out of yourself.

Here let me show you how you calculate memory bandwith:

I will use the gtx 580 of nvidia as an example. With a memory clock of 1002 mhz and a memory bus width of 384 bit.

A memory clock of 1002MHz means a datarate of 4008MHz (all videomemory since ddr3 is quad pumped) Each of these transactions is 384 bits. This means that the bandwidth of the memory is 384 x 4008 = 1,539,072 million bits per second. See, the memory bus , in this case 384 bit , multiplies with the memory speed, so that means it isn't just a highway lol, it has direct influence on your memory performance. The higher the bit bus, the higher the standard memory speed multiplies with, The higher the performance.

To get from bits to bytes we divide the number by 8 leading to 1,539,072 / 8 = 192,384 million bytes per second or 192.4 GB/s. This final outcome is the real memory performance figure. So yes the bus width has a direct influence on the performance of the card

SINCE IT HAS HIGHER DATA THROUGHPUT

So yes, a hd 7790 with a 256 bit bus will be faster than one with a 128 bit bus, especially when you see that much weaker gpu's have higher memory bandwith and that performance is lost when you lower the the memory bandwith

Now computer class is over , I'll send you the bill.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#210  Edited By commander
Member since 2010 • 16217 Posts

@FastRobby said:

@tormentos said:

@FastRobby said:

Shooting for parity doesn't mean holding back PS4... As if PS4 version could have been 1440p... You are reading things that aren't being said. The reason I put that piece of eSRAM in it, was because that was the question they asked BEFORE the DX12 question, to make sure you understood that they are talking about the Xbox One. Normal people could have seen this, but a cow with Sony goggles...

@GrenadeLauncher I don't see any reason why a Xbox gamer SHOULD have XBLG, and if he doesn't can't be interested in EA Access... You shouldn't make such dumb generalizations. And paying up? If you really think PS Now prices have better value than EA Access, then there is no point in arguing anymore.

If you have a Hyundai Accent and i have a Hyundai Genesis and i decide to stay side by side with you,it isn't because our cars are on par.

No matter what the PS4 has extra resources,and they don't need to be place on higher resolution,the PS4 version can be 1080p 60FPS like the xbox one version but have better image quality,low on xbox one mid to high on PS4.

And i want you to remember that developer call their games 60FPS but there is a far cry from it,Sniper elite is call 1080p 60FPS when it really isn't,BF4 is call 60FPS and it isn't either on xbox one,is 10 FPS slower on average vs the PS4.

So while they claim that 1080p 60FPS should be the same as on ps4,that doesn't mean it will be and we know how several games on xbox one have fail on that front,the ones that do hold the 60FPS,like Wolfenstein have dynamic resolution on xbox one and drop to hell when action get intense,or like MGS5 60 FPS but 720 while the PS4 version is 1080p how do you ignore so much examples of this is beyond logic but yeah i am the fanboy..lol

So first you say they are holding PS4 back for parity, and then you proof that there isn't any parity. Classic self ownage from Tormentos

That guy doesn't have a clue what he's talking about

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#211  Edited By StormyJoe
Member since 2011 • 7806 Posts

@GrenadeLauncher said:

@StormyJoe said:

@GrenadeLauncher said:

@StormyJoe said:

Everyone "in the know", including the "insiders" at NeoGAF that you cows cling to like messiahs, have said that with time the XB1 will be able to consistently hit 1080p/60fps. Now that it is starting to happen, you cows chant "that's an anomaly" because it ruins your "PS4 is teh Godbox" belief.

It's kind of fun to watch.

And it's always with cross-gen games and games a year off. Keep clinging to the belief that one day the Shitbox will be a halfway decent console. It's kind of fun to watch.

The XBox One is a great console - so is the PS4. That's the difference between you and me - I may like the XBox One over the PS4, but I don't bash the PS4 (in fact, I plan on getting one). I can admit the competition has merit. You cannot - because you are a moo-cow fanboy.

So, I could give you hundreds of links to back up what I said - and you would not believe any of them because it goes against your fanboy ideals. Kind of sad...

Oh, one more thing: RE remake 1080p on PS4 & XBox One.

How can I be a moo-cow fanboy when I like Nintendo consoles as well? Shit, I'd even give the Ouya a look before the Xbone. At least that thing has games.

Oh Jesus, Joey, are you actually using a straight port of a 12 year old Gamecube game to try and back up your flailing point? Nurse! We need to increase Joey's medication!

It's not a port, it's a remake.

You are like the frog that slowly boils alive because the temp is raised slowly. I have given three examples of upcoming games that have parity on resolution and frame rate; yet you still doubt.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

resolution is just resolution, sharpness and clarity on your screen, 1080p is the standard now a days and people who are gaming on big HDTV's like 45 inches and up and are running a game that isnt native 1080p its not gonna look that sharp on a big HDTV, the smaller the TV, the better, more PPI if you go smaller, i cant imagine people playing Cod Ghosts on there xbox one on a 50 + inch tv, it must be blurry like a 360 game.

of course i know what your saying, its all about quality per pixel projected on screen

Avatar image for Bishop1310
Bishop1310

1274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#213 Bishop1310
Member since 2007 • 1274 Posts

@Shewgenja said:

Personally, I just think as the generation unfolds, as has with every gen before it that game design will push the hardware further and further. In that sense, the XBone will plateau dramatically. The PS4 is easier to develop for, it is the strongest technologically and it is the most ubiquitous commercially.

For me, at least, it has little to do with proving or disproving any of the consoles as a "godbox" and more to do with what one can do to actually justify the XBone in a day and age when one console has literally all the advantages. On top of that, when the XBox was the king of the land, they shrank their development resources and even let go of their most venerated first party studio.

So, there's an extra layer on top of the power debate and that is the developer debate and who will be willing to draw a line at what they can do for the sake of parity on a system that is struggling commercially. It's becoming an exercise in futility extending the branch of impartiality on people who defend the XBone for any reason. As the days pass, that reason diminishes.

yeahhhhhhhhhhhhhhhhh no.

the fact that the PS4 is the easier system to develop for would mean that it would "plateau" sooner, as developers struggle to find optimization on the X1.
One thing none of you fan boys understand is that these systems are already being "pushed".. That is the developers are using 100% of the systems resources they have available to them, through the engine they are using.

Everything is relevant when it comes to these consoles, it's not just about which one can push more triangles or pixles, or which can compile and render c++ in openGL or DirectX... A lot of it has to do with how the developer takes advantage of the machine. Right off the bat the X1 is at a disadvantage in terms of multiplat games, as developers usually don't have time to optimize games for both systems to their fullest and PS4 is easier at this point in time.

You know whats crazy? I owned both the PS4 and X1 at the same time, if you asked me which console was more powerful what would I say? PS4, no doubt about it. Now try and guess which system I sold simply because there was NOTHING worth playing on it.

You res fags need to get over it, games will launch on both systems regardless of the sales gap.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#214  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@evildead6789 said:

@tormentos said:

@evildead6789 said:

Yeah because they didn't have the right esram tools at that time lol

witcher devs are just being lazy

both consoles are weak sauce

my gpu is a gtx 760

They have all the same tool all developers have,Sniper Elite is 1080p The witcher 3 comes next year and isn't,it not the tools is the weak xbox one.

I doubt you have one,and that 760GTX can't be consider a 5 year old PC.

I get it you love your xbox but you hide on PC when it loss so lemming like..lol

Just because i have a newer card doesn't mean older cards matched that performance. The hd 5970 runs circles around the ps4's gpu, and that's a 5 year old card, that you can pick up for a lousy 150 bucks.

yeah sure , I love my xbox, for what, old games lol. I don't even have a fake next gen console. I still have an x360 but also a ps3 but i have those for years.

The point is , you love your ps4 so much and keep on htting on the x1 but they're both a rip off with dinosaur hardware when you look at performance. The only thing that's recent about these consoles is the power consumption but apart from that, a 5 year old pc runs circles around it.

next gen console lmao, whining about 100p difference at 30fps because one dev is too lcheap to use the proper tools. What a victory lol,

enjoy your lagfests lol

how does a HD5970 run circles around a HD 7850? The terascale engine in the old GPUs get handily beat by the new GPU architecture (GCN)in the 7xxx series GCN cards is so much better then the past architectures, tesselation performance is tons better on a HD 7xxx class gpu, GCN is much more efficient then a 5970

show me a 5 year old computer that can run Tomb raider with TressFX on at 1080p and run it like a PS4 , a GTX 580 is better then a HD5970, and a 580 is only 20% faster about then the 7850, this guy got his facts completely wrong, ignorance is bliss i guess.

Compute performance, tessellation performance, and others is much much better on GCN architecture then Terascale architectures in past AMD Gpus before GCN came out, not to mention the HSA design that comes along and boosts overall system performance, alongside with other GPU architecture tweaks done by Sony themselves to boost system performance. Ps4 is no powerhouse of a machine , but it will quite easily beat a 5 year old rig easily in performance

and lagfest? consoles usually have consistent performance and FPS,

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#215  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@Wickerman777 said:

@evildead6789 said:

@Wickerman777 said:

@evildead6789 said:

both consoles are weak sauce

my gpu is a gtx 760

While I agree they aren't what they should have been (Especially X1) I never expected their specs to compete with high-end PC gpus. That's impossible given the cost and power requirements of graphics cards like that. But what they both could, and should, have done is aimed at a $500 price point with 24 functioning compute units in the GPUs and 12 Jaguar CPU cores instead of 8. Those would have been true 1080p machines and I totally believe people would have been willing to pay the extra $100 for them. But there's no time machine to correct things so we've got what we've got. :(

gtx 760 is not a high end gpu lol. I had a 7870 xt before that and then i bought a 750 ti because i wanted to make a httpc. The 7870XT even runs circles around the ps4's gpu. The 750ti is about the same as what's in a ps4, but an old phenom II quad beats the crap out of the ps4's cpu lol.

I bought the gtx 760 because i can sli and for upcoming vr games. The gtx 780 is a high end gpu, or maybe the gtx 770.

The x1 is really not that much weaker as the ps4, everybody's whining about the secret sauce, but the fact is that it isn't secret sauce but it still give an boost to the x1's gpu which is basically a 7790, that's just one tier below the 7850 and that's what's in a ps4.

The 1080p problems is because of the reservation for kinect , which is now gone and the proper tools for that esram. Maybe the ps4 may still have a bit of an edge, but it won't be a difference vs 1080p and 900p, the witcher's comment is just because they're frustated , they have to put the extra work in the X1 version, while the ps4 will be a lot easier to dev for, since it's a lot more like a pc. They also have a pc version so they just going to use one version and they won't use the esram.

Because that's what's basically is the difference between the 7790 and 7850, 1080p vs 900p, the esram can close the gap, but if they won't use it, nobody can do something about that.

I wasn't talking about your GPU, didn't even pay attention to it. Was talking about PC GPUs in general, specifically the high-end ones. And X1 is not equivalent to a 7790. Architecturally it's similar but the PC version is clocked in a way that gives it several hundred extra terraflops of performance. X1 is only 1.3. And I disagree with your 1080p vs 900p figure. More like 900p vs 720p. PS4 has been running quite a few 1080p games on old engines. Once next-gen gets into full gear and real games start coming out for it regularly that figure is going to be lower much of the time. A 1080p console it's not.

you have no point here, you say the PS4 is running old game engines at 1080p, which is actually great because these cross platform titles and cross gen titles still being made is actually holding back the XB1 and PS4 until they drop 360/ps3 from production, most of these cross gen games and cross plat games are using older rendering techniques still, not fully exploiting GCN architecture,

Hence look at Assasins creed unity that is only for PS4/xb1/ and PC, it is a next gen game, it looks next gen, they are exploiting the modern feature sets of modern GPUs today like GCN and Kepler etc with no lowest common denominator to account for.

Console optimization is also real, even on the x86 platform that is the PS4 and Xbox One, this is how Sony is optimizing the PS4

http://gamingbolt.com/how-sony-are-pushing-for-60fps-on-ps4-razor-gpucpu-profiler-and-linker-optimizations-detailed#mWl07S9UDoBfmrex.99

Games on PS4/xb1 will only continue to look noticeably better as developers start to exploit modern gpu feature sets we have today and modern gpu architecture in GCN, things like PRT, efficient tessellation performance, compute performance that comes from GCN, and much more. Actually the PS4 is a 1080p console, and itll stay 1080p, it has more then enough bandwidth and ROPS and everything else for steady 1080p , PS4 gpu has a big advantage on the Xbox one in compute, which Xbox One can never match with PS4 or itll have to suffer graphical fidelity due to dedicating CU's to try and compete with PS4's compute prowess. PS4 can dedicate 4 CU's for physics and compute alone and still have 14 CU's left for graphical rendering, Xbox One has no choice but to try and match PS4 in compute by it to also dedicating CU's to compute, and so lets say if Xbox One also used 4 CU's for compute in a game like the PS4 to match its compute, itll only have 8 CU's left for graphical rendering, not much at all, as opposed to the 14 cu's still left for graphical rendering on PS4, 2 more then the total count on the Xbox One GPU.

1080p is here to stay, and as more days past by, more and more ps4 games will steadily be 1080p, until 5-6 years down the road when brand new next gen gpu feature sets and software comes out that makes this era of GPUs behind , like leaving behind DX11/12/OpenGL 4.4 when down the road theyre will be DX13 and so fourth and brand new gpu architectures, THEN developers will lower resolution to try and get more quality per pixel, but by then, this console generation is already over and the next gen consoles will almost be out by then (ps5/xbox one two)

old game engines and older rendering techniques is holding back the XB1/PS4 for full utilization and efficiency of both systems,

Avatar image for softwaregeek
SoftwareGeek

573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#216 SoftwareGeek
Member since 2014 • 573 Posts

@xboxiphoneps3 said:

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

resolution is just resolution, sharpness and clarity on your screen, 1080p is the standard now a days and people who are gaming on big HDTV's like 45 inches and up and are running a game that isnt native 1080p its not gonna look that sharp on a big HDTV, the smaller the TV, the better, more PPI if you go smaller, i cant imagine people playing Cod Ghosts on there xbox one on a 50 + inch tv, it must be blurry like a 360 game.

of course i know what your saying, its all about quality per pixel projected on screen

nope. looks great on my 55-inch 1080p 120hz tv. better than dragon age origins (5 years old now?) running at 1080p on my ROG hdmi'd into my TV. Blurry is running ps4 at 60fps because the sharpness takes a hit to get dat framerate.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#217 Shewgenja
Member since 2009 • 21456 Posts

@Bishop1310 said:

yeahhhhhhhhhhhhhhhhh no.

the fact that the PS4 is the easier system to develop for would mean that it would "plateau" sooner, as developers struggle to find optimization on the X1.

One thing none of you fan boys understand is that these systems are already being "pushed".. That is the developers are using 100% of the systems resources they have available to them, through the engine they are using.

Everything is relevant when it comes to these consoles, it's not just about which one can push more triangles or pixles, or which can compile and render c++ in openGL or DirectX... A lot of it has to do with how the developer takes advantage of the machine. Right off the bat the X1 is at a disadvantage in terms of multiplat games, as developers usually don't have time to optimize games for both systems to their fullest and PS4 is easier at this point in time.

You know whats crazy? I owned both the PS4 and X1 at the same time, if you asked me which console was more powerful what would I say? PS4, no doubt about it. Now try and guess which system I sold simply because there was NOTHING worth playing on it.

You res fags need to get over it, games will launch on both systems regardless of the sales gap.

1st of all, hate speech is one of the few things not tolerated on this forum. Using such terms is childish and really does nothing to raise the dialogue on any subject.

2nd of all, you assume a lot about me. You assume even more about a system using DDR3 as graphics memory. Having one system as a known sum while assuming there is a hidden ESRam boogeyman in the other is the height of naivety. GPU Compute is going to be leaned on more and more by game developers throughout this generation as a means of optimization and that is the achilles heel of the XBone architecture.

3rd. You're a lunatic if you want anyone to believe you sold a PS4 for not having games but somehow kept the XBone? It simply has fewer games already. That gap is only going to widen as this gen progresses. The industry gave XBone the benefit of the doubt years before the systems actually launched. That's the only reason you can come on here and talk that good shit while waving the fanboy label around.

You're willing to give the XBone so much benefit of the doubt that you are straight up in BizzaroWorld now. If any other platform maker had just stopped making first party titles two years before the next console came out, their fanboys could be parading the same whack as **** bullshit you are spewing. As far as XBone having a higher threshhold than the PS4 because of its architecture, you're actually worse than Playstation 3 fanboys ever were with that nonsense. At least they could point to titles like Uncharted and show us where Cell/PS3 was living up. You, on the other hand, are championing a losing cause when N-O-T-H-I-N-G being shown on the XBone is even in the same category or galaxy as Uncharted 4 or DriveClub.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#218 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@softwaregeek said:

@xboxiphoneps3 said:

@softwaregeek said:

Alright. Let me explain something. Just because a game runs in 1080p doesn't mean the graphics are better than a game that doesn't. That's the most absurd argument I've ever heard. I'm running games on my ASUS ROG in 1080p that don't look nearly as good as some of the XB1 games that are coming out. They don't look as good as titanfall and it's not running 1080p. The whole 1080p argument is like saying a song sucks because it was recorded in the 1960's when the audio fidelity wasn't as good as it is today. That's just flat stupid and it's wrong. I'd rather have a great artist recording at 44.1k 16bit (cd quality) than a not-so-good artists recording at 192k 24bit (way beyond cd quality). The hardware is such a small part of it. What it all comes down to is how good the end product is. The end product is games. Good games are enjoyable no matter what the resolution. So get over it ps4 fanboys. If we wanna play numbers, you're system sucks compared to my ROG. You also have no cloud. Does that mean the ps4 sucks? Nope. Why? The ps4 is going to have good games to play just like the pc and just like the XB1.

resolution is just resolution, sharpness and clarity on your screen, 1080p is the standard now a days and people who are gaming on big HDTV's like 45 inches and up and are running a game that isnt native 1080p its not gonna look that sharp on a big HDTV, the smaller the TV, the better, more PPI if you go smaller, i cant imagine people playing Cod Ghosts on there xbox one on a 50 + inch tv, it must be blurry like a 360 game.

of course i know what your saying, its all about quality per pixel projected on screen

nope. looks great on my 55-inch 1080p 120hz tv. better than dragon age origins (5 years old now?) running at 1080p on my ROG hdmi'd into my TV. Blurry is running ps4 at 60fps because the sharpness takes a hit to get dat framerate.

what are you saying? Blurry ? Dragon Age Origins is a old game, no doubt about it Ghost has more quality per pixel, but ew, 720p resolution on a 55 inch HDTV? reminds me of blurry games looked when i ran 360 games on my big HDTV, 720p on 55 inch tv the PPI is quite low. if Ghost ran 1080p on your One, native resolution of your beautiful HDTV, itd look alot better running in 1080p no doubt about it.

Of course Ghosts looks better then a 5 year old game, were talking about resolution here and the sharpness of what is projected on your television and the clarity of it.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#219 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@Shewgenja said:

@Bishop1310 said:

yeahhhhhhhhhhhhhhhhh no.

the fact that the PS4 is the easier system to develop for would mean that it would "plateau" sooner, as developers struggle to find optimization on the X1.

One thing none of you fan boys understand is that these systems are already being "pushed".. That is the developers are using 100% of the systems resources they have available to them, through the engine they are using.

Everything is relevant when it comes to these consoles, it's not just about which one can push more triangles or pixles, or which can compile and render c++ in openGL or DirectX... A lot of it has to do with how the developer takes advantage of the machine. Right off the bat the X1 is at a disadvantage in terms of multiplat games, as developers usually don't have time to optimize games for both systems to their fullest and PS4 is easier at this point in time.

You know whats crazy? I owned both the PS4 and X1 at the same time, if you asked me which console was more powerful what would I say? PS4, no doubt about it. Now try and guess which system I sold simply because there was NOTHING worth playing on it.

You res fags need to get over it, games will launch on both systems regardless of the sales gap.

1st of all, hate speech is one of the few things not tolerated on this forum. Using such terms is childish and really does nothing to raise the dialogue on any subject.

2nd of all, you assume a lot about me. You assume even more about a system using DDR3 as graphics memory. Having one system as a known sum while assuming there is a hidden ESRam boogeyman in the other is the height of naivety. GPU Compute is going to be leaned on more and more by game developers throughout this generation as a means of optimization and that is the achilles heel of the XBone architecture.

3rd. You're a lunatic if you want anyone to believe you sold a PS4 for not having games but somehow kept the XBone? It simply has fewer games already. That gap is only going to widen as this gen progresses. The industry gave XBone the benefit of the doubt years before the systems actually launched. That's the only reason you can come on here and talk that good shit while waving the fanboy label around.

You're willing to give the XBone so much benefit of the doubt that you are straight up in BizzaroWorld now. If any other platform maker had just stopped making first party titles two years before the next console came out, their fanboys could be parading the same whack as **** bullshit you are spewing. As far as XBone having a higher threshhold than the PS4 because of its architecture, you're actually worse than Playstation 3 fanboys ever were with that nonsense. At least they could point to titles like Uncharted and show us where Cell/PS3 was living up. You, on the other hand, are championing a losing cause when N-O-T-H-I-N-G being shown on the XBone is even in the same category or galaxy as Uncharted 4 or DriveClub.

The Xbox Ones true achilles heel is compute, compute is going to be huge in all games future games down the road, PS4 is alot better at compute then the Xbox One no doubt about it simply due to just having more CU's and custom Sony tweaks to the GPU to allow even better compute performance and more ACE's

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#220  Edited By commander
Member since 2010 • 16217 Posts

@xboxiphoneps3 said:

@evildead6789 said:

@tormentos said:

@evildead6789 said:

Yeah because they didn't have the right esram tools at that time lol

witcher devs are just being lazy

both consoles are weak sauce

my gpu is a gtx 760

They have all the same tool all developers have,Sniper Elite is 1080p The witcher 3 comes next year and isn't,it not the tools is the weak xbox one.

I doubt you have one,and that 760GTX can't be consider a 5 year old PC.

I get it you love your xbox but you hide on PC when it loss so lemming like..lol

Just because i have a newer card doesn't mean older cards matched that performance. The hd 5970 runs circles around the ps4's gpu, and that's a 5 year old card, that you can pick up for a lousy 150 bucks.

yeah sure , I love my xbox, for what, old games lol. I don't even have a fake next gen console. I still have an x360 but also a ps3 but i have those for years.

The point is , you love your ps4 so much and keep on htting on the x1 but they're both a rip off with dinosaur hardware when you look at performance. The only thing that's recent about these consoles is the power consumption but apart from that, a 5 year old pc runs circles around it.

next gen console lmao, whining about 100p difference at 30fps because one dev is too lcheap to use the proper tools. What a victory lol,

enjoy your lagfests lol

how does a HD5970 run circles around a HD 7850? The terascale engine in the old GPUs get handily beat by the new GPU architecture (GCN)in the 7xxx series GCN cards is so much better then the past architectures, tesselation performance is tons better on a HD 7xxx class gpu, GCN is much more efficient then a 5970

show me a 5 year old computer that can run Tomb raider with TressFX on at 1080p and run it like a PS4 , a GTX 580 is better then a HD5970, and a 580 is only 20% faster about then the 7850, this guy got his facts completely wrong, ignorance is bliss i guess.

Compute performance, tessellation performance, and others is much much better on GCN architecture then Terascale architectures in past AMD Gpus before GCN came out, not to mention the HSA design that comes along and boosts overall system performance, alongside with other GPU architecture tweaks done by Sony themselves to boost system performance. Ps4 is no powerhouse of a machine , but it will quite easily beat a 5 year old rig easily in performance

and lagfest? consoles usually have consistent performance and FPS,

It's not because because these older cards don't have some of the new features (which are not a must, it's still dx 11).that they're not stronger. They're a bit more power efficient that may be true, but the rest you're saying you just pulled out of your ass, or you're just misinformed, or you're just not smart enough to read benchmark charts.

You have some nerve come in here and spreading your bs. The gtx 580 isn't faster than the hd 5970, the 5970 is faster , if the proper drivers are used, in terms of horsepower the hd 5970 is simply stronger.

Whatever you're saying about architecture is all good and nice, but isn't better than the raw horsepower the hd 5970 is. IT may be optimized for dx 11 features and maybe beat or match the HD 5970 in some titles , especially in titles that don't have proper dual gpu support but overall the hd 5970 is the stronger card.

The tweaks that sony has done may be all well and good but with the proper drivers and nice cpu from even 5 years ago, the hd 5970 runs circles around a ps4

lagfest? I don't know what planet you live on, but consoles are lagfests, the framerate drops beneath 30 a lot, maybe not now because they're basically not any next gen games yet, and the ones that are released are either stripped of resolution, texturing, shadowing or lighting to maintain a steady fps otherwise everyone would be complaining.

Now get outta here and take your bs with you

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#221 Shewgenja
Member since 2009 • 21456 Posts

@xboxiphoneps3 said:

The Xbox Ones true achilles heel is compute, compute is going to be huge in all games future games down the road, PS4 is alot better at compute then the Xbox One no doubt about it simply due to just having more CU's and custom Sony tweaks to the GPU to allow even better compute performance and more ACE's

Not to mention the ability for the CPU and GPU to read from the same memory addresses.

Avatar image for donalbane
donalbane

16383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#223 donalbane
Member since 2003 • 16383 Posts

It wi

@Shewgenja said:

Personally, I just think as the generation unfolds, as has with every gen before it that game design will push the hardware further and further. In that sense, the XBone will plateau dramatically. The PS4 is easier to develop for, it is the strongest technologically and it is the most ubiquitous commercially.

For me, at least, it has little to do with proving or disproving any of the consoles as a "godbox" and more to do with what one can do to actually justify the XBone in a day and age when one console has literally all the advantages. On top of that, when the XBox was the king of the land, they shrank their development resources and even let go of their most venerated first party studio.

So, there's an extra layer on top of the power debate and that is the developer debate and who will be willing to draw a line at what they can do for the sake of parity on a system that is struggling commercially. It's becoming an exercise in futility extending the branch of impartiality on people who defend the XBone for any reason. As the days pass, that reason diminishes.

Actually, the Witcher devs are on record saying that the Xbox One version was easier to develop for due to the Windows-based OS... I believe they said something about the video encoding or something along those lines. At any rate, it will look better on PS4 despite the fact that that version created some minor hurdles for the team.

Still, the PC version is clearly the way to go with this game if your machine can handle it.

Avatar image for softwaregeek
SoftwareGeek

573

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#224  Edited By SoftwareGeek
Member since 2014 • 573 Posts

"Personally, I just think as the generation unfolds, as has with every gen before it that game design will push the hardware further and further. In that sense, the XBone will plateau dramatically. The PS4 is easier to develop for, it is the strongest technologically..."

As MS develops new development tools, it may not always be the case that the PS4 is easier to develop for. The ps4 has a little better hardware, but that doesn't necessarily mean it is the strongest technologically. It means as a stand alone unit it is stronger. I see where both consoles could plateau. However, the xb1 has an advantage here because incredibly complex calculations can be offloaded to the cloud. Calculations that could easily overwhelm either console. After taking both consoles strengths and weaknesses into consideration, I find it more likely that PS4 will plateau out before the xb1. Buuuut....that is speculation. It's speculation however that's based upon my 15+ years experience as a software engineer. I could be wrong. Probably not though. At the end of the day, it doesn't matter much because both systems will have killer games. Halo. The Last of Us. etc.

"... when the XBox was the king of the land, they shrank their development resources and even let go of their most venerated first party studio. "

MS still has world class developers. Last I checked, they still had the fewest bugs per sloc.

"So, there's an extra layer on top of the power debate and that is the developer debate and who will be willing to draw a line at what they can do for the sake of parity ..."

Parity will come about because production issues will demand it. 3rd party companies won't have the time to optimize the graphics for each system. They will all wind up looking similar. Some people call this dumbing down the graphics. I guess that's one way to look at it. Another angle is meeting the deadlines and sticking to your iteration goals. Development goals sadly don't always include optimizing the code for each system. But there's also other reasons parity will be reached and they are more important in the eyes of corporate management. Branding, Marketing, and consistency. Very important for 3rd party developers. Pretty much every software developer that wants a console understands this. The common public however does not. So these facts alone can be a tough pill for some of you to swallow.

"...on a system that is struggling commercially."

Selling 5 million units is hardly struggling. The xb1 is doing fine. The ps4 is doing fantastic at about 9 million units i think?

"As the days pass, that reason diminishes."

True of any system.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#225 StormyJoe
Member since 2011 • 7806 Posts

@Shewgenja said:

@xboxiphoneps3 said:

The Xbox Ones true achilles heel is compute, compute is going to be huge in all games future games down the road, PS4 is alot better at compute then the Xbox One no doubt about it simply due to just having more CU's and custom Sony tweaks to the GPU to allow even better compute performance and more ACE's

Not to mention the ability for the CPU and GPU to read from the same memory addresses.

That's only an issue for the ESRAM...

Avatar image for slimdogmilionar
slimdogmilionar

1345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#226  Edited By slimdogmilionar
Member since 2014 • 1345 Posts

@softwaregeek said:

"Personally, I just think as the generation unfolds, as has with every gen before it that game design will push the hardware further and further. In that sense, the XBone will plateau dramatically. The PS4 is easier to develop for, it is the strongest technologically..."

As MS develops new development tools, it may not always be the case that the PS4 is easier to develop for. The ps4 has a little better hardware, but that doesn't necessarily mean it is the strongest technologically. It means as a stand alone unit it is stronger. I see where both consoles could plateau. However, the xb1 has an advantage here because incredibly complex calculations can be offloaded to the cloud. Calculations that could easily overwhelm either console. After taking both consoles strengths and weaknesses into consideration, I find it more likely that PS4 will plateau out before the xb1. Buuuut....that is speculation. It's speculation however that's based upon my 15+ years experience as a software engineer. I could be wrong. Probably not though. At the end of the day, it doesn't matter much because both systems will have killer games. Halo. The Last of Us. etc.

"... when the XBox was the king of the land, they shrank their development resources and even let go of their most venerated first party studio. "

MS still has world class developers. Last I checked, they still had the fewest bugs per sloc.

"So, there's an extra layer on top of the power debate and that is the developer debate and who will be willing to draw a line at what they can do for the sake of parity ..."

Parity will come about because production issues will demand it. 3rd party companies won't have the time to optimize the graphics for each system. They will all wind up looking similar. Some people call this dumbing down the graphics. I guess that's one way to look at it. Another angle is meeting the deadlines and sticking to your iteration goals. Development goals sadly don't always include optimizing the code for each system. But there's also other reasons parity will be reached and they are more important in the eyes of corporate management. Branding, Marketing, and consistency. Very important for 3rd party developers. Pretty much every software developer that wants a console understands this. The common public however does not. So these facts alone can be a tough pill for some of you to swallow.

"...on a system that is struggling commercially."

Selling 5 million units is hardly struggling. The xb1 is doing fine. The ps4 is doing fantastic at about 9 million units i think?

"As the days pass, that reason diminishes."

True of any system.

Source?

Avatar image for b4x
B4X

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#227  Edited By B4X
Member since 2014 • 5660 Posts

The Graphics King Machine needs no solving. :p

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#228 commander
Member since 2010 • 16217 Posts

@xboxiphoneps3 said:

@Wickerman777 said:

@evildead6789 said:

@Wickerman777 said:

@evildead6789 said:

both consoles are weak sauce

my gpu is a gtx 760

While I agree they aren't what they should have been (Especially X1) I never expected their specs to compete with high-end PC gpus. That's impossible given the cost and power requirements of graphics cards like that. But what they both could, and should, have done is aimed at a $500 price point with 24 functioning compute units in the GPUs and 12 Jaguar CPU cores instead of 8. Those would have been true 1080p machines and I totally believe people would have been willing to pay the extra $100 for them. But there's no time machine to correct things so we've got what we've got. :(

gtx 760 is not a high end gpu lol. I had a 7870 xt before that and then i bought a 750 ti because i wanted to make a httpc. The 7870XT even runs circles around the ps4's gpu. The 750ti is about the same as what's in a ps4, but an old phenom II quad beats the crap out of the ps4's cpu lol.

I bought the gtx 760 because i can sli and for upcoming vr games. The gtx 780 is a high end gpu, or maybe the gtx 770.

The x1 is really not that much weaker as the ps4, everybody's whining about the secret sauce, but the fact is that it isn't secret sauce but it still give an boost to the x1's gpu which is basically a 7790, that's just one tier below the 7850 and that's what's in a ps4.

The 1080p problems is because of the reservation for kinect , which is now gone and the proper tools for that esram. Maybe the ps4 may still have a bit of an edge, but it won't be a difference vs 1080p and 900p, the witcher's comment is just because they're frustated , they have to put the extra work in the X1 version, while the ps4 will be a lot easier to dev for, since it's a lot more like a pc. They also have a pc version so they just going to use one version and they won't use the esram.

Because that's what's basically is the difference between the 7790 and 7850, 1080p vs 900p, the esram can close the gap, but if they won't use it, nobody can do something about that.

I wasn't talking about your GPU, didn't even pay attention to it. Was talking about PC GPUs in general, specifically the high-end ones. And X1 is not equivalent to a 7790. Architecturally it's similar but the PC version is clocked in a way that gives it several hundred extra terraflops of performance. X1 is only 1.3. And I disagree with your 1080p vs 900p figure. More like 900p vs 720p. PS4 has been running quite a few 1080p games on old engines. Once next-gen gets into full gear and real games start coming out for it regularly that figure is going to be lower much of the time. A 1080p console it's not.

you have no point here, you say the PS4 is running old game engines at 1080p, which is actually great because these cross platform titles and cross gen titles still being made is actually holding back the XB1 and PS4 until they drop 360/ps3 from production, most of these cross gen games and cross plat games are using older rendering techniques still, not fully exploiting GCN architecture,

Hence look at Assasins creed unity that is only for PS4/xb1/ and PC, it is a next gen game, it looks next gen, they are exploiting the modern feature sets of modern GPUs today like GCN and Kepler etc with no lowest common denominator to account for.

Console optimization is also real, even on the x86 platform that is the PS4 and Xbox One, this is how Sony is optimizing the PS4

http://gamingbolt.com/how-sony-are-pushing-for-60fps-on-ps4-razor-gpucpu-profiler-and-linker-optimizations-detailed#mWl07S9UDoBfmrex.99

Games on PS4/xb1 will only continue to look noticeably better as developers start to exploit modern gpu feature sets we have today and modern gpu architecture in GCN, things like PRT, efficient tessellation performance, compute performance that comes from GCN, and much more. Actually the PS4 is a 1080p console, and itll stay 1080p, it has more then enough bandwidth and ROPS and everything else for steady 1080p , PS4 gpu has a big advantage on the Xbox one in compute, which Xbox One can never match with PS4 or itll have to suffer graphical fidelity due to dedicating CU's to try and compete with PS4's compute prowess. PS4 can dedicate 4 CU's for physics and compute alone and still have 14 CU's left for graphical rendering, Xbox One has no choice but to try and match PS4 in compute by it to also dedicating CU's to compute, and so lets say if Xbox One also used 4 CU's for compute in a game like the PS4 to match its compute, itll only have 8 CU's left for graphical rendering, not much at all, as opposed to the 14 cu's still left for graphical rendering on PS4, 2 more then the total count on the Xbox One GPU.

1080p is here to stay, and as more days past by, more and more ps4 games will steadily be 1080p, until 5-6 years down the road when brand new next gen gpu feature sets and software comes out that makes this era of GPUs behind , like leaving behind DX11/12/OpenGL 4.4 when down the road theyre will be DX13 and so fourth and brand new gpu architectures, THEN developers will lower resolution to try and get more quality per pixel, but by then, this console generation is already over and the next gen consoles will almost be out by then (ps5/xbox one two)

old game engines and older rendering techniques is holding back the XB1/PS4 for full utilization and efficiency of both systems,

you really have no clue how computer hardware works do you.

Dx 12 can only do so much.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#229  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@evildead6789 said:

@xboxiphoneps3 said:

@Wickerman777 said:

@evildead6789 said:

@Wickerman777 said:

@evildead6789 said:

both consoles are weak sauce

my gpu is a gtx 760

While I agree they aren't what they should have been (Especially X1) I never expected their specs to compete with high-end PC gpus. That's impossible given the cost and power requirements of graphics cards like that. But what they both could, and should, have done is aimed at a $500 price point with 24 functioning compute units in the GPUs and 12 Jaguar CPU cores instead of 8. Those would have been true 1080p machines and I totally believe people would have been willing to pay the extra $100 for them. But there's no time machine to correct things so we've got what we've got. :(

gtx 760 is not a high end gpu lol. I had a 7870 xt before that and then i bought a 750 ti because i wanted to make a httpc. The 7870XT even runs circles around the ps4's gpu. The 750ti is about the same as what's in a ps4, but an old phenom II quad beats the crap out of the ps4's cpu lol.

I bought the gtx 760 because i can sli and for upcoming vr games. The gtx 780 is a high end gpu, or maybe the gtx 770.

The x1 is really not that much weaker as the ps4, everybody's whining about the secret sauce, but the fact is that it isn't secret sauce but it still give an boost to the x1's gpu which is basically a 7790, that's just one tier below the 7850 and that's what's in a ps4.

The 1080p problems is because of the reservation for kinect , which is now gone and the proper tools for that esram. Maybe the ps4 may still have a bit of an edge, but it won't be a difference vs 1080p and 900p, the witcher's comment is just because they're frustated , they have to put the extra work in the X1 version, while the ps4 will be a lot easier to dev for, since it's a lot more like a pc. They also have a pc version so they just going to use one version and they won't use the esram.

Because that's what's basically is the difference between the 7790 and 7850, 1080p vs 900p, the esram can close the gap, but if they won't use it, nobody can do something about that.

I wasn't talking about your GPU, didn't even pay attention to it. Was talking about PC GPUs in general, specifically the high-end ones. And X1 is not equivalent to a 7790. Architecturally it's similar but the PC version is clocked in a way that gives it several hundred extra terraflops of performance. X1 is only 1.3. And I disagree with your 1080p vs 900p figure. More like 900p vs 720p. PS4 has been running quite a few 1080p games on old engines. Once next-gen gets into full gear and real games start coming out for it regularly that figure is going to be lower much of the time. A 1080p console it's not.

you have no point here, you say the PS4 is running old game engines at 1080p, which is actually great because these cross platform titles and cross gen titles still being made is actually holding back the XB1 and PS4 until they drop 360/ps3 from production, most of these cross gen games and cross plat games are using older rendering techniques still, not fully exploiting GCN architecture,

Hence look at Assasins creed unity that is only for PS4/xb1/ and PC, it is a next gen game, it looks next gen, they are exploiting the modern feature sets of modern GPUs today like GCN and Kepler etc with no lowest common denominator to account for.

Console optimization is also real, even on the x86 platform that is the PS4 and Xbox One, this is how Sony is optimizing the PS4

http://gamingbolt.com/how-sony-are-pushing-for-60fps-on-ps4-razor-gpucpu-profiler-and-linker-optimizations-detailed#mWl07S9UDoBfmrex.99

Games on PS4/xb1 will only continue to look noticeably better as developers start to exploit modern gpu feature sets we have today and modern gpu architecture in GCN, things like PRT, efficient tessellation performance, compute performance that comes from GCN, and much more. Actually the PS4 is a 1080p console, and itll stay 1080p, it has more then enough bandwidth and ROPS and everything else for steady 1080p , PS4 gpu has a big advantage on the Xbox one in compute, which Xbox One can never match with PS4 or itll have to suffer graphical fidelity due to dedicating CU's to try and compete with PS4's compute prowess. PS4 can dedicate 4 CU's for physics and compute alone and still have 14 CU's left for graphical rendering, Xbox One has no choice but to try and match PS4 in compute by it to also dedicating CU's to compute, and so lets say if Xbox One also used 4 CU's for compute in a game like the PS4 to match its compute, itll only have 8 CU's left for graphical rendering, not much at all, as opposed to the 14 cu's still left for graphical rendering on PS4, 2 more then the total count on the Xbox One GPU.

1080p is here to stay, and as more days past by, more and more ps4 games will steadily be 1080p, until 5-6 years down the road when brand new next gen gpu feature sets and software comes out that makes this era of GPUs behind , like leaving behind DX11/12/OpenGL 4.4 when down the road theyre will be DX13 and so fourth and brand new gpu architectures, THEN developers will lower resolution to try and get more quality per pixel, but by then, this console generation is already over and the next gen consoles will almost be out by then (ps5/xbox one two)

old game engines and older rendering techniques is holding back the XB1/PS4 for full utilization and efficiency of both systems,

you really have no clue how computer hardware works do you.

Dx 12 can only do so much.

im not talking about computer hardware or DX12, i know DX12 wont do much really for the XB1, it already has something similar, low level access to its hardware, "metal" im simply stating that the GCN architectures of both consoles are not being fully fully exploited yet, that cross gen games that come out on PS4/xb1/360/ps3 holds back from full full utilization of all feature sets and hardware power of these new consoles, it is in exclusives that make the hardware shine and show "next gen prowess and graphics" that once ps3/360 get dropped from development, you see instant benefits ala AC Unity

i dont claim to know the world about computer hardware, but i actually do have some knowledge on it, and what i have posted is nothing but truth, what are you disputing, i know DX12 wont do much at all for the XB1, maybe litttttle less CPU overhead and whatnot, but not much at all

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#230 commander
Member since 2010 • 16217 Posts

@xboxiphoneps3 said:

@evildead6789 said:

@xboxiphoneps3 said:

@Wickerman777 said:

@evildead6789 said:

@Wickerman777 said:

@evildead6789 said:

both consoles are weak sauce

my gpu is a gtx 760

While I agree they aren't what they should have been (Especially X1) I never expected their specs to compete with high-end PC gpus. That's impossible given the cost and power requirements of graphics cards like that. But what they both could, and should, have done is aimed at a $500 price point with 24 functioning compute units in the GPUs and 12 Jaguar CPU cores instead of 8. Those would have been true 1080p machines and I totally believe people would have been willing to pay the extra $100 for them. But there's no time machine to correct things so we've got what we've got. :(

gtx 760 is not a high end gpu lol. I had a 7870 xt before that and then i bought a 750 ti because i wanted to make a httpc. The 7870XT even runs circles around the ps4's gpu. The 750ti is about the same as what's in a ps4, but an old phenom II quad beats the crap out of the ps4's cpu lol.

I bought the gtx 760 because i can sli and for upcoming vr games. The gtx 780 is a high end gpu, or maybe the gtx 770.

The x1 is really not that much weaker as the ps4, everybody's whining about the secret sauce, but the fact is that it isn't secret sauce but it still give an boost to the x1's gpu which is basically a 7790, that's just one tier below the 7850 and that's what's in a ps4.

The 1080p problems is because of the reservation for kinect , which is now gone and the proper tools for that esram. Maybe the ps4 may still have a bit of an edge, but it won't be a difference vs 1080p and 900p, the witcher's comment is just because they're frustated , they have to put the extra work in the X1 version, while the ps4 will be a lot easier to dev for, since it's a lot more like a pc. They also have a pc version so they just going to use one version and they won't use the esram.

Because that's what's basically is the difference between the 7790 and 7850, 1080p vs 900p, the esram can close the gap, but if they won't use it, nobody can do something about that.

I wasn't talking about your GPU, didn't even pay attention to it. Was talking about PC GPUs in general, specifically the high-end ones. And X1 is not equivalent to a 7790. Architecturally it's similar but the PC version is clocked in a way that gives it several hundred extra terraflops of performance. X1 is only 1.3. And I disagree with your 1080p vs 900p figure. More like 900p vs 720p. PS4 has been running quite a few 1080p games on old engines. Once next-gen gets into full gear and real games start coming out for it regularly that figure is going to be lower much of the time. A 1080p console it's not.

you have no point here, you say the PS4 is running old game engines at 1080p, which is actually great because these cross platform titles and cross gen titles still being made is actually holding back the XB1 and PS4 until they drop 360/ps3 from production, most of these cross gen games and cross plat games are using older rendering techniques still, not fully exploiting GCN architecture,

Hence look at Assasins creed unity that is only for PS4/xb1/ and PC, it is a next gen game, it looks next gen, they are exploiting the modern feature sets of modern GPUs today like GCN and Kepler etc with no lowest common denominator to account for.

Console optimization is also real, even on the x86 platform that is the PS4 and Xbox One, this is how Sony is optimizing the PS4

http://gamingbolt.com/how-sony-are-pushing-for-60fps-on-ps4-razor-gpucpu-profiler-and-linker-optimizations-detailed#mWl07S9UDoBfmrex.99

Games on PS4/xb1 will only continue to look noticeably better as developers start to exploit modern gpu feature sets we have today and modern gpu architecture in GCN, things like PRT, efficient tessellation performance, compute performance that comes from GCN, and much more. Actually the PS4 is a 1080p console, and itll stay 1080p, it has more then enough bandwidth and ROPS and everything else for steady 1080p , PS4 gpu has a big advantage on the Xbox one in compute, which Xbox One can never match with PS4 or itll have to suffer graphical fidelity due to dedicating CU's to try and compete with PS4's compute prowess. PS4 can dedicate 4 CU's for physics and compute alone and still have 14 CU's left for graphical rendering, Xbox One has no choice but to try and match PS4 in compute by it to also dedicating CU's to compute, and so lets say if Xbox One also used 4 CU's for compute in a game like the PS4 to match its compute, itll only have 8 CU's left for graphical rendering, not much at all, as opposed to the 14 cu's still left for graphical rendering on PS4, 2 more then the total count on the Xbox One GPU.

1080p is here to stay, and as more days past by, more and more ps4 games will steadily be 1080p, until 5-6 years down the road when brand new next gen gpu feature sets and software comes out that makes this era of GPUs behind , like leaving behind DX11/12/OpenGL 4.4 when down the road theyre will be DX13 and so fourth and brand new gpu architectures, THEN developers will lower resolution to try and get more quality per pixel, but by then, this console generation is already over and the next gen consoles will almost be out by then (ps5/xbox one two)

old game engines and older rendering techniques is holding back the XB1/PS4 for full utilization and efficiency of both systems,

you really have no clue how computer hardware works do you.

Dx 12 can only do so much.

im not talking about computer hardware or DX12, i know DX12 wont do much really for the XB1, it already has something similar, low level access to its hardware, "metal" im simply stating that the GCN architectures of both consoles are not being fully fully exploited yet, that cross gen games that come out on PS4/xb1/360/ps3 holds back from full full utilization of all feature sets and hardware power of these new consoles, it is in exclusives that make the hardware shine and show "next gen prowess and graphics" that once ps3/360 get dropped from development, you see instant benefits ala AC Unity

i dont claim to know the world about computer hardware, but i actually do have some knowledge on it, and what i have posted is nothing but truth, what are you disputing, i know DX12 wont do much at all for the XB1, maybe litttttle less CPU overhead and whatnot, but not much at all

you really have no clue about computer hardware. Dx12 maybe good for some things and especially for very fast ram like the esram, but that ain't going to do any wonders, i will just close the gap a bit with the ps4

There will be more optimization of course, but you can only do so much. Don't expect the ps4 or x1 to suddenly come with gtx 680 performance.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#231  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@evildead6789 said:

@xboxiphoneps3 said:

@evildead6789 said:

@xboxiphoneps3 said:

@Wickerman777 said:

@evildead6789 said:

@Wickerman777 said:

@evildead6789 said:

both consoles are weak sauce

my gpu is a gtx 760

While I agree they aren't what they should have been (Especially X1) I never expected their specs to compete with high-end PC gpus. That's impossible given the cost and power requirements of graphics cards like that. But what they both could, and should, have done is aimed at a $500 price point with 24 functioning compute units in the GPUs and 12 Jaguar CPU cores instead of 8. Those would have been true 1080p machines and I totally believe people would have been willing to pay the extra $100 for them. But there's no time machine to correct things so we've got what we've got. :(

gtx 760 is not a high end gpu lol. I had a 7870 xt before that and then i bought a 750 ti because i wanted to make a httpc. The 7870XT even runs circles around the ps4's gpu. The 750ti is about the same as what's in a ps4, but an old phenom II quad beats the crap out of the ps4's cpu lol.

I bought the gtx 760 because i can sli and for upcoming vr games. The gtx 780 is a high end gpu, or maybe the gtx 770.

The x1 is really not that much weaker as the ps4, everybody's whining about the secret sauce, but the fact is that it isn't secret sauce but it still give an boost to the x1's gpu which is basically a 7790, that's just one tier below the 7850 and that's what's in a ps4.

The 1080p problems is because of the reservation for kinect , which is now gone and the proper tools for that esram. Maybe the ps4 may still have a bit of an edge, but it won't be a difference vs 1080p and 900p, the witcher's comment is just because they're frustated , they have to put the extra work in the X1 version, while the ps4 will be a lot easier to dev for, since it's a lot more like a pc. They also have a pc version so they just going to use one version and they won't use the esram.

Because that's what's basically is the difference between the 7790 and 7850, 1080p vs 900p, the esram can close the gap, but if they won't use it, nobody can do something about that.

I wasn't talking about your GPU, didn't even pay attention to it. Was talking about PC GPUs in general, specifically the high-end ones. And X1 is not equivalent to a 7790. Architecturally it's similar but the PC version is clocked in a way that gives it several hundred extra terraflops of performance. X1 is only 1.3. And I disagree with your 1080p vs 900p figure. More like 900p vs 720p. PS4 has been running quite a few 1080p games on old engines. Once next-gen gets into full gear and real games start coming out for it regularly that figure is going to be lower much of the time. A 1080p console it's not.

you have no point here, you say the PS4 is running old game engines at 1080p, which is actually great because these cross platform titles and cross gen titles still being made is actually holding back the XB1 and PS4 until they drop 360/ps3 from production, most of these cross gen games and cross plat games are using older rendering techniques still, not fully exploiting GCN architecture,

Hence look at Assasins creed unity that is only for PS4/xb1/ and PC, it is a next gen game, it looks next gen, they are exploiting the modern feature sets of modern GPUs today like GCN and Kepler etc with no lowest common denominator to account for.

Console optimization is also real, even on the x86 platform that is the PS4 and Xbox One, this is how Sony is optimizing the PS4

http://gamingbolt.com/how-sony-are-pushing-for-60fps-on-ps4-razor-gpucpu-profiler-and-linker-optimizations-detailed#mWl07S9UDoBfmrex.99

Games on PS4/xb1 will only continue to look noticeably better as developers start to exploit modern gpu feature sets we have today and modern gpu architecture in GCN, things like PRT, efficient tessellation performance, compute performance that comes from GCN, and much more. Actually the PS4 is a 1080p console, and itll stay 1080p, it has more then enough bandwidth and ROPS and everything else for steady 1080p , PS4 gpu has a big advantage on the Xbox one in compute, which Xbox One can never match with PS4 or itll have to suffer graphical fidelity due to dedicating CU's to try and compete with PS4's compute prowess. PS4 can dedicate 4 CU's for physics and compute alone and still have 14 CU's left for graphical rendering, Xbox One has no choice but to try and match PS4 in compute by it to also dedicating CU's to compute, and so lets say if Xbox One also used 4 CU's for compute in a game like the PS4 to match its compute, itll only have 8 CU's left for graphical rendering, not much at all, as opposed to the 14 cu's still left for graphical rendering on PS4, 2 more then the total count on the Xbox One GPU.

1080p is here to stay, and as more days past by, more and more ps4 games will steadily be 1080p, until 5-6 years down the road when brand new next gen gpu feature sets and software comes out that makes this era of GPUs behind , like leaving behind DX11/12/OpenGL 4.4 when down the road theyre will be DX13 and so fourth and brand new gpu architectures, THEN developers will lower resolution to try and get more quality per pixel, but by then, this console generation is already over and the next gen consoles will almost be out by then (ps5/xbox one two)

old game engines and older rendering techniques is holding back the XB1/PS4 for full utilization and efficiency of both systems,

you really have no clue how computer hardware works do you.

Dx 12 can only do so much.

im not talking about computer hardware or DX12, i know DX12 wont do much really for the XB1, it already has something similar, low level access to its hardware, "metal" im simply stating that the GCN architectures of both consoles are not being fully fully exploited yet, that cross gen games that come out on PS4/xb1/360/ps3 holds back from full full utilization of all feature sets and hardware power of these new consoles, it is in exclusives that make the hardware shine and show "next gen prowess and graphics" that once ps3/360 get dropped from development, you see instant benefits ala AC Unity

i dont claim to know the world about computer hardware, but i actually do have some knowledge on it, and what i have posted is nothing but truth, what are you disputing, i know DX12 wont do much at all for the XB1, maybe litttttle less CPU overhead and whatnot, but not much at all

you really have no clue about computer hardware. Dx12 maybe good for some things and especially for very fast ram like the esram, but that ain't going to do any wonders, i will just close the gap a bit with the ps4

There will be more optimization of course, but you can only do so much. Don't expect the ps4 or x1 to suddenly come with gtx 680 performance.

a HD5970 does not offer more consistent performance then a GTX 580, you are pretty much owning your self with what you are saying. dont even let me speak on about Dual gpu issues and horrible microstuttering it can have, espcially older AMD cards

Consoles offer more consistent performance then a PC, locked settings, fixed FPS, how dont you know this? and you are claiminig you know about computer hardware lol?

nobody is claiming ps4 will match gtx 680 performance, software doesnt add hardware resources, but a closed box machine , you can squeeze all the juice out of a closed box machine as opposed to PC where there is countless configurations. that is my point,

Show me where PS4 games drop below 30 fps more then 2 fps? your grasping at straws

www.hardocp.com/article/2010/11/14/geforce_gtx_580_vs_radeon_hd_5970_2gb_performance/1#.u-fqj8vd_qb

look at those minimums, that frame rate jumping all over on the 5970, gtx 580 provides a more consistent performance

^ self-owned ;)

your logic is flawed, it doesnt matter how much "raw horsepower" you may have, does a 6870 (Terascale architecture) that has 2 teraflops of single precision power beat a HD7850 (GCN)that has less then 2 teraflops of single precision power? no it doesnt, the hd7850 actually beats it, your logic is completely flawed and wrong and sir you dont know much about computer hardware at all by just what you are speaking and claiming. sorry man. GCN is a noticeably better GPU architecture then Terascale and its more efficient, Google is your friend.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#232  Edited By Kinthalis
Member since 2002 • 5503 Posts

DX12 is really about PC gaming. IT's not going to do anything for Xbone, except make it easier for devs to tap into that hardware without necessarily bogging down into the bowels of esoteric low level code.

Problem with that is that most devs unwilling to go down "to the metal" are probably runnig their game son prefab engines like Unreal 4 anyway. So DX12 really means very little for most console developers.

On PC, DX12 and Mantle and Open GL + extensions, means the CPU will be a lot better utilized, and won't waste cycles handling a ton of drawcalls for each frame. CPU's on even modest gaming PC's are 3-4 times more powerful than what's on these consoles - that means a LOT more CPU time to go around for devs to utilize.

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#233  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@Kinthalis said:

DX12 is really about PC gaming. IT's not going to do anything for Xbone, except make it easier for devs to tap into that hardware without necessarily bogging down into the bowels of esoteric low level code.

Problem with that is that most devs unwilling to go down "to the metal" are probably runnig their game son prefab engines like Unreal 4 anyway. So DX12 really means very little for most console developers.

On PC, DX12 and Mantle and Open GL + extensions, means the CPU will a lot better utilized, and won't waste cycles handling drawcalls for each frame. CPU's on even modest gmaing PC's are 3-4 times more powerful than what's on these consoles - that means a LOT more CPU time to go around for devs to utilize.

exactly, its about the PC, xbox one already has low level access, "metal" access, its a console, DX12 is meant really for PC as big gains will be made there

Avatar image for clr84651
clr84651

5643

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#234  Edited By clr84651
Member since 2010 • 5643 Posts
@chikenfriedrice said:

Is the Witcher 3 confirmed 1080P on the PS4? ( real question )

Yes it is. And it's not the first game nor will it be the last game that's 1080p on PS4 & not on X1.

DX 12 does can't make up for the hardware advantage, because it's software.

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#235  Edited By commander
Member since 2010 • 16217 Posts

@xboxiphoneps3 said:

@evildead6789 said:

@xboxiphoneps3 said:

@evildead6789 said:

you really have no clue how computer hardware works do you.

Dx 12 can only do so much.

im not talking about computer hardware or DX12, i know DX12 wont do much really for the XB1, it already has something similar, low level access to its hardware, "metal" im simply stating that the GCN architectures of both consoles are not being fully fully exploited yet, that cross gen games that come out on PS4/xb1/360/ps3 holds back from full full utilization of all feature sets and hardware power of these new consoles, it is in exclusives that make the hardware shine and show "next gen prowess and graphics" that once ps3/360 get dropped from development, you see instant benefits ala AC Unity

i dont claim to know the world about computer hardware, but i actually do have some knowledge on it, and what i have posted is nothing but truth, what are you disputing, i know DX12 wont do much at all for the XB1, maybe litttttle less CPU overhead and whatnot, but not much at all

you really have no clue about computer hardware. Dx12 maybe good for some things and especially for very fast ram like the esram, but that ain't going to do any wonders, i will just close the gap a bit with the ps4

There will be more optimization of course, but you can only do so much. Don't expect the ps4 or x1 to suddenly come with gtx 680 performance.

a HD5970 does not offer more consistent performance then a GTX 580, you are pretty much owning your self with what you are saying. dont even let me speak on about Dual gpu issues and horrible microstuttering it can have, espcially older AMD cards

Consoles offer more consistent performance then a PC, locked settings, fixed FPS, how dont you know this? and you are claiminig you know about computer hardware lol?

nobody is claiming ps4 will match gtx 680 performance, software doesnt add hardware resources, but a closed box machine , you can squeeze all the juice out of a closed box machine as opposed to PC where there is countless configurations. that is my point,

Show me where PS4 games drop below 30 fps more then 2 fps? your grasping at straws

www.hardocp.com/article/2010/11/14/geforce_gtx_580_vs_radeon_hd_5970_2gb_performance/1#.u-fqj8vd_qb

look at those minimums, that frame rate jumping all over on the 5970, gtx 580 provides a more consistent performance

^ self-owned ;)

your logic is flawed, it doesnt matter how much "raw horsepower" you may have, does a 6870 (Terascale architecture) that has 2 teraflops of single precision power beat a HD7850 (GCN)that has less then 2 teraflops of single precision power? no it doesnt, the hd7850 actually beats it, your logic is completely flawed and wrong and sir you dont know much about computer hardware at all by just what you are speaking and claiming. sorry man. GCN is a noticeably better GPU architecture then Terascale and its more efficient, Google is your friend.

this is totally besides the point, a gtx 580 may be more consistent now because amd is too lazy for driver support, because the gtx 580 has more memory and because the hd 5970 is a dual gpu. The point is that a hd 5970 released 5 years ago and was way stronger than the 7850 which is now in a ps4, it is also stronger than a gtx 580. HEck, even if you want a single gpu, the gtx 480 matched the 7850 and was released also five years ago.

And you can put four gtx 480's in a pc.

The point is that this five years ago and these next gen consoles released last year. In the last five years gpu hardware has increased a lot more than cpu hardware but still you get this kind of hardware in these consoles, not to mention the very weak cpu's.

of course , there always have to be a guy like you that thinks he knows it better and start commenting with bs and yes I know something about computer hardware and apparently a lot more than you if I look at what you're saying here. I worked 15 year in the industy and this is a hobby of mine since the 1990's.

You have to learn how to read too because i never said the ps4 dropped below 30 fps, but it will happen soon, because when new games are released , they will ask for more hardware and the ps4 simply won't keep up. They already have to make sacrifies now and this gen has just begun..

You still don't get how computer hardware works do you, and you want to lecture me lol, it's kinda a funny. You're comparing a hd 6870 with a 7850 The 7850 is simply stronger because of other reasons that tflops only lol, you're just a guy who reads something about tflops and think that is how you compare hardware, well do some research, you say google is my friend, it think you don't know how to use it.

The 7850 has simply more raw horsepower than the hd 6870, it's 28nm vs 40 nm, and it has higher memory bandwith lol. but maybe you made just a typo, because the 6870 was never mentioned in this discussion before lol. If you meant the 5970 , you're still wrong, the 5970 simply has more raw horsepower wether you like it or not.

http://www.hwcompare.com/12050/radeon-hd-5970-vs-radeon-hd-7850/

You say my logic flawed, get outta here you silly kid

Avatar image for general_solo76
General_Solo76

578

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#236 General_Solo76
Member since 2013 • 578 Posts

For crying out loud, please stop with the massive quote posts. It takes like 5 damn minutes to scroll through one page of this crap on my phone!

Avatar image for tormentos
tormentos

33798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#237  Edited By tormentos
Member since 2003 • 33798 Posts

@evildead6789 said:

Where do you get you info on non programmable crap. This has nothing to with it, floating point performance is floating point performance, you can stand on your head lol, it isn't going to change the numbers.

You still don't get how bits and bytes work. The bus is not just a way to pass through lol. A gtx 275 has a 448 bit bus and is nowhere near the hd 7790 in performance or the gtx 660 ti. A gtx 280 has a 512 bit bus. They just put a 512 bit highway on a gtx 280 , that has ddr3 ram and is like 1/3 of the performance of a gtx 660ti. Don't you think the 660ti would be bottlenecked lol

You're just hilarious how you try to sell your wannabe knowledge. You're arguing with people who know what they're talking about, and you just keep selling your bs like it's hotcakes. Sadly no one is buying them , you just keep on making a fool out of yourself.

Here let me show you how you calculate memory bandwith:

I will use the gtx 580 of nvidia as an example. With a memory clock of 1002 mhz and a memory bus width of 384 bit.

A memory clock of 1002MHz means a datarate of 4008MHz (all videomemory since ddr3 is quad pumped) Each of these transactions is 384 bits. This means that the bandwidth of the memory is 384 x 4008 = 1,539,072 million bits per second. See, the memory bus , in this case 384 bit , multiplies with the memory speed, so that means it isn't just a highway lol, it has direct influence on your memory performance. The higher the bit bus, the higher the standard memory speed multiplies with, The higher the performance.

To get from bits to bytes we divide the number by 8 leading to 1,539,072 / 8 = 192,384 million bytes per second or 192.4 GB/s. This final outcome is the real memory performance figure. So yes the bus width has a direct influence on the performance of the card

SINCE IT HAS HIGHER DATA THROUGHPUT

So yes, a hd 7790 with a 256 bit bus will be faster than one with a 128 bit bus, especially when you see that much weaker gpu's have higher memory bandwith and that performance is lost when you lower the the memory bandwith

Now computer class is over , I'll send you the bill.

That first part bold there is complete a utter bullsh** is a gibberish mess of epic proportions.

Stop trying to talk about sh** you don't understand the fact that you claim the PS3 has 2TF says it all,you are a sad lemming who doesn't know s** about what he is talking about a 1.28TF GPU doesn't need a freaking 256 bit bus period,the 7770 doesn't have it and the more powerful 7790 doesn't either.

Again the 660Ti smokes the 7850 yet the 7850 has higher bandwidth and wider bus as well,secret sauce debunked..lol

@FastRobby said:

So first you say they are holding PS4 back for parity, and then you proof that there isn't any parity. Classic self ownage from Tormentos

Is not my fault that you are to dumb to under stand it,developers looking for parity hold the PS4 back to achieve it,the part about been no parity refers to developers not holding back and using both units to the max.

There is no parity because the PS4 is stronger,unless developers hold the PS4 back.

@evildead6789 said:

you really have no clue about computer hardware. Dx12 maybe good for some things and especially for very fast ram like the esram, but that ain't going to do any wonders, i will just close the gap a bit with the ps4

There will be more optimization of course, but you can only do so much. Don't expect the ps4 or x1 to suddenly come with gtx 680 performance.

And the bulsh** keeps coming..Secret Sauce FTW...lol

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#238  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@evildead6789: your logic is the more tflops it has, the better it performs, that raw horsepower beats everything else, the Hd5970 doesnt have enough "raw horsepower" to overcome that efficiency gap that there is between GCN and Terascale. A Hd 5970 at the time was expensive as ****. Actually on paper, the HD6870 does have more "raw horsepower " as you like to call it and in your logic

1120 shader processors for the 6870 (2.0 teraflops) verses 1024 shader processors for the hd7850 (1.76 teraflops), around same clock speeds with the Hd7850 having about 20gb/s more bandwidth, so why is it that the Hd7850 beats the 6870 if the 6870 has more "raw horsepower" ?

Thanks for proving my point that a 28nm GCN gpu outperforms a older 5970 that isnt built on 28nm.

Your trying to rip on me for comparing two gpus, while you were doing the same exact thing with your comparison. No shit you cant just compare TFLOPS because tflops number from Terascale gpu architecture and tflop number from a GCN gpu have two different meanings. A 2.0 tflop Terascale GPU and a 2.0 tflop GCN gpu are not the same performance even if both tflop numbers are clocked at 2.0 tflops. The GCN has it beat in tons of ways.

Goodluck with compute and tessellation and other features on your terascale GPU 's , GCN runs laps around Terascale gpus in quite a good amount of catorgories big examples being tessellation and compute performance. Sorry buddy but you didnt own nothing and you honestly are speaking nonsense

GCN is a much more efficient gpu architecture then the Terascale gpus, doesnt matter how much "raw horsepower" you may have GCN is simply more efficient

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#239  Edited By commander
Member since 2010 • 16217 Posts

@tormentos said:

@evildead6789 said:

Where do you get you info on non programmable crap. This has nothing to with it, floating point performance is floating point performance, you can stand on your head lol, it isn't going to change the numbers.

You still don't get how bits and bytes work. The bus is not just a way to pass through lol. A gtx 275 has a 448 bit bus and is nowhere near the hd 7790 in performance or the gtx 660 ti. A gtx 280 has a 512 bit bus. They just put a 512 bit highway on a gtx 280 , that has ddr3 ram and is like 1/3 of the performance of a gtx 660ti. Don't you think the 660ti would be bottlenecked lol

You're just hilarious how you try to sell your wannabe knowledge. You're arguing with people who know what they're talking about, and you just keep selling your bs like it's hotcakes. Sadly no one is buying them , you just keep on making a fool out of yourself.

Here let me show you how you calculate memory bandwith:

I will use the gtx 580 of nvidia as an example. With a memory clock of 1002 mhz and a memory bus width of 384 bit.

A memory clock of 1002MHz means a datarate of 4008MHz (all videomemory since ddr3 is quad pumped) Each of these transactions is 384 bits. This means that the bandwidth of the memory is 384 x 4008 = 1,539,072 million bits per second. See, the memory bus , in this case 384 bit , multiplies with the memory speed, so that means it isn't just a highway lol, it has direct influence on your memory performance. The higher the bit bus, the higher the standard memory speed multiplies with, The higher the performance.

To get from bits to bytes we divide the number by 8 leading to 1,539,072 / 8 = 192,384 million bytes per second or 192.4 GB/s. This final outcome is the real memory performance figure. So yes the bus width has a direct influence on the performance of the card

SINCE IT HAS HIGHER DATA THROUGHPUT

So yes, a hd 7790 with a 256 bit bus will be faster than one with a 128 bit bus, especially when you see that much weaker gpu's have higher memory bandwith and that performance is lost when you lower the the memory bandwith

Now computer class is over , I'll send you the bill.

That first part bold there is complete a utter bullsh** is a gibberish mess of epic proportions.

Stop trying to talk about sh** you don't understand the fact that you claim the PS3 has 2TF says it all,you are a sad lemming who doesn't know s** about what he is talking about a 1.28TF GPU doesn't need a freaking 256 bit bus period,the 7770 doesn't have it and the more powerful 7790 doesn't either.

Again the 660Ti smokes the 7850 yet the 7850 has higher bandwidth and wider bus as well,secret sauce debunked..lol


@evildead6789 said:

you really have no clue about computer hardware. Dx12 maybe good for some things and especially for very fast ram like the esram, but that ain't going to do any wonders, i will just close the gap a bit with the ps4

There will be more optimization of course, but you can only do so much. Don't expect the ps4 or x1 to suddenly come with gtx 680 performance.

And the bulsh** keeps coming..Secret Sauce FTW...lol

the ps3 number of tflops has nothing to do with the discussion of the memory bandwith. The numbers were giving out by sony and the ps3 was bought to build a super computer cluster, of 8 ps3 which outperformed 100 intel xeon cpu's (source: wiki)

Wether it was programmable for games or not doesn't matter, because that's exactly my point, tflops are not an absolute measurement for gaming performance.

Your comment about the bandwith is again hilarious, simply because you don't understand the subject lol. The 660 ti is better than the 7850 because it has higher clock speed , more shader cpu's and because it has double the texture mapping units, the smaller bit bus is countered by the higher memory speeds (6000 mhz on the 660 ti vs the 4800 on the7850)

The 7850 's memory is still about 7 percent faster though, but the texture fill rate is double as fast with 660 ti, and that's why the 660ti is faster. If it would had a 256 bit bus, it would have higher memory speeds and the gap would even be bigger.

Is it really that difficult, I mean it's just basic mathemathics, multiply and compare lol. Anyway no point in discussing this with you because you're not listening. It's perfectly understandable though, after you made such a fool of yourself , I wouldn't want to know it either.

but i wouldn't keep on coming back and make it worse lol

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#240  Edited By commander
Member since 2010 • 16217 Posts

@xboxiphoneps3 said:

@evildead6789: your logic is the more tflops it has, the better it performs, that raw horsepower beats everything else, the Hd5970 doesnt have enough "raw horsepower" to overcome that efficiency gap that there is between GCN and Terascale. A Hd 5970 at the time was expensive as ****. Actually on paper, the HD6870 does have more "raw horsepower " as you like to call it and in your logic

1120 shader processors for the 6870 (2.0 teraflops) verses 1024 shader processors for the hd7850 (1.76 teraflops), around same clock speeds with the Hd7850 having about 20gb/s more bandwidth, so why is it that the Hd7850 beats the 6870 if the 6870 has more "raw horsepower" ?

Thanks for proving my point that a 28nm GCN gpu outperforms a older 5970 that isnt built on 28nm.

Your trying to rip on me for comparing two gpus, while you were doing the same exact thing with your comparison. No shit you cant just compare TFLOPS because tflops number from Terascale gpu architecture and tflop number from a GCN gpu have two different meanings. A 2.0 tflop Terascale GPU and a 2.0 tflop GCN gpu are not the same performance even if both tflop numbers are clocked at 2.0 tflops. The GCN has it beat in tons of ways.

Goodluck with compute and tessellation and other features on your terascale GPU 's , GCN runs laps around Terascale gpus in quite a good amount of catorgories big examples being tessellation and compute performance. Sorry buddy but you didnt own nothing and you honestly are speaking nonsense

GCN is a much more efficient gpu architecture then the Terascale gpus, doesnt matter how much "raw horsepower" you may have GCN is simply more efficient

I'm not basing myself of tflops, I never said that. With raw horsepower I mean benchmarks, a hd 5970 is basically two underclocked hd 5780's on one board. The 5870 comes very close to the 7850 in performance. That's why I gave you the example of the gtx 480 , which was also released 5 years ago and which is a single gpu. I'm not using the 5870 as a reference because it's a bit slower than the hd 7850 , allthough overclocking it would match the 7850's performance.

The effeciency gap you're talking about is just the driver support for the dual gpu on the 5970. Amd is not very good when it comes to driver support of older cards, especially not in dual setups.

The 6870 cannot even match the 5870 in performance so why you comparing the 6870 with a 7850, i never used the term tflops in this discussion, it is you .

You're always referring to optimizations the newer 7xxx cards got, but the older cards had this too with the never settle drivers. Nvidia release new drivers as well. As bad as amd is for releasing drivers on time or for multi gpu setups, the 5xxx series are still supported. If you look at any site that compares cards from different gens you will see the exact same image even today

the 7850 is comparable to the gtx 480, the 5870 is a bit slower but nothing that an overclock can't handle, the 5970 is way above it.

here maybe a reference so you would stop spreading your bs all the time

http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html

Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#241  Edited By deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

@evildead6789 said:

@xboxiphoneps3 said:

@evildead6789: your logic is the more tflops it has, the better it performs, that raw horsepower beats everything else, the Hd5970 doesnt have enough "raw horsepower" to overcome that efficiency gap that there is between GCN and Terascale. A Hd 5970 at the time was expensive as ****. Actually on paper, the HD6870 does have more "raw horsepower " as you like to call it and in your logic

1120 shader processors for the 6870 (2.0 teraflops) verses 1024 shader processors for the hd7850 (1.76 teraflops), around same clock speeds with the Hd7850 having about 20gb/s more bandwidth, so why is it that the Hd7850 beats the 6870 if the 6870 has more "raw horsepower" ?

Thanks for proving my point that a 28nm GCN gpu outperforms a older 5970 that isnt built on 28nm.

Your trying to rip on me for comparing two gpus, while you were doing the same exact thing with your comparison. No shit you cant just compare TFLOPS because tflops number from Terascale gpu architecture and tflop number from a GCN gpu have two different meanings. A 2.0 tflop Terascale GPU and a 2.0 tflop GCN gpu are not the same performance even if both tflop numbers are clocked at 2.0 tflops. The GCN has it beat in tons of ways.

Goodluck with compute and tessellation and other features on your terascale GPU 's , GCN runs laps around Terascale gpus in quite a good amount of catorgories big examples being tessellation and compute performance. Sorry buddy but you didnt own nothing and you honestly are speaking nonsense

GCN is a much more efficient gpu architecture then the Terascale gpus, doesnt matter how much "raw horsepower" you may have GCN is simply more efficient

I'm not basing myself of tflops, I never said that. With raw horsepower I mean benchmarks, a hd 5970 is basically two underclocked hd 5780's on one board. The 5870 comes very close to the 7850 in performance. That's why I gave you the example of the gtx 480 , which was also released 5 years ago and which is a single gpu. I'm not using the 5870 as a reference because it's a bit slower than the hd 7850 , allthough overclocking it would match the 7850's performance.

The effeciency gap you're talking about is just the driver support for the dual gpu on the 5970. Amd is not very good when it comes to driver support of older cards, especially not in dual setups.

The 6870 cannot even match the 5870 in performance so why you comparing the 6870 with a 7850, i never used the term tflops in this discussion, it is you .

You're always referring to optimizations the newer 7xxx cards got, but the older cards had this too with the never settle drivers. Nvidia release new drivers as well. As bad as amd is for releasing drivers on time or for multi gpu setups, the 5xxx series are still supported. If you look at any site that compares cards from different gens you will see the exact same image even today

the 7850 is comparable to the gtx 480, the 5870 is a bit slower but nothing that an overclock can't handle, the 5970 is way above it.

here maybe a reference so you would stop spreading your bs all the time

http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html

7850 has a nice performance edge over the hd 5870 in Directx11 games, it has superior geometry performance and compute, with future games especially are going to utilize compute even more.

how much people bought a HD 5970? not THAT much people at all, it was a expensive ass card at the time and generates heat like a factory.

Dude you are completly forgetting about feature sets that are on GCN that isnt available on the older 5970, cough cough PRT, superior tessellation and geometry performance, superior compute performance alongside with the PS4's custom gpu' tweaks. better AF performance

you are now seeing developers fully exploit GCN architecture and Kepler aswel ( games like AC Unity, The Order, and a bunch of other gamessince PS4/XB1 is the denominator now (both are GCN)and they are also ready for the next step aswell

your also forgetting about AMD Mantle support for only GCN gpu's on PC, low level access ,

derp

also the issues that come with the older dual GPU solutions, microstuttering, etc.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#242  Edited By scatteh316
Member since 2004 • 10273 Posts

This is real power boys and girls....... Check out my memory bandwidth too....

Consolites talking about GPU performance... Too funny....

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#243 CrownKingArthur
Member since 2013 • 5262 Posts

@scatteh316: r9 290? very nice. triple fan?
what res and fps do you like to play at?

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#244 scatteh316
Member since 2004 • 10273 Posts

@CrownKingArthur said:

@scatteh316: r9 290? very nice. triple fan?

what res and fps do you like to play at?

It's water cooled under a full cover block..... that's actually turned out slightly while I try and get the air bubble out of the system as I've just re-built it.

It runs a custom BIOS that allows up 1000w power draw and up the voltage limit to 2v!! It normally runs at 1.375Ghz at 1.45v

My res is 2560x1440 and I run with V-Sync and it manages pretty much everything at a locked 60fps with no dips.

Avatar image for CrownKingArthur
CrownKingArthur

5262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#245 CrownKingArthur
Member since 2013 • 5262 Posts

@scatteh316: that sounds like a very serious system. yo man, respect.
i knew those r9 290's were good, but hot dang - that's pretty baws. so, compared to the same card at stock, what are you seeing in terms of (ballpark) percentage performance increase?

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#246 GrenadeLauncher
Member since 2004 • 6843 Posts

@StormyJoe said:

It's not a port, it's a remake.

You are like the frog that slowly boils alive because the temp is raised slowly. I have given three examples of upcoming games that have parity on resolution and frame rate; yet you still doubt.

Joey displays his ignorance again. H2A, Oddworld New 'n' Tasty and Ratchet and Clank PS4 are remakes. TLOU and Halo 1/3/4 are remasters. This is a straight port, with an upscaling filter slapped on and pan and scan for the widescreen. It's so lazy it hurts.

Sniper Elite 3 had "parity" as well. We all know what happened there. It takes more than paper performance to make the Shitbone look good.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#249  Edited By scatteh316
Member since 2004 • 10273 Posts

@CrownKingArthur:

It's a good 40% overclock and it's faster then a stock 7990 at this clock speed.. So Yea it's fast....

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#250 GrenadeLauncher
Member since 2004 • 6843 Posts

@sts106mat said:

Sniper Elite? Lol the game looks like shit, it's not made by a developer known for their graphical prowess. Its also cross gen. Before you start, i played it on PS4.

I reckon Batman Arkham Knight will be the multiplat to compare since its not being developed for last gen consoles.

The point is, Simon, Sniper Elite 3 was also "at parity." I remember lemmings crowing about that. Result? Worst on the Bone.