You may want to hold off on those 970's ladies and jellyspoons.
http://wccftech.com/nvidia-geforce-gtx-980-ti-gtx-titan-x-coming/
i7's may be 4 years old but i7 from 4 years ago are slower then i5 from 2-3 years ago. And to the fact that i7's tend to be $300+
GTX 670 performance wise outclasses the console gpu's.
No, X1 and PS4 only have 5gb or less memory to use. And the system/games store data the same way as pc's do so you are looking at 2-3 gb for game assets and 2-3gb for VRAM typical usage.
These consoles are still the bottleneck for games because of the lack of processing power... but are a still a massive leap over 360/PS3.
But the problem is ID tech 5 engine, however as well seen too many over estimated requirements for games this year.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
As you just said an i7 from 4 years ago is slower than the i5 from 2-3 years ago , actually its slower than last years i5. Since they didn't specifiy model we can assume i7 from the Broadwell generation which is the i7 920 as recommended. I don't see how that is a high requirement for a recommended spec at all.
Res settings weren't shown in the video.
They were in the description.....
Anything could be in the description.
Are you being difficult on purpose or are you normally this stupid?
How am I supposed to know if he's being honest? If everyone went around being honest then everyone would have tons of enemies.
Speaking of Vram here's an excerpt from ign's Mordor review. Sounds a little crazy
On the PC side Mordor also compares to the Batman games, in that it’s of good quality. There are even some enhanced graphics settings, including an ultra-high texture setting that requires a full 6GB of video memory. My only issue with it is some awkward menu controls, but most of those are customizable and those that aren’t aren’t too inconvenient to get used to.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?
Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...
Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?
Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...
Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.
This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?
Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...
Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.
You vastly overestimate how much CPU power is needed for games. It's such a vast overestimate I question how much you really know about game programming and what it takes to make things run. I have a 3770k. If a game uses 15% of my CPU I would be impressed. Most sit at 25% of a single core. We're talking fractions of my CPU are being used.
The lower framerates and resolutions are a result of the GPU as the PS4/Xbox One's GPUs are significantly weaker than what PC have. However graphics don't limit games. You can downscale graphics easily. What limits games more is RAM and CPU bottlenecks neither which the PS4/Xbox One has.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?
Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...
Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.
This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.
Wait dumb and unformed? Do you realize that even the most processor intensive games barely crack 25% total CPU usage on anything modern of a CPU? Obviously not. Also 30fps and resoluition has little to do with CPU power and as I said before, you can downscale assets and rendering, those things don't limit game design. Less polygons on a model and less detail in the distances isn't the same as physically not able to implement a mechanic because you ran out of RAM.
I probably have a more powerful PC than 99% of people here and game almost exclusively on my PC yet I don't have my head so far up my ass that I throw out general knowledge just to bash the game consoles and make me feel better for owning that PC.
There are also two other trends you'll see with PC games. Trend 1, games will require a minimum of 4 core processors. Why? PS4/Xbox One's CPUs are powerful enough for games but single-core performance is not all that great. This forces developers to utilize multiple cores. Both consoles went for a low power consumption and low heat solution by going with more weak cores rather than fewer powerful cores (saves on heat and thus allows for smaller form factor and longer lifespan). This is why we see i5s and i7s be the requirements and the recommended specs. Now this won't always hold true, especially for more mature engine that were around last generation (CryEngine, Unreal 3, Frostbite), but expect newer engines to almost always require a minimum of 4 cores.
There is another trend you'll see. Until DX12, DX9 and DX11 have a major CPU bottleneck that really hurts draw calls as well as a few other graphic processing things. Consoles can do draw calls way faster than a comparable PC due to the low-level API and code basically being able to call hardware directly instead of having to be interpreted through a software API running on the CPU. This is why we specs recommended in the i7 range yet only require i5s. Once DX12 hits and games start adapting that, the larger CPU bottlenecks should go away on the PC allowing for even weaker CPUs to be used. It's a win-win.
And yet the textures in the game still look like complete ass. Final nail in idTech5's coffin? I think so.
Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.
Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.
The PS4/Xbox One are not bottlenecks right now.
It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?
Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...
Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.
This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.
Wait dumb and unformed? Do you realize that even the most processor intensive games barely crack 25% total CPU usage on anything modern of a CPU? Obviously not. Also 30fps and resoluition has little to do with CPU power and as I said before, you can downscale assets and rendering, those things don't limit game design.
I probably have a more powerful PC than 99% of people here and game almost exclusively on my PC yet I don't have my head so far up my ass that I throw out general knowledge just to bash the game consoles and make me feel better for owning that PC. Get some perspective.
Damn! Well said!
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame
No because adding a second card doesn't double your VRAM.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2-3GB of vRAM.
And they sometimes may be bottlenecked enough that they can't even handle such high textures.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.
And even they sometimes my be bottlenecked so they can't even handle such high textures.
Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.
On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.
And even they sometimes my be bottlenecked so they can't even handle such high textures.
Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.
On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.
Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?
If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?
Damn! Well said!
You've been getting too obvious lately.
Not sure what you're getting at here.
So much for the "Best versions of multiplatform games are on PC" thought process.
1. Can you show me how you came to the conclusion that this game will not look and run better on high-end PCs? How does it differ from nearly all other multiplats which all look and run better on PC? Link? Citation?
2. Even if one game did finally end up better on consoles, you really think it ends a thought process concerning 99% of other games?
So much for the "Best versions of multiplatform games are on PC" thought process.
1. Can you show me how you came to the conclusion that this game will not look and run better on high-end PCs? How does it differ from nearly all other multiplats which all look and run better on PC? Link? Citation?
2. Even if one game did finally end up better on consoles, you really think it ends a thought process concerning 99% of other games?
Thats your opinion. I don't see PS4 gamers constantly complaining about their games being crappy ports.
I for one find any game I've played on my PS4 to look great....run great....sound great....all that. While I spent $400 on a PS4 and will get a good looking game that runs great....you're going to need a $1000 PC to run this game.
i7.....GPU with 4GB of VRAM? Yeah....and we both know if thats the case it's gonna need something beefier that a GTX 670 to really get good frames and look like it's suppose to.
You vastly overestimate how much CPU power is needed for games. It's such a vast overestimate I question how much you really know about game programming and what it takes to make things run. I have a 3770k. If a game uses 15% of my CPU I would be impressed. Most sit at 25% of a single core. We're talking fractions of my CPU are being used.
The lower framerates and resolutions are a result of the GPU as the PS4/Xbox One's GPUs are significantly weaker than what PC have. However graphics don't limit games. You can downscale graphics easily. What limits games more is RAM and CPU bottlenecks neither which the PS4/Xbox One has.
You are underestimating the lack of processing power. Also you are wrong because of the fact that plenty of games do use more then "15"% or 25% of your cpu power hence the ones using more then a single thread at 80-100%. And plenty of prime examples of Cpu bottlenecks to the gpu and or games prove that fact. Take a Phenom 2 x4 to BF3 or 4, limit it to two cores and see the fps get cut in half.
Now "According to developer DICE, BF4 already uses up to 95 percent of available CPU power on next-gen consoles."
"Frostbite technical director Johan Andersson, the game uses 90 to 95 percent of the available CPU power on the PS4 and Xbox One. You can check out an in-depth Q&A at AMD with him and other developers While the next gen consoles are clearly more powerful than the previous generation hardware, Sony and Microsoft have decided to focus more on the GPU than the CPU. Both the PS4 and Xbox One have an 8 core CPU clocked at around 1.6 Ghz, which doesn’t sound like a lot. Furthermore, developers only have access to 6 CPU cores, the last two are reserved for the OS.While it’s no surprise that a game as complex as Battlefield 4 uses almost all of the available CPU power, we’re surprised that developers are already almost hitting the limit."
Also all the fps issues in games Im talking about are from the lack of cpu processing power, not all fps issues are from the lack of gpu power. Sudden changes in directions, lots of multiplayer action are prime examples of the cpu not being able to keep up with the data flow for the gpu. Like you said graphics can be tweaked to fit the hardware but yet we see instances of fps drops and unstable averages, because of the cpu not the gpu.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.
And even they sometimes my be bottlenecked so they can't even handle such high textures.
Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.
On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.
Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?
If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?
lol, ISS used all 4.5 gb in PS4, Killzone also used 3gb for vram and 1.5gb for the other game assets. Open world type of games use more memory to cache the game data hence the reason for lack of best textures for watchdogs. These new consoles store data just like pc's do... And they have a shared memory pool having to spit the usage using less then 5gb.
@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.
PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.
Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.
Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.
And even they sometimes my be bottlenecked so they can't even handle such high textures.
Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.
On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.
Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?
If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?
lol, ISS used all 4.5 gb in PS4, Killzone also used 3gb for vram and 1.5gb for the other game assets. Open world type of games use more memory to cache the game data hence the reason for lack of best textures for watchdogs. These new consoles store data just like pc's do... And they have a shared memory pool having to spit the usage using less then 5gb.
I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.
So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.
Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.
I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.
So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.
Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.
The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.
This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.
And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..
Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol
I call it ages ago..
I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.
So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.
Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.
The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.
This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.
And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..
Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol
I call it ages ago..
Nah the PS4 uses 3.5 for the OS. the rest is split between system memory and video memory.
So the PS4 will most likely use Medium textures if those settings are to be believed.
If the PS4 couldn't do ultra textures for WatchDogs then it probably won't handle high textures for Shadow of Mordor.
If it's anything like Titanfall the textures will look like crap even at the highest and there will me almost no visible difference between high and ultra.
I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.
So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.
Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.
The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.
This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.
And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..
Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol
I call it ages ago..
Yes, 2gb vram won't be enough max newer games at 1080p but so far it seems that a gtx 680 for example, is bottlenecked by it's power rather than lack of vram and that card is much faster than the ps4 gpu. So despite having more available memory the ps4 lacks the power run upcoming games at the equivalent of max settings on pc, it can't even run some of the existing games at max settings. I'm not saying that there never will be any cases where the ps4 could have an advantage over a pc with 2gb vram.
I wonder if it's like Ryse where the VRAM is just to run the game at 4K because 4GB is a hell of a lot for a game that looks like this and is pretty linear.
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
Nah the PS4 uses 3.5 for the OS. the rest is split between system memory and video memory.
So the PS4 will most likely use Medium textures if those settings are to be believed.
If the PS4 couldn't do ultra textures for WatchDogs then it probably won't handle high textures for Shadow of Mordor.
If it's anything like Titanfall the textures will look like crap even at the highest and there will me almost no visible difference between high and ultra.
No is not 3.5 for OS that also include systems,already confirmed by Infamous developer.
Oh the problem os the ps4 is not ram is power,the power need it wasn't there,on the 7870 there is more power but ram is a limiting factor i call this ages ago and hermits deny it..hahah
Even that i posted that screen there showing how the 7850 2GB out did the same card with 1GB of ram on Skyrim when you installed the HD textures,ram was crippling the 1GB model vs the 2GB one on a card that have the exact same power.
All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame
Shoulda waited for the 970 bruh! Considering how long ago the 700 series was launched we could easily expect the next generation at the 3rd Q of this year. And it was totally worth the wait. Like I was going to upgrade to a 760 this year but when I heard the rumors of the 800 series I literally waited months, and now I will be getting a card that is like 2 times better for a similar price (the 970, with 4gb too).
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in the other thread, there's no way in hell SoM needs a 3GB card to run at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
Face it, PC gaming suck balls this gen compare to the PS4.
It has no game, all the big game are danging in front of the watery mouth of PC gamers.
Sims, Coaster tycoon, WoW, sims, tycoon, WOW.
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
Face it, PC gaming suck balls this gen compare to the PS4.
It has no game, all the big game are danging in front of the watery mouth of PC gamers.
Sims, Coaster tycoon, WoW, sims, tycoon, WOW.
You're trying too hard
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
Face it, PC gaming suck balls this gen compare to the PS4.
It has no game, all the big game are danging in front of the watery mouth of PC gamers.
Sims, Coaster tycoon, WoW, sims, tycoon, WOW.
You're trying too hard
Still I got a point.
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
Face it, PC gaming suck balls this gen compare to the PS4.
It has no game, all the big game are danging in front of the watery mouth of PC gamers.
Sims, Coaster tycoon, WoW, sims, tycoon, WOW.
You're trying too hard
Still I got a point.
No, you don't. Hence why you are trying too hard
No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.
Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL
What the ****?
Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best
Face it, PC gaming suck balls this gen compare to the PS4.
It has no game, all the big game are danging in front of the watery mouth of PC gamers.
Sims, Coaster tycoon, WoW, sims, tycoon, WOW.
Please Log In to post.
Log in to comment