Assassins Creed PC requirements.lol

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 RyviusARC
Member since 2011 • 5708 Posts

@tormentos said:

http://bbs2.ruliweb.daum.net/gaia/do...icleId=1545598

Assassin's creed Unity PC version's System requirements were revealed by intra games, Korean distributor of Ubisoft Korea.

- OS: Windows® 7 SP1, Windows® 8/8.1 (Only 64-bit supports)

- CPU:

Minimum - Intel Core® i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz or above

Recommended - Intel Core® i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or above

- RAM:

Minimum - 6GB or above

Recommended - 8GB or above

- GPU:

Minimum - NVIDIA GeForce® GTX 680 or AMD Radeon HD 7970 or above

Graphic Memory minimum - 2GB

Recommended - NVIDIA GeForce® GTX 780 or AMD Radeon R9 290X or above

Graphic Memory Recommended - 3 GB

- Sound Card: DX9.0c-compatible

- HDD: above 50 GB

- Multiplayer: 256 kbps Upload bandwidth or Higher

680GTX with FX8350 minimum.................lol

95% of PC can't play this game..lol What say you System War Ubi sucks optimizing on PC or the game really require those resources.

Do you really trust PC requirement?

The Evil Within said you needed 4GB of vRAM at 1080p yet the game barely uses 1.5GB of vRAM.

Shadow of Mordor said that you needed 6GB of vRAM at 1080p for Ultra textures yet the game used 3.4GB of vRAM for me at 1440p with Ultra textures.

The GTX 680 is more than twice the power of the PS4 so it should have no problems running the game at higher settings than consoles.

If the lowest settings are similar to the PS4 then anything with 2GB of vRAM that has the power of a GTX 570 or higher should run the game fine.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52  Edited By RyviusARC
Member since 2011 • 5708 Posts

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

That is just a limitation of the game engine.

At 30fps even weaker cards like the GTX 570 run the game fine at 1080p maxed.

Avatar image for timster20
timster20

399

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#53  Edited By timster20
Member since 2014 • 399 Posts

@RyviusARC said:

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

That is just a limitation of the game engine.

At 30fps even weaker cards like the GTX 570 run the game fine at 1080p.

Do you have footage with framerate counter and without the black bars?

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54  Edited By RyviusARC
Member since 2011 • 5708 Posts

@timster20 said:

@RyviusARC said:

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

That is just a limitation of the game engine.

At 30fps even weaker cards like the GTX 570 run the game fine at 1080p.

Do you have footage with framerate counter and without the black bars?

There are plenty of people who chimed in about their performance. And it seems that having a more powerful card does not help much in The Evil Within.

I play at 2560x1440.

Avatar image for kipsta77
kipsta77

1119

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#55 kipsta77
Member since 2012 • 1119 Posts

Bought AC4 day 1 on steam, maxed out without an issue, same thing for watch dogs.

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#56 ShepardCommandr
Member since 2013 • 4939 Posts

I have 0 interest in ubisoft's trash games.

Avatar image for timster20
timster20

399

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#57 timster20
Member since 2014 • 399 Posts

@RyviusARC said:

@timster20 said:

@RyviusARC said:

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

That is just a limitation of the game engine.

At 30fps even weaker cards like the GTX 570 run the game fine at 1080p.

Do you have footage with framerate counter and without the black bars?

There are plenty of people who chimed in about their performance. And it seems that having a more powerful card does not help much in The Evil Within.

I play at 2560x1440.

Compare fps in these 2 vids to see that gpu does indeed make a difference.

https://www.youtube.com/watch?v=qvMbOaE-t28

https://www.youtube.com/watch?v=ht476_X5v4E

Avatar image for KarateeeChop
KarateeeChop

4666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#58 KarateeeChop
Member since 2010 • 4666 Posts

what's the point of upgrading your pc every year just to play the newest games when you can play everything in maxed out settings by default on the Xbox One for the next 10 years?

Avatar image for gamingelite021
GamingElite021

89

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#59 GamingElite021
Member since 2014 • 89 Posts

Horrible optimize it means. i mean such a ridiculous requirement for this crap???

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#60  Edited By lostrib
Member since 2009 • 49999 Posts

@KarateeeChop: not sure if serious...

Avatar image for gamingelite021
GamingElite021

89

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#61  Edited By GamingElite021
Member since 2014 • 89 Posts

@KarateeeChop said:

what's the point of upgrading your pc every year just to play the newest games when you can play everything in maxed out settings by default on the Xbox One for the next 10 years?

0/10

try harder

Avatar image for Snugenz
Snugenz

13388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62  Edited By Snugenz
Member since 2006 • 13388 Posts

Ubisoft make bad PC ports!!!, more incredible revealations at 11.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64  Edited By RyviusARC
Member since 2011 • 5708 Posts

@timster20 said:

@RyviusARC said:

@timster20 said:

@RyviusARC said:

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

That is just a limitation of the game engine.

At 30fps even weaker cards like the GTX 570 run the game fine at 1080p.

Do you have footage with framerate counter and without the black bars?

There are plenty of people who chimed in about their performance. And it seems that having a more powerful card does not help much in The Evil Within.

I play at 2560x1440.

Compare fps in these 2 vids to see that gpu does indeed make a difference.

https://www.youtube.com/watch?v=qvMbOaE-t28

https://www.youtube.com/watch?v=ht476_X5v4E

I am not saying there is no difference.

What I meant was that they both experience big drops in the same areas.

Also one video removed the black bars which does drop performance by a little while the other only plays in black bars.

When both are using black bars the performance between the GTX 770 and 980 are really close which shouldn't be the case.

The GTX 770 was getting around 45-52fps (when he removed the black bars it was around 41fps) while the GTX 980 (black bars always on) was getting around 56-64fps (when not staring at the sky...).

Both had drops that were a little lower in some areas.

The GTX 980 is much more powerful than a 770 so the frame rate difference should be much larger.

Avatar image for naz99
naz99

2941

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65  Edited By naz99
Member since 2002 • 2941 Posts

News. At 10:

This just in! PC specs advance!

The world is shocked.

Also it is pretty obvious they are not the minimum specs most of us would rather wait for official news than believe some random Korean website.

Avatar image for GhoX
GhoX

6267

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#66 GhoX
Member since 2006 • 6267 Posts

I don't trust this source of yours. The link doesn't work, and the site is completely unheard of.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67  Edited By clyde46
Member since 2005 • 49061 Posts

@aroxx_ab said:

@DarthRamms said:

@kinectthedots said:

System requirements for PS4 and xbone to play the Unity....Buy the game, DONE.

900p and 30fps no tnx

With these requirements many hermits will have same experience or worse lol

But I'll be only to play this at 1080p max settings and that is all that matters.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#68 RyviusARC
Member since 2011 • 5708 Posts

@clyde46 said:

@aroxx_ab said:

@DarthRamms said:

@kinectthedots said:

System requirements for PS4 and xbone to play the Unity....Buy the game, DONE.

900p and 30fps no tnx

With these requirements many hermits will have same experience or worse lol

But I'll be only to play this at 1080p max settings and that is all that matters.

I will be playing at 1440p max settings at 60fps.

Consoles are just too weak and outdated.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#69 clyde46
Member since 2005 • 49061 Posts

@RyviusARC said:

@clyde46 said:

@aroxx_ab said:

@DarthRamms said:

@kinectthedots said:

System requirements for PS4 and xbone to play the Unity....Buy the game, DONE.

900p and 30fps no tnx

With these requirements many hermits will have same experience or worse lol

But I'll be only to play this at 1080p max settings and that is all that matters.

I will be playing at 1440p max settings at 60fps.

Consoles are just too weak and outdated.

I find it funny that consolites love to use the argument of "bu bu bu only 10% of whatever can play it at X settings". Do they expect me to care that some shumck somewhere in the world is still running Win95? As long as my PC can run things better than my PS4 then its all gravy.

Avatar image for WolfgarTheQuiet
WolfgarTheQuiet

483

Forum Posts

0

Wiki Points

0

Followers

Reviews: 51

User Lists: 6

#70 WolfgarTheQuiet
Member since 2010 • 483 Posts

@bldgirsh: I seen that vid. Look lets put it this way, that test at 30 looks choppier than most last gen console games at 30. look dont want to argue cause i dont care about this stuff as much. yes most FPS benefit from 60, especially fast paced online shooters, some games just dont need it.

Avatar image for millerlight89
millerlight89

18658

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#71 millerlight89
Member since 2007 • 18658 Posts

This will be a rough gen for pc gaming. This will cause poor sales for pc as most dnt have the hardware to keep up. Seems the only reason to upgrade anymore is to play console ports that are unoptimized. I don't blame ubi tho. More devs will follow.

Avatar image for dommeus
dommeus

9433

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#72  Edited By dommeus
Member since 2004 • 9433 Posts

lol...so I can't run this game?

Avatar image for ghostwarrior786
ghostwarrior786

5811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#73 ghostwarrior786
Member since 2005 • 5811 Posts

And the reason why consoles are so popular becomes evident lololol mustard race get owned

Avatar image for mariokart64fan
mariokart64fan

20828

Forum Posts

0

Wiki Points

0

Followers

Reviews: 101

User Lists: 1

#74 mariokart64fan
Member since 2003 • 20828 Posts

Only the rich and greedy will even be able to afford PC gaming lol thus why I'll stick to consoles also don't have to deal with that clunky control method lol

Avatar image for timster20
timster20

399

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#75  Edited By timster20
Member since 2014 • 399 Posts

@RyviusARC said:

@timster20 said:

@RyviusARC said:

@timster20 said:

Do you have footage with framerate counter and without the black bars?

There are plenty of people who chimed in about their performance. And it seems that having a more powerful card does not help much in The Evil Within.

I play at 2560x1440.

Compare fps in these 2 vids to see that gpu does indeed make a difference.

https://www.youtube.com/watch?v=qvMbOaE-t28

https://www.youtube.com/watch?v=ht476_X5v4E

I am not saying there is no difference.

What I meant was that they both experience big drops in the same areas.

Also one video removed the black bars which does drop performance by a little while the other only plays in black bars.

When both are using black bars the performance between the GTX 770 and 980 are really close which shouldn't be the case.

The GTX 770 was getting around 45-52fps (when he removed the black bars it was around 41fps) while the GTX 980 (black bars always on) was getting around 56-64fps (when not staring at the sky...).

Both had drops that were a little lower in some areas.

The GTX 980 is much more powerful than a 770 so the frame rate difference should be much larger.

Heres a vid with the 970 with no black bars and it gets 10-15 fps more than the 770 which is in line with 1080p review benchmarks.

https://www.youtube.com/watch?v=8V8-S6B8shg

Avatar image for rekthard
REKThard

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#76 REKThard
Member since 2014 • 479 Posts

It's Ubi.
Besides,all PC reqs these days are exaggerated anyway.

Avatar image for naz99
naz99

2941

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#77 naz99
Member since 2002 • 2941 Posts

@millerlight89: yeah that unoptimized console port of Elite Dangerous just runs awful on 3 screens at 6000 x1080 @ 60 fps truly a lesser console experience.

You really hit the nail on the head. *sarcasm*

Avatar image for naz99
naz99

2941

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#78  Edited By naz99
Member since 2002 • 2941 Posts

@mariokart64fan:

You mean that Xbox 360 pad that I use on my PC the steering wheel I use for my racing games the joystick I use for my space and flight sims and the keyboard and mouse I use for my first person shooters?

Yep having a choice of control methods is truly a clunky. Experience *sarcasm again*

As usual more weak arguments from the consolites

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#79  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

The same AMD CPU can run the game at minimum AND recommended, but i need a different Intel CPU?

And I5-2500 is minimum, yet it destroys the shitty chip on consoles?

And 680, which is far above the console's cards, is minimum, yet recommended only jumps to a 780?

This game should be running at 900p-720p, medium settings on consoles while having a crap frame rate. Sure, Ubisoft.

Just like i needed 3GB of VRAM for ultra textures on Watch Dogs and Shadow fo Mordor.

Just like i needed 4GB of VRAM to play TEW at 1080p. Get bent.

@tormentos said:

95% of PC can't play this game..lol What say you System War Ubi sucks optimizing on PC or the game really require those resources.

If you honestly believe for 1 second a mid range GPU/CPU like a 760/i5-2500k can't play this game at 1080p, ultra settings, you're gonna be disappointed.

Avatar image for ten_pints
Ten_Pints

4072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#80  Edited By Ten_Pints
Member since 2014 • 4072 Posts

@deadline-zero0 said:

The same AMD CPU can run the game at minimum AND recommended, but i need a different Intel CPU?

And I5-2500 is minimum, yet it destroyes the shitty chip on consoles?

And 680 is minimum, yet recommended only jumps to a 780.

This game should be running at 900p-720p, medium settings on consoles while having a crap frame rate. Sure, Ubisoft.

Just like i needed 3GB of VRAM for ultra textures on Watch Dogs and Shadow fo Mordor.

Just liek how i needed 4GB of VRAM to play TEW at 1080p. Gte bent.

@tormentos said:

95% of PC can't play this game..lol What say you System War Ubi sucks optimizing on PC or the game really require those resources.

If you honestly believe for 1 second a mid range GPU/CPU like a 760/i5-2500k can't play this game at 1080p, ultra settings, you're gonna be disappointed.

I just hope I can play it on my HD7850 1GB card. I don't plan on upgrading until AMD comes out with their <28NM cards.

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#81 DEadliNE-Zero0
Member since 2014 • 6607 Posts

@ten_pints said:

@deadline-zero0 said:

The same AMD CPU can run the game at minimum AND recommended, but i need a different Intel CPU?

And I5-2500 is minimum, yet it destroyes the shitty chip on consoles?

And 680 is minimum, yet recommended only jumps to a 780.

This game should be running at 900p-720p, medium settings on consoles while having a crap frame rate. Sure, Ubisoft.

Just like i needed 3GB of VRAM for ultra textures on Watch Dogs and Shadow fo Mordor.

Just liek how i needed 4GB of VRAM to play TEW at 1080p. Gte bent.

@tormentos said:

95% of PC can't play this game..lol What say you System War Ubi sucks optimizing on PC or the game really require those resources.

If you honestly believe for 1 second a mid range GPU/CPU like a 760/i5-2500k can't play this game at 1080p, ultra settings, you're gonna be disappointed.

I just hope I can play it on my HD7850 1GB card. I don't plan on upgrading until AMD comes out with their <28NM cards.

That card's pretty weak, to be fair. Its a 1GB model.

If you have a good chip and compatible mobo, you can probably just get a HD 7950. Don't know much about AMD, but that one seems to be closer to the 760, albeit, 10% slower.

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 mikhail
Member since 2003 • 2697 Posts

@kinectthedots said:

Hermits are having to update and change out parts almost game by game now it seems

Ah, the classic console peasant argument..."LMFAO hermits buy $10,000 gaming PCs and have to spend $500 four times a year just to play games at the same settings as PS4! Stupid virgin hermits living in their moms basement, LOL!"

Seriously...go back and read what you just wrote a couple of times, then come back and tell me that you were high or something when you posted it. I'll believe that.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#83  Edited By RyviusARC
Member since 2011 • 5708 Posts

@timster20 said:

@RyviusARC said:

@timster20 said:

@RyviusARC said:

@timster20 said:

Do you have footage with framerate counter and without the black bars?

There are plenty of people who chimed in about their performance. And it seems that having a more powerful card does not help much in The Evil Within.

I play at 2560x1440.

Compare fps in these 2 vids to see that gpu does indeed make a difference.

https://www.youtube.com/watch?v=qvMbOaE-t28

https://www.youtube.com/watch?v=ht476_X5v4E

I am not saying there is no difference.

What I meant was that they both experience big drops in the same areas.

Also one video removed the black bars which does drop performance by a little while the other only plays in black bars.

When both are using black bars the performance between the GTX 770 and 980 are really close which shouldn't be the case.

The GTX 770 was getting around 45-52fps (when he removed the black bars it was around 41fps) while the GTX 980 (black bars always on) was getting around 56-64fps (when not staring at the sky...).

Both had drops that were a little lower in some areas.

The GTX 980 is much more powerful than a 770 so the frame rate difference should be much larger.

Heres a vid with the 970 with no black bars and it gets 10-15 fps more than the 770 which is in line with 1080p review benchmarks.

https://www.youtube.com/watch?v=8V8-S6B8shg

10-15 fps more is laughable when the GTX 970 is almost a 780ti on drivers that are not mature.

Actually the GTX 970 performs better than a 780ti in this game I think.

The game seems to favor Maxwell GPUs.

Avatar image for deactivated-57ad0e5285d73
deactivated-57ad0e5285d73

21398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#84 deactivated-57ad0e5285d73
Member since 2009 • 21398 Posts

What I say is when are all these fucking rehashes going to stop. It is as if there are no ideas anymore. X to Y genre. No new ones. Here is the fifth yearly tps.

Avatar image for Ben-Buja
Ben-Buja

2809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85  Edited By Ben-Buja
Member since 2011 • 2809 Posts

@timster20 said:

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

But it can handle 30 fps @4k. The game is simply not optimised to run at stable 60 fps and we can thank the consoles for that.

Besides the console version can't maintain 30 fps, it regularly dips into the low 20s, while a 750ti runs it with a more stable 30 fps

@millerlight89 said:

This will be a rough gen for pc gaming. This will cause poor sales for pc as most dnt have the hardware to keep up. Seems the only reason to upgrade anymore is to play console ports that are unoptimized. I don't blame ubi tho. More devs will follow.

And not for console gaming? lol More and more games start to drop below 1080p to maintain "stable" 30 fps and you're stuck with these shit settings. At least on PC you have the option to upgrade to something that will handle the games far better.

So far a 750ti has proven to handle most games better than the PS4s GPU, I don't think that will change anytime soon.

Avatar image for gpuking
gpuking

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 gpuking
Member since 2004 • 3914 Posts

So how many hermits can run this at 1080p/60 or 4k at max settings? Shit I bet 99% of you don't even meet the minimum requirement, but you folks keep dumping the cash upgrading tho. On a side note the game doesn't even look that hot. The programmers at ubi need to get a reality check.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#87 clyde46
Member since 2005 • 49061 Posts

@gpuking said:

So how many hermits can run this at 1080p/60 or 4k at max settings? Shit I bet 99% of you don't even meet the minimum requirement, but you folks keep dumping the cash upgrading tho. On a side note the game doesn't even look that hot. The programmers at ubi need to get a reality check.

I can and thats all that matters.

Avatar image for Ben-Buja
Ben-Buja

2809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88  Edited By Ben-Buja
Member since 2011 • 2809 Posts

@gpuking said:

So how many hermits can run this at 1080p/60 or 4k at max settings? Shit I bet 99% of you don't even meet the minimum requirement, but you folks keep dumping the cash upgrading tho. On a side note the game doesn't even look that hot. The programmers at ubi need to get a reality check.

How many consoles gamers will play this at 1080 and 30 fps?

0

zilch

nada

lol

Avatar image for rogelio22
rogelio22

2477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 rogelio22
Member since 2006 • 2477 Posts

I jus received my my 2 970s on monday... And they work great with all my games on my new 4ktv. But im still hesitant to get this on pc since what happen with ac3 and watch dogs (stutterfest) still to this day even on lowest settings and resolution! So ill prob jus be settling for 900p30fps on ps4 lol :p

Avatar image for rogerjak
rogerjak

14950

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#90  Edited By rogerjak
Member since 2004 • 14950 Posts

@wolverine4262 said:

The fx 8350 is pretty damn cheap, but it being min and recommended is confusing.

Anyway, my 970 is ready.

I came here to say the same.
AMD FX-8350 for minimum and recommended? I call bullshit with a side of coolwhip

Avatar image for legendofsense
legendofsense

320

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91  Edited By legendofsense
Member since 2013 • 320 Posts

@KarateeeChop said:

what's the point of upgrading your pc every year just to play the newest games when you can play everything in maxed out settings by default on the Xbox One for the next 10 years?

you just went full retard brother.

Avatar image for uninspiredcup
uninspiredcup

62847

Forum Posts

0

Wiki Points

0

Followers

Reviews: 86

User Lists: 2

#92  Edited By uninspiredcup  Online
Member since 2013 • 62847 Posts

I believe in Ubisoft. The Assassins Creed series has proven to be the greatest franchise humanity has ever witnessed. Will the evil Templar's win? This could be the game (or possibly the next one) to finally lay the cards down.

Avatar image for AutoPilotOn
AutoPilotOn

8655

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#93 AutoPilotOn
Member since 2010 • 8655 Posts

Well my 4 year old 2500k is finally making it to min requirements huh?

Avatar image for gpuking
gpuking

3914

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 gpuking
Member since 2004 • 3914 Posts

@Ben-Buja

Actually if it's not for the parity, at least PS4 could. Maybe not at max settings but all High would still be decent.

Avatar image for LustForSoul
LustForSoul

6404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 LustForSoul
Member since 2011 • 6404 Posts

That's a high recommendation, hopefully meaning something nice and optimized. (no)

Avatar image for PurpleMan5000
PurpleMan5000

10531

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#96 PurpleMan5000
Member since 2011 • 10531 Posts

@mr_huggles_dog said:

And to think...I got in an argument with someone here who was saying the best versions of multiplats are on PC.

Yet...this type of thing happens non stop.

This is the whole flaw that the herms don't seem to accept. Sure...your games run at better quality....but the majority of PC gamers don't see that quality.....b/c of the non universal system.

So the herms don't get butthurt by my post: I already know the 2 sides: Buy the console version > get lower quality - Buy the PC version > get higher quality...but at the cost of an arm and a leg.

So in this case no one wins.

It would be a whole lot cheaper for me to just upgrade my pc to the recommended specs than it would be to purchase a PS4. I think I will wait a year or so to upgrade, though.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 tormentos
Member since 2003 • 33793 Posts

@GoldenElementXL said:

Bad optimization or not I will max it out at 1440p. It's good to be able to build a rig with whatever specs I want instead of letting the devs decide what resolution and frame rate I will be playing at.

Yeah Uncharted 4 would look incredible in 1440p on Pc..oh wait..

Stop your damage control,i wasn't attacking PC,which you don't care about unless for hiding purposes..lol

@DarthRamms said:

ok whatever considering 10 million own GTX 680 classes GPU that can outperform the ps4 not to mention that's only one class from Nividia.

This isn't a PS4 vs PC thread bro.

@zeeshanhaider said:

And lowest graphics settings along with framerate drops to 20 FPS, with low draw distance, no AA/AF, screen tearing, input lag, texture pop in.

Draw distances isn't a problem this gen like it was last gen the PS4 and xbox one have memory to spare,i would worry more about 1GB cards than for the PS4.

Also i think you described the xbox one version..hahaha

@04dcarraher said:

After the string of games that have been coming out this year with inflated and false requirements, recommending i7's and AMD FX 8's then saying you *need* 3-4gb vram for high/max settings when in fact you don't need it. People shouldn't get their panties in a bunch, as well as console fanboys using these "requirements" as bait which is hilarious.... As for the latest blunder in pc requirements The Evil Within requiring 4gb of vram when in fact only uses 1.6GB of VRAM at 1080p with all its bells and whistles enabled.

We already discuss this your GPU can't be assign more ram than it has,your test were flawed and DF actually confirmed Mordor used a bit over 5GB on their test not quite 6 but 5Gb is allot more than most GPU have out there.

If your GPU has 2GB yeah that is what it will use if it has more it will use more,requirements adjust on PC,now before you come with your huge damage control like always,i do think Ubi sucks optimizing so it may be a problem with them,and i don't think this game how it look require those kind of specs,specially when the game is done on PS4 and XBO which has weak CPU compare to stronger PC ones.

@timster20 said:

Evil Within's vram reqs may have been exaggerated but even a 980 can't maintain 60 fps at 1080p.

Blame the developers the game is highly unoptimized,COD ghost was the same last year had problems even on PC when the game requirements aren't that high.

@Jankarcop said:

According to the Nvidia CEO and steam-survey, more PC's than the number of PS4's sold will.

This game is factually best on PC as it looks and runs better on it.

Yeah i am sure he will say any crap specially when no console use their over priced GPU..

Fact is according to steam the huge majority will not,grouping all PC under 1 powerful one is a joke.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98 tormentos
Member since 2003 • 33793 Posts

@RyviusARC said:

Do you really trust PC requirement?

The Evil Within said you needed 4GB of vRAM at 1080p yet the game barely uses 1.5GB of vRAM.

Shadow of Mordor said that you needed 6GB of vRAM at 1080p for Ultra textures yet the game used 3.4GB of vRAM for me at 1440p with Ultra textures.

The GTX 680 is more than twice the power of the PS4 so it should have no problems running the game at higher settings than consoles.

If the lowest settings are similar to the PS4 then anything with 2GB of vRAM that has the power of a GTX 570 or higher should run the game fine.

So without further ado, we present a selection of comparisons of the game's opening scenes, captured at medium, high and ultra texture settings with all other settings ramped up as high as they go. Monolith recommends a 6GB GPU for the highest possible quality level - and we found that at both 1080p and 2560x1440 resolutions, the game's art ate up between 5.4 to 5.6GB of onboard GDDR5. Meanwhile, the high setting utilises 2.8GB to 3GB, while medium is designed for the majority of gaming GPUs out there, occupying around 1.8GB of video RAM.

http://www.eurogamer.net/articles/digitalfoundry-2014-eyes-on-with-pc-shadow-of-mordors-6gb-textures

So who is lying you or DF.?

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:

@GoldenElementXL said:

Bad optimization or not I will max it out at 1440p. It's good to be able to build a rig with whatever specs I want instead of letting the devs decide what resolution and frame rate I will be playing at.

Yeah Uncharted 4 would look incredible in 1440p on Pc..oh wait..

Stop your damage control,i wasn't attacking PC,which you don't care about unless for hiding purposes..lol

Good thing I also own a PS4. And you're right, Uncharted will look great at whatever resolution and frame rate Naughty Dog decides is best for us.

And what on earth am I damage controlling? Do you know what "damage control" means? And "hiding purposes"? You're taking the System Wars game to a dangerously insane level.