PC's VRAM issues rears it's ugly head once again in Jedi Survivor. Worst performce to date?

  • 112 results
  • 1
  • 2
  • 3
Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#1  Edited By above_average
Member since 2021 • 1956 Posts

Too many people have tried to fall back on a basic power = performance metric to determine what their chosen PC or console of choice will achieve this gen not thinking the move to current generation development meant anything.

Clearly, that type of mentality is flawed and may have caused them to place bets on certain systems that will fail to produce the results that match their expectations.

Michale Thomas (NX Gamer), being a senior advanced engineer himself, predicted this outcome in future games on PC and it appears that he was correct. (his breakdown below)

Is this the worst PC performance we've seen so far due to VRAM issues?

  • Jedi Survivor uses UE4: much like Hogwarts & Redfall which MS devs are trying to get above 30fps
  • SPECS: RTX 4090, AMD Ryzen 9 5900X, 32 GB ram running @1440p
  • 4090 was getting 18gb of VRAM allocated to a scene
  • 4090 average performance with this set up between 30 to 45fps; can not maintain 60fps @1440p
  • Game extremely is CPU limited
  • AMD GPU preform better than Nvidia GPUs due to lower CPU overhead.
  • FSR 2.0 does not help performance
  • 12GB of VRAM isn't enough for 4k gaming

All traditional theories about performance metrics and conventional calculations to determine what performance a game should/would get based on GPU power (tflops) alone is being thrown a muck.

It's going to be really interesting to see what kind of resolution targets as well as performance PS5 & XSX are going to be able to fall within seeing that even a 4090 was dropping to 33fps at 1440p.

EXTRA:....................................................................................................................................................................

NXG's comments on the this VRAM issue on PC. NOTE: He concluded that PC GPU's should have 32GB of VRAM to resolve these problem but I wasn't able to pin point the specific video he mentioned this.

Starting @17:26

@18:20

PC RX 6800 16gig of ram/Zen3 5600X 6c 12thread @4.8GH/32GB DDR4/512GB 6GB's PCIe4 SSD

"The demands from console to PC are going to significantly raise over the course of this generation, and only this year are we starting to get into games that are really starting to use the underline architecture and infrastructure that these demands put on the PC space. As such, It also has a high demand on memory allocation"

Starting @9:01

"The main impact I believe is 2 fold...the PS5 has a dedicated I/O management block, DMA and decompression hardware. That means it can shift data; specifically textures but also animations, sound effects ext. without any CPU cycles being needed and get it on screen...

As this data management, decompression, animation streaming, path finding all of these impact the CPU,which (PC) has no such dedicated hardware and the API and additional overhead. Now in addition, this is where the memory ramp up on the PC. It has to pull all the required data into system ram from the harddrive and copy this back and forth between the VRAM and system ram pools...

And it means the PC demands much larger heaps than console"

🫣 I'll wait for @davillain to pop up again to tell & everyone everything is fine to cope with his 8GB 3070Ti

Avatar image for SecretPolice
SecretPolice

45753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By SecretPolice
Member since 2007 • 45753 Posts

The Pee Salty Seas will be very turbulent and stormy this time of year. lol :P

But there is good news...

https://www.gamespot.com/articles/tlou-gets-massive-25gb-patch-on-pc-with-lots-of-fixes-and-improvements/1100-6513631/

TLOU Gets Massive 25GB Patch On PC With Lots Of Fixes And Improvements

Version 1.0.4.0 should help PC players who are experiencing performance problems.

Avatar image for Pedro
Pedro

74088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#3 Pedro  Online
Member since 2002 • 74088 Posts

Another vomiting of ignorance and the subscribing to secret sauce. How can someone be this stupid?😂

Avatar image for hardwenzen
hardwenzen

42366

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#4 hardwenzen
Member since 2005 • 42366 Posts

This is simply pathetic at this point. No wonder UK doesn't want them to acquire Activision... they wouldn't even be able to run their games.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#5  Edited By above_average
Member since 2021 • 1956 Posts
@Pedro said:

Another vomiting of ignorance and the subscribing to secret sauce. How can someone be this stupid?😂

You should really take a time out.

Shit posting, insults and never being on topic has a limit and you cross that line way too often.

Avatar image for SecretPolice
SecretPolice

45753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 SecretPolice
Member since 2007 • 45753 Posts

^^^^

lol :P

Avatar image for Pedro
Pedro

74088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#7 Pedro  Online
Member since 2002 • 74088 Posts

@above_average said:

You should really take a time out.

Shit posting, insults and never being on topic has a limit and you cross that line way too often.

You should take a break from reading into nonsense. 😎

Avatar image for deactivated-6717e99227ada
deactivated-6717e99227ada

3866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#8 deactivated-6717e99227ada
Member since 2022 • 3866 Posts

PS5 has just too many tools!

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#9  Edited By above_average
Member since 2021 • 1956 Posts
@Pedro said:
@above_average said:

You should really take a time out.

Shit posting, insults and never being on topic has a limit and you cross that line way too often.

You should take a break from reading into nonsense. 😎

You haven't even had time to review the information.

Stop shit posting and GTFO the thread if you can't post on topic. You don't contribute anything to the board with your silly troll behavior nor does your childish antics come off as cute.

Go make a thread on your own and post all the stupid one liners you can think of in that thread.

Avatar image for R4gn4r0k
R4gn4r0k

49184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 R4gn4r0k
Member since 2004 • 49184 Posts

I wonder why games truly need more than 16GB of RAM and 12GB of VRAM, besides being unoptimized?

Avatar image for miquella
Miquella

1160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#11 Miquella
Member since 2022 • 1160 Posts

PC gamers are getting fed up with one shitty port after another.

Avatar image for Pedro
Pedro

74088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#12 Pedro  Online
Member since 2002 • 74088 Posts

@above_average said:

You haven't even had time to review the information.

Stop shit posting and GTFO the thread if you can't post on topic. You don't contribute anything to the board with your silly troll behavior nor does your childish antics come off as cute.

Go make a thread on your own and post all the stupid one liners you can think of in that thread.

You are peddling nonsense and falsehood that is based on speculation. Then you try to pass this off as fact or proof but it is not. Use some critical thinking when evaluating your sources instead of regurgitating and coming to even wilder conclusions.

Avatar image for Pedro
Pedro

74088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#13 Pedro  Online
Member since 2002 • 74088 Posts

@R4gn4r0k said:

I wonder why games truly need more than 16GB of RAM and 12GB of VRAM, besides being unoptimized?

To add to that the amount of RAM available on current consoles is 10-~13GB combined.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#14 above_average
Member since 2021 • 1956 Posts

@Pedro said:
@above_average said:

You haven't even had time to review the information.

Stop shit posting and GTFO the thread if you can't post on topic. You don't contribute anything to the board with your silly troll behavior nor does your childish antics come off as cute.

Go make a thread on your own and post all the stupid one liners you can think of in that thread.

You are peddling nonsense and falsehood that is based on speculation. Then you try to pass this off as fact or proof but it is not. Use some critical thinking when evaluating your sources instead of regurgitating and coming to even wilder conclusions.

You can't determine if something is false or not when you yourself can't break down or explain ANYTHING that's going here on with these games other than your simple minded brain dead response of "duh it's jus da bad portz"!

It's ok to say you don't believe the conclusions of an individual who has far more experience with hardware, engineering and and software development than you ever will.

But don't bring your over inflated option here acting as if you, with zero understanding of how any of this works, can debunk someone who actually worked in the field for over 20 years.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15  Edited By above_average
Member since 2021 • 1956 Posts

@Pedro: Furthermore, me adding NXG's take on the VRAM issue to thread is just some additional information for anyone to take or leave.

The primary meat of the thread is the first video which is simply a matter of fact revealing severe VRAM issues on PC.

No one has to get bent out of shape and act like a complete idiot because of the additional content that was added to the topic.

Avatar image for R4gn4r0k
R4gn4r0k

49184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By R4gn4r0k
Member since 2004 • 49184 Posts
@Pedro said:
@R4gn4r0k said:

I wonder why games truly need more than 16GB of RAM and 12GB of VRAM, besides being unoptimized?

To add to that the amount of RAM available on current consoles is 10-~13GB combined.

Yup, it's rediculous.

Ok so 16GB has been fine in the PC space for years, now 32GB is recommended for some very demanding games, but let's be honest here, for some very unoptimized games for the most part.

My GTX1080ti had 11GB of RAM and it was fine for so many years, now my 3080ti is an upgrade, has 12GB of RAM yet every year the VRAM requirements seem to be doubling. 24GB VRAM on a 3090 seemed overkill a couple of years ago.

Now what's clear to me is that they have this unoptimized software just draining resources and not using it correctly or efficiently enough and a "let's just roll with it" attitude.

Like The Last of Us, a recent example of a bad pc port, had the CPU actually handle texture streaming and this central unit was just constantly being taxed for no good reason.

Poor showing.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#17  Edited By above_average
Member since 2021 • 1956 Posts
@miquella said:

PC gamers are getting fed up with one shitty port after another.

To be fair, the console versions aren't running flawlessly either.

I was disappointing to when hearing IGN's review of the game on PS5 couldn't hold 60fps in performance mode.

In light of seeing how it's performing on PC several folds more powerful, I'd like to see what resolution target consoles are able to land.

Avatar image for deactivated-6717e99227ada
deactivated-6717e99227ada

3866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#18 deactivated-6717e99227ada
Member since 2022 • 3866 Posts

Definitely a bad port, seems like we're going back in time with little effort being put on secundary versions.

If as a software engineer your solution is to brute force through hardware that in it self is the definition of a bad port.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#19 above_average
Member since 2021 • 1956 Posts

@kathaariancode said:

Definitely a bad port, seems like we're going back in time with little effort being put on secundary versions.

If as a software engineer your solution is to brute force through hardware that in it self is the definition of a bad port.

Here's the problem with the lazy "bad port" response. A bad port of what???

None of the versions run flawlessly. So what is the so-called lead platform that every other version is supposed to be a "port" of???

Don't say PS5 because it seems to run worse than the Xbox Series X version. So, do you think XSX is the lead platform?

Avatar image for mesome713
Mesome713

7272

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#20 Mesome713
Member since 2019 • 7272 Posts

Thats just a PC issue, PC always sucks. Some things will never change.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#21  Edited By above_average
Member since 2021 • 1956 Posts
@mesome713 said:

Thats just a PC issue, PC always sucks. Some things will never change.

People here convinced themselves that the new generation of consoles didn't mean anything with development of games going forward and bought PC hardware in accordance with that belief.

Now that developers have started to develop around current gen console architecture the resident hardware tech experts are baffled and confused.

The straight line power = performance metric which everyone defaulted to (which DF are also guilty of) to estimate console capability relative to PC hardware has constantly been shifted up. (DF originally estimated that PS5 was only on par with a 2060 super lol)

Avatar image for fedor
Fedor

11830

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Fedor
Member since 2015 • 11830 Posts

Obvious garbage optimization. Won't get better until people stop pre-ordering and buying shit ports day 1.

Avatar image for deactivated-6717e99227ada
deactivated-6717e99227ada

3866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#23 deactivated-6717e99227ada
Member since 2022 • 3866 Posts

@above_average: Read "version" instead of "port" if you like. But PC does look like it's going back in time.

Unless you are trying to tell me that it's impossible for modern PCs to run a game well then a game that fails to run well on said hardware is by definition badly developed.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#24  Edited By above_average
Member since 2021 • 1956 Posts
@kathaariancode said:

@above_average: Read "version" instead of "port" if you like. But PC does look like it's going back in time.

Unless you are trying to tell me that it's impossible for modern PCs to run a game well then a game that fails to run well on said hardware is by definition badly developed.

I'm not saying that. I don't have the answers and I'm not going to pretend I do. I will, however, listen to people who are experienced in software and development and can give answers to the possible situations we are looking at.

..................................................................................

NXG:

"The main impact I believe is 2 fold...the PS5 has a dedicated I/O management block, DMA and decompression hardware. That means it can shift data; specifically textures but also animations, sound effects ext. without any CPU cycles being needed and get it on screen...

As this data management, decompression, animation streaming, path finding all of these impact the CPU,which (PC) has no such dedicated hardware and the API and additional overhead. Now in addition, this is where the memory ramp up on the PC. It has to pull all the required data into system ram from the harddrive and copy this back and forth between the VRAM and system ram pools...

And it means the PC demands much larger heaps than console"

......................................................................................

Now everyone can choose to default to the "lazy development" response if they choose. But from the results of almost all recent games, that type of response isn't going to resolve any of the issues we're seeing here.

To my understanding both PS5 and XSX have custom hardware to help better manage data which standard PC's do not.

Instead of stubbornly saying power = performance like many have done, it may be time to concede that if games are designed around current gen consoles and if their architecture allows developers to create games that effectively go up to or above 16GB of ram due to superior data management & stress taken off the CPU then that would go to explain how consoles are able to perform several levels above a similar PC counterpart.

Avatar image for pclover1980
PCLover1980

1753

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#25 PCLover1980
Member since 2022 • 1753 Posts

It's a garbage port, plain and simple.

Avatar image for Pedro
Pedro

74088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#26 Pedro  Online
Member since 2002 • 74088 Posts

@above_average said:
@kathaariancode said:

@above_average: Read "version" instead of "port" if you like. But PC does look like it's going back in time.

Unless you are trying to tell me that it's impossible for modern PCs to run a game well then a game that fails to run well on said hardware is by definition badly developed.

I'm not saying that. I don't have the answers and I'm not going to pretend I do. I will, however, listen to people who are experienced in software and development and can give answers to the possible situations we are looking at.

..................................................................................

NXG:

"The main impact I believe is 2 fold...the PS5 has a dedicated I/O management block, DMA and decompression hardware. That means it can shift data; specifically textures but also animations, sound effects ext. without any CPU cycles being needed and get it on screen...

As this data management, decompression, animation streaming, path finding all of these impact the CPU,which (PC) has no such dedicated hardware and the API and additional overhead. Now in addition, this is where the memory ramp up on the PC. It has to pull all the required data into system ram from the harddrive and copy this back and forth between the VRAM and system ram pools...

And it means the PC demands much larger heaps than console"

......................................................................................

Now everyone can choose to default to the "lazy development" response if they choose. But from the results of almost all recent games, that type of response isn't going to resolve any of the issues we're seeing here.

To my understanding both PS5 and XSX have custom hardware to help better manage data which standard PC's do not.

Instead of stubbornly saying power = performance like many have done, it may be time to concede that if games are designed around current gen consoles and if their architecture allows developers to create games that effectively go up to or above 16GB of ram due to superior data management & stress taken off the CPU then that would go to explain how consoles are able to perform several levels above a similar PC counterpart.

I know this would be useless but I am going to make a concerted effort.

The first note is that data management is software and not hardware. Secondly, the claim that a console performs several levels above a similar specced PC is false.

Consoles use unified memory solutions and PCs (outside of M1/2) have separated memory resources. The advantage of unified is that there is no need for data to be copied between pools. The claim

"As this data management, decompression, animation streaming, path finding all of these impact the CPU,which (PC) has no such dedicated hardware and the API and additional overhead." is also false.

When streaming data on PC without direct access from the storage to VRAM, system RAM is used as the buffer between the two resources. This does not equate to a VRAM issue. It simply means that the method of transferring data between pools is different and on PC a buffer in system memory would be required. Note, that system memory is faster than SSD. Games on PC has the ability to allocate more VRAM if the VRAM is available. This is not an uncommon practice. If the 4090 is using 18GB of VRAM it is just bulk allocating because it has the capacity. If the game required that much memory to run, it would not be able to run on any console.

Software that runs poorly on stronger hardware is unoptimized software. All of the things you are talking about is software related and not hardware. Software utilizes hardware. Poor utilization of hardware equates to poor performance. It is that simple.

Avatar image for deactivated-6717e99227ada
deactivated-6717e99227ada

3866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#27 deactivated-6717e99227ada
Member since 2022 • 3866 Posts

@above_average: If I was seeing anything that goes beyond anything I've seen before then I would agree. But we're not.

Let's say, ray tracing. Nvidia's hardware solution makes it impossible to deliver the same results on AMD hardware. You can't really blame developers for that, but if cyberpunk was forced to run at 3fps just to run the "Nvidia version" of Cyberpunk then that would be 100% a bad port/version. Because the job of a software developer is to find software solutions for whichever hardware limitations they are working with. For instance, using SSR offers immense gains while delivering excellent results.

I can give you a very basic personal story. I wanted to implement parallax into project I was working on. As any good developer my immediate solution was to search the web for some already created and tested code, but upon implementation the game became a stuttering mess. Well, nothing really worked so I had to understand the issue, study it, and develop the code myself and now it runs flawlessly. Ultimately the issue is always on the hardware because if you had unlimited power you would brute force almost anything, but the realistic solution must be on the software side.

If versions are being made to take advantage of some hardware specifications then solutions must be found to circumvent those limitations elsewhere. I mean, it's not unheard, pretty standard actually. You cut what you need to cut to make sure a version runs well, every system has its strengths and weaknesses.

And btw, I hate the "lazy developer" accusation. Developers for the most part are overworked to the bone and are not given the necessary time and conditions to deliver a completely satisfactory job. It's not their call to say when a game is done.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28  Edited By Zaryia
Member since 2016 • 21607 Posts

PC versions eventually get fixed, almost always.

Console version remains at 30 fps on medium, forever.

@mesome713 said:

Thats just a PC issue, PC always sucks. Some things will never change.

By always sucks you mean always wins SW.

Avatar image for pc_rocks
PC_Rocks

8611

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#29  Edited By PC_Rocks
Member since 2018 • 8611 Posts

Can someone tell me what does senior advanced engineer means? Also LOL NXCow.

Avatar image for nfamouslegend
NfamousLegend

1037

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 NfamousLegend
Member since 2016 • 1037 Posts

Guys all games will get patched, it's just the way it is now. STOP being surprised by shitty performance from games on day one, publishers have deadlines, developers are lazy etc etc. Also anyone buying a new graphics card with less than 16gb of vram and expecting smooth performance at max settings and resolution will be disappointed.

Avatar image for davillain
DaVillain

58798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#31 DaVillain  Moderator
Member since 2014 • 58798 Posts

Day one patch is said to fix the problem when it releases tomorrow for PC. Until I play it myself, I'll report back how the game fairs on my PC hardware.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 Juub1990
Member since 2013 • 12622 Posts

Is this why it's taking 15 seconds to load a bunch of blurry textures here?

Loading Video...

Dat decompression at work.

And TLOU has been massively improved with the latest patch.

  • Optimized CPU and GPU use throughout the game
  • Improved texture fidelity and resolution on in-game Low and Medium settings
  • Improved graphical fidelity on the in-game Low graphics preset, particularly water surfaces no longer appear black
  • Fixed a crash that may occur during shader building

OP fails again.

Avatar image for davillain
DaVillain

58798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#34 DaVillain  Moderator
Member since 2014 • 58798 Posts
@above_average said:

🫣 I'll wait for @davillain to pop up again to tell & everyone everything is fine to cope with his 8GB 3070Ti

Why do you feel the need to lie to yourself?

Avatar image for ivangrozny
IvanGrozny

1939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35  Edited By IvanGrozny
Member since 2015 • 1939 Posts

PC should be just using console vram. Problem solved 🤣🤣🤣.

Oh wait, console use the same vram modules as PC. The only difference is OS. But if we install PS/Xbox Os on PC then we won’t have PC anymore. You will just have a console 🙄

Avatar image for NoodleFighter
NoodleFighter

11900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 NoodleFighter
Member since 2011 • 11900 Posts

@R4gn4r0k: @Juub1990: I noticed another common factor that almost all these games have in common. They're AMD Sponsored titles. I won't be surprised if AMD encourages developers not to optimize their VRAM usage because it allows them to make Nvidia look bad, which appears to be working. I see everyone on youtube saying they're switching to AMD or are glad they picked RDNA 2/3 instead of Ampere/Ada all because of VRAM.

Avatar image for above_average
above_average

1956

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#37 above_average
Member since 2021 • 1956 Posts

@davillain said:
@above_average said:

🫣 I'll wait for @davillain to pop up again to tell & everyone everything is fine to cope with his 8GB 3070Ti

Why do you feel the need to lie to yourself?

I'll wait for your report on your rig

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 Juub1990
Member since 2013 • 12622 Posts

@above_average: How about you tell us where Mikami went with those 150 employees?

Avatar image for gifford38
Gifford38

7917

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#39 Gifford38
Member since 2020 • 7917 Posts

I just thinks its a amd and nvidia thing. both are different in ways. one is more optimized than the other. if amd game it runs poorer on nvidia gpu until they fix it.

its kind of like console gaming. series x is bit different than ps5. games run a bit different for both. just can't take ps5 code and run it on xbox.

Avatar image for davillain
DaVillain

58798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#40 DaVillain  Moderator
Member since 2014 • 58798 Posts

@NoodleFighter said:

@R4gn4r0k: @Juub1990: I noticed another common factor that almost all these games have in common. They're AMD Sponsored titles. I won't be surprised if AMD encourages developers not to optimize their VRAM usage because it allows them to make Nvidia look bad, which appears to be working. I see everyone on youtube saying they're switching to AMD or are glad they picked RDNA 2/3 instead of Ampere/Ada all because of VRAM.

I also notice that as well from the games I'm playing AMD sponsorships. That's really dirty but I get it, VRAM is more evident we need more VRAM in our GPU and, to be frank, Nvidia is the problem ever since TLOU PC really sparks demanding more VRAM, Nvidia themselves need to stop skimping VRAM and I can see why people are upset with the 4070 even though 12GB VRAM is nice but it could have been 14GB or more what they're asking.

If Nvidia starts skimping VRAM whenever they release 50XX, I might switch to AMD and I can live without using RT even though I really like the tech.

Avatar image for Bond007uk
Bond007uk

1722

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 Bond007uk
Member since 2002 • 1722 Posts

@mesome713 said:

Thats just a PC issue, PC always sucks. Some things will never change.

Yeah sure they do. Running your beloved Switch games at 8x the resolution and 4x the frame rate 😂

Avatar image for NoodleFighter
NoodleFighter

11900

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42  Edited By NoodleFighter
Member since 2011 • 11900 Posts

@davillain: There is some truth to Nvidia skimping on VRAM, but games like Jedi Survivor look the same as their predecessor so there is clearly some half-assing going on. Nvidia has superior Image Upscaling quality and performance, superior RT, superior driver support, and rasterization performance that is at least on par with their AMD counterparts. More VRAM and being cheaper is all AMD has going for them for the past decade, and even then, AMD cards aren't cheap enough in the top tiers to effectively undercut Nvidia's best. This reminds me of AMD CPUs pre-Ryzen and post-Phenom basically, they couldn't compete with making more efficient architecture and innovations, so they overcompensated by cramming in more cores.

AMD isn't trying to innovate first, they simply wait for Nvidia to release something groundbreaking, and then they try to copy it. The last groundbreaking thing I remember AMD trying to do was the creation of the Mantle API. Some of the PC techtuber videos I've been watching recently question if AMD even cares about the PC gaming market anymore since they notice AMD is fumbling the bag with undercutting Nvidia and Intel, who are new to the scene and are already doing somethings better than them.

The RTX 4070 is already flopping in sales and getting price drops so I think Nvidia may not try to attempt to be as greedy with the RTX 50 series.

Avatar image for Bond007uk
Bond007uk

1722

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 Bond007uk
Member since 2002 • 1722 Posts

It's a shit version. Well, it is at the moment at least. It'll be fixed, but to be honest, if they couldn't be bothered to get it working on mainstream tier PC graphics hardware they shouldn't release it.

I mean, 18GB of VRAM? The current gen consoles only have 16GB of RAM total!

Avatar image for fedor
Fedor

11830

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 Fedor
Member since 2015 • 11830 Posts

OP BTFO

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#45  Edited By 04dcarraher
Member since 2004 • 23859 Posts
@davillain said:
@NoodleFighter said:

@R4gn4r0k: @Juub1990: I noticed another common factor that almost all these games have in common. They're AMD Sponsored titles. I won't be surprised if AMD encourages developers not to optimize their VRAM usage because it allows them to make Nvidia look bad, which appears to be working. I see everyone on youtube saying they're switching to AMD or are glad they picked RDNA 2/3 instead of Ampere/Ada all because of VRAM.

I also notice that as well from the games I'm playing AMD sponsorships. That's really dirty but I get it, VRAM is more evident we need more VRAM in our GPU and, to be frank, Nvidia is the problem ever since TLOU PC really sparks demanding more VRAM, Nvidia themselves need to stop skimping VRAM and I can see why people are upset with the 4070 even though 12GB VRAM is nice but it could have been 14GB or more what they're asking.

If Nvidia starts skimping VRAM whenever they release 50XX, I might switch to AMD and I can live without using RT even though I really like the tech.

VRAM isnt the sole issue.... There are a slew of reasons why all of a sudden its become an issue lately.

Firstly its developers releasing buggy and poorly optimized games. Many are from the use of the UE4 engine without properly taking into account cpu limitations of the engine, not compiling shaders more efficiently, and translating DX12 features on top of a engine using DX11 base.

Another problem is that it looks like many developers are not creating specific textures resolutions/size based settings. Even with a well optimized game like Spiderman, 1080p low vs 4k low settings was only 300mb difference 6.2gb vs 6.5gb or even on ultra settings same usage difference 300-400mb 1080p vs 4k. Even TLOU part 1, only sees 400-500mb difference between 1080p vs 4k at same settings.

Next point is that a good chunk of why games are having issues on Nvidia gpus lately, is because those games are AMD sponsored titles, which means RDNA architectures are going to get preferential treatment, and AMD will push their bigger advantage over Nvidia by artificially inflating more vram usage. Multiple examples from Farcry 6's high resolution texture pack requirements to AMD's touting the vram card against Nvidia, telling potential graphics card buyers that they should consider AMD over Nvidia.... after RTX 4070 launch.

With this game in particular game Ive seen a example with pre-release version that a 7900xtx running the game better than 4090 by a whooping 12 fps.

Another thing people need to look at especially with concerning to RTX 3070 with 8gb is that it was never meant to be 4k gaming gpu, nor be able push RT to the wall, but mainly its a gpu released almost three years ago meant for 1080p/1440p gaming.

Now Nvidia skimping out on vram with the 4070 series and Nvidia originally going to pass the 4070ti as a RTX 4080 12gb was shady.

Side note with the TLOU Part 1 latest patch has fixed the vram/performance issue on RTX 3070 series.

Avatar image for simple-facts
simple-facts

2592

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#46 simple-facts
Member since 2021 • 2592 Posts

That ps5 secret sauce is some stuff 🤣🤣

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#47  Edited By 04dcarraher
Member since 2004 • 23859 Posts
@Bond007uk said:

It's a shit version. Well, it is at the moment at least. It'll be fixed, but to be honest, if they couldn't be bothered to get it working on mainstream tier PC graphics hardware they shouldn't release it.

I mean, 18GB of VRAM? The current gen consoles only have 16GB of RAM total!

Not only having 16gb of ram but its all shared between OS,features, game cache, and gpu. XSX typically only uses 10gb of the faster memory pool for vram anyways, and PS5 usage is a bit more flexible buts its vram usage is similar to XSX.

Avatar image for BassMan
BassMan

18757

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#48  Edited By BassMan
Member since 2002 • 18757 Posts

Settle down, this is nothing more than lack of optimization and time for polish. They should have spent more time....

Patches will fix the issues. Hopefully the day one patch is effective and gamers are not getting screwed if they buy at release. Not a fan of relying on day one/post-release patches and reviewing games with non final code either. You don't see the movie industry releasing movies into theaters that have not finished their post production, editing and final cut. So, why does the game industry keep releasing games that are not finished? It is fucking dumb. Shit management.

Avatar image for Juub1990
Juub1990

12622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Juub1990
Member since 2013 • 12622 Posts

So, apparently, it's been fixed? Performance isn't great and we're yet to see the open areas but it's much better now.

Loading Video...

Well, the game isn't officially out until tomorrow so let's wait a bit.

Avatar image for with_teeth26
with_teeth26

11660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 1

#50 with_teeth26
Member since 2007 • 11660 Posts

@Juub1990 said:

So, apparently, it's been fixed? Performance isn't great and we're yet to see the open areas but it's much better now.

Well, the game isn't officially out until tomorrow so let's wait a bit.

first i've seen of the post-patch performance, thats encouraging at least. big shame it doesn't support DLSS