GamingPCGod's forum posts

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 GamingPCGod
Member since 2015 • 132 Posts

@zaryia said:
@ronvalencia said:
@zaryia said:

oh wow but @ronvalencia told me XBOX1X will run Destiny2 at 60 fps and pc can't!

Bullshit. D2 4K 60 fps high settings needs atleast GTX 1080 OC level GPU. At 4K, there's a higher chance for X1X running into GPU bound.

So PC can do destiny2 at 60 fps

XBOX ONE X will not do destiny 2 at 60 fps?

No, because the XB1X isn't capable of doing it, either because it's GPU is overhyped or it's bottlenecked by it's cpu. But seeing as it uses AMD architecture, it's probably both.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By GamingPCGod
Member since 2015 • 132 Posts

@drlostrib said:

@gamingpcgod: because alt

Ok, I understand. My low post count does make me more suspect. But anyways, since you ask, here's my Geforce set up

Blocked out my username because it's my real name in real life (which was a bad idea in hindsight).

Anyways, why would I make an alt to simply trash consoles, which I would supposedly own?!?!

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 GamingPCGod
Member since 2015 • 132 Posts

@drlostrib said:

@gamingpcgod: I meant you

Why would you say that

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad said:

@gamingpcgod:

Dude, a stock RX 480 8GB is getting almost double the frames of the GTX 1060 3GB in RoTR at 4K. X1X's GPU is superior to the RX 480 8GB by upwards of 20%. 3GB is not sufficient for 4K in RoTR, Forza, and the majority of modern games.

I have the GTX 1060 6GB and RX 480 8GB. They both fail to match X1X by a substantial amount in Forza Apex. In essence, X1X has the missing "RX 490" tier GPU. The same performance slot the 980ti stock occupies.

---

I'm all for 1080p/Ultra or 1080p/60fps modes on X1X. That we can agree on.

I've just concluded that you can't read. Have a good day.

P.S. you can't run Forza Apex better than an XB1X? I can run Forza Apex on Max Graphics at 70-90 (depending on map and weather type) fps with my 1060; 40-60 fps at 4K with max graphics; and I have the MOBILE variant. And these are with NVidia Control Center settings at

  • Anti-Aliasing FXAA: On
  • Anti-Aliasing Gamma Correction: On
  • Overriding the x8 application MSAA with a x8 SSAA
  • Texture filtering: High Quality
  • Virtual Pre-Rendered Frames: 4
  • Preferring Maximal Performance
  • Anti-Strophic Filtering: x16

Either you're lying (Which I suspect you are) or you have something seriously wrong with your rig.

Don't bother responding.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5  Edited By GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad:

1. Supersampling is essentially the same exact thing as running 4K in DSR or VSR mode on a 1080p screen. Your "looks FAR better than 4k" claim is bullshit. It's the same thing. I showed you with Ryse.

OMG, I don't know how many times I have to explain this. SSAA is an Anti-Aliasing method. Anti-Aliasing typically works to decrease jaggedness on edges. It, however, has no improvement on the display. Increase AA - regardless of if it's MSAA, FXAA, or SSAA - but keep the resolution at 480p, and the picture will be incredibly pixilated; HOWEVER, there will be no jagged edges and the game will still retain it's graphical fidelity.

Loading Video...

Unlike most AA settings, which only provide a crisper image of the EDGES of 3d models, (some) SS settings provide a crisper and more detailed image of the edges AND the models themselves.

Display resolution IS NOT anti-aliasing. It doesn't matter if you run your game at 4K; the fact of the matter is that if you turn all your AA settings off, then you will have jagged 3d models with the staircase effect.

HOWEVER, AA is becoming less and less needed as graphic modeling improves. Graphic designers are progressively adding more and more vertex points to their modeling, making traditional AA like MSAA less and less needed. This is why BF 4 only has MSAA 4x, because it has high quality, smooth 3d models, while games like CS:GO need MSAA 8x, because their models don't have quite as many vertexes.

Which brings me to the point of RYSE that I probably didn't clarify earlier: If a game already has a lot of vertexes, than it needs to rely on ANY AA setting less. This means increasing the SS to a certain extant won't give you a much better picture.

Point is, display resolution isn't an AA setting, and is incapable of improving the quality and the smoothness of the 3d models THEMSELVES.

Irregardless of all of this, SSAA is only ONE WAY of improving graphical fidelity, and though to their credit the XB1X will be introducing it to consoles for the first time, it still lags behind in rendering speeds and texture details.

2. Digital Foundry with confirmation from MS. Article 1. Article 2. Showed you twice where supersampling for 1080p users is a system-level feature of the X1X. lol.

I KNOW! I CONCEDED THAT EARLIER! But obviously you can't read, so I'll repost my reply again.

2. OK, that's actually fantastic, but what level will the Supersampling be? 120% (or x1.44)? 150% (x2.25)? 200% (x4)? 283% (x8)? We need more details than "it will simply have supersampling" because I can tell you that if it's anything lower than x2.25 (or 150% res scale) than it won't make much of a difference. Plus, if it's incapable of playing games at 60 fps with this res scale, is it really worth it? Time will tell.

I then moved on to state how res scaling is only one of MANY ways to improve graphical fidelity. But again, you completely ignored that.

Already showed you where PS4 games used 3GB of 4.5GB available for VRAM. Sony increased this to 5.5GB for games, so VRAM usage will jump to ~4GB. X1X adds 3.5GB RAM for games, putting it into the 6-7GB VRAM range while still easily accommodating 1.5-3GB System RAM usage.

I see what you mean. It's the term "VRAM" - which is typically used for dedicated graphics cards (not shared RAM like in consoles) - that through me off.

Still, doesn't mean that it will be better than cards with less memory. You also have to take into account stuff like processing speed, which is more important (as I've repeatedly said, VRAM is not as important as architecture and speed, as multiple cards outperform cards with more VRam. GTX 1060 6GB vs the RX 580, the GTX 1060 3GB vs GTX 1050ti, GTX 1080ti vs Vega Frontier Edition; all of the former cards have less VRAM, ranging from a 1 gb difference to a 5 gb difference, yet all of the former cards trump the ladder, due to superior quality. And seeing as Consoles use AMD - which is notoriously known for bad architecture - that 6-7 gb of VRAM will more likely perform like 3-5 gb of a NVidia equivalent. It also doesn't help that it has a clock speed of 6.8 Ghz, which is freaking lower than the GTX 1060 3gb variant (9 Ghz).

It's large amount of GDDR RAM won't be able to compensate for the inferior AMD architecture or inferior clock speeds, the ladder of which being FAR more important when it comes to rendering graphics. It's why the GTX 1080ti, which has 5 less "VRAM" trumps the Vega Frontier, or the GTX 1060 with 2 less GB of VRAM trumps the RX 580. If you have all this room to store data, but you're not processing it quick enough to provide a smooth gameplay, then the amount of room you have is irrelevant. Then again, it seems that console players are fine with 30 fps with frequent drops in the 20s.

It's the same thing with CPUs. Console lemmings love raving about how many cores they have. "Ooooh, look, I have 8 cores". Yet, their 8 cores are being outdone by a 4 core I3. Why? Because Intel's superior architecture and faster speeds >>>>> AMD based shitty architecture. Why do you think they have so many cores? It's to compensate.

THAT is the reason why I've been saying that it will probably be a 1060 3gb level. Yeah, it has twice as much VRAM, but it has inferior architecture and is 1.2 GHz slower.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad said:

@gamingpcgod:

Keep on spinning. Your whole premise is debunked.

You think a GTX 1060 3GB can match the X1X GPU. lol. It's half the framerate of the GTX 1060 6GB in RoTR. Full-on VRAM limited. X1X is faster than the GTX 1060 6GB and RX 480 8GB.

How do you debunk a premise when you completely ignore it??????

Let me present my conclusion in one more final, concise fashion, since you clearly have reading comprehension deficiencies.

The 4K, Max Settings PC version is basically what'd you get if you combine the 4K and Enriched Visual+ settings in Xbox One X. That's why those FPS are so low.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad said:

@gamingpcgod:

The X1X GPU is more powerful than a GTX 1060 6GB or the RX 480 8GB(or the 390/390X for that matter).

The 1060 3GB version gets half the frames of the 1060 6GB model and RX 480 8GB. It's VRAM bottlenecked at 4K on RoTR.

The paltry 3GB VRAM and lower mem bandwidth run into the wall hard. X1X is designed to use 6-7GB VRAM.

Are you tired of being wrong yet?

So did you just completely ignore everything I just said? I'm going to give you the benefit of the doubt and assume you didn't get my reply rather than assuming you're a blind idiot fanboy who's grasping at straws, so here it goes again.

So many things wrong with your post.

First off, those benchmark numbers are for the game not only at 4k, but also on max settings, which typically involves also going into the NVidia control panel and turning all of the settings to their highest and preferring image quality to performance. The "Enriched Visuals" mode will probably be equivalent to the PC "High". The 4K visual mode on the other hand renders the game at only slightly higher levels than the XBox One, but at 4K and 30FPS. The third setting basically gives you XBO graphics at 1080p but at twice the fps (60).

Why is this important? Because you're comparison is utterly moronic with this information in mind. You're not comparing apples to apples, or even apples to oranges; you're comparing an apple to a grape when you make these comparisons, as the GTX 1060 3GB has to do FAR MORE when rendering the game at 4K. It's not simply giving a 4k display like the XBox One X is, but it is also giving you XBox One X "Enriched Visuals" at slightly higher settings WHILE running it at 4K. It would be basically like combining those two settings. THAT'S why it's FPS is so low, because I know for a FACT that a GTX 1070 is not only slightly better than an XB1X GPU. A better comparison if you were to configure the settings to somewhere between medium and high at 4K, and THEN compare it to a XB1X. I'd bet it would get similar frames, if not more.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad said:
@gamingpcgod said:

And anyways, the quality of the GPU is more important than how much VRam it has anyways. That's why a 1060 3gb can get you 20 more frames than a 1050ti.

GTX 1060 3GB is destroyed at 4K by X1X...

X1X is running RoTR at native 4k/30fps with essentially very high settings. More than double what the 1060 3GB is capable of.

So many things wrong with your post.

First off, those benchmark numbers are for the game not only at 4k, but also on max settings, which typically involves also going into the NVidia control panel and turning all of the settings to their highest and preferring image quality to performance. The "Enriched Visuals" mode will probably be equivalent to the PC "High". The 4K visual mode on the other hand renders the game at only slightly higher levels than the XBox One, but at 4K and 30FPS. The third setting basically gives you XBO graphics at 1080p but at twice the fps (60).

Why is this important? Because you're comparison is utterly moronic with this information in mind. You're not comparing apples to apples, or even apples to oranges; you're comparing an apple to a grape when you make these comparisons, as the GTX 1060 3GB has to do FAR MORE when rendering the game at 4K. It's not simply giving a 4k display like the XBox One X is, but it is also giving you XBox One X "Enriched Visuals" at slightly higher settings WHILE running it at 4K. It would be basically like combining those two settings. THAT'S why it's FPS is so low, because I know for a FACT that a GTX 1070 is not only slightly better than an XB1X GPU. A better comparison if you were to configure the settings to somewhere between medium and high at 4K, and THEN compare it to a XB1X. I'd bet it would get similar frames, if not more.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 GamingPCGod
Member since 2015 • 132 Posts

@gamecubepad: Maybe if you went back and read through the thread, you'll see that I acknowledge that the XB1X has said capabilities, though it is unknown on what level this will be.

Nonetheless, the problem still remains that these systems care more about display resolution than graphical fidelity and fps.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 GamingPCGod
Member since 2015 • 132 Posts

@quadknight said:

? No.