DF: Xbox One X at Gamescom - Enhanced Games Previewed!

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#301  Edited By QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:

? Wrongvalencia REKT again.

And this clown has the courage to say the X1X is on par with a GTX 1070 when it can't even beat the RX580 lol.

https://www.neowin.net/news/according-to-a-producer-at-respawn-entertainment-the-xbox-one-x-renders-titanfall-2-at-6k

? More meaningless charts and graphs and he's still wrong lol.

You're the only idiot on the planet that thinks the X1X will outperform a PC with a GTX 1070. All the benchmarks disagree with you and you still go on and on. This is why you have zero credibility and are a joke on SW. If you were on NeoGAF you'd be banned for being so stupid.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#302 ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:

? Wrongvalencia REKT again.

And this clown has the courage to say the X1X is on par with a GTX 1070 when it can't even beat the RX580 lol.

https://www.neowin.net/news/according-to-a-producer-at-respawn-entertainment-the-xbox-one-x-renders-titanfall-2-at-6k

? More meaningless charts and graphs and he's still wrong lol.

You're the only idiot on the planet that thinks the X1X will outperform a PC with a GTX 1070. All the benchmarks disagree with you and you still go on and on. This is why you have zero credibility and are a joke on SW. If you were on NeoGAF you'd be banned for being so stupid.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

The developer didn't state "If you think about it, it's kind of equivalent to a GTX 1060". LOL.

Keep crying.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#303 QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:
@ronvalencia said:

https://www.neowin.net/news/according-to-a-producer-at-respawn-entertainment-the-xbox-one-x-renders-titanfall-2-at-6k

? More meaningless charts and graphs and he's still wrong lol.

You're the only idiot on the planet that thinks the X1X will outperform a PC with a GTX 1070. All the benchmarks disagree with you and you still go on and on. This is why you have zero credibility and are a joke on SW. If you were on NeoGAF you'd be banned for being so stupid.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

The developer didn't state "If you think about it, it's kind of equivalent to a GTX 1060". LOL.

Keep crying.

WTF? What argument are you even trying to make now? How did posting this shit just help your argument?

Are you sure you haven't short-circuited again Wrongvalencia? ? WTH dude.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#304  Edited By ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@ronvalencia said:
@quadknight said:
@ronvalencia said:

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

The developer didn't state "If you think about it, it's kind of equivalent to a GTX 1060". LOL.

Keep crying.

WTF? What argument are you even trying to make now? How did posting this shit just help your argument?

Are you sure you haven't short-circuited again Wrongvalencia? ? WTH dude.

1. Your "even with that saying "kind of equivalent" isn't even close to being a confirmation." argument is flawed since one can't say X1X has actual GTX 1070 GPU stupid coward.

2. The developer did NOT state "If you think about it, it's kind of equivalent to a GTX 1060".

Deal with it. Hint: it's the wet track's alpha effects which is mostly ROPS (with it's cache) based processing.

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#305  Edited By QuadKnight
Member since 2015 • 12916 Posts

@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

The developer didn't state "If you think about it, it's kind of equivalent to a GTX 1060". LOL.

Keep crying.

WTF? What argument are you even trying to make now? How did posting this shit just help your argument?

Are you sure you haven't short-circuited again Wrongvalencia? ? WTH dude.

1. Your "even with that saying "kind of equivalent" isn't even close to being a confirmation." argument is flawed since one can't say X1X has actual GTX 1070 GPU stupid coward.

2. The developer did NOT state "If you think about it, it's kind of equivalent to a GTX 1060".

Deal with it. Hint: it's the wet track's alpha effects which is mostly ROPS (with it's cache) based processing.

English spambot ....are you capable of replying in simple English Wrongvalencia?

You sound like a malfunctioning robot and your posts are devoid of any logic or reasoning.

Your shitty reply still doesn't explain why you're posting pics of TLOU. You're so stupid you can't even make posts that have any coherence.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#306 ronvalencia
Member since 2008 • 29612 Posts

@quadknight said:
@ronvalencia said:
@quadknight said:
@ronvalencia said:
@quadknight said:

Ark is an optimized piece of shit game on PC and even with that saying "kind of equivalent" isn't even close to being a confirmation.

You're a clown and you need to brush up on your English comprehension.

Every benchmark on the internet proves you wrong, you're the one that needs to "deal with it". The fact you're quoting shitty devs like the Ark devs already proves you've already lost the argument.

The developer didn't state "If you think about it, it's kind of equivalent to a GTX 1060". LOL.

Keep crying.

WTF? What argument are you even trying to make now? How did posting this shit just help your argument?

Are you sure you haven't short-circuited again Wrongvalencia? ? WTH dude.

1. Your "even with that saying "kind of equivalent" isn't even close to being a confirmation." argument is flawed since one can't say X1X has actual GTX 1070 GPU stupid coward.

2. The developer did NOT state "If you think about it, it's kind of equivalent to a GTX 1060".

Deal with it. Hint: it's the wet track's alpha effects which is mostly ROPS (with it's cache) based processing.

English spambot ....are you capable of replying in simple English Wrongvalencia?

You sound like a malfunctioning robot and your posts are devoid of any logic or reasoning.

Your shitty reply still doesn't explain why you're posting pics of TLOU. You're so stupid you can't even make posts that have any coherence.

1. Your "even with that saying "kind of equivalent" isn't even close to being a confirmation" argument is flawed since any person can't say X1X has an actual GTX 1070 GPU stupid coward.

2. The developer did NOT state "If you think about it, it's kind of equivalent to a GTX 1060".

Hint: it's the wet track's alpha effects which is mostly ROPS (with it's cache) based processing. Deal with it.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#307 scatteh316
Member since 2004 • 10273 Posts

Performance is where everyone expected it to be......... except if your names Ron.... in which case performance is worse then what you've spent the last 6 months hyping and owning yourself.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#308 PinkAnimal
Member since 2017 • 2380 Posts

@quadknight: "English spambot ....are you capable of replying in simple English Wrongvalencia?"

Apparently he can't...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#309  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:

Performance is where everyone expected it to be......... except if your names Ron.... in which case performance is worse then what you've spent the last 6 months hyping and owning yourself.

Too bad for you

Deal with it.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

Loading Video...

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#310  Edited By ronvalencia
Member since 2008 • 29612 Posts

@pinkanimal said:

@quadknight: "English spambot ....are you capable of replying in simple English Wrongvalencia?"

Apparently he can't...

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#311 PinkAnimal
Member since 2017 • 2380 Posts

@ronvalencia: "I'm correct"

You know that spamming "I'm correct" a million times doesn't make you be correct, spambot? I can see how a spambot could have that misconception since spamming is all you know but you're still wrong in pretty much all the crap you spew here, wrongvalencia. You can't even read a post properly to respond with something that makes sense.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#312  Edited By ronvalencia
Member since 2008 • 29612 Posts

@pinkanimal said:

@ronvalencia: "I'm correct"

You know that spamming "I'm correct" a million times doesn't make you be correct, spambot? I can see how a spambot could have that misconception since spamming is all you know but you're still wrong in pretty much all the crap you spew here, wrongvalencia. You can't even read a post properly to respond with something that makes sense.

You're still wrong coward.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#313 PinkAnimal
Member since 2017 • 2380 Posts

@ronvalencia: You're still spamming nonsense, spambot.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#314 ronvalencia
Member since 2008 • 29612 Posts

@pinkanimal said:

@ronvalencia: You're still spamming nonsense, spambot.

You're still wrong. cowspam.

Avatar image for SOedipus
SOedipus

15072

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#315 SOedipus
Member since 2006 • 15072 Posts

omg you guys.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#316 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:

Performance is where everyone expected it to be......... except if your names Ron.... in which case performance is worse then what you've spent the last 6 months hyping and owning yourself.

Too bad for you

Deal with it.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

Loading Video...

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

" it's kind of"

Now you deal with it.......

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#317 tormentos
Member since 2003 • 33793 Posts
@ronvalencia said:

1. Your "even with that saying "kind of equivalent" isn't even close to being a confirmation." argument is flawed since one can't say X1X has actual GTX 1070 GPU stupid coward.

This is the main problem with you,you simply look for quotes of developers who praise the xbox brand and use it as some kind irrefutable truth.

And you have been doing this for a LONG LONG time this is not something of just the xbox one X,you use developers to defend the xbox one as well.

@ronvalencia said:

@remaGloohcSdlO:

The stories are the same.

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Bolcato stated that, “It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

“It means you have to do it in chunks or using tricks, tiling it and so on...

...

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.

Yeah, I mean that’s probably why, well at least on paper, it’s a bit more powerful. But I think the Xbox One is gonna catch up.But definitely there’s this eSRAM. PS4 has 8GB and it’s almost as fast as eSRAM [bandwidth wise] but at the same time you can go a little bit further with it, because you don’t have this slower memory. That’s also why you don’t have that many games running in 1080p, because you have to make it smaller, for what you can fit into the eSRAM with the Xbox One.”

This was one of my favorites,remember how you use to quote Rebellion on a daily basis?

I do you did it to damage control the xbox one not hitting 1080p,i remember how you quote and quoted that part about rebellion been worry six months prior to that article but not any more because sdk and ESRAM would save the day.

Then Sniper Elite 3 got released and you literally banish from the forum,you stopped posting i remember well when the gap it was shown that while the xbox one reach 1080p the gap in frames was considerable and that in reality ESRAM as just one of the many problems the xbox one had.

You were wrong,Bolcato was wrong,Brad Wardell LIED about double performance and Phil Spencer also lie,think about that next time you quote the same shit from a developer 100 times,specially if the game in question is a shitty mess that requires a 1080gtX for 1080p like arc survival does.

Avatar image for AdobeArtist
AdobeArtist

25184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#318 AdobeArtist  Moderator
Member since 2006 • 25184 Posts

** Courtesy Mod bump to clear out the recent forum invasion... carry on :) **

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#319 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

$500 30 FPS box, cannot stop laughing. What a fucking joke.

Love how the usual suspect talks out his arse and always proven wrong, what a fucking clown.

Avatar image for SecretPolice
SecretPolice

45675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#320 SecretPolice
Member since 2007 • 45675 Posts

:P

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#321 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

ronvalencia a person who is always wrong and laughed at.

No one to fight in his corner, what a sad pathetic individual.

Avatar image for tdkmillsy
tdkmillsy

6617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#323 tdkmillsy
Member since 2003 • 6617 Posts

@tormentos: What is the problem with using developers exactly?

When they say something negative your the first to point them out

Developer quotes must stand better than other sources such as forums, personal bios and guessing.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#324  Edited By tormentos
Member since 2003 • 33793 Posts

@tdkmillsy said:

@tormentos: What is the problem with using developers exactly?

When they say something negative your the first to point them out

Developer quotes must stand better than other sources such as forums, personal bios and guessing.

The problem was simple the developers speaking for MS were lying.

Brad Wardell in special claimed 2X performance gains on xbox one by the use of DX12,he openly claim the xbox one was the biggest beneficiary when it simply wasn't true.

Now another developers this time for a game that even on PC performs like shit look average and require a 1080GTX just to max out at 1080p,is the one making praises and comparing the xbox one X to a 1070GTX.

Not when they lie for a company to hype a product no they don't.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#325 ronvalencia
Member since 2008 • 29612 Posts

@Random_Matt said:

ronvalencia a person who is always wrong and laughed at.

No one to fight in his corner, what a sad pathetic individual.

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#326 ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:

Performance is where everyone expected it to be......... except if your names Ron.... in which case performance is worse then what you've spent the last 6 months hyping and owning yourself.

Too bad for you

Deal with it.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

Loading Video...

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

Deal with it.

" it's kind of"

Now you deal with it.......

ARC Dev didn't state "If you think about it, it's kind of equivalent to a GTX 1060".

Now you deal with it.......

Avatar image for quadknight
QuadKnight

12916

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#327 QuadKnight
Member since 2015 • 12916 Posts

@Random_Matt said:

$500 30 FPS box, cannot stop laughing. What a fucking joke.

Love how the usual suspect talks out his arse and always proven wrong, what a fucking clown.

Indeed. The dude is a complete joke.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#328  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@tdkmillsy said:

@tormentos: What is the problem with using developers exactly?

When they say something negative your the first to point them out

Developer quotes must stand better than other sources such as forums, personal bios and guessing.

The problem was simple the developers speaking for MS were lying.

Brad Wardell in special claimed 2X performance gains on xbox one by the use of DX12,he openly claim the xbox one was the biggest beneficiary when it simply wasn't true.

Now another developers this time for a game that even on PC performs like shit look average and require a 1080GTX just to max out at 1080p,is the one making praises and comparing the xbox one X to a 1070GTX.

Not when they lie for a company to hype a product no they don't.

Brad Wardell has nothing to do with X1X and his compute heavy CPU RTS games are NOT on X1X which is different from ARC Survival early access build which is running on actual XBO and X1X hardware.

Games like Alien Isolation needs new API programming model since your POS4 has weak sauce CPUs.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#329  Edited By tormentos
Member since 2003 • 33793 Posts

@ronvalencia said:

Brad Wardell has nothing to do with X1X and his compute heavy CPU RTS games are NOT on X1X which is different from ARC Survival early access build which is running on actual XBO and X1X hardware.

Games like Alien Isolation needs new API programming model since your POS4 has weak sauce CPUs.

Keep playing stupid you're doing a terrific job.

I don't care if Brad Wardell has to do with the xbox one X or not and THAT IS NOT THE POINT YOU FOOL.

The point is DEVELOPERS LIE FOR MS.

You defended Brad Wardell and he was wrong,in fact still today you try to damage control what he claimed by saying it was about his game,when his game is not on PS4 or xbox one.

Ark developers is a shitty developer which needs a 1080gtx just to hit 1080p,by Ark Survival developer the xbox one X is equivalent to a 1080gtx because that is what you need for Ark at 1080p epic settings,and you will get between 40 and 35 FPS.

This is a badly optimized game it was using a single core it simply is a mess.

Performance wise this game is a mess,so yeah i take that 1070gtx comment with a truck of salt,you just take anything that serve you best and ignore anything that doesn't PROVEN by me already.

Developers do lie to hype consoles and Brad Wardell was a perfect example of that..

You of all persons calling the PS4 a POS4? The PS4 has 3+ years spanking your lovely xbox one,by gaps bigger than 100% and walking all over it not only power wise but sales wise and game wise as well,and you dare call the PS4 POS4..Hahahahaa

But but but you are not a lemming right? Hahahahahahaaa

The real shitt console here is the weakbox one which still here still getting kicked and still be here after the xbox one lands,in fact i bet my account vs yours that the XBO S will be the best selling model of the xbox one this holiday which mean yeah,most xbox fans will get an inferior experience just because the xbox one X launch doesn't make that the xbox one S somehow now can run at 4k native with ultra settings no,any one with an xbox one still get a shitty experience,and any one buying one new still gets the worst experience.

PS. That shitty CPU is also on xbox one X...lol

But but but Ryzen...lol

Avatar image for SecretPolice
SecretPolice

45675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#330 SecretPolice
Member since 2007 • 45675 Posts

@ronvalencia said:
@Random_Matt said:

ronvalencia a person who is always wrong and laughed at.

No one to fight in his corner, what a sad pathetic individual.

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Captain Ron, sizzling them bovine all day every day...

Good stuff. lol :P

Avatar image for tdkmillsy
tdkmillsy

6617

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#331 tdkmillsy
Member since 2003 • 6617 Posts

@tormentos said:
@tdkmillsy said:

@tormentos: What is the problem with using developers exactly?

When they say something negative your the first to point them out

Developer quotes must stand better than other sources such as forums, personal bios and guessing.

The problem was simple the developers speaking for MS were lying.

Brad Wardell in special claimed 2X performance gains on xbox one by the use of DX12,he openly claim the xbox one was the biggest beneficiary when it simply wasn't true.

Now another developers this time for a game that even on PC performs like shit look average and require a 1080GTX just to max out at 1080p,is the one making praises and comparing the xbox one X to a 1070GTX.

Not when they lie for a company to hype a product no they don't.

This is F*cked up and you know it.

Everyone can make mistakes have their own ideas and out right make stuff up.

Just because one developer says something that obviously isn't true (or lacks context) doesn't mean other developers are making stuff up and to say it just applies to MS is just WRONG.

Everyone that has got something right has got something wrong at some point. Does this make them liars.

Until you have full facts you have to combine what the professionals are saying (you know those that actually have access or know about the X1X) and in summary they are saying X1X is the best performing console out to work on, but the CPU issue could be a challenge for some games. This is now being proved correct from the limited benchmarks and comparisons we are getting through.

Avatar image for pelvist
pelvist

9001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#332  Edited By pelvist
Member since 2010 • 9001 Posts

@xhawk27 said:
@tormentos said:
  • World of Tanks native 4K/30...

lol

Loading Video...

1070GTX 4k max settings over 100FPS...lol

BUt but but Forza wet track...lol

Butthurt again, Tomato. LOL

You dont have to let us all know when you're butthurt. Better off keeping that to youreslf especially here on SW. So the XboneX cant do WoT at more than 4k30fps, no need to let that trigger you. You should thank tormentos for relaying these facts, it could save you some money.

Avatar image for tormentos
tormentos

33793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#333 tormentos
Member since 2003 • 33793 Posts

@tdkmillsy said:

This is F*cked up and you know it.

Everyone can make mistakes have their own ideas and out right make stuff up.

Just because one developer says something that obviously isn't true (or lacks context) doesn't mean other developers are making stuff up and to say it just applies to MS is just WRONG.

Everyone that has got something right has got something wrong at some point. Does this make them liars.

Until you have full facts you have to combine what the professionals are saying (you know those that actually have access or know about the X1X) and in summary they are saying X1X is the best performing console out to work on, but the CPU issue could be a challenge for some games. This is now being proved correct from the limited benchmarks and comparisons we are getting through.

NO is not..

And i can certainly remind you of how much you people fight me over DX12 so call improvements that were oversold since the very first day.

Non of you admit i was right,and i was basically DX12 was MS xbox development tools ported to PC,so basically the real beneficiary was PC not the xbox one let alone the xbox one been the biggest beneficiary and it simply didn't double performance i told you people many times and you simply would not listen.

One? How about Bolcato from Sniper elite team? Claiming ESRAM problems were solve by a new sdk,and that the xbox one would hit 1080p more,in fact as the generation process it was the contrary less and less games were 1080p on xbox one.

There is this thing call hype if you want to believe it or not that is your problem and we all know you lemmings fell for DX12 and the cloud as well,so falling for this one is not really hard,the GPU inside the xbox one is a 6TF Polaris is the same GPU inside the RX580 with some vega improvements,is not an actual Vega GPU and has a 2.3ghz CPU behind it,destiny developer claimed already that they could have go for 60FPS like on PC but the world and AI would need to be smaller which clearly indicate a CPU problem,which i also stated would happen.

Ark Survival is not even a well optimized game,it requires a 1080gtx to just do 1080p over 30FPS come one man a 1080gtx? For Real?

Ill wait until actual comparisons can be made,another thing is that xbox one X development kids have more power than the actual units is not just ram like other dev kits no that dev kit has 44CU vs the XBO X 40 so anything running of a devkit should be taken with a grain of salt as well and before you try to even downplay my argument...

MS did use dev kits that even had Nvidia GPU on them,we know this as fact.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#334  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@tdkmillsy said:

This is F*cked up and you know it.

Everyone can make mistakes have their own ideas and out right make stuff up.

Just because one developer says something that obviously isn't true (or lacks context) doesn't mean other developers are making stuff up and to say it just applies to MS is just WRONG.

Everyone that has got something right has got something wrong at some point. Does this make them liars.

Until you have full facts you have to combine what the professionals are saying (you know those that actually have access or know about the X1X) and in summary they are saying X1X is the best performing console out to work on, but the CPU issue could be a challenge for some games. This is now being proved correct from the limited benchmarks and comparisons we are getting through.

NO is not..

And i can certainly remind you of how much you people fight me over DX12 so call improvements that were oversold since the very first day.

Non of you admit i was right,and i was basically DX12 was MS xbox development tools ported to PC,so basically the real beneficiary was PC not the xbox one let alone the xbox one been the biggest beneficiary and it simply didn't double performance i told you people many times and you simply would not listen.

One? How about Bolcato from Sniper elite team? Claiming ESRAM problems were solve by a new sdk,and that the xbox one would hit 1080p more,in fact as the generation process it was the contrary less and less games were 1080p on xbox one.

XBO has multiple problems.

For example

32MB ESRAM has imposed resolution limits as shown by the above green shade.

The above example shows the new split rendering API which enables the programmer is use both ESRAM and DDR3 memory pools to fake a single rendering surface e.g. DDR3 gets the smaller spill over workload from 32 MB ESRAM.

This workaround doesn't fix CU bound issues and will remain unresolved until X1X.

@tormentos said:

There is this thing call hype if you want to believe it or not that is your problem and we all know you lemmings fell for DX12 and the cloud as well,so falling for this one is not really hard,the GPU inside the xbox one is a 6TF Polaris is the same GPU inside the RX580 with some vega improvements,is not an actual Vega GPU and has a 2.3ghz CPU behind it,destiny developer claimed already that they could have go for 60FPS like on PC but the world and AI would need to be smaller which clearly indicate a CPU problem,which i also stated would happen.

Ark Survival is not even a well optimized game,it requires a 1080gtx to just do 1080p over 30FPS come one man a 1080gtx? For Real?

Ill wait until actual comparisons can be made,another thing is that xbox one X development kids have more power than the actual units is not just ram like other dev kits no that dev kit has 44CU vs the XBO X 40 so anything running of a devkit should be taken with a grain of salt as well and before you try to even downplay my argument...

Polaris 10 doesn't have any Render Back End (RBE) improvements. Polaris 10's main focus is with compute shader improvements. Polaris DCC improvements occurs at the memory controller level. Polaris 10 has 2MB L2 cache for geometry and TMUs (with CU). For PS4 Pro, Sony asked AMD to place Vega NCU inside Polaris based GPU.

In addition to Polaris improvements, X1X has conservation occlusion (for Rasterizer) and 2 MB rendering cache (for RBE) improvements.

The missing improvement with Polaris is X1X's graphics pipeline improvements and there's 60 graphics pipeline improvements for X1X. ForzeTech wet track with heavy alpha effects shows X1X's RBE improvements.

Computer Shader's primary read and write units are TMUs which is connected to 2MB L2 cache.

Pixel Shader's primary read and write units are ROPS which is connected to tiny cache and memory controller which is a bottleneck. X1X's ROPS has 2MB render cache which matched TMU's 2MB L2 cache size. NVIDIA Paxwell's ROPS has direct access to 2MB/3MB L2 cache advantage.

Understand the reasons for AMD/Sony pushing for compute shaders and async compute optimizations i.e. it's mostly for accessing 2MB L2 cache.

@tormentos said:

MS did use dev kits that even had Nvidia GPU on them,we know this as fact.

XBO's Forza Motorsport 6 has 3D person NPCs. MS is only one Forza release late which is not a problem with nearly annual releases.

ARK Survival's non-4K target... ARC Devs was focusing on "quality pixels" instead of resolution. Any GPUs can be burden with over commit bloated workloads and higher grade GPUs handles it better than lesser grade GPUs.

ARK is not based on unknown game engine i.e. it's Unreal Engine 4 and to make it worst for you, it's the NVIDIA Gameworks branch of Unreal Engine 4. NVIDIA Gameworks has very high probability to bloat down GPUs.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#335  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Brad Wardell has nothing to do with X1X and his compute heavy CPU RTS games are NOT on X1X which is different from ARC Survival early access build which is running on actual XBO and X1X hardware.

Games like Alien Isolation needs new API programming model since your POS4 has weak sauce CPUs.

Keep playing stupid you're doing a terrific job.

I don't care if Brad Wardell has to do with the xbox one X or not and THAT IS NOT THE POINT YOU FOOL.

The point is DEVELOPERS LIE FOR MS.

You defended Brad Wardell and he was wrong,in fact still today you try to damage control what he claimed by saying it was about his game,when his game is not on PS4 or xbox one.

Ark developers is a shitty developer which needs a 1080gtx just to hit 1080p,by Ark Survival developer the xbox one X is equivalent to a 1080gtx because that is what you need for Ark at 1080p epic settings,and you will get between 40 and 35 FPS.

This is a badly optimized game it was using a single core it simply is a mess.

Performance wise this game is a mess,so yeah i take that 1070gtx comment with a truck of salt,you just take anything that serve you best and ignore anything that doesn't PROVEN by me already.

Developers do lie to hype consoles and Brad Wardell was a perfect example of that..

You of all persons calling the PS4 a POS4? The PS4 has 3+ years spanking your lovely xbox one,by gaps bigger than 100% and walking all over it not only power wise but sales wise and game wise as well,and you dare call the PS4 POS4..Hahahahaa

But but but you are not a lemming right? Hahahahahahaaa

The real shitt console here is the weakbox one which still here still getting kicked and still be here after the xbox one lands,in fact i bet my account vs yours that the XBO S will be the best selling model of the xbox one this holiday which mean yeah,most xbox fans will get an inferior experience just because the xbox one X launch doesn't make that the xbox one S somehow now can run at 4k native with ultra settings no,any one with an xbox one still get a shitty experience,and any one buying one new still gets the worst experience.

PS. That shitty CPU is also on xbox one X...lol

But but but Ryzen...lol

Did you know ARK Dev has recently confirmed X1X version will be running PC's medium settings 1080p 60 fps target?

https://www.gamespot.com/articles/ark-dev-talks-xbox-one-x-and-says-sony-wont-allow-/1100-6452662/

On the subject of the Xbox One X's horsepower, Stieglitz said Ark can run at the equivalent of "Medium" or "High" settings on PC. It can run at 1080p/60fps (Medium) or 1440p/30fps (High), and it sounds like developer Studio Wildcard may offer an option to switch between them.

For GTX 1070, medium settings 1080p resolution with 60 fps target starts at https://youtu.be/nIbiUd3l4PQ?t=20

Loading Video...

http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ram/index.html

As for the comparisons between the PC and Xbox One X, he said: "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499".

ARK Dev has confirmed "If you think about it, it's kind of equivalent to a GTX 1070" for 1080p 60 fps medium settings.

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#336 DragonfireXZ95
Member since 2005 • 26716 Posts

Lol, this back and forth is pretty entertaining.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#337 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

@ronvalencia said:

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Ok, first off it's Ark not Arc. You can't claim to be right when you're spelling the game title wrong.
Second, who argued with you against World of Tanks, Killer Instinct or Path of Exile being 4K. It seems like you are just making up W's in order to look good.
Third, stop saying "Forza wet track." You're turning that into a buzzword. Forza, while beautiful, should not be used as some sort of performance benchmark. Forza 5 and 6 ran at 1080p/60fps on the OG Xbox One hardware. That is not typical of the hardware as we all know. So lets not make Forza 7 some sort of huge technological feat. We already know it's not running PC "max settings" like you and your friends hyped. It's time you hold your self accountable.


Avatar image for Jebus213
Jebus213

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#338 Jebus213
Member since 2010 • 10056 Posts

@goldenelementxl said:
@ronvalencia said:

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Ok, first off it's Ark not Arc. You can't claim to be right when you're spelling the game title wrong.

Second, who argued with you against World of Tanks, Killer Instinct or Path of Exile being 4K. It seems like you are just making up W's in order to look good.

Third, stop saying "Forza wet track." You're turning that into a buzzword. Forza, while beautiful, should not be used as some sort of performance benchmark. Forza 5 and 6 ran at 1080p/60fps on the OG Xbox One hardware. That is not typical of the hardware as we all know. So lets not make Forza 7 some sort of huge technological feat. We already know it's not running PC "max settings" like you and your friends hyped. It's time you hold your self accountable.

Can't you just ignore the guy already? All he does is spam.

Avatar image for popgotcha
PopGotcha

716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#339 PopGotcha
Member since 2016 • 716 Posts

@DragonfireXZ95 said:

Lol, this back and forth is pretty entertaining.

Is it though? It's just constant spamming of the same charts. I do love how heated they all get but

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#340  Edited By ronvalencia
Member since 2008 • 29612 Posts

@goldenelementxl said:
@ronvalencia said:

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Ok, first off it's Ark not Arc. You can't claim to be right when you're spelling the game title wrong.

Second, who argued with you against World of Tanks, Killer Instinct or Path of Exile being 4K. It seems like you are just making up W's in order to look good.

Third, stop saying "Forza wet track." You're turning that into a buzzword. Forza, while beautiful, should not be used as some sort of performance benchmark. Forza 5 and 6 ran at 1080p/60fps on the OG Xbox One hardware. That is not typical of the hardware as we all know. So lets not make Forza 7 some sort of huge technological feat. We already know it's not running PC "max settings" like you and your friends hyped. It's time you hold your self accountable.

1. It doesn't materially change the argument.

2. Is that a go ahead to re-post my benchmark graphs for R9-290X/R9-390X from year 2016? You only joined late year 2016.

It's not geometry complex and R9-290X can handle it just fine. The programmers for Killer Instinct would known AMD's optimization path towards L2 cache with compute/TMU and reduce the bottlenecks from ROPS path (lacking the 1MB L2 cache access).

R9-290X has 1MB L2 cache for geometry and CU(via TMUs).

I claimed Killer Instinct being 4K 60 fps on Scorpio.

Do you want me to address your "It seems like you are just making up W's in order to look good" argument for ROTR, WOT, Path of Exile, Gears of War 4 Campaign, Shadow of Mordor (for Shadow of War)?

3. Too bad for you, the out-come for FM7 is very similar to X1X's earlier Frozatech M6 demo results relative to stock GTX 1070 (roughly a 6 TFLOPS class card). DF claimed X1X having GTX 1070/Fury X class card while GTX 1070 was dropping below 60 fps. I'm not surprise with GTX 1080 since it's next card higher than GTX 1070.

Besides the non-reference 1070s, NVIDIA doesn't have a reference SKU between 6.x TFLOPS and 9.x TFLOPS.

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#341 DragonfireXZ95
Member since 2005 • 26716 Posts

@popgotcha said:
@DragonfireXZ95 said:

Lol, this back and forth is pretty entertaining.

Is it though? It's just constant spamming of the same charts. I do love how heated they all get but

That's why it's so funny. I started just skimming over posts after a while. Not sure how they could possibly waste so much time on something so trivial. Lol

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#342 PinkAnimal
Member since 2017 • 2380 Posts

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#343 ronvalencia
Member since 2008 • 29612 Posts

@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

The only nonsense is your blind support for PS4.

Avatar image for DragonfireXZ95
DragonfireXZ95

26716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#344 DragonfireXZ95
Member since 2005 • 26716 Posts

@ronvalencia said:
@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

The only nonsense is your blind support for PS4.

Lol taking one single line and cutting off half of the sentence from his post and trying to use it as ownage against him.

C'mon, dude. He was saying there was only 1 game that was confirmed native 4k. Was he lying at the time? I highly doubt it.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#345  Edited By deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

And yet you keep doing it over, and over again. I wonder who the idiot is then

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#346 ronvalencia
Member since 2008 • 29612 Posts

@Jebus213 said:
@goldenelementxl said:
@ronvalencia said:

I'm correct on X1X having ROPS/RBE upgrade based from degraded GTX 1080 Ti at 6.5 TFLOPS and Forzatech wet track results and ARC Survival .

I'm correct on X1X being superior to R9-390X and RX-480 OC based on Forzatech wet track and ARC Survival .

I'm correct on ROTR being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on WOT being native 4K based on my R9-390X results. Devs already claims this.

I'm correct on Killer Instinct being native 4K based on somebody's R9-290X results (reaching 55 fps average) .

I'm correct on Path of Exilebeing native 4K. Devs already claims this.

I'm correct on Gears of War 4 Campaign mode being native 4K based on R9-390X to GeForce 980 TI results. Devs already claims this.

I'm correct on Shadow of War being native 4K based on Shadow of Mordor R9-390X to GeForce 980 TI results.

You're the sad pathetic individual.

Ok, first off it's Ark not Arc. You can't claim to be right when you're spelling the game title wrong.

Second, who argued with you against World of Tanks, Killer Instinct or Path of Exile being 4K. It seems like you are just making up W's in order to look good.

Third, stop saying "Forza wet track." You're turning that into a buzzword. Forza, while beautiful, should not be used as some sort of performance benchmark. Forza 5 and 6 ran at 1080p/60fps on the OG Xbox One hardware. That is not typical of the hardware as we all know. So lets not make Forza 7 some sort of huge technological feat. We already know it's not running PC "max settings" like you and your friends hyped. It's time you hold your self accountable.

Can't you just ignore the guy already? All he does is spam.

Your last post is a spam which doesn't address TC's post.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#347  Edited By ronvalencia
Member since 2008 • 29612 Posts

@DragonfireXZ95 said:
@ronvalencia said:
@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

The only nonsense is your blind support for PS4.

Lol taking one single line and cutting off half of the sentence from his post and trying to use it as ownage against him.

C'mon, dude. He was saying there was only 1 game that was confirmed native 4k. Was he lying at the time? I highly doubt it.

BS(for pinkanimal), WOT was confirmed native 4K by the developer and there's 4K video feed for it. It's my not problem cowabanga couldn't follow DF's DIY instructions on pixel counting. He is already wrong since devs claimed Skyrim SE is 4K for both X1X and PS4 Pro, unless cowabanga claims X1X's Skyrim SE is inferior to PS4 Pro's Skyrim SE.

Also, DF couldn't find non-2160p from X1X's Assassin's Creed Origins i.e. it could be dynamic resolution didn't kick in during the video feed.

Killer Instinct is an easy 4K 60 fps game for the existing R9-390X. DF even has leaked X1X's benchmarks for several games with 4K results.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#348 PinkAnimal
Member since 2017 • 2380 Posts

@DragonfireXZ95 said:
@ronvalencia said:
@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

The only nonsense is your blind support for PS4.

Lol taking one single line and cutting off half of the sentence from his post and trying to use it as ownage against him.

C'mon, dude. He was saying there was only 1 game that was confirmed native 4k. Was he lying at the time? I highly doubt it.

Indeed. At the time I posted that only Forza was confirmed but ronvalencia gets a kick out of lying and spamming I guess

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#349 PinkAnimal
Member since 2017 • 2380 Posts

@FastRobby said:
@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

And yet you keep doing it over, and over again. I wonder who the idiot is then

It's sort of amusing to point out his nonsense and see him go into fits of spamming

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#350 PinkAnimal
Member since 2017 • 2380 Posts

@ronvalencia said:
@DragonfireXZ95 said:
@ronvalencia said:
@pinkanimal said:

@DragonfireXZ95: engaging with ronvalencia is like trying to debate with your emails spam folder. I've never seen a "human being" that can store so much nonsense inside his brain. It's literally GBs of useless charts, information and spam. It won't be long before he starts pasting Viagra ads

The only nonsense is your blind support for PS4.

Lol taking one single line and cutting off half of the sentence from his post and trying to use it as ownage against him.

C'mon, dude. He was saying there was only 1 game that was confirmed native 4k. Was he lying at the time? I highly doubt it.

BS(for pinkanimal), WOT was confirmed native 4K by the developer and there's 4K video feed for it. It's my not problem cowabanga can't follow DF's DIY instructions on pixel counting. He is already wrong since devs claimed Skyrim SE is 4K for both X1X and PS4 Pro, unless cowabanga claims X1X's Skyrim SE is inferior to PS4 Pro's Skyrim SE.

Also, DF couldn't find non-2160p from X1X's Assassin's Creed Origins i.e. it could be dynamic resolution didn't kick in during the video feed.

It wasn't confirmed by a third party. "Confirmed" by developers coming from you is BS since your poor grasp of the english language makes you interpret developers saying what their aim is as a pure confirmation. Besides you have hyped games as native 4k before when they were "confirmed" by developers to then go and be proven wrong like with F1 2017. That's why independent confirmation is important before hyping something ahead of time and looking like a fool.