@GoldenElementXL: you got OWNED BRUH
This topic is locked from further discussion.
@GoldenElementXL: Technically speaking , when you say what's your fps, you refer to the "average" fps which in @GarGx1'case it's true , the average is easily 60. You're right about firefights , yeah it will drop to 50 but as i said the "average" is what matters. Even in Digital Foundary analyses they talk about average because that's how it should be.
About the max out thing , well he said this :
Temporal AA is set at Supersampling, that's as high as it gets.
Shadow Quality is at high, that's as high as it gets
Object detail set to 100% would make as much difference as DoF, none what so ever.
You're clutching at straws and you know it. bu.. bu what about a fire fight, do you honestly think I'm going to lose over 20frames per second in a fire fight?
I don't play it maxed out I do cut it back because I like to keep my FPS well over 60.
So it was maxed out.
He plays over 60, he can easily play at average 60 if he wants to.
sho
Yes I am a PC gamer. I am also tired of people claiming they can max every game out at 60fps. And people are just supposed to accept those claims with no question. The problem is I have a better PC than all of these folks and can out bench them all. I know the performance they are getting and know when they are full of shit. Now "steady 60 fps" can mean anything in the low 50s? "Max settings" can include things turned down? Would we call a PS4 game that drops in the low 50s a "locked 60fps" performance? No it would be "LOL Consoles." But every scrub here with a 970 and a 2500K can run everything maxed at 1440p just fine. GTFO.
I love PC gaming but here at System Wars Hermits are getting out of hand.
Thank you, sir, for pointing out that hypocrisy and providing some excellent entertainment in this thread. It's really sad and embarrassing to see how some people squirm and twist instead of just saying "Yeah, I'm not really getting 60 FPS and it's not maxed out either."
The problem is I'm not a blind PC fanboy so I don't add to the delusion. For 1/3 of GarGx's video the frame counter is sub 60 fps and all we see is the in the video is the settings screen, the sky a bunch and some car close ups. 30 seconds of his 1:33 video is sub 60 and I'm supposed to sit here and accept that he wants to call that a steady 60fps. He uses the word "steady" to describe frame rates from 52fps to 88fps. A drop of 4-6 frames occurs every time he fires the weapon in the video and we are supposed to call that steady. Object detail was set to 75% and the word "maxed" is being used to describe it. Sorry but no thanks. I'm not going to accept that and let it go.
lowered object detail is just the beginning of how far below maxed he is
Yes I am a PC gamer. I am also tired of people claiming they can max every game out at 60fps. And people are just supposed to accept those claims with no question. The problem is I have a better PC than all of these folks and can out bench them all. I know the performance they are getting and know when they are full of shit. Now "steady 60 fps" can mean anything in the low 50s? "Max settings" can include things turned down? Would we call a PS4 game that drops in the low 50s a "locked 60fps" performance? No it would be "LOL Consoles." But every scrub here with a 970 and a 2500K can run everything maxed at 1440p just fine. GTFO.
I love PC gaming but here at System Wars Hermits are getting out of hand.
Thank you, sir, for pointing out that hypocrisy and providing some excellent entertainment in this thread. It's really sad and embarrassing to see how some people squirm and twist instead of just saying "Yeah, I'm not really getting 60 FPS and it's not maxed out either."
The problem is I'm not a blind PC fanboy so I don't add to the delusion. For 1/3 of GarGx's video the frame counter is sub 60 fps and all we see is the in the video is the settings screen, the sky a bunch and some car close ups. 30 seconds of his 1:33 video is sub 60 and I'm supposed to sit here and accept that he wants to call that a steady 60fps. He uses the word "steady" to describe frame rates from 52fps to 88fps. A drop of 4-6 frames occurs every time he fires the weapon in the video and we are supposed to call that steady. Object detail was set to 75% and the word "maxed" is being used to describe it. Sorry but no thanks. I'm not going to accept that and let it go.
Well I'm going to play on PS4 because of Eroica, Getyeryayas and Animal Mother. Maybe double dip when it goes on sale in the future and play with us. If it's like Destiny in game structure, we'll be playing for awhile.
@GoldenElementXL: Technically speaking , when you say what's your fps, you refer to the "average" fps which in @GarGx1'case it's true , the average is easily 60. You're right about firefights , yeah it will drop to 50 but as i said the "average" is what matters. Even in Digital Foundary analyses they talk about average because that's how it should be.
About the max out thing , well he said this :
Temporal AA is set at Supersampling, that's as high as it gets.
Shadow Quality is at high, that's as high as it gets
Object detail set to 100% would make as much difference as DoF, none what so ever.
You're clutching at straws and you know it. bu.. bu what about a fire fight, do you honestly think I'm going to lose over 20frames per second in a fire fight?
I don't play it maxed out I do cut it back because I like to keep my FPS well over 60.
So it was maxed out.
He plays over 60, he can easily play at average 60 if he wants to.
Steady 60fps were his words, not mine. His Object detail and Shadow Quality were not max. He used the word max to describe near max settings. He plays over 60 but also plays under 60. So technically speaking, he isn't playing a steady 60 or Max settings.
I guess we can make a new classification. GarGx is averaging 60fps and averaging max settings at 1440p. I will compromise there, but no more. The word "steady" is not to be used to describe his frame rate and the word "max" must have the word "averaging" in front when describing settings. Anything else is factually false.
@Heil68: Looking forward to a special podcast about The Division :D
You should join us! We talked a little about it last time, but if I recall nobody was in the beta at that time. I downloaded last night so I'm going to try it out today.
Definitely want to, I'm a little bit busy right now and trying to catch up with your podcasts.
@Heil68: Looking forward to a special podcast about The Division :D
You should join us! We talked a little about it last time, but if I recall nobody was in the beta at that time. I downloaded last night so I'm going to try it out today.
Definitely want to, I'm a little bit busy right now and trying to catch up with your podcasts.
Everyone can make time for an hour on Thursday nights! LEt's do it! :D
@Heil68: Looking forward to a special podcast about The Division :D
You should join us! We talked a little about it last time, but if I recall nobody was in the beta at that time. I downloaded last night so I'm going to try it out today.
Definitely want to, I'm a little bit busy right now and trying to catch up with your podcasts.
Everyone can make time for an hour on Thursday nights! LEt's do it! :D
Next Thursday i'm in ! Since this Thursday i might not be home, PM me the details.
Nobody is running this maxed out at any res over 1080p at 60 fps. I don't know why anybody is still arguing.
I maintain around 50 or so fps most of the time at 1440p and SLI reportedly doesn't work. A properly overclocked 980 Ti should achieve 60fps at 1440p. That's about 20% faster than what I get and I've seen some monster 980 Ti's beating my 980 by almost 50%.
I also maintain 60fps at max settings at 1920x1440 which is above 1080p.
I never claimed I was, stop putting words in my mouth, I said once drivers hit, Crossfire is supported and the game is further optimized that I no doubt will be, especially if it scales like every other Ubisoft game which is from 1.9x to 2x...
With one card, no game drivers, and further optimizations to be done I'm averaging about 35 FPS at 1440p MAXED, it's in beta... It's not a complicated math equation to figure out what the results will be once the former things are applied...
I say there is no fairytale hardware out there hitting a steady 60 fps at 1440p maxed. You say you are going to "destroy" me. That doesn't seem like I'm putting words in your mouth. And your video literally shows what I said above.
Don't take my posts out of context you scaly little snake... Let's look at what was actually said...
Those saying they are playing the game maxed at 1440p and 60fps are full of shit.
That is all.
Not really, I get about 35 - 40 FPS and my Crossfire isn't even functioning, there's no in game support for it yet, I'll no doubt be over 60 when the support is added.
@nyadc: Check the NeoGaf forum or YouTube if you don't believe me. Maybe I will post some pics too. I'm getting mid 40s-50s with shadow quality and AA turned down at 1440p with a 5960X at 4.2 and 2 Titan Blacks. (I know SLI isn't working right). Over at Gaf people with a 980Ti and X99 CPUs are hitting very similar frames as me. Using the games recommended settings I get almost the same performance. There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that.
Yeah, give me about 30 minutes, I'm going to destroy you.
I never claimed I was getting 60 FPS currently or that anyone else is, but like an idiot you're taking performance figures from an unfinished game with no drivers, no current multi-gpu support and further optimizations to be done as some kind of benchmark for the capabilities of hardware.
"There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that."
Like I said, not currently, but that's because there are a bunch of equations that can't be factored in at the moment. The hardware capable of 1440p 60+fps at maximum settings exists, I own it, @GarGx1 owns it, we own this "fairytale" hardware currently that you're saying doesn't exist... What are you trying to imply with any of this? Who gives a flying **** if it can't be run at 60 FPS right now? The game isn't done, it's in beta, the game doesn't have any fucking drivers, the game has no multi-gpu support currently and it's not fully optimized yet...
What are you trying to accomplish here, because frankly you're coming off a moron...
I never claimed I was, stop putting words in my mouth, I said once drivers hit, Crossfire is supported and the game is further optimized that I no doubt will be, especially if it scales like every other Ubisoft game which is from 1.9x to 2x...
With one card, no game drivers, and further optimizations to be done I'm averaging about 35 FPS at 1440p MAXED, it's in beta... It's not a complicated math equation to figure out what the results will be once the former things are applied...
I say there is no fairytale hardware out there hitting a steady 60 fps at 1440p maxed. You say you are going to "destroy" me. That doesn't seem like I'm putting words in your mouth. And your video literally shows what I said above.
Don't take my posts out of context you scaly little snake... Let's look at what was actually said...
Those saying they are playing the game maxed at 1440p and 60fps are full of shit.
That is all.
Not really, I get about 35 - 40 FPS and my Crossfire isn't even functioning, there's no in game support for it yet, I'll no doubt be over 60 when the support is added.
@nyadc: Check the NeoGaf forum or YouTube if you don't believe me. Maybe I will post some pics too. I'm getting mid 40s-50s with shadow quality and AA turned down at 1440p with a 5960X at 4.2 and 2 Titan Blacks. (I know SLI isn't working right). Over at Gaf people with a 980Ti and X99 CPUs are hitting very similar frames as me. Using the games recommended settings I get almost the same performance. There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that.
Yeah, give me about 30 minutes, I'm going to destroy you.
I never claimed I was getting 60 FPS currently or that anyone else is, but like an idiot you're taking performance figures from an unfinished game with no drivers, no current multi-gpu support and further optimizations to be done as some kind of benchmark for the capabilities of hardware.
"There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that."
Like I said, not currently, but that's because there are a bunch of equations that can't be factored in at the moment. The hardware capable of 1440p 60+fps at maximum settings exists, I own it, @GarGx1 owns it, we own this "fairytale" hardware currently that you're saying doesn't exist... What are you trying to imply with any of this? Who gives a flying **** if it can't be run at 60 FPS right now? The game isn't done, it's in beta, the game doesn't have any fucking drivers, the game has no multi-gpu support currently and it's not fully optimized yet...
What are you trying to accomplish here, because frankly you're coming off a moron...
what did you need 30 minutes for?
Nobody is running this maxed out at any res over 1080p at 60 fps. I don't know why anybody is still arguing.
I maintain around 50 or so fps most of the time at 1440p and SLI reportedly doesn't work. A properly overclocked 980 Ti should achieve 60fps at 1440p. That's about 20% faster than what I get and I've seen some monster 980 Ti's beating my 980 by almost 50%.
I also maintain 60fps at max settings at 1920x1440 which is above 1080p.
a highly overclocked 980ti gets 40 to 55 fps at 1440p maxed
@nyadc i love how you just assume that the game will have perfect xfire scaling and that performance in general will improve from now to march
I never claimed I was getting 60 FPS currently or that anyone else is, but like an idiot you're taking performance figures from an unfinished game with no drivers, no current multi-gpu support and further optimizations to be done as some kind of benchmark for the capabilities of hardware.
"There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that."
Like I said, not currently, but that's because there are a bunch of equations that can't be factored in at the moment. The hardware capable of 1440p 60+fps at maximum settings exists, I own it, @GarGx1 owns it, we own this "fairytale" hardware currently that you're saying doesn't exist... What are you trying to imply with any of this? Who gives a flying **** if it can't be run at 60 FPS right now? The game isn't done, it's in beta, the game doesn't have any fucking drivers, the game has no multi-gpu support currently and it's not fully optimized yet...
What are you trying to accomplish here, because frankly you're coming off a moron...
There are people in this thread claiming to get a STEADY 60fps on this unfinished game with no drivers, no current multi-GPU support BETA at 1440p at Max settings. RIGHT NOW. That was the point of my post. Why are you trying to prove a hypothetical? That is pointless. I didn't take aim at you until you said you would "destroy me" when I made a factual statement that your video further proved. Are there members at NeoGaf claiming to max this beta out at a STEADY 60 fps? Nope. So why does it happen here? Hermits have had free reign for long enough. They got away with this stuff because the majority of this board is console gamers that can't prove them wrong. Well I've had a top 3 PC over on the PC forum for a couple of years now. I've been on the FireStrike Hall of Fame. I know when a PC gamer is full of shit because it just so happens that I am a PC enthusiast myself.
It's a new day Herms. Max is going to mean Max and Steady is gonna mean Steady. No more potatoes maxing every game at 60fps bullshit.
Oh, and what happens when you can't max it after release either?
@GoldenElementXL Lies like that happen everywhere. I'm a member of another enthusiast website and I'm always surprised when some members claim they max The Witcher 3 at 1080p and maintain a 60fps with a 970. Mine didn't come close to that. Foliage Visible Range absolutely killed my fps but once I turned that High/Medium and Hairworks off, 60fps was most definitely doable but we were no longer talking max settings any more.
Used to be the same when I had a 670 and some guys were claiming frame rates 30% faster than my OC'd Asus Direct CU II with a reference OC'd model. I damn neared returned it until I realize they were all full of shit.
Edit: Also, I wouldn't call your PC top 3 any more. In most games I'll beat you. You'll beat me in benchmarks mostly because of your ridiculous Physics/CPU score thanks to your 5960X. Unless of course you OC'd your Titan Blacks.
@m3dude1: And you pulled out from where? SLI doesn't work for that game and 40-55 is what I get with two 980's but only one is actually doing anything.
A single 980 Ti should demolish that.
you dont get 40 to 55 fps at 1440p maxed on your 980
@m3dude1: And you pulled out from where? SLI doesn't work for that game and 40-55 is what I get with two 980's but only one is actually doing anything.
A single 980 Ti should demolish that.
you dont get 40 to 55 fps at 1440p maxed on your 980
Lol you wanna bet? You don't even own one so piss off fanboy.
I never claimed I was getting 60 FPS currently or that anyone else is, but like an idiot you're taking performance figures from an unfinished game with no drivers, no current multi-gpu support and further optimizations to be done as some kind of benchmark for the capabilities of hardware.
"There is no fairytale hardware out there hitting a steady 60 at 1440p maxed. It doesn't exist and don't let anyone tell you otherwise. There is no evidence anywhere that supports that."
Like I said, not currently, but that's because there are a bunch of equations that can't be factored in at the moment. The hardware capable of 1440p 60+fps at maximum settings exists, I own it, @GarGx1 owns it, we own this "fairytale" hardware currently that you're saying doesn't exist... What are you trying to imply with any of this? Who gives a flying **** if it can't be run at 60 FPS right now? The game isn't done, it's in beta, the game doesn't have any fucking drivers, the game has no multi-gpu support currently and it's not fully optimized yet...
What are you trying to accomplish here, because frankly you're coming off a moron...
There are people in this thread claiming to get a STEADY 60fps on this unfinished game with no drivers, no current multi-GPU support BETA at 1440p at Max settings. RIGHT NOW. That was the point of my post. Why are you trying to prove a hypothetical? That is pointless. I didn't take aim at you until you said you would "destroy me" when I made a factual statement that your video further proved. Are there members at NeoGaf claiming to max this beta out at a STEADY 60 fps? Nope. So why does it happen here? Hermits have had free reign for long enough. They got away with this stuff because the majority of this board is console gamers that can't prove them wrong. Well I've had a top 3 PC over on the PC forum for a couple of years now. I've been on the FireStrike Hall of Fame. I know when a PC gamer is full of shit because it just so happens that I am a PC enthusiast myself.
It's a new day Herms. Max is going to mean Max and Steady is gonna mean Steady. No more potatoes maxing every game at 60fps bullshit.
Oh, and what happens when you can't max it after release either?
A steady 60 FPS would imply that one would need a much higher default framerate threshold than 60 to never go below it, likely an average of 80 FPS. When people talk about GPU performance and benchmarks they generally post their average, that's likely what he meant.
I got about 50 FPS average in Far Cry 4 @ 1440p on Ultra before Crossfire support was introduced, my average is about 90 now after the fact... Why are you trying to base things on where this game sits in a completely unfinished state and use that as some kind of hermit bashing propaganda? You sound dumb because of it... Look at Clyde's computer, he has two 980 Ti's, once he gets drivers, SLI and final optimizations are done his average frame rate will likely be over 100 FPS...
What is your motivation here? Like, what are you trying to accomplish? It seems like you're trying to make a V8 car drag race with 3 coil wires undone so it's only firing on 5 cylinders, and then using whatever time that car runs as the peak of its capability... It's shortsighted and completely idiotic...
@nyadc i love how you just assume that the game will have perfect xfire scaling and that performance in general will improve from now to march
Essentially every Ubisoft game scales with Crossfire at 1.9x to 2x, it's an educated assumption based upon their existing game engines...
I got about 50 FPS average in Far Cry 4 @ 1440p on Ultra before Crossfire support was introduced, my average is about 90 now after the fact...
@nyadc i love how you just assume that the game will have perfect xfire scaling and that performance in general will improve from now to march
Essentially every Ubisoft game scales with Crossfire at 1.9x to 2x, it's an educated assumption based upon their existing game engines...
So why is yours scaling less than 1.9x-2x then?
@m3dude1: And you pulled out from where? SLI doesn't work for that game and 40-55 is what I get with two 980's but only one is actually doing anything.
A single 980 Ti should demolish that.
you dont get 40 to 55 fps at 1440p maxed on your 980
Lol you wanna bet? You don't even own one so piss off fanboy.
yeah i do wanna bet actually. upload a youtube video. i dont care about your screens which prove nothing
@nyadc hows that AC Syndicate 1.9x scaling? oh wait
I got about 50 FPS average in Far Cry 4 @ 1440p on Ultra before Crossfire support was introduced, my average is about 90 now after the fact...
@nyadc i love how you just assume that the game will have perfect xfire scaling and that performance in general will improve from now to march
Essentially every Ubisoft game scales with Crossfire at 1.9x to 2x, it's an educated assumption based upon their existing game engines...
So why is yours scaling less than 1.9x-2x then?
"about 50 FPS average"
"about 90 now after the fact"
I also said essentially, across the board that is what you can typically expect, the Crossfire support for their games is very well fleshed out, I don't know about SLI.
yeah i do wanna bet actually. upload a youtube video. i dont care about your screens which prove nothing
@nyadc hows that AC Syndicate 1.9x scaling? oh wait
Lol when presented with evidence you're just gonna ignore it? It's like when I thoroughly owned you in the 7850 vs PS4 thread and you kept moving the goal post. Settings are maxed. Frame rate mostly hovers around 50. Lowest dip I had was around 43.
yeah i do wanna bet actually
Then you lost the bet. Look at the post above yours.
waiting on that video. make sure its outside in the city and try engaging some hostiles. not hard to cherry pick screens, especially when theres no way to even prove you didnt lower the settings for them
@nyadc how about your b4 and after AC syndicate crossfire numbers?
waiting on that video. make sure its outside in the city and try engaging some hostiles.
I know you, even if I post a video you're gonna find an excuse. I had posted a video of TW3 running better on my 7850 than on PS4 and you still came up with lame excuses.
yeah i do wanna bet actually
Then you lost the bet. Look at the post above yours.
waiting on that video. make sure its outside in the city and try engaging some hostiles.
@nyadc how about your b4 and after AC syndicate crossfire numbers?
I don't play Assassin's Creed, the last one I bought was III for PC and Unity because it was $10 for Xbox One.
waiting on that video. make sure its outside in the city and try engaging some hostiles.
I know you, even if I post a video you're gonna find an excuse. I had posted a video of TW3 running better on my 7850 than on PS4 and you still came up with lame excuses.
nah you just know you cant produce a video because you are full of shit
this isnt even fully maxed, its just the ultra preset and the numbers speak for themselves. heres another one
http://www.pcgameshardware.de/The-Division-Spiel-37399/Specials/Beta-Benchmark-Test-1184726/
again, not fully maxed and the 980 is the asus model which runs an ingame boost clock of just over 1400 mhz. its too easy to embarrass hermits. they are dumb as hell.
@m3dude1: Even I Will come out and say that benchmark doesn't represent the real world in anyway. First off SLI doesn't work so how they got those gains is suspect. Second there is plenty of video and visual evidence here in this thread that proves that performance isnt that bad. A 980Ti at a 48fps average? Now that certainly wouldn't be a steady 60fps.
That bench is bogus.
@m3dude1: Even I Will come out and say that benchmark doesn't represent the real world in anyway. First off SLI doesn't work so how they got those gains is suspect. Second there is plenty of video and visual evidence here in this thread that proves that performance isnt that bad. A 980Ti at a 48fps average? Now that certainly wouldn't be a steady 60fps.
That bench is bogus.
its not bogus, the single card numbers are extremely inline with general performance. they dont go find some high performing and barren street and claim thats their overall performance is all. heres the scene they use for testing.
both websites are extremely legit and have been around a long time.
nah you just know you cant produce a video because you are full of shit
this isnt even fully maxed, its just the ultra preset and the numbers speak for themselves. heres another one
http://www.pcgameshardware.de/The-Division-Spiel-37399/Specials/Beta-Benchmark-Test-1184726/
again, not fully maxed and the 980 is the asus model which runs an ingame boost clock of just over 1400 mhz. its too easy to embarrass hermits. they are dumb as hell.
Youtube is uploading my video which will be live in 20 minutes. I even did as you say and took a stroll outside and got into a firefight(couldn't find a lot, the world is freakin' barren). Shadowplay drops my fps by a good 3-5 as well yet I still maintained 40-55 with a single OC'd Superclocked 980. Add 3fps at the minimum for whatever I recorded and 5-6fps maximum. Even without adding these numbers, I still stay at 40-55. With them, I'm more in the 50-60 range.
Edit: Also, that bench is from almost a month ago. A new set of drivers(361.91) came out last week or two weeks prior. Can't remember.
@m3dude1 Here you go. Pwned.
Not outright calling you a liar, but I kind of am, why is the source footage here 1080p and not 1440p? I'm fully aware that you could manipulate what you're doing here to make it look like it's at 1440p by simply setting it to 2560x1440 in your options but not actually applying it.
All you would have to do is have the game set to 1920x1080, go into the options menu, change it to 2560x1440 and then navigate back to the main option selection screen without actually applying that resolution which is exactly where your video starts.
The 980 is not very much faster than a 290X, in most instances they're actually within just a few frames of each other, yet you expect me to believe you getting 30 FPS more at the same settings, also factoring in the shadowplay FPS cut in? Even with it being overclocked that's just not something I buy, I'm calling bullshit on this...
I think you're trying to pull a fast one on us...
its most likely just SLI(i would hope even a hermit wouldnt be so pathetic to do what you just described). i honestly doubt he would even know how to disable sli if he wanted to. but yeah, amd stomps nvidia in this game. your stock 290x is faster than a stock 980.
the frames hes claiming to get are higher than an overclocked titan x
@m3dude1:
its most likely just SLI(i would hope even a hermit wouldnt be so pathetic to do what you just described). i doubt he would even know how to disable it if he wanted to. but yeah, amd stomps nvidia in this game. your stock 290x is faster than a stock 980.
Oh no, I think it's exactly what I said, this is Destructoid's analysis of this build of the game at 1080p... Does any of this sound or look familiar to the above?
@m3dude1:
its most likely just SLI(i would hope even a hermit wouldnt be so pathetic to do what you just described). i doubt he would even know how to disable it if he wanted to. but yeah, amd stomps nvidia in this game. your stock 290x is faster than a stock 980.
Oh no, I think it's exactly what I said, this is Destructoid's analysis of this build of the game at 1080p... Does any of this sound or look familiar to the above?
sad even by his standards if true
Not outright calling you a liar, but I kind of am, why is the source footage here 1080p and not 1440p? I'm fully aware that you could manipulate what you're doing here to make it look like it's at 1440p by simply setting it to 2560x1440 in your options but not actually applying it.
All you would have to do is have the game set to 1920x1080, go into the options menu, change it to 2560x1440 and then navigate back to the main option selection screen without actually applying that resolution which is exactly where your video starts.
The 980 is not very much faster than a 290X, in most instances they're actually within just a few frames of each other, yet you expect me to believe you getting 30 FPS more at the same settings, also factoring in the shadowplay FPS cut in? Even with it being overclocked that's just not something I buy, I'm calling bullshit on this...
I think you're trying to pull a fast one on us...
Recorded with Shadowplay which I believe supports 1080p max. Worse it gets is dips in the mid 30's in really intense moments but otherwise it stays around 40-55. 55 is mostly when nothing is going on. As GoldenElemenXL pointed out, SLI doesn't work and I disabled it anyway. With or without it makes no difference and my second GPU has a usage of 0-5%. A friend I was playing with told me he was getting 60fps at Ultra on his OC'd R9 390X at 1080p so an average of around 48 is entirely doable on a OC'd 980 at 1440p.
I got proof. If you guys ain't happy it's not really my problem. Now you're all making excuses and using benchmarks from over a month ago.
@m3dude1 You've been owned. Why bother asking me for proof if you were going to find excuses anyway? I already pointed out you would find lame reasons to discredit my video even if I uploaded it. I showed you screenshot and you kept arguing, I posted a video and you keep on arguing. You could sit on my PC and try it yourself and you'd still argue. Also lol @me not knowing how to disable SLI. I know computer hardware far more than your stupid ass ever will.
Not outright calling you a liar, but I kind of am, why is the source footage here 1080p and not 1440p? I'm fully aware that you could manipulate what you're doing here to make it look like it's at 1440p by simply setting it to 2560x1440 in your options but not actually applying it.
All you would have to do is have the game set to 1920x1080, go into the options menu, change it to 2560x1440 and then navigate back to the main option selection screen without actually applying that resolution which is exactly where your video starts.
The 980 is not very much faster than a 290X, in most instances they're actually within just a few frames of each other, yet you expect me to believe you getting 30 FPS more at the same settings, also factoring in the shadowplay FPS cut in? Even with it being overclocked that's just not something I buy, I'm calling bullshit on this...
I think you're trying to pull a fast one on us...
Recorded with Shadowplay which I believe supports 1080p max. Worse it gets is dips in the mid 30's in really intense moments but otherwise it stays around 40-55. 55 is mostly when nothing is going on. As GoldenElemenXL pointed out, SLI doesn't work and I disabled it anyway. With or without it makes no difference and my second GPU has a usage of 0-5%. A friend I was playing with told me he was getting 60fps at Ultra on his OC'd R9 390X at 1080p so an average of around 48 is entirely doable on a OC'd 980 at 1440p.
I got proof. If you guys ain't happy it's not really my problem. Now you're all making excuses and using benchmarks from over a month ago.
That's not proof of anything, you did exactly what I said, and there's great credence to believe so as people with the same hardware are reporting roughly the same performance figures at 1080p that you're trying to purport to us at 1440p...
I'm using benchmarks from a month ago? There's no excuses, you thought you were slick and got caught in a lie... Destructoid's numbers essentially identically mirror what you're telling us here with essentially identical hardware but you want us to believe that you're hitting those same numbers at a 77% higher resolution?
This is from yesterday by the way...
http://www.destructoid.com/the-division-s-open-beta-is-fantastic-on-pc-343227.phtml
Please Log In to post.
Log in to comment