Poll Are you a member of the GTX 900 club? (146 votes)
Not gonna lie, SW. The new nvidia gpus are pretty sweet. As much as I love console, these GPUs are in a league of their own. Do you have one? Plan on getting one? What is your setup?
Not gonna lie, SW. The new nvidia gpus are pretty sweet. As much as I love console, these GPUs are in a league of their own. Do you have one? Plan on getting one? What is your setup?
Waiting on the next line to upgrade my 670
same
my 670 still handles most things well
All that bitching about my PC and playing games in 900p and you only have a GTX 670.
I don't know you anymore, rib.
We used to be close.
Waiting on the next line to upgrade my 670
same
my 670 still handles most things well
All that bitching about my PC and playing games in 900p and you only have a GTX 670.
I don't know you anymore, rib.
We used to be close.
but i play games (like Far Cry 3) at 1080p
All that bitching about my PC and playing games in 900p and you only have a GTX 670.
I don't know you anymore, rib.
We used to be close.
but i play games (like Far Cry 3) at 1080p
WHATEVER!!!
We had something special....but knowing I have a better GPU than you.....it....it just makes things so emotional.
All that bitching about my PC and playing games in 900p and you only have a GTX 670.
I don't know you anymore, rib.
We used to be close.
but i play games (like Far Cry 3) at 1080p
WHATEVER!!!
We had something special....but knowing I have a better GPU than you.....it....it just makes things so emotional.
Went green for the first time and got a single 970 in January. It's a fantastic card already and will be even better once DX12 comes out, with it's combined memory capabilities.
The gtx 970 really is a nice upgrade from my Raedon HD 6770
It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.
Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.
My GTX 560Ti laughs at the 900 series.
I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.
My GTX 560Ti laughs at the 900 series.
I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.
rofl
My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.
The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. The preview benchs show 970 only 60% behind 390x at 4k on average, below 1600p its only 30% faster on average. But 390x is rumored to be a $700 gpu. So to say in general that the 300 series will pwn the 970 is flat out wrong since chances are that the 370 and or 380 will be in the same ballpark.
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.
Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.
Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.
I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.
Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.
I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.
The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....
The gtx 970 really is a nice upgrade from my Raedon HD 6770
It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.
Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.
This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.
Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's.(I never had problems with SLI, As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work). I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.
Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.
I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.
The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....
Well according to the most recent leaked benchmarks state its only 4gb not 8gb. Also it shows the 390x peaking at 289w with Metro LL. We will have to wait and see. But according to the specs and facts about 1st gen HBM the limit is 4gb, so they would have to bridge 4gb of GDDR5 to make it a 8gb card
The gtx 970 really is a nice upgrade from my Raedon HD 6770
It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.
Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.
This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.
Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's. As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work. I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.
Nvidia don't owe you shit brah. This whole "Ramgate" is just a means of trying to get money out of Nvidia by a few butthurt owners and AMD fanboys. The card still works? As it was found out, its when you start pushing serious resolutions that you get problems. The 970 was never designed for 4K...
nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.
amd 300 series will pwn it
I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.
Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.
I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.
The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....
Well according to the most recent leaked benchmarks state its only 4gb not 8gb. Also it shows the 390x peaking at 289w with Metro LL.
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.
The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.
Agreed, even if my preference is Nvidia, I would rather see you get a 300 series AMD card. The 200 series is old, now if you get one really cheap then I understand, but I would not pay full retail when the 300 series is right around the corner.
Even games at 1080p are beginning to get extreme graphics memory requirements to enable or push certain features, with the way things are going the GTX 970 will be a 1080p bottleneck within a year and 1440p sooner, that's not ok and no one is blowing anything out of proportion.
The gtx 970 really is a nice upgrade from my Raedon HD 6770
It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.
Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.
This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.
Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's. As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work. I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.
Nvidia don't owe you shit brah. This whole "Ramgate" is just a means of trying to get money out of Nvidia by a few butthurt owners and AMD fanboys. The card still works? As it was found out, its when you start pushing serious resolutions that you get problems. The 970 was never designed for 4K...
And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
256w for the titan x. res was 1600p
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
256w for the titan x. res was 1600p
I see.
http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/
The 390X is going to ship with an 8GB variant.
My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.
The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.
Agreed, even if my preference is Nvidia, I would rather see you get a 300 series AMD card. The 200 series is old, now if you get one really cheap then I understand, but I would not pay full retail when the 300 series is right around the corner.
yep, Ill be waiting for the new AMD cards
And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.
Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.
I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
256w for the titan x. res was 1600p
I see.
http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/
The 390X is going to ship with an 8GB variant.
I think I'm going to wait for the 490X, currently running two 290X's, I don't see the need to move to what will likely be a 30-40% increase over one of my cards when my two will likley excel above it by 60-70%.
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
256w for the titan x. res was 1600p
I see.
http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/
The 390X is going to ship with an 8GB variant.
This could also potentially mean that we might be seeing both 4GB HBM and 8GB HBM variants.
8gb will most likely be later when 2nd gen HBM comes to age.
Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?
256w for the titan x. res was 1600p
I see.
http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/
The 390X is going to ship with an 8GB variant.
his could also potentially mean that we might be seeing both 4GB HBM and 8GB HBM variants.
8gb will most likely be later when 2nd gen HBM comes to age.
Seems quite strange that they would brag about having a 8GB variant that won't be available on release.
And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.
Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.
I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.
Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU. And, unless your 100% pro quadro is way to expensive. All my teachers even used Geforce. Even many on polycount say Geforce as Quadro isn't worth the asking price. Also with Quadro I can't game, so I'm looking for a both solution.
And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.
Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.
I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.
Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU. And, unless your 100% pro quadro is way to expensive. All my teachers even used Geforce. Even many on polycount say Geforce as Quadro isn't worth the asking price. Also with Quadro I can't game, so I'm looking for a both solution.
If it requires double precision then get an original Titan.
Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU.
Yes of course, I believe it's been in Max since 2010. And I was working out scenes for big budget films on a 2gb Quadro 4000 lol.
Anyways, that does suck for you, but with so many better rendering solutions out there it isn't really that big of an issue is it? And if you must use it, why go with a gaming card? A budget gaming card no less...
Edit: Ok just saw your edit. Well yeah you're not going to get the best of both worlds with a Quadro, they suck for gaming. Also, don't take the guys at Polycount as scripture. Most of them are only working on games in which a geforce would be the way to go, and while there are plenty of brilliant artists on the site, they're not going to give you the best hardware advice.
My GTX 560Ti laughs at the 900 series.
I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.
I have a secret. I have the Nvidia dash suit which allows me to overclock the shit out of it bringing performance up to 2006 domination. LOL.
Lol rber when Hermits thought a GTX 670/680 could last 5-6 years lollolol
And yet both cards you mentioned would still be rolling fine.
I just upgraded from my old gtx 460. It was still pumpin out what i needed 5 years in. I wanted to drop down to a mini itx build for cimplicity to my gameroom/living room needs so hit up on a 960 itx card.
Lol rber when Hermits thought a GTX 670/680 could last 5-6 years lollolol
They can. Just not everything will be at Ultra.
Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU.
Yes of course, I believe it's been in Max since 2010. And I was working out scenes for big budget films on a 2gb Quadro 4000 lol.
Anyways, that does suck for you, but with so many better rendering solutions out there it isn't really that big of an issue is it? And if you must use it, why go with a gaming card? A budget gaming card no less...
Edit: Ok just saw your edit. Well yeah you're not going to get the best of both worlds with a Quadro, they suck for gaming. Also, don't take the guys at Polycount as scripture. Most of them are only working on games in which a geforce would be the way to go, and while there are plenty of brilliant artists on the site, they're not going to give you the best hardware advice.
But though wound't you be a little upset about Nvidia's lies? I'm much happier as I can fit a full 4gb into a 980. As a gamer and a Student of 3D, the 970 and 980 seemed very good. I originally got a second 970 for gaming, but as 3ds max doesn't rely on SLI only one would have done it. Now to find out any scene bigger than 3.5 gives me an error and at the time I didn't understand why. I ended up solving my problem by just selling them and buying 980 cards while I still worked at BestBuy on discount. I made out in the end, I would love to buy a Titan X for its 12gb of ram, but am out of return policy and have just accepted this and will live with it. I now i'm getting what I paid for as well as max performance in games.
As for someone on a budget I relied on the specs Nvidia had gave. Had I an actual job in the industry rather than starting out and could afford a Quadro and then a gaming machine, I wouldn't be ticked, but I'm still a out of college student who is trying to build a portfolio. But while were on the subject, do you know any good tutorials on learning to build tree's? This is a thing I have been spending lots of time with, but ending up with results like Forza 5. I have looked hard at the tree's in Inquisition and see kinda how they look, but trying to make them has turned out not quite as good. You seem to be knowledgeable about 3DS max and would like any suggestions you may have.
My 4GB MSI OC GTX 770 is doing fine with Far Cry 4. So, not yet. I normally alternate between AMD and Nvidia. I broke that rule with the GTX 770 (from a GTX 560 Ti). My next card will be AMD. Gotta get back to my old habits.
But though wound't you be a little upset about Nvidia's lies? I'm much happier as I can fit a full 4gb into a 980. As a gamer and a Student of 3D, the 970 and 980 seemed very good. I originally got a second 970 for gaming, but as 3ds max doesn't rely on SLI only one would have done it. Now to find out any scene bigger than 3.5 gives me an error and at the time I didn't understand why. I ended up solving my problem by just selling them and buying 980 cards while I still worked at BestBuy on discount. I made out in the end, I would love to buy a Titan X for its 12gb of ram, but am out of return policy and have just accepted this and will live with it. I now i'm getting what I paid for as well as max performance in games.
As for someone on a budget I relied on the specs Nvidia had gave. Had I an actual job in the industry rather than starting out and could afford a Quadro and then a gaming machine, I wouldn't be ticked, but I'm still a out of college student who is trying to build a portfolio. But while were on the subject, do you know any good tutorials on learning to build tree's? This is a thing I have been spending lots of time with, but ending up with results like Forza 5. I have looked hard at the tree's in Inquisition and see kinda how they look, but trying to make them has turned out not quite as good. You seem to be knowledgeable about 3DS max and would like any suggestions you may have.
Haha, yeah, I just saw the Max comment and jumped in, but you're right. Nvidia did pull some BS, I'm not defending that. I have a 970 and for my current needs it's good enough. I'm pleased with what I got for what I spent, but that doesn't excuse Nvidia.
Also didn't know you were still a student, so yes I fully understand that man. Rough times on the budget lol.
Trees. Yes, always a bitch haha. Of course there is SpeedTree which is used heavily in games and vfx. From what I understand, it's the industry standard. Never used it myself, but it looks beast.
In Max though? Sorry, but you're either going to have to model them from scratch, or dick around with the AEC extended trees, convert them to editable poly, then get in there and refine them yourself.
Last time I modeled a tree, I created the high res in Zbrush. Using Zspheres in conjunction with Fibermesh, you can quickly develop a range of different looks. It's really a great workflow for getting quick original pieces. Of course, the bitch of it is that you have to retopologize everything if you want to render it externally. And that takes some time. Best way to do it is to simply retopologize the actual wood part and then create a few different leaf cards with opacity, diff, spec, norm and scatter them about.
I know it's not the answer you're looking for man, but trees are just tough. Without the aid of something like SpeedTree, you just have to put in the work. It sucks.
Are you shooting to specialize in environmental art? Seems like a good time to get in, I'm seeing more and more job listings for environment artists all the time.
Edit: As for the Zbrush workflow I mentioned (if that's an option for you), you could just duplicate your mesh in the subtool stack and run Zremesh on it. It's not a perfect retopo, but it gets the job done and is fine for static meshes. Then you can reproject and bake normals in Zbrush, or just export both the high and low to Xnormal or Max and do your baking there. The only issue is that with a computer generated topology, your UV map may not be as uniform as you'd like. It's not an issue if you're texturing in a 3D paint program like Mari or Mudbox, but god help you if you're texturing in Photoshop.
Edit2: Also we're wrecking this thread lol. Feel free to PM me about 3d stuff any time though broseph.
My 970 is perfect for me since my monitor's native res is 1920x1200 and I don't use dual monitor setups or have a higher-than-60 refresh rate monitor. It absolutely was a massive upgrade over a 670. But if I ever want to go beyond 1080-1200p I'm restricted by the 3.5GB bullshit.
@inb4uall: i wouldn't count on it, next year we will probably see another maxwell cards since nvidia went all the way to develop the gm200 architecture and dont think to use it only for titan and couple quadro cards..
Next big GPUs ( 1080??) and the smaller ones will probably use either 22nm or 20nm GM200 ( maxwell ) but with 256 bit bus width and fewer Vram
Please Log In to post.
Log in to comment