Are you a member of the GTX 900 club?

  • 197 results
  • 1
  • 2
  • 3
  • 4
Avatar image for SolidGame_basic
SolidGame_basic

47717

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Poll Are you a member of the GTX 900 club? (146 votes)

GTX 980 5%
GTX 970 40%
GTX960 3%
No, but I plan on getting it. 13%
Will not be getting it 38%

Not gonna lie, SW. The new nvidia gpus are pretty sweet. As much as I love console, these GPUs are in a league of their own. Do you have one? Plan on getting one? What is your setup?

 • 
Avatar image for gameofthering
gameofthering

11286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#51 gameofthering
Member since 2004 • 11286 Posts

Waiting until Batman Arkham knight before I upgrade my PC.

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#52 Mr_Huggles_dog
Member since 2014 • 7805 Posts

@lostrib said:

@MonsieurX said:

Waiting on the next line to upgrade my 670

same

my 670 still handles most things well

All that bitching about my PC and playing games in 900p and you only have a GTX 670.

I don't know you anymore, rib.

We used to be close.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#53 lostrib
Member since 2009 • 49999 Posts

@mr_huggles_dog said:

@lostrib said:

@MonsieurX said:

Waiting on the next line to upgrade my 670

same

my 670 still handles most things well

All that bitching about my PC and playing games in 900p and you only have a GTX 670.

I don't know you anymore, rib.

We used to be close.

but i play games (like Far Cry 3) at 1080p

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#54  Edited By Mr_Huggles_dog
Member since 2014 • 7805 Posts

@lostrib said:

@mr_huggles_dog said:

All that bitching about my PC and playing games in 900p and you only have a GTX 670.

I don't know you anymore, rib.

We used to be close.

but i play games (like Far Cry 3) at 1080p

WHATEVER!!!

We had something special....but knowing I have a better GPU than you.....it....it just makes things so emotional.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#55 lostrib
Member since 2009 • 49999 Posts

@mr_huggles_dog said:

@lostrib said:

@mr_huggles_dog said:

All that bitching about my PC and playing games in 900p and you only have a GTX 670.

I don't know you anymore, rib.

We used to be close.

but i play games (like Far Cry 3) at 1080p

WHATEVER!!!

We had something special....but knowing I have a better GPU than you.....it....it just makes things so emotional.

Avatar image for Zethrickk382
Zethrickk382

480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#56 Zethrickk382
Member since 2013 • 480 Posts

@osirisx3: "Nvidia sucks"

Couldn't agree more.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#57 NyaDC
Member since 2014 • 8006 Posts

Nope, two 290X's in my rig.

Avatar image for LegatoSkyheart
LegatoSkyheart

29733

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 1

#59 LegatoSkyheart
Member since 2009 • 29733 Posts

The gtx 970 really is a nice upgrade from my Raedon HD 6770

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#60 NyaDC
Member since 2014 • 8006 Posts

@LegatoSkyheart said:

The gtx 970 really is a nice upgrade from my Raedon HD 6770

It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.

Avatar image for whalefish82
whalefish82

511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 whalefish82
Member since 2013 • 511 Posts

Went green for the first time and got a single 970 in January. It's a fantastic card already and will be even better once DX12 comes out, with it's combined memory capabilities.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#62 clyde46
Member since 2005 • 49061 Posts

@nyadc said:

@LegatoSkyheart said:

The gtx 970 really is a nice upgrade from my Raedon HD 6770

It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.

Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#63  Edited By ConanTheStoner
Member since 2011 • 23838 Posts

3.5 gb mister rice here.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#64 parkurtommo
Member since 2009 • 28295 Posts

@FoxbatAlpha said:

My GTX 560Ti laughs at the 900 series.

I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.

Avatar image for SolidGame_basic
SolidGame_basic

47717

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#65 SolidGame_basic  Online
Member since 2003 • 47717 Posts

@parkurtommo said:

@FoxbatAlpha said:

My GTX 560Ti laughs at the 900 series.

I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.

rofl

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#66 04dcarraher
Member since 2004 • 23859 Posts

@thehig1 said:

My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.

The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#67  Edited By 04dcarraher
Member since 2004 • 23859 Posts

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. The preview benchs show 970 only 60% behind 390x at 4k on average, below 1600p its only 30% faster on average. But 390x is rumored to be a $700 gpu. So to say in general that the 300 series will pwn the 970 is flat out wrong since chances are that the 370 and or 380 will be in the same ballpark.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#68 clyde46
Member since 2005 • 49061 Posts

@04dcarraher said:

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.

Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.

Avatar image for Stevo_the_gamer
Stevo_the_gamer

50237

Forum Posts

0

Wiki Points

0

Followers

Reviews: 49

User Lists: 0

#69 Stevo_the_gamer  Moderator  Online
Member since 2004 • 50237 Posts

I have MSI Gaming GTX 970. It's pretty sweet.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#70 04dcarraher
Member since 2004 • 23859 Posts

@clyde46 said:

@04dcarraher said:

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.

Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.

I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#71 clyde46
Member since 2005 • 49061 Posts

@04dcarraher said:

@clyde46 said:

@04dcarraher said:

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.

Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.

I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.

The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#72  Edited By deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

@clyde46 said:

@nyadc said:

@LegatoSkyheart said:

The gtx 970 really is a nice upgrade from my Raedon HD 6770

It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.

Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.

This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.

Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's.(I never had problems with SLI, As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work). I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#73  Edited By 04dcarraher
Member since 2004 • 23859 Posts
@clyde46 said:

@04dcarraher said:

@clyde46 said:

@04dcarraher said:

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.

Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.

I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.

The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....

Well according to the most recent leaked benchmarks state its only 4gb not 8gb. Also it shows the 390x peaking at 289w with Metro LL. We will have to wait and see. But according to the specs and facts about 1st gen HBM the limit is 4gb, so they would have to bridge 4gb of GDDR5 to make it a 8gb card

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74 clyde46
Member since 2005 • 49061 Posts

@ryangcnx-2 said:

@clyde46 said:

@nyadc said:

@LegatoSkyheart said:

The gtx 970 really is a nice upgrade from my Raedon HD 6770

It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.

Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.

This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.

Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's. As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work. I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.

Nvidia don't owe you shit brah. This whole "Ramgate" is just a means of trying to get money out of Nvidia by a few butthurt owners and AMD fanboys. The card still works? As it was found out, its when you start pushing serious resolutions that you get problems. The 970 was never designed for 4K...

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#75 clyde46
Member since 2005 • 49061 Posts

@04dcarraher said:
@clyde46 said:

@04dcarraher said:

@clyde46 said:

@04dcarraher said:

@osirisx3 said:

nah i dont want to be ripped off. Nvidia sucks, they should refund everyone who got a 970.

amd 300 series will pwn it

I have not had one issue with the 970,. Also quite a few of the people over estimated the 970's abilities to begin with and them using over excessive resolutions and or settings added to the segmented memory blaze. saying the 300's will pwn is abit too fanboyish since the preview benchs show 970 only 60% behind 390x at 4k on avearge, at 1600p its only 30% faster. But 390x is rumored to be a $700 gpu.

Should of seen /g/ when the Titan X/390X benchmarks got leaked a few days ago. The AMD shills were putting in over time.

I bet, but wonder if anyone pointed out the 1st gen HBM 4gb limitation will hurt the 390x in 4k once you past that 4gb vs the titan x and possibly allow the rumored 6gb nvidia card to catch up.

The 390X is supposed to have 8GB of VRAM now. Its supposed to be able to smack a Titan X around at $700 in 4K benchmarks. Its supposed to use less power and produce less heat. It was also "supposed" to be announced this month....

Well according to the most recent leaked benchmarks state its only 4gb not 8gb. Also it shows the 390x peaking at 289w with Metro LL.

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#76 deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

@04dcarraher said:

@thehig1 said:

My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.

The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.

Agreed, even if my preference is Nvidia, I would rather see you get a 300 series AMD card. The 200 series is old, now if you get one really cheap then I understand, but I would not pay full retail when the 300 series is right around the corner.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#77 NyaDC
Member since 2014 • 8006 Posts

@clyde46 said:

Even games at 1080p are beginning to get extreme graphics memory requirements to enable or push certain features, with the way things are going the GTX 970 will be a 1080p bottleneck within a year and 1440p sooner, that's not ok and no one is blowing anything out of proportion.

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#78 deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

@clyde46 said:

@ryangcnx-2 said:

@clyde46 said:

@nyadc said:

@LegatoSkyheart said:

The gtx 970 really is a nice upgrade from my Raedon HD 6770

It's a nice card, shame about that memory parsing though, that shattered a lot of people relationship with Nvidia and greatly tarnished their image.

Thats what you get for buying the cheaper card. All the *70 series have been gimped in some way.

This is the issue though, Originally we all though it was only gimped on Cuda Cores and at that who would buy the 980. But when then saying everything else was false, its memory bandwidth was a lie, its ROP count was a lie, is LV2 cache was a lie. This info makes it very different from the 980. Personally my dilemma was 2 970's VS 1 980 based on the original specs, they seems so close. So now people are learning the true differences, should be taken care of. I say refund or a pay the difference upgrade to the 980 or Titan X.

Personally, I will still vote green team. Mainly because 3ds Max is pretty much reliant on Nvidia, but that in gaming I have had much better driver support. My history is SLI GT 7900's, SLI 8800 512 GTS, Crossfire 5870, GTX 680, GTX 970 SLI ( sold both to upgrade) and now SLI GTX 980's. As far as drivers go AMD was the worst. I couldn't play skyrim as Crossfire would make it flash seizures of death in the sky, or disabling to single would cause a instant crash upon launching. I had to literally remove the second card to get it to work. I have went back to nvidia since, and even with the 970 blunder, I will still stick with them. Again, mainly for 3DS Max support and SLI drivers.

Nvidia don't owe you shit brah. This whole "Ramgate" is just a means of trying to get money out of Nvidia by a few butthurt owners and AMD fanboys. The card still works? As it was found out, its when you start pushing serious resolutions that you get problems. The 970 was never designed for 4K...

And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#79 04dcarraher
Member since 2004 • 23859 Posts
@clyde46 said:

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

256w for the titan x. res was 1600p

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#80 clyde46
Member since 2005 • 49061 Posts

@04dcarraher said:
@clyde46 said:

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

256w for the titan x. res was 1600p

I see.

http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/

The 390X is going to ship with an 8GB variant.

Avatar image for thehig1
thehig1

7556

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#81 thehig1
Member since 2014 • 7556 Posts

@ryangcnx-2 said:

@04dcarraher said:

@thehig1 said:

My gpu upgrade likely to be a r9 290x, or see what amd release later in the year.

The 200 series is a waste of money at this point especially when 300 series will appear and DX12 being around the corner. To get anything but a DX12 ready gpu is not a good plan.

Agreed, even if my preference is Nvidia, I would rather see you get a 300 series AMD card. The 200 series is old, now if you get one really cheap then I understand, but I would not pay full retail when the 300 series is right around the corner.

yep, Ill be waiting for the new AMD cards

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#82 ConanTheStoner
Member since 2011 • 23838 Posts

@ryangcnx-2 said:

And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.

Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.

I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#83 NyaDC
Member since 2014 • 8006 Posts

@clyde46 said:

@04dcarraher said:
@clyde46 said:

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

256w for the titan x. res was 1600p

I see.

http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/

The 390X is going to ship with an 8GB variant.

I think I'm going to wait for the 490X, currently running two 290X's, I don't see the need to move to what will likely be a 30-40% increase over one of my cards when my two will likley excel above it by 60-70%.

Avatar image for 04dcarraher
04dcarraher

23859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#84  Edited By 04dcarraher
Member since 2004 • 23859 Posts
@clyde46 said:

@04dcarraher said:
@clyde46 said:

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

256w for the titan x. res was 1600p

I see.

http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/

The 390X is going to ship with an 8GB variant.

This could also potentially mean that we might be seeing both 4GB HBM and 8GB HBM variants.

8gb will most likely be later when 2nd gen HBM comes to age.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85 clyde46
Member since 2005 • 49061 Posts

@04dcarraher said:
@clyde46 said:

@04dcarraher said:
@clyde46 said:

Must be different from the ones I saw. They were from Chiphell. 293W in LL, let me go see what power draw the Titan X has in that game. What res?

256w for the titan x. res was 1600p

I see.

http://wccftech.com/amd-radeon-r9-390x-directx-12-tier-3-implementation-wce-water-cooled-edition-inbound/

The 390X is going to ship with an 8GB variant.

his could also potentially mean that we might be seeing both 4GB HBM and 8GB HBM variants.

8gb will most likely be later when 2nd gen HBM comes to age.

Seems quite strange that they would brag about having a 8GB variant that won't be available on release.

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#86  Edited By deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

@ConanTheStoner said:

@ryangcnx-2 said:

And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.

Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.

I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.

Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU. And, unless your 100% pro quadro is way to expensive. All my teachers even used Geforce. Even many on polycount say Geforce as Quadro isn't worth the asking price. Also with Quadro I can't game, so I'm looking for a both solution.

Avatar image for scoots9
scoots9

3505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#87 scoots9
Member since 2006 • 3505 Posts

GTX 770, with no plan to upgrade to anything. The rest of my computer is 5 years old, so my next GPU purchase will be a few years down the line for a totally new rig.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#88 clyde46
Member since 2005 • 49061 Posts

@ryangcnx-2 said:

@ConanTheStoner said:

@ryangcnx-2 said:

And yet I didn't mention a single thing about 4k... Read Brah!!! My issue was with 3ds Max and future Gaming performance. As of TODAY, there is an issue with 3ds Max only being able to store 3.5gb of scenery. Nvidia did lie about the specs, that is flat out false advertising. They could have saved me a TON OF HASSLE, buy just saying it outfront as I would have gone the 980 route.

Bruh, I've been working in Max for about 8 years now. Everything from game assets to high res film assets. I have no idea what you could possibly be doing in Max that's killing your card, heavy scenes typically eat up ram, not your gpu lol. If you're doing taxing work in Max you should be on a Quadro anyways, not a gaming card.

I'll admit I switched to Modo about a year ago as it's a far more sophisticated modeler with a much cleaner interface, but I highly doubt much has changed on the Max end in the past year.

Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU. And, unless your 100% pro quadro is way to expensive. All my teachers even used Geforce. Even many on polycount say Geforce as Quadro isn't worth the asking price. Also with Quadro I can't game, so I'm looking for a both solution.

If it requires double precision then get an original Titan.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89  Edited By ConanTheStoner
Member since 2011 • 23838 Posts

@ryangcnx-2 said:

Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU.

Yes of course, I believe it's been in Max since 2010. And I was working out scenes for big budget films on a 2gb Quadro 4000 lol.

Anyways, that does suck for you, but with so many better rendering solutions out there it isn't really that big of an issue is it? And if you must use it, why go with a gaming card? A budget gaming card no less...

Edit: Ok just saw your edit. Well yeah you're not going to get the best of both worlds with a Quadro, they suck for gaming. Also, don't take the guys at Polycount as scripture. Most of them are only working on games in which a geforce would be the way to go, and while there are plenty of brilliant artists on the site, they're not going to give you the best hardware advice.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#90 GioVela2010
Member since 2008 • 5566 Posts

Lol rber when Hermits thought a GTX 670/680 could last 5-6 years lollolol

Avatar image for deactivated-5f768591970d3
deactivated-5f768591970d3

1255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#91 deactivated-5f768591970d3
Member since 2004 • 1255 Posts

Two Asus GTX970 in SLI.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 FoxbatAlpha
Member since 2009 • 10669 Posts

@parkurtommo said:

@FoxbatAlpha said:

My GTX 560Ti laughs at the 900 series.

I'm sorry, I used to have that card, it's doesn't cut it for anything post-2013.

I have a secret. I have the Nvidia dash suit which allows me to overclock the shit out of it bringing performance up to 2006 domination. LOL.

Avatar image for jedikevin2
jedikevin2

5263

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#93 jedikevin2
Member since 2004 • 5263 Posts
@GioVela2010 said:

Lol rber when Hermits thought a GTX 670/680 could last 5-6 years lollolol

And yet both cards you mentioned would still be rolling fine.

I just upgraded from my old gtx 460. It was still pumpin out what i needed 5 years in. I wanted to drop down to a mini itx build for cimplicity to my gameroom/living room needs so hit up on a 960 itx card.

Avatar image for newwellofgabe
NewWellofGabe

113

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#94 NewWellofGabe
Member since 2015 • 113 Posts

@GioVela2010 said:

Lol rber when Hermits thought a GTX 670/680 could last 5-6 years lollolol

They can. Just not everything will be at Ultra.

Avatar image for deactivated-57d307c5efcda
deactivated-57d307c5efcda

1302

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#95 deactivated-57d307c5efcda
Member since 2009 • 1302 Posts

@ConanTheStoner said:

@ryangcnx-2 said:

Ever heard of iRay? It requires everything to be stuffed into the Vram of the GPU.

Yes of course, I believe it's been in Max since 2010. And I was working out scenes for big budget films on a 2gb Quadro 4000 lol.

Anyways, that does suck for you, but with so many better rendering solutions out there it isn't really that big of an issue is it? And if you must use it, why go with a gaming card? A budget gaming card no less...

Edit: Ok just saw your edit. Well yeah you're not going to get the best of both worlds with a Quadro, they suck for gaming. Also, don't take the guys at Polycount as scripture. Most of them are only working on games in which a geforce would be the way to go, and while there are plenty of brilliant artists on the site, they're not going to give you the best hardware advice.

But though wound't you be a little upset about Nvidia's lies? I'm much happier as I can fit a full 4gb into a 980. As a gamer and a Student of 3D, the 970 and 980 seemed very good. I originally got a second 970 for gaming, but as 3ds max doesn't rely on SLI only one would have done it. Now to find out any scene bigger than 3.5 gives me an error and at the time I didn't understand why. I ended up solving my problem by just selling them and buying 980 cards while I still worked at BestBuy on discount. I made out in the end, I would love to buy a Titan X for its 12gb of ram, but am out of return policy and have just accepted this and will live with it. I now i'm getting what I paid for as well as max performance in games.

As for someone on a budget I relied on the specs Nvidia had gave. Had I an actual job in the industry rather than starting out and could afford a Quadro and then a gaming machine, I wouldn't be ticked, but I'm still a out of college student who is trying to build a portfolio. But while were on the subject, do you know any good tutorials on learning to build tree's? This is a thing I have been spending lots of time with, but ending up with results like Forza 5. I have looked hard at the tree's in Inquisition and see kinda how they look, but trying to make them has turned out not quite as good. You seem to be knowledgeable about 3DS max and would like any suggestions you may have.

Avatar image for jun_aka_pekto
jun_aka_pekto

25255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#96 jun_aka_pekto
Member since 2010 • 25255 Posts

My 4GB MSI OC GTX 770 is doing fine with Far Cry 4. So, not yet. I normally alternate between AMD and Nvidia. I broke that rule with the GTX 770 (from a GTX 560 Ti). My next card will be AMD. Gotta get back to my old habits.

Avatar image for ConanTheStoner
ConanTheStoner

23838

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97  Edited By ConanTheStoner
Member since 2011 • 23838 Posts

@ryangcnx-2 said:

But though wound't you be a little upset about Nvidia's lies? I'm much happier as I can fit a full 4gb into a 980. As a gamer and a Student of 3D, the 970 and 980 seemed very good. I originally got a second 970 for gaming, but as 3ds max doesn't rely on SLI only one would have done it. Now to find out any scene bigger than 3.5 gives me an error and at the time I didn't understand why. I ended up solving my problem by just selling them and buying 980 cards while I still worked at BestBuy on discount. I made out in the end, I would love to buy a Titan X for its 12gb of ram, but am out of return policy and have just accepted this and will live with it. I now i'm getting what I paid for as well as max performance in games.

As for someone on a budget I relied on the specs Nvidia had gave. Had I an actual job in the industry rather than starting out and could afford a Quadro and then a gaming machine, I wouldn't be ticked, but I'm still a out of college student who is trying to build a portfolio. But while were on the subject, do you know any good tutorials on learning to build tree's? This is a thing I have been spending lots of time with, but ending up with results like Forza 5. I have looked hard at the tree's in Inquisition and see kinda how they look, but trying to make them has turned out not quite as good. You seem to be knowledgeable about 3DS max and would like any suggestions you may have.

Haha, yeah, I just saw the Max comment and jumped in, but you're right. Nvidia did pull some BS, I'm not defending that. I have a 970 and for my current needs it's good enough. I'm pleased with what I got for what I spent, but that doesn't excuse Nvidia.

Also didn't know you were still a student, so yes I fully understand that man. Rough times on the budget lol.

Trees. Yes, always a bitch haha. Of course there is SpeedTree which is used heavily in games and vfx. From what I understand, it's the industry standard. Never used it myself, but it looks beast.

In Max though? Sorry, but you're either going to have to model them from scratch, or dick around with the AEC extended trees, convert them to editable poly, then get in there and refine them yourself.

Last time I modeled a tree, I created the high res in Zbrush. Using Zspheres in conjunction with Fibermesh, you can quickly develop a range of different looks. It's really a great workflow for getting quick original pieces. Of course, the bitch of it is that you have to retopologize everything if you want to render it externally. And that takes some time. Best way to do it is to simply retopologize the actual wood part and then create a few different leaf cards with opacity, diff, spec, norm and scatter them about.

I know it's not the answer you're looking for man, but trees are just tough. Without the aid of something like SpeedTree, you just have to put in the work. It sucks.

Are you shooting to specialize in environmental art? Seems like a good time to get in, I'm seeing more and more job listings for environment artists all the time.

Edit: As for the Zbrush workflow I mentioned (if that's an option for you), you could just duplicate your mesh in the subtool stack and run Zremesh on it. It's not a perfect retopo, but it gets the job done and is fine for static meshes. Then you can reproject and bake normals in Zbrush, or just export both the high and low to Xnormal or Max and do your baking there. The only issue is that with a computer generated topology, your UV map may not be as uniform as you'd like. It's not an issue if you're texturing in a 3D paint program like Mari or Mudbox, but god help you if you're texturing in Photoshop.

Edit2: Also we're wrecking this thread lol. Feel free to PM me about 3d stuff any time though broseph.

Avatar image for princeofshapeir
princeofshapeir

16652

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#98 princeofshapeir
Member since 2006 • 16652 Posts

My 970 is perfect for me since my monitor's native res is 1920x1200 and I don't use dual monitor setups or have a higher-than-60 refresh rate monitor. It absolutely was a massive upgrade over a 670. But if I ever want to go beyond 1080-1200p I'm restricted by the 3.5GB bullshit.

Avatar image for GioVela2010
GioVela2010

5566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#99 GioVela2010
Member since 2008 • 5566 Posts

Lol loud GPU's.

No thanks Titan, no thanks 390x

Avatar image for metalesback
MetalEsback

30

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#100 MetalEsback
Member since 2013 • 30 Posts

@inb4uall: i wouldn't count on it, next year we will probably see another maxwell cards since nvidia went all the way to develop the gm200 architecture and dont think to use it only for titan and couple quadro cards..

Next big GPUs ( 1080??) and the smaller ones will probably use either 22nm or 20nm GM200 ( maxwell ) but with 256 bit bus width and fewer Vram