AMD not looking so good....

  • 53 results
  • 1
  • 2
Avatar image for zaku101
zaku101

4641

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#1  Edited By zaku101
Member since 2005 • 4641 Posts

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

Link

Link 2

Avatar image for gerygo
GeryGo

12810

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#2 GeryGo  Moderator
Member since 2006 • 12810 Posts

*Move to Hardware forum*

@zaku101 said:

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

Link

How about price/ performance ratio? Instead of buying GTX970 for 330$ you can buy R9 290 for 250$ - performance difference? 5-15% give or take.

How about R9 390? 8Gb VRAM - best GPUs to go if you're tight on budget and still want good 4K experience - either that or straight to 980Ti which still won't be as good as 2x 390.

Not an AMD fanboy but I do appreciate price / performance.

Avatar image for zaku101
zaku101

4641

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#3 zaku101
Member since 2005 • 4641 Posts

@PredatorRules: Keep in mind the slightly higher cost comes with much lower load temp and voltage consumption, which means longer life and better OC. The 970 runs 31c cooler under load compared to the R9 290.


Avatar image for urbangamez
urbangamez

3511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4 urbangamez
Member since 2010 • 3511 Posts

@zaku101: here are few amd cards worth purchasing R9 390, R9 380 and R7 370 two great performance cards and a mainstream one.

Avatar image for gerygo
GeryGo

12810

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#5  Edited By GeryGo  Moderator
Member since 2006 • 12810 Posts

@zaku101 said:

@PredatorRules: Keep in mind the slightly higher cost comes with much lower load temp and voltage consumption, which means longer life and better OC. The 970 runs 31c cooler under load compared to the R9 290.

Stock AMD fans my friend, it does not get to 90C, it gets to 78C at most with my 290.

Avatar image for zaku101
zaku101

4641

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6  Edited By zaku101
Member since 2005 • 4641 Posts

@urbangamez: I only feel that the R9 380 migh be a better value when compared to Nvidia. But then again you can find the GTX 960 for $175 these days...

Avatar image for urbangamez
urbangamez

3511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#7 urbangamez
Member since 2010 • 3511 Posts

@zaku101 said:

@urbangamez: I only feel that the R9 380 would be a better value when compared to Nvidia.

the R7 370 beats the 750ti and is great choice if upgrading from say a 460 or 560ti, i should add that the R7 370 4G is the best option vs the R7 370 2G model and the full potential of cards show up when using intel cpus. amd cards also perform much better in DX 12 than DX 11 so the future there is bright if they can stay alive.

the R9 280 is still around in limited numbers and is a great card.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8 Coseniath
Member since 2004 • 3183 Posts

@zaku101: 18% of the global market is not a small number.

There was a year that ATI and Nvidia together, had around 15%... (3DFX ftw!)

AMD's problem is not their GPU department which are always following with Nvidia in performance.

Their big problem is that they are behind CPU performance for 5-6 years in a row (they were following in performance till nehalem/sandybridge happened)...

Thats a lot of money....

Well since Maxwell appeared they lost around half of their market (They were 38% that time). Notice the difference between 2014Q2 and 2014Q3...

Maxwell provided great overclockability and too low temps.

As @PredatorRules said, almost no one buys reference cards, so the temps will be even lower, allowing Asus, MSI, Gigabyte, etc etc to make cards with huge factory overclocks.

That make Maxwell a lot faster in reality (some gpus had factory o/c even beyond 20!!!%), and lead people to choose Nvidia mostly...

Avatar image for BassMan
BassMan

18747

Forum Posts

0

Wiki Points

0

Followers

Reviews: 233

User Lists: 0

#9  Edited By BassMan
Member since 2002 • 18747 Posts

AMD is just behind and they are missing the proprietary tech and features that Nvidia has. I wish there was no proprietary shit for fair competition, but it is there and does sway people. Along with better efficiency, lower temps and driver support on the Nvidia side.

Avatar image for kitty
kitty

115479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10  Edited By kitty  Moderator
Member since 2006 • 115479 Posts

@urbangamez said:
@zaku101 said:

@urbangamez: I only feel that the R9 380 would be a better value when compared to Nvidia.

the R7 370 beats the 750ti and is great choice if upgrading from say a 460 or 560ti, i should add that the R7 370 4G is the best option vs the R7 370 2G model and the full potential of cards show up when using intel cpus. amd cards also perform much better in DX 12 than DX 11 so the future there is bright if they can stay alive.

the R9 280 is still around in limited numbers and is a great card.

amd is missing a 270x in the 300 series. a 370 is a 7850, which is what the ps4 uses or close to that. a 270x would be the better option over a 750ti and 370.

its even the same price as most 370's

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202050&cm_re=r9_270x-_-14-202-050-_-Product

Avatar image for urbangamez
urbangamez

3511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#11 urbangamez
Member since 2010 • 3511 Posts

@kitty said:
@urbangamez said:
@zaku101 said:

@urbangamez: I only feel that the R9 380 would be a better value when compared to Nvidia.

the R7 370 beats the 750ti and is great choice if upgrading from say a 460 or 560ti, i should add that the R7 370 4G is the best option vs the R7 370 2G model and the full potential of cards show up when using intel cpus. amd cards also perform much better in DX 12 than DX 11 so the future there is bright if they can stay alive.

the R9 280 is still around in limited numbers and is a great card.

amd is missing a 270x in the 300 series. a 370 is a 7850, which is what the ps4 uses or close to that. a 270x would be the better option over a 750ti and 370.

its even the same price as most 370's

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202050&cm_re=r9_270x-_-14-202-050-_-Product

true. the 270x would be a better option.

Avatar image for thehig1
thehig1

7556

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#12 thehig1
Member since 2014 • 7556 Posts

@PredatorRules said:

*Move to Hardware forum*

@zaku101 said:

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

Link

How about price/ performance ratio? Instead of buying GTX970 for 330$ you can buy R9 290 for 250$ - performance difference? 5-15% give or take.

How about R9 390? 8Gb VRAM - best GPUs to go if you're tight on budget and still want good 4K experience - either that or straight to 980Ti which still won't be as good as 2x 390.

Not an AMD fanboy but I do appreciate price / performance.

Gotta agree with this, i went nvidia 970 in the end because i found a great deal on it.

How ever the 290 and 390 cards are excellent

Avatar image for alucrd2009
Alucrd2009

788

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#13 Alucrd2009
Member since 2007 • 788 Posts

thats just sad , just imagine AMD is dead , what we gonna suffer , CPU for like 2 k and nvida for 1 k GPU ............... that would be a disaster !

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14 MK-Professor
Member since 2009 • 4218 Posts

@zaku101 said:

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

I think the "better drivers" myth needs to die. Have you owned a AMD card since 2008 or so? let me put it this way, my last 3 GPU's were from AMD if there was any driver problem( or any problem) than why I end up with the 390 over the 970?

"run much cooler myth" I have a r9 390 (OC to 1100/1730) an it run around 66-70C in games. (stock coolers are just crappy the same way nvidia's stock coolers are crap)

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#15  Edited By 04dcarraher
Member since 2004 • 23858 Posts

@MK-Professor said:
@zaku101 said:

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

I think the "better drivers" myth needs to die. Have you owned a AMD card since 2008 or so? let me put it this way, my last 3 GPU's were from AMD if there was any driver problem( or any problem) than why I end up with the 390 over the 970?

"run much cooler myth" I have a r9 390 (OC to 1100/1730) an it run around 66-70C in games. (stock coolers are just crappy the same way nvidia's stock coolers are crap)

Since 2013 AMD has stepped up their drivers but even still today AMD tends to still fall abit behind with updates with new games. Also their DX11 driver base is awful until they update/fix it for the new games. Project Cars is a prime example where AMD's drivers in Win 7/8 reverted to single threaded in DX11. while in Win10 beta AMD gpu's seen massive improvements since Win10/DX12 forces proper multithreading usage. AMD has slacked with DX11 because of their focus of Mantle, and now with DX12 and Vulkan.

Now that is a BS statement..... all AMD gpu's since GCN have used more power and produced more heat that what Nvidia has offered within the same time frames and performance bracket.

The 390x is a reworked/refreshed Hawaii chip it uses 30w less than 290x.

Even with a stock cooler 980ti runs peaks at 84C while MSI twin frozr 390x runs around 76C, While about all 3rd party coolers will keep 980ti well below 70C.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16  Edited By MK-Professor
Member since 2009 • 4218 Posts

@04dcarraher said:
@MK-Professor said:
@zaku101 said:

Nvidia currently controls 82% of the GPU market, that only leaves 18% for AMD, I am wondering if they're going to totally tank... I haven't really found a reason to pick AMD over Nvidia. Nvidia has better drivers and their cards run much cooler.

I think the "better drivers" myth needs to die. Have you owned a AMD card since 2008 or so? let me put it this way, my last 3 GPU's were from AMD if there was any driver problem( or any problem) than why I end up with the 390 over the 970?

"run much cooler myth" I have a r9 390 (OC to 1100/1730) an it run around 66-70C in games. (stock coolers are just crappy the same way nvidia's stock coolers are crap)

Since 2013 AMD has stepped up their drivers but even still today AMD tends to still fall abit behind with updates with new games. Also their DX11 driver base is awful until they update/fix it for the new games. Project Cars is a prime example where AMD's drivers in Win 7/8 reverted to single threaded in DX11. while in Win10 beta AMD gpu's seen massive improvements since Win10/DX12 forces proper multithreading usage. AMD has slacked with DX11 because of their focus of Mantle, and now with DX12 and Vulkan.

Now that is a BS statement..... all AMD gpu's since GCN have used more power and produced more heat that what Nvidia has offered within the same time frames and performance bracket.

The 390x is a reworked/refreshed Hawaii chip it uses 30w less than 290x.

Even with a stock cooler 980ti runs peaks at 84C while MSI twin frozr 390x runs around 76C, While about all 3rd party coolers will keep 980ti well below 70C.

Project Cars is the exception not the norm( probably nvidia pay the developers to nerf the performance for this game ). AMD drivers are as good as nvidias and that is a fact. Also the massive improvements in DX12 is a good think for AMD, because will make gpu's like the 390 to equal the 980Ti instead of the 970.

He said run much cooler, 6C difference is not much cooler it is almost the same.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17  Edited By Coseniath
Member since 2004 • 3183 Posts
@MK-Professor said:

Also the massive improvements in DX12 is a good think for AMD, because will make gpu's like the 390 to equal the 980Ti instead of the 970.

May I ask where did you read this?

@MK-Professor said:
He said run much cooler, 6C difference is not much cooler it is almost the same.

Your R9 390 with 10% (1100MHz) o/c runs at 66-70C under gaming.

My GTX 970 with 20% (1405MHz) o/c runs at 56-60C under gaming.

Define what "much cooler" means to you...

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18  Edited By Wickerman777
Member since 2013 • 2164 Posts

I read an article that says an early DX12 test has proven very favorable to AMD and not so much for Nvidia. Nvidia was unquestionably the DX11 champ but maybe AMD is gonna wear the DX12 belt.

http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19  Edited By Wickerman777
Member since 2013 • 2164 Posts

@Coseniath said:
@MK-Professor said:

Also the massive improvements in DX12 is a good think for AMD, because will make gpu's like the 390 to equal the 980Ti instead of the 970.

May I ask where did you read this?

@MK-Professor said:
He said run much cooler, 6C difference is not much cooler it is almost the same.

Your R9 390 with 10% (1100MHz) o/c runs at 66-70C under gaming.

My GTX 970 with 20% (1405MHz) o/c runs at 56-60C under gaming.

Define what "much cooler" means to you...

He was talking about the 390 but in that article I just linked to a 290X was equaling the 980Ti when they were running DX12. :)

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#20  Edited By MK-Professor
Member since 2009 • 4218 Posts

@Coseniath said:
@MK-Professor said:

Also the massive improvements in DX12 is a good think for AMD, because will make gpu's like the 390 to equal the 980Ti instead of the 970.

May I ask where did you read this?

@MK-Professor said:
He said run much cooler, 6C difference is not much cooler it is almost the same.

Your R9 390 with 10% (1100MHz) o/c runs at 66-70C under gaming.

My GTX 970 with 20% (1405MHz) o/c runs at 56-60C under gaming.

Define what "much cooler" means to you...

So many DX12 benchmarks are floating around lately. So it is not unlikely to see AMD GPU's gain a noticeable performance bump compared to nvidia GPU's in DX12 games.

reviews show much smaller temperature difference:

http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/34.html

http://www.techpowerup.com/reviews/Colorful/iGame_GTX_970/31.html

http://www.techpowerup.com/reviews/EVGA/GTX_970_SC_ACX_Cooler/29.html

many nvidia users pretend that AMD GPU's run at 90C while the fan is at 100%. in reality the difference are very small, so small that it is not even worth discussing except from the nvidia fanboys that make it seem like a big deal.

Avatar image for intotheminx
intotheminx

2608

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#21 intotheminx
Member since 2014 • 2608 Posts

Personally, I believe AMD is going to focus on APU's in the future. The strides being made there are great. 2 years from now we may not need a discrete GPU for 1080p gaming.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22 04dcarraher
Member since 2004 • 23858 Posts

@MK-Professor said:

Project Cars is the exception not the norm( probably nvidia pay the developers to nerf the performance for this game ). AMD drivers are as good as nvidias and that is a fact. Also the massive improvements in DX12 is a good think for AMD, because will make gpu's like the 390 to equal the 980Ti instead of the 970.

He said run much cooler, 6C difference is not much cooler it is almost the same.

Not really look at every new game that isnt promoted by AMD nor directly by Nvidia, it takes them time to update and fix the major issues. Nvidia's driver's tend to work out of the box better when it comes to new games.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#23 deactivated-59d151f079814
Member since 2003 • 47239 Posts

I am more worried about AMD's cpu market personally.. In the GPU side they are still competitive enough in most brackets to make it a heads or tails decision.. Their CPU side is so far behind it is not even funny.. Their motherboards still have things like regular pci slots for crying out loud..

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24  Edited By Coseniath
Member since 2004 • 3183 Posts

@MK-Professor, @Wickerman777: Well this is just a single benchmark of a single game that its not even finished...

And these are not my words. These are AMD's words. AMD has a diffirent opinion:

We talked with NVIDIA and AMD about the benchmark and both noted that this is alpha software and we took that as they felt it might not be an accurate measurement of DX12 at this point in time.

@MK-Professor: You linked to me 2 GTX 970 with 15% and 20% o/c with an R9 390 with 1!!!% o/c that they have their temps near (67,73,71, the one with 67C has 20% o/c at 1418MHz lol), as an apples to apples comparison?...

@sSubZerOo said:

I am more worried about AMD's cpu market personally.. In the GPU side they are still competitive enough in most brackets to make it a heads or tails decision.. Their CPU side is so far behind it is not even funny.. Their motherboards still have things like regular pci slots for crying out loud..

+1. They are perfectly trading blows with Nvidia or they are very close.

In CPUs... :(

I still remember that their mobos have PCIE 2.0 as standar....

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#25  Edited By Xtasy26
Member since 2008 • 5594 Posts

Couple of things. Those stat's goes up to 2Q of 2015. AMD didn't launch their refresh 300 series till around mid-June and the Fury X didn't launch till June 24th. And the Fury didn't launch till July, so I wouldn't expect market share to change that much in Q2. Having said that, AMD has execution problems. There was a fascinating article posted on a site/blog from former AMD GPU architect who criticized the current state of AMD's graphics division. He basically stated that after many talented graphics guys were let go in 2011, including Mr. Killebrew (the inventor of Eyefinity), many people followed by leaving AMD. ATI has had assembled a whole bunch of talented graphics guys and they were going "toe-to-toe" with nVidia up until 2011 as he put it. After they were let go, you saw AMD start to miss "market cycles" as the inventor of AMD's Eyefinity pointed out and in this business that is a lose-lose situation. For example, they missed the fall market cycle last year and didn't even release a single GPU earlier this year. Only in June of this year they released the refreshed R9 300 series. That's 9 month's since the release of the GTX 970/980. For the first in over a decade AMD/ATI went without releasing a high end single GPU in 2014.

He stated the following:

"They've lost a substantial part of the Orlando design team to Apple (about a dozen people I hear). In our business we all know the difference between success and failure is a few percent. Lose key leadership and you've probably lost the critical few percent. Make a graphics chip a bit too power hungry, a bit too expensive, a couple of features substandard, and even more importantly miss market cycles and you start the downward spiral."

I mean why did it take AMD so long to release the 300 series? If AMD had released the 300 series say last fall or earlier this year, the sales of graphics card would be more split and AMD wouldn't have lost market share as much. Looks like the loss of so much talent finally caught up to AMD. Since it takes 2 - 3 years to make a GPU from the ground up, AMD loss of talent back in 2011/2012 didn't affect AMD back then. Only in 2014 you started to see it's effects. Those guys who were let go and the subsequent people who left AMD after seeing their co-workers let go, I think affected AMD in a profound way. I mean these guys have been in the trenches fighting nVidia for nearly a decade. They know the graphics industry very well. Also, not having enough Fury cards probably is affecting AMD right now. There aren't any regular Fury from XFX for example. I am thinking that AMD doesn't have enough supplies of the Fury chip ( I read that it's less than 30,000) so only a select few partners are getting the Fury chip. I mean why else they would delay the release of the Fury Nano, unless they didn't think it was a high volume chip and they may not have enough supplies of it. So they are building up inventory so they can have a hard launch. Anyways, that's my theory.

There as such things as going from good to worse. Looks like the last CEO made things worse. At first I welcomed change by bringing in someone from the outside so they could provide a "fresh prespective" to make AMD better, but looks like that last CEO was extremely short sighted and started to let go very talented people from a division that was their profitable division, their graphics division, and now we are seeing the results of his poor decisions. He helped AMD make some money like in the fall of 2013 but that's about it. I mean ATI for the most part since the early 2000's had released great chips with great execution, with the exception of maybe the HD 2900XT, although that chip wasn't as nearly as worse the GeForce FX 5800 Ultra. But it's just sad to see AMD's management team tear up a great company like ATI.

Anyways, I just hope that AMD gains back some market share and thus money as they desperately need it for the rest of the year.

Here's a link to the article about what the inventor of Eyefinity said about AMD's current state of graphics.

Link.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#26 Xtasy26
Member since 2008 • 5594 Posts

@sSubZerOo said:

I am more worried about AMD's cpu market personally.. In the GPU side they are still competitive enough in most brackets to make it a heads or tails decision.. Their CPU side is so far behind it is not even funny.. Their motherboards still have things like regular pci slots for crying out loud..

You are right. The 390X even beat's the GTX 980 in some games or ties it. It cost's $70 less. Even better you could get the 8GB R9 290X which is over $100+ cheaper and get's the same result when over-clocked to the R9 390X specs. The Fury is beating the 980 by nearly double digit percentage on average according to Anandtech. My main gripe with that their new refreshed GPU's and the Fury series should have come out WAY earlier. Hence, AMD's market share loss. I have never seen nVidia get 80% market share in the graphics card market. I am willing to bet my life savings that if the old ATI guys were still around this would have never happened.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#27 MK-Professor
Member since 2009 • 4218 Posts

@Coseniath said:

@MK-Professor: You linked to me 2 GTX 970 with 15% and 20% o/c with an R9 390 with 1!!!% o/c that they have their temps near (67,73,71, the one with 67C has 20% o/c at 1418MHz lol), as an apples to apples comparison?...

LOL? the R9 390 have like a 10% OC, did you notice the graph that said LOAD + OC? and on the overclocking section it say the actual clocks.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28 Coseniath
Member since 2004 • 3183 Posts
@MK-Professor said:
@Coseniath said:

@MK-Professor: You linked to me 2 GTX 970 with 15% and 20% o/c with an R9 390 with 1!!!% o/c that they have their temps near (67,73,71, the one with 67C has 20% o/c at 1418MHz lol), as an apples to apples comparison?...

LOL? the R9 390 have like a 10% OC, did you notice the graph that said LOAD + OC? and on the overclocking section it say the actual clocks.

Well I guess you don't even read what you are linking. If we count the manual o/c from reviewer, then the 2 GTX970 will have 26% and near 30% o/c with 74C and 69C temps against R9 390's 9% and 71C.

Isn't this a big difference?

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29 MK-Professor
Member since 2009 • 4218 Posts

@Coseniath said:
@MK-Professor said:
@Coseniath said:

@MK-Professor: You linked to me 2 GTX 970 with 15% and 20% o/c with an R9 390 with 1!!!% o/c that they have their temps near (67,73,71, the one with 67C has 20% o/c at 1418MHz lol), as an apples to apples comparison?...

LOL? the R9 390 have like a 10% OC, did you notice the graph that said LOAD + OC? and on the overclocking section it say the actual clocks.

Well I guess you don't even read what you are linking. If we count the manual o/c from reviewer, then the 2 GTX970 will have 26% and near 30% o/c with 74C and 69C temps against R9 390's 9% and 71C.

Isn't this a big difference?

no it is not big difference, and also this is called silicon lottery, 1100MHz on the 390 it is considering poor OC I have seen many 1200MHz with very good temps.

https://www.youtube.com/watch?v=k9cKZiJw6Pk

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30  Edited By Coseniath
Member since 2004 • 3183 Posts

@MK-Professor: I ve seen even bigger GTX970 o/c (over 1500MHz and some reaching near 1600MHz), thats not the point...

Silicon lottery or ASIC lottery isn't our subject of disagreement here.

Temps are. It doesn't matter if one GPU reaches 100% o/c, its about what temp it has when its doing it.

Avatar image for MK-Professor
MK-Professor

4218

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#31  Edited By MK-Professor
Member since 2009 • 4218 Posts

@Coseniath said:

@MK-Professor: I ve seen even bigger GTX970 o/c (over 1500MHz and some reaching near 1600MHz), thats not the point...

Silicon lottery or ASIC lottery isn't our subject of disagreement here.

Temps are. It doesn't matter if one GPU reaches 100% o/c, its about what temp it has when its doing it.

and the temps difference is very small, in facts it is not even worth discussing, does it really hurt you gaming experiences if it is 4C or 8C difference?

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#32 Xtasy26
Member since 2008 • 5594 Posts

@Coseniath said:

@zaku101: 18% of the global market is not a small number.

There was a year that ATI and Nvidia together, had around 15%... (3DFX ftw!)

May I know which year that happened? I have been following the GPU market since the 90's. Never have I heard of ATI and nVidia together having around 15% graphics market share. ATI used to dominate the discrete graphics market. While 3DFX had the better performance like with the Voodoo and the Voodoo 2 but they were not really ideal for the mainstream PC desktop market simply because you needed to have a 2D graphics card with your Voodoo 2 graphics card. That added extra cost for OEM's. 3DFX tried to have a mainstream graphics card with 2D graphics integrated to take more OEM design wins with the 3DFX Voodoo Banshee but it was largely poorly received as the performance wasn't that good. Mostly enthusiast and gamers brought Voodoo cards back in the early days of 3D Graphics wars, leaving ATI with catering to the mainstream market. ATI's Rage series dominated the desktop market as they had the cheaper 2D/3D graphics card. nVidia started to make in-roads in the Riva 128 back in 1998 (which was the first desktop graphics card I used to play 3D accelerated games). As nVidia got better and better with the TNT, TNT2, GeForce and GeForce 2 and the popular GeForce 2 MX which was very popular with OEM's (which some credit helped with the demise of 3DFX), nVidia then started to take market share from ATI, which was the beginning of nVidia's dominance in the graphics card market. ATI briefly had more market share back in 2003 - 2005 after the very successful launch of ATI 9700 Pro, but nVidia started to take back market share exactly around Q3 2005 which falls exactly 10 years ago. ATI/AMD was still doing descent by going "toe-to-toe" with nVidia as former AMD graphics architect Carrell Killbrew, the inventor of Eyefinity put it up to 2011. Even up to 1H of 2014 they had over 35+ market share.

It all fell apart after the 970/980 series where AMD had no answer until June of this year which is too late IMO.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33  Edited By Coseniath
Member since 2004 • 3183 Posts
@Xtasy26 said:

May I know which year that happened?

It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. (around 1997-1998)

That leaves 15-20% for other companies, right?

Btw you should read the whole article from part 1. It was an awesome article!

ps: I also had a voodoo2 12MB. (I still have it lol). Noone was playing 3D games with their 2D GPU lol. And obviously, noone would buy an other 3D card (Nvidia Riva / ATI Rage) and then add a Voodoo on it...

ps2: Riva 128 was released in 1997. Not somewhere in 1998.

ps3: ATI had more market share from Q3/2004 to Q2/2005 and not from 2003 to 2005...

@MK-Professor said:

and the temps difference is very small, in facts it is not even worth discussing, does it really hurt you gaming experiences if it is 4C or 8C difference?

Well it seems you keep avoiding that the lower temps allow the heavy o/c that usually makes Maxwell cards faster. Both factory and manual o/c. And thats something that affects gaming experience.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#34 Xtasy26
Member since 2008 • 5594 Posts

@Coseniath said:
@Xtasy26 said:

May I know which year that happened?

It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign. (around 1997-1998)

That leaves 15-20% for other companies, right?

Btw you should read the whole article from part 1. It was an awesome article!

ps: I also had a voodoo2 12MB. (I still have it lol). Noone was playing 3D games with their 2D GPU lol. And obviously, noone would buy an other 3D card (Nvidia Riva / ATI Rage) and then add a Voodoo on it...

ps2: Riva 128 was released in 1997. Not somewhere in 1998.

ps3: ATI had more market share from Q3/2004 to Q2/2005 and not from 2003 to 2005...

@MK-Professor said:

and the temps difference is very small, in facts it is not even worth discussing, does it really hurt you gaming experiences if it is 4C or 8C difference?

Well it seems you keep avoiding that the lower temps allow the heavy o/c that usually makes Maxwell cards faster. Both factory and manual o/c. And thats something that affects gaming experience.

Thanks for the link. I am always interested in about the early days of the 3D graphics wars. Those days were very crucial as to what we have today, whoever won the wars would occupy the graphics card market for generations which turned out to be nVidia and ATI (now AMD's graphics division).

You are right about the rest. With respect to the Riva 128 being released in 1998, I didn't say it was released, I said nVidia started to "make in-roads in 1998" with the Riva 128 especially with major OEMs carrying the Riva 128 like Dell and Gateway. That was the first card I used to game on. Became a loyal nVidia follower and user for 10 years and only switched to AMD in 2008 because of the price/performance of the HD 4800 series and the obscene prices nVidia was charging for the GTX 200 series. $650 for a GTX 280? Really nVidia? When the HD 4870 for more than half it's price was getting close to it in performance and was even beating it.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35 Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: Yeah $650 for GTX280 was awful.

Thanx to competition and die shrink after some months, I bought the better performer MSI GTX275 twin frozr (the 1st one with this tech!) for just $200... :D

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#36  Edited By Xtasy26
Member since 2008 • 5594 Posts

@Coseniath By the way you should keep your Voodoo 2. J I came across a STB Black Magic Voodoo 2 board 7 - 8 years ago on a PC that was about to be disposed. My regret was not asking to see if I can keep the card. :(

I read through the entire article, very informative. What I found interesting was how big nVidia managed to grow their revenue from $300 million dollars to now nearly $5 billion dollars in the past 15 years. Almost by a factor of 15X.

ATI grew their revenue from $1 billion or so in the year 2000 to close to $2.5 billion in 2006. Now after AMD took over ATI, the total revenue from their graphics chips remained stagnant still in the $1.5 billion - $2.5 billion range and might even have regressed.

Had ATI stayed independent I think they would have generated even more revenue probably in the $3 - $4 billion range because I don’t think ATI would have sold their mobile chip business and that would have been a big revenue generator as they had a strong foothold in the cell phone market and would have probably powering many smart phones. Shows how incompetent AMD’s management not just the last CEO but going back 7 - 8 years. They managed to damage a great enterprise.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#37  Edited By Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: I have kept it for 17 years, I am going to keep it for more. :D

I believe AMD made a terrible mistake by rejecting the merge with Nvidia back then.

They just disagree that they didn't want Nvidia CEO Jen-Hsun to be the CEO of the new AMD-Nvidia company, which proved to be a fatal mistake.

Most AMD's CEOs after then, made decisions that reached AMD near bankruptcy... So why in the hell someone rejects an offer thats better than theirs and wants to place their CEO which were awful...

Nvidia grow 2,5x times since then and AMD is 7 times smaller now...

ATI as independed would be better and probably Intel might bought them...

AMD-Nvidia vs Intel-ATI, Jen-Hsun Huang vs Paul Otellini would be a great battle.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#38 Xtasy26
Member since 2008 • 5594 Posts

@Coseniath They did make a terrible mistake. AMD's CEO back then should have just swallowed his pride and Jen Hsun take over the CEO role. The thing is then CEO then became the CEO of GlobaFoundries which AMD spinned of and sold, which means he wouldn't have been the CEO of AMD for too long anyways.

With respect to ATI buying intel. I don't think they would have done so because intel was designing their own GPU around the time of ATI acquisition called Larabee for Gaming which turned out to be a failure as it never made commercial release due to performance and driver issues.

But I do agree Jen Hsun of AMD/nVidia vs Intel would have been a great battle.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#39 Xtasy26
Member since 2008 • 5594 Posts

One thing I will say about the new CEO is that she is getting rid of management people instead of engineers unlike the last CEO which let go of many engineers including the inventor of Eyefinity (facepalm). AMD has management issues so they should be held accountable not the other way around.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40 Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: Actually the biggest mistake I believe of previous AMD CEOs is the abandon of mobile segment and in 2008, the sell of AMD Imageon (the Adreno makers) to Qualcomm.

But the new one is failing a looooooooooot in the marketing segment.

Under her decisions AMD lost half of their market and lost half of its value as a company.

And all this only in a year...

If Lisa su continues for 1 year more, I believe she will put the last nails in AMD's coffin.

AMD needs a new CEO with passion and vision for AMD. Not something like this, she seems that she doesn't know what she is doing and in which company she is CEO...

I believe you and me, we would make better decisions...

Avatar image for Yams1980
Yams1980

2866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#41  Edited By Yams1980
Member since 2006 • 2866 Posts

My last card i got from amd was a 5870. A power hog, and a very hot running card which i never liked using.

Things that kept me buying nvidia are better drivers, faster cards, and the most important... power usage. I use my PC several hours a day sometimes and having a video card that uses 50-100 watts less power means by the end of the year i've saved probably over 100-200 dollars in power bills, while enjoying a very fast card. On the opposite side, you can buy an amd card that is slightly cheaper but is slower and will cost you much more in electricity costs.

I laugh at people when they show how an AMD card is maybe 100 dollars cheaper, but after a short while, you will have paid for that and more if using an Nvidia card in power savings alone.

Exactly the same goes for comparisons between Intel and AMD, the higher cost of intel is worth it in the long run from the power savings.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#42 Xtasy26
Member since 2008 • 5594 Posts

@Coseniath But is it really fair to blame Lisa Su for AMD's current problems? It hasn't even been a year since she became the CEO. She was thrown into the fire out of the blue when AMD fired/forced out their last CEO Rory Read out of the blue just before the announced horrendous Q3 2014 results. You do know that it takes 2 - 4 years to design a new GPU from the ground up. So any changes she made doesn't happen over night. She did get rid of like I said 3 senior management people earlier this year.

With respect to the current market share loss in GPUs. The guy who was running their GPU division was with ATI going back to 1998. I think she trusted in his judgment and he clearly didn't deliver hence he being fired/forced out and the guy who worked on the original 9700 Pro who recently came back from Apple being made the head of the division. She did turn around the business for FreeScale Semiconductor who like AMD was losing money. And I totally agree with you about the decision to sell their mobile division. That was laughable. They could have been powering the World's Most Popular smartphones.

I also think their current loss of market share lies heavily on the last CEO Rory Read. He let go of many talented AMD GPU engineers like the inventor of Eyefinity. And once that happened it opened the floodgates of other talented engineers that had been around even since ATI leave. As the inventor of Eyefinity put it:

"AMD's losses of top-rate graphics talent is appalling.....In our business we all know the difference between success and failure is a few percent.Lose key leadership and you've probably lost the critical few percent."

I think AMD's current predicament is a result of losing key leadership which meant losing that "critical few percent". Put it this way, had those ATI guys were around I can pretty much guarantee that they wouldn't have lost this much market share. The ATI guys wouldn't have tolerated it.

As for us running the company. It may sound ridiculous but sometimes people who actually buy the hardware like you and me may actually know more about the market and what people want as opposed to people in management who live in a bubble and lose sight of reality.

I think that's what happened with 3DFX and that ultimately caused their demise.

I am frankly very worried about AMD because a lot of the things I saw with 3DFX 15 years ago, I am now seeing with AMD. Piss poor management, coming late to market, not including important features that people wanted (ie. 3DFX not offering full 32 bit color support until the 3DFX Voodoo 5 5500 when games starting supporting 32 bit color 2 years earlier with games like Forsaken, Sin, Half-Life, Unreal all offered 32 bit color support. And with more titles released in 1999 like Quake 3, Unreal Tournament, System Shock 2 that supported 32 bit color.)

Let's just hope they don't end up like 3DFX by December of this year like what happened to 3DFX 15 years ago in December of 2000.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#43  Edited By Xtasy26
Member since 2008 • 5594 Posts

@Yams1980 said:

My last card i got from amd was a 5870. A power hog, and a very hot running card which i never liked using.

Things that kept me buying nvidia are better drivers, faster cards, and the most important... power usage. I use my PC several hours a day sometimes and having a video card that uses 50-100 watts less power means by the end of the year i've saved probably over 100-200 dollars in power bills, while enjoying a very fast card. On the opposite side, you can buy an amd card that is slightly cheaper but is slower and will cost you much more in electricity costs.

I laugh at people when they show how an AMD card is maybe 100 dollars cheaper, but after a short while, you will have paid for that and more if using an Nvidia card in power savings alone.

Exactly the same goes for comparisons between Intel and AMD, the higher cost of intel is worth it in the long run from the power savings.

50 - 100 Watts less power will save you $100 - $200? Please tell me you that you are joking. It's proven time and time again that AMD will save you money in the long run. It may take up to 10 years or more to reach the price difference that you may save going with nVidia/intel. Besides at Idle power sometimes the power difference is the save or 1W - 2W difference depending on the GPU/CPU.

Loading Video...

I am not calling you Stupid BTW. I am just stating that you are misinformed.

As for your AMD GPU running more hot.

I didn't realize 52 Celsius was so "hot" as opposed to the 83 Celsius of the 980 Ti, 780 Ti and 84 Celsius of the Titan X.

Avatar image for Yams1980
Yams1980

2866

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#44 Yams1980
Member since 2006 • 2866 Posts

@Xtasy26:

@Xtasy26 said:
@Yams1980 said:

My last card i got from amd was a 5870. A power hog, and a very hot running card which i never liked using.

Things that kept me buying nvidia are better drivers, faster cards, and the most important... power usage. I use my PC several hours a day sometimes and having a video card that uses 50-100 watts less power means by the end of the year i've saved probably over 100-200 dollars in power bills, while enjoying a very fast card. On the opposite side, you can buy an amd card that is slightly cheaper but is slower and will cost you much more in electricity costs.

I laugh at people when they show how an AMD card is maybe 100 dollars cheaper, but after a short while, you will have paid for that and more if using an Nvidia card in power savings alone.

Exactly the same goes for comparisons between Intel and AMD, the higher cost of intel is worth it in the long run from the power savings.

50 - 100 Watts less power will save you $100 - $200? Please tell me you that you are joking. It's proven time and time again that AMD will save you money in the long run. It may take up to 10 years or more to reach the price difference that you may save going with nVidia/intel. Besides at Idle power sometimes the power difference is the save or 1W - 2W difference depending on the GPU/CPU.

Loading Video...

I am not calling you Stupid BTW. I am just stating that you are misinformed.

As for your AMD GPU running more hot.

I didn't realize 52 Celsius was so "hot" as opposed to the 83 Celsius of the 980 Ti, 780 Ti and 84 Celsius of the Titan X.

Ok i over guessed the power savings but its still very significant.... Say if you use a video card that uses 100 watts less than another GPU and game for around 8 hrs a day, after a year thats a savings of over 50 dollars. I'd have to check the hydro bill cause im not completely sure what i pay for hydro, but last time i checked, its was a bit over 16 cents per kilowatt (canadian) during the daytime.

And assume you add in another 100 watts saved by using an intel cpu, thats even more than another 50 dollars a year you would be saving since intel use less at idle than amd, and even less under load. I think the amd fx 9590 uses over 220watts, and thats almost 3 times the tdp my 4770k uses... and my 4770k out benchmarks it in games and some other benchmarks its equal to it.

Add that up, and its over 100 dollars easily and thats only 8 hrs a day of pc usage, and for me i rarely turn my pc off, so that would be double that amount of time my pc is turned on, meaning i probably have my pc under load 5-10 hrs a day, and its less load the rest of the time... depending on if im encoding a video or gaming.

I owned an 8 core bulldozer AMD cpu and have put power meters on it before and it uses a lot more power than my intel cpu. Its not used anymore so im not worried about power usuage from that.

I've use the same gpu/cpu for a couple years usually. I used my 680gtx from 2012 to 2014. I got a 970gtx in the end of 2014 and will use it for a least 2 years, thats 100 dollars in energy savings easily. As for my cpu, its a 4700k which i've had for a couple years and will keep it for a few more years at least... again saving money in the long run, while being fast and easily worth the bit more money i paid upfront.

Heres a calculator if you don't believe me. And the energy usage is probably more, since they cranked up the rates back in may 2015. Hydro is very expensive in canada ever since they introduced the "smart metres" over the last 10 years or so, and they used that as an trick to put up rates and force people to use power at night as the only way to save money.

http://michaelbluejay.com/electricity/cost.html

sorry for the long response, i just had to make a point that saving energy saves a lot of money and its worth it. And you wouldn't take 10 years to see the power savings, it would take less than 2 years.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: When you appoint a new CEO cause you believe the previous failed to meet the expectations and caused some damage to the company, you simply expect from the new CEO to fix these problems and make the company recover.

Having previous CEOs making mistakes that doesn't justifies the new CEO to make some mistakes too. Cause AMD might not be able to recover after some more mistakes.

And she made mistakes. Mistakes that sometimes are suspicious. Mistakes that even you or me wouldn't make.

And don't forget that a CEO is being judged by the numbers he provides. And Lisa Su provided the worst numbers AMD has seen ever.

Making a better marketing or strategic moves, doesn't take 4 years...

If she couldn't do the job, then she shouldn't take it. This is not a charity organisation.

When the new CEO failed to meet these expectations, what do you expect people to say?

So may I ask, are you happy with AMD's state with Lisa Su as CEO?

I undoubtedly am not...

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#46 insane_metalist
Member since 2006 • 7797 Posts

Nvidia has better drivers? Maybe first year and then they don't care about previous series card owners as soon as new series comes out.

Avatar image for TrooperManaic
TrooperManaic

3863

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 TrooperManaic
Member since 2004 • 3863 Posts

@insane_metalist: THANKYOU Thats what I have been saying for years.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#48 Xtasy26
Member since 2008 • 5594 Posts

@Yams1980 said:

@Xtasy26:

@Xtasy26 said:
@Yams1980 said:

My last card i got from amd was a 5870. A power hog, and a very hot running card which i never liked using.

Things that kept me buying nvidia are better drivers, faster cards, and the most important... power usage. I use my PC several hours a day sometimes and having a video card that uses 50-100 watts less power means by the end of the year i've saved probably over 100-200 dollars in power bills, while enjoying a very fast card. On the opposite side, you can buy an amd card that is slightly cheaper but is slower and will cost you much more in electricity costs.

I laugh at people when they show how an AMD card is maybe 100 dollars cheaper, but after a short while, you will have paid for that and more if using an Nvidia card in power savings alone.

Exactly the same goes for comparisons between Intel and AMD, the higher cost of intel is worth it in the long run from the power savings.

50 - 100 Watts less power will save you $100 - $200? Please tell me you that you are joking. It's proven time and time again that AMD will save you money in the long run. It may take up to 10 years or more to reach the price difference that you may save going with nVidia/intel. Besides at Idle power sometimes the power difference is the save or 1W - 2W difference depending on the GPU/CPU.

Loading Video...

I am not calling you Stupid BTW. I am just stating that you are misinformed.

As for your AMD GPU running more hot.

I didn't realize 52 Celsius was so "hot" as opposed to the 83 Celsius of the 980 Ti, 780 Ti and 84 Celsius of the Titan X.

Ok i over guessed the power savings but its still very significant.... Say if you use a video card that uses 100 watts less than another GPU and game for around 8 hrs a day, after a year thats a savings of over 50 dollars. I'd have to check the hydro bill cause im not completely sure what i pay for hydro, but last time i checked, its was a bit over 16 cents per kilowatt (canadian) during the daytime.

And assume you add in another 100 watts saved by using an intel cpu, thats even more than another 50 dollars a year you would be saving since intel use less at idle than amd, and even less under load. I think the amd fx 9590 uses over 220watts, and thats almost 3 times the tdp my 4770k uses... and my 4770k out benchmarks it in games and some other benchmarks its equal to it.

Add that up, and its over 100 dollars easily and thats only 8 hrs a day of pc usage, and for me i rarely turn my pc off, so that would be double that amount of time my pc is turned on, meaning i probably have my pc under load 5-10 hrs a day, and its less load the rest of the time... depending on if im encoding a video or gaming.

I owned an 8 core bulldozer AMD cpu and have put power meters on it before and it uses a lot more power than my intel cpu. Its not used anymore so im not worried about power usuage from that.

I've use the same gpu/cpu for a couple years usually. I used my 680gtx from 2012 to 2014. I got a 970gtx in the end of 2014 and will use it for a least 2 years, thats 100 dollars in energy savings easily. As for my cpu, its a 4700k which i've had for a couple years and will keep it for a few more years at least... again saving money in the long run, while being fast and easily worth the bit more money i paid upfront.

Heres a calculator if you don't believe me. And the energy usage is probably more, since they cranked up the rates back in may 2015. Hydro is very expensive in canada ever since they introduced the "smart metres" over the last 10 years or so, and they used that as an trick to put up rates and force people to use power at night as the only way to save money.

http://michaelbluejay.com/electricity/cost.html

sorry for the long response, i just had to make a point that saving energy saves a lot of money and its worth it. And you wouldn't take 10 years to see the power savings, it would take less than 2 years.

You game for 8 hours a day? Not to mention you are comparing full load. In games you don't always get full load it's sometimes less. Your number are WAY exaggerated. No one games for 8 hours a day at full load for 365 days a year. Where are your math calculations? JayZtwocents provided actual numbers and it comes down to a couple of dollars in difference. So, it could take up to 10 - 20 years to catch up to the difference in price. And he is using California numbers which are even higher.

Secondly, why in the world would you compare the FX 9590, that's ridiculous. You should compare it to the AMD FX 8350 or better the 8370 which is a 95 W CPU. When you compare that and compare that to the performance that the FX 8350 give compared to say 3770K or higher CPUs that cost as much as $1000 dollars like the 3970X Extreme Edition, AMD looks even better. When you compare that with respect to the price the 3970X Extreme Edition cost's over $1100 while the FX 8350 cost like $160 on newegg. The gap becomes even larger, more like 20 - 30 years to recover the costs.

Secondly at Idle, the wattage in GPUs between the two companies are nearly identical, so that further undermines your argument.

Avatar image for Xtasy26
Xtasy26

5594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#49  Edited By Xtasy26
Member since 2008 • 5594 Posts

@Coseniath said:

@Xtasy26: When you appoint a new CEO cause you believe the previous failed to meet the expectations and caused some damage to the company, you simply expect from the new CEO to fix these problems and make the company recover.

Having previous CEOs making mistakes that doesn't justifies the new CEO to make some mistakes too. Cause AMD might not be able to recover after some more mistakes.

And she made mistakes. Mistakes that sometimes are suspicious. Mistakes that even you or me wouldn't make.

And don't forget that a CEO is being judged by the numbers he provides. And Lisa Su provided the worst numbers AMD has seen ever.

Making a better marketing or strategic moves, doesn't take 4 years...

If she couldn't do the job, then she shouldn't take it. This is not a charity organisation.

When the new CEO failed to meet these expectations, what do you expect people to say?

So may I ask, are you happy with AMD's state with Lisa Su as CEO?

I undoubtedly am not...

I am not happy with AMD's current state. Especially as a PC Gamer who is more concerned about GPUs as gaming is far more dependent on GPUs vs CPUs and AMD's recent 18% market share is what really troubles me (hence the thread and posting of the video who has similar concerns). I am getting ***flashbacks*** of what happened to 3DFX exactly 15 years ago.

I don't fully blame the current CEO as in the Semiconductor industry it takes 2 - 3 years to see results. I am glad to see she got rid of certain management because let's face it AMD's management has displayed their incompetence in many occasions. She also closed down SeaMicro which was brought my their last CEO, which was again his fault has that cost $300 million and it was wasted as SeaMicro brought in little to no revenue, which was again something that she fixed instead of letting it continue. What really pi$$ed me off was learning that the last CEO let go of many talented AMD GPU engineers which then opened the flood gates of many ATI engineers to leave. ATI had built a "World Renowned" group of GPU engineers as the inventory of AMD's Eyefinity put it only to have the last CEO help disintegrate it. I mean the guy quoted in the description of the video former AMD GPU Guru stated that , "When you don't show up at the fight, you lose by default" is a prime example of a guy who AMD desperately needed as he clearly understands the importance of execution. Had people like him stayed around, I would pretty much guarantee that AMD wouldn't be at 18%. Only recently, I also found out that one of the lead GPU Engineers for the HD 4800 series (my first ATI card) had left AMD last year to go work for Apple after working for AMD and ATI going back to 1999. This is a prime example of what I disliked under the leadership of the last CEO. At least Dirk Meyer the CEO prior to Rory Read, he didn't cut the GPU guys with the exception of their cellphone division which was sold. The Gaming GPU remained in tact no matter how badly their CPU division was doing.

AMD's market share loss started to happen last fall when she barely came into the CEO position. I don't think she can change things in 2 - 3 quarters. I think she did the right thing my creating the Radeon Group to make the division more agile and focused to gain back market leadership with the guy who helped design the 9700 Pro. His return from Apple is one thing that was positive with respect to talent.

What are the decisions she made that you think she did wrong?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50  Edited By Coseniath
Member since 2004 • 3183 Posts

@Xtasy26: Well since I am not at AMD's board, I can't tell you about all the decisions she made.

But a lot of decisions made for the GPU summer launches were awful (this includes the first decisions Lisa Su needed to take before too).

Fiji was a chip with great potential.

They should allow Sapphire, ASUS etc etc to make air coolers of Fury X aswell. This could lower the cost of FuryX which also means more sales.

R9 Nano in my opinion wasn't worth the RnD. They could allow companies to create Fury or FuryX with lower clocks in SFF PCBs.

R9 390 and R9 390X. This is obvious. 8GB VRAM? Really? 99% of people that will not CF, they would love 4GB and $50 in their pockets. They could simply allow AIB partners again to add 8GB in some models if they want.

And she could also lower the price of Fury to make it even more attractive.

All of these decisions were made or were able to change in 8 months of Lisa Su's authority (8th october till summer launch).

I believe these decisions would save money (millions of $ actually) and they would add some percent in the global market share giving AMD more cash.

And I see them as obvious changes that even me or you would made them. So after what I saw the decisions from Steven Elop at Nokia, I found her moves suspicious.

Its even more suspicious if you look at AMD market cap and share price. In October when Lisa Su became CEO of AMD, an AMD share price had a value of 3,28.

Today has a value of 1,85.

Dejavu with Nokia?

ps: This is the reason I said that AMD is not a charity organisation. People didn't hire Lisa Su to make AMD almost twice smaller. AMD is not a school to tell her "its okie, you tried". Trying or not, doesn't matter when you are the CEO of a big company like AMD. She might (a lot of mights here :P) tried but it didn't work. That makes her a failure.

ps2: I believe that creating the Radeon Group was the only great thing she did. But if we believe that these fail decisions she made weren't on purpose, then she realise that she should let more capable people than her to make decisions for the GPU department. Thats a start...

ps3: New architecture designs take 3 years, not choosing a cooler or the amount of VRAM or price...