click this shizz
it will have 4,096 GCN cores and uses HBM memory that is 9 times faster then gddr5.
ps4 has 1192 cores at a slow 800mhz with slow gddr5 memory that is shared by the cpu.
Every high range chip destroys consoles.
Why does every GPU thread here have to state that? Its just baiting. Every high end card since the ps4/xboxone was first released were stronger than the ps4/xboxone.
What this GPU does is destroy the r9 290x and will be significantly faster than the gtx 980. Rumored pricing = gtx 980 price range.
I want to see the r9 370x for my cousin's build.
it destory's the ps4? it removes all the stories from all the games?
sounds like a match made in heaven for me.
but seriously, i read the article - that hardware means business. i'm very curious about the benches and the price. i'm potentially buy-curious.
lol @ OP being a dumb hermit so he doesnt even understand the context of hbms comparison to gddr5. this is also a rumor, altho amd will obviously have to launch a new gpu sometime soon because nvidia slaughters them at the high end currently. cant see anything beyond a 30ish% increase over a 290 given its on the same process and amds architecture is so power hungry already.
btw op, the last thing amd gpus need at the high end is more bandwidth. also, pc gpus are already far ahead of the ps4, too bad that has no impact on the software side, thats where the pc falls flat.
Why do some hermits have such a boner for PS4?
That's a good question.
Console hardware stays the same, PC hardware improves year after year. It's not even worth mentioning the newest stuff is going to be more powerful than 2 year old hardware.
That said, that card sounds to be a beast. I can't wait to see some benchmarks.
Only want to know its power draw...if perform just a tad better than Maxwell, but really high power draw..then bleh! :P
That 300w power consumtion hah, will need some heavy cooling, Nvidia green way ftw:P
Why do some hermits have such a boner for PS4?
That's a good question.
Console hardware stays the same, PC hardware improves year after year. It's not even worth mentioning the newest stuff is going to be more powerful than 2 year old hardware.
That said, that card sounds to be a beast. I can't wait to see some benchmarks.
Will be fun to see it go up against Titan 2 :D
Why do some hermits have such a boner for PS4?
Mainly 3 reasons:
Number 1: The PS4 is the only really relevant console
Number 2: The PS4 is the strongest console and has some games that are on par or are exceeding what is available on PC (Killzone, Infamous Second Son, Driveclub and soon The Order 1886 and after that Uncharted 4)
Number 3: Playstation fans dont hide behind PC and dont suck up to the wannabe mustards because they have no reason for doing so unlike nintendo fans (Since the Wii) and Xbox fans (Since the xbone)
unless amd has some serious architectural changes in store, no way is this chip going to beat big maxwell. nvidia has so much more power and temperature headroom.
While this card will beat gtx 980, the r9 390x will be the competitor for the new nvidia titan (full maxwell) not the r9 380x
There were rumored benchmarks that have been floating around from chiphell but I am not sure if I trust them yet.
click this shizz
it will have 4,096 GCN cores and uses HBM memory that is 9 times faster then gddr5.
ps4 has 1192 cores at a slow 800mhz with slow gddr5 memory that is shared by the cpu.
Hmmm, a GPU that came out a year after the PS4 and will cost about as much as a PS4 will be more powerful?????
At the end of the day that R9 380X won't be able to play Uncharted 4.
Why do some hermits have such a boner for PS4?
That's a good question.
Console hardware stays the same, PC hardware improves year after year. It's not even worth mentioning the newest stuff is going to be more powerful than 2 year old hardware.
That said, that card sounds to be a beast. I can't wait to see some benchmarks.
But the funny thing is that the pc hardware was leaps ahead of the console hardware nearly two years before their release. The 380x will most likely be between 30-40% faster then the 290x, with a max 512gb/s memory bandwidth. What stands out is that all the leaks and promotions done for the HDM state its 9x faster then GDDR5, but fail to point out that its based on per module and stack.
GDDR5 is 28gb/s per and 1st gen HDM can do 128gb/s per stack. With 1st gen HDM I know their able to do x2 stacks which allows 256gb/s. which would give the 9x claim. With a x4 stack HDM stack its 512gb/s. But in reality HDM isnt going to a game changer for awhile until gpu's get stronger to be able to make use of the bandwidth. Even with the 290x and its 512bit bus and 320gb/s of bandwidth does not out right beat gpu's with only 256bit bus and nearly 100gb/s less bandwidth.
Also even at 20nm with 4096 cores its power consumption will be most likely near their 290 levels knowing AMD.
Why do some hermits have such a boner for PS4?
That's a good question.
Console hardware stays the same, PC hardware improves year after year. It's not even worth mentioning the newest stuff is going to be more powerful than 2 year old hardware.
That said, that card sounds to be a beast. I can't wait to see some benchmarks.
But the funny thing is that the pc hardware was leaps ahead of the console hardware nearly two years before their release. The 380x will most likely be between 30-40% faster then the 290x, with a max 512gb/s memory bandwidth. What stands out is that all the leaks and promotions done for the HDM state its 9x faster then GDDR5, but fail to point out that its based on per module and stack. GDDR5 is 28gb/s per and 1st gen HDM can do 128gb/s per stack. With 1st gen HDM I know their able to do 2x2 stacks which allows 256gb/s. which would give the 9x claim. With a 4x4 stack HDM stack its 512gb/s. But in reality HDM isnt going to a game changer for awhile until gpu's get stronger to be able to make use of the bandwidth. Even with the 290x and its 512bit bus and 320gb/s of bandwidth does not out right beat say a GTX 970 with only a 256bit bus with only 224gb/s of bandwidth.
It depends on the games e.g. FarCry 4's enhanced godrays causes a siziable performance drop on R9-290X while R9-285's performance remained similar to volumetric fog godrays settings which is similar to Maxwell's behaviour.
R9-380X includes improvements from R9-285.
R9-380X (64 CU) seems to be a near straight scaling from R9-285 (32 CU with 28 CU working) e,g 7870 was a near straight scaling from 7770 i.e. 2X tessellation units, 2X CU, 2X ROPS.
R9-290X was a rough scaling from 7770 4X i.e. 4X tessellation units, near 4X CU, 4X ROPS, 4X ACE.
R9-285 almost matched R9-290X in Far Cry 4, hence I can understand why AMD didn't release R9-285X (full 32 CU version).
http://www.hardocp.com/article/2015/01/12/far_cry_4_graphics_features_performance_review/2#.VL9PMk_9mpo
It depends on the games e.g. FarCry 4's god rays level 3 causes a large performance drop on R9-290X while R9-285's performance remained similar to level 2 settings which is similar to Maxwell's behaviour.
R9-380X includes improvements from R9-285.
I say unoptimized drivers and or just bad coding with the game, because with 512bit bus with 320gb/s with a stronger gpu core 290x would normally outperform that 285.. However when the game came out 7970/280x actually did as good if not a bit better than GTX 780.
It depends on the games e.g. FarCry 4's god rays level 3 causes a large performance drop on R9-290X while R9-285's performance remained similar to level 2 settings which is similar to Maxwell's behaviour.
R9-380X includes improvements from R9-285.
I say unoptimized drivers and or just bad coding with the game, because with 512bit bus with 320gb/s with a stronger gpu core 290x would normally outperform that 285.. However when the game came out 7970/280x actually did as good if not a bit better than GTX 780.
R9-285 has improved front end (includes tessellation units) with larger cache vs ALU count ratios. Aggressive driver JIT optimisations hides architectural issues.
7970 has larger cache vs ALU count ratio than Kepler.
Maxwell has larger cache vs ALU count than Kepler.
Hawaii's performance degradation is similar to other older GCNs with R9-285 being the exception.
I'm waiting for a proper 4K resolution single GPU card.
http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/7#.VL9WLU_9mpo
You see, the R9 285 (Tonga) received some tessellation performance improvements in that new iteration of GCN. You can refer back to this table where we outlined all the improvements in each iteration of GCN. You will notice that improved tessellation is one of the upgrades with Tonga. Since Godrays are tessellation based, this newer architecture does very well with it compared to any other AMD GPU. Even R9 290X cannot touch how well R9 285 is doing tessellation now. The R9 285 also uses a new color compression scheme which also improves performance.
your bait is even worse than your mom
Oooh your mom jokes. Almost as bad as his bait. Now that's irony.
your bait is even worse than your mom
you just slipped in your own shit
@legendofsense: It's horrible cuz you're salty.
You win. I cant beat that 1st grade logic.
@gpuking: after how massively downgraded the gameplay footage was from that e3 teaser, im not putting any stock in this photo being representative of anything until the game is released and i see exactly what ND has achieved
The change of day, lighting condition and wetness could be the main influence. The model is exactly the same as well as textures and general shading quality. Just wait till you see a twilight scene in the actual game before pass it off. Have faith my friend.
Please Log In to post.
Log in to comment