Tighaman's forum posts

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#1 Tighaman
Member since 2006 • 1038 Posts

how many people played kz sf ....it only showed demos to a certain people Fallon played it but it wasn't on the floor at E3 or comiccon and they just showing another demo at gamecon everything that ms said coming at launch is being played by people on the showrooms at high fidelity 1080p 60fps, 2.5 of gdrr5 is not going to be enough if you got a 2vram game and the gpu has to do cpu task because of gddr5 used for the whole system. That gpu is gonna take major hit.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#2 Tighaman
Member since 2006 • 1038 Posts

its easy to see what's happening with the ps4 and the Xbox one with the ps4 they made the customized the gpu to do some cpu tasks so they knew that the gddr5 was gonna have cpu problems and its a problem the gpu is locked at 2gbs of vram max so you making that gpu work very hard to carry all gpu and some cpu loads with 2.5 gbs of ram left. so they might have some bad results in games that's vram hungry. Instead with the Xbox one they chosen the cloud to do the same thing but efficiency is gonna make microsoft win, windows OS, dx 11.1 and 11.2, are the heart of gaming right now and that's what's in the Xbox one

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#3 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="blackace"][QUOTE="StormyJoe"]

Exactly where does that article say that it is a rumor?

xboxiphoneps3

It's considered to be rumor since the information came from a developer. Unless Sony themselves confirm this to be true or not true, it's just a rumor. There is only one source saying this right now.

you fail to realize that the PS4 dev kit doesnt have any more then 8 gigs of ram, as the APU is 256 bit and cant address any more then 8 gb of ram, and there 100% not using DDR3 ram/gddr5 ram combo on the dev kits as that would make absolutely no sense at all and that wouldnt be developing the game to the PS4 specs and its archutecture

im going to tell you again it was a ddr3 for the cpu and gddr5 for the gpu in the dev kits i promise take my word its the reason they customize the gpu to do some cpu task, thats gonna be where the problem lies

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#4 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="blackace"][QUOTE="StormyJoe"]

Exactly where does that article say that it is a rumor?

xboxiphoneps3

It's considered to be rumor since the information came from a developer. Unless Sony themselves confirm this to be true or not true, it's just a rumor. There is only one source saying this right now.

you fail to realize that the PS4 dev kit doesnt have any more then 8 gigs of ram, as the APU is 256 bit and cant address any more then 8 gb of ram, and there 100% not using DDR3 ram/gddr5 ram combo on the dev kits as that would make absolutely no sense at all and that wouldnt be developing the game to the PS4 specs and its archutecture

im going to tell you again it was a ddr3 for the cpu and gddr5 for the gpu in the dev kits i promise take my word its the reason they customize the gpu to do some cpu task, thats gonna be where the problem

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#5 Tighaman
Member since 2006 • 1038 Posts

xbox one uses hypervisor so devs can use up to 6.5 gbs of ram so 6.5>5.5 also killzone used 4.5 gbs for a demo so they were was most likely using all of the vram that capable of being used thats close to maxing out :o

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#6 Tighaman
Member since 2006 • 1038 Posts

sony is about to sweet talk them fanboy panties right off and not going to call after

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#7 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="deeliman"][QUOTE="RedentSC"] Any post you make is automatically invalid .. just sayingCallOfDutyRulez

He is the defenition of a fakeboy.

- Insiders confirm 5 billion transistor is for GPU ONLY.

2x AMD 7970 = 5 billion transistor.

>768 ALU for the GPU

Nvidia GTX 680 = 1536 ALU
AMD 7970 = 2048 ALU
Nvidia's architecture is more efficient, so its 1536 ALU can beat 7970's 2048 ALU

So to beat titan with 2880 ALU, AMD must provide a GCN 1.0 performance of 3840 ALU or 1.3 times of ALU number for titan.

Since Nvidia Titan has 2880 ALU, the ONE's AMD GPU must be at least 3840 ALU on GCN 1.0.

AMD recently created GCN 2.0 GPU with 2880 ALU and claimed it would work more efficiently than NVIDIA's titan.

More specifically, GCN 2.0 will have 30-40% better efficient design, meaning that with GCN 2.0, the ONE's GPU would only need 2880 ALU.

wasnt you the same guy arguing with me about what they had in the PS4 dev kits? Didnt i say it wasnt gonna work because the dev kits had ddr3 for the cpu?

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#8 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="Tighaman"]

[QUOTE="CallOfDutyRulez"]

Making up more bullshit? :lol:

DF confirmed The Division was running on the PS4. :cool:

And post some proof that the PS4 devkit has a DDr3 RAM for the CPU.

CallOfDutyRulez

yes it was running on the PS4 dev kit but if you read it it was at mostly 10fps and again reading is fun if you read more on the dev kit you will find out that it was ddr3 for the cpu and gddr5 for the gpu

http://www.eurogamer.net/articles/digitalfoundry-tech-analysis-tom-clancys-the-division

"The Division's frame-rate is a solid 30fps based on what's shown so far, with only hints of dips to 28fps during grenade explosions."

And if you're talking about 10 FPS, maybe you should look at the ONE's DR3 that has last-gen graphics and runs at 10 FPS.

There is no mention of DDR3 for the CPU for the PS4 devkit... You're just making sh*t up, like always.

my bad i was reading as they was begining to port from the PC to the PS4 my bad and accept the ownage but again just a demo on dev kit is not the same we have seen games on the xbox running on actual hardware NO KITS hardware at high fidelity why not the PS4

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#9 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="Tighaman"]

what open world game infamous? Popins everywhere even in the cut scene at this moment. Ass. Creed? Running on the PC thats sitting on the far left side behind the curtains at E3? and that was a dev kit, but what you dont understand is that the PS4 dev kit had DDR3 inside for the cpu and GDDR5 for the GPU its not going to convert over well with an all GDDR5 console. I think i made plenty of sense by the way.

xboxiphoneps3

nope, on acutal ps4 dev kits, and its pretty obvious why they didnt show XB1 multiplat gameplay footage... nonsense, many developers praised the 8 gigs of GDDR5 in the ps4, ps4 has gotten tons more praise from devs then for Xboner

You really need to listen to every word people say play on words like i said gddr5 will be a great benefit to the PS4 in the GPU but bad for the cpu. They didnt show footage because just like EA/MS they have agreements and contracts. PS4 had ddr3 and gddr5 in the dev kit.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

3

Followers

Reviews: 0

User Lists: 0

#10 Tighaman
Member since 2006 • 1038 Posts

[QUOTE="tormentos"]

[QUOTE="ronvalencia"]

X1's cooling solution i.e. rivals blu-ray drive' size.


wouldn't the Xbox one be closer to a 7790HD except lower clockspeed?

Cooling solution for Radeon HD 7850 (~130 watts TDP)

4798_06_his_radeon_hd_7850_iceq_x_turbox

For Crysis 2 DX11, X1's GPU would be closer to W5000 and than to 7770 i.e. due to improved "2 primitives" per cycle rate, higher register storage (scale with CU), higher L1 cache (scale with CU), greater LDS (scale with CU), greater L2 cache (scale with memory controlers) and higher effective memory bandwidth (DDR3 68 GB/s + 32 MB ESRAM 192 GB/s + JIT LZ compression/decompression).

7770 only has 72 GB/s memory bandwidth.

ronvalencia

See this is why i make su much fun of you..

LINKs to were it say that teh card you just posted there is a 5 billion APU with and 8 core CPU and an 1.6 billion transistor ESRAM..

Not it will not that is what you want to assume the W5000S is not 1.13 TF is 1.3 TF and has higher than 800mhz clock to.

I want a link to where it say that the 7850 has a big apu like the xbox one,so that you totaly wrong comparison even start to make sense.

There's only a minor effecive difference between between 800 Mhz to 850Mhz (the required Mhz for 1.3 TFLOPS). You're talking about 1 to 2 FPS.

The difference between 800 Mhz to 850 Mhz is even smaller than the difference with X1 and PS4.

My 12 CU's 31 FPS estimate is lower than W5000's 33 FPS.

------------

"The power consumption of SRAM varies widely depending on how frequently it is accessed; it can be as power-hungry as dynamic RAM, when used at high frequencies".