Wickerman777's forum posts

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#1  Edited By Wickerman777
Member since 2013 • 2164 Posts

@lostrib said:

@Riverwolf007 said:

i wish you dudes were not such freaks about resolution.

it uses shitloads of systems resources and you get nothing out of it.

now because of the ignorance of the unwashed masses they focus on resolution instead of important effects like draw distance and lighting and particles and water effects and lighting and ect, ect, ect...

nice job derp squad.

uh it makes the image not look like a blurry mess

Lol, like 1080p or even 720p is a "blurry mess".

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#2  Edited By Wickerman777
Member since 2013 • 2164 Posts

I didn't read the thread but your thread title sure is true ... of some sites anyway. I use to read Kotaku quite a bit but eventually grew tired of all the far-left propaganda articles there that have nothing to do with gaming.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#3  Edited By Wickerman777
Member since 2013 • 2164 Posts

More like UE 3.5 since the weakness of the "next-gen" consoles forced Epic to scale it back.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#4 Wickerman777
Member since 2013 • 2164 Posts

Speaking of feminism ...

http://www.debunker.com/patriarchy.html

... the facts fly in the face of what many people have been led to believe.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#5 Wickerman777
Member since 2013 • 2164 Posts
@ragnaris said:

@Wickerman777:

The main memory bandwidth has to satisfy both the cpu and the gpu, additionally there is serious memory contention in shared memory systems, including AMDs line of APUs, which lowers main memory bandwidth significantly from its theoretical peak.

Whatever. Then make the bus 384 bit. Still wouldn't be a $200 difference. $100 difference, yes. Not $200. We're only talking about 600 extra gflops here.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#6  Edited By Wickerman777
Member since 2013 • 2164 Posts

@ragnaris said:

@Wickerman777:

It'd be probably $200 more to jump to a 24CU gpu

For a 24Cu gpu to function optimally it'd require a memory bandwidth increase on both system.

Yields would go down as would chips per wafer so price would go up.

It'd also increase the tdp (peak temp in games) so they'd either have to downclock the apu, separate them onto separate chips or put a more robust cooling system

Increased power consumption would mean a higher watt power supply.

I don't believe the cost would be $200 higher. All 24 CUs running at 800 mhz would give you is the same power as 7870 running in a PC which has 20 CUs but running at a higher clock. Currently PS4 is running on par with a 7850 (And X1 is on par with 7770). There certainly isn't a $200 difference between a 7870 and 7850. The memory bandwidth for 7870 is also 256 bit. It would mean more wattage but not too much more.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#7 Wickerman777
Member since 2013 • 2164 Posts

I define it the way Epic Games does. They say last gen consoles were basically 250 gflops machines. PS4 is 1.8 tflops and X1 1.3 tflops. So PS4 is 7X as powerful as last gen and X1 5X.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#8 Wickerman777
Member since 2013 • 2164 Posts

@emgesp said:

@Wickerman777 said:

@miiiiv said:

So basically the leaps are getting smaller and smaller over the generations, from a raw performance perspective and ps2 was the biggest leap over it's predecessor. Also the tdp has increased each generation but this one were it actually decreased instead. We are never going to see ~50x performance increase again unless some revolutionary new processor technology comes along.

Well, no. 50X ain't gonna happen, that'd be insane. But considering that there was an 8 year gap between this gen and last gen I expected a 10X processing improvement. Instead it's 7X for PS4 and 5X for X1.

PS4 is over 9x more powerful than the PS3. The PS3's GPU tops out at 192 Gflops. PS4 is 1.84 Tflops.

I knew somebody was going to cheat like that which is why I said in another reply it should be compared to X360 instead of PS3 because PS3 also draws graphics from the CPU. And when compared to X360 it's roughly 7X as powerful, not 9X and certainly not 10X.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#9  Edited By Wickerman777
Member since 2013 • 2164 Posts

@edwardecl said:

PS1 [MIPS 33.8MHz] -> PS2 [MIPS 294Mhz] (>8.6x with new instructions)

PS2 [MIPS 294Mhz, 10.4M transistors] -> PS3 [PPC 7 SPU 3.2Ghz, 234M transistors] (a bit of a hard one to directly compare but at least 10x clock speed and 22.5x the transistor count, so falls somewhere between the two most probably)

PS3 [PPC 6 SPU 3.2Ghz, 234M transistors] -> PS4 [X64 Jaguar 8 core 1.6Ghz] (again another hard one, if you are talking clock speeds then it's slower, but it's a totally different architecture and has separate cores. Assuming the Xbox One and PS4 jaguar cores are identical MS claims 7.1 Billion transistors, and from the pictures less than half are for the CPU probably more like 1/3 so that is ~2.5 Billion transistors so up to (~10x).

so...

CPU

PS1 -> PS2 = (~9x)

PS2 -> PS3 = (10-23x)

PS3 -> PS4 = (~5-10x)

RAM

PS1 [3MB] -> PS2 [36MB] (12x)

PS2 [36MB] -> PS3 [512MB] (14.2x)

PS3 [512MB] -> PS4 [8GB] (16x)

Bus Speed

PS1 [132MB/s] -> PS2 [3.2GB/s] (24.8x)

PS2 [3.2GB/s] -> PS3 [16.8 GB/s - 22.4GB/s] (5.25x - 7x) GPU and CPU accesses ram at different speeds.

PS3 [16.8 GB/s - 22.4GB/s] -> PS4 [176GB/s] (7.8 - 10.4x)

Make of that what you will.

You left out the GPU, the most important component. And the jump with that ain't what it should have been. And comparing to the PS3 GPU ain't a fair comparison because the CPU in that system is also responsible for graphics. Because of that X360 is a good system to compare PS4 to and the jump there is only 7X in the GPU department. Not terrible by any stretch but the generation was 8 freakin' years long and therefor should have been 10X.

Avatar image for Wickerman777
Wickerman777

2164

Forum Posts

0

Wiki Points

1

Followers

Reviews: 0

User Lists: 0

#10  Edited By Wickerman777
Member since 2013 • 2164 Posts

Going with the same kinda architecture they have now these consoles should have had 24 compute units instead of 12 and 18 and 12 Jaguar CPU cores instead of 8 (And X1 needs GDDR5 instead of DDR3). Doing that would have probably raised the price by $100 but I think people would have been very happy with them that way.