WiiU (ZombiU), Witcher 2 (360) and Next Gen systems xbox 720 (FF15 Agnes)

This topic is locked from further discussion.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#301 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"]

[QUOTE="ronvalencia"]

Old game engines and they are not shader extensive. Geforce 7 and RSX can't even do HDR FP + MSAA.

On Geforce 7 family, texture fetch operation has a cascading ALU stalls i.e. stall "FP32 Shader Unit 1", you stall the rest of the ALU pipeline.

pixel-block-small.gif

ronvalencia

Again, your point being? HDR + MSAA has been done on PS3

Note the HDP FP not integer based HDR.

The only thing that I've noted is that you're a desperate fool that does nothing but rely on posting other peoples comments and techincal break downs to give an ultimate impression that you know what you're talking about when in reality that's far from the truth.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#302 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"]

Again, your point being? HDR + MSAA has been done on PS3

mrfrosty151986

Note the HDP FP not integer based HDR.

The only thing that I've noted is that you're a desperate fool that does nothing but rely on posting other peoples comments and techincal break downs to give an ultimate impression that you know what you're talking about when in reality that's far from the truth.

That's all you can do? Your the desperate fool and you got nothing.

I already know about Geforce 7's design issues when I got my Geforce 8600M GT GDDR3 i.e. from NVIDIA's G7x vs G8x white paper.

PS; I can program in C++ with COM objects i.e. DirectX.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#303 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] Meens absolutely nothing...... Closed box system always go for the most efficient results so simply saying that RSX is bad at this is pointless... And were talking about a 7800GTX not RSX...jackass :|mrfrosty151986

You fool, PS3's NVIDIA RSX is part of Geforce 7 family. Geforce 7 and NVIDIA RSX doesn't have G80's Gigathreads technology.

You can't even read NVIDIA's whitepaper on G80 vs G70.

Close box system doesn't change the fact that RSX's ALUs stalls during texture fetch operations. CPU's Load/Store operation = GPU's texture fetch operation. Before opening your mouth learn some computer science 101.

You have no ideaof how things works if you think that...

That's rich, since PS3 has yet to beat my old ASUS G1S laptop. Regardless on close or open system, the ALU needs to fetch data and can't be avoided.

My Geforce 8600M GT GDDR3 has about half of theoretical GFLOPs of Geforce 7800 GTX yet 8600M GT's results is better than 7800 GTX in CryEngine 3(Crysis 2) and Unreal Engine 3 based engines.

Inferior shader branching from 7900 GTX which matches NVIDIA RSX's shader branching comments.

IMG0018541.gif

IMG0018542.gif

Avatar image for kickass1337
kickass1337

149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#304 kickass1337
Member since 2011 • 149 Posts
the thing there is consoles were never ahead of PCs when the 360 launched pc could play the same games at way better settings 1year latter 8800gtx came out with the ps3 launch which still crushes consoles in performance
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#305 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

Crysis 2 on Geforce 7800GTX = crap frame rates http://www.youtube.com/watch?v=klgVCk178OI

Btw, Sony PS3 has CELL to fix NVIDIA RSX's issues.

Crysis 2 on Radeon X1950 Pro runs like Xbox 360 version http://www.youtube.com/watch?v=jHWPGmf_A_0

RyviusARC

I think you are missing the whole point of the argument in which the troll LoosingENDs thinks the next gen console will be much more powerful than any PC out today.

He thinks this because of the Xbox 360.

He doesn't seem to understand that the Xbox 360's gpu was basically a 7600gt with unified shaders in terms of performance.

If there were no unified shader tech then it would be much weaker than it is.

There is nothing like unified shader tech coming next gen so it won't have that advantage.

Luckily for me my 2004 PC was smashing the consoles early in the gen and by the time shader intensive games were coming along you could get an 8800gt for under 200USD and it was around 3 times the power of consoles.

My point was, you can't generalise a PC with Geforce 7 type GPU.

Radeon X19x0 was able keep up against Xbox 360 with current Unreal Engine 3 and CryEngine 3 base games.

With Xbox 720 and PS4 generation, unified shader design shift doesn't exist. Current GPUs already has SMT type technology.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#306 RyviusARC
Member since 2011 • 5708 Posts

That's all you can do? Your the desperate fool and you got nothing.

I already know about Geforce 7's design issues when I got my Geforce 8600M GT GDDR3 i.e. from NVIDIA's G7x vs G8x white paper.

PS; I can program in C++ with COM objects i.e. DirectX.

ronvalencia

I always found C++ to be rather inefficient since it is basically taking something that is already written and making it do something it wasn't specially designed to do.

Kind of ironic when I think about your posting style.

Avatar image for Inconsistancy
Inconsistancy

8094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#307 Inconsistancy
Member since 2004 • 8094 Posts

[QUOTE="Inconsistancy"][QUOTE="mrfrosty151986"][QUOTE="loosingENDS"]

And yet cant run anything even miles close to what Xenos runs and is garbage comparing

More things that cant be actually fully used due to last gen fixed shader design and PC bottlenecks are not really that helpfull

mrfrosty151986

A 7800GTX runs Quake 4, Doom 3 and Half Life 2 much much better then Xenos ever could.

"Stop comparing the way a console utilises GPU's to the way PC's do...you just make yourself look stupid."

I wonder who said that.... Hmmm.

I wonder who can't read.... Hmmm


I don't know if you're a troll, or if you're just 'that' stupid.

First you compared the performance of the 7800gtx, saying it'd crush the Xenos, to which I replied "Mirror's Edge" (as an example where it(the 7800gtx) falls on its face, compared to Xenos)

Then you reply: "Stop comparing the way a console utilises GPU's to the way PC's do...you just make yourself look stupid."

Followed by: "A 7800GTX runs Quake 4, Doom 3 and Half Life 2 much much better then Xenos ever could."

Directly comparing a desktop GPU to the Xenos, again. Now 'I' can't read? You seem to not even know what the words 'you' type out actually mean in the first place.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#308 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

That's all you can do? Your the desperate fool and you got nothing.

I already know about Geforce 7's design issues when I got my Geforce 8600M GT GDDR3 i.e. from NVIDIA's G7x vs G8x white paper.

PS; I can program in C++ with COM objects i.e. DirectX.

RyviusARC

I always found C++ to be rather inefficient since it is basically taking something that is already written and making it do something it wasn't specially designed to do.

Kind of ironic when I think about your posting style.

I use C++ for speed and C# for rapid development i.e. it depends on the task at hand.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#309 RyviusARC
Member since 2011 • 5708 Posts

My point was, you can't generalise a PC with Geforce 7 type GPU. Radeon X19x0 was able keep up against Xbox 360 with current Unreal Engine 3 and CryEngine 3 base games.ronvalencia

I will try to lay waste to insults, but I do know you are a big supporter of AMD.....a little too much.

If you could remain more neutral you might do better.

The truth is both Nvidia and AMD/ATI had their ups and downs.

I currently own an Nvidia card not because I think Nvidia are better but because they had an extra feature that I wanted.

I owned an ATI 9700pro back in 2002 and it kicked ass for quite some time, but I do not have brand loyalty because it's a petty thing ignorant people succumb to.

And don't bother trying to deny your loyalty to the AMD/ATI brand, I have seen your posts and know well enough.

.

And back to the topic, yes I do agree with you that the x19x0 fairs better in shader intensive games.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#310 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]My point was, you can't generalise a PC with Geforce 7 type GPU. Radeon X19x0 was able keep up against Xbox 360 with current Unreal Engine 3 and CryEngine 3 base games.RyviusARC

I will try to lay waste to insults, but I do know you are a big supporter of AMD.....a little too much.

If you could remain more neutral you might do better.

The truth is both Nvidia and AMD/ATI had their ups and downs.

I currently own an Nvidia card not because I think Nvidia are better but because they had an extra feature that I wanted.

I owned an ATI 9700pro back in 2002 and it kicked ass for quite some time, but I do not have brand loyalty because it's a petty thing ignorant people succumb to.

And don't bother trying to deny your loyalty to the AMD/ATI brand, I have seen your posts and know well enough.

.

And back to the topic, yes I do agree with you that the x19x0 fairs better in shader intensive games.

My recent NVIDIA hardware

ASUS G1S laptop with Geforce 8600M GT GDDR3 256MB (dead, NVIDIA bumbgate, the entire G1S rendered useless).

ASUS G1Sn laptop with Geforce 9500M GS GDDR2 512MB (free replacement for ASUS G1S from ASUS).

ASUS N80VN (14 inch) with Geforce 9650M GT GDDR2 1GB (it's a small laptop with NVIDIA GPU).

NVIDIA Geforce 8600 GT GDDR3 (my CUDA test GPU, DIY ViDocK test GPU).


On Geforce 7 subject, NVIDIA has thier whitepaper on G8x vs G7x i.e. NVIDIA talks about on why NVIDIA G7x owner should upgrade to G8x e.g. they bashed thier own G7X. https://aulavirtual.uji.es/pluginfile.php/774612/mod_resource/content/0/GPU/GeForce_8800_GPU_Architecture_Technical_Brief_1_.pdf

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#311 RyviusARC
Member since 2011 • 5708 Posts

My recent NVIDIA hardware

ASUS G1S laptop with Geforce 8600M GT GDDR3 256MB (dead, NVIDIA bumbgate, the entire G1S rendered useless).

ASUS G1Sn laptop with Geforce 9500M GS GDDR2 512MB (free replacement for ASUS G1S from ASUS).

ASUS N80VN (14 inch) with Geforce 9650M GT GDDR2 1GB (it's a small laptop with NVIDIA GPU).

NVIDIA Geforce 8600 GT GDDR3 (my CUDA test GPU, DIY ViDocK test GPU).


On Geforce 7 subject, NVIDIA has thier whitepaper on G8x vs G7x i.e. NVIDIA talks about on why NVIDIA G7x owner should upgrade to G8x e.g. they bashed thier own G7X. https://aulavirtual.uji.es/pluginfile.php/774612/mod_resource/content/0/GPU/GeForce_8800_GPU_Architecture_Technical_Brief_1_.pdf

ronvalencia

Yes I see that you own Nvidia hardware at some point but I don't see how the last line really matters.

Like I said both Nvidia and AMD/ATI had their ups and downs.

At one time I favored 3DFX more than any other GPU provider.

But I don't pick and choose based on brand.

And I know for a fact that you tend to shy away from saying much good about Nvidia and spend a lengthy amount of time praising AMD/ATI.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#312 mrfrosty151986
Member since 2012 • 533 Posts

RSX can't do this, RSX can't do that..... and yet I've not seen a single person comment on how poor Xenos's performance becomes when using predictive tiling.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#313 ronvalencia
Member since 2008 • 29612 Posts

RSX can't do this, RSX can't do that..... and yet I've not seen a single person comment on how poor Xenos's performance becomes when using predictive tiling.

mrfrosty151986
It doesn't render Xbox 360 coming last in multi-platform games.
Avatar image for ZombieKiller7
ZombieKiller7

6463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#314 ZombieKiller7
Member since 2011 • 6463 Posts
WiiU seems to be on-par with current gen. Or will be after they optimize the games better for the hardware.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#315 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

My recent NVIDIA hardware

ASUS G1S laptop with Geforce 8600M GT GDDR3 256MB (dead, NVIDIA bumbgate, the entire G1S rendered useless).

ASUS G1Sn laptop with Geforce 9500M GS GDDR2 512MB (free replacement for ASUS G1S from ASUS).

ASUS N80VN (14 inch) with Geforce 9650M GT GDDR2 1GB (it's a small laptop with NVIDIA GPU).

NVIDIA Geforce 8600 GT GDDR3 (my CUDA test GPU, DIY ViDocK test GPU).


On Geforce 7 subject, NVIDIA has thier whitepaper on G8x vs G7x i.e. NVIDIA talks about on why NVIDIA G7x owner should upgrade to G8x e.g. they bashed thier own G7X. https://aulavirtual.uji.es/pluginfile.php/774612/mod_resource/content/0/GPU/GeForce_8800_GPU_Architecture_Technical_Brief_1_.pdf

RyviusARC

Yes I see that you own Nvidia hardware at some point but I don't see how the last line really matters.

Like I said both Nvidia and AMD/ATI had their ups and downs.

At one time I favored 3DFX more than any other GPU provider.

But I don't pick and choose based on brand.

And I know for a fact that you tend to shy away from saying much good about Nvidia and spend a lengthy amount of time praising AMD/ATI.

The issue is not about praising. What's to praise with Geforce 7 with current workloads?

David Shippy from IBM praised Xbox 360's GPU. http://www.gamasutra.com/php-bin/news_index.php?story=21820

But Gamasutra also asked him about the relative power of the two systems -- since he worked so intimately on them, does he have an opinion on which was the more powerful?

"I'm going to have to answer with an 'it depends,'" laughs Shippy, after a pause. "Again, they're completely different models. So in the PS3, you've got this Cell chip which has massive parallel processing power, the PowerPC core, multiple SPU cores it's got a GPU that is, in the model here, processing more in the Cell chip and less in the GPU. So that's one processing paradigm -- a heterogeneous paradigm."

"With the Xbox 360, you've got more of a traditional multi-core system, and you've got three PowerPC cores, each of them having dual threads -- so you've got six threads running there, at least in the CPU. Six threads in Xbox 360, and eight or nine threads in the PS3 -- but then you've got to factor in the GPU," Shippy explains. "The GPU is highly sophisticated in the Xbox 360."

Note that the compute paradigm and praise from the Xbox 360 can also be applied on NVIDIA Geforce 8.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#316 mrfrosty151986
Member since 2012 • 533 Posts
[QUOTE="mrfrosty151986"]

RSX can't do this, RSX can't do that..... and yet I've not seen a single person comment on how poor Xenos's performance becomes when using predictive tiling.

ronvalencia
It doesn't render Xbox 360 coming last in multi-platform games.

There's been a few that PS3 has had extra stuff like MLAA added...... and considering most multiplats are made on 360 why would it?
Avatar image for super600
super600

33160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#317 super600  Moderator
Member since 2007 • 33160 Posts

WiiU seems to be on-par with current gen. Or will be after they optimize the games better for the hardware.ZombieKiller7

It's a bit stronger then current gen consoles.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#318 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"]

RSX can't do this, RSX can't do that..... and yet I've not seen a single person comment on how poor Xenos's performance becomes when using predictive tiling.

mrfrosty151986
It doesn't render Xbox 360 coming last in multi-platform games.

There's been a few that PS3 has had extra stuff like MLAA added...... and considering most multiplats are made on 360 why would it?

PS3 has IBM CELL to fix/patch NVIDIA RSX's design issues.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#320 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="STOPSAMMING"][QUOTE="ronvalencia"][QUOTE="RyviusARC"]

Yes I see that you own Nvidia hardware at some point but I don't see how the last line really matters.

Like I said both Nvidia and AMD/ATI had their ups and downs.

At one time I favored 3DFX more than any other GPU provider.

But I don't pick and choose based on brand.

And I know for a fact that you tend to shy away from saying much good about Nvidia and spend a lengthy amount of time praising AMD/ATI.

The issue is not about praising. What's to praise with Geforce 7 with current workloads?

mr.wannabe john carmack likes to spit alot of words he doesnt understand.

That's all you can do?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#322 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="STOPSAMMING"][QUOTE="ronvalencia"][QUOTE="STOPSAMMING"] mr.wannabe john carmack likes to spit alot of words he doesnt understand.

That's all you can do?

my bad i forgot miss wannabe john carmack.

You have forgotten more than that.
Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#323 mrfrosty151986
Member since 2012 • 533 Posts
[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"][QUOTE="ronvalencia"] It doesn't render Xbox 360 coming last in multi-platform games.

There's been a few that PS3 has had extra stuff like MLAA added...... and considering most multiplats are made on 360 why would it?

PS3 has IBM CELL to fix/patch NVIDIA RSX's design issues.

RSX doesn't have any design issues, the only thing it only really falls short of when compared to Xenos is vertex processing but Xenos drops down to RSX levels when using predicted tiling anyway so it's a moot point.
Avatar image for STOPSAMMING
STOPSAMMING

109

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#324 STOPSAMMING
Member since 2012 • 109 Posts
[QUOTE="ronvalencia"][QUOTE="STOPSAMMING"][QUOTE="ronvalencia"] That's all you can do?

my bad i forgot miss wannabe john carmack.

You have forgotten more than that.

I forgot u were 15 also.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#325 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"] There's been a few that PS3 has had extra stuff like MLAA added...... and considering most multiplats are made on 360 why would it?mrfrosty151986
PS3 has IBM CELL to fix/patch NVIDIA RSX's design issues.

RSX doesn't have any design issues, the only thing it only really falls short of when compared to Xenos is vertex processing but Xenos drops down to RSX levels when using predicted tiling anyway so it's a moot point.

It has design issues with complex shader workloads and full 32bit compute. NVIDIA RSX has issues with hardware culling, 3DC+ texture compression support.

Xbox 360's predicted tiling issue doesn't directly affect compute characteristics.

Avatar image for percuvius2
percuvius2

1982

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#326 percuvius2
Member since 2004 • 1982 Posts

[QUOTE="04dcarraher"][QUOTE="lostrib"]

He seems to be a bit delusional

loosingENDS

That's a understatement Compared to full medium settings the 360 version is missing texture quality, uses 2D sprites. lower draw distances,lower LOD, missing shadows, and has 10x the loading screens. 360 version isnt even close to full medium settings on PC.

Actually runs above PC medium and by Digital Foundry has vastly better lighting on top

And AA that PC does not have in the tests

HAHAHA I notice you always run from my posts where I post what the PC version of TW2 really looks like. digital 720p foundry doesn't even know how to run a PC above 720p they're a disgusting joke. I'll just look at the actual TW2 running on my own PC. I'll post some more OWNAGE of the 360 consolized version below. All pics are taken from my PC running in Ultra mode on a Radeon 7870. All you have are UGLY pics from some third party for your 360 defense. Go take any screenhot from your 360 and not one will compete with mine.

WHAT TW2 REALLY LOOKS LIKE \|/

V

ilnYvkssyIRt.jpg

iiLeXw8xE7GJv.jpg

iz2AGJ4C5ybJT.jpg

i1opzHKS4ApX2.jpg

Keep ignoring the FACT that THIS /\ is the FULL version of TW2 and you play a little low detail, low res, sprite infested, low framerate, screen tearing, long load times and faded colour palette consolized version. In fact you really haven't even played the TW2 just a wannabe pretender. If digital foundry want a lesson in how to build a proper gaming PC they can contact me through gamespot.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#327 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="STOPSAMMING"][QUOTE="ronvalencia"][QUOTE="STOPSAMMING"] my bad i forgot miss wannabe john carmack.

You have forgotten more than that.

I forgot u were 15 also.

LOL, I guess I have a time machine in purchasing my Amiga 500 from 1989.
Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#328 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"] PS3 has IBM CELL to fix/patch NVIDIA RSX's design issues.ronvalencia

RSX doesn't have any design issues, the only thing it only really falls short of when compared to Xenos is vertex processing but Xenos drops down to RSX levels when using predicted tiling anyway so it's a moot point.

It has design issues with complex shader workloads and full 32bit compute. NVIDIA RSX has issues with hardware culling, 3DC+ texture compression support.

Makes no difference in a closed box system and they're not major problems anyway.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#329 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] RSX doesn't have any design issues, the only thing it only really falls short of when compared to Xenos is vertex processing but Xenos drops down to RSX levels when using predicted tiling anyway so it's a moot point.mrfrosty151986

It has design issues with complex shader workloads and full 32bit compute. NVIDIA RSX has issues with hardware culling, 3DC+ texture compression support.

Makes no difference in a closed box system.

Close box system doesn't change the hardware characteristics. PS3 has CELL to make up for the lost compute performance.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#330 mrfrosty151986
Member since 2012 • 533 Posts
[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"]

It has design issues with complex shader workloads and full 32bit compute. NVIDIA RSX has issues with hardware culling, 3DC+ texture compression support.

ronvalencia
Makes no difference in a closed box system.

Close box system change the hardware characteristics.

Developers just code round the problems and looking at exclusive games PS3 is doing just fine.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#331 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"] Makes no difference in a closed box system.mrfrosty151986
Close box system change the hardware characteristics.

Developers just code round the problems and looking at exclusive games PS3 is doing just fine.

The workaround involves IBM's CELL processor.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#332 mrfrosty151986
Member since 2012 • 533 Posts
[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"][QUOTE="ronvalencia"] Close box system change the hardware characteristics.

Developers just code round the problems and looking at exclusive games PS3 is doing just fine.

The workaround involves the CELL processor.

Not really, on real world pixel shader through put RSX and Xenos are about equal with Xenos having the upper hand when it comes to vertex loads. Cell adds the icing on the cake which is why PS3 exclusive games have looked much better then 360 exclusives now for the last couple of years.
Avatar image for R4gn4r0k
R4gn4r0k

49125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#333 R4gn4r0k
Member since 2004 • 49125 Posts

ANOTHER loosingENDS witcher 2 thread. Are you fvcking kidding me ?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#334 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"] Developers just code round the problems and looking at exclusive games PS3 is doing just fine.mrfrosty151986
The workaround involves the CELL processor.

Not really, on real world pixel shader through put RSX and Xenos are about equal with Xenos having the upper hand when it comes to vertex loads. Cell adds the icing on the cake which is why PS3 exclusive games have looked much better then 360 exclusives now for the last couple of years.

On Battlefield 3, DICE use SPEs for Deferred Rendering while Xbox 360 has a GPU powerful enough to match RSX+SPEs.

Exclusive vs exclusive comparsion doesn't negate artwork subjectivity.

Avatar image for percuvius2
percuvius2

1982

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#335 percuvius2
Member since 2004 • 1982 Posts

[QUOTE="STOPSAMMING"][QUOTE="ronvalencia"] You have forgotten more than that. ronvalencia
I forgot u were 15 also.

LOL, I guess I have a time machine in purchasing my Amiga 500 from 1989.

I had an Amiga 1000 , 500 and 2000 I still miss Moria!

Avatar image for AznbkdX
AznbkdX

4284

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#336 AznbkdX
Member since 2012 • 4284 Posts

[QUOTE="ronvalencia"]My point was, you can't generalise a PC with Geforce 7 type GPU. Radeon X19x0 was able keep up against Xbox 360 with current Unreal Engine 3 and CryEngine 3 base games.RyviusARC

I will try to lay waste to insults, but I do know you are a big supporter of AMD.....a little too much.

If you could remain more neutral you might do better.

The truth is both Nvidia and AMD/ATI had their ups and downs.

I currently own an Nvidia card not because I think Nvidia are better but because they had an extra feature that I wanted.

I owned an ATI 9700pro back in 2002 and it kicked ass for quite some time, but I do not have brand loyalty because it's a petty thing ignorant people succumb to.

And don't bother trying to deny your loyalty to the AMD/ATI brand, I have seen your posts and know well enough.

.

And back to the topic, yes I do agree with you that the x19x0 fairs better in shader intensive games.

You have to admit though, its not easy to argue with him still. :P

I am not going to say I am an expert at this stuff, but I do know of some of the things being talked about in all this argument and I see a lot of ignorance and no facts when going against the ATI/AMD guy. Its obvious of the bias for AMD, and even if there a ton of fact checking on posts with pictures, forum posts, etc. at least there is some evidence no matter how off the wall.

I don't mind the posting style. Its a hell of a lot better than saying you're wrong with a bit of evidence which is what I've seen at times on here.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#337 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"] The workaround involves the CELL processor.ronvalencia

Not really, on real world pixel shader through put RSX and Xenos are about equal with Xenos having the upper hand when it comes to vertex loads. Cell adds the icing on the cake which is why PS3 exclusive games have looked much better then 360 exclusives now for the last couple of years.

On Battlefield 3, DICE use SPEs for Deferred Rendering while Xbox 360 has a GPU powerful enough to match RSX+SPEs.

Exclusive vs exclusive comparsion is subjective and doesn't negate artwork subjectivity.

Oh dear god, deferred rendering is not something that requires a monster GPU, it's just a difference way of doing thing then the old forward rendering and does not require a set amount of GPU power. And why would you put the main render-er on Cell and not a GPU ( RSX ) You need to read DICE's papers because you're wrong my friend. SPE's were used for lighting and such though.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#338 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="STOPSAMMING"] I forgot u were 15 also.percuvius2

LOL, I guess I have a time machine in purchasing my Amiga 500 from 1989.

I had an Amiga 1000 , 500 and 2000 I still miss Moria!

I had an Amiga 500 (1989) and 3000 (1992). I use the Amiga 3000 for video work i.e. PowerPoint doesn't exist at that time.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#339 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] Not really, on real world pixel shader through put RSX and Xenos are about equal with Xenos having the upper hand when it comes to vertex loads. Cell adds the icing on the cake which is why PS3 exclusive games have looked much better then 360 exclusives now for the last couple of years.mrfrosty151986

On Battlefield 3, DICE use SPEs for Deferred Rendering while Xbox 360 has a GPU powerful enough to match RSX+SPEs.

Exclusive vs exclusive comparsion is subjective and doesn't negate artwork subjectivity.

Oh dear god, deferred rendering is not something that requires a monster GPU, it's just a difference way of doing thing then the old forward rendering and does not require a set amount of GPU power. And why would you put the main render-er on Cell and not a GPU ( RSX ) You need to read DICE's papers because you're wrong my friend. SPE's were used for lighting and such though.

You are making a lot of assumptions in regards to "why would you put the main render-er on Cell and not a GPU". You are putting words that doesn't reflect my POV.

If you read DICE's papers, DICE has stated Xenos's ALUs are powerfull enough (i.e. Xbox 360 doesn't have DirectCompute or SPEs) and GpGPU capable.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#340 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"]

On Battlefield 3, DICE use SPEs for Deferred Rendering while Xbox 360 has a GPU powerful enough to match RSX+SPEs.

Exclusive vs exclusive comparsion is subjective and doesn't negate artwork subjectivity.

ronvalencia

Oh dear god, deferred rendering is not something that requires a monster GPU, it's just a difference way of doing thing then the old forward rendering and does not require a set amount of GPU power. And why would you put the main render-er on Cell and not a GPU ( RSX ) You need to read DICE's papers because you're wrong my friend. SPE's were used for lighting and such though.

You are making a lot of assumptions in regards to "why would you put the main render-er on Cell and not a GPU". You are putting words that doesn't reflect my POV.

If you read DICE's papers, DICE has stated Xenos's ALUs are powerfull and GpGPU capable.

The answer is simple, you would never put the main render-er on a CPU when you have a dedicated GPU to do it, so no DICE did not use Cell to get deferred rendering working :|

And if Xenos comes up to a high vertex load it's pixel shading ability drops and you get a performance drop, no such problems on PS3.

The you have vertex/pixel shader, memory bandwidth, fill rate and other performance drops from using tiling on the EDRAM.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#341 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] Oh dear god, deferred rendering is not something that requires a monster GPU, it's just a difference way of doing thing then the old forward rendering and does not require a set amount of GPU power. And why would you put the main render-er on Cell and not a GPU ( RSX ) You need to read DICE's papers because you're wrong my friend. SPE's were used for lighting and such though.mrfrosty151986

You are making a lot of assumptions in regards to "why would you put the main render-er on Cell and not a GPU". You are putting words that doesn't reflect my POV.

The answer is simple, you would never put the main render-er on a CPU when you have a dedicated GPU to do it, so no DICE did not use Cell to get deferred rendering working :|

Deferred rendering for lights (DRL) was running on 5 to 6 SPEs and has HDR FP16 output.

DirectCompute based titled deferred rendering for lights served as the basis for SPEs version.

RSX still handles SSAO, shadows, particles, geometry and 'etc'.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#342 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"]

You are making a lot of assumptions in regards to "why would you put the main render-er on Cell and not a GPU". You are putting words that doesn't reflect my POV.

ronvalencia

The answer is simple, you would never put the main render-er on a CPU when you have a dedicated GPU to do it, so no DICE did not use Cell to get deferred rendering working :|

Deferred rendering for lights was running on 5 to 6 SPEs. DirectCompute based titled deferred rendering for lights served as the basis for SPEs version.

So there you have it then, you've just proved yourself wrong. The lighting was done on Cell and not the deferred rendering.

2 completely different things

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#343 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"] The answer is simple, you would never put the main render-er on a CPU when you have a dedicated GPU to do it, so no DICE did not use Cell to get deferred rendering working :|mrfrosty151986

Deferred rendering for lights was running on 5 to 6 SPEs. DirectCompute based titled deferred rendering for lights served as the basis for SPEs version.

So there you have it then, you've just proved yourself wrong. The lighting was done on Cell and not the deferred rendering.

2 completely different things

No, I didn't specfically state render stage.
Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#344 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"]

[QUOTE="ronvalencia"] Deferred rendering for lights was running on 5 to 6 SPEs. DirectCompute based titled deferred rendering for lights served as the basis for SPEs version.ronvalencia

So there you have it then, you've just proved yourself wrong. The lighting was done on Cell and not the deferred rendering.

2 completely different things

No, I didn't specfically state render stage.

No but you never said lighting either did you? If you thought it was lighting then I assume someone with your knowledge would of just said so instead of using the term " DICE use SPEs for Deferred Rendering" That to me gives the impression that you thought that the only way DICE got deferred rendering running on PS3 was because of Cell, which is wrong.

Or have you yet again copy and pasted that sentance from yet another article?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#345 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"][QUOTE="mrfrosty151986"] So there you have it then, you've just proved yourself wrong. The lighting was done on Cell and not the deferred rendering.

2 completely different things

mrfrosty151986

No, I didn't specfically state render stage.

No but you never said lighting either did you? If you thought it was lighting then I assume someone with your knowledge would of just said so instead of using the term " DICE use SPEs for Deferred Rendering" That to me gives the impression that you thought that the only way DICE got deferred rendering running on PS3 was because of Cell, which is wrong.

Or have you yet again copy and pasted that sentance from yet another article?

It would be stupid to think that the entire rendering stages to be running on SPEs e.g. RSX still handles SSAO, shadows, particles, geometry and 'etc'. Sony didn't design PS3 with 100 percent CELL solution for a reason.

It still supports the view that the CELL still patches RSX. BattleField 3's results supports David Shippy's view on Xbox 360 being roughtly equal with PS3.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#346 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"]

[QUOTE="ronvalencia"] No, I didn't specfically state render stage. ronvalencia

No but you never said lighting either did you? If you thought it was lighting then I assume someone with your knowledge would of just said so instead of using the term " DICE use SPEs for Deferred Rendering" That to me gives the impression that you thought that the only way DICE got deferred rendering running on PS3 was because of Cell, which is wrong.

Or have you yet again copy and pasted that sentance from yet another article?

It would stupid to think that the entire rendering stages to be running on SPEs e.g. RSX still handles SSAO, shadows, particles, geometry and 'etc'.

It still supports the view that the CELL still patches RSX.

Cell does assist RSX but not as much as you think, RSX is still a very capable GPU as exclusive games have shown and there's even been developers over Beyond3D say that it's also very competent.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#347 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] No but you never said lighting either did you? If you thought it was lighting then I assume someone with your knowledge would of just said so instead of using the term " DICE use SPEs for Deferred Rendering" That to me gives the impression that you thought that the only way DICE got deferred rendering running on PS3 was because of Cell, which is wrong.

Or have you yet again copy and pasted that sentance from yet another article?

mrfrosty151986

It would stupid to think that the entire rendering stages to be running on SPEs e.g. RSX still handles SSAO, shadows, particles, geometry and 'etc'.

It still supports the view that the CELL still patches RSX.

Cell does assist RSX but not as much as you think, RSX is still a very capable GPU as exclusive games have shown and there's even been developers over Beyond3D say that it's also very competent.

Again, exclusive games doesn't negate artwork subjectivity. RSX's competency is debatable, overall PS3 ~= Xbox 360.

On deferred shading, Sony has a white paper on CELL's 5 SPEs ~= Geforce 7800 GTX.

What happens if PS3 has Radeon X1950 (with 128bit, 8 ROPs) instead of Geforce 7800 (with 128bit, 8 ROPs)?

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#348 mrfrosty151986
Member since 2012 • 533 Posts

[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"]

It would stupid to think that the entire rendering stages to be running on SPEs e.g. RSX still handles SSAO, shadows, particles, geometry and 'etc'.

It still supports the view that the CELL still patches RSX.

ronvalencia

Cell does assist RSX but not as much as you think, RSX is still a very capable GPU as exclusive games have shown and there's even been developers over Beyond3D say that it's also very competent.

Again, exclusive games doesn't negate artwork subjectivity.

On deferred shading, Sony has a white paper on CELL's 5 SPEs ~= Geforce 7800 GTX.

What happens if PS3 has Radeon X1950 (with 128bit, 8 ROPs) instead of Geforce 7800 (with 128bit, 8 ROPs)?

Not a lot as at this point in there life cycle the systems are currently too bandwidth starved for more GPU power to make that much of a difference. And stop dismissing exclusive games, you never ever judge a game by ports and technically PS3 exclusive games are, in my eyes a step above 360 ones.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#349 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="mrfrosty151986"] Cell does assist RSX but not as much as you think, RSX is still a very capable GPU as exclusive games have shown and there's even been developers over Beyond3D say that it's also very competent.mrfrosty151986

Again, exclusive games doesn't negate artwork subjectivity.

On deferred shading, Sony has a white paper on CELL's 5 SPEs ~= Geforce 7800 GTX.

What happens if PS3 has Radeon X1950 (with 128bit, 8 ROPs) instead of Geforce 7800 (with 128bit, 8 ROPs)?

Not a lot as at this point in there life cycle the systems are currently too bandwidth starved for more GPU power to make that much of a difference. And stop dismissing exclusive games, you never ever judge a game by ports and technically PS3 exclusive games are, in my eyes a step above 360 ones.

On industry standard computer benchmark practises, compute benchmarks are judged by negating the artwork subjectivity. They must run the same datasets.


In general, no one questions Geforce 8800 being superior to the consoles.

Avatar image for mrfrosty151986
mrfrosty151986

533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#350 mrfrosty151986
Member since 2012 • 533 Posts
[QUOTE="mrfrosty151986"][QUOTE="ronvalencia"]

Again, exclusive games doesn't negate artwork subjectivity.

On deferred shading, Sony has a white paper on CELL's 5 SPEs ~= Geforce 7800 GTX.

What happens if PS3 has Radeon X1950 (with 128bit, 8 ROPs) instead of Geforce 7800 (with 128bit, 8 ROPs)?

ronvalencia
Not a lot as at this point in there life cycle the systems are currently too bandwidth starved for more GPU power to make that much of a difference. And stop dismissing exclusive games, you never ever judge a game by ports and technically PS3 exclusive games are, in my eyes a step above 360 ones.

On industry standard computer benchmark practises, compute benchmarks are judged by negating the artwork subjectivity. They must run the same datasets.

Dude these are consoles and console have and always will be judged on exclusive games to show there technical ability.