Next time research first, spout nonsense later.
Materia_Ashes
Next time know what my post implies (research).
This topic is locked from further discussion.
Next time know what my post implies (research).
Trmpt
So please do enlighten me as to what you're implying... In your scenario, PC gaming dies out and so as a result dedicated GPU dies out as well. Intel would be happy and the movie industry wouldn't give a damn because their render farms are still being upgraded with the new Xeons... Gaming however suffers because now everything will have to be done through CPU instead of GPU. How is this a good thing?
[QUOTE="Trmpt"]
Next time know what my post implies (research).
Materia_Ashes
So please do enlighten me as to what you're implying... In your scenario, PC gaming dies out and so as a result dedicated GPU dies out as well. Intel would be happy and the movie industry wouldn't give a damn because their render farms are still being upgraded with the new Xeons... Gaming however suffers because now everything will have to be done through CPU instead of GPU. How is this a good thing?
There you go.
And no, this would not be a good thing.
I guess you could look at how much the cgi in movies have improved over the years to get an idea of how much the games graphics would improve.
Trmpt
How is the GPUs application in pre-rendered video possibly applicable to real time graphics in games?
Take pixel shader 4, why would there be a reason to develop PS 4.0 between console generations? You may argue the R&D would be done for the next generation consoles. Well what about pixel shader 5? PS 5.0 was developed based on what they learned from PS 4.0, it will be ready for the next generation consoles after being developed and tested on PC.
If it wasn't for PC there wouldn't be a pixel shader 5 ready for the next generation of consoles, it's the same for any other technology. You cannot replicate that without the technology being used in games and improved on, which just doesn't happen in either films or fixed hardware consoles.
[QUOTE="Trmpt"]
I guess you could look at how much the cgi in movies have improved over the years to get an idea of how much the games graphics would improve.
AnnoyedDragon
How is the GPUs application in pre-rendered video possibly applicable to real time graphics in games?
Take pixel shader 4, why would there be a reason to develop PS 4.0 between console generations? You may argue the R&D would be done for the next generation consoles. Well what about pixel shader 5? PS 5.0 was developed based on what they learned from PS 4.0, it will be ready for the next generation consoles after being developed and tested on PC.
If it wasn't for PC there wouldn't be a pixel shader 5 ready for the next generation of consoles, it's the same for any other technology. You cannot replicate that without the technology being used in games and improved on, which just doesn't happen in either films or fixed hardware consoles.
You bring up a good point. I mean how can they advance what hasnt been created? I think that they would make its equivalent eventually but it would definitely take longer, which is why I hope this never happens.There you go.
And no, this would not be a good thing.
Trmpt
So you were trying to reenforce the original point that all gaming would suffer if the PC gaming industry died out? FYI prerendered graphics have already achieved photorealism. You can check it out here and here.
[QUOTE="savagetwinkie"] i doubt the architectures are really that different if anything they probably have a couple of numbers they play with in a hardware description languange, to increase vertex setups because normally most of what they are doing is modeling with thouse, changing a few numbers around could radically change the transistor count, and even if they didn't do it for PC"s someone would have for a console, i mean, they didn't put custom grahpics cards in till ps3, xbox, gamecube, before that they all had their own custom made graphics cards. What i will agree with is, as budgets skyrocket the standardizations of PC gaming probably lowered the costs for dev's when they can use opengl, directx type of calls. And with the new unified architectures, they probably don't even need to change that much around between a dev card and end user card, dev cards can use all those cuda engines as vertex setups, and even the new nvidia 300 series i beleive is all the same architecture, with up to 1.5gb aimed at end users and 6gb aimed at devs.VandalvideoYou really haven't tried buying a workstation GPU before have you? The way they process information is astronomically different from that of a regular GPU. They simply aren't designed to do the kinds of processing you need for games. This isn't a matter of raw power, because on the face of it workstation GPUs are drastically overpowered. The problems lie in the architecture and how they handle processing. They simply are inept at rendering video games. I personally bought a workstation GPU a couple years ago. (Paid a good 1,300 bucks for it) It was absolutely worthless for playing video games. My 5700le outperformed this thing which was roughly equivalent to your modern day 7900gs. They simply cannot be used to play video games. It is a law of deminishing returns. They are too expensive to produce, they don't handle video games, and any attempt to put them into a console would only exacerbate costs and produce extremely shoddy games. those architectures are going to be more similar then you think, they just add more of what you need for a work station, why bother putting many shaders in if you don't need to really render it in realtime? They are actually based off the same architectures, roughly 95% the same, kinda like humans and monkeys, very close dna but radically different working results.
[QUOTE="Trmpt"]
There you go.
And no, this would not be a good thing.
Materia_Ashes
So you were trying to reenforce the original point that all gaming would suffer if the PC gaming industry died out? FYI prerendered graphics have already achieved photorealism. You can check it out here and here.
No, my point was that cgi would still improve through hollywood, that goes along with what you said as well but it still wasnt the point that I was trying to make.
And that isnt photo realism.
[QUOTE="Trmpt"]
That has been said before; I'll make my point again, if PC gaming dropped off the face of the earth it wouldnt mean the end of graphics advancement at all. Why? The movie industry pours millions into cgi, they want advancement just as much as the video game industry.
This is a gaming GPU, this is a workstation GPU, they are not the same.
Workstation GPUs are bad for gaming, you cannot simply take the R&D for workstation GPUs and match development as if there was demand for gaming GPUs.
Rofl Dragon, that ignorant post of his just made my day.[QUOTE="AnnoyedDragon"][QUOTE="Trmpt"]
That has been said before; I'll make my point again, if PC gaming dropped off the face of the earth it wouldnt mean the end of graphics advancement at all. Why? The movie industry pours millions into cgi, they want advancement just as much as the video game industry.
TheBigBadGRIM
This is a gaming GPU, this is a workstation GPU, they are not the same.
Workstation GPUs are bad for gaming, you cannot simply take the R&D for workstation GPUs and match development as if there was demand for gaming GPUs.
Rofl Dragon, that ignorant post of his just made my day.lol......yeah I know right. Why would the movie industry want to keep improving CGI, I mean, there really is no demand to do so whatsoever so I have no clue where that ignorance came from. /sarcasm
Rofl Dragon, that ignorant post of his just made my day.[QUOTE="TheBigBadGRIM"][QUOTE="AnnoyedDragon"]
This is a gaming GPU, this is a workstation GPU, they are not the same.
Workstation GPUs are bad for gaming, you cannot simply take the R&D for workstation GPUs and match development as if there was demand for gaming GPUs.
Trmpt
lol......yeah I know right. Why would the movie industry want to keep improving CGI, I mean, there really is no demand to do so whatsoever so I have no clue where that ignorance came from. /sarcasm
You're just making it worse. You do realise there are massive differences between what movies require of CGI, and advancements in video game graphics?[QUOTE="Trmpt"][QUOTE="TheBigBadGRIM"] Rofl Dragon, that ignorant post of his just made my day.Danm_999
lol......yeah I know right. Why would the movie industry want to keep improving CGI, I mean, there really is no demand to do so whatsoever so I have no clue where that ignorance came from. /sarcasm
You're just making it worse. You do realise there are massive differences between what movies require of CGI, and advancements in video game graphics?It is improvment that would happen with or without the video game industry/end of story
You're just making it worse. You do realise there are massive differences between what movies require of CGI, and advancements in video game graphics?[QUOTE="Danm_999"][QUOTE="Trmpt"]
lol......yeah I know right. Why would the movie industry want to keep improving CGI, I mean, there really is no demand to do so whatsoever so I have no clue where that ignorance came from. /sarcasm
Trmpt
It is improvment that would happen with or without the video game industry/end of story
but it wouldnt happen in the gaming industry end of story.
Fewer exclusives that are designed to push pc hardware compared to last gen.[QUOTE="dc337"]
[QUOTE="Danm_999"]
In what sense can you consider the health of PC gaming anything other than excellent and expansionary?
Danm_999
[QUOTE="Trmpt"]
I guess you could look at how much the cgi in movies have improved over the years to get an idea of how much the games graphics would improve.
AnnoyedDragon
How is the GPUs application in pre-rendered video possibly applicable to real time graphics in games?
If it wasn't for PC there wouldn't be a pixel shader 5 ready for the next generation of consoles, it's the same for any other technology. You cannot replicate that without the technology being used in games and improved on, which just doesn't happen in either films or fixed hardware consoles.
There will always be a push to make games more realistic, you don't need a pc gaming market for that. The same engineers that work at Nvidia can work for Microsoft.
The most you can argue is that graphics tech will slow without the same push from the pc gaming market, which I would agree with. However I don't think it matters that much at this point. We're reaching a tech plateau with gaming graphics, differences between gpu generations are less significant in terms of tech. If they froze the current tech level for 5 or even 10 years I could care less.dc337
i agree yet disagree. the only reason weve hit a plateau is due to less of a push from develoeprs due to a mutliplatform focus and due to rising costs. it isnt a matter of us having acheived graphics good enough.
the tech itself has changed significantly in the past and even as it changes from dx10 to 11 is still quite a change.
the leap from what was out to crysis was huge. its still huge since nothing has touched it tech wise.
All you've managed to portray (using extremely spurious reasons which I don't agree with) is that it's changing. He asked in what sense could pc gaming be anything other than excellent and expansionary. I would say that describing it as changing doesn't meet those terms. Why? Change is not innately negative.[QUOTE="Danm_999"]
[QUOTE="dc337"] Fewer exclusives that are designed to push pc hardware compared to last gen.
Less diversity in exclusives, too many are either rts or mmorpg.
Traditional pc developers going multiplat or console only.
The sports genre is pretty much dead on the pc.
I wouldn't say that pc gaming is dying as a whole but a lot of the recent growth has been in casual sales and mmorpg subscriptions.dc337
Some genres are clearly not as healthy as they used to be and there was a time when pc gaming was expanding in all areas.dc337I'm sorry, no there wasn't. Please don't mythologise the past in order to criticise the present.
Rofl. Ok, I'll play ball here.
As a whole it has expanded in revenue but a lot of that growth is isolated in a few genres like mmorpg and casual. Take away that growth and you have an industry in decline.dc337
I'll pretend growth in the PC industry only comes from MMORPGs and casual (???). So then, why do we ignore these things? If I ignored what I considered to be casual in the console market (the Wii), the console industry has bottomed out.
The fact is, if you count the genre representation amongst all systems; the PC has the exclusives in more than any other. Even the 'contested' genres of FPS, racers, sports are all doing extremely well on the PC, off the top of my head this year alone we've had Arma 2, Out of the Park Baseball, Worldwide Soccer Manager 2009, Football Manager Live, Burnout Paradise (non-exclusive), Call of Juarez (non-exclusive), The Last Remnant (non-exclusive).
[QUOTE="Vandalvideo"][QUOTE="Trmpt"]That has been said before; I'll make my point again, if PC gaming dropped off the face of the earth it wouldnt mean the end of graphics advancement at all. Why? The movie industry pours millions into cgi, they want advancement just as much as the video game industry.Trmpt
I've said this before and I'll say it again; the workbench GPUs championed by the movie industry are not at all conducive to video games. They are too expensive and the technology isn't great for gaming. Yes, there would be a huge blowback on console graphics if the PC industry sank.
Maybe I have to make myself clear, if every type of video game ceases to exist.........the movie industry would still want advancement.
And thank you for proving my point.
I think even with PCgaming around console industry is slowing down the progress to a crawl, Wii showed the way the rest will follow.THe biggest downside to death of PCgaming would be lack of new talent in the industry. Now it's a river, kill PCgaming and the new talent influx will resemble a tiny stream
[QUOTE="Trmpt"]
[QUOTE="Danm_999"] You're just making it worse. You do realise there are massive differences between what movies require of CGI, and advancements in video game graphics?washd123
It is improvment that would happen with or without the video game industry/end of story
but it wouldnt happen in the gaming industry end of story.
Exactly!Please Log In to post.
Log in to comment