Nvidia Conference (Flex, Shadowplay, and more!)

This topic is locked from further discussion.

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1  Edited By cfisher2833
Member since 2011 • 2150 Posts

So Nvidia is currently holding a conference where they're demoing some new technologies. Everything else is pretty nice, but the one that really caught my eye is FleX, which is a upgraded form of PhysX that emphasizes physical interaction between various objects. Here's a video demonstrating the potential:

Anything else you've seen that interesting?

Now they're announcing something called GSync, which requires a certain monitor. The GPU controls the refresh rate of the monitor...something like that.

btw, if you wanna see Linus' livestream from the event, here's the link: http://www.twitch.tv/linustech

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#2  Edited By Kinthalis
Member since 2002 • 5503 Posts

So yesterday they announced some new technolgy tools for developers, allowing them to easily implement volumetric fire effects, physx based physics simulations of liquids and particles, and perhaps most importantly FULL GLOBAL ILLUMINATION on Nvidia GPU's.

Sweet mother of god.

They also mentioned some improvements coming with their Physx 3.x api such as full multi-threading.

Today they introduced Shadowplay (beta coming Oct 28th), which can buffer up to 20 minutes of video, streams directly to twitch, allows you to record as much video as you want at 1080p/60fps/up to 50 mbps, and supports video overlay of your webcam/other vidoe source. All for no FPS hit as the built in video encoder on Nvidia GPU's handles everything.

And finally, Nvidia has basically solved the issues with V-sync and screen tearing!

With G-Sync a 3 part solution to the problem, they will eliminate the need for Vsync (which introduces latency and stuttering), AND remove screen tearing that Vsync would normally solve. The 3 part solution is a Kepler GPU, your GPU drivers, and, the key part, a chip embedded into monitors that allows the GPU to drive monitor timing. They announced several partners including Asus who will be offering panels from 1080p all the way to 4K.

Carmack was also on hand, and I believe him and people from DICE are having a conference after the Nvidia show. So far he spoke about the huge difference in smoothness and image quality that G-sync brings to the table.

My guess is that Gsync is also goign to be critical for Oculus rift support. GPU driving the timing of the display will probably be of great help in terms of redicing latency - something Carmack has spoken of extensibly in the past as being the bane of virtual reality.

Avatar image for gameofthering
gameofthering

11286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#3 gameofthering
Member since 2004 • 11286 Posts

I'm interested in Shadowplay.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#4 Kinthalis
Member since 2002 • 5503 Posts

So this whole thing totally blew my mind. Nvidia is doign good and I went from thinking I would trade in my 680's for an AMD R290x to thinking that I'll pick up one of these new monitors instead (apparently you cna even mod your existing monitor to support this - they have directions online, but only support 1 model currently).

Again, it looks like PC gmaing is THE way to experience games. NOT consoles.

Graphics features that won't exist on consoles for MANY games, and now the smoothest, best quality experience with the least latency can ONLY be had on the PC. Maybe next, next gen, consoles :P

Shadow play beta is comign just in time for BF4 too. Awesome sauce.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#5 Daious
Member since 2013 • 2315 Posts

They announced the 780ti. It doesn't make a lot of sense. 780 is just a gimp version of the titan. The 780ti is just a less gimp version of the titan. They should have just priced drop the Titan instead of this bullocks.

Avatar image for Netherscourge
Netherscourge

16364

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6 Netherscourge
Member since 2003 • 16364 Posts

Sounds kind of gimmicky to me.

I think AMD's Mantle stuff was more promising.

But, I'm not really sold on anything right now.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7  Edited By clyde46
Member since 2005 • 49061 Posts

Any new GPU's?

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#8 Kinthalis
Member since 2002 • 5503 Posts

@Netherscourge said:

Sounds kind of gimmicky to me.

I think AMD's Mantle stuff was more promising.

But, I'm not really sold on anything right now.

What sounds gimmicky? G-Sync?

Some quotes:

Tim Sweeney, creator of Epic’s industry-dominating Unreal Engine, called G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def.” He added, “If you care about gaming, G-SYNC is going to make a huge difference in the experience.” The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.” Coming from a pioneer of the gaming industry, who’s also a bonafide rocket scientist, that’s high praise indeed.

It really looks like somehting that's goign to make a big difference in how games re experienced on a panel.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#9  Edited By Kinthalis
Member since 2002 • 5503 Posts

@clyde46 said:

Any new GPU's?

Not really. They annoucned the GTX 780 Ti. But it's likely a less gimped Titan.

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 NameIess_One
Member since 2013 • 1077 Posts

Mantle, PhysX, all of these things sound very pretty on paper, or as PR statements, until you remember they're exclusive to a single card vendor, which prevents the developers from building the game around them, or making them a crucial part of the game's inner workings.

Ultimately, they all end up as nothing more than gimmicks.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 clyde46
Member since 2005 • 49061 Posts

@Kinthalis said:

@clyde46 said:

Any new GPU's?

Not really. They annoucned the GTX 780 Ti. But it's likely a less gimped Titan.

Price?

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12  Edited By NameIess_One
Member since 2013 • 1077 Posts

@Kinthalis said:

@Netherscourge said:

Sounds kind of gimmicky to me.

I think AMD's Mantle stuff was more promising.

But, I'm not really sold on anything right now.

What sounds gimmicky? G-Sync?

Some quotes:

Tim Sweeney, creator of Epic’s industry-dominating Unreal Engine, called G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def.” He added, “If you care about gaming, G-SYNC is going to make a huge difference in the experience.” The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.” Coming from a pioneer of the gaming industry, who’s also a bonafide rocket scientist, that’s high praise indeed.

It really looks like somehting that's goign to make a big difference in how games re experienced on a panel.

The problem is, the solution is hardware based. A monitor chip which is only useful for a single card brand, not sure that'll be overly successful, and have enough of a pull to become a standard.

Would be a very nice feature to have if it was software based like Adaptive VSync, though.

Avatar image for Mozuckint
Mozuckint

831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#13  Edited By Mozuckint
Member since 2012 • 831 Posts

G-synch did it for me.

V-synch and such other methods have been plaguing games for years. Nvidia even going so far as to offer a DIY module for what is essentially resurrecting CRT quality from the dead was enough for me to stay on team green.

Everything else is a bonus.

Avatar image for Cyberdot
Cyberdot

3928

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Cyberdot
Member since 2013 • 3928 Posts

Nvidia is awesome.

I refuse to buy AMD for a reason.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#15 Kinthalis
Member since 2002 • 5503 Posts

@NameIess_One said:

Mantle, PhysX, all of these things sound very pretty on paper, or as PR statements, until you remember they're exclusive to a single card vendor, which prevents the developers from building the game around them, or making them a crucial part of the game's inner workings.

Ultimately, they all end up as nothing more than gimmicks.

They are all partnering up wiht publishers and devs though.

It's not just PR. These things ARE currenlty running on games and WILL be featured in future games.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#16 Kinthalis
Member since 2002 • 5503 Posts

@NameIess_One said:

@Kinthalis said:

@Netherscourge said:

Sounds kind of gimmicky to me.

I think AMD's Mantle stuff was more promising.

But, I'm not really sold on anything right now.

What sounds gimmicky? G-Sync?

Some quotes:

Tim Sweeney, creator of Epic’s industry-dominating Unreal Engine, called G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def.” He added, “If you care about gaming, G-SYNC is going to make a huge difference in the experience.” The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.” Coming from a pioneer of the gaming industry, who’s also a bonafide rocket scientist, that’s high praise indeed.

It really looks like somehting that's goign to make a big difference in how games re experienced on a panel.

The problem is, the solution is hardware based. A monitor chip which is only useful for a single card brand, not sure that'll be overly successful, and have enough of a pull to become a standard.

Would be a very nice feature to have if it was software based like Adaptive VSync, though.

The whole point of the presentaiton was that the issue could NOT be resolved by software alone. This is a problem inherent in current monitor technology. This aims to CHANGE that technology, probably the biggest change since HD in how displays work.

I don't know if it will be widely adopted, I sure hope so. But personally, I'm going to pick this up.

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17  Edited By NameIess_One
Member since 2013 • 1077 Posts

@Kinthalis said:

@NameIess_One said:

Mantle, PhysX, all of these things sound very pretty on paper, or as PR statements, until you remember they're exclusive to a single card vendor, which prevents the developers from building the game around them, or making them a crucial part of the game's inner workings.

Ultimately, they all end up as nothing more than gimmicks.

They are all partnering up wiht publishers and devs though.

It's not just PR. These things ARE currenlty running on games and WILL be featured in future games.

True, but they'll never be featured in a truly meaningful manner, for example, you won't see a game having its entire physics engine based on Nvidia PhysX, since that would cut off all the AMD owners.

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 cfisher2833
Member since 2011 • 2150 Posts

@NameIess_One said:

@Kinthalis said:

@NameIess_One said:

Mantle, PhysX, all of these things sound very pretty on paper, or as PR statements, until you remember they're exclusive to a single card vendor, which prevents the developers from building the game around them, or making them a crucial part of the game's inner workings.

Ultimately, they all end up as nothing more than gimmicks.

They are all partnering up wiht publishers and devs though.

It's not just PR. These things ARE currenlty running on games and WILL be featured in future games.

True, but they'll never be featured in a truly meaningful manner, for example, you won't see a game having its entire physics engine based on Nvidia PhysX, since that would cut off all the AMD owners.

I don't know. Now that the PhysX API is being more multi-core friendly, it might perform better on systems with AMD GPUs. We'll have to see though. It certainly would suck if that amazing technology went to waste.

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 NameIess_One
Member since 2013 • 1077 Posts

@cfisher2833 said:

@NameIess_One said:

@Kinthalis said:

@NameIess_One said:

Mantle, PhysX, all of these things sound very pretty on paper, or as PR statements, until you remember they're exclusive to a single card vendor, which prevents the developers from building the game around them, or making them a crucial part of the game's inner workings.

Ultimately, they all end up as nothing more than gimmicks.

They are all partnering up wiht publishers and devs though.

It's not just PR. These things ARE currenlty running on games and WILL be featured in future games.

True, but they'll never be featured in a truly meaningful manner, for example, you won't see a game having its entire physics engine based on Nvidia PhysX, since that would cut off all the AMD owners.

I don't know. Now that the PhysX API is being more multi-core friendly, it might perform better on systems with AMD GPUs. We'll have to see though. It certainly would suck if that amazing technology went to waste.

Well, it's been going to waste for years, and I really doubt Nvidia had a sudden change of heart.

There'll be a catch... there's always a catch.

Avatar image for Sushiglutton
Sushiglutton

10480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#20  Edited By Sushiglutton  Online
Member since 2009 • 10480 Posts

Skimmed through some three hour long conference yesterday. Those water balloons were freaking aweesom :)!

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21  Edited By lostrib
Member since 2009 • 49999 Posts

So shadowplay isn't dead then?

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#22  Edited By Kinthalis
Member since 2002 • 5503 Posts

@lostrib said:

So shadowplay isn't dead then?

It's rumored to be what powers the SteamOS streaming. So it must also work for AMD cards. Or will work at some point.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 clyde46
Member since 2005 • 49061 Posts

I think I'll be getting that 780Ti

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 NameIess_One
Member since 2013 • 1077 Posts

@clyde46 said:

I think I'll be getting that 780Ti

Really curious about the price. Since there's a good chance 780 Ti will be trading blows with R9 290X, we may see a price war.

May get one for myself as well, but will wait for a bit, see how things play out.

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 cfisher2833
Member since 2011 • 2150 Posts

@NameIess_One said:

@clyde46 said:

I think I'll be getting that 780Ti

Really curious about the price. Since there's a good chance 780 Ti will be trading blows with R9 290X, we may see a price war.

May get one for myself as well, but will wait for a bit, see how things play out.

What I think they'll do is have the 780ti replace the 780 as the most expensive Geforce GPU (not counting Titan) and lower the 780 to around the price of a 770.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26  Edited By clyde46
Member since 2005 • 49061 Posts

@NameIess_One said:

@clyde46 said:

I think I'll be getting that 780Ti

Really curious about the price. Since there's a good chance 780 Ti will be trading blows with R9 290X, we may see a price war.

May get one for myself as well, but will wait for a bit, see how things play out.

I hope so.

Avatar image for NameIess_One
NameIess_One

1077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 NameIess_One
Member since 2013 • 1077 Posts

@cfisher2833 said:

@NameIess_One said:

@clyde46 said:

I think I'll be getting that 780Ti

Really curious about the price. Since there's a good chance 780 Ti will be trading blows with R9 290X, we may see a price war.

May get one for myself as well, but will wait for a bit, see how things play out.

What I think they'll do is have the 780ti replace the 780 as the most expensive Geforce GPU (not counting Titan) and lower the 780 to around the price of a 770.

That'd be a pretty bold move, doubt they'd be willing to go that far.

But if it happened, would gladly pick up a 780, preferably the MSI Lightning model, and OC the crap out of it!

Avatar image for snugenz_irl
snugenz_irl

31

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#28 snugenz_irl
Member since 2013 • 31 Posts

@cfisher2833 said:

@NameIess_One said:

@clyde46 said:

I think I'll be getting that 780Ti

Really curious about the price. Since there's a good chance 780 Ti will be trading blows with R9 290X, we may see a price war.

May get one for myself as well, but will wait for a bit, see how things play out.

What I think they'll do is have the 780ti replace the 780 as the most expensive Geforce GPU (not counting Titan) and lower the 780 to around the price of a 770.

That would be ideal and i'll probably pick up another 780 if that does happen.