PCI Express 3.0 - Need someone to explain this to me..

This topic is locked from further discussion.

Avatar image for xsubtownerx
xsubtownerx

10705

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#1 xsubtownerx
Member since 2007 • 10705 Posts

Hi guys,

I've been out of the loop for about a year now. What the **** does PCI Express 3.0 bring to the table? How does it work? And WHAT DOES IT DO?

I noticed that the new GTX680 supports this, and that some mobos have released with this new design (gen 3), but I'm confused as to what this all means.

is this just another gimmick of some kind?

Avatar image for configme
configme

786

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#2 configme
Member since 2004 • 786 Posts

It comes after 2....2.1 actually.

(double the bandwidth)

Avatar image for GummiRaccoon
GummiRaccoon

13799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 GummiRaccoon
Member since 2003 • 13799 Posts

It doesn't matter for right now. It's just them updating the spec before they fall behind.

Avatar image for Blistrax
Blistrax

1071

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#4 Blistrax
Member since 2008 • 1071 Posts

Hi guys,

I've been out of the loop for about a year now. What the **** does PCI Express 3.0 bring to the table? How does it work? And WHAT DOES IT DO?

I noticed that the new GTX680 supports this, and that some mobos have released with this new design (gen 3), but I'm confused as to what this all means.

is this just another gimmick of some kind?

xsubtownerx
No gimmick. The motherboard people have to stay ahead of everybody else, if you think about it. It wouldn't make sense for the GPU people to design an interface that didn't exist on a mobo yet. Only the 7970 and the 680 use it, but they aren't any faster in games because of it. As time goes by, they will make everything fast enough to need PCIE 3.0, but right now it's overkill. Anandtech tested a GPU computing program that showed a slight improvement under 3.0, but not for graphics. If you could understand how it works, you wouldn't need me to explain it to you. If I could understand how it works, I'd be designing them for big bucks instead of typing here like an idiot. That said, it seems they changed the encoding to be much more efficient and tweaked the hardware, among other things.
Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#5 Kinthalis
Member since 2002 • 5503 Posts

Basically 3.0 oiffers double the bandwidth of 2.0 while remaining backwards compatible.

Good for the future, specially in SLI/x-fire setups, where many fo us are stuck at 8x/8x configurations.

I doubt we'll see anything saturate even the current bus for a while though, so it really is just a future proofing thing.

Avatar image for 04dcarraher
04dcarraher

23858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6 04dcarraher
Member since 2004 • 23858 Posts

Basically 3.0 oiffers double the bandwidth of 2.0 while remaining backwards compatible.

Good for the future, specially in SLI/x-fire setups, where many fo us are stuck at 8x/8x configurations.

I doubt we'll see anything saturate even the current bus for a while though, so it really is just a future proofing thing.

Kinthalis
Heck even with a a GTX 580 MARS 2 with pci- e 2.0 at 4x only loses a few fps, at 8x+ there's is no saturation not even close.