Will computers keep improving at the same rate as in the past?

This topic is locked from further discussion.

Avatar image for slipknot0129
slipknot0129

5832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 slipknot0129
Member since 2008 • 5832 Posts

Will computers keep improving at the same rate as in the past?

These last few years ive always had it in my mind that cpu's and gpu's are close to about as much performance you could get. That they couldnt improve much more. Due to the limits of silicon and whatnot. Am I wrong?

Avatar image for Obiwan_1O
Obiwan_1O

286

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 Obiwan_1O
Member since 2003 • 286 Posts

http://en.wikipedia.org/wiki/Moore's_law

Avatar image for ydnarrewop
ydnarrewop

2293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#3 ydnarrewop
Member since 2004 • 2293 Posts

Cool according to Moor's Law there should be a slowing in 2013 where the doubling rate of transistors will occur every 3 years instead of two. I believe nothing is infinite and there will be a slowing that will inevitably stall at some point. However, this could rectified by new tech altogether.

Avatar image for LordRork
LordRork

2692

Forum Posts

0

Wiki Points

0

Followers

Reviews: 26

User Lists: 0

#4 LordRork
Member since 2004 • 2692 Posts

With current technology, you're limited by the size of the atom - Moore's law is just as much about the number of transistors in the same area as anything else.

The switch to more "GPU-like" processing is likely to be the next big jump for computers (which we're starting to see with APUs). Concepts like organic/genetic computing are a long way off, so exploiting existing mass market parallel processing (i.e. cheaper and well researched) is going to be the way forward for the time being.

Avatar image for spittis
spittis

1875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#5 spittis
Member since 2005 • 1875 Posts

Most likely. And imagine if there would be a breakthrough in optical computing, just as an example.