ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Are Optical Computers as PCs on the horizon?

(1/3) > >>

kyrathaba:
Today's silicon-based computers are limited in their speed by the speed with which they can transfer data and the speed with which that data can be processed via logic gates.  That being the case, it stands to reason that a primary factor limiting the speed of current computers is the speed with which electrons move around inside the silicon chips, which is roughly half the speed of light in a vacuum.1  David Levy says that computing technology of the future will be impossible without a change in the way computers are made (materials, size, etc.)2  New conducting materials developed at NASA's Marshall Space Flight Center can act as computer logic gates that perform 1,000 times faster than today's fastest silicon chips, and are smaller.  And researchers at the University of Rochester believe that their work could lead to a computer more than a billion times faster than the fastest supercomputer was at the beginning of the 21st century.  Now that, friends and neighbors, is fast :D

Leonard Adelman, who coauthored the RSA encryption algorithm, introduced the idea of using DNA to solve complex mathematical problems.  And in 1997, researchers at the University of Rochester actually developed DNA logic gates.  I won't even get into a discussion of quantum computing. 

Here's my question: I want to know how powerful (memory capacity, processing speed) you guys think that PCs will be in 2050, and in 3000 A.D.  And, secondly, what everyday applications will such powerful computers, using A.I., be put to in those two years, respectively?



1. I.e., electron speed in our CPUs is approximately 93,000 miles per second.
2. Robots Unlimited: Life In A Virtual Age, copyright 2005

f0dder:
Using DNA is not without problems - it doesn't have infinite lifetime, but will decay. Also it's a lot slower than our current silicon based computers, but has the advantage of being massively parallel. At least that was my understanding of it when I head about it a couple years ago... so the idea was that you could use a "DNA computing device" for some specialized tasks, rather than having a "DNA computer"... same goes for "quantum computer", really.

The idea of optical computers is interesting, but I have this hunch that there'll need to be some conversion to/from electricity in various parts, and that might end up being the bottlenecks.

2stepsback:
Here's my question: I want to know how powerful (memory capacity, processing speed) you guys think that PCs will be in 2050, and in 3000 A.D.  And, secondly, what everyday applications will such powerful computers, using A.I., be put to in those two years, respectively?-kyrathaba (April 25, 2007, 06:25 PM)
--- End quote ---

Speaking robots will be the next big thing, if you ask me.[Edit:]I didn't read the second reference at first, so it kind of looks silly  :-[

Interested people can look at

1) http://rchi.raskincenter.org/index.php?title=Jef_Raskin
Jef Raskin
see the "MicroOptical" photo

2) http://cs.nyu.edu/~jhan/ftirtouch/
and http://www.ted.com/index.php/talks/view/id/65?gclid=CJGX2ruY4IsCFQu2bgodsE4Hcg
Jeff Han

Jef Raskin and Jeff Han.
What's with these Jef's?

kyrathaba:
There has been a several thousand-fold increase in the power of the PC since the first ones came out with 64 Kb RAM and no hard drive.  In the book by David Levy (referenced in my original post) that I'm reading, the author posits that by 2020 the PC will have the computing power of a human brain, and by 2030 it will have the computing power of an entire village of human minds. 

Do you think such estimates are wildly optimistic or not?  There are some very intelligent people, such as Ray Kurzweil, who believe we're approaching a technological singularity, a point at which our technological know-how reaches a critical mass, so to speak, and all sorts of things that now seem like science fiction will become realities.  For example, David Levy thinks that within the next few decades computing power and the sophistication of A.I. will advance so far that robots will be on a par with human beings in terms of intelligence.  He even thinks (and here I begin to arch an eyebrow) that there will come a point where robots will intermarry with humans, reproduce, have legal rights, etc.  My kneejerk reaction to such predictions is "yeah, right", but who knows?

gjehle:
i'm lazy, i stick to moore ;-)

i don't see optical or biological computers in the _consumer_ market any time soon.
even tho you hear that researchers have "slowed down" or even "stopped" (aka stored) light the technology is no where near anything you could use.
it's still huge lab benches of high precision equipment in a controlled environment.

having an optical logic gate is only a very very very small step.
if you can't efficiently store the information for a prolonged time (think: registers, cache, ram, peripheral storage) you're nowhere near anything you could call 'computer'.

and yes, the "signal has to be converted to electricity at some point" bottle neck will be there.
there's no way you'll snap your fingers and have optical-everything just like that ;-)

i think the silicon based integrated circuit technology will go on for a bit longer, maybe not silicon based (some other mineral) but same concept.
as you might have already noticed for some time now, development took the direction of (massive) parallelism rather than higher clock speeds.

one speed limiting factor with the way logic gates work is capacity. every transistor has a capacity, every bus has a capacity, _everything_ has a capacity. what you need to do in order to "do" something is basically "fill it up", that's what you call "drive".
by decreasing the capacity (making the transistor smaller) you can get [ higher speed, less power consumption, more elements per inch, ... ].
actually, the little nano-meter value (structural width) that defines the 'technology' (as in: 65nm process, etc) is really the width of the poly-silicon stripe that is the 'gate' of the transistor.
making this smaller, you automatically decrease the capacity of said transistor.

well, silicon based computers aren't dead yet ;-)

Navigation

[0] Message Index

[#] Next page

Go to full version