I have recently been web-researching quantum computing.
Will we see these in our lifetimes (ever?) (The error correction issue, for example, seems intractable to me).
-
I vote: Hype.
...but hope I'm wrong.
Randy
Simucal : Why do people sign their posts like a personal letter? lol.. We can see who it is at the bottom of the post ;)Randy Stegbauer : Well...because it seems more personal. -
Quantum computing is a tool; it's just a tool too raw to have any sort of useful application as of this moment, but who knows.
-
It's already here, just with very limited uses so far.
-
Nobody will ever need more than 640KB of RAM...
... okay fine. Maybe he didn't say that but you get the idea. :)
-
Just looking at the results from one website, I'd say it's not that impossible:
http://arstechnica.com/journals/science.ars/2008/03/28/encoding-more-than-one-bit-in-a-photon
http://arstechnica.com/journals/science.ars/2008/10/28/scalable-quantum-computing-in-the-next-5-years
http://arstechnica.com/news.ars/post/20080729-finding-lost-qubits.html
http://arstechnica.com/news.ars/post/20080509-new-quantum-dot-logic-gates-a-step-towards-quantum-computers.html
http://arstechnica.com/news.ars/post/20080626-three-dimensional-qubits-on-the-way.html
http://arstechnica.com/news.ars/post/20080527-molecular-magnets-in-soap-bubbles-could-lead-to-quantum-ram.htmlFor a more technical overview of why it's not as hard as it used to be, there's a four-part series on self-correcting quantum computers:
http://scienceblogs.com/pontiff/2008/08/selfcorrecting_quantum_compute.php -
Quantum computing isn't much past the "idea" stage. Sure, they can multiply two 2-bit integers, but it takes a dozen grad students a week to set up for the run, and another week to validate the results.
Long-term it's probably got a lot of potential, though it may never be stable enough for use outside of a highly controlled lab-based "supercomputer" environment.
At this point I'd classify it more as Physics than Computer Science. In a way, it's as if Charles Babbage got his hands on one of Michael Faraday's papers and started thinking about maybe, possibly, someday, being able to use electromagnetism as a basis for calculation.
There's been a fair amount written about Quantum Computing over the last couple of years in Scientific American, much of it by the primary researchers themselves: http://www.sciam.com
-
Error correction and loss of coherence are the big problems in quantum computing, as I understand it. Lots of smart people are hard at work on solving these problems, but last I read, it was looking like error-correction requirements might be exponential over the number of qbits, which really detracts from the "we'll solve NP problems in an instant!" attraction of quantum computation.
0 comments:
Post a Comment