Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Now I think you don't understand quantum computing as well as you claim. The standard model of quantum computing uses complex numbers, but it is well known that you can do everything with just real numbers instead. See exercise 10.4-10.5 of Arora-Barak [1]. The key insight in quantum computing is not that complex numbers are required, but that the invariant across computation is the 2-norm of a vector (as opposed to the 1-norm, as in the classic stochastic model of randomized computation).

Further, the standard quantum model of computation is probabilistic in the sense that you "compute" something if your program outputs the right answer with probability at least 2/3. But 2/3 is not special, you just need it to be some probability bounded away from 1/2, in the sense that it can't get closer and closer to 1/2 as the input size grows.

So it's certainly plausible that you could take advantage of this to reduce the precision enough to get to the size bound Lipton mentioned, especially if, as implied, you had the might of a hundred Google engineers working on it. And the guy is so freaking smart and experienced in theoretical computer science that chances are he thought of all this and considered it not interesting enough to spell out for the people who will say "well actually."

[1]: http://www.cs.princeton.edu/theory/complexity/



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: