Computational Complexity


Creative Commons License
This work is licensed under a Creative Commons License.

Powered by Blogger™

Friday, January 23, 2004

STOC and the NSF

A couple of quick notes.

The list of accepted papers for the upcoming STOC conference has been posted. The most intriguing looking paper in complexity is Multi-Linear Formulas for Permanent and Determinant are of Super-Polynomial Size by Ran Raz (also mentioned earlier by Scott Aaronson).

Congress has finally passed the FY 2004 US budget. A 5% increase for NSF, 4.8% for research and related activities.

9:54 AM # Comments []  

The Defense, Part I

You've taken your classes, passed your preliminary/qualifying exams, done your research and written your thesis. What stands between you and the Ph.D.--the thesis defense.

After all this buildup, the defense in the states is rather anti-climatic. The student gives an extended talk on his thesis research, the thesis committee peppers the student with questions, then the committee deliberates and decides whether to pass the student.

The defense is mostly for show, the student almost always passes. If the student didn't deserve to pass the fault lies not with the student but with the advisor for letting the process get this far. A dirty little secret: We often make the deliberation longer than needed just to add a little drama.

I have served on defense committees in Denmark and Portugal that follow this same basic plan. But not all countries do the same.

I remember visiting the University of Karlsruhe (Germany) when a parade broke out. I asked about the parade and my host said someone just got their Ph.D. I have also heard of some countries where the advisor has to defend the thesis. Glad I don't teach there.

I bring this up because next week I sit on my third committee at the University of Amsterdam. The Dutch do their Ph.D. defenses the way the way a defense ought to be done. What happens at the Dutch defense? I'll let you know next week.

8:50 AM # Comments []  

Thursday, January 22, 2004

John Lewis

[From Chris Fuchs]

Dear friends in the quantum information and foundations communities,

Many of you may not know it, but the concept of a generalized quantum measurement or positive-operator-valued measure was introduced in

E. B. Davies and J. T. Lewis, "An Operational Approach to Quantum Probability," Communications in Mathematical Physics 17, 239-260 (1970).

Last night, John Lewis passed away here in Dublin, his home of 28 years, from complications due to a recent surgery. He was a good man, honest and upright, and left us a deep legacy. He will be missed.

8:05 AM # Comments []  

Wednesday, January 21, 2004

The Da Vinci Code

On my vacation I read Dan Brown's The Da Vinci Code, a very popular book I received recently as a gift. Warning: Minor spoilers follow.

I always enjoy a novel with an academic protagonist but the Da Vinci Code reads like a bad conspiracy theory using the roles of the professor and other experts to give the theory some weight. But I bring up this book in this weblog because it spends considerable time on various cryptographic schemes.

I don't blame the author for not using modern cryptography but the methods described would be laughable 50 or 100 years ago. Imagine using the password 1123581321 to guard the biggest secret in the history of religion. It only gets worst: backwards writing, simple anagrams, substitution ciphers, riddles. I suppose these make for fun puzzles for the reader but do not make for a safe secret.

The book describes one intriguing device supposedly invented by Leonardo Da Vinci called a cryptex, a small cylinder with a combination lock that will destroy its written contents if broken. However the book calls it a rudimentary form of public-key cryptography, which only tells me Dan Brown has no idea what that term means.

For a far better novel dealing with cryptography and the related paranoia, check out Neal Stephenson's Cryptonomicon.

8:56 AM # Comments []  

Monday, January 19, 2004

I'm Back

Thanks to Scott Aaronson for covering for me last week. If you've enjoyed the last week, check out more of his writings. Maybe the weblog bug has bit him and he will start his own blog someday.

I feel the need to remark on Scott's advice post and comments, particularly the following paragraph (having just come back from a week of skiing and no research).

So then, how do you do original research? By throwing your entire life into it. Many researchers play piano, go to clubs, sail, etc., but if they're any good they probably think about research while they're doing these things. I know grad students who never suffer the indignity of working late into the night. They go surfing with friends every weekend and are constantly away on road trips. At this rate, they'll enjoy life more than I will but won't be successful researchers.

Your success in academics, like any professional endeavor, depends in part on how much effort you put into it with the relationship far more than linear. But by no means is social life and a productive research career incompatible. Most academics eventually find a life partner and many of us have children. We have many non-academic hobbies and activities even as graduate students. The trick is to find the right balance between your academic and non-academic activities, a difficult task but far from impossible. I truly admire the massive works of Paul Erdös, but I would never trade my life for the one he led.

And now a message for Warren, the college freshman with a potential interest in graduate school. Take some computer science classes and lots of math classes, particularly probability, algebra and logic. But most important of all, don't worry about research now. Enjoy your college days, get involved in lots of activities, have an active social life. You'll have plenty of time for research in graduate school.

8:13 PM # Comments []  

Scaring Away The Scientists Of Tomorrow (last post of guest blogger Scott Aaronson)

Sir Lance-lot has returned, and tomorrow will reclaim his fortress from this barbarian invader. He writes: "Thanks for blogging for me, though I hope you haven't scared away potential future researchers."

Let me state clearly what I think. The greatest perk of being a scientist is never having to doubt the value of what you do. If someone who fed starving Ethiopians, or rescued baby seals from oil spills, asked me how I justify spending my time proving complexity theorems, I might have difficulty answering; eventually I'd mumble something about basic science (along with art and music) embodying the highest aspirations of civilized humankind since the age of Democritus, and therefore being worthy of pursuit even in the face of palpable suffering. But if some regular schmo -- a sportswriter, or consultant, or homeopathist -- demanded that I justify what I do, I'd laugh in his or her face.

Other benefits of a research career include the freedom more or less to choose your hours, the satisfaction of being "the person who discovered such-and-such", the opportunity to inspire students, and copious expenses-paid trips to conferences around the world. I won't dwell on the downsides of being a scientist, both out of deference to Lance, and because the downsides are obvious to anyone familiar with cultural stereotypes.

The point I want to make is that for me, both the benefits and the downsides are irrelevant, because I can't even imagine not doing science. Having once tasted it, I couldn't go cold turkey any more than a heroin addict. What if someone solved one of my open problems, or emailed me with questions about a paper I wrote? Would I ignore that person, just as though BQP/qpoly and NISZK had never been part of my life? I mean, obviously I'd be happier were I a self-assured ignoramus who majored in marketing and mingled on the beach -- but then I wouldn't be I; I'd be a different person.

In summary, then, you should pursue a research career if and only if science to you is life's kth greatest pleasure for some k=O(1). Thank you for reading.

[Addendum: Here O(1) is intended in the physicist's sense, not the computer scientist's asymptotic sense. You only live once.]

1:12 AM # Comments []  

Sunday, January 18, 2004

Algorithmic Cooling on Mars II: Mars (by guest blogger Scott Aaronson)

OK, now Mars. I'm sure you've all read about the dramatic successes of the Spirit rover, which incidentally raise two computer science questions:

  1. Can a lander be programmed to scout a safe, interesting landing site during its 6-minute descent phase? (Sending pictures to Earth takes too long; the round-trip time for radio signals is about 20 minutes.) As far as I know, Spirit took photos only to gauge its speed relative to the surface, not to scout landing sites.
  2. Can (and should) the Internet be extended beyond Earth's atmosphere? During the periods when Spirit is not in Earth's line of sight, two existing Mars orbiters are pressed into service as relays -- so in some sense a Martian communications network already exists. Will denial-of-service attacks and Viagra offers soon plague the solar system?

I'm sure you've also all read about the Bush administration's new vision for space exploration, which includes a manned Mars mission at an unspecified future date. Despite my no-politics mandate, Lance has often discussed science funding in this blog, so I will too. The usual rule is that sending humans somewhere (the Moon, Mars, low-Earth orbit) costs 100 to 1000 times as much as sending robots to the same place. Part of the reason is that, letting ε be the probability of a catastrophic failure, the cost of a mission increases asymptotically as ε approaches 0. Unmanned Mars landers have done well with ε around 2/3. For manned missions, by contrast, any estimated ε above (say) 1/1000 is unacceptable (although ε will always be higher in practice, as we were recently reminded).

But is human spaceflight worth the costs? Lest this post become too polemical, I'll skip the usual arguments and their rebuttals (if you don't know them, read What's New by Bob Park), and end with a question for readers. If you were the President's science adviser, would you suggest gutting the Shuttle, the ISS, and all work towards a moon base or manned Mars mission, and diverting the funds toward basic science? If so, a followup: suppose the NSF budget for theoretical computer science were quintupled tomorrow. What would be the best way to spend the money?

11:17 PM # Comments []  

Algorithmic Cooling on Mars I: Algorithmic Cooling (by guest blogger Scott Aaronson)

Sorry I haven't posted for a while -- QIP has left me with nary an hour to spare. Today Leonard Schulman gave a talk whose title was announced as "Physical Limits of Heat-Bath Algorithmic Cooling on Mars." No, the talk didn't actually have anything to do with Mars; Leonard just wanted to show us his Windows wallpaper (a Mars photo), and suggest that Mars, being even colder than Waterloo, might be an even better site for quantum computing experiments.

Nevertheless, the title provides an excuse to discuss two things on my mind: Leonard's talk and Mars. I'll start with Leonard's talk. Liquid NMR quantum computing has the advantage that hundreds or thousands of quantum gates can be applied before decoherence sets in, and the disadvantage that the qubits are difficult to initialize to the standard "all-0" state. Instead, the starting state is exponentially close to the maximally mixed state. This means that in the final measurement outcome, the ratio of signal to noise decreases exponentially in the number of qubits -- so exponentially many repetitions are needed to extract a signal, negating any quantum speedup.

But is this fundamental? A few years ago Schulman and Vazirani introduced algorithmic cooling, a technique that starts with a hot, high-entropy state, then uses data compression to cool down a few qubits, at the cost of making the rest of the qubits even hotter. (As long as we're limited to reversible unitary gates, the total entropy of the system must remain constant.) The cold ("Waterloo/Mars") qubits can then be used as a quantum computer's standard initial state. The trouble is that too much chaff is needed for too little wheat: with current NMR error rates, extracting a few dozen cold qubits could take 108 or 1012 starting qubits.

A natural alternative, proposed by Boykin, Mor, Roychowdhury, Vatan, and Vrijen, is to let the qubits evolve nonunitarily; that is, interact with the environment. In physics jargon, this is called "coupling the qubits to a heat bath," even though the goal is to cool the qubits. Amazingly, it turns out that by using classical tricks (for example, mapping the basis state |a,b,c> to |a+c,b+c,MAJ(a,b,c)>, where addition is mod 2 and MAJ denotes the majority function), the qubits can be made even colder than the environment to which they're coupled. This raises a question: are there any limits to such cooling? Schulman, jointly with collaborators who I can't recall right now (one of them is Tal Mor), have given an affirmative answer. Suppose each qubit initially has bias ε (that is, is in the mixed state (1/2+ε)|0><0|+(1/2-ε)|1><1|). Then the heat-bath method can't increase the probability (that is, |amplitude|2) of any basis state above 2-nexp(ε2n), where n is the number of qubits. This bound is essentially tight: if ε>24-n, then the initial state can be cooled significantly. Unfortunately, the algorithm that achieves this cooling requires order 1/ε2 steps, which is exponential assuming ε is exponentially small. Furthermore, this exponential blowup seems to be unavoidable (Schulman didn't give a rigorous lower bound, but said it would be easy to obtain).

To my mind, the cooling result raises a fascinating question for complexity theory. Imagine that each generation of human beings, just as it plants trees and stores wine in cellars, starts cooling quantum states -- so that future generations, millions of years hence, could use those states to perform whatever quantum computations they wanted. "Vintage" quantum states would then be highly prized possessions (served chilled, of course). In this situation, would we be using exponential time (the time needed to cool the states), or polynomial time (the time between specifying the input and measuring the output)?

Part II of "Algorithmic Cooling on Mars" will be about Mars.

12:59 PM # Comments []