Computational Complexity

 

Creative Commons License
This work is licensed under a Creative Commons License.

Powered by Blogger™

Thursday, April 29, 2004

 
Karp Symposium

Posted by Lance

[A report from weblog correspondent Bill Gasarch. Link to Allender's talk added 5/7]

On Wednesday April 28 there was a SYMPOSIUM HONORING DR. RICHARD M. KARP at Drexel University in Philadelphia.

They were honoring him for winning the BEN FRANKLIN MEDAL IN COMPUTER AND COGNITIVE SCIENCE (There are Ben Franklin Medals for Physics, Chemistry, Life Sciences, Earth Science, Computer and Cognitive Science, and Engineering.)

There were three talks:

ERIC ALLENDER: The Audacity of Computational Complexity.

This talk described the basics of complexity theory and mostly focused on reductions. A nice contrast that it made:

  1. in the year 2004 we have good reason to think that many problems (e.g., SAT, 3-COL) are hard, except factoring which is still hard to classify.
  2. in the year 1970 most problems (including SAT, 3-COL) were hard to classify.
The talk also pointed out some of the problems with Computational Complexity (e.g., "How can you call a n100000 algorithm feasible?") and answered them nicely (e.g., "we want to show problems are hard, so showing its not in P does that.") The talk both began and ended on the topic of CHECKERS and GO being computationally hard problems.

AVI WIGDERSON: The Power and Weakness of Randomness (When you are short on time).

This talk showed several examples of problems where randomness helps (hence randomized algorithms are powerful) but also indicated why there may be reason to think that you can always replace a randomized algorithms with a polynomial time algorithm (hence randomization adds no power). The problems it helped on involved sampling, routing in networks, and mazes.

RICHARD KARP: Even Approximation Solutions can be Hard to Compute.

This talk was about certain problems that can be approximated and certain ones that (it seems) cannot be. A nice contrast was variants of TSP, which ranged from what can be approximated very well, to what can be approximated some, to what can't be approximated. He also brought in randomized rounding as a technique for approximation. The talk ended on PCP (done informally) and how it can be used to show lower bounds for approximation.

OVERALL:
The talks were all well presented and quite understandable. The point of the talks was to expose our area to people outside of theory and perhaps even outside of computer science. As such the theorists in the audience did not learn much new; however, it is still interesting to see someone else's perspective on material that you are familiar with.

9:56 AM # 0 comments

Wednesday, April 28, 2004

 
Conferences versus Journals

Posted by Lance

A reader asks why Gafni and Borowski did not publish their paper in a journal and become eligible for the Gödel Prize. I wish this was an isolated incident but it reflects on a sad state of affairs in computer science and theoretical computer science in particular. Too many papers in our field, including many great ones, do not get submitted to refereed journals. In an extreme case, Steve Cook received the Turing Award mostly for a STOC paper.

In most cases, conferences in computer science are more selective than journals. Your reputation in theoretical computer science is measured more by the conferences your papers appear than the journals. In other fields like mathematics, physics and biology, journals have a much greater reputation and most of their papers do appear in refereed form. I believe the reason is historical: computer science started as a quickly changing field and journals could not keep up with the rapidly emerging ideas.

Conference program committee cannot and do not produce full referee reports on conference submissions. Proofs are not verified. Papers are not proofread carefully for mistakes and suggested improvement of presentation. Computer science suffers by not having the permanency and stamp of approval of a journal publication on many of its best papers. The founders of the Gödel Prize put in the journal requirement to encourage potential award winning papers to go through the full refereeing process.

Many papers in our field do appear in journals and some researchers are extremely diligent in making sure all of their work appears in refereed form. Also I know of no computer scientist who purposely avoids sending their papers to a journal. But when we have a system that does not value journal publications, a computer scientist pressed for time often will not make the effort to take their papers past the conference version.

8:53 AM # 0 comments

Monday, April 26, 2004

 
Is Disney World NP-complete?

Posted by Lance

The Unofficial Guide to Walt Disney World 2004 gives a lesson on complexity by describing the optimal tour of the Magic Kingdom as a traveling salesman problem. Some excerpts:
As we add more attractions to our list, the number of possible touring plans grows rapidly...The 21 attractions in the Magic Kingdom One-Day Touring Plan for Adults as a staggering 51,090,942,171,709,440,000 possible touring plans...roughly six times more than the estimated number of grains of sand in the whole world...Fortunately, scientists have been hard at work on similar problems for many years..finding good ways to visit many places with minimum effort is such a common problem that it has its own nickname: the traveling salesman problem.
The book goes on to describe the computer program they use to approximate the optimal tour. Read more here (which I found by searching within the book for "traveling salesman" on the Amazon site). You'll need to be a registered user of Amazon.com to read it.

9:17 AM #

Sunday, April 25, 2004

 
G�del Prize

Posted by Lance

From the PODC (distributed computing) mailing list via Harry Buhrman. Usually the winners are kept secret until the ICALP or STOC conference but the PODC mailing list has already broken the news.
It has been recently announced that this year's winners of the Gödel Prize are

As we all know, the result was initially published simultaneously in STOC 1993 also by Eli Gafni and Liz Borowski, but the Gödel Prize is awarded only to journal articles.

Congratulations to the winners!

Note that for the second time, the Gödel's Prize honors a core PODC topic (in 1997, Joe Halpern and Yoram Moses won the prize). This is a sign both of the scientific quality of the PODC community, as well as the respect it wins in the theoretical CS world at large.

In case you are counting, that's Complexity 5, PODC 2.

6:53 AM # 0 comments

Friday, April 23, 2004

 
Theory Girl

Posted by Lance

From Bill Gasarch: There are some more novelty songs about theory (aside from THE LONGEST PATH) from the Washington CSE Band. The best one is THEORY GIRL.

2:53 PM # 1 comments

Thursday, April 22, 2004

 
A Few Short Announcements

Posted by Lance

Alan Kay will receive the 2004 Turing award. It can't always be a theorist.

Registration is open for the 2004 Conference on Computational Complexity. The final schedule will be posted soon. Also keep in mind STOC 2004 right here in Chicago.

The list of accepted papers for ICALP is up.

Finally, next Wednesday the 28th in Philadelphia, Drexel is hosting a symposium on computational complexity honoring Richard Karp.

10:36 AM # 0 comments

Wednesday, April 21, 2004

 
Are There #P Functions Equivalent to SAT?

Posted by Lance

Help me solve this problem, write the paper with me, get an Erdös number of 3 and it won't cost you a cent.

We can have #P functions hard for the polynomial-time hierarchy (Toda) or very easy but can they capture exactly the power of NP?

Conjecture: There exists an f in #P such that Pf=PSAT.

There is some precedence: counting the number of graph isomorphisms is equivalent to solving graph isomorphism.

The conjecture is true if NP=UP, NP=PP or if GI is NP-complete. I don't believe any of these. Does the conjecture follow from some believable assumption or perhaps no assumption at all? We don't know if there exists a relativized world where the conjecture does not hold.

Even the following weaker conjecture is open: There exists an f in #P such that NP⊆Pf⊆PH.

A good solution to these conjectures might help us settle the checkability of SAT.

8:51 AM # 0 comments

Monday, April 19, 2004

 
Asian Food for Thought?

Posted by Lance

Many years ago, an Israeli graduate student made the rounds and gave talks at several US universities. When he arrived in Chicago, he asked me if Americans only eat Chinese food. I told him he hadn't seen a random sample of Americans and took him out for some good Chicago ribs. Afterwords he told me he preferred the Chinese food.

At a logic conference at Notre Dame, I ate dinner with a small group at one of the few Chinese restaurants in South Bend. Surprisingly no other mathematicians were eating in the restaurant. Just as we noticed this, the waiters started putting tables together and about five minutes later in walk about 20 logicians for dinner.

Why do mathematicians and computer scientists eat so much Asian food? Not just Chinese but Japanese, Thai, Korean, Vietnamese, Indonesian, Ethiopian (not Asian but close enough) and of course Indian (northern and southern). Not that I don't enjoy Asian food but what's wrong with a good hamburger?

8:40 AM # 0 comments

Tuesday, April 13, 2004

 
Favorite Theorems: Primality

Posted by Lance

March Edition

Primality is a problem hanging onto a cliff above P with its grip continuing to loosen each day. - Paraphrased from a talk given by Juris Hartmanis in 1986.

It took sixteen more years but the primality problem did fall.

PRIMES is in P by Manindra Agrawal, Neeraj Kayal and Nitin Saxena.

This paper gave the first provably deterministic polynomial-time algorithm that could determine whether n is a prime given n in binary. The theoretical importance cannot be overstated. But why do I consider the paper a complexity result instead of just an algorithmic result?

Manindra Agrawal had already a strong reputation as a complexity theorist. The proof involves a derandomization technique for a probabilistic algorithm for primality. But more importantly primality had a long history in complexity.

Primality is in co-NP almost by definition. In 1975, Vaughn Pratt showed that PRIMES is in NP. In 1977, Solovay and Strassen showed that PRIMES in co-RP and testing primality became the standard example of a probabilistic algorithm. In 1987, Adleman and Huang building on work of Goldwasser and Kilian showed that PRIMES is in RP and thus in ZPP. In 1992, Fellows and Koblitz showed that PRIMES is in UP∩co-UP. Finally in 2002 came AKS putting PRIMES in P.

A runner-up in this area is the division problem recently shown to be in logarithmic space and below.

10:28 AM #

Sunday, April 11, 2004

 
The Cost of Textbooks

Posted by Lance

The University of Chicago Bookstore has asked for textbook requests for the fall quarter by the middle of next month instead of during the summer as in past years. The reasoning: A burgeoning used textbook market. If the bookstore knows what books faculty will use in the fall, they can offer higher prices to pay for used books at the end of the spring quarter.

This is just an indication of the problems of higher textbook costs. CALPIRG has a recent extensive report on this topic. Textbook costs add to already spiraling increases in tuition and other college expenses.

In addition, I have more griping than usual about buying the textbook from students in my class though the book, Homer and Selman's Computability and Complexity Theory lists new for $50, under even the average used price mentioned in the CALPIRG report.

What should I do as a faculty member? Should professors strive to reuse the same textbook each year so student's can buy and sell used versions to keep their costs down? That can lead to courses getting stale very fast.

Or should I even forgo textbooks completely and rely on less organized material freely available on the internet? I already do this for graduate courses where strong up-to-date textbooks simply do not exist.

11:47 AM # 0 comments

Tuesday, April 06, 2004

 
The View of a Science Writer

Posted by Lance

A friend of mine from college became a science writer for various newspapers and magazines. Once he told me about his two biggest complaints about scientists.
  1. Scientists want everyone who works on a project to be named in an article.
  2. Scientists want every detail in an article to be complete and correct.
You might initially take the side of the scientists. But the science writer does not write for the scientists but for the general public.

Put yourself in the position of the reader. The reader doesn't want to read through a long list of names that they won't remember anyway. The average reader also just wants an overview of the research and its importance. If removing some technical caveats and slightly oversimplifying the research achieves a better level of understanding to the reader, so be it.

Remember next time you read a science article in the popular press or get interviewed for such an article, the goal of the article is not to pass a serious referee review but to give the general public some glimpse into an important research area.

9:01 PM # 0 comments

Monday, April 05, 2004

 
Blum Complexity Measures

Posted by Lance

The Blum speed-up theorem states that there exists a computable language L such that if L is in time t(n) then L is in time log(t(n)). The log function can be replaced by any arbitrarily slowly growing computable function. Instead of time one can use space or any other measure Φ that fulfills these properties:
  1. Φ(M,x) is finite if and only if M(x) halts, and
  2. There is a computable procedure that given (M,x,r) can decide if Φ(M,x)=r.
These are known as Blum axioms and measures that fulfill them are known as Blum complexity measures. They were developed by Manuel Blum in the late 1960's.

The Borodin-Trakhtenbrot Gap Theorem states that given any computable function g(n) (e.g. g(n)=2^n), there exists a function t(n) such that every language computable in time g(t(n)) is also computable in time t(n), i.e., there exists a gap between these time classes. Once again the theorem holds for any Blum complexity measure.

We don't see much of the Blum complexity measures these days for a few reasons.

  1. The only truly interesting Blum measures are time and space.
  2. The functions and languages that one gets out of the speed-up, gap and related theorems are usually quite large and artificial.
  3. Many measures that we are interested in today, like the number of random coins used by a probabilistic Turing machine, do not fulfill the Blum axioms.
In 1991 I saw Manuel Blum give a talk discussing a new complexity measure, something about mind changes, that did not fulfill his axioms. So we had a Blum complexity measure that was not a Blum complexity measure and as Douglas Adams would say Manuel Blum "promptly vanishes in a puff of logic." [Just kidding-we like Manuel]

8:06 AM # 0 comments

Friday, April 02, 2004

 
More News from Dagstuhl

Posted by Lance

Another Guest Post from Dieter van Melkebeek

Thursday morning, Shuki Bruck gave the first talk at the workshop that dealt with actual Boolean circuits. He pointed out that cyclic circuits can be combinational and may allow us to realize Boolean functions with fewer gates and/or less delay. Consider the following circuit with inputs x1, x2, x3, and outputs f1, f2, f3, f4:


 |-----------------------------------|
 |                                   |
 |    x1       x2     -x1       x3   |
 |    |        |       |        |    |
 |    |        |       |        |    |
 |    v        v       v        v    |
 |                                   |
 |-> \/ ----> /\ ----> \/ ----> /\ --|

      |        |        |        |
      v        v        v        v

      f1       f2       f3       f4
Although the circuit is topologically cyclic, the outputs are well-defined and only depend on the inputs. (Look at the cases x1=0 and x1=1 separately.) A careful analysis shows that every acyclic circuit that outputs f1, f2, f3, and f4 needs at least 5 nonunary gates. Thus, circuits with feedback allow us to gain a factor of 4/5 in terms of number of gates needed to compute these functions. (As usual, we do not count negations.) Shuki presented a sequence of Boolean functions for which the reduction in the number of nonunary gates asymptotically reaches 1/2 if we only allow gates of fanin at most 2. He raised the question how significant the reduction can be if we allow larger fanin.

Thomas Thierauf presented an NC2 algorithm for unique perfect matching. A perfect matching in a graph is a collection of disjoint edges that cover all vertices. It is known for some time how to decide the existence of a perfect matching and how to construct one in randomized NC2:

  1. Assign random weights from a small range of integers to the edges of the graph such that with high probability there is at most one minimum weight perfect matching. If we are in the situation with a unique minimum weight matching M, we can decide whether a given edge belongs to M by evaluating two determinants of matrices with integer entries that are exponential in the weights. Since the weights are small, we can do the latter in NC2.
  2. Run the NC2 algorithm on all edges in parallel and verify that the result is a perfect matching M.
It is open whether perfect matchings can be constructed deterministically in NC.

To decide whether a graph G has a unique perfect matching, Thomas first runs step 2 above (with unit weights). If that test fails, the algorithm rejects since G either has no perfect matching or has more than one. If the test is passed, the algorithm additionally verifies that G has no perfect matching M' other than M. Such an M' exists iff G contains a cycle that alternates between edges from M and edges in G-M. The latter can be cast as a reachability problem in a graph that is roughly a concatenation of directed copies of M and G-M. Since directed graph reachability can be computed in NC2 and the input to the reachability problem can be computed in NC2 by step 2 above, the additional test runs in NC2, as does the entire algorithm.

On Friday, Oded Lachish discussed the current records on unrestricted circuit lower bounds for explicit functions in n Boolean variables. For circuits that can use any binary gate, the record dates back to 1984 and stands at 3n. For circuits that can use any binary gate except parity and its negation, the record has recently been improved from 4n - O(1) to 5n - o(n). Both records use the technique of gate elimination, and Oded conjectured that the 3n result can be improved along the lines of the recent 5n - o(n) result.

The workshop ended at noon on Friday. One statistic: among the 33 talks, 3 were blackboard only, 5 used handwritten slides, 1 printed slides, and 24 were computer presentations.

Finally, I have one suggestion for those readers who have attended a Dagstuhl seminar in the past. In a response to changes in financial support, the Dagstuhl office is requesting information about research publications that grew out of or have otherwise been significantly influenced by a Dagstuhl seminar. If you are an author of such a publication, please send the information to office@dagstuhl.de. Let's try to keep the wonderful tradition of Dagstuhl alive!

2:39 PM # 0 comments

Archives