Computational Complexity

 

Creative Commons License
This work is licensed under a Creative Commons License.

Powered by Blogger™

Saturday, November 23, 2002

 
FOCS and Visas

The FOCS Conference, the major fall theory conference held last week in Vancouver, sounded like a complete success. According to PC Chair Bernard Chazelle there were 320 registrants--quite a healthy number for this conference. Most encouraging was the larger number of students attending as well as a number of strong student papers indicating a solid future for theoretical computer science.

The 320 does not count another 50 "registrants" from Nigeria. They registered with fake credit card numbers in order to obtain letters from the conference organizers to help them obtain visas to go in this case to Canada. Whether they got the visas is unclear and they, of course, never showed up at the conference.

The temptation to help those from Africa is strong, especially since that continent is woefully underrepresented in computer science. However we must as a community be diligent against those who use our conferences as a way to get around standard immigration laws. Whether or not we agree with those laws, if abuses of this nature continue it becomes harder to bring in legitimate scientists, a problem I discussed in my last post.

7:52 PM # Comments []  

Friday, November 22, 2002

 
Coming to America

A recent Washington Post editorial brings up an important issue.

Andris Ambainis was supposed to spend the fall at MSRI in Berkeley but instead is enjoying his native Latvia. When Manindra Agrawal came to Boston last month to talk about his primality algorithm, he was supposed to bring along his student co-authors. Instead he came alone.

Worries about terrorism have caused the US government have made them more cautious about issuing visas and this has slowed down the visa process tremendously. Visa problems have always been a thorn for academics but this fall seems particularly bad.

I understand the need to be careful but when science is hindered by politics nobody is a winner.

9:58 AM # Comments []  

Wednesday, November 20, 2002

 
Where was Cook's Talk?

In 1971, Steve Cook gave his conference presentation that showed that SAT was NP-complete. There it did not immediately stir up much excitement but it is, in retrospect, the single most important conference talk in this history of theoretical computer science. So when and where was this talk?

Steve Cook's paper The Complexity of Theorem Proving Procedures appeared at the Third Annual ACM Symposium on Theory of Computing (STOC) that was held May 3-5, 1971 in Shaker Heights, Ohio, a suburb of Cleveland.

Funda Ergun, a professor at Case Western Reserve, just purchased a house in Shaker Heights and wondered where exactly the conference took place. We got the answer from Bill Rounds, who was one of the local organizers of that conference.

It was (at I think Stouffer's hotel) at the intersection of Warrensville Center Road and Chagrin Boulevard, in the Van Aken center district. The hotel is now gone.

Here is a Mapquest map of that location.

Someday we will organize a P versus NP workshop in that area and make a pilgrimage to this site.

8:00 AM # Comments []  

Tuesday, November 19, 2002

 
Foundations of Complexity
Lesson 8: Efficient Computation

Previous Lesson | Next Lesson

In past lessons, we have studied the computable functions. Computable functions can take an arbitrary amount of time: What good is a program that will eventually give the correct answer but might not finish before the universe collapses?

Somehow we want to limit the amount of time or memory that a computer can use. Just giving a fixed bound does not work well. As technology improves and computers get faster and better, we expect to solve larger and larger problems in a reasonable amount of time. Hartmanis and Stearns, in their seminal paper on computational complexity, turn this around to come up with the right idea: Consider time and memory as functions of the size of the input.

The time a machine M takes on input x is just the number of computation steps that it takes before halting starting with input x. I am being very informal about what a "step" is. In a later lesson we will get into formal definitions of computers and steps but for now just use the idea of implementing one instruction.

The memory or space as we theorists call it is just the number of bits of storage used by M on a given input.

Edmonds gave an algorithm for the matching problem that ran in polynomial time: The number of steps used by the algorithm on a graph of size n is nk for some k. He suggests that polynomial-time captures efficient computation.

We now define our first complexity class P as the set of all languages L for which machine exist that determine whether x is in L and halts in time polynomial in the length of x. The P has many nice properties: A polynomial-time algorithm that uses a polynomial-time subroutine remains in polynomial-time. Also P is robust, that is the class is the same no matter how you formally define your machines.

In these lessons, we will treat P as the class consisting of efficiently computable problems. More classes will come.

9:48 AM # Comments []  

Archives