[general info]
[lecture notes] [Midterm and
Project]
Lecturer: Luca
Trevisan, luca@eecs, 679 Soda Hall, Tel. 642 8006
Classes are Tuesday-Thursday, 2-3:30pm, 405 Soda
Office hours: Wednesdays, 2-3pm, or by appointment
References: the main reference for the course will be lecture notes. New lecture notes will be distributed after each lecture. Meanwhile, you can also refer to older notes
Two very good new textbooks are coming out soon, and preliminary versions are freely available on the web
It is also good to have a copy of
About this course: Computational Complexity theory looks
at the computational resources (time, memory, communication, ...)
needed
to solve computational problems that we care about, and it is
especially
concerned with the distinction between "tractable" problems, that we
can
solve with reasonable amount of resources, and "intractable" problems,
that are beyond the power of existing, or conceivable, computers. It
also
looks at the trade-offs and relationships between different "modes" of
computation (what if we use randomness, what if we are happy with
approximate,
rather than exact, solutions, what if we are happy with a program that
works only for most possible inputs, rather than being universally
correct,
and so on).
This course will roughly be divided into two parts: we will start with "basic" and "classical" material about time, space, P versus NP, polynomial hierarchy and so on, including moderately modern and advanced material, such as the power of randomized algorithm, the complexity of counting problems, and the average-case complexity of problems. In the second part, we will focus on more research oriented material, to be chosen among: (i) PCP and hardness of approximation; (ii) lower bounds for proofs and circuits; and (iii) derandomization and average-case complexity.
There are at least two goals to this course. One is to demonstrate the surprising connections between computational problems that can be discovered by thinking abstractly about computations: this includes relations between learning theory and average-case complexity, the Nisan-Wigderson approach to turn intractability results into algorithms, the connection, exploited in PCP theory, between efficiency of proof-checking and complexity of approximation, and so on. The other goal is to use complexity theory as an "excuse" to learn about several tools of broad applicability in computer science. Depending on how far we will go, we will see enough Fourier analysis and learning to know how to learn decision trees with membership queries, enough graph theory to build constant-degree expander graphs from scratch and to have an understanding of why spectral partitioning algorithms work, and enough algorithmic coding theory to know how to decode Reed-Solomon codes.
For reasons that are only partially understood, a disproportionate number of the most beautiful results in Complexity theory in the 80s and 90s have been found by Berkeley graduate students. Hopefully you will feel upon yourselves the mission of continuing this tradition before the current decade ends.
Note: the homeworks are for your enjoyment only. You don't have to solve them and they are not meant to be turned in.
NO CLASS March 18
Tentative plan: (updated 3/11 4/14)
Lectures 1-6: time complexity
P, NP, randomized algorithms, circuits, polynomial hierarchy, approximate
counting
Lectures 7-14: space complexity
expanders, Reingold's algorithm
Lectures 15-22: PCP
Dinur's proof of the PCP theorem and applications to approximability
Lectures 23-24: Hardness of random 3SAT
Levin's theory of average-case complexity; proof complexity lower
bounds
Lectures 25-: Pseudorandomness, Natural Proofs, and Learning
Parity lower bounds, one-way permutations, Goldreich-Levin, PRGs, PRFs,
applications to encryption and authentication, Natural Proofs, Goldreich-Levin
as a learning algorithm, learning decision trees and AC0 circuits with queries
Midterm due April 15
Information about projects