Location: Gates 4B lobby.
Mailing list: thseminar@cs.stanford.edu. Email Qiqi to sign up.
Contact: Qiqi Yan (qiqiyan@stanford...)
Food: Wendy usually chooses the food at her discretion. But the speaker may suggest a preference to Wendy.
This meeting is the place where people sign up for talks.
All you want to know, and more, about the set of all knight's tours on a 3xn chessboard.
A fertile area of recent research has found concrete polynomial time
lower bounds for solving hard computational problems on restricted
computational models. Among these problems are Boolean Satisfiability,
Vertex Cover, Hamilton Path, MOD6-SAT, and Majority-of-Majority-SAT,
to name a few. The proofs of such lower bounds all follow a certain
proof-by-contradiction strategy.
I will present a high-level survey of some recent work on this topic,
to give you a feeling of the techniques involved. I'll also briefly
discuss a strategy for generating new lower bound proofs by computer.
Abstract: In the single-sink buy-at-bulk problem we wish to route flow along a graph from a set of demand nodes to a root node, where the cost of routing x total flow along an edge is proportional to f(x) for some concave, non-decreasing function f satisfying f(0) = 0. Generally, solutions are heavily tailored to the specific cost function, but we present a simple, deterministic, combinatorial algorithm that takes a set of demands and constructs a single tree T such that for all f the cost f(T) is a 47.45-approximation of the optimal cost for that f. This is within a factor of 2.33 of the best approximation ratio currently achievable when the tree can be optimized for a specific function. Trees achieving simultaneous O(1)-approximations for all concave functions were previously not known to exist regardless of computation time.
Abstract: Shapley and Shubik in 1971 initiated the study of matching markets from market equilibrium point of view, and showed that the set of equilibria, which is always non-empty, has a number of nice properties. Motivated from applications of advertising markets, we study the matching markets with a strict budget constraint for every buyer. While an equilibrium may not exist with the introduction of budgets, we show a polynomial time algorithm to determine its existence, and compute one if it does exist.
The talk is based on joint work with Xiaotie Deng and Arpita Ghosh.
We analyze Combinatorial Auctions with subadditive bidders. The basic problem is to sell m items to n bidders with the bidders obtaining disjoint subsets of items. Each bidder has a valuation for each subset of the items. We additionally assume that this valuation is subadditive i.e. it satisfies $v(S U T) \le v(S) + v(T)$ for any two sets S and T.
The mechanism we analyze restricts bidders to expressing their preference using a much simpler model. Bidders specify one value for each item, value for a set is then inferred by adding the values of individual items. In technical language, it forces a bidder to express his subadditive preference using an additive valuation. Since the bidder's valuation is not necessarily additive, dominant strategies(expressing true preference) are out of question and hence the Price of Anarchy framework is employed to analyze the performance of the mechanism.
The performance metric for the mechanism is social welfare which is the sum of the values to the bidders of the allocated set. We show that for this mechanism, a pure Nash equilibrium approximates the optimal social welfare within a factor of 2 and there exist simple instances where this worst case ratio is realized. For more general but sometimes more natural equilibrium concepts like Bayes Nash, mixed Nash, correlated and coarse-correlated we establish a bound of $2\ln{m}$. Though we do not provide a lower bound establishing the exactness of this second bound, we do present an instance with Bayes Nash equilibrium approximating the social welfare by a factor strictly worse than 2. This establishes a clear gap between worst case approximation obtainable in a pure Nash equilibrium versus a Bayes-Nash equilibrium.
This is joint work with Tim Roughgarden. It will appear in SODA 2011.
Abstract: We consider the problem of finding the worst case joint distribution of n random variables with given marginals, where worst case distribution is defined as the joint distribution that maximizes the expected value of an objective function. We focus on bounding correlation gap, defined as the approximation ratio that independent (product) distribution achieves for this problem. We demonstrate that in addition to stochastic optimization, this problem appears in context of many deterministic optimization applications, where correlation gap corresponds to approximation ratio achieved by simple independent selection based approximation algorithms. The current results on the approximation ratios achieved are largely based on the submodularity of involved function. In this talk, we explore a characterization of non-submodular solvable cases using the concept of cross-monotone cost-sharing scheme from game theory. We given "near-tight" bounds on correlation gap and show that our characterization captures the closeness to submodularity for this problem.
Abstract: The L1 distance, also known as the Manhattan or taxicab distance,
between two vectors x, y is sum_i |x_i-y_i|. Approximating this
distance is a fundamental primitive on massive databases, with
applications to clustering, nearest neighbor search, network
monitoring, regression, and sampling. We give the first 1-pass
streaming algorithm for this problem in the turnstile model which
simultaneously achieves near-optimal space and update time.
Joint work with David Woodruff (IBM Almaden)
Abstract: Finding the length of the longest increasing subsequence (LIS) is a classic algorithmic problem. A small example should explain the problem. In the array 1 9 4 10 6 7, the LIS has length 4 and is 1 4 6 7.
Let n denote the size of the input array. Simple textbook solutions achieve an O(n log n) running time, using dynamic programming. What can a sublinear time algorithm say about the LIS? For any constant delta > 0, we construct a polylogarithmic time randomized algorithm that estimates the length of the LIS within an additive error of (delta n). Previously, the best known polylogarithmic time algorithms could only achieve an additive n/2 approximation.
Why is this problem challenging? The LIS length is the output of a dynamic program, which means unless we solve many (read linear) subproblems, we cannot get the exact LIS length. We are looking to get extremely accurate (read delta n) approximations in *polylogarithmic time*. The algorithm we construct attempts to follow the progress of a (top-down) dynamic program. In polylogarithmic time, it zeroes in on the "important" sub-problems to solve. In essence, it is a sublinear-time divide and conquer algorithm. This requires some new sublinear algorithm ideas, which we hope will have applications for approximating other dynamic programs.
Joint work with Michael Saks. (Appeared in FOCS 2010)
A result from the 1950's, Blackwell's Approachability Theorem, showed what was essentially a generalization of Von Neumann's minimax theorem for "multi-dimensional payoffs". More recently, there has been a lot of work on constructing what are known as "no-regret algorithms", which provide a strong guarantee when optimizing an arbitrary sequence of convex functions. I'll show that two results, "existence of no-regret algorithms" and "Blackwell approachability", are essentially equivalent in a very strong sense. I'll also describe a nice application of the result, namely for constructing efficient forecasters for calibration.
Joint work with Peter Bartlett and Elad Hazan.