Location: Gates 4B lobby.
Mailing list: thseminar@cs.stanford.edu. Email Qiqi to sign up.
Contact: Qiqi Yan (qiqiyan@stanford...)
We revisit the problem of designing the profit-maximizing single-item auction, solved by Myerson in his seminal paper for the case in which bidder valuations are independently distributed. We focus on general joint distributions, seeking the optimal deterministic incentive compatible auction. We give a geometric characterization of the optimal auction through a duality theorem, resulting in an efficient algorithm for finding the optimal deterministic auction in the two-bidder case and an inapproximability result for three or more bidders.
Joint work with Christos Papadimitriou.
Inspired by our work on output bidding [1], we propose a new sponsored search bidding language that provides advertisers with the following two options:
1. Advertisers can specify "targets", e.g., advertiser x wants his ad to be displayed only if y's ad is displayed.
2. Advertisers can specify "conflicts", e.g., advertiser x does not want his ad to be displayed if y's ad is displayed.
First, I will motivate these two options with some examples. Then, I will present the challenges in determining the auction winners and designing a truthful pricing mechanism.
This talk is about a work in progress and any feedback would be greatly appreciated.
(joint work with Hector Garcia-Molina)
[1] Papadimitriou, Panagiotis and Garcia-Molina, Hector and Dasdan, Ali and Kolay, Santanu (2011) Output URL Bidding. In: 37th International Conference on Very Large Data Bases (VLDB), Aug 29 - Sep 3, Seattle, WA, USA.
Affine and two-source extractors and dispersers (known also as bipartite Ramsey graphs) are fundamental objects studied in the context of derandomization. We show how to construct two-source extractors and dispersers for arbitrarily small min-entropy rate in a black-box manner from affine extractors with sufficiently good parameters. We present two black-box constructions. Both constructions work for min-entropy rate less than half. One of them can potentially reach arbitrarily small min-entropy rate provided the affine extractor used to construct it outputs, on affine sources of min-entropy rate half, a relatively large number of output bits and has sufficiently small error. Our analysis relies on the study of approximate duality, a concept related to the polynomial Freiman-Ruzsa conjecture (PFR) from additive combinatorics.
For more details see: http://eccc.hpi-web.de/report/2010/144/
Joint work with Eli Ben-Sasson.
To appear in STOC 2011
Motivated by spatial encryption, we study what partially-ordered sets (and "hierarchies", a variant of partially-ordered sets) can be embedded in the poset of affine subspaces of an n-dimensional space. In particular, we will show that if all but n<<wN dimensions of an N-dimensional space go unused, then this space can be compressed to an O(n)-dimensional space.
The convexity assumptions required for the Arrow-Debreu theorem are reasonable and realistic for preferences; however, they are highly problematic for production because they rule out economies of scale. We take a complexity-theoretic look at economies with non-convex production. It is known that in such markets equilibrium prices may not exist; we show that it is an intractable problem to achieve Pareto efficiency, the fundamental objective achieved by equilibrium prices. The same is true for core efficiency or any one of an array of concepts of stability, with the degree of intractability ranging from F\Delta_2^P-completeness to PSPACE-hardness. We also identify a novel phenomenon that we call complexity equilibrium in which agents quiesce, not because there is no way for any one of group of them to improve their situation, but because discovering the changes necessary for (individual or group) improvement is intractable. In fact, we exhibit a somewhat natural distribution of economies that has an average-case hard complexity equilibrium.
We consider a market model where identical copies of a good are sold through a sequence of second price auctions. Each agent in the market has an unknown independent private valuation which determines the distribution of the reward she obtains from the good; for example, in sponsored search settings, advertisers may initially be unsure of the value of a click. Though the induced dynamic game is complex, we simplify the analysis using an approximation methodology known as *mean field equilibrium* (MFE), where agents optimize only with respect to long run average estimates of the distribution of other players' bids. We show a remarkable fact: in a mean field equilibrium, the agent has an optimal strategy where she bids truthfully according to a *conjoint valuation*. The conjoint valuation is the sum of her current expected valuation, together with an overbid amount that is exactly the expected marginal benefit to one additional observation about her true private valuation. Under mild conditions on the model, we show that an MFE exists, and that it is a good approximation to a rational agent's behavior as the number of agents increases.
Joint work with Ramesh Johari and Mukund Sundararajan
We investigate search engines' mechanism for allocating impressions generated from different search terms. In practice, the number of search terms is so large that an advertiser cannot possibly communicate to the search engine all the search terms he wishes to bid on. For example, a travel agency is interested in all search terms pertaining to flight, including ``flight to boston'', ``ticket to SFO'', ``cheap airfare'', etc. Therefore, the search engine introduces broad match keywords as a bidding language that allows an advertiser to submit a bid for multiple GSP auctions (search terms) at once. However, with broad match keywords, the GSP auctions for different search terms are no longer independent, i.e. an advertiser's bid in one auction may depend on his bid in another auction.
We propose the broad match mechanism as a model that captures this aspect
of the multi-keyword sponsored search mechanism. We identify two properties of broad match keywords, namely expressiveness and homogeneity, that characterize the performance of the broad match mechanism. Our POA analysis allows us to explore trade-offs between the two properties.
In short, we investigate a key aspect of the sponsored search mechanism that is orthogonal to the GSP auction. Our analysis reveals interesting mechanism design choices (trade-off).
We discuss the problem of efficiently computing exact and approximate shortest paths in graphs, with the main focus being on shortest path query processing. Strategies for computing answers to shortest path queries may involve the use of pre-computed data structures (often called distance oracles) in order to improve the query time. Designing a shortest path query processing method raises questions such as: How can these data structures be computed efficiently? What amount of storage is necessary? How much improvement of the query time is possible? How good is the approximation quality of the query result? What are the tradeoffs between pre-computation time, storage, query time, and approximation quality? We survey the state of the art of distance oracles for planar graphs with a focus on our recent results (joint work with Ken-ichi Kawarabayashi, Philip N. Klein, and Shay Mozes) on exact and approximate distance oracles for planar graphs. - For any space parameter S in [n lglg n, n2], there is an exact distance oracle with space O(S) and query time ~O(n/sqrt S) - For any eps > 0, there is a (1 + eps ) - approximate distance oracle with space O(n) and query time O(eps^{-2} lg2 n) - For unweighted planar graphs, for any eps > 0, there is a (1 + eps ) - approximate distance oracle with space O(n eps^{-1} lglg n lg* n) with query time O(lg lg lg n + lg *n lg2(1/eps) + eps^{-1})