Seminars in May 2015

  • [FYI] KAIST Theory Day (May 31, 2015)

    FYI: (KAIST Theory Day organized by Jinwoo Shin and Eric Vigoda)

    KAIST Theory Day
    2015/5/31 Sun 10AM-5PM (Building N1, room 102, KAIST)
    9:30 – Coffee and refreshments
    10:00 – Daniel Stefankovic (Rochester):  Spin models: connections between complexity, phase transition, belief propagation, and induced matrix norms
    11:00 – Yitong Yin (Nanjing)Phase transition of hypergraph matchings
    12:00 – Lunch break
    1:30 – Euiwoong Lee (CMU):  Hardness of Graph Pricing through Generalized Max-Dicut
    2:30 – Sewoong Oh (UIUC):  Data processing inequality for differential privacy and applications
    3:30 – Coffee break
    4:00 – Martin Ziegler (TU Darmstadt):  Algebraic Complexity Theory and Quantum Logic

    Speaker: Daniel Stefankovic

    Title: Spin models: connections between complexity, phase transition, belief propagation, and induced matrix norms

    What are the typical configurations of a spin model (for example, Ising model, or Potts model) on a random regular graph? We show that the answer to this question is characterized by tree recursions (belief propagation) and p->q operator matrix norms.

    Understanding the typical configurations allows us to show hardness of approximating the partition function for certain multispin models (for example, Potts model) on d-regular graphs in the non-uniqueness regime (that is, when the Gibbs measure on infinite d-regular tree is not unique). Joint work with Andreas Galanis, and Eric Vigoda.

    Speaker: Yitong Yin

    Title: Phase transition of hypergraph matchings

    We study the problem of approximately counting weighted matchings in hypergraphs of bounded maximum edge size and maximum degree. The problem is expressed as evaluating the partition function, which is the weighted sum of all macthings in a hypergraph where each macthing M is assigned a weight λ M in terms of a fixed activity parameter λ. This model unifies two important problems in approximate counting arising from statistical physics: counting independent set (the hardcore model) and counting matchings (the monomer-dimer model).

    We show that for hypergraph matchings, the critical activity λc= dd/k(d-1)(d+1) is the uniqueness threshold for the Gibbs measure on (d+1)-uniform (k+1)-regular hyper-tree, and when λ<λc, the hypergraphs of maximum edge size at most d+1 and maximum degree at most k+1 exhibit strong spatial mixing. As a consequence, there is an FPTAS for counting matchings in hypergraphs of bounded maximum edge size at most and bounded maximum degree as long as the uniqueness condition is satisfied.

    As for the inapproximability, it can be easily shown that there is no FPRAS for the problem when λ>2λc, unless NP=RP. This left an intriguing gap between λc and 2λc. A key step towards tight inapproximability result is the local weak convergence from a sequence of finite graphs to the extremal measures for the uniqueness threshold on the infinite tree. For hypergraph matchings, we discover that such local weak convergence does not exist for any sequence of finite hypergraphs, which indicates that approximate counting on hypergraphs (or general CSPs) contains much richer structure yet to be understood.

    Speaker: Euiwoong Lee

    Title: Hardness of Graph Pricing through Generalized Max-Dicut

    The Graph Pricing problem is among the fundamental problems whose approximability is not well-understood. While there is a simple combinatorial ¼-approximation algorithm, the best hardness result remains at ½ assuming the Unique Games Conjecture (UGC). We show that it is NP-hard to approximate within a factor better than ¼ under the UGC, so that the simple combinatorial algorithm might be the best possible.

    This work is based on the effort to view the Graph Pricing problem as a Constraint Satisfaction Problem (CSP) simpler than the standard and complicated formulation. We propose the problem called Generalized Max-Dicut(T), which has a domain size T + 1 for every T ≥ 1 . Generalized Max-Dicut(1) is well-known Max-Dicut. Besides its connection to Graph Pricing, the hardness of Generalized Max-Dicut is interesting in its own right since in most arity two CSPs studied in the literature, SDP-based algorithms perform better than LP-based or combinatorial algorithms — for this arity two CSP, a simple combinatorial algorithm does the best.

    Speaker: Sewoong Oh

    Title: Data processing inequality for differential privacy and applications

    We provide a hypothesis testing interpretation to differential privacy and derive natural forward and reverse data processing inequalities. These inequalities are very useful in deriving tight impossibility results, as demonstrated by the following two applications: composing multiple queries and multi-party computation. The impossibility results hold for arbitrary parameter settings (privacy levels, number of queries, etc) and for both standard and approximate differential privacy settings. Further, these impossibility results cannot be improved upon.

    Speaker: Martin Ziegler

    Title: Algebraic Complexity Theory and Quantum Logic

    Algebraic models of computation (like register machines) make one operation per step regardless of the value processed. They underly algorithms for sorting or in computational geometry. As opposed to bit models, algebraic methods permit to establish non-trivial lower bounds:

    We recall Morgenstern’s elegant proof that the fast fourier transform is optimal; then proceed to the structural complexity theory and prove the quantum satisfiability problem complete for NP_R^0: a well-known discrete complexity class between NP and PSPACE. (Joint work with Christian Herrmann.)

  • Pinyan Lu, Optimal Competitive Auctions

    Optimal Competitive Auctions
    Pinyan Lu (陆品燕)
    Theory Group, Microsoft Research Asia, Beijing, China
    2015/5/19 Tue 4PM-5PM
    We study the design of truthful auctions for selling identical items in unlimited supply (e.g., digital goods) to n unit demand buyers. This classic problem stands out from profit-maximizing auction design literature as it requires no probabilistic assumptions on buyers’ valuations and employs the framework of competitive analysis. Our objective is to optimize the worst-case performance of an auction, measured by the ratio between a given benchmark and revenue generated by the auction.
    We establish a sufficient and necessary condition that characterizes competitive ratios for all monotone benchmarks. The characterization identifies the worst-case distribution of instances and reveals intrinsic relations between competitive ratios and benchmarks in the competitive analysis. With the characterization at hand, we show optimal competitive auctions for two natural benchmarks.
    The most well-studied benchmark F^2 measures the envy-free optimal revenue where at least two buyers win. Goldberg et al. showed a sequence of lower bounds on the competitive ratio for each number of buyers n. They conjectured that all these bounds are tight. We show that optimal competitive auctions match these bounds. Thus, we confirm the conjecture and settle a central open problem in the design of digital goods auctions. As one more application we examine another economically meaningful benchmark, which measures the optimal revenue across all limited-supply Vickrey auctions. We identify the optimal competitive ratios to be (n/({n-1})^{n-1}-1 for each number of buyers n, that is e-1 as n approaches infinity.</p>

    Joint work with Ning Chen and Nick Gravin.

    Tags:
  • Jinwoo Kim (김진우), Stable Matching in Large Economies

    Stable Matching in Large Economies
    Jinwoo Kim
    Department of Economics, Seoul National University, Seoul, Korea
    2015/5/12 Tue 4PM-5PM
    Complementarities of preferences have been known to jeopardize the stability of two-sided matching, and they are a pervasive feature of many markets. We revisit the stability issue with such preferences in a large market.
    Workers have preferences over firms while firms have preferences over distributions of workers and may exhibit complementarity. We demonstrate that if each firm’s choice changes continuously as the set of available workers changes, then there exists a stable matching even with complementarity. Building on this result, we show that there exists an approximately stable matching in any large finite economy. We extend our framework to accommodate indifferences in firms’ preferences, construct a stable mechanism that is strategy-proof and equitable for workers perceived as indifferent by firms, and apply the analysis to probabilistic and time-share matching models with a finite number of firms and workers.
    Tags:

Monthly Archives