Speaker: Denis Pankratov, University of Chicago

Title: Exact Communication Complexity via Information Complexity

Abstract:

The disjointness function is DISJ(X,Y) = not [OR_{i=1}^{n}( X_i AND Y_i)] where X, Y \in {0, 1}^n. Disjointness is one of the most studied functions in communication complexity. Let R(DISJ) denote the randomized communication complexity of disjointness. The original proof of R(DISJ) = \Omega(n) (due to Kalyanasundaram and Schnitger, 1992) and its simplification (due to Razborov, 1992) were combinatorial in nature. The first information-theoretic proof of this lower bound is due to Bar-Yossef et al (2004). Bar-Yossef et al used the notion of information complexity to give a yet simpler proof of \Omega(n) lower bound on R(DISJ).

In this talk, I shall demonstrate how to use the notion of information complexity to derive an exact (up to lower order terms) bound on the randomized communication complexity of disjointness:

R(DISJ) = (C_{DISJ}) n ± o(n) where C_{DISJ} = 0.4827 . . .

Information complexity has the direct sum property, which can be applied to the disjointness

function. Thus most of the work in deriving the above result is in computing the exact expression for the information complexity of the 2-bit AND function. There are several ways in which this task can be accomplished. My collaborators (Braverman, Garg, and Weinstein) and I described one such approach in the paper titled ``From Information to Exact Communication'' (STOC 2013). In this talk I shall take an alternative approach that will tie information complexity with the theory of partial differential equations.

This talk is based on joint work with Mark Braverman, Ankit Garg, and Omri Weinstein.