Speaker: Gillat Kol, Institute for Advanced Study
Title: Exponential Separation of Information and Communication
In profoundly influential works, Shannon, Fano and Huffman show that if Alice wants to send a message X to Bob, it's sufficient for her to send roughly H(X) bits (in expectation), where H denotes Shannon's entropy function. In other words, every message can be compressed to its information content. Can we prove similar results in the interactive setting, where Alice and Bob engage in a communication protocol? The standard way to formalize this question is to ask whether the communication complexity of a communication task is always (roughly) bounded by its information complexity. Prior to our work, no separation between communication and information complexity was known.
We answer the above question in the negative: We show an exponential gap between communication complexity and information complexity, by giving an explicit example of a boolean function with information complexity at most O(k), and distributional communication complexity at least 2^k. This shows that a communication protocol cannot always be compressed to its internal information. By a result of Braverman, this gap is the largest possible. By a result of Braverman and Rao, our example shows a gap between communication complexity and amortized communication complexity, implying that strong direct sum does not hold for distributional communication complexity of boolean functions, answering a long standing open problem.