INTERACTIVE CHANNEL CAPACITY
Date: Tuesday, November 05, 2013
Time: 4:15 PM to 5:15 PM Note: all times are in the Eastern Time Zone
Refreshments: 3:45 PM
Host: Dana Moshkovitz and Costis Daskalakis
Contact: Holly A Jones, firstname.lastname@example.org
Relevant URL: http://toc.csail.mit.edu/node/395
Speaker URL: None
TALK: INTERACTIVE CHANNEL CAPACITY
Abstract: In a profoundly influential 1948 paper, Claude Shannon defined the entropy function H, and showed that the capacity of a symmetric binary channel with noise rate (bit flip rate) eps is 1-H(eps). This means that one can reliably communicate n bits by sending roughly n / (1-H(eps)) bits over this channel. The extensive study of interactive communication protocols in the last decades gives rise to the related question of finding the capacity of a noisy channel when it is used interactively. We define interactive channel capacity as the minimal ratio between the communication required to compute a function (over a non-noisy channel), and the communication required to compute the same function over the eps-noisy channel. We show that the interactive channel capacity is roughly 1-Theta( sqrt(H(eps)) ). Our result gives the first separation between interactive and non-interactive channel capacity. Joint work with Ran Raz.
Created by Holly A Jones at Wednesday, October 30, 2013 at 10:18 AM.