Speaker: Gillat Kol

Date: Tuesday, November 05, 2013

Time: 4:15 PM to 5:15 PM Note: all times are in the Eastern Time Zone

Refreshments: 3:45 PM

Public: Yes

Location: 32-G882

Event Type:

Room Description:

Host: Dana Moshkovitz and Costis Daskalakis

Contact: Holly A Jones,

Relevant URL:

Speaker URL: None

Speaker Photo:

Reminders to:


Abstract: In a profoundly influential 1948 paper, Claude Shannon defined the entropy function H, and showed that the capacity of a symmetric binary channel with noise rate (bit flip rate) eps is 1-H(eps). This means that one can reliably communicate n bits by sending roughly n / (1-H(eps)) bits over this channel. The extensive study of interactive communication protocols in the last decades gives rise to the related question of finding the capacity of a noisy channel when it is used interactively. We define interactive channel capacity as the minimal ratio between the communication required to compute a function (over a non-noisy channel), and the communication required to compute the same function over the eps-noisy channel. We show that the interactive channel capacity is roughly 1-Theta( sqrt(H(eps)) ). Our result gives the first separation between interactive and non-interactive channel capacity. Joint work with Ran Raz.

Research Areas:

Impact Areas:

See other events that are part of the Theory of Computation Colloquium - 2013.

Created by Holly A Jones Email at Wednesday, October 30, 2013 at 10:18 AM.