Speaker: Paul Valiant , Brown University

Date: Tuesday, November 12, 2013

Time: 4:15 PM to 5:15 PM Note: all times are in the Eastern Time Zone

Refreshments: 3:45 PM

Public: Yes

Location: G449

Event Type:

Room Description:

Host: Constantinos Daskalakis and Dana Moshkovitz

Contact: Holly A Jones,

Relevant URL:

Speaker URL: None

Speaker Photo:

Reminders to:


The amount of data being routinely processed by computers is exploding, and this puts a new demand on the algorithms community: do our algorithms make efficient use of their data? Many familiar problems can be reexamined in this light to yield significant and practical improvements. A small asymptotic algorithmic improvement, from needing terabytes of data to merely gigabytes, effectively cut costs by a factor of a thousand in a data-driven world.

In this talk we consider a fundamental hypothesis testing problem: given data from a probability distribution, was it drawn from a known distribution P, or from a distribution far from P? Our analysis reveals several distinct features that previous algorithms failed to take advantage of, and culminates by showing that these are essentially all the possible features an algorithm could take advantage of. Namely, we introduce what we call an "instance-by-instance optimal" algorithm and show that our algorithm, when testing against an input distribution P, performs within constant factors of the best possible performance of the best possible algorithm customized for P.

Our analyses introduces a new way of using linear programming duality to automatically analyze a broad class of inequalities that generalizes Cauchy-Schwarz, Holder's inequality, and the monotonicity of Lp norms.

Joint work with Gregory Valiant

Research Areas:

Impact Areas:

See other events that are part of the Theory of Computation Colloquium - 2013.

Created by Holly A Jones Email at Monday, November 04, 2013 at 3:51 PM.