Thesis defense: Adrian Vladu, "Shortest Paths, Markov Chains, Matrix Scaling and Beyond: Improved Algorithms Through the Lens of Continuous Optimization"

Speaker:

Date: Thursday, May 11, 2017

Time: 3:00 PM to 4:00 PM

Public: Yes

Location: 32-G575

Event Type:

Room Description:

Host: Aleksander Madry, MIT

Contact: Aleksander Madry, madry@csail.mit.edu

Relevant URL:

Speaker URL: None

Speaker Photo:
None

Reminders to: theory-seminars@csail.mit.edu, seminars@csail.mit.edu

Reminder Subject: TALK: Thesis defense: Adrian Vladu, "Shortest Paths, Markov Chains, Matrix Scaling and Beyond: Improved Algorithms Through the Lens of Continuous Optimization"

In this thesis, we build connections between classic methods from convex optimization and the modern toolkit from the fast Laplacian solver literature, in order to make progress on a number of fundamental algorithmic problems:

- We develop a faster algorithm for the unit capacity minimum cost flow problem, which encompasses the shortest path with negative weights and minimum cost bipartite perfect matching problems. In the case of sparse graphs, this provides the first running time improvement for these problems in over 25 years.

- We initiate the study of solving linear systems involving directed Laplacian matrices, and devise an almost-linear time algorithm for this task. This primitive enables us to also obtain almost-linear time algorithms for computing an entire host of quantities associated with Markov chains, such as stationary distributions, personalized PageRank vectors, hitting times, or escape probabilities. This significantly improves over the previous state-of-the-art, which was based on simulating random walks, or applying fast matrix multiplication.

- We develop faster algorithms for scaling and balancing nonnegative matrices, two fundamental problems in scientific computing, significantly improving over the previously known best running times. In particular, if the optimal scalings/balancings have polynomially bounded condition numbers, our algorithms run in nearly-linear time.

Beyond that, we leverage and extend tools from convex geometry in order to design an algorithm for online pricing with nearly-optimal regret. We also use convex optimization to shed a new light on the approximate Carathe?odory problem, for which we give a deterministic nearly-linear time algorithm, as well as matching lower bounds.

Thesis Supervisors: Jonathan Kelner, Aleksander M?dry

Thesis Committee: Jonathan Kelner, Aleksander M?dry, Ankur Moitra

Research Areas:

Impact Areas:

This event is not part of a series.

Created by Aleksander Madry Email at Monday, May 08, 2017 at 11:52 AM.