Kyriakos Axiotis: Thesis Defense: Flows, Submodularity, Sparsity, and Beyond: Continuous Optimization Insights into Discrete Problems

Speaker: Kyriakos Axiotis

Date: Monday, August 29, 2022

Time: 3:00 PM to 4:30 PM Note: all times are in the Eastern Time Zone

Public: Yes

Location: G449, Kiva / https://mit.zoom.us/j/95652580611

Event Type:

Room Description:

Host: Aleksander Madry (Chair), Costis Daskalakis and Piotr Indyk

Contact: Deborah Goodwin, dlehto@csail.mit.edu

Relevant URL:

Speaker URL: None

Speaker Photo:
Kyriakos axiotis

Reminders to: seminars@csail.mit.edu

Reminder Subject: TALK: Kyriakos Axiotis: Thesis Defense: Flows, Submodularity, Sparsity, and Beyond: Continuous Optimization Insights into Discrete Problems

Abstract: We will discuss a number of new connections between discrete and continuous optimization.

The first part will present a faster second-order convex optimization algorithms for classical graph algorithmic problems. In particular, we will show that the running time of interior point methods is closely connected to spectral connectivity notions in the underlying graph, such as electrical conductance and effective resistance. We will explore these connections along two orthogonal directions: making manual interventions to the graph to improve connectivity, and keeping track of connectivity so as to make faster updates. These ideas lead to the first runtime improvement for the minimum cost flow problem in more than 10 years, as well as faster algorithms for problems like negative-weight shortest path and minimum cost perfect matching.

In the second part, we will discuss efficient optimization algorithms for problems relevant to machine learning that have some discrete element, such as sparse or low rank structure. We introduce a new technique, called adaptive regularization, which eliminates the sparsity performance degradation caused by l2 projections onto structured non-convex domains, like the set of sparse vectors or low rank matrices. This leads to improving the sparsity guarantees of one of the most well known sparse optimization algorithms, IHT.

Research Areas:

Impact Areas:

This event is not part of a series.

Created by Deborah Goodwin Email at Friday, August 05, 2022 at 10:53 AM.