Program Optimization for Machine Learning

Speaker: Alex Aiken , Stanford University

Date: Monday, May 11, 2020

Time: 2:00 PM to 3:00 PM Note: all times are in the Eastern Time Zone

Public: Yes

Location: https://mit.zoom.us/j/536883569 Password 008943

Event Type: Seminar

Room Description: https://mit.zoom.us/j/536883569 Password 008943

Host: Julian Shun, MIT CSAIL

Contact: Julian Shun, jshun@mit.edu

Relevant URL: http://fast-code.csail.mit.edu/

Speaker URL: http://theory.stanford.edu/~aiken/

Speaker Photo:
None

Reminders to: seminars@csail.mit.edu, fast-code-seminar@lists.csail.mit.edu, pl@csail.mit.edu, commit@csail.mit.edu, mitml@mit.edu

Reminder Subject: TALK: Program Optimization for Machine Learning

Abstract: Training deep neural networks (DNNs) can be expensive and slow, consuming enormous numbers of compute-hours on parallel machines. This talk will present results on using novel search procedures over programs to reduce training time. In particular, instead of greedily applying program-improving transformations to compute a single improved program, we search a space of programs, considering many possible candidates guided by a global cost function. Application of search-based optimization to two separate problems will be discussed: improving the partitioning and distribution of training data, and reducing the execution time of the DNN computation graph. Both methods speedup training by up to a factor of 3 over current state-of-the-art systems.

Bio: Alex Aiken is the Alcatel-Lucent Professor of Computer Science at Stanford. Alex received his Bachelors degree in Computer Science and Music from Bowling Green State University in 1983 and his Ph.D. from Cornell University in 1988. Alex was a Research Staff Member at the IBM Almaden Research Center (1988-1993) and a Professor in the EECS department at UC Berkeley (1993-2003) before joining the Stanford faculty in 2003. His research interest is in areas related to programming languages. He is an ACM Fellow, a recipient of ACM SIGPLAN's Programming Languages Achievement Award and Phi Beta Kappa's Teaching Award, and a former chair of the Stanford Computer Science Department.

Research Areas:
AI & Machine Learning, Programming Languages & Software, Systems & Networking

Impact Areas:
Big Data

See other events that are part of the Fast Code Seminar 2019.

Created by Julian J. Shun Email at Wednesday, May 06, 2020 at 12:10 AM.