Communication-avoiding algorithms for linear algebra, machine learning and beyond

Speaker: James Demmel , UC Berkeley

Date: Monday, August 10, 2020

Time: 2:00 PM to 3:00 PM Note: all times are in the Eastern Time Zone

Public: Yes

Location: https://mit.zoom.us/meeting/register/tJUrdOqopj8uHdO4gUyVMnfglOFEqIye_Je0 (Registration required)

Event Type: Seminar

Room Description: https://mit.zoom.us/meeting/register/tJUrdOqopj8uHdO4gUyVMnfglOFEqIye_Je0 (Registration required)

Host: Julian Shun, MIT CSAIL

Contact: Julian Shun, jshun@mit.edu, lindalynch@csail.mit.edu

Relevant URL: http://fast-code.csail.mit.edu/

Speaker URL: http://people.eecs.berkeley.edu/~demmel/

Speaker Photo:
None

Reminders to: seminars@csail.mit.edu, fast-code-seminar@lists.csail.mit.edu, pl@csail.mit.edu, commit@lists.csail.mit.edu, toc-students@csail.mit.edu, toc@csail.mit.edu, toc-postdocs@csail.mit.edu, toc-faculty@csail.mit.edu

Reminder Subject: TALK: Communication-avoiding algorithms for linear algebra, machine learning and beyond

Abstract: Algorithms have two costs: arithmetic and communication, i.e. moving data between levels of a memory hierarchy or processors over a network. Communication costs (measured in time or energy per operation) already greatly exceed arithmetic costs, and the gap is growing over time following technological trends. Thus our goal is to design algorithms that minimize communication. We present new algorithms that communicate asymptotically less than their classical counterparts, for a variety of linear algebra and machine learning problems, demonstrating large speedups on a variety of architectures. Some of these algorithms attain provable lower bounds on communication. We describe generalizations of these bounds, and optimal algorithms, to arbitrary code that can be expressed as nested loops accessing arrays, such as convolutional neural nets, and to account for arrays having different precisions.

Bio: James Demmel is the Dr. Richard Carl Dehmel Distinguished Professor of Computer Science and Mathematics at the University of California at Berkeley, and former Chair of the EECS Dept. His research is in numerical linear algebra, high performance computing, and communication avoiding algorithms. He is known for his work on the widely used LAPACK and ScaLAPACK linear algebra libraries. He is a member of the National Academy of Sciences, National Academy of Engineering, and American Academy of Arts and Sciences; a Fellow of the AAAS, ACM, AMS, IEEE and SIAM; and winner of the IPDPS Charles Babbage Award, IEEE Computer Society Sidney Fernbach Award, the ACM Paris Kanellakis Award, and numerous best paper prizes.

Research Areas:
Algorithms & Theory, AI & Machine Learning, Systems & Networking

Impact Areas:
Big Data

See other events that are part of the Fast Code 2020 - 2021.

Created by Julian J. Shun Email at Monday, August 03, 2020 at 3:17 PM.