David Patterson: Domain Specific Architectures for Deep Neural Networks: Three Generations of Tensor Processing Units (TPUs)

Speaker: David Patterson , Univ. of California, Berkeley

Date: Wednesday, October 16, 2019

Time: 4:30 PM to 5:30 PM Note: all times are in the Eastern Time Zone

Public: Yes

Location: 32-123

Event Type: Seminar

Room Description: Kirsch Auditorium

Host: Charles Leiserson

Contact: Lauralyn M. Smith, lauralyn@csail.mit.edu

Relevant URL:

Speaker URL: None

Speaker Photo:
Daveface2018

Reminders to: seminars@csail.mit.edu

Reminder Subject: TALK: David Patterson: Domain Specific Architectures for Deep Neural Networks: Three Generations of Tensor Processing Units (TPUs)

Abstract:
The recent success of deep neural networks (DNN) has inspired a resurgence in domain specific architectures (DSAs) to run them, partially as a result of the deceleration of microprocessor performance improvement due to the ending of Moore’s Law. DNNs have two phases: training, which constructs accurate models, and inference, which serves those models. Google’s first generation Tensor Processing Unit (TPUv1) offered 50X improvement in performance per watt over conventional architectures for inference. We naturally asked whether a successor could do the same for training. This talk reviews TPUv1 and explores how Google built the first production DSA supercomputer for the much harder problem of training, which was deployed in 2017. Google’s TPUv2/TPUv3 supercomputers with up to 1024 chips train production DNNs at close to perfect linear speedup with 10X-40X higher floating point operations per Watt than general-purpose supercomputers running the high-performance computing benchmark Linpack.

Bio:
David Patterson is a Berkeley CS professor emeritus, a Google distinguished engineer, and the RISC-V Foundation Vice-Chair. He received his BA, MS, and PhD degrees from UCLA. His Reduced Instruction Set Computer (RISC), Redundant Array of Inexpensive Disks (RAID), and Network of Workstation projects helped lead to multibillion-dollar industries. This work led to 40 awards for research, teaching, and service plus many papers and seven books. The best known book is ‘Computer Architecture: A Quantitative Approach,’ and the newest is ‘The RISC-V Reader: An Open Architecture Atlas.’ In 2018 he and John Hennessy shared the ACM A.M. Turing Award.

Research Areas:

Impact Areas:

See other events that are part of the Dertouzos Distinguished Lecture Series.

Created by Lauralyn M. Smith Email at Tuesday, August 13, 2019 at 2:02 PM.