Differentiable Programming for Machine Learning
Date: Thursday, May 02, 2019
Time: 4:00 PM to 5:00 PM
Event Type: Seminar
Room Description: 32-D463 (Star)
Host: Michael Carbin
Contact: Nathan Higgins, 617-515-5976, email@example.com
Speaker URL: None
firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org
TALK: Differentiable Programming for Machine Learning
The recent successes of machine learning are due in part to the invention of machine learning methods (especially for deep learning), to the collection of datasets for tackling problems in many fields, and to the availability of powerful hardware, including CPUs, GPUs, and custom-designed ASICs. Software systems, however, are central to this progress.
This talk suggests that it is instructive and fruitful to think of these software systems from a programming-language perspective. A system such as TensorFlow owes its power and generality to its programmability. There, models for machine learning are assembled from primitive operations by function composition and other simple, familiar constructs, with in addition support for automatic differentiation, which is uncommon in mainstream programming languages but prominent in current machine learning methods.
The design and the principles of the corresponding programming languages remain active research areas. This talk presents some recent results on the semantics of these languages, focusing on a tiny but expressive language for differentiable programming. It also includes an introduction to JAX, a new system for composable transformations of numerical Python programs, including automatic differentiation and compilation for hardware accelerators.
The talk is based on joint work with many people, in particular Gordon Plotkin and the many contributors to the systems described.
AI & Machine Learning, Computer Architecture, Programming Languages & Software
Created by Nathan Higgins at Monday, April 22, 2019 at 1:14 PM.