RISE Seminar 4/5/18: Matt Johnson, Roy Frostig, and Chris Leary (Google Brain): Compiling machine learning programs via high-level tracing
April 5, 2018
Date: Thursday, April 5th, 12-1pm,
Wozniak Lounge (430 Soda Hall)
We’ll describe JAX, a domain-specific tracing JIT compiler for generating high-performance accelerator code from pure Python and Numpy machine learning programs. JAX uses the XLA compiler infrastructure to generate optimized code for the program subroutines that are most favorable for acceleration, and these optimized subroutines can be called and orchestrated by arbitrary Python. Because the system is fully compatible with Autograd, it allows forward- and reverse-mode automatic differentiation of Python functions to arbitrary order. We show that by combining JAX with Autograd and Numpy we get an easily programmable and highly performant ML system that targets CPUs, GPUs, and TPUs, capable of scaling to multi-core Cloud TPUs.
But JAX isn’t really for building just one ML system. Instead, because both JAX and Autograd are pure Python userspace libraries, they provide flexible tools for building new systems and doing machine learning systems research. To that end, this talk will cover not only the ML researcher’s view of JAX but also some highlights of Autograd and XLA, and how JAX can compose them into new systems.
This is joint work between Matt Johnson, Roy Frostig, Chris Leary, Dougal Maclaurin, Jamie Townsend, and Jonathan Ragan-Kelley.
Short paper on JAX: http://www.sysml.cc/doc/
Autograd’s homepage: https://github.com/
XLA’s landing page: https://www.tensorflow.
Chris Leary is a compiler engineer at Google and tech lead of XLA. Matt Johnson, Roy Frostig, and Dougal Maclaurin (creator of Autograd) are machine learning researchers at Google Brain, and Jamie Townsend is their trusty intern. Jonathan Ragan-Kelley is a professor at Berkeley working on systems, compilers, and programming languages.