The rapid evolution of machine learning over the current decade has brought about a number of exciting challenges in building systems. Large datasets and the need for more compute has shifted the computation models from CPUs to GPUs and even to ML specific accelerators like TPUs. Over the same time, the research has moved from simple feed forward models to much deeper convolutional ones, sequential models like LSTMs and beyond deep learning with reinforcement learning and evolutionary approaches. This talk will cover some of the ideas in TensorFlow that address the need to provide performance and scale, along with flexible programming models for researchers to explore new ideas.