Google TensorFlow Developer Summit

IMG_1294

(T) Google had yesterday its second TensorFlow (TF) developer summit at the Computer Museum in Mountain View. This second summit was quite different from the first one. While the first one was a small conference with a limited audience, this second one was like a “mini-Google I/O” with many presentations, demos, and a big party!

I was expected that Google will announce new APIs to Karas and/or a complete AutoML toolkit (in particular, to make non data scientists more comfortable developing models with TF), as well as improving the TensorFlow platform in particular in distributed and production environments.

While there were many announcements, and both the breadth and depth of those announcements were quite impressive, about the TF platform, the big surprises were TF for the Web and JavaScript, and Swift for TF! Can you imagine that?

With some many investments into TensorFlow, it remains to be seen if the other ML frameworks such as Caffe2 or Pytorch will be able to be widely adopted or if there will only be used for a limited number of use cases or by a limited number of companies.

And with so much investment from Google in TF, it might seem that TF could be the next moon shot for Google after Android. And, if that is the case the key business question is what will be the monetization mechanisms for the TF platform for Google?

Following are some selected videos on the TF platform, followed by TF for JavaScript and TF for Swift.

Eager Execution

“TensorFlow’s eager execution is an imperative programming environment that evaluates operations immediately, without an extra graph-building step. Operations return concrete values instead of constructing a computational graph to run later. This makes it easy to get started with TensorFlow, debug models, and reduce boilerplate code.”

Input Pipelines (tf.data)

“The tf.data API enables you to build complex input pipelines from simple, reusable pieces. For example, the pipeline for an image model might aggregate data from files in a distributed file system, apply random perturbations to each image, and merge randomly selected images into a batch for training. The pipeline for a text model might involve extracting symbols from raw text data, converting them to embedding identifiers with a lookup table, and batching together sequences of different lengths.”

TensorFlow Hub

“TensorFlow Hub is a library to foster the publication, discovery, and consumption of reusable parts of machine learning models. A module is a self-contained piece of a TensorFlow graph, along with its weights and assets, that can be reused across different tasks in a process known as transfer learning.TensorFlow Hub is a library to foster the publication, discovery, and consumption of reusable parts of machine learning models. A module is a self-contained piece of a TensorFlow graph, along with its weights and assets, that can be reused across different tasks in a process known as transfer learning.

Modules contain variables that have been pre-trained for a task using a large dataset. By reusing a module on a related task, you can:

  • Train a model with a smaller dataset
  • Improve generalization
  • Speed up training”

Debugging TF with TensorBoard plugins

“TensorFlow Debugger, an interactive web GUI for controlling the execution of TensorFlow models, setting breakpoints, stepping through graph nodes, watching tensors flow in real-time, and pinpointing problems down to the tiniest NaN. This tool now comes included with TensorBoard via its open plugin API.”

Distributed TF

“Distributed TensorFlow enables you to create a cluster of TensorFlow servers, and to distribute a computation graph across that cluster.”

Building TF pipelines with TFX

TF for JavaScript

TensorFlow.js is a WebGL accelerated, browser-based JavaScript library for training and deploying TF models:

Swift for TF

Note: The picture above is two Jellyfish from the Monterey Bay Aquarium.

Copyright © 2005-2018 by Serge-Paul Carrasco. All rights reserved.
Contact Us: asvinsider at gmail dot com.