Microsoft Deep Learning Image Classifier


(T) At its 15th annual Research Faculty Summit, Microsoft demonstrated its latest deep learning image classifier named Project Adam. Adam was demonstrated on a smartphone to identify the exact breed of a dog.

Project Adam leverages a dataset of 14 million images from 22,000 categories from the Web. Microsoft claims that “using 30 times fewer machines than other systems, that data was used to train a neural network (likely to be based on convolutional neural nets) made up of more than two billion connections. This scalable infrastructure is twice more accurate in its object recognition, and 50 times faster than other systems.”

While the present trend is to run similar machine learning algorithms on GPUs, Adam uses plain old CPUs over Microsoft’s Azure cloud. To that end, Adam leverages new research in asynchronous multicore algorithms – namely new stochastic gradient descent (SGD) called Hogwild! – that allows processors to run independently while accessing shared memory with the possibility of over-writing each other’s contribution.

Project Adam reference:

Microsoft Project Adam

Hogwild references:

Professor Wright: Optimization in Learning and Data Analysis
Hogwild!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

Update: Professor LeCun has posted a few comments on his Google+ page about this classifier.

Note: The picture above is from Project Adam.

Copyright © 2005-2014 by Serge-Paul Carrasco. All rights reserved.
Contact Us: asvinsider at gmail dot com.