Zachary Chase Lipton

Zachary Chase Lipton is a PhD student in the Computer Science department at UCSD. He researches machine learning theory and applications, and is a contributing editor to KDnuggets.

California http://www.zacklipton.com 10 posts

Imaginary big data problems

Five years ago, social media was the one thing everyone had to talk about. Do you have a social media strategy? Well you better, because the competition does! Does that mean anything at all? Maybe not, but do you want to take that risk? To be fair, of course online media is... Read more

A rigorous & readable review on RNNs

This post introduces a new critical review on recurrent neural networks for sequence learning. Twelve nights back, while up late preparing pretty pictures for a review on recurrent neural networks for sequence learning, I figured I should share my Google art with the world. After all, RNNs are poorly understood by most... Read more

Looking back at "Finding structure in time"

Keeping up with the break-neck pace of research in computer science can be daunting. Even in my comfortable position as a graduate researcher, with no students to advise and no current teaching responsibilities, there are more interesting papers published each month than I could reasonably read. For an engineer with full-time responsibilities... Read more

Demystifying LSTM neural networks

This article provides a basic introduction to Long Short Term Memory Neural Networks. For a more thorough review of RNNs, see the full 33 page review hosted on arXiv. Given its wide applicability to real-world tasks, deep learning has attracted the attention of a wide audience of interested technologists, investors, and spectators.... Read more

No more local minima?

Local minima have long been considered to plague neural networks. This idea has been propagated by reputable researchers. Thousands of papers express this view. Textbooks and lecture notes repeat it. My previous blog post on recurrent nets repeated this claim. A recent paper, "Identifying and attacking the saddle point problem in high-dimensional... Read more

Outstanding deep learning tutorial

I recently came across a fantastic tutorial on deep learning by Alec Radford, the head of research at indico Data Solutions. The video is 52 minutes long and clearly covers fundamental theano concepts as well as theano implementations of logistic regression, multilayer perceptron, and convolutional neural networks. The source code is available... Read more

Expressing math in code without hair loss

Machine learning researchers and data scientists regularly use many different programming languages. Most common are Python, Matlab, R, and C++. Another popular language, Octave offers a similar language to Matlab but is open source and free to use. Its drawbacks are penalties in performance and library available. Another language quickly entering the... Read more

Idiot-proof validation

Big data is a tortured term. (See my article from February for more on unforgivable abuses of data science jargon). Most articles on the topic seem to fall into one of two clusters: 1) Preposterous posts that vaguely allude to magic, clouds, and giant data monsters that scale multiple data centers and... Read more

The economics of virtualization

Over the last several years, the term cloud has become ubiquitous. It's even come to refer to activities that far predate its coinage. Email is in the cloud. Your voice mail is stored in the cloud. Perhaps your company offers or licenses collaborative productivity tools which live in the cloud. Of course,... Read more