Linear algebra cheat sheet for Deep Learning
Beginner’s guide to commonly used operations During Jeremy Howard’s excellent deep learning course I realized I was a little rusty on the prerequisites and my fuzziness was impacting my ability to understand concepts like backpropagation. I decided to put together a few wiki pages on these... Read more
A Beginner’s Guide To Understanding Convolutional Neural Networks Part 2
Introduction Link to Part 1 In this post, we’ll go into a lot more of the specifics of ConvNets. Disclaimer: Now, I do realize that some of these topics are quite complex and could be made in whole posts by themselves. In an effort to remain... Read more
Automated analysis of High‐content Microscopy data with Deep Learning
    Introduction Advances in automated image acquisition and analysis, coupled with the availability of reagents for genome‐scale perturbation, have enabled systematic analyses of cellular and subcellular phenotypes (Mattiazzi Usaj et al, 2016). One powerful application of microscopy‐based assays involves assessment of changes in the subcellular... Read more
A Beginner’s Guide To Understanding Convolutional Neural Networks
Introduction Convolutional neural networks. Sounds like a weird combination of biology and math with a little CS sprinkled in, but these networks have been some of the most influential innovations in the field of computer vision. 2012 was the first year that neural nets grew to... Read more
Deciphering the Neural Language Model
Recently, I have been working on the Neural Networks for Machine Learning course offered by Coursera and taught by Geoffrey Hinton. Overall, it is a nice course and provides an introduction to some of the modern topics in deep learning. However, there are instances where the student... Read more
Handwritten digits recognition using Tensorflow with Python
The progress in technology that has happened over the last 10 years is unbelievable. Every corner of the world is using the top most technologies to improve existing products while also conducting immense research into inventing products that make the world the best place to live.... Read more
Random-Walk Bayesian Deep Networks: Dealing with Non-Stationary Data
Thomas originally posted this article here at http://twiecki.github.io  Most problems solved by Deep Learning are stationary. A cat is always a cat. The rules of Go have remained stable for 2,500 years, and will likely stay that way. However, what if the world around you is changing? This... Read more
TensorFlow Clusters: Questions and Code
One way to think about TensorFlow is as a framework for distributed computing. I’ve suggested that TensorFlow is a distributed virtual machine. As such, it offers a lot of flexibility. TensorFlow also suggests some conventions that make writing programs for distributed computation tractable. When is there... Read more
In this interview, Jonathan Schwarz of Google DeepMind shares insight on Deep Learning projects. He offers tips and advice for the those interested in DL, and explains whether DL projects relate to other data driven projects? He comments on effective team size, software, frameworks, common mistakes, resources for learning,... Read more
On word embeddings – Part 2: Approximating the Softmax
Table of contents: Softmax-based Approaches Hierarchical Softmax Differentiated Softmax CNN-Softmax Sampling-based Approaches Importance Sampling Adaptive Importance Sampling Target Sampling Noise Contrastive Estimation Negative Sampling Self-Normalisation Infrequent Normalisation Other Approaches Which Approach to Choose? Conclusion This is the second post in a series on word embeddings and... Read more