fbpx
Random-Walk Bayesian Deep Networks: Dealing with Non-Stationary Data
Thomas originally posted this article here at http://twiecki.github.io  Most problems solved by Deep Learning are stationary. A cat is always a cat. The rules of Go have remained stable for 2,500 years, and will likely stay that way. However, what if the world around you is changing? This is common, for... Read more
TensorFlow Clusters: Questions and Code
One way to think about TensorFlow is as a framework for distributed computing. I’ve suggested that TensorFlow is a distributed virtual machine. As such, it offers a lot of flexibility. TensorFlow also suggests some conventions that make writing programs for distributed computation tractable. When is there a cluster? A... Read more
In this interview, Jonathan Schwarz of Google DeepMind shares insight on Deep Learning projects. He offers tips and advice for the those interested in DL, and explains whether DL projects relate to other data driven projects? He comments on effective team size, software, frameworks, common mistakes, resources for learning, and more all under 30... Read more
On word embeddings – Part 2: Approximating the Softmax
Table of contents: Softmax-based Approaches Hierarchical Softmax Differentiated Softmax CNN-Softmax Sampling-based Approaches Importance Sampling Adaptive Importance Sampling Target Sampling Noise Contrastive Estimation Negative Sampling Self-Normalisation Infrequent Normalisation Other Approaches Which Approach to Choose? Conclusion This is the second post in a series on word embeddings and representation learning. In... Read more
TensorFlow as a Distributed Virtual Machine
TensorFlow has a flexible API, and it has automatic differentiation, and it can run on GPUs. But the thing that’s really neat about TensorFlow is that it gives you a fairly general way to easily program across multiple computers. TensorFlow’s distributed runtime, the big bottom box in this figure... Read more
How NOT to program the TensorFlow Graph
Using TensorFlow from Python is like using Python to program another computer. Some Python statements build your TensorFlow program, some Python statements execute that program, and of course some Python statements aren’t involved with TensorFlow at all. Being thoughtful about the graphs you construct can help you avoid confusion... Read more
Deep Learning as the apotheosis of Test-Driven Development
Even if you aren’t interested in data science, Deep Learning is an interesting programming paradigm; you can see it as “doing test-driven development with a ludicrously large number of tests, an IDE that writes most of the code, and a forgiving client.” No wonder everybody’s pouring so much money... Read more
Transfer Learning – Machine Learning’s Next Frontier
Table of contents: What is Transfer Learning? Why Transfer Learning Now? A Definition of Transfer Learning Transfer Learning Scenarios Applications of Transfer Learning Learning from simulations Adapting to new domains Transferring knowledge across languages Transfer Learning Methods Using pre-trained CNN features Learning domain-invariant representations Making representations more similar Confusing... Read more
Faster deep learning with GPUs and Theano
Originally posted by Manojit Nandi, Data Scientist at STEALTHbits Technologies on the Domino data science blog Domino recently added support for GPU instances. To celebrate this release, I will show you how to: Configure the Python library Theano to use the GPU for computation. Build and train neural networks... Read more
Cognitive Machine Learning (1): Learning to Explain
Above is an image of the Zaamenkomst panel: one of the best remaining exemplars of rock art from the San people of Southern Africa. As soon as you see it, you are inevitably herded, like the eland in the scene, through a series of thoughts. Does it have a meaning?  Why are the eland running?... Read more