Faster deep learning with GPUs and Theano
Originally posted by Manojit Nandi, Data Scientist at STEALTHbits Technologies on the Domino data science blog Domino recently added support for GPU instances. To celebrate this release, I will show you how to: Configure the Python library Theano to use the GPU for computation. Build and train neural networks... Read more
ftfy (fixes text for you) 4.4 and 5.0
ftfy is Luminoso’s open-source Unicode-fixing library for Python. Luminoso’s biggest open-source project is ConceptNet, but we also use this blog to provide updates on our other open-source projects. And among these projects, ftfy is certainly the most widely used. It solves a problem a lot of people have with... Read more
On word embeddings – Part 1
Table of contents: A brief history of word embeddings Word embedding models A note on language modelling Classic neural language model C&W model Word2Vec CBOW Skip-gram Unsupervisedly learned word embeddings have been exceptionally successful in many NLP tasks and are frequently seen as something akin to a silver bullet.... Read more
Dropout with Theano
Almost everyone working with Deep Learning would have heard a smattering about Dropout. Albiet a simple concept (introduced a couple of years ago), which sounds like a pretty obvious way for model averaging, further resulting into a more generalized and regularized Neural Net; still when you actually get into... Read more
Background – How many cats does it take to identify a Cat? In this article, I cover the 12 types of AI problems i.e. I address the question : in which scenarios should you use Artificial Intelligence (AI)?  We cover this space in the  Enterprise AI course Some background: Recently, I conducted... Read more
Learning the Monty Hall problem
As Wikipedia gives it Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No.... Read more
(Over)Simplifying Buenos Aires
This is a very rough sketch of the city of Buenos Aires: As the sketch shows, it’s a big blob of homes (VIVIENDAs), with an office-ridden downtown to the East (OFICINAS) and a handful of satellite areas. The sketch, of course, lies. Here’s a map that’s slightly less of... Read more
How the Multinomial Logistic Regression Model Works
In the pool of supervised classification algorithms, the logistic regression model is the first most algorithm to play with. This classification algorithm again categorized into different categories. These categories purely based on the number of target classes. If the logistic regression model used for addressing the binary classification kind... Read more
How the Logistic Regression Model Works in Machine Learning
In this article, we are going to learn how the logistic regression model works in machine learning. The logistic regression model is one member of the supervised classification algorithm family. The building block concepts of logistic regression can be helpful in deep learning while building the neural networks. Logistic... Read more
ConceptNet 5.5 and conceptnet.io
ConceptNet is a large, multilingual knowledge graph about what words mean. This is background knowledge that’s very important in NLP and machine learning, and it remains relevant in a time when the typical thing to do is to shove a terabyte or so of text through a neural net.... Read more