(Over)Simplifying Buenos Aires
This is a very rough sketch of the city of Buenos Aires: As the sketch shows, it’s a big blob of homes (VIVIENDAs), with an office-ridden downtown to the East (OFICINAS) and a handful of satellite areas. The sketch, of course, lies. Here’s a map that’s slightly less of... Read more
How the Multinomial Logistic Regression Model Works
In the pool of supervised classification algorithms, the logistic regression model is the first most algorithm to play with. This classification algorithm again categorized into different categories. These categories purely based on the number of target classes. If the logistic regression model used for addressing the binary classification kind... Read more
ConceptNet 5.5 and conceptnet.io
ConceptNet is a large, multilingual knowledge graph about what words mean. This is background knowledge that’s very important in NLP and machine learning, and it remains relevant in a time when the typical thing to do is to shove a terabyte or so of text through a neural net.... Read more
How the Logistic Regression Model Works in Machine Learning
In this article, we are going to learn how the logistic regression model works in machine learning. The logistic regression model is one member of the supervised classification algorithm family. The building block concepts of logistic regression can be helpful in deep learning while building the neural networks. Logistic... Read more
When the bootstrap doesn’t work
The bootstrap always works, except sometimes. By ‘works’ here, I mean in the weakest senses that the large-sample bootstrap variance correctly estimates the variance of the statistic, or that the large-scale percentile bootstrap intervals have their nominal coverage. I don’t mean the stronger sense that someone like Peter Hall might... Read more
HACKER’S GUIDE TO NEURAL NETWORKS, #2
Chapter 2: Machine Learning In the last chapter we were concerned with real-valued circuits that computed possibly complex expressions of their inputs (the forward pass), and also we could compute the gradients of these expressions on the original inputs (backward pass). In this chapter we will see how useful... Read more
TensorFlow and Queues
There are many ways to implement queue data structures, and TensorFlow has some of its own. FIFO Queue with a list In Python, a list can implement a first-in first-out (FIFO) queue, with slightly awkward syntax: >>> my_list = >>> my_list.insert(0, 'a') >>> my_list.insert(0, 'b') >>> my_list.insert(0, 'c')... Read more
Algorithms are Black Boxes, That is Why We Need Explainable AI
Artificial Intelligence offers a lot of advantages for organisations by creating better and more efficient organisations, improving customer services with conversational AI and reducing a wide variety of risks in different industries. Although we are only at the beginning of the AI revolution that is upon us, we can already see... Read more
Deep Learning, IoT Sensor Data…and Bats!
At the very center of Internet of Things excitement is the sensor. Not just one sensor, mind you, but a sensor that normally would just be sending a data stream to who knows where would now have access to the information from another sensor measuring something completely different. Now... Read more
Generating data with random Gaussian noise
I recently needed to generate some data for xx, with some added Gaussian noise. This comes in handy when you want to generate data with an underlying regularity that you want to discover, for example when testing different machine learning algorithms. What I wanted to get is a mechanism... Read more