Stirling Numbers, Including Negative Arguments
Stirling numbers are something like binomial coefficients. They come in two varieties, imaginatively called the first kind and second kind. Unfortunately it is the second kind that are simpler to describe and that come up more often in applications, so we’ll start there. Stirling numbers of the second kind... Read more
Why NLP is a Great First AI Solution for Businesses
The world of the C-suite isn’t quite ready to embrace artificial intelligence just yet. That’s according to an oft-cited Accenture study from late 2016, which found that less than half of the 1,700 business leaders interviewed in a global sample would feel comfortable trusting the advice of AI “in... Read more
A Different Use of Time Series to Identify Seasonal Customers
I had previously written about creatively leveraging your data using segmentation to learn about a customer base.  The article is here.  In the article I mentioned utilizing any data that might be relevant.  Trying to identify customers with seasonal usage patterns was one of the variables that I mentioned that sounded interesting. ... Read more
Fixed Points of Logistic Function
Here’s an interesting problem that came out of a logistic regression application. The input variable was between 0 and 1, and someone asked when and where the logistic transformation f(x) = 1/(1 + exp(a + bx)) has a fixed point, i.e. f(x) = x. So given logistic regression parameters a and b, when does the logistic curve... Read more
Missing the Point About Microservices – It’s About Testing and Deploying Independently
Ok, so I have to first preface this whole blog post by a few things: I really struggle with the term microservices. I can’t put my finger on exactly why. Maybe because the term is hopelessly ill-defined, maybe because it’s gotten picked up by the hype train. Whatever. But I... Read more
Relative Error in the Central Limit Theorem
If you average a large number independent versions of the same random variable, the central limit theorem says the average will be approximately normal. That is the absolute error in approximating the density of the average by the density of a normal random variable will be small. (Terms and conditions apply.... Read more
Why are Convnets Often Better Than the Rest? Part I
Introduction In this series, I will explore convolutional neural networks in comparison to standard neural networks. To begin with, the former is an evolution of the latter. Through analyzing this evolution, it is fascinating to see how particular design differences have such a great impact on performance and overall... Read more
Introduction to Machine Learning for Non-Developers
About Machine Learning We all know that machine learning is about handling data, but it also can be seen as: The art of finding order in data by browsing its inner information. Some background on predictive models There are several types of predictive models. These models usually have several... Read more
Quantifying Uncertainty with Bayesian Statistics
Whenever we’re working with data, there is necessarily uncertainty in our results. Firstly, we can’t collect all the possible data, so instead we randomly sample from a population. Accordingly, there is a natural variance and uncertainty in any data we collect. There is also uncertainty from missing data, systematic... Read more
Reel Reviews: Neural Networks for Sentiment Analysis
This is a joint article authored in collaboration between Kannan Sankaran and Win Suen. The Problem Over the past few years, there has been burgeoning interest in neural networks from data science and engineering communities. The advent of ever larger datasets, efficient commodity hardware, and powerful open source libraries... Read more
Open Data Science - Your News Source for AI, Machine Learning & more