New Approaches Apply Deep Learning to Recommender Systems
Oliver Gindele is Head of Machine Learning at Datatonic. At ODSC Europe 2018, he spoke about how to apply deep learning techniques to recommender systems. He discussed how data scientists can implement some of these novel models in the TensorFlow framework, starting from a collaborative filtering approach and extending... Read more
Layer-wise Relevance Propagation Means More Interpretable Deep Learning
Wojciech Samek is head of machine learning for Fraunhofer Heinrich Hertz Institute. At ODSC Europe 2018, he spoke about an active area of research in deep learning: interpretability. Samek launched his lecture with the following preface on the rising importance of interpretability of deep learning models: “In the last number of... Read more
At NVIDIA, Deep Learning Gets Deeper
Alison Lowndes and her team at NVIDIA are finding exciting new ways to handle the demands deep learning places on machines — and even more exciting ways to use the new technology. At ODSC London 2018, Alison Lowndes of NVIDIA gave a talk on how advances in graphics processing... Read more
How to Play Fantasy Sports Strategically (and Win)
Daily Fantasy Sports is a multibillion-dollar industry with millions of annual users. The Imperial College Business School’s Martin Haugh created a framework to best those users by modeling what they’ll do and constructing a team based on it. Haugh presented his research on how to play Fantasy sports strategically... Read more
Mail Processing with Deep Learning: A Case Study
Businesses increasingly delegate simple, boring, and repetitive tasks to artificial intelligence. In a case study, Alexandre Hubert — lead data scientist of software company Dataiku’s U.K. operations — worked on a team of three to automate mail processing with deep learning. At ODSC Europe 2018, Hubert detailed how his team... Read more
Understanding the 3 Primary Types of Gradient Descent
Gradient descent is the most commonly used optimization method deployed in machine learning and deep learning algorithms. It’s used to train a machine learning model and is based on a convex function. Through an iterative process, gradient descent refines a set of parameters through use of partial differential equations, or... Read more
Unsupervised Learning: Evaluating Clusters
K-means clustering is a partitioning approach for unsupervised statistical learning. It is somewhat unlike agglomerative approaches like hierarchical clustering. A partitioning approach starts with all data points and tries to divide them into a fixed number of clusters. K-means is applied to a set of quantitative variables. We fix... Read more
Alexandru Agachi of Empiric Capital on “Handling Missing Data in Python/Pandas” at ODSC Europe 2018
Key Takeaways: It’s important to describe missing data and the challenges it poses. You need to clarify a confusing terminology that further adds to the field’s complexity. You should take the time to review methods for handling missing data. You need to learn how to apply robust multiple imputation... Read more
Thomas Wiecki of Quantopian on ‘Minding the Gap’ Between Statistics and Machine Learning at ODSC Europe 2018
Key Takeaways: It’s important for data scientists to understand the so-called “gap” between statistics and machine learning, and how there actually is a lot of commonality between the two; it’s just a matter of how you look at things. PyMC3 is a very useful probabilistic programming framework for Python.... Read more
Active Learning: Your Model’s New Personal Trainer
First, some facts. Fact: active learning is not just another name for reinforcement learning; active learning is not a model; and no, active learning is not deep learning. What active learning is and why it may be an important component of your next machine learning project was the subject... Read more