Word2Vec – the world of word vectors
Have you ever wondered how a chatbot can learn about the meaning of words in a text? Does this sound interesting? Well, in this blog we will describe a very powerful method, Word2Vec, that maps words to numbers (vectors) in order to easily capture and distinguish their meaning. We will briefly describe how Word2Vec... Read more
Word embeddings in 2017: Trends and future directions
Table of contents: Subword-level embeddings OOV handling Evaluation Multi-sense embeddings Beyond words as points Phrases and multi-word expressions Bias Temporal dimension Lack of theoretical understanding Task and domain-specific embeddings Embeddings for multiple languages Embeddings based on other contexts The word2vec method based on skip-gram with negative... Read more
See first-hand how you can bring an NLP application to life. Last week I introduced No Jitter readers to Natural Language Processing (NLP) and Facebook’s free NLP service, wit.ai. I wrote about intents and entities, and how they work together to convert human language into actionable... Read more
This post is the first of a two-part series in which we apply NLP techniques to analyze articles about big data, data science, and AI. If you are tired of the hassles of web scraping, then this post might be just for you. I occasionally web scrape news... Read more
Multi-Task Learning Objectives for Natural Language Processing
In a previous blog post, I discussed how multi-task learning (MTL) can be used to improve the performance of a model by leveraging a related task. Multi-task learning consists of two main components: a) The architecture used for learning and b) the auxiliary task(s) that are trained... Read more
Enhancing Customer Experience with Natural Language Processing
Processing language into actionable components is the future of communication. If you talk to a man in a language he understands, that goes to his head. If you talk to him in his language, that goes to his heart. — Nelson Mandela I would venture to... Read more
Word Vectors and SAT Analogies
This reminded me of SAT analogy questions, which disappeared from the SAT in 2005, but looked like this: PALTRY : SIGNIFICANCE :: A. redundant : discussion B. austere : landscape C. opulent : wealth D. oblique : familiarity E. banal : originality The king/queen example is not difficult, and... Read more
Practical Naive Bayes — Classification of Amazon Reviews
If you search around the internet looking for applying Naive Bayes classification on text, you’ll find a ton of articles that talk about the intuition behind the algorithm, maybe some slides from a lecture about the math and some notation behind it, and a bunch of articles I’m... Read more
In my last post, I did some natural language processing and sentiment analysis for Jane Austen’s most well-known novel, Pride and Prejudice. It was just so much fun that I wanted to extend some of that work and compare across her body of writing. I decided... Read more
Seven Python Kernels from Kaggle You Need to See Right Now
The ability to post and share kernels is probably my favorite thing about Kaggle. Learning from other users’ kernels has often provided inspiration for a number of my own projects. I also appreciate the attention to detail and descriptions provided by some users in their code... Read more