fbpx
Deciphering the Neural Language Model
Recently, I have been working on the Neural Networks for Machine Learning course offered by Coursera and taught by Geoffrey Hinton. Overall, it is a nice course and provides an introduction to some of the modern topics in deep learning. However, there are instances where the student... Read more
Why the most influential business AIs will look like spellcheckers (and a toy example of how to build one)
Forget voice-controlled assistants. At work, AIs will turn everybody into functional cyborgs through squishy red lines under everything you type. Let’s look at a toy example I just built (mostly to play with deep learning along the way). I chose as a data set Patrick Martinchek’s... Read more
A survey of cross-lingual embedding models
In past blog posts, we discussed different models, objective functions, and hyperparameter choices that allow us to learn accurate word embeddings. However, these models are generally restricted to capture representations of words in the language they were trained on. The availability of resources, training data, and... Read more
You Must Allow Me To Tell You How Ardently I Admire and Love Natural Language Processing
It is a truth universally acknowledged that sentiment analysis is super fun, and Pride and Prejudice is probably my very favorite book in all of literature, so let’s do some Jane Austen natural language processing. Project Gutenberg makes e-texts available for many, many books, including Pride... Read more
Hello all and welcome to the second of the series – NLP with NLTK. The first of the series can be found here, incase you have missed. In this article we will talk about basic NLP concepts and use NLTK to implement the concepts. Contents: Corpus... Read more
On word embeddings – Part 2: Approximating the Softmax
Table of contents: Softmax-based Approaches Hierarchical Softmax Differentiated Softmax CNN-Softmax Sampling-based Approaches Importance Sampling Adaptive Importance Sampling Target Sampling Noise Contrastive Estimation Negative Sampling Self-Normalisation Infrequent Normalisation Other Approaches Which Approach to Choose? Conclusion This is the second post in a series on word embeddings and... Read more
ftfy (fixes text for you) 4.4 and 5.0
ftfy is Luminoso’s open-source Unicode-fixing library for Python. Luminoso’s biggest open-source project is ConceptNet, but we also use this blog to provide updates on our other open-source projects. And among these projects, ftfy is certainly the most widely used. It solves a problem a lot of... Read more
On word embeddings – Part 1
Table of contents: A brief history of word embeddings Word embedding models A note on language modelling Classic neural language model C&W model Word2Vec CBOW Skip-gram Unsupervisedly learned word embeddings have been exceptionally successful in many NLP tasks and are frequently seen as something akin to... Read more
Implementing a CNN for Text Classification in Tensorflow
The full code is available on Github. In this post we will implement a model similar to Kim Yoon’s Convolutional Neural Networks for Sentence Classification. The model presented in the paper achieves good classification performance across a range of text classification tasks (like Sentiment Analysis) and... Read more
Attention and Memory in Deep Learning and NLP
A recent trend in Deep Learning are Attention Mechanisms. In an interview, Ilya Sutskever, now the research director of OpenAI, mentioned that Attention Mechanisms are one of the most exciting advancements, and that they are here to stay. That sounds exciting. But what are Attention Mechanisms?... Read more