Searched for

137 results found
natural language processing
The Most Exciting Natural Language Processing Research of 2019 So Far
The data revolution isn’t just about numbers, as researchers are teaching machines how to process natural language as data. The evolving capacity that machines have to interpret human speech, whether written or spoken, opens new possibilities for the interactions between computers and people. Below, we have highlighted some of the... Read more
The Promise of Retrofitting: Building Better Models for Natural Language Processing
Editor’s note: Catherine is a speaker for the upcoming ODSC East 2019 this April 30-May 3! Be sure to check out her talk, “Adding Context and Cognition to Modern NLP Techniques.” OpenAI’s Andrej Karpathy famously said, “I don’t have to actually experience crashing my car into a wall a few... Read more
Word Embedding and Natural Language Processing
Editor’s note: Check out Mayank’s talk at ODSC East 2019 this April 30 to May 3 in Boston, “Let’s Embed Everything!“ Researchers who work with the nuts and bolts of deep neural networks (or even shallow neural networks, like skip-gram) know that the success of these methods can be attributed in no small... Read more
Introduction to Clinical Natural Language Processing
Andrew is a speaker for ODSC East 2019! Be sure to check out his talk, “Introduction to Natural Language Processing in Healthcare,” this May in Boston! Doctors have always written clinical notes about their patients — originally, the notes were on paper and were locked away in a cabinet. Fortunately for... Read more
An Introduction to Natural Language Processing (NLP)
Everything we express (either verbally or in written) carries huge amounts of information. The topic we choose, our tone, our selection of words, everything adds some type of information that can be interpreted and value can be extracted from it. In theory, we can understand and even predict human behavior... Read more
Why Word Vectors Make Sense in Natural Language Processing
If you’re up-to-date with progress in natural language processing research, you’ve probably heard of word vectors in word2vec. Word2vec is a neural network configuration that ingests sentences to learn word embeddings, or vectors of continuous numbers representing individual words. The neural network accepts a word, which is first mapped to a one-hot vector... Read more
An Idiot’s Guide to Word2vec Natural Language Processing
Word2vec is arguably the most famous face of the neural network natural language processing revolution. Word2vec provides direct access to vector representations of words, which can help achieve decent performance across a variety of tasks machines are historically bad at. For a quick examination of how word vectors work, check... Read more
Tracking the Progress in Natural Language Processing
This post introduces a resource to track the progress and state-of-the-art across many tasks in NLP. Go directly to the document tracking the progress in NLP. Research in machine learning and in natural language processing (NLP) is moving so fast these days, it is hard to keep up. This is... Read more