

Vector Space Word Representations – Rani Nelken ODSC Boston 2015
ConferencesModelingNLP/Text AnalyticsODSC East 2015|Speaker Slidesposted by Open Data Science December 10, 2014 Open Data Science

NLP has traditionally mapped words to discrete elements without underlying structure. Recent research replaces these models with vector-based representations, efficiently learned using neural networks. The resulting embeddings not only improve performance on a variety of tasks, but also show surprising algebraic structure. I will give a gentle introduction to these exciting developments.
Rani Nelken is Director of Research at Outbrain, where he leads a research team focusing on the advanced algorithms behind the company’s recommendation technologies. Prior to that he was a research fellow at Harvard University, and worked at IBM Research, and several startups. He received his PhD in CS from the Technion in 2001.
3 Best Benefits of AI-Powered Predictive Analytics for Marketing
Business + Managementposted by ODSC Community Jun 8, 2023
Top Data Science and AI News: May 2023
AI and Data Science Newsposted by ODSC Team Jun 8, 2023
Better Understand the Economy with ODSC Europe’s ML for Finance Track
Europe 2023Conferencesposted by ODSC Team Jun 8, 2023