Wednesday, January 1, 2025
Google search engine
HomeData Modelling & AIVector Space Word Representations – Rani Nelken ODSC Boston 2015

Vector Space Word Representations – Rani Nelken ODSC Boston 2015

NLP has traditionally mapped words to discrete elements without underlying structure. Recent research replaces these models with vector-based representations, efficiently learned using neural networks. The resulting embeddings not only improve performance on a variety of tasks, but also show surprising algebraic structure. I will give a gentle introduction to these exciting developments.

Rani Nelken is Director of Research at Outbrain, where he leads a research team focusing on the advanced algorithms behind the company’s recommendation technologies. Prior to that he was a research fellow at Harvard University, and worked at IBM Research, and several startups. He received his PhD in CS from the Technion in 2001.

RELATED ARTICLES

Most Popular

Recent Comments