Thursday, September 4, 2025
HomeData Modelling & AIVector Space Word Representations – Rani Nelken ODSC Boston 2015

Vector Space Word Representations – Rani Nelken ODSC Boston 2015

NLP has traditionally mapped words to discrete elements without underlying structure. Recent research replaces these models with vector-based representations, efficiently learned using neural networks. The resulting embeddings not only improve performance on a variety of tasks, but also show surprising algebraic structure. I will give a gentle introduction to these exciting developments.

Rani Nelken is Director of Research at Outbrain, where he leads a research team focusing on the advanced algorithms behind the company’s recommendation technologies. Prior to that he was a research fellow at Harvard University, and worked at IBM Research, and several startups. He received his PhD in CS from the Technion in 2001.

RELATED ARTICLES

Most Popular

Dominic
32261 POSTS0 COMMENTS
Milvus
81 POSTS0 COMMENTS
Nango Kala
6626 POSTS0 COMMENTS
Nicole Veronica
11795 POSTS0 COMMENTS
Nokonwaba Nkukhwana
11855 POSTS0 COMMENTS
Shaida Kate Naidoo
6747 POSTS0 COMMENTS
Ted Musemwa
7023 POSTS0 COMMENTS
Thapelo Manthata
6695 POSTS0 COMMENTS
Umr Jansen
6714 POSTS0 COMMENTS