NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
I like this paper and I want to see it in the conference. It provides a valuable new perspective on the theoretical properties of embedding spaces and how they relate to word distributions. I interpret the low-scoring review as more of a lack of interest on the part of the reviewer rather than an indication of the quality of the contribution. There has been a substantial literature on theoretical approaches to vector embedding models since the Turney and Pantel 2010 paper, and I believe this work is a solid and valuable addition. R3's concern about cosine distance vs. Euclidean is real and should be addressed in some fashion, but is not a game changer. Personally, I would speculate that most vectors are of similar length except for a few extremely frequent words, so that cosine and Euclidean usually aren't that different.