New ask Hacker News story: Orthogonal Representations in NLP
Orthogonal Representations in NLP
2 by soeboy | 0 comments on Hacker News.
Let's say I have a word I want to retrieve the "sentiment", say "cow." A cow is an animal. Obviously in a given model (i.e. word2vec) the representation of animal and cow will be correlated. I was wondering, how can I extract the orthogonal component of the word "cow" that is not driven by "animal" and capture its sentiment?
2 by soeboy | 0 comments on Hacker News.
Let's say I have a word I want to retrieve the "sentiment", say "cow." A cow is an animal. Obviously in a given model (i.e. word2vec) the representation of animal and cow will be correlated. I was wondering, how can I extract the orthogonal component of the word "cow" that is not driven by "animal" and capture its sentiment?
Comments
Post a Comment