Word2Vec concept

These days, word2vec is a very hot topic and technique.
Because of my lab is doing text mining and I present a paper in lab meeting which utilizing word2vec to train a CNN. Professor ask me to prepare some slide to explain word2vec. So I will share my study of word2vec to you guys, haha.

Basically, which is a new representation for the words and the special thing is its semantic performance is outperform to other word representation.

For example,  vector(king) – vector(man) + vector(woman) = vector(queen)

Wow so amazing, so how does word2vec calculate word representation?
Umm… great question!

I do a slide for my lab presentation, share to you guys!
By the way, in slide I only cover algorithm concept of word2vec. I didn’t cover advanced of word2vec such as negative sampling for optimize neural network.

發表迴響

你的電子郵件位址並不會被公開。 必要欄位標記為 *