Converting the word2vec model from black box to white box
From this section onwards, we are going to get an understanding of each of the components of the word2vec model, as well as the model's working process. So, in short, we are converting the black box part of word2vec into a white box.
We will focus on the following procedures in order to understand the word2vec model:
- Distributional similarity based representation
- Understanding the components of the word2vec model
- Understanding the logic of the word2vec model
- Understanding the algorithms and math behind the word2vec model
- Some of the facts regarding the word2vec model
- Application of the word2vec model
Let's begin!
Distributional similarity based representation
This is quite an old and powerful idea in NLP. The notion of distributional similarity is that you can get a lot of value for representing the meaning of a word by considering the context in which that particular word appears, and it is highly related with that context. There is a very...