Working with CBOW embeddings
In this recipe, we will implement the CBOW (continuous bag-of-words) method of word2vec. It is very similar to the Skip-Gram
method, except we are predicting a single target word from a surrounding window of context words.
Getting ready
In this recipe, we will implement the CBOW
method of word2vec. It is very similar to the Skip-Gram
method, except we are predicting a single target word from a surrounding window of context words.
In the previous example, we treated each combination of window and target as a group of paired inputs and outputs, but with CBOW we will add the surrounding window embeddings together to get one embedding to predict the target word embedding:

Figure 5: A depiction of how the CBOW embedding data is created out of a window on an example sentence (window size = 1 on each side)
Most of the code will stay the same, except we will need to change how we create the embeddings and how we generate the data from the sentences.
To make the code easier...