Word representations
Let's look at some of these advanced strategies for handling text data and extracting meaningful features or word embeddings from it, which can be used in other machine learning (ML) systems for more advanced tasks such as classification, summarization, and translation. We can transfer the learned representation of words to another model. If we have huge training data then word embeddings can be jointly learned along with the final task.
Word2vec model
This model was created by Google in 2013 and is a predictive, deep learning-based model that computes and generates high quality, distributed, and continuous dense vector representations of words, which capture contextual and semantic similarity. Essentially, these are unsupervised models that can take in massive textual corpora, create a vocabulary of possible words, and generate dense word embeddings for each word in the vector space representing that vocabulary. Usually, you can specify the size of the word embedding...