Word vectors
Word vectors are useful building blocks in many applications. They capture and encode the semantic relationships between words. As a consequence, they lead to the transformation of words into a sequence of numbers, forming a dense vector that is well-suited for training deep learning models. In this chapter, we will take a detailed look at the various approaches to building such semantically useful word embeddings.
The classical approach
Traditionally, the approach for building word representations used things such as the bag-of-words model. In this model, word representations considered individual words to be independent of one another. Hence, such representations often used a one-hot representation, which indicated the presence or absence of words in a sentence, to generate a vector representation in a sentence or document. However, such a representation is seldom useful in real-world applications, where word meanings change based on the words surrounding them. For example,...