Chapter 4. Semantic Embedding Using Shallow Models
In this chapter, we will discuss the motivation for understanding semantic relationships between words, and we will discuss approaches for identifying such relationships. In the process, we will obtain a vector representation for words, which will let us build vector representations at a document level.
We will cover the following topics in this chapter:
- Word embeddings, to represent words as vectors, trained by a simple shallow neural network
- Continuous Bag of Words (CBOW) embeddings, to predict a target from a source word, using a similar neural network
- Sentence embeddings, through averaging Word2vec
- Document embeddings, through averaging across the document