Avatto>>DATA SCIENTIST>>SHORT QUESTIONS>>Deep Learning>>Embedding based Methods
Word embeddings are the vector representations of words in a vector space.
The word embeddings capture the syntactic and semantic meanings of a word which helps our network to understand the word better.
In the CBOW model, the goal of the network is to predict a target word given its surrounding words whereas, in the skip-gram model, the goal of the network is to predict surrounding words given a target word.
Being the rapidly evolving field, there are new improvements and advancements being published every day. So make sure to answer questions regarding the different variants of BERT, ELMo, XLnet, and so on.
The paragraph vector learns the vector representation of the whole paragraph thus capturing the subject of the paragraph.


Description