Photo by Kelsey Curtis on Unsplash
Read More

GloVe Word Embeddings

GloVe is a word embedding model that constructs word vectors based on global co-occurrence statistics. Unlike Word2Vec, which relies on local context windows, GloVe captures the overall statistical relationships between words through matrix factorization. This approach enables GloVe to generate high-quality word representations that effectively encode semantic and syntactic relationships. This article will introduce the principles and training methods of GloVe.
Read More
Photo by Barcs Tamás on Unsplash
Read More

Word2Vec Word Embedding Model

Word2Vec is a model for learning word embeddings, which converts words and their semantics into vectors through neural networks. Word2Vec provides two training methods: CBOW and Skip-gram, and improves efficiency through Negative Sampling and Subsampling technologies. This article will introduce the basic principles and training methods of Word2Vec.
Read More