Illustrated Word2Vec: Understanding Word Embeddings
Word embeddings represent a word with a numerical vector, which is different from the IDs used in Tokenization. Word embedding vectors carry more semantic information. This article will illustrate Word2Vec: a method for word embeddings. This series also includes illustrations of Tokenization, Transformer, GPT2, and BERT. If you want to learn about Tokenization, please see … Read more