Interpreting Character Relationships in Yanxi Palace with Word2Vec

Interpreting Character Relationships in Yanxi Palace with Word2Vec

Click the image below to get the knowledge card Reading Difficulty: ★★☆☆☆ Skill Requirements: Machine Learning, Python, Tokenization, Data Visualization Word Count: 1500 words Reading Time: 6 minutes This article combines the recently popular TV series “Yanxi Palace” to analyze the character relationships from a data perspective. By collecting relevant novels, scripts, character introductions, etc., … Read more

In-Depth Understanding of Word2Vec

In-Depth Understanding of Word2Vec

Deep Learning Author: louwill From: Deep Learning Notes Language models are one of the core concepts in natural language processing. Word2Vec is a language model based on neural networks and a method for word representation. Word2Vec includes two structures: skip-gram (Skip-gram Model) and CBOW (Continuous Bag of Words Model), both essentially perform dimensionality reduction on … Read more

What To Do When Word2Vec Lacks Words?

What To Do When Word2Vec Lacks Words?

Click on the “MLNLP” above and select “Star” to follow the public account Heavyweight content delivered to you first Editor: Yi Zhen https://www.zhihu.com/question/329708785 This article is for academic exchange and sharing only; if there is infringement, the article will be deleted. The author found an interesting question on Zhihu: What to do when Word2Vec lacks … Read more

Resources for Learning and Understanding Word2Vec

Resources for Learning and Understanding Word2Vec

Source: AI Study Society I was interviewed recently and, since I still don’t fully understand how word embeddings work, I’ve been looking for a lot of related materials to grasp this concept better. My understanding is still limited, so I won’t overestimate myself by writing my own article (even if I did, it would just … Read more

Word2Vec, Node2Vec, Graph2Vec, X2Vec: Theory of Vector Embeddings

Word2Vec, Node2Vec, Graph2Vec, X2Vec: Theory of Vector Embeddings

[Introduction] Embedding representation learning is a current research hotspot. From Word2Vec to Node2Vec to Graph2Vec, a large number of X2Vec algorithms have emerged. But how can we construct a theory of vector embeddings to guide algorithm design? Recently, Professor Martin Grohe, a computer science professor at RWTH Aachen University and ACM Fellow, gave a report … Read more

Understanding Huffman Tree Generation in Word2Vec

Understanding Huffman Tree Generation in Word2Vec

Deep learning has achieved great success in natural language processing (NLP) tasks, among which distributed representation of words is a crucial technology. To deeply understand distributed representation, one must delve into word2vec. Today, let’s explore how the Huffman Tree is generated in the word2vec code. This is a very important data structure in word2vec, used … Read more

Understanding Word2Vec: A Deep Dive into Neural Networks

Understanding Word2Vec: A Deep Dive into Neural Networks

Since Tomas Mikolov from Google proposed Word2Vec in “Efficient Estimation of Word Representation in Vector Space”, it has become a fundamental component of deep learning in natural language processing. The basic idea of Word2Vec is to represent each word in natural language as a short vector with a unified meaning and dimension. As for what … Read more

Understanding Word2vec: The Essence of Word Vectors

Understanding Word2vec: The Essence of Word Vectors

Summary of Word2vec Reference Materials Let me briefly describe my deep dive into Word2vec: I first looked at Mikolov’s two original papers on Word2vec, but found myself still confused after reading them. The main reason is that these papers omit too much theoretical background and derivation details. I then revisited Bengio’s 2003 JMLR paper and … Read more

The Secrets of Word2Vec: Part 3 of the Word Embedding Series

The Secrets of Word2Vec: Part 3 of the Word Embedding Series

Excerpt from Sebastian Ruder Blog Author: Sebastian Ruder Translated by: Machine Heart Contributors: Terrence L This article is Part 3 of the Word Embedding Series, introducing the popular word embedding model Global Vectors (GloVe). To read Part 2, click on Technical | Word Embedding Series Part 2: Comparing Several Methods of Approximate Softmax in Language … Read more

Understanding Character Relationships in ‘Story of Yanxi Palace’ Using Word2Vec

Understanding Character Relationships in 'Story of Yanxi Palace' Using Word2Vec

Source | Wujie Community Mixlab Editor | An Ke 【PanChuang AI Introduction】: Recently, everyone has been flooded with the popular Qing Dynasty drama “Story of Yanxi Palace”~ The male lead, Emperor Qianlong, is often referred to as a “big pig’s hoof” by everyone because he falls in love with every woman he meets. As simple … Read more