Concepedia

Publication | Closed Access

Semantic Compositionality through Recursive Matrix-Vector Spaces

1.3K

Citations

36

References

2012

Year

TLDR

Single‑word vector space models have successfully captured lexical information, yet they fail to represent the compositional meaning of longer phrases, limiting deeper language understanding. The authors propose a recursive neural network that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length. Their model assigns each parse‑tree node a vector for the constituent’s inherent meaning and a matrix for how it transforms neighboring words or phrases, allowing it to learn operators in propositional logic and natural language. The approach attains state‑of‑the‑art results on three tasks: predicting fine‑grained sentiment distributions of adverb‑adjective pairs, classifying sentiment labels of movie reviews, and classifying semantic relationships such as cause‑effect or topic‑message between nouns via their syntactic paths.

Abstract

Single-word vector space models have been very successful at learning lexical information. However, they cannot capture the compositional meaning of longer phrases, preventing them from a deeper understanding of language. We introduce a recursive neural network (RNN) model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length. Our model assigns a vector and a matrix to every node in a parse tree: the vector captures the inherent meaning of the constituent, while the matrix captures how it changes the meaning of neighboring words or phrases. This matrix-vector RNN can learn the meaning of operators in propositional logic and natural language. The model obtains state of the art performance on three different experiments: predicting fine-grained sentiment distributions of adverb-adjective pairs; classifying sentiment labels of movie reviews and classifying semantic relationships such as cause-effect or topic-message between nouns using the syntactic path between them.

References

YearCitations

2013

6.6K

2008

5.2K

2003

3K

2010

2.8K

2005

2.1K

1998

1.6K

1991

1.4K

1998

1.3K

2011

1.2K

2011

1.2K

Page 1