Publication | Open Access
Knowledge graph completion via complex tensor factorization
207
Citations
30
References
2017
Year
Artificial IntelligenceEngineeringMachine LearningComplex EmbeddingsSemantic WebLink PredictionStatistical Relational LearningSquare Matrices FactorizationKnowledge Graph EmbeddingsInformation RetrievalData ScienceData MiningKnowledge Graph CompletionKnowledge DiscoveryComputer ScienceKnowledge GraphsSemantic NetworkGraph TheoryMatrix FactorizationBusinessSemantic Graph
Knowledge graph completion in statistical relational learning seeks to automatically understand large directed graphs and predict missing edges, while state‑of‑the‑art embedding models balance expressiveness against time and space complexity. The study aims to reconcile expressiveness and complexity in knowledge graph completion by employing complex‑valued embeddings and investigating their connection to unitary diagonalization. The authors use complex‑valued embeddings and analyze their relationship to unitary diagonalization to achieve this balance. Theoretical analysis shows that all real square matrices are the real part of some unitarily diagonalizable matrix, enabling simple Hermitian dot‑product embeddings that are linear in space and time and outperform alternative methods on standard link‑prediction benchmarks, opening avenues for other square‑matrix factorization applications.
In statistical relational learning, knowledge graph completion deals with automatically understanding the structure of large knowledge graphs--labeled directed graphs-- and predicting missing relationships--labeled edges. State-of-the-art embedding models propose different trade-offs between modeling expressiveness, and time and space complexity. We reconcile both expressiveness and complexity through the use of complex-valued embeddings and explore the link between such complex-valued embeddings and unitary diagonalization. We corroborate our approach theoretically and show that all real square matrices--thus all possible relation/adjacency matrices--are the real part of some unitarily diagonalizable matrix. This results opens the door to a lot of other applications of square matrices factorization. Our approach based on complex embeddings is arguably simple, as it only involves a Hermitian dot product, the complex counterpart of the standard dot product between real vectors, whereas other methods resort to more and more complicated composition functions to increase their expressiveness. The proposed complex embeddings are scalable to large data sets as it remains linear in both space and time, while consistently outperforming alternative approaches on standard link prediction benchmarks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1