Publication | Closed Access
Approximate tensor decomposition within a tensor-relational algebraic framework
15
Citations
10
References
2011
Year
Unknown Venue
Mathematical ProgrammingRelational QueriesComputational ScienceEngineeringRepresentation TheoryData ScienceMatrix FactorizationVery Large DatabaseComputer EngineeringTensor DecompositionTensor Decomposition OperationsMultilinear Subspace LearningComputer ScienceParallel ComputingRelational Algebraic FrameworkApproximate Tensor DecompositionLow-rank ApproximationQuery Optimization
In this paper, we first introduce a tensor-based relational data model and define algebraic operations on this model. We note that, while in traditional relational algebraic systems the join operation tends to be the costliest operation of all, in the tensor-relational framework presented here, tensor decomposition becomes the computationally costliest operation. Therefore, we consider optimization of tensor decomposition operations within a relational algebraic framework. This leads to a highly efficient, effective, and easy-to-parallelize join-by-decomposition approach and a corresponding KL-divergence based optimization strategy. Experimental results provide evidence that minimizing KL-divergence within the proposed join-by-decomposition helps approximate the conventional join-then-decompose scheme well, without the associated time and space costs.
| Year | Citations | |
|---|---|---|
Page 1
Page 1