DisTenC: A Distributed Algorithm for Scalable Tensor Completion on Spark Conference Paper uri icon

abstract

  • 2018 IEEE. How can we efficiently recover missing values for very large-scale real-world datasets that are multi-dimensional even when the auxiliary information is regularized at certain mode? Tensor completion is a useful tool to recover a low-rank tensor that best approximates partially observed data and further predicts the unobserved data by this low-rank tensor, which has been successfully used for many applications such as location-based recommender systems, link prediction, targeted advertising, social media search, and event detection. Due to the curse of dimensionality, existing algorithms for tensor completion that integrate auxiliary information do not scale for tensors with billions of elements. In this paper, we propose DisTenC, a new distributed large-scale tensor completion algorithm that can be distributed on Spark. Our key insights are to (i) efficiently handle trace-based regularization terms; (ii) update factor matrices with caching; and (iii) optimize the update of the new tensor via residuals. In this way, we can tackle the high computational costs of traditional approaches and minimize intermediate data, leading to order-of-magnitude improvements in tensor completion. Experimental results demonstrate that DisTenC is capable of handling up to 10~1000X larger tensors than existing methods with much faster convergence rate, shows better linearity on machine scalability, and achieves up to an average improvement of 23.5% in accuracy in applications.

name of conference

  • 2018 IEEE 34th International Conference on Data Engineering (ICDE)

published proceedings

  • 2018 IEEE 34TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE)

author list (cited authors)

  • Ge, H., Zhang, K., Alfifi, M., Hu, X., & Caverlee, J.

citation count

  • 10

complete list of authors

  • Ge, Hancheng||Zhang, Kai||Alfifi, Majid||Hu, Xia||Caverlee, James

publication date

  • April 2018