Slide 131
Slide 131 text
167
References
[1] Mika, Sebastian, et al. "Kernel PCA and de-noising in feature spaces." Advances in neural information processing systems 11 (1998).
[2] Hoffmann, Heiko. "Kernel PCA for novelty detection." Pattern recognition 40.3 (2007): 863-874.
[3] Lee, Daniel D., and H. Sebastian Seung. "Learning the parts of objects by non-negative matrix factorization." Nature 401.6755 (1999): 788-791.
[4] Zhang, Sheng, et al. "Learning from incomplete ratings using non-negative matrix factorization.โ
Proceedings of the 2006 SIAM international conference on data mining. Society for Industrial and Applied Mathematics, 2006.
[5] Lee, Hyekyoung, and Seungjin Choi. "Group nonnegative matrix factorization for EEG classification.โ
Artificial Intelligence and Statistics. PMLR, 2009.
[6] Takeuchi Koh, et al. "Non-negative multiple matrix factorization." Twenty-third international joint conference on artificial intelligence. 2013.
[7] Nickel, Maximilian, Volker Tresp, and Hans-Peter Kriegel. โA three-way model for collective learning on multi-relational data.โ ICML2011
[8] T. Bezdan, N. Baฤanin Dลพakula, International Scientific Conference on Information Technology and Data Related Research, 2019
[9] Y. Liu et al., "Tensor Computation for data analysis", 2022
[10] Ghalamkari Kazu, Mahito Sugiyama, and Yoshinobu Kawahara. "Many-body approximation for non-negative tensors.โ Advances in Neural Information Processing
Systems 36 (2024).
[11] Wang, Wenqi, et al. "Wide compression: Tensor ring nets." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.
[12] Zheng, Yu-Bang, et al. "Fully-connected tensor network decomposition and its application to higher-order tensor completion." Proceedings of the AAAI conference on
artificial intelligence. Vol. 35. No. 12. 2021.
[13] Cichocki, Andrzej, et al. "Tensor networks for dimensionality reduction and large-scale optimization: Part 1 low-rank tensor decompositions." Foundations and
Trendsยฎ in Machine Learning 9.4-5 (2016): 249-429.