Abstract
Organs, cells and microstructures in cells dealt with in medical image analysis are volumetric data. Sampled values of volumetric data are expressed as three-way array data. For the quantitative discrimination of multiway forms from the viewpoint of principal component analysis (PCA)-based pattern recognition, distance metrics for subspaces of multiway data arrays are desired. The paper aims to extend pattern recognition methodologies based on PCA for vector spaces to those for multilinear data. First, we extend the canonical angle between linear subspaces for vector-based pattern recognition to the canonical angle between multilinear subspaces for tensor-based pattern recognition. Furthermore, using transportation between the Stiefel manifolds, we introduce a new metric for a collection of linear subspaces. Then, we extend the transportation of between Stiefel manifolds in vector space to the transportation of the Stiefel manifolds in multilinear spaces for the discrimination analysis of multiway array data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Jordan, C.: Essai sur la géométrie àn \(n\) dimensions. Bull. Soc. Math. France 3, 103–174 (1875)
Afriat, S.N.: Orthogonal and oblique projectors and the characterisation of pairs of vector spaces. Math. Proc. Cambridge Philos. Soc. 53, 800–816 (1957)
Knyazev, A.V., Argentati, M.E.: Principal angles between subspaces in an a-based scalar product: algorithms and perturbation estimates. SIAM J. Sci. Comput. 23, 2009–2041 (2002)
Cock, K.D., Moor, B.D.: Subspace angles between ARMA models. Syst. Control Lett. 46, 265–270 (2002)
Villani, C.: Optimal Transport: Old and New. Grundlehren der mathematischen Wissenschaften. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-540-71050-9
Stiefel, E.: Richtungsfelder und Fernparallelismus in n-dimensionalen Mannigfaltigkeiten. Comment. Math. Helv. 8, 305–353 (1935)
Hansen, P.-C.: Discrete Inverse Problems: Insight and Algorithms. SIAM, Philadelphia (2010)
Hansen, P.-C., Nagy, J.G., O’Leary, D.P.: Deblurring Images: Matrices, Spectra, and Filtering. SIAM, Philadelphia (2006)
Chung, J., Knepper, S., Nagy, J.: Large-scale inverse problems in imaging. In: Scherzer, O. (ed.) Handbook of Mathematical Methods in Imaging, pp. 43–86. Springer, New York (2011). https://doi.org/10.1007/978-0-387-92920-0_2
Iijima, T.: Pattern Recognition, Corona-sha (1974). (in Japanese)
Watanabe, S.: Pattern Recognition: Human and Mechanical. Wiley, Hoboken (1985)
Oja, E.: Subspace Methods of Pattern Recognition. Research Studies Press, Baldock (1983)
Otsu, N.: Mathematical Studies on Feature Extraction in Pattern Recognition, Researches of The Electrotechnical Laboratory, 818 (1981). (in Japanese)
Grenander, U., Miller, M.: Pattern Theory: From Representation to Inference. OUP, Oxford (2007)
Malcev, A.: Foundations of Linear Algebra. In: Russian, Gostekhizdat, 1948. English translation, W. H. Freeman and Company, New York (1963)
Cichocki, A., Zdunek, R., Phan, A.-H., Amari, S.: Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation. Wiley, Hoboken (2009)
Itskov, M.: Tensor Algebra and Tensor Analysis for Engineers. ME. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-16342-0
Mørup, M.: Applications of tensor (multiway array) factorizations and decompositions in data mining. Wiley Interdisc. Rev.: Data Min. Knowl. Disc. 1, 24–40 (2011)
Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Review 51, 455–500 (2009)
Itoh, H., Imiya, A., Sakai, T.: Approximation of N-way principal component analysis for organ data. In: Chen, C.-S., Lu, J., Ma, K.-K. (eds.) ACCV 2016. LNCS, vol. 10118, pp. 16–31. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-54526-4_2
Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998)
Turaga, P., Veeraraghavan, A., Chellappa, R.: Statistical analysis on Stiefel and Grassmann manifolds with applications in computer vision. In: IEEE CVPR, pp. 1–8 (2008)
Kroonenberg, P.M.: Applied Multiway Data Analysis. Wiley, Hoboken (2008)
Marron, J.M., Alonso, A.M.: Overview of object oriented data analysis. Biometrical J. 56, 732–753 (2014)
Ferrer, M., Valveny, E., Serratosa, F., Riesen, K., Bunke, H.: Generalized median graph computation by means of graph embedding in vector spaces. Pattern Recognit. 43, 1642–1655 (2010)
Nye, T.M.W.: Principal component analysis in the space of phylogenetic trees. Ann. Stat. 39, 2716–2739 (2011)
Fletcher, P., Lu, C., Pizer, S.M., Joshi, S.: Principal geodesic analysis for the study of nonlinear statistics of shape. IEEE TMD 23, 995–1005 (2004)
Wong, Y.-C.: Differential geometry of Grassmann manifolds. Proc. Nat. Acad. Sci. 57, 589–594 (1967)
Absil, P.-A., Mahony, R., Sepulchre, R.: Riemannian geometry of Grassmann manifolds with a view on algorithmic computation. Acta Applicandae Math. 80, 199–220 (2004)
Hamm, J., Lee, D.D.: Grassmann discriminant analysis: a unifying view on subspace-based learning. In: Proceedings of the International Conference on Machine Learning, pp. 376–383 (2008)
Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Sci. Comput. 26, 313–338 (2005)
Roweis, S.T., Saul, K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Andreopoulos, A., Tsotsos, J.K.: Efficient and generalizable statistical models of shape and appearance for analysis of cardiac MRI. Med. Image Anal. 12, 335–357 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
Image SVD (imageSVD) [8, 9] for the image array \(\varvec{X}\in \mathbf{R}^{m\times n}\) establishes the decomposition
where \(\varvec{U}\varvec{X}\varvec{V}^\top \) has low-rank and \(\varvec{E}\) is the residual error. This decomposition is performed by minimising \(|\varvec{E}|_{\mathrm {F}}^2\) with respect to the conditions
for \(k\le m\) and \(l\le n\). Eigenmatrices of \(\varvec{X}\varvec{X}^\top \) and \(\varvec{X}^\top \varvec{X}\) derive matrices \(\varvec{U}\) and \(\varvec{V}\), respectively.
For a collection of \(m\times n\) matrices \(\{\varvec{X}_i\}_{i=1}^N\), where \(N\gg \max (m, n)\), we assume that
The matrix PCA derives a pair of matrices \(\varvec{U}\) and \(\varvec{V}\) by minimising the criterion
with the constraints \(\varvec{U}^\top \varvec{U}=\varvec{I}_{m}\) and \(\varvec{V}^\top \varvec{V}=\varvec{I}_{n}\). A pair of orthogonal matrices \(\varvec{U}\) and \(\varvec{V}\) are eigenmatrices of
For a pair of \(k\times m \times n\) three-ways \(\varvec{F}=((f_{\alpha \beta \gamma }))\) and \(\varvec{G}=(( g_{\alpha \beta \gamma }))\), Euclidean distance \(d_E\) and the transportation \(d_T\) of intensities are
with respect to
for \( k_{\alpha \beta \gamma }^{\alpha '\beta '\gamma '} = |f_{\alpha \beta \gamma }- g_{\alpha '\beta '\gamma '}|^2\).
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Itoh, H., Imiya, A. (2018). Discrimination of Volumetric Shapes Using Orthogonal Tensor Decomposition. In: Reuter, M., Wachinger, C., Lombaert, H., Paniagua, B., Lüthi, M., Egger, B. (eds) Shape in Medical Imaging. ShapeMI 2018. Lecture Notes in Computer Science(), vol 11167. Springer, Cham. https://doi.org/10.1007/978-3-030-04747-4_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-04747-4_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04746-7
Online ISBN: 978-3-030-04747-4
eBook Packages: Computer ScienceComputer Science (R0)