Figure - available from: Mathematical Problems in Engineering
This content is subject to copyright. Terms and conditions apply.
Sample images of QMUL dataset.

Sample images of QMUL dataset.

Source publication
Article
Full-text available
The region covariance descriptor (RCD), which is known as a symmetric positive definite (SPD) matrix, is commonly used in image representation. As SPD manifolds have a non-Euclidean geometry, Euclidean machine learning methods are not directly applicable to them. In this work, an improved covariance descriptor called the hybrid region covariance de...

Similar publications

Article
Full-text available
Generalized Zero-Shot Learning (GZSL) holds significant research importance as it enables the classification of samples from both seen and unseen classes. A prevailing approach for GZSL is learning transferable representations that can generalize well to both seen and unseen classes during testing. This approach encompasses two key concepts: discri...

Citations

... Although HSIC has been found in many applications, it seems not to have been directly applied on SPD manifolds. is study is an extension of our previously published work [44]. In [44], we proposed a method named HSIC subspace learning (HSIC-SL) by using global HSIC maximization. ...
... is study is an extension of our previously published work [44]. In [44], we proposed a method named HSIC subspace learning (HSIC-SL) by using global HSIC maximization. Compared with HSIC-SL, we use HSIC herein in the form of regularization term to build a novel kernel framework on SPD manifolds. ...
... where C N � I N − (1/N)Γ N Γ T N is the centralizing matrix and K X and K Y are the kernel matrices of X and Y, respectively. We have summarized the relevant theories of HSIC in [44]. For detailed derivation, please refer to our previously published work. ...
Article
Full-text available
Image recognition tasks involve an increasingly high amount of symmetric positive definite (SPD) matrices data. SPD manifolds exhibit nonlinear geometry, and Euclidean machine learning methods cannot be directly applied to SPD manifolds. The kernel trick of SPD manifolds is based on the concept of projecting data onto a reproducing kernel Hilbert space. Unfortunately, existing kernel methods do not consider the connection of SPD matrices and linear projections. Thus, a framework that uses the correlation between SPD matrices and projections to model the kernel map is proposed herein. To realize this, this paper formulates a Hilbert–Schmidt independence criterion (HSIC) regularization framework based on the kernel trick, where HSIC is usually used to express the interconnectedness of two datasets. The proposed framework allows us to extend the existing kernel methods to new HSIC regularization kernel methods. Additionally, this paper proposes an algorithm called HSIC regularized graph discriminant analysis (HRGDA) for SPD manifolds based on the HSIC regularization framework. The proposed HSIC regularization framework and HRGDA are highly accurate and valid based on experimental results on several classification tasks.