Fig 1 - uploaded by Junde Wu
Content may be subject to copyright.
Feature Centroid Contrast Learning. Our contrastive loss module works directly on the features of the backbone output, increasing interpretability, as shown on the right. Different from other contrastive learning, we only use one data augmentation and one backbone. The norm module represents l2 normalization, which ensures stable changes in the centroid.

Feature Centroid Contrast Learning. Our contrastive loss module works directly on the features of the backbone output, increasing interpretability, as shown on the right. Different from other contrastive learning, we only use one data augmentation and one backbone. The norm module represents l2 normalization, which ensures stable changes in the centroid.

Source publication
Preprint
Full-text available
Deep learning based medical imaging classification models usually suffer from the domain shift problem, where the classification performance drops when training data and real-world data differ in imaging equipment manufacturer, image acquisition protocol, patient populations, etc. We propose Feature Centroid Contrast Learning (FCCL), which can impr...

Contexts in source publication

Context 1
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). ...
Context 2
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. ...
Context 3
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. Intuitively, this loss will pull each sample towards the class centroid it belongs to, and push the sample away from all the centroids of other classes. See Fig. 1 on the ...
Context 4
... cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. Intuitively, this loss will pull each sample towards the class centroid it belongs to, and push the sample away from all the centroids of other classes. See Fig. 1 on the right for a visualization of how the features of each instance move with the centroids. FCCL constrains the distribution of different classes of data features through feature and centroids Contrast. The contrast loss ...
Context 5
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). ...
Context 6
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. ...
Context 7
... propose Feature Centroid Contrast Learning (FCCL), as shown in Fig 1. When training on the source domain, the model is constrained by a contrastive loss ( Fig. 1 bottom branch) in addition to the regular cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. Intuitively, this loss will pull each sample towards the class centroid it belongs to, and push the sample away from all the centroids of other classes. See Fig. 1 on the ...
Context 8
... cross-entropy loss ( Fig. 1 top branch). The contrastive loss is computed between the features generated by the backbone of the classifier network and the class centroids of these features. Intuitively, this loss will pull each sample towards the class centroid it belongs to, and push the sample away from all the centroids of other classes. See Fig. 1 on the right for a visualization of how the features of each instance move with the centroids. FCCL constrains the distribution of different classes of data features through feature and centroids Contrast. The contrast loss ...

Similar publications

Article
Full-text available
Medical imaging classification is playing a vital role in identifying and diagnoses the diseases, which is very helpful to doctor. Conventional ways classify supported the form, color, and/or texture, most of tiny problematic areas haven’t shown in medical images, which meant less efficient classification and that has poor ability to identify disea...