sherlock

Attending to Discriminative Certainty for Domain Adaptation

Vinod Kumar Kurmi* , Shanu Kumar* , Vinay P. Namboodiri

[* indicates equal contributions]

Delta Lab

Indian Institute of Technology Kanpur

[Paper] [ArXiv] [Code] [Supplementary] [Poster]

sherlock

Abstract

In this paper, we aim to solve for unsupervised domain adaptation of classifiers where we have access to label information for the source domain while these are not available for a target domain. While various methods have been proposed for solving these including adversarial discriminator based methods, most approaches have focused on the entire image based domain adaptation. In an image, there would be regions that can be adapted better, for instance, the foreground object may be similar in nature. To obtain such regions, we propose methods that consider the probabilistic certainty estimate of various regions and specific focus on these during classification for adaptation. We observe that just by incorporating the probabilistic certainty of the discriminator while training the classifier, we are able to obtain state of the art results on various datasets as compared against all the recent methods. We provide a thorough empirical analysis of the method by providing ablation analysis, statistical significance test, and visualization of the attention maps and t-SNE embeddings. These evaluations convincingly demonstrate the effectiveness of the proposed approach.

tSNE Plots

Visualization

Code Coming Soon!

V. K. Kurmi, Shanu Kumar, V. P. Namboodiri
Attending to Discriminative Certainty for Domain Adaptation

BibTex

@InProceedings{Kurmi_2019_CVPR,
author = {Kumar Kurmi, Vinod and Kumar, Shanu and Namboodiri, Vinay P.},
title = {Attending to Discriminative Certainty for Domain Adaptation},
booktitle = {IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR),},
month = {June},
year = {2019}
}

Acknowledgement

We acknowledge the help provided by Delta Lab members, who have supported us for this research activity.