About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
CODS-COMAD 2021
Conference paper
Deep Domain Adaptation under Label Scarcity
Abstract
The goal behind Domain Adaptation (DA) is to leverage the labeled examples from a source domain to infer an accurate model for a target domain where labels are not available or in scarce at the best. Recently, there has been a surge in adversarial learning based deep-net approaches for DA problem - a prominent example being DANN approach [9]. These methods require a large number of source labeled examples to infer a good model for the target domain; but start performing poorly with reduced labels. In this paper, we study the behavior of such approaches (especially DANN) under such scarce label scenarios. Further, we propose an architecture, namely TRAVERS, that amalgamates TRAnsductive learning principles with adVERSarial learning so as to provide a cushion to the performance of these approaches under label scarcity. Experimental results (both on text and images) show a significant boost in the performance of TRAVERS over approaches such as DANN under scarce label scenarios.