Publication
NeurIPS 2021
Conference paper

VisDA-2021 Competition: Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data

Abstract

Progress in machine learning is typically measured by training and testing a model on samples drawn from the same distribution, i.e. the same domain. This over-estimates future accuracy on out-of-distribution data. The Visual Domain Adaptation (VisDA) 2021 competition tests models’ ability to adapt to novel test distributions and handle distributional shift. We set up unsupervised domain adaptation challenges for image classifiers and evaluate adaptation to novel viewpoints, backgrounds, styles and degradation in quality. Our challenge draws on large-scale publicly available datasets but constructs the evaluation across domains, rather than the traditional in-domain benchmarking. Furthermore, we focus on the difficult “universal” setting where, in addition to input distribution drift, methods encounter missing and/or novel classes in the test set. In this paper, we describe the datasets and evaluation metrics and highlight similarities across top-performing methods that might point to promising future directions in universal domain adaptation research. We hope that the competition will encourage further improvement in machine learning methods’ ability to handle realistic data in many deployment scenarios. See http://ai.bu.edu/visda-2021/