Publication
IGARSS 2024
Conference paper

TOWARDS EFFICIENT SATELLITE DATA REPRESENTATION LEARNING WITH CONSISTENCY LOSS

Abstract

Foundation models are often pretrained on large datasets and have been valuable in improving efficiency of model train- ing for language and visual processing tasks. An increasing amount of research has focused on foundation models for satellite data which would be beneficial for a wide range of applications such as climate impact modelling. As large quantities of unlabeled satellite data are collected daily, self- supervised learning methods such as image inpainting are key to pretraining these models. Reducing the amount of data re- quired for pretraining by improving learning efficiency could make the implementation of such models more feasible. This research proposes the use of an augmentation based consis- tency loss to improve pretraining efficiency while enhancing downstream performance. Two variations of the proposed approach are evaluated by finetuning pretrained models on flood segmentation and multilabel land cover downstream tasks. Findings show that incorporating consistency loss can enhance downstream performance, although the degree of improvement depends on the downstream task. It is further demonstrated that the downstream improvements can be achieved even with reduced pretraining data.

Date

Publication

IGARSS 2024

Share