About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Abstract
Foundation models, generalist artificial intelligence models pre-trained on large unlabeled datasets through self-supervision, have recently risen to prominence in the Earth and atmospheric sciences. Although there has been significant focus on architectures and techniques for pre-training and evaluating these models, there is less emphasis on tooling to facilitate fine-tuning which is crucial to allow researchers and practitioners to adapt foundation models to relevant downstream applications. Here, we present TerraTorch (https://github.com/IBM/terratorch), an open-source library based on PyTorch Lightning and the TorchGeo open-source domain library for geospatial data designed to streamline the process of fine-tuning geospatial foundation models for different downstream tasks. The library provides easy integration of available pre-trained geospatial foundation models (e.g., Prithvi[1], SatMAE[2], and ScaleMAE[3]), other backbones available in the timm or SMP packages, and fine-tuned models such as IBM's granite-geospatial-biomass. It provides flexible trainers for image segmentation, classification, and pixel-wise regression tasks while allowing developers to create their own decoders for these and other tasks. For users who want to interact at a higher abstraction level, it allows fine-tuning tasks to be launched through flexible configuration files. The TerraTorch repository comes with example configuration files to fine-tune models for flood mapping, multi-temporal crop segmentation, and land use/land cover classification. Following these examples makes it easy for a user to create fine-tuning config files for other downstream tasks. Additionally, in order to facilitate experimentation and benchmarking, TerraTorch supports automation of experiments with configurable hyperparameter optimization and integration with GeoBench[4]. [1] Jakubik et al. Foundation Models for Generalist Geospatial Artificial Intelligence, arXiv 2310.18660 [2] Cong et al. SatMAE: pre-training transformers for temporal and multi-spectral satellite imagery, NeurIPS 2023 [3] Reed et al. Scale-MAE: A Scale-Aware Masked Autoencoder for Multiscale Geospatial Representation Learning, ICCV 2023 [4] Lacoste et al. GEO-Bench: Toward Foundation Models for Earth Monitoring, NeurIPS 2023