This week, we released our initial work on neural embedding compression and image retrieval based on Earth observation foundation models at the IEEE GARSS conference in Athens. There, we demonstrated specific ways to compress embeddings by neural compression. We’ve been able to reduce transfer latency and storage substantially at minimal drop of data utility. We also discussed content-based image retrieval means by foundation model generated embeddings stored in vector databases for Earth observation images. We’ve demonstrated a mean average precision of 97.4% for the BigEarthNet benchmark classes at high retrieval speeds.
Overall, we expect the exchange of embeddings from foundation models to pick up in the next few years and hopefully democratizing the access to earth observation data also to places with limited bandwidth.