Publication
NeurIPS 2024
Workshop paper

Vector Quantization with Sorting Transformation

Download paper

Abstract

Vector quantization is a compression technique for vector data. It creates a collection of codewords to represent the entire vector space. Each vector data is then represented by its nearest neighbor codeword, where the distance between them is the compression error. To improve nearest neighbor representation for vector quantization, we propose to apply sorting transformation to vector data such that members within each vector are sorted. It can be shown that among all permutation transformations, the sorting transformation minimizes L2 distance and maximizes similarity measures such as cosine similarity and Pearson correlation for vector data. Through experimental validation, we show that sorting transformation based vector quantization prominently reduces compression errors and improves nearest neighbor retrieval performance.

Date

Publication

NeurIPS 2024

Authors

Topics

Share