xCloudServing: Automated and Optimized ML Serving across Clouds
Gosia Lazuka, Andreea Simona Anghel, et al.
CLOUD 2023
We study the general framework of warm-started hyperparameter optimization (HPO) where we have some source datasets (tasks) on which we have already performed HPO, and we wish to leverage the results of these HPO to warm-start the HPO on an unseen target dataset and perform few-shot HPO. Various meta-learning schemes have been proposed over the last decade (and more) for this problem. In this paper, we theoretically analyse the optimality gap of the hyperparameter obtained via such warm-started few-shot HPO, and provide novel results for multiple existing meta-learning schemes. We show how these results allow us identify situations where certain schemes have advantage over others.
Gosia Lazuka, Andreea Simona Anghel, et al.
CLOUD 2023
Parikshit Ram, Tim Klinger, et al.
IJCAI 2023
Parikshit Ram
ICLR 2025
S. Ilker Birbil, Donato Maragno, et al.
AAAI 2023