Publication
ICASSP 2023
Tutorial

Parameter-Efficient Learning for Speech and Language Processing: Adapters, Prompts, and Reprogramming

View publication

Abstract

With rising interests of using frozen pre-trained models for diverse downstream applications, how to design a performance-effective and parameter-efficient training framework is one open topic. When some recently discovered techniques share similar design principles, we aim to provide an in-depth summary and draw a taxonomy on the differences of parameter-efficient learning modules. The presenting topic is emerging as an essential pathway to design foundation models for the research community.