About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP 2023
Tutorial
Parameter-Efficient Learning for Speech and Language Processing: Adapters, Prompts, and Reprogramming
Abstract
With rising interests of using frozen pre-trained models for diverse downstream applications, how to design a performance-effective and parameter-efficient training framework is one open topic. When some recently discovered techniques share similar design principles, we aim to provide an in-depth summary and draw a taxonomy on the differences of parameter-efficient learning modules. The presenting topic is emerging as an essential pathway to design foundation models for the research community.