Publication
ICLR 2022
Workshop paper

Regression Transformer: Concurrent Conditional Generation and Regression by Blending Numerical and Textual Tokens

View publication

Abstract

We report the Regression Transformer (RT), a method that abstracts regression as a conditional sequence modeling problem. The RT casts continuous properties as sequences of numerical tokens and encodes them jointly with conventional tokens. This yields a dichotomous model that can seamlessly transition between solving regression tasks and conditional generation tasks; solely governed by the mask location. We propose several extensions to the XLNet objective and adopt an alternating training scheme to concurrently optimize property prediction and conditional text generation with on a self-consistency loss. Our experiments on both chemical and protein languages demonstrate that the performance of traditional regression models can be surpassed despite training with cross entropy loss. Importantly, priming the same model with continuous properties yields a highly competitive conditional generative models that outperforms specialized approaches in a constrained property optimization benchmark. In sum, the Regression Transformer opens the door for ”swiss army knife” models that excel at both regression and conditional generation. This finds application particularly in property-driven, local exploration of the chemical or protein space.

Date

30 Apr 2022

Publication

ICLR 2022