Nothing Special   »   [go: up one dir, main page]

 logo Idiap Research Institute        
 [BibTeX] [Marc21]
s-GPTs: A New Approach to Autoregressive Models.
Type of publication: Conference paper
Citation: Pannatier_ECMLPKDD_2024
Publication status: Accepted
Booktitle: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases
Year: 2024
Month: September
Abstract: Autoregressive models, such as the GPT family, use a fixed order, usually left-to-right, to generate sequences. However, this is not a necessity. In this paper, we challenge this assumption and show that by simply adding a positional encoding for the output, this order can be modulated on-the-fly per-sample which offers key advantageous properties. It allows for the sampling of and conditioning on arbitrary subsets of tokens, and it also allows sampling in one shot multiple tokens dynamically according to a rejection strategy, leading to a sub-linear number of model evaluations. We evaluate our method across various domains, including language modeling, path-solving, and aircraft vertical rate prediction, decreasing the number of steps required for generation by an order of magnitude.
Keywords: Autoregressive models, Permutations, Rejection Sampling, transformers
Projects MALAT
Authors Pannatier, Arnaud
Courdier, Evann
Fleuret, Francois
Added by: [UNK]
Total mark: 0
Attachments
  • Pannatier_ECMLPKDD_2024.pdf
Notes