Nothing Special   »   [go: up one dir, main page]

CERN Accelerating science

Article
Title Transformers for Generalized Fast Shower Simulation
Author(s) Raikwar, Piyush (CERN) ; Cardoso, Renato (CERN) ; Chernyavskaya, Nadezda (CERN) ; Jaruskova, Kristina (CERN) ; Pokorski, Witold (CERN) ; Salamani, Dalila (CERN) ; Srivatsa, Mudhakar (IBM Watson Res. Ctr.) ; Tsolaki, Kalliopi (CERN) ; Vallecorsa, Sofia (CERN) ; Zaborowska, Anna (CERN)
Publication 2024
Number of pages 8
In: EPJ Web Conf. 295 (2024) 09039
In: 26th International Conference on Computing in High Energy & Nuclear Physics, Norfolk, Virginia, Us, 8 - 12 May 2023, pp.09039
DOI 10.1051/epjconf/202429509039
Subject category Computing and Computers
Abstract Recently, transformer-based foundation models have proven to be a generalized architecture applicable to various data modalities, ranging from text to audio and even a combination of multiple modalities. Transformers by design should accurately model the non-trivial structure of particle showers thanks to the absence of strong inductive bias, better modeling of long-range dependencies, and interpolation and extrapolation capabilities. In this paper, we explore a transformer-based generative model for detector-agnostic fast shower simulation, where the goal is to generate synthetic particle showers, i.e., the energy depositions in the calorimeter. When trained with an adequate amount and variety of showers, these models should learn better representations compared to other deep learning models, and hence should quickly adapt to new detectors. In this work, we will show the prototype of a transformer-based generative model for fast shower simulation, as well as explore certain aspects of transformer architecture such as input data representation, sequence formation, and the learning mechanism for our unconventional shower data.
Copyright/License CC-BY-4.0

Corresponding record in: Inspire


 Zapis kreiran 2024-12-12, zadnja izmjena 2024-12-12


Cjeloviti tekst:
Download fulltext
PDF