Karlsruhe 2024 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
T: Fachverband Teilchenphysik
T 119: Data, AI, Computing 8 (foundational & transformer models)
T 119.1: Talk
Friday, March 8, 2024, 09:00–09:15, Geb. 30.33: MTI
Finetuning Foundation Models for Joint Analysis Optimization — •Matthias Vigl, Nicole Hartman, and Lukas Heinrich — TUM
Most searches at the LHC employ an analysis pipeline consisting of various discrete components, each individually optimized and later combined to provide relevant features used to discriminate SM background from potential signal. These are typically high-level features constructed from particle four-momenta. However, the combination of individually optimized tasks doesn't guarantee an optimal performance on the final analysis objective. In this study, we show how an analysis would benefit from adopting an end-to-end ML optimization approach. Specifically, we investigate the impact of jointly optimizing particle identification and signal vs background discrimination exploiting the transformer-based ParT architecture [arXiv:2202.03772] as foundation model, showing the effectiveness of finetuning in the case of multi jets final states with CMS open data [DOI:10.7483/OPENDATA.CMS.JGJX.MS7Q].
Keywords: Machine Learning