Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
T: Fachverband Teilchenphysik
T 21: Experimentelle Methoden 1 (Computing, Machine Learning, Statistik)
T 21.1: Vortrag
Montag, 27. März 2017, 16:45–17:00, JUR 253
Design and Execution of make-like Distributed Analyses — •Robert Fischer, Ralf Florian von Cube, Martin Erdmann, Benjamin Fischer, and Marcel Rieger — III. Physikalisches Institut A, RWTH Aachen
In particle physics, workflow management systems are primarily used as tailored solutions in dedicated areas such as Monte Carlo production. However, physicists performing data analyses are usually required to steer their individual workflows manually, which is time-consuming and often leads to undocumented relations between particular workloads. We present a generic analysis design pattern that copes with the sophisticated demands of end-to-end HEP analyses. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Our tools allow to specify arbitrary workloads and dependencies between them in a lightweight and scalable structure. Further features are multi-user support, automated dependency resolution and error handling, central scheduling, and status visualization. The WLCG infrastructure is supported including CREAM-CE, DCAP, SRM and GSIFTP. Due to the open structure, additional computing resources, such as local computing clusters or Dropbox storage, can be easily added and supported. Computing jobs execute their payload, which may be any executable or script, in a dedicated software environment. Software packages are installed as required, and input data is retrieved on demand. We develop and test our system alongside ttbb and ttH cross section measurements. The clear interface and dependencies between individual workloads enables a make-like execution.