Hamburg 2016 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
T: Fachverband Teilchenphysik
T 97: Grid-Computing
T 97.9: Talk
Thursday, March 3, 2016, 18:45–19:00, VMP8 SR 05
Experience of Google’s latest Deep Learning library, TensorFlow, in a large-scale WLCG cluster — •Gen Kawamura, Joshua Wyatt Smith, and Arnulf Quadt — II. Physikalisches Institut, Georg-August-Universität Göttingen
The researchers at the Google Brain team released their second generation’s Deep Learning library, TensorFlow, as an open-source package under the Apache 2.0 license in November, 2015. Google has already deployed the first generation’s library using DistBlief in various systems such as Google Search, advertising systems, speech recognition systems, Google Images, Google Maps, Street View, Google Translate and many other latest products. In addition, many researchers in high energy physics have recently started to understand and use Deep Learning algorithms in their own research and analysis. We conceive a first use-case scenario of TensorFlow to create the Deep Learning models from high-dimensional inputs like physics analysis data in a large-scale WLCG computing cluster. TensorFlow carries out computations using a dataflow model and graph structure onto a wide variety of different hardware platforms and systems, such as many CPU architectures, GPUs and smartphone platforms. Having a single library that can distribute the computations to create a model to the various platforms and systems would significantly simplify the use of Deep Learning algorithms in high energy physics. We deploy TensorFlow with the Docker container environments and present the first use in our grid system.