Berlin 2014 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
Q: Fachverband Quantenoptik und Photonik
Q 64: Quantum information: Concepts and methods V
Q 64.3: Talk
Friday, March 21, 2014, 14:30–14:45, Kinosaal
Systematic errors in current quantum state tomography tools — Christian Schwemmer1,2, Lukas Knips1,2, Daniel Richart1,2, •Tobias Moroder3, Matthias Kleinmann3,4, Otfried Gühne3, and Harald Weinfurter1,2 — 1Max-Planck-Institut für Quantenoptik, Garching — 2Department für Physik, Ludwig-Maximilians-Universität München — 3Theoretische Quantenoptik, Universität Siegen — 4Departamento de Matematica, Belo Horizonte
In this work we investigate the systematic errors of commonly employed tools used in quantum state tomography to estimate the full density operator or key figures of merit like its entanglement or the fidelity with respect to the intended target state. We show that techniques such as maximum likelihood or free-least-squares—used nearly in all experiments within the last decade—suffer from a rather large systematic error for current experimental samples sizes, which leads to strong deviations of the estimated fidelity or wrong conclusion about the presence of entanglement. These errors do not occur due to some mismatch between the real experimental setup and the associated model, but are inherent to the used analysis tools, which in statistics is called the bias. As a solution in order to avoid this we exemplify a linear evaluation of the data which does not suffer from this effect, show how even non-linear quantities like entanglement measures can easily be accessed, and finally equip it with directly computable confidence intervals which do not rely on large sample properties.