Detalle Publicación

ARTÍCULO
A benchmark for comparison of cell tracking algorithms
Autores: Maška, M. ; Ulman, V. ; Svoboda, D. ; Matula, P. ; Matula, P. ; Ederra Ochoa, Cristina; Urbiola García, Ainhoa; España, T. ; Venkatesan, S. ; Balak, D.M. ; Karas, P. ; Bolcková, T. ; Streitová, M. ; Carthel, C. ; Coraluppi, S. ; Harder, N. ; Rohr, K. ; Magnusson, K. E. ; Jaldén, J. ; Blau, H. M.; Dzyubachyk, O.; K¿ížek, P.; Hagen, G. M. ; Pastor-Escudero, D. ; Jiménez-Carretero, D. ; Ledesma-Carbayo, M. J. ; Muñoz Barrutia, María Arrate; Meijering, E. ; Kozubek, M.; Ortiz de Solórzano Aurusa, Carlos
Título de la revista: BIOINFORMATICS
ISSN: 1367-4803
Volumen: 30
Número: 11
Páginas: 1609-1617
Fecha de publicación: 2014
Resumen:
MOTIVATION: Automatic tracking of cells in multidimensional time-lapse fluorescence microscopy is an important task in many biomedical applications. A novel framework for objective evaluation of cell tracking algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2013 Cell Tracking Challenge. In this article, we present the logistics, datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. RESULTS: The main contributions of the challenge include the creation of a comprehensive video dataset repository and the definition of objective measures for comparison and ranking of the algorithms. With this benchmark, six algorithms covering a variety of segmentation and tracking paradigms have been compared and ranked based on their performance on both synthetic and real datasets. Given the diversity of the datasets, we do not declare a single winner of the challenge. Instead, we present and discuss the results for each individual dataset separately. AVAILABILITY AND IMPLEMENTATION: The challenge Web site (http://www.codesolorzano.com/celltrackingchallenge) provides access to the training and competition datasets, along with the ground truth of the training videos. It also provides access to Windows and Linux executable files of the evaluation software and most of the algorithms that competed in the challenge.