Speaker
Description
One frequent task performed on high-resolution 3D time-lapse microscopy images is the reconstruction of cell lineage trees. The construction of such lineage trees is computationally expensive, and traditionally involves following individual cells across all time points, annotating their positions, and linking them to create complete trajectories using a 2D interface. Despite advances in automated cell tracking, human intervention remains important, yet tedious, for both correction and guidance of automated algorithms. We propose to combine 3D cell tracking by means of eye tracking in Virtual Reality with an incremental deep learning approach, to accelerate not only the creation of ground truth, but also proofreading. We discuss the current state and structure of the project and the challenges of using natural user interfaces for cell tracking tasks, especially in crowded environments. We detail our planned investigations into the speed and usability compared to conventional tracking methods, and discuss how the inclusion of uncertainty data of the deep learning model along the cell trajectories can guide the user during the training and proofreading process.
Authors | Samuel Pantze*, Ulrik Günther, Matthew McGinity |
---|---|
Keywords | Systems Biology, Eye Tracking, Virtual Reality, Microscopy, Cell Tracking, Volume Rendering |