Carnegie Mellon University
Browse
bme-1131.pdf (13.24 MB)

A flexible and robust approach for segmenting cell nuclei from 2D microscopy images using supervised learning and template matching.

Download (13.24 MB)
journal contribution
posted on 2013-05-01, 00:00 authored by Cheng Chen, Wei Wang, John A. Ozolek, Gustavo RohdeGustavo Rohde

We describe a new supervised learning-based template matching approach for segmenting cell nuclei from microscopy images. The method uses examples selected by a user for building a statistical model that captures the texture and shape variations of the nuclear structures from a given dataset to be segmented. Segmentation of subsequent, unlabeled, images is then performed by finding the model instance that best matches (in the normalized cross correlation sense) local neighborhood in the input image. We demonstrate the application of our method to segmenting nuclei from a variety of imaging modalities, and quantitatively compare our results to several other methods. Quantitative results using both simulated and real image data show that, while certain methods may work well for certain imaging modalities, our software is able to obtain high accuracy across several imaging modalities studied. Results also demonstrate that, relative to several existing methods, the template-based method we propose presents increased robustness in the sense of better handling variations in illumination, variations in texture from different imaging modalities, providing more smooth and accurate segmentation borders, as well as handling better cluttered nuclei.

History

Publisher Statement

This is the accepted version of the article which has been published in final form at dx.doi.org/10.1002/cyto.a.22280

Date

2013-05-01