Abstract
AbstractAuditory hair cells, the whole length of the cochlea, are routinely visualized using light microscopy techniques. It is common, therefore, for one to collect more data than is practical to analyze manually. There are currently no widely accepted tools for unsupervised, unbiased, and comprehensive analysis of cells in an entire cochlea. This represents a stark gap between image-based data and other tests of cochlear function. To close this gap, we present a machine learning-based hair cell analysis toolbox, for the analysis of whole cochleae, imaged with confocal microscopy. The software presented here allows the automation of common image analysis tasks such as counting hair cells, determining their best frequency, as well as quantifying single cell immunofluorescence intensities along the entire cochlear coil. We hope these automated tools will remove a considerable barrier in cochlear image analysis, allowing for more informative and less selective data analysis practices.
Publisher
Cold Spring Harbor Laboratory