Author:
Hollandi Réka,Diósdi Ákos,Hollandi Gábor,Moshkov Nikita,Horváth Péter
Abstract
AbstractAnnotatorJ combines single-cell identification with deep learning and manual annotation. Cellular analysis quality depends on accurate and reliable detection and segmentation of cells so that the subsequent steps of analyses e.g. expression measurements may be carried out precisely and without bias. Deep learning has recently become a popular way of segmenting cells, performing unimaginably better than conventional methods. However, such deep learning applications may be trained on a large amount of annotated data to be able to match the highest expectations. High-quality annotations are unfortunately expensive as they require field experts to create them, and often cannot be shared outside the lab due to medical regulations.We propose AnnotatorJ, an ImageJ plugin for the semi-automatic annotation of cells (or generally, objects of interest) on (not only) microscopy images in 2D that helps find the true contour of individual objects by applying U-Net-based pre-segmentation. The manual labour of hand-annotating cells can be significantly accelerated by using our tool. Thus, it enables users to create such datasets that could potentially increase the accuracy of state-of-the-art solutions, deep learning or otherwise, when used as training data.
Publisher
Cold Spring Harbor Laboratory
Reference48 articles.
1. “The best image annotation platforms for computer vision (+ an honest review of each)” (2018, October 30), https://hackernoon.com/the-best-image-annotation-platforms-for-computer-vision-an-honest-review-of-each-dac7f565fea
2. Image Processing with ImageJ;Biophotonics International,2004
3. Adams, R. and Bischof, L. (1994), “Seeded region growing”, IEEE Transactions on Pattern Analysis and Machine Intelligence.
4. Crowdsourcing the creation of image segmentation algorithms for connectomics
5. Trainable Weka Segmentation: a machine learning tool for microscopy pixel classification
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献