Abstract
Abstract
Purpose
This work aims for a systematic comparison of popular shape and appearance models. Here, two statistical and four deep-learning-based shape and appearance models are compared and evaluated in terms of their expressiveness described by their generalization ability and specificity as well as further properties like input data format, interpretability and latent space distribution and dimension.
Methods
Classical shape models and their locality-based extension are considered next to autoencoders, variational autoencoders, diffeomorphic autoencoders and generative adversarial networks. The approaches are evaluated in terms of generalization ability, specificity and likeness depending on the amount of training data. Furthermore, various latent space metrics are presented in order to capture further major characteristics of the models.
Results
The experimental setup showed that locality statistical shape models yield best results in terms of generalization ability for 2D and 3D shape modeling. However, the deep learning approaches show strongly improved specificity. In the case of simultaneous shape and appearance modeling, the neural networks are able to generate more realistic and diverse appearances. A major drawback of the deep-learning models is, however, their impaired interpretability and ambiguity of the latent space.
Conclusions
It can be concluded that for applications not requiring particularly good specificity, shape modeling can be reliably established with locality-based statistical shape models, especially when it comes to 3D shapes. However, deep learning approaches are more worthwhile in terms of appearance modeling.
Funder
Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI)
Publisher
Springer Science and Business Media LLC
Subject
Health Informatics,Radiology, Nuclear Medicine and imaging,General Medicine,Surgery,Computer Graphics and Computer-Aided Design,Computer Science Applications,Computer Vision and Pattern Recognition,Biomedical Engineering
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献