Abstract
AbstractMotivationA fundamental step in many analyses of high-dimensional data is dimension reduction. Two basic approaches are introduction of new, synthetic coordinates, and selection of extant features. Advantages of the latter include interpretability, simplicity, transferability and modularity. A common criterion for unsupervised feature selection is variance or dynamic range. However, in practice it can occur that high-variance features are noisy, that important features have low variance, or that variances are simply not comparable across features because they are measured in unrelated numeric scales or physical units. Moreover, users may want to include measures of signal-to-noise ratio and non-redundancy into feature selection.ResultsHere, we introduce the RNR algorithm, which selects features based on (i) the reproducibility of their signal across replicates and (ii) their non-redundancy, measured by linear independence. It takes as input a typically large set of features measured on a collection of objects with two or more replicates per object. It returns an ordered list of featuresi1,i2, …,ik, where featurei1is the one with the highest reproducibility across replicates,i2that with the highest reproducibility across replicates after projecting out the dimension spanned byi1, and so on. Applications to microscopy based imaging of cells and proteomics experiments highlight benefits of the approach.AvailabilityThe RNR method is implemented in the R packageFeatSeekRand is available via Bioconductor (Huberet al., 2015) under the GPL-3 open source license.Contacttuemay.capraz@embl.de
Publisher
Cold Spring Harbor Laboratory
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献