Modeling and Unsupervised Unmixing Based on Spectral Variability for Hyperspectral Oceanic Remote Sensing Data with Adjacency Effects

Author:

Deville Yannick1ORCID,Brezini Salah-Eddine12ORCID,Benhalouche Fatima Zohra123ORCID,Karoui Moussa Sofiane123ORCID,Guillaume Mireille4ORCID,Lenot Xavier5,Lafrance Bruno5,Chami Malik6ORCID,Jay Sylvain4ORCID,Minghelli Audrey78ORCID,Briottet Xavier9ORCID,Serfaty Véronique10

Affiliation:

1. Université de Toulouse, UPS-CNRS-OMP-CNES, IRAP, 31400 Toulouse, France

2. Université des Sciences et de la Technologie d’Oran-Mohamed Boudiaf, LSI, Bir El Djir, Oran 31000, Algeria

3. Algerian Space Agency (ASAL), Centre des Techniques Spatiales (CTS), Arzew 31200, Algeria

4. Aix Marseille University, CNRS, Centrale Marseille, Institut Fresnel, 13013 Marseille, France

5. CS GROUP, CEDEX 05, 31506 Toulouse, France

6. Université Côte d’Azur, Observatoire de la Côte d’Azur, CNRS, Sorbonne Université (UFR 918), Laboratoire Lagrange, CS 34229, CEDEX 4, 06304 Nice, France

7. Laboratoire d’Informatique et Système (LIS), Université de Toulon, CNRS UMR 7020, 83041 Toulon, France

8. Laboratoire d’Informatique et Système (LIS), Aix Marseille Université, 13288 Marseille, France

9. Université de Toulouse, ONERA/DOTA, CEDEX 4, 31055 Toulouse, France

10. DGA/AID, CEDEX 15, 75509 Paris, France

Abstract

In a previous paper, we introduced (i) a specific hyperspectral mixing model for the sea bottom, based on a detailed physical analysis that includes the adjacency effect, and (ii) an associated unmixing method that is supervised (i.e., not blind) in the sense that it requires a prior estimation of various parameters of the mixing model, which is constraining. We here proceed much further, by first analytically showing that the above model can be seen as a specific member of the general class of mixing models involving spectral variability. Therefore, we then process such data with the IP-NMF unsupervised (i.e., blind) unmixing method that we proposed in previous works to handle spectral variability. Such variability especially occurs when the sea depth significantly varies over the considered scene. We show that IP-NMF then yields significantly better pure spectra estimates than a classical method from the literature that was not designed to handle such variability. We present test results obtained with realistic synthetic data. These tests address several reference water depths, up to 7.5 m, and clear or standard water. For instance, they show that when the reference depth is set to 7.5 m and the water is clear, the proposed approach is able to distinguish various classes of pure materials when the water depth varies up to ±0.2 m around this reference depth, over all pixels of the analyzed scene or over a “subscene”: the overall scene may first be segmented, to obtain smaller depths variations over each subscene. The proposed approach is therefore effective and can be used as a building block in performing the subpixel classification of the sea bottom for shallow water.

Funder

French Defense Agency

Publisher

MDPI AG

Subject

General Earth and Planetary Sciences

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Informed NMF-Based Unmixing Method Addressing Spectral Variability for Marine Mucilage Mapping using Hyperspectral Prisma Data;IGARSS 2024 - 2024 IEEE International Geoscience and Remote Sensing Symposium;2024-07-07

2. Active Hyperspectral-Based Projects at the Algerian Space Agency;2024 IEEE Mediterranean and Middle-East Geoscience and Remote Sensing Symposium (M2GARSS);2024-04-15

3. Improving Geological Remote Sensing Interpretation via Optimal Transport-Based Point–Surface Data Fusion;Remote Sensing;2023-12-22

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3