LIRRN: Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

  • Armin Moghimi
  • Vahid Sadeghi
  • Amin Mohsenifar
  • Turgay Celik
  • Ali Mohammadzadeh

Externe Organisationen

  • K.N. Toosi University of Technology
  • University of Tabriz
  • University of the Witwatersrand
  • University of Agder
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Aufsatznummer2272
Seitenumfang18
FachzeitschriftSensors
Jahrgang24
Ausgabenummer7
PublikationsstatusVeröffentlicht - 2 Apr. 2024

Abstract

Relative radiometric normalization (RRN) is a critical pre-processing step that enables accurate comparisons of multitemporal remote-sensing (RS) images through unsupervised change detection. Although existing RRN methods generally have promising results in most cases, their effectiveness depends on specific conditions, especially in scenarios with land cover/land use (LULC) in image pairs in different locations. These methods often overlook these complexities, potentially introducing biases to RRN results, mainly because of the use of spatially aligned pseudo-invariant features (PIFs) for modeling. To address this, we introduce a location-independent RRN (LIRRN) method in this study that can automatically identify non-spatially matched PIFs based on brightness characteristics. Additionally, as a fast and coregistration-free model, LIRRN complements keypoint-based RRN for more accurate results in applications where coregistration is crucial. The LIRRN process starts with segmenting reference and subject images into dark, gray, and bright zones using the multi-Otsu threshold technique. PIFs are then efficiently extracted from each zone using nearest-distance-based image content matching without any spatial constraints. These PIFs construct a linear model during subject–image calibration on a band-by-band basis. The performance evaluation involved tests on five registered/unregistered bitemporal satellite images, comparing results from three conventional methods: histogram matching (HM), blockwise KAZE, and keypoint-based RRN algorithms. Experimental results consistently demonstrated LIRRN’s superior performance, particularly in handling unregistered datasets. LIRRN also exhibited faster execution times than blockwise KAZE and keypoint-based approaches while yielding results comparable to those of HM in estimating normalization coefficients. Combining LIRRN and keypoint-based RRN models resulted in even more accurate and reliable results, albeit with a slight lengthening of the computational time. To investigate and further develop LIRRN, its code, and some sample datasets are available at link in Data Availability Statement.

ASJC Scopus Sachgebiete

Ziele für nachhaltige Entwicklung

Zitieren

LIRRN: Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images. / Moghimi, Armin; Sadeghi, Vahid; Mohsenifar, Amin et al.
in: Sensors, Jahrgang 24, Nr. 7, 2272, 02.04.2024.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Moghimi A, Sadeghi V, Mohsenifar A, Celik T, Mohammadzadeh A. LIRRN: Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images. Sensors. 2024 Apr 2;24(7):2272. doi: 10.3390/s24072272
Moghimi, Armin ; Sadeghi, Vahid ; Mohsenifar, Amin et al. / LIRRN : Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images. in: Sensors. 2024 ; Jahrgang 24, Nr. 7.
Download
@article{b041502aecf746a3a9f8f1f0f20448fb,
title = "LIRRN: Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images",
abstract = "Relative radiometric normalization (RRN) is a critical pre-processing step that enables accurate comparisons of multitemporal remote-sensing (RS) images through unsupervised change detection. Although existing RRN methods generally have promising results in most cases, their effectiveness depends on specific conditions, especially in scenarios with land cover/land use (LULC) in image pairs in different locations. These methods often overlook these complexities, potentially introducing biases to RRN results, mainly because of the use of spatially aligned pseudo-invariant features (PIFs) for modeling. To address this, we introduce a location-independent RRN (LIRRN) method in this study that can automatically identify non-spatially matched PIFs based on brightness characteristics. Additionally, as a fast and coregistration-free model, LIRRN complements keypoint-based RRN for more accurate results in applications where coregistration is crucial. The LIRRN process starts with segmenting reference and subject images into dark, gray, and bright zones using the multi-Otsu threshold technique. PIFs are then efficiently extracted from each zone using nearest-distance-based image content matching without any spatial constraints. These PIFs construct a linear model during subject–image calibration on a band-by-band basis. The performance evaluation involved tests on five registered/unregistered bitemporal satellite images, comparing results from three conventional methods: histogram matching (HM), blockwise KAZE, and keypoint-based RRN algorithms. Experimental results consistently demonstrated LIRRN{\textquoteright}s superior performance, particularly in handling unregistered datasets. LIRRN also exhibited faster execution times than blockwise KAZE and keypoint-based approaches while yielding results comparable to those of HM in estimating normalization coefficients. Combining LIRRN and keypoint-based RRN models resulted in even more accurate and reliable results, albeit with a slight lengthening of the computational time. To investigate and further develop LIRRN, its code, and some sample datasets are available at link in Data Availability Statement.",
keywords = "bitemporal multispectral images, change detection, location-independent RRN, pseudo-invariant features (PIFs), relative radiometric normalization (RRN), remote sensing (RS)",
author = "Armin Moghimi and Vahid Sadeghi and Amin Mohsenifar and Turgay Celik and Ali Mohammadzadeh",
year = "2024",
month = apr,
day = "2",
doi = "10.3390/s24072272",
language = "English",
volume = "24",
journal = "Sensors",
issn = "1424-3210",
publisher = "Multidisciplinary Digital Publishing Institute",
number = "7",

}

Download

TY - JOUR

T1 - LIRRN

T2 - Location-Independent Relative Radiometric Normalization of Bitemporal Remote-Sensing Images

AU - Moghimi, Armin

AU - Sadeghi, Vahid

AU - Mohsenifar, Amin

AU - Celik, Turgay

AU - Mohammadzadeh, Ali

PY - 2024/4/2

Y1 - 2024/4/2

N2 - Relative radiometric normalization (RRN) is a critical pre-processing step that enables accurate comparisons of multitemporal remote-sensing (RS) images through unsupervised change detection. Although existing RRN methods generally have promising results in most cases, their effectiveness depends on specific conditions, especially in scenarios with land cover/land use (LULC) in image pairs in different locations. These methods often overlook these complexities, potentially introducing biases to RRN results, mainly because of the use of spatially aligned pseudo-invariant features (PIFs) for modeling. To address this, we introduce a location-independent RRN (LIRRN) method in this study that can automatically identify non-spatially matched PIFs based on brightness characteristics. Additionally, as a fast and coregistration-free model, LIRRN complements keypoint-based RRN for more accurate results in applications where coregistration is crucial. The LIRRN process starts with segmenting reference and subject images into dark, gray, and bright zones using the multi-Otsu threshold technique. PIFs are then efficiently extracted from each zone using nearest-distance-based image content matching without any spatial constraints. These PIFs construct a linear model during subject–image calibration on a band-by-band basis. The performance evaluation involved tests on five registered/unregistered bitemporal satellite images, comparing results from three conventional methods: histogram matching (HM), blockwise KAZE, and keypoint-based RRN algorithms. Experimental results consistently demonstrated LIRRN’s superior performance, particularly in handling unregistered datasets. LIRRN also exhibited faster execution times than blockwise KAZE and keypoint-based approaches while yielding results comparable to those of HM in estimating normalization coefficients. Combining LIRRN and keypoint-based RRN models resulted in even more accurate and reliable results, albeit with a slight lengthening of the computational time. To investigate and further develop LIRRN, its code, and some sample datasets are available at link in Data Availability Statement.

AB - Relative radiometric normalization (RRN) is a critical pre-processing step that enables accurate comparisons of multitemporal remote-sensing (RS) images through unsupervised change detection. Although existing RRN methods generally have promising results in most cases, their effectiveness depends on specific conditions, especially in scenarios with land cover/land use (LULC) in image pairs in different locations. These methods often overlook these complexities, potentially introducing biases to RRN results, mainly because of the use of spatially aligned pseudo-invariant features (PIFs) for modeling. To address this, we introduce a location-independent RRN (LIRRN) method in this study that can automatically identify non-spatially matched PIFs based on brightness characteristics. Additionally, as a fast and coregistration-free model, LIRRN complements keypoint-based RRN for more accurate results in applications where coregistration is crucial. The LIRRN process starts with segmenting reference and subject images into dark, gray, and bright zones using the multi-Otsu threshold technique. PIFs are then efficiently extracted from each zone using nearest-distance-based image content matching without any spatial constraints. These PIFs construct a linear model during subject–image calibration on a band-by-band basis. The performance evaluation involved tests on five registered/unregistered bitemporal satellite images, comparing results from three conventional methods: histogram matching (HM), blockwise KAZE, and keypoint-based RRN algorithms. Experimental results consistently demonstrated LIRRN’s superior performance, particularly in handling unregistered datasets. LIRRN also exhibited faster execution times than blockwise KAZE and keypoint-based approaches while yielding results comparable to those of HM in estimating normalization coefficients. Combining LIRRN and keypoint-based RRN models resulted in even more accurate and reliable results, albeit with a slight lengthening of the computational time. To investigate and further develop LIRRN, its code, and some sample datasets are available at link in Data Availability Statement.

KW - bitemporal multispectral images

KW - change detection

KW - location-independent RRN

KW - pseudo-invariant features (PIFs)

KW - relative radiometric normalization (RRN)

KW - remote sensing (RS)

UR - http://www.scopus.com/inward/record.url?scp=85190267397&partnerID=8YFLogxK

U2 - 10.3390/s24072272

DO - 10.3390/s24072272

M3 - Article

C2 - 38610483

AN - SCOPUS:85190267397

VL - 24

JO - Sensors

JF - Sensors

SN - 1424-3210

IS - 7

M1 - 2272

ER -