Author:
Salgotra Rohit,Lamba Amanjot Kaur,Talwar Dhruv,Gulati Dhairya,Gandomi Amir H.
Abstract
AbstractThis paper proposes a novel multi-hybrid algorithm named DHPN, using the best-known properties of dwarf mongoose algorithm (DMA), honey badger algorithm (HBA), prairie dog optimizer (PDO), cuckoo search (CS), grey wolf optimizer (GWO) and naked mole rat algorithm (NMRA). It follows an iterative division for extensive exploration and incorporates major parametric enhancements for improved exploitation operation. To counter the local optima problems, a stagnation phase using CS and GWO is added. Six new inertia weight operators have been analyzed to adapt algorithmic parameters, and the best combination of these parameters has been found. An analysis of the suitability of DHPN towards population variations and higher dimensions has been performed. For performance evaluation, the CEC 2005 and CEC 2019 benchmark data sets have been used. A comparison has been performed with differential evolution with active archive (JADE), self-adaptive DE (SaDE), success history based DE (SHADE), LSHADE-SPACMA, extended GWO (GWO-E), jDE100, and others. The DHPN algorithm is also used to solve the image fusion problem for four fusion quality metrics, namely, edge-based similarity index ($$Q^{AB/F}$$
Q
A
B
/
F
), sum of correlation difference (SCD), structural similarity index measure (SSIM), and artifact measure ($$N^{AB/F}$$
N
A
B
/
F
). The average $$Q^{AB/F} = 0.765508$$
Q
A
B
/
F
=
0.765508
, $$SCD = 1.63185$$
S
C
D
=
1.63185
, $$SSIM = 0.726317$$
S
S
I
M
=
0.726317
, and $$N^{AB/F} = 0.006617$$
N
A
B
/
F
=
0.006617
shows the best combination of results obtained by DHPN with respect to the existing algorithms such as DCH, CBF, GTF, JSR and others. Experimental and statistical Wilcoxon’s and Friedman’s tests show that the proposed DHPN algorithm performs significantly better in comparison to the other algorithms under test.
Publisher
Springer Science and Business Media LLC