Abstract
The use of a wireless sensor network (WSN) for gathering security information about terrorism patterns in war zones renders a wide range of advantages. Which reduces huge personnel fatalities, minimizing attendant personnel, and maintenance costs. As well as improving the efficiency of the sophisticated machinery that is more resilient than humans at the front line through autonomous surveillance. However, this research aims to develop an intelligent wireless sensing system (IWSS) for autonomous surveillance, firearm detection, and defense systems at the front line through the deployment of intelligent wireless sensor nodes. This prototypical model of the autonomous defense and surveillance system involves several sensors and intelligent cameras. These all are integrated into the ARM Cortex A53 processor for data collection, and image processing using the Support Vector Machine (SVM), Histogram of Gradient (HOG), and Eye Aspect Ratio (EAR) algorithms of the computer vision algorithm. The surveillance video clips/imagery extracted have experimented with the YOLOv3 model for object training, detection, and classification using a deep convolutional neural network (DCNN). The result obtained for the detection accuracy of humans in possession of the weapon is 100%, with a processing time of 0.875 seconds. Also, the deployment of the multi-agent sensing prototype for the autonomous surveillance system is implemented and simulated in a spanning tree network testbed model. The average detection accuracy results obtained are 94.85%, 95.10%, 96.58%, 93.57%, 95.26%, and 97.17% respectively.