Affiliation:
1. School of Computer Science and Engineering, VIT University, Vellore 632014, India
Abstract
More than 60 percent of the global surface is covered by clouds, and they play a vital role in the hydrological circle, climate change, and radiation budgets by modifying shortwaves and longwave. Weather forecast reports are critical to areas such as air and sea transport, energy, agriculture, and the environment. The time has come for artificial intelligence-powered devices to take the place of the current method by which decision-making experts determine cloud types. Convolutional neural network models (CNNs) are starting to be utilized for identifying the types of clouds that are caused by meteorological occurrences. This study uses the publicly available Cirrus Cumulus Stratus Nimbus (CCSN) dataset, which consists of 2543 ground-based cloud images altogether. We propose a model called Cloud-MobiNet for the classification of ground-based clouds. The model is an abridged convolutional neural network based on MobileNet. The architecture of Cloud-MobiNet is divided into two blocks, namely the MobileNet building block and the support MobileNet block (SM block). The MobileNet building block consists of the weights of the depthwise separable convolutions and pointwise separable convolutions of the MobileNet model. The SM block is made up of three dense network layers for feature extraction. This makes the Cloud-MobiNet model very lightweight to be implemented on a smartphone. An overall accuracy success of 97.45% was obtained for the CCSN dataset used for cloud-type classification. Cloud-MobiNet promises to be a significant model in the short term, since automated ground-based cloud classification is anticipated to be a preferred means of cloud observation, not only in meteorological analysis and forecasting but also in the aeronautical and aviation industries.
Funder
Vellore Institute of Technology
Subject
Atmospheric Science,Environmental Science (miscellaneous)
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献