Abstract
This paper proposed a Hierarchical Multi-label Text Classification Model based on Multi-Scale Gated-Dilated Convolution (HiDilated) to address the issue of insufficient feature extraction in longer text data. The model emphasized the design of a three-layer one-dimensional dilated convolutional structure with a gating mechanism. By exponentially increasing the receptive field of the network, it effectively captured long-distance dependencies between words, fully extracting deeper textual semantic information, thereby enhancing understanding of complex textual structures and semantic content. Additionally, the model integrated multi-scale gated-dilated convolutions, multi-head self-attention mechanisms, and Bi-GRU into different positions within the feature extraction layer. A multi-granularity fusion module was designed to thoroughly extract both local key information and long-distance semantic information from the text. Moreover, considering the imbalanced distribution of labels with a hierarchical structure, the paper designed a focal balanced loss as the model's loss function. This loss function assigned appropriate weights to samples based on their classification difficulty, enabling the model to focus more on deeper, harder-to-classify labels during training. Experimental results demonstrated that the proposed model achieved higher classification accuracy than baseline models, and that each improved module contributed to enhancing the model's performance. These findings confirm the superiority and practicality of the HiDilated model.