Abstract
Convolutional neural network (CNN) is frequently used in image classification. However, obtaining sufficient labelled data for training is difficult because labelling data is costly. Learning from a limited number of samples creates challenges since the learned model may become overfitted due to the biased distribution caused by a few training samples and the pow learning algorithm. This paper proposed a dynamic distribution calibration method for the few shot-learning. First, the base and new class samples were normalized using the normalization technique to eliminate the effect of different feature magnitudes. A pre-trained feature extraction model extracts the sample feature vectors of the base class and the new class. The distribution characteristics of the adjacent and remote base classes are dynamically selected for a new class sample in an embedding space by a threshold value method. A similar class usually has a similar feature distribution, such as mean and variance. So, the means and variance of the Gaussian distribution can be transferred between similar classes. In the paper, the feature distribution is assumed to follow the Gaussian distribution. Then, the distributional features of each sample in the new class are calibrated using the corrected hyperparameter based on the distribution features of the adjacent base class and far base class in the embedding space. Finally, the calibrated distribution features augment the sample set of the new class. In some experiments on the benchmark datasets miniImagenet and CUB, the new dynamic distribution calibration method achieves up to 4% accuracy gains in some few-shot classification challenges and achieves superior performance.