Download PDFOpen PDF in browser

A Self-Distillation Assisted ResNet-KL Image Classification Network

EasyChair Preprint no. 10663

6 pagesDate: August 3, 2023


Traditional ResNet models suffer from large model size and high computational complexity. In this study, we propose a self-distillation assisted ResNet-KL image classification method to address the low accuracy and efficiency issues in image classification tasks.Firstly,we introduce depthwise separable convolutions to the ResNet network and enhance the model's classification performance by improving the design of activation functions, using T-ReLU instead of traditional ReLU. Secondly,we enhance the model's perception of features at different scales by incorporating multi-scale convolutions for the fusion of residual layers and attention mechanism layers. To reduce the model's parameter count, we combine feature distillation with logic distillation and optimize the model layer by layer through self-distillation, while applying pruning techniques multiple times to reduce its size. Finally, To assess the efficacy of our methodology, we conduct experimental evaluations on public datasets CIFAR-10, CIFAR-100, and STL-10. The results show that the improved ResNet-KL network achieves an accuracy improvement of 1.65%, 2.72%, and 0.36% compared to traditional ResNet models on these datasets, respectively. Our method obtains better classification performance with the same computational resources, making it promising for applications in tasks such as object classification.

Keyphrases: image classification, Pruning, ResNet, self-distillation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Yuanyuan Wang and Haiyang Tian and Yu Shen},
  title = {A Self-Distillation Assisted ResNet-KL Image Classification Network},
  howpublished = {EasyChair Preprint no. 10663},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser