Download PDFOpen PDF in browser

Subjective Feedback-based Neural Network Pruning for Speech Enhancement

EasyChair Preprint no. 2056

5 pagesDate: November 30, 2019

Abstract

Speech enhancement based on neural networks provides performance superior to that of conventional algorithms. However, the network may suffer owing to redundant parameters, which demands large unnecessary computation and power consumption. This work aimed to prune the large network by removing extra neurons and connections while maintaining speech enhancement performance. Iterative network pruning combined with network retraining was employed to compress the network based on the weight magnitude of neurons and connections. This pruning method was evaluated using a deep denoising autoencoder neural network, which was trained to enhance speech perception under nonstationary noise interference. Word correct rate was utilized as the subjective intelligibility feedback to evaluate the understanding of noisy speech enhanced by the sparse network. Results showed that the iterative pruning method combined with retraining could reduce 50% of the parameters without significantly affecting the speech enhancement performance, which was superior to the two baseline conditions of direct network pruning with network retraining and iterative network pruning without network retraining. Finally, an optimized network pruning method was proposed to implement the iterative network pruning and retraining in a greedy repetition manner, yielding a maximum pruning ratio of 80%.

Keyphrases: network pruning, neural network, speech enhancement, Subjective feedback

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:2056,
  author = {Fuqiang Ye and Yu Tsao and Fei Chen},
  title = {Subjective Feedback-based Neural Network Pruning for Speech Enhancement},
  howpublished = {EasyChair Preprint no. 2056},

  year = {EasyChair, 2019}}
Download PDFOpen PDF in browser