Skip to content

Zhou3983/SlimBERT

Repository files navigation

SlimBERT - BERT Compression with Neural Slimming

I studied how to compress the BERT model in a structured pruning manner. I proposed the neural slimming technique to assess the importance of each neuron and designed the cost function and pruning strategy to remove neurons that make zero or less contribution to the prediction. After getting fine-tuned on the downstream tasks, the model can learn a more compact structure, and it is named SlimBERT. My thesis is available here.

Methods

To estimate the contribution of each neuron, we introduce a importance factor α which is a learnable parameter. A slim layer is a group of independent importance factors that are individually optimized. Each time we want to perform pruning on a certain layer, we connect the slim layer to this layer.

Slim Layer And Loss Function

image

Due to the flexibility of the slim layer, we can easily apply it to the parts that we want to prune in the model. For BERT, we use it on all the layers, including the embedding layer, the multi-head self-attention layers, and the fully connected layer.

SilmBERT Architecture

image

Results

We tested our method on 7 GLUE tasks and used only 10% of the original parameters to recover 94% of the original accuracy. It also reduced the run-time memory and increased the inference speed at the same time. Compared to knowledge distillation methods and other structured pruning methods, the proposed approach achieved better performance under different metrics with the same compression ratio. Moreover, our method also improved the interpretability of BERT. By analyzing neurons with a significant contribution, we can observe that BERT utilizes different components and subnetworks according to different tasks.

Performance on GLUE Tasks

image

Performance of SlimBERT vs Unstructured Pruning

image

Percentage of Remaining Neurons by Layers

image

Percentage of removed neurons in each sublayer

image

Important Attention Heads in SlimBERT

Green: High importance Pink: Low importance image image

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published