close
close

first Drop

Com TW NOw News 2024

(R) LayerMerge: Neural network depth compression by layer pruning and merging (ICML 2024)
news

(R) LayerMerge: Neural network depth compression by layer pruning and merging (ICML 2024)

(R) LayerMerge: Neural network depth compression by layer pruning and merging (ICML 2024)

Paper: https://arxiv.org/abs/2406.12837

Code: https://github.com/snu-mllab/LayerMerge

In short: LayerMerge reduces the depth of the CNN and diffusion models by pruning and merging convolution and activation layers.

Qualitative example of LayerMerge

Overview: LayerMerge is a new method to make convolutional neural networks more efficient without sacrificing performance. Traditional methods to reduce network depth typically follow one of two approaches:

1. Pruning convolution layers: Aggressively removes parameters, risking losing important information.

2. Prune activation layers and merge layers: Eliminates redundant activation layers and merges the resulting successive convolutional layers, which can increase kernel size and negate the speed gain.

LayerMerge addresses these issues by joint pruning of convolutional layers and activation functions. It optimizes which layers to remove, speeds up inference, and minimizes performance loss. Since this selection process involves an exponential search space, we formulate a new surrogate optimization problem and solve it efficiently via dynamic programming.

Our results show that LayerMerge outperforms current methods for reducing network depth in tasks such as: image classification and generation.

Demo showing the effectiveness of LayerMerge with MobileNetV2-1.0 on ImageNet and with DDPM on CIFAR10.

submitted by /u/jusjinuk
(link) (reactions)