如何计算一个BatchNormalization的参数?

Batch参数=前一层卷积数量x4

Posted by yaohong on Tuesday, November 17, 2020

TOC

如何计算一个BatchNormalization的参数?

# Environment:
# OS			macOS Catalina 10.15.6
# python 		3.7
# pip 			20.1.1
# tensorflow	1.14.0
# Keras 		2.1.5

from keras.models import Sequential
from keras.layers import Conv2D,BatchNormalization
model = Sequential();
# conv2d + max pooling
model.add(
	Conv2D(96, 
		kernel_size = (11,11), 
		strides=(4, 4), 
		padding="valid", 
		input_shape=(224,224,3),
		activation="relu")
	); # output  55 * 55 * 96 

# batchNormalization ! 
model.add(BatchNormalization()) # output
model.summary();

output:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_1 (Conv2D)            (None, 54, 54, 96)        34944     
_________________________________________________________________
batch_normalization_1 (Batch (None, 54, 54, 96)        384       
=================================================================
Total params: 35,328
Trainable params: 35,136
Non-trainable params: 192
_________________________________________________________________

问第二行batch_normalization_1右边的384是怎么计算?

这里参数=前一层卷积数量x4=96x4=384;

为什么是4呢?

因为共有这4组参数[gamma weights, beta weights, moving_mean(non-trainable), moving_variance(non-trainable)],每组是96个;

REFERENCES: How the number of parameters associated with BatchNormalization layer is 2048?

「点个赞」

Yaohong

点个赞

使用微信扫描二维码完成支付