Main Content

crossChannelNormalizationLayer

Channel-wise local response normalization layer

Description

A channel-wise local response (cross-channel) normalization layer carries out channel-wise normalization.

Creation

Description

layer = crossChannelNormalizationLayer(windowChannelSize) creates a channel-wise local response normalization layer and sets the WindowChannelSize property.

example

layer = crossChannelNormalizationLayer(windowChannelSize,Name,Value) sets the optional properties WindowChannelSize, Alpha, Beta, K, and Name using name-value pairs. For example, crossChannelNormalizationLayer(5,'K',1) creates a local response normalization layer for channel-wise normalization with a window size of 5 and K hyperparameter 1. You can specify multiple name-value pairs. Enclose each property name in single quotes.

Properties

expand all

Cross-Channel Normalization

Size of the channel window, which controls the number of channels that are used for the normalization of each element, specified as a positive integer less than or equal to 16.

If WindowChannelSize is even, then the window is asymmetric. The software looks at the previous floor((w-1)/2) channels and the following floor(w/2) channels. For example, if WindowChannelSize is 4, then the layer normalizes each element by its neighbor in the previous channel and by its neighbors in the next two channels.

Example: 5

α hyperparameter in the normalization (the multiplier term), specified as a numeric scalar.

Example: 0.0002

β hyperparameter in the normalization, specified as a numeric scalar. The value of Beta must be greater than or equal to 0.01.

Example: 0.8

K hyperparameter in the normalization, specified as a numeric scalar. The value of K must be greater than or equal to 10-5.

Example: 2.5

Layer

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet, trainNetwork, assembleNetwork, layerGraph, and dlnetwork functions automatically assign names to layers with the name "".

The CrossChannelNormalizationLayer object stores this property as a character vector.

Data Types: char | string

This property is read-only.

Number of inputs to the layer, returned as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, returned as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, returned as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, returned as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a local response normalization layer for channel-wise normalization, where a window of five channels normalizes each element, and the additive constant for the normalizer K is 1.

layer = crossChannelNormalizationLayer(5,'K',1)
layer = 
  CrossChannelNormalizationLayer with properties:

                 Name: ''

   Hyperparameters
    WindowChannelSize: 5
                Alpha: 1.0000e-04
                 Beta: 0.7500
                    K: 1

Include a local response normalization layer in a Layer array.

layers = [ ...
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    reluLayer
    crossChannelNormalizationLayer(3)
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer]
layers = 
  7x1 Layer array with layers:

     1   ''   Image Input                   28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution               20 5x5 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   ReLU                          ReLU
     4   ''   Cross Channel Normalization   cross channel normalization with 3 channels per element
     5   ''   Fully Connected               10 fully connected layer
     6   ''   Softmax                       softmax
     7   ''   Classification Output         crossentropyex

Algorithms

expand all

References

[1] Krizhevsky, A., I. Sutskever, and G. E. Hinton. "ImageNet Classification with Deep Convolutional Neural Networks." Advances in Neural Information Processing Systems. Vol 25, 2012.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2016a