Main Content

setL2Factor

Set L2 regularization factor of layer learnable parameter

Description

layerUpdated = setL2Factor(layer,parameterName,factor) sets the L2 regularization factor of the parameter with the name parameterName in layer to factor.

For built-in layers, you can set the L2 regularization factor directly by using the corresponding property. For example, for a convolution2dLayer layer, the syntax layer = setL2Factor(layer,'Weights',factor) is equivalent to layer.WeightL2Factor = factor.

example

layerUpdated = setL2Factor(layer,parameterPath,factor) sets the L2 regularization factor of the parameter specified by the path parameterPath. Use this syntax when the layer is a networkLayer or when the parameter is in a dlnetwork object in a custom layer.

example

netUpdated = setL2Factor(net,layerName,parameterName,factor) sets the L2 regularization factor of the parameter with the name parameterName in the layer with name layerName for the specified dlnetwork object.

example

netUpdated = setL2Factor(net,parameterPath,factor) sets the L2 regularization factor of the parameter specified by the path parameterPath. Use this syntax when the parameter is in a networkLayer or when the parameter is in a dlnetwork object in a custom layer.

example

Examples

collapse all

Set and get the L2 regularization factor of a learnable parameter of a layer.

Create a layer array containing the custom layer sreluLayer, attached to this example as a supporting file. To access this layer, open this example as a live script.

Create a layer array including a custom layer sreluLayer.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(5,20)
    batchNormalizationLayer
    sreluLayer
    fullyConnectedLayer(10)
    softmaxLayer];

Set the L2 regularization factor of the LeftSlope learnable parameter of the sreluLayer to 2.

layers(4) = setL2Factor(layers(4),"LeftSlope",2);

View the updated L2 regularization factor.

factor = getL2Factor(layers(4),"LeftSlope")
factor = 
2

Set and get the L2 regularization factor of a learnable parameter of a custom nested layer defined using network composition.

Create a residual block layer using the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

numFilters = 64;
layer = residualBlockLayer(numFilters)
layer = 
  residualBlockLayer with properties:

       Name: ''

   Learnable Parameters
    Network: [1x1 dlnetwork]

   State Parameters
    Network: [1x1 dlnetwork]

Use properties method to see a list of all properties.

View the layers of the nested network.

layer.Network.Layers
ans = 
  7x1 Layer array with layers:

     1   'conv_1'        2-D Convolution       64 3x3 convolutions with stride [1  1] and padding 'same'
     2   'batchnorm_1'   Batch Normalization   Batch normalization
     3   'relu_1'        ReLU                  ReLU
     4   'conv_2'        2-D Convolution       64 3x3 convolutions with stride [1  1] and padding 'same'
     5   'batchnorm_2'   Batch Normalization   Batch normalization
     6   'add'           Addition              Element-wise addition of 2 inputs
     7   'relu_2'        ReLU                  ReLU

Set the L2 regularization factor of the learnable parameter 'Weights' of the layer 'conv_1' to 2 using the setL2Factor function.

factor = 2;
layer = setL2Factor(layer,'Network/conv_1/Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(layer,'Network/conv_1/Weights')
factor = 
2

Set and get the L2 regularization factor of a learnable parameter of a dlnetwork object.

Specify the layers of the classification branch and add them to the network.

net = dlnetwork;
layers = [
    imageInputLayer([28 28 1],'Normalization','none','Name','in')
    convolution2dLayer(5,20,'Name','conv')
    batchNormalizationLayer('Name','bn')
    reluLayer('Name','relu')
    fullyConnectedLayer(10,'Name','fc')
    softmaxLayer('Name','sm')];

net = addLayers(net,layers);

Set the L2 regularization factor of the 'Weights' learnable parameter of the convolution layer to 2 using the setL2Factor function.

factor = 2;
net = setL2Factor(net,'conv','Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(net,'conv','Weights')
factor = 
2

Create an array of layers containing an lstmLayer with 100 hidden units and a dropoutLayer with a dropout probability of 0.2.

layers = [lstmLayer(100,OutputMode="sequence",Name="lstm")
    dropoutLayer(0.2,Name="dropout")];

Create a network layer containing these layers.

lstmDropoutLayer = networkLayer(layers,Name="lstmDropout");

Use the network layer to build a network.

layers = [sequenceInputLayer(3)
    lstmDropoutLayer
    lstmDropoutLayer
    fullyConnectedLayer(10)
    softmaxLayer];

Create a dlnetwork object. You can also create a dlnetwork object by training the network using the trainnet function.

net = dlnetwork(layers);

Set the L2 regularization factor of the InputWeights learnable parameter of the LSTM layer in the first network layer to 2 using the setL2Factor function.

factor = 2;
net = setL2Factor(net,"lstmDropout_1/lstm/InputWeights",factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(net,"lstmDropout_1/lstm/InputWeights")
factor = 
2

Set and get the L2 regularization factor of a learnable parameter of a custom nested layer defined using network composition in a dlnetwork object.

Create a dlnetwork object containing the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 3];
numFilters = 32;
numClasses = 5;

layers = [
    imageInputLayer(inputSize,'Normalization','none','Name','in')
    convolution2dLayer(7,numFilters,'Stride',2,'Padding','same','Name','conv')
    groupNormalizationLayer('all-channels','Name','gn')
    reluLayer('Name','relu')
    maxPooling2dLayer(3,'Stride',2,'Name','max')
    residualBlockLayer(numFilters,'Name','res1')
    residualBlockLayer(numFilters,'Name','res2')
    residualBlockLayer(2*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res3')
    residualBlockLayer(2*numFilters,'Name','res4')
    residualBlockLayer(4*numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res5')
    residualBlockLayer(4*numFilters,'Name','res6')
    globalAveragePooling2dLayer('Name','gap')
    fullyConnectedLayer(numClasses,'Name','fc')
    softmaxLayer('Name','sm')];

dlnet = dlnetwork(layers);

The Learnables property of the dlnetwork object is a table that contains the learnable parameters of the network. The table includes parameters of nested layers in separate rows. View the learnable parameters of the layer "res1".

learnables = dlnet.Learnables;
idx = learnables.Layer == "res1";
learnables(idx,:)
ans=8×3 table
    Layer              Parameter                     Value       
    ______    ____________________________    ___________________

    "res1"    "Network/conv_1/Weights"        {3x3x32x32 dlarray}
    "res1"    "Network/conv_1/Bias"           {1x1x32    dlarray}
    "res1"    "Network/batchnorm_1/Offset"    {1x1x32    dlarray}
    "res1"    "Network/batchnorm_1/Scale"     {1x1x32    dlarray}
    "res1"    "Network/conv_2/Weights"        {3x3x32x32 dlarray}
    "res1"    "Network/conv_2/Bias"           {1x1x32    dlarray}
    "res1"    "Network/batchnorm_2/Offset"    {1x1x32    dlarray}
    "res1"    "Network/batchnorm_2/Scale"     {1x1x32    dlarray}

For the layer "res1", set the L2 regularization factor of the learnable parameter 'Weights' of the layer 'conv_1' to 2 using the setL2Factor function.

factor = 2;
dlnet = setL2Factor(dlnet,'res1/Network/conv_1/Weights',factor);

Get the updated L2 regularization factor using the getL2Factor function.

factor = getL2Factor(dlnet,'res1/Network/conv_1/Weights')
factor = 
2

Input Arguments

collapse all

Input layer, specified as a scalar Layer object.

Parameter name, specified as a character vector or a string scalar.

L2 regularization factor for the parameter, specified as a nonnegative scalar.

The software multiplies this factor with the global L2 regularization factor to determine the L2 regularization factor for the specified parameter. For example, if factor is 2, then the L2 regularization for the specified parameter is twice the global L2 regularization factor. You can specify the global L2 regularization factor using the trainingOptions function.

Example: 2

Path to parameter in nested layer, specified as a string scalar or a character vector. A nested layer can be a layer within a networkLayer or a custom layer that itself defines a neural network as a learnable parameter.

If the input to setL2Factor is a layer, then:

  • If the nested layer is in a network layer, the parameter path has the form "nestedLayerName/parameterName" where nestedlayerName is the name of the nested layer inside the network layer, and parameterName is the name of the parameter. If there are multiple levels of nested layers, then specify the path using the form nestedLayerName1/.../nestedLayerNameN/parameterName.

  • If the nested layer is a custom layer that itself defines a neural network as a learnable parameter, the parameter path has the form "propertyName/layerName/parameterName" where propertyName is the name of the property containing a dlnetwork object, layerName is the name of the layer in the dlnetwork object, and parameterName is the name of the parameter. If there are multiple levels of nested layers, then specify the path using the form "propertyName1/layerName1/.../propertyNameN/layerNameN/parameterName".

If the input to setL2Factor is a dlnetwork object and the desired parameter is in a nested layer, then:

  • If the nested layer is in a network layer, the parameter path has the form "networkLayerName/nestedLayerName/parameterName" where networkLayerName is the name of the network layer, nestedlayerName is the name of the nested layer inside the network layer, and parameterName is the name of the parameter. If there are multiple levels of nested layers, then specify the path using the form "networkLayerName1/.../networkLayerNameN/nestedLayerName/parameterName".

  • If the nested layer is a custom layer that itself defines a neural network as a learnable parameter, the parameter path has the form "customLayerName1/propertyName/layerName/parameterName", where layerName1 is the name of the layer in the input dlnetwork object, propertyName is the name of the property of the layer containing a dlnetwork object, layerName is the name of the layer in the dlnetwork object, and parameterName is the name of the parameter. If there are multiple levels of nested layers, then specify the path using the form "customLayerName1/propertyName1/.../customLayerNameN/propertyNameN/layerName/parameterName".

Data Types: char | string

Neural network, specified as a dlnetwork object.

Layer name, specified as a string scalar or a character vector.

Data Types: char | string

Output Arguments

collapse all

Updated layer, returned as a Layer.

Updated network, returned as a dlnetwork.

Version History

Introduced in R2017b

expand all