How can I train a deep neural network for the approximation of a nonlinear function in 1D?

26 views (last 30 days)
Given a nonlinear function of the form 'y = f(x)' in 1D, it sought for the appropriate workflow that allows to approximate the solution of this nonlinear function based on a trained deep neural network. There is a recommended workflow for training deep neural networks with non-image or non-sequence data for regression in the following MATLAB Answers thread,
which leverages the 'imageInputLayer' for the definition of non-image or non-sequence data.
How can I train a deep neural network for the approximation of a nonlinear function in 1D? 

Accepted Answer

MathWorks Support Team
MathWorks Support Team on 27 Oct 2020
The newly introduced 'featureInputLayer' in release R2020b can be leveraged for the particular task. Given a nonlinear function the following workflow can be used for the approximation of a nonlinear function with deep neural networks:
1.) prepare a set of training data (XTrain, YTrain),
2.) define a deep neural network architecture with a 'featureInputLayer' as input layer and a 'regressionLayer' regression output layer,
3.) define training options,
4.) train the network using 'trainNetwork' function for the given training data,
5.) verify the network performance by means of some test data using function 'predict',
6.) plot the exact function and its prediction by means of the trained deep neural network.
The latter workflow is herein demonstrated in a MATLAB script by means of nonlinear function 'y = x^3', namely,
 
%% Define function to be approximated
fnc = @(x) x.^3;
%% Define the training data
XTrain = linspace(-1,1,80)';
YTrain = fnc(XTrain);
%% Define a layer architecture
layers = [ ...
featureInputLayer(1, "Name", "myFeatureInputLayer", 'Normalization','rescale-symmetric')
fullyConnectedLayer(8, "Name", "myFullyConnectedLayer1")
tanhLayer("Name", "myTanhLayer")
fullyConnectedLayer(1, "Name", "myFullyConnectedLayer2")
regressionLayer("Name", "myRegressionLayer")
];
%% Define options for the training
opts = trainingOptions('adam', ...
'MaxEpochs',1000, ...
'InitialLearnRate',0.01,...
'Shuffle','every-epoch', ...
'Plots','training-progress', ...
'MiniBatchSize',128, ...
'Verbose',false);
%% Train the network
[trainedNet, info] = trainNetwork(XTrain, YTrain, layers, opts);
%% Create some sample test data
numRand = 100;
XTest = sort(2.*rand(numRand, 1) - 1);
YTest = predict(trainedNet, XTest);
%% Compare the expected and the predicted solution of the nonlinear function
plot(XTrain, YTrain, '-sblack', XTest, YTest, '-vr');
legend('exact', 'predicted')
grid on;
xlabel('x')
ylabel('f(x) = x^3')
The results of the predicted function on a given set of test data is shown in the figure below,
Although the aforementioned workflow is supported from R2020b and on where the 'featureInputLayer' was introduced, this workflow can be achieved also in previous releases of MATLAB by leveraging the 'imageInputLayer' as shown in the following MATLAB Answers thread,

More Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Tags

No tags entered yet.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!