Deep Learning Toolbox - Normalising a Cell prior to LSTM Training

2 views (last 30 days)
Hi All,
I am attempting to normalise my training data it improve the performance of the LSTM network.
The data is collected in 10 ANSYS simulations, so 10 observations, with 5 input states (dataTrain) and 4 output states.
The code partitions the train data (90% Train 10% Test). I would like to normalise the 5 signals contained in the dataTrain cell array (1x9 cell) where each individual erray is a 5x324 double (5 signal, volume, temp, pressure, CO2, and NO each with 324 steps).
Step 1.
I have attemped to first compute the maximum value of each signal contained in the 1x9 cell, my code does find a value, however, it seems to find the last indexed value (of the 9 arrays in the cell). Which does not make sense as the values should numerically decrease from array 1 to array 9, as shown below, so my code is at fault. As an example, below shows the max pressure values returned in the arrays (and not in numerical order as they should) and the computed maximum pressure from the cell.
%Normalise the Signals contained in dataTrain cell (1x9 cell) prior to extracting predictors and targets??
for i = 1:numObservationsTrain % Find the Max value of each individual signal in the dataTrain cell
volume_max = max(dataTrain{i}(1,:));
temperature_max = max(dataTrain{i}(2,:));
pressure_max = max(dataTrain{i}(3,:))
CO2_max = max(dataTrain{i}(4,:));
NO_max = max(dataTrain{i}(5,:));
Step 2.
Then I attempt to index in and norm each signal (row) in the cell using the computed maximum signal value, but again the code flags an error. See below code.
for i = 1:numObservationsTrain % Normalise each signal in each array in the dataTrain cell with max value computed above
dataTrain{i} = dataTrain{i}(1,:) ./ volume_max;
dataTrain{i} = dataTrain{i}(2,:) ./ temperature_max;
dataTrain{i} = dataTrain{i}(3,:) ./ pressure_max;
dataTrain{i} = dataTrain{i}(4,:) ./ CO2_max;
dataTrain{i} = dataTrain{i}(5,:) ./ NO_max;
Please forgive the schoolboy questions, this is the first time I have attempted to train an LSTM network with real-time data. Any suggestions as to how to mind the max and norm each signal within the 1x9 cell would be great.
Thanks in advance,

Answers (0)


Find more on Machine Learning and Deep Learning in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!