How can I use NARX network on GPU?
Show older comments
Hello.
I want to use GPU to speed up training of NARX. But it stopped at the 'Computing Resources: GPU device #1, GeForce GTX 750 Ti' i waited 3 hours but it say nothing.
My network is composed with 3 layers(Input layer,Intermediate layer,Output layer). it has 15 inputs and 1 output. so i decided to divide these inputs and output but i don't understand how to do it. Please tell me the way to divide training data.
Reference ver:matlab2017b
Program X = con2seq(input);% input is matfile which has size of 15×30001 T = con2seq(output);% input is matfile which has size of 1×30001
net = narxnet(1:2,1:4,5,'open'); [x,xi,ai,t] = preparets(net,X,[],T);
[net,tr] = train(net,x,ai,'useGPU','yes');
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!