Inspired by the TensorFlow Neural Networks Playground interface readily available online at http://playground.tensorflow.org/, this is a MATLAB implementation of the same Neural Network interface for using Artificial Neural Networks for regression and classification of highly non-linear data. The interface uses the HG1 graphics system in order to be compatible with older versions of MATLAB. A secondary purpose of this project is to write a vectorized implementation of training Artificial Neural Networks with Stochastic Gradient Descent as a means of education and to demonstrate the power of MATLAB and matrices. The goal for this framework is given randomly generated training and test data that fall into two classes that conform to certain shapes or specifications, and given the configuration of a neural network, the goal is to perform either regression or binary classification of this data and interactively show the results to the user, specifically a classification or regression map of the data, as well as numerical performance measures such as the training and test loss and their values plotted on a performance curve over each iteration. The architecture of the neural network is highly configurable so the results for each change in the architecture can be seen immediately.
There are two files that accompany this project:
1. NeuralNetApp.m: The GUI that creates the interface as seen on TensorFlow Neural Networks Playground but is done completely with MATLAB GUI elements and widgets.
2. NeuralNet2.m: The class that performs the Neural Network training via Stochastic Gradient Descent. This is used in NeuralNetApp.m
Raymond Phan (2019). A MATLAB implementation of the TensorFlow Neural Network Playground (https://www.github.com/StackOverflowMATLABchat/NeuralNetPlayground), GitHub. Retrieved .
Thanks for your great port.
I've just tried the TF-NN set and it performs great! Congratulations!
I have a question related to the type of data we may eventually use as dataset. In example, I have a mixed numbers/texts/categorical/boolean dataset and I'd appreciate if someone could give me a clue on how to use it. Thank you!
You have done a great job in implementing the TensorFlow Matlab class. I have already applied the ANN model strange effects to a highly nonlinear regression problem and encountered some strange effects which I was not able to get rid of. Almost independent on settings of the control parameters, the cost function is contaminated by an extreme wobble which I do not observe in other neural network simulators as, e. g., SNNS. Do you have an explanation for this behaviour?
Thanks in advance for your efforts.
@AndreasHorn - Thank you for your comments. Currently regression has not been implemented yet because the hidden neuron and output neuron activation functions are the same. I have written modifications to the toolbox that allow you specify the hidden neuron and output neuron activation functions differently so that it's possible to perform regression. I need to update the toolbox and that'll probably take a couple of days... when I have the time to do so. As for the GUI problems, I unfortunately did not design the GUI. My friend Amro did. You can find his contact on our Github page. Good luck!
@RouzbehDavoudi: (1) I didn't do anything special. I implemented vanilla gradient descent and regularization. This is simply the standard implementation of backpropagation with regularization. (2) You can use this guideline to determine how to cite submissions to MATLAB FEX: http://blogs.mathworks.com/community/2010/12/13/citing-file-exchange-submissions/. In summary, something like this could work: Phan, Raymond (2015). A MATLAB implementation of the TensorFlow Neural Network Playground (https://www.mathworks.com/matlabcentral/fileexchange/57610). MATLAB Central File Exchange. Retrieved <insert date here>. Finally, thank you for using my toolbox :)
@aaaa This code does not use GPU because that requires the Parallel Computing Toolbox. The goal of this code is to use CORE MATLAB functions that don't require the use of any toolboxes.
I have also another question, how I can cite your work? did you publish a paper?
Hi, thanks for the toolbox. You did a nice job. However, I'm little confused. I have also used tensor flow in python (Keras library). Your code always shows better performance compared to python (The R is like 4 % better in your toolbox compared to python results) given the same parameters, number of layers, nodes,.... I'm wondering if you implemented sth more than common tensor flow or I'm making a mistake in my python code?
this is a great implementation! I'd love to use the tool for a regression problem – and would like to do so without using the GUI.
Is regression already implemented?
In lines 124 ff of NeuralNetApp.m, there seems to be nothing happening depending on switching the hPopProblem popup.
Thanks so much for a hint!
Great work!! I am newbie in this field, and planning to use Matlab and tensor flow. I reallized that your source code does not use GPU (GTX1070) that is very disappointing. Maybe i am lack of knowledge.
@kartikSarin - Please see the Wiki page on our Github repo. It has everything you need. Also, the documentation internal to NeuralNet2 when you type in help NeuralNet2 in MATLAB is very clear on its usage as well as examples on how to use the code. Not sure why you didn't find this out. https://github.com/StackOverflowMATLABchat/NeuralNetPlayground/wiki
@FabioTango - Please see the Wiki page on our Github repo. It has everything you need. Also, the documentation internal to NeuralNet2 when you type in help NeuralNet2 in MATLAB is very clear on its usage as well as examples on how to use the code. Not sure why you didn't find this out. https://github.com/StackOverflowMATLABchat/NeuralNetPlayground/wiki
Just let me know as well..if any1 find some more documentation on this.......thanks
where can i find some related documentation about this implementation? (if possible).
Thanks in advance
Fabio Tango (from Centro Ricerche Fiat)
(1) NeuralNet2 can now do regression. (2) Can modify output layer activation function. (3) Allow specifying a network without hidden layers. (4) More efficient weight update with matrix multiplication. (5) More documentation through help NeuralNet2
Updated snapshot of project
Minor fixes in documentation and code overall
Added a more descriptive description on what this project is about.