How to use softmax, Loss function(negative log probability) in classification
    11 views (last 30 days)
  
       Show older comments
    
Hello. 
I want to classify videos.
After computation of eucldean distance, I want to use softmax and Loss function(negative log probability) for classification.
Can I get some idea to make the code?
clear all
close all
data = csvread('outfile.csv');
values = data(:,1:end-1);
labels = data(:,end);
avg = splitapply(@(x) mean(x,1), values, labels+1);
mean_class1 = avg(1,:);
mean_class2 = avg(2,:);
mean_class3 = avg(3,:);
mean_class4 = avg(4,:);
mean_class5 = avg(5,:);
bend_query = values(1,:);
run_query = values(2,:);
walk_query = values(3,:);
skip_query = values(4,:);
wave_query = values(5,:);
% calculate euclidean distance
euclidean_bend = pdist2(mean_class1, bend_query, 'euclidean');
euclidean_run = pdist2(mean_class2, run_query, 'euclidean');
euclidean_walk = pdist2(mean_class3, walk_query, 'euclidean');
euclidean_skip = pdist2(mean_class4, skip_query, 'euclidean');
euclidean_wave = pdist2(mean_class5, wave_query, 'euclidean');
0 Comments
Accepted Answer
  Shishir Singhal
      
 on 7 Apr 2020
        For classification, 
softmax creates probability scores for each category.
since your predictions and targets follows different probability distributions. You can use cross entropy loss for that. It is kind of negative log probability function.
Refer to this documentation for the implementation: https://www.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html
0 Comments
More Answers (0)
See Also
Categories
				Find more on Statistics and Machine Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
