How can I visually show that using fixed min and max for histograms is better?
1 view (last 30 days)
I'm classifying people based on biometric features extracted. I thought that it would be best to represent these features in a histogram before giving them to the classifier.
I tried with the matlab function 'hist' and performed classification. I got very poor results. I noticed that using a fixed min and max and forcing a histogram between these points would give better results. So I downloaded a script and generated histograms for all samples within a fixed scale. This gave very good results.
Now the problem is that I don't know how to visually represent my theorey? It does make sense that if a histogram for an unknown sample is adjusting itself for its won min and max then it would be useless for classification purposes, but let those histogms be generated within a fixed scale and it will have some identification and discriminatory power.
I thought it would be as simple as plotting both histograms, but it is not clear through that.
For example for a correct classifictaion using Matlab's hist function gives the following plot:
For a correct classification using the downloaded script generates the following plot:
Using these two figures I cannot prove my point that the second histogram displayed performs significantly better just by providing a fixed min and max.
Maybe I am looking at this from the wrong perspective. Any suggestions here would be appreciated.
edit: code for the script added below:
function h = calculate_hist(Patch, NumOfHist,MinValue,MaxValue)
%%Find patch size
m = length(Patch);
%%compute histogram size
% MaxValue= max((Patch));
% MinValue = min((Patch));
binSize = (MaxValue - MinValue)/ NumOfHist;
%%compute colour histograms
h = zeros(1,NumOfHist);
for i = 1 : m
A = floor((Patch(i)- MinValue)/binSize) + 1;
if (Patch(i) < MaxValue && A < NumOfHist+1)
h(floor((Patch(i)- MinValue)/binSize) + 1) = h(floor((Patch(i)- MinValue)/binSize) + 1)+1;
h(end) = h(end)+1;
end h = h/m;