Is it possible to find the confidence level of classifier scores?
4 views (last 30 days)
I have a writer recognition system that gives back an NLL (Negative Least Likelihood) score for a test sample against every trained model. For example if there are thirteen models to compare the sample against the NLL output will look like this.
80815.7010223003 85581.6346832070 83606.7426608412 85056.4654552773 87063.0759166968 85087.6025451967 83785.5540344493 81874.6972713527 84314.5367319622 89551.5734673822 81328.8830385541 89110.5651289799 83632.8519473224
Where each column represents the score for that sample against every model. Column 1 gives the score against model 1 and so on.This test sample is written by model 1. So the first column should have the minimum value for correct prediction.The output I provided here gives the desired prediction, as the value of column 1 is minimum.
Here is the score from another classifier for the same test sample:
13229.1198655118 15825.1707083033 14812.4282777584 14511.5846921635 20356.4908375808 17776.3907333853 15387.8566384908 14271.7361799156 16061.8498237708 19712.8938234106 12761.0017859257 15197.7126488257 15778.5860607059
This classifier is incorrectly classifying this sample to belong to class 11. Whereas the classifier shown before this correctly classified it.
Now naturally the best way for me would be select the correct score. I was reading up and found out that one way to do this would be to find out the confidence level of the scores presented by every classifier. And the score having the better confidence level would be the most likely output. I did some reading and came upon this answer as well but this does not appear to be my solution.
How exactly would I go about determining the confidence level of my score? Or the best way to select the score?