resubLoss
Class: RegressionSVM
Resubstitution loss for support vector machine regression model
Syntax
L = resubLoss(mdl)
L = resubLoss(mdl,Name,Value)
Description
returns the resubstitution loss for the support vector machine (SVM) regression model L
= resubLoss(mdl
)mdl
, using the training data stored in mdl.X
and corresponding response values stored in mdl.Y
.
specifies additional options using one or more name-value arguments. For example,
specify the loss function to use for the resubstitution loss computation.L
= resubLoss(mdl
,Name,Value
)
Input Arguments
mdl
— Full, trained SVM regression model
RegressionSVM
model
Full, trained SVM regression model, specified as a RegressionSVM
model returned by fitrsvm
.
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN
, where Name
is
the argument name and Value
is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Before R2021a, use commas to separate each name and value, and enclose
Name
in quotes.
Example: resubLoss(Mdl,"LossFun","epsiloninsensitive")
specifies
to use the epsilon-insensitive loss function to compute the resubstitution
loss.
LossFun
— Loss function
"mse"
(default) | "epsiloninsensitive"
| function handle
Loss function, specified as "mse"
,
"epsiloninsensitive"
, or a function
handle.
The following table lists the available loss functions. Specify one using its corresponding value.
Value Loss Function "mse"
Mean Squared Error "epsiloninsensitive"
Epsilon-Insensitive Loss Function Specify your own function using function handle notation.
Suppose that
n = size(X,1)
is the sample size. Your function must have the signaturelossvalue = lossfun(Y,Yfit,W)
, where:The output argument
lossvalue
is a numeric value.You choose the function name (
lossfun
).Y
is an n-by-1 numeric vector of observed response values.Yfit
is an n-by-1 numeric vector of predicted response values, calculated using the corresponding predictor values inX
(similar to the output ofpredict
).W
is an n-by-1 numeric vector of observation weights.
Specify your function using
"LossFun",@
.lossfun
Example: "LossFun","epsiloninsensitive"
Data Types: char
| string
| function_handle
PredictionForMissingValue
— Predicted response value to use for observations with missing predictor values
"median"
(default) | "mean"
| "omitted"
| numeric scalar
Since R2023b
Predicted response value to use for observations with missing predictor values,
specified as "median"
, "mean"
,
"omitted"
, or a numeric scalar.
Value | Description |
---|---|
"median" | resubLoss uses the median of the observed
response values in the training data as the predicted response value for
observations with missing predictor values. |
"mean" | resubLoss uses the mean of the observed
response values in the training data as the predicted response value for
observations with missing predictor values. |
"omitted" | resubLoss excludes observations with missing
predictor values from the loss computation. |
Numeric scalar | resubLoss uses this value as the predicted
response value for observations with missing predictor values. |
If an observation is missing an observed response value or an observation weight, then
resubLoss
does not use the observation in the loss
computation.
Example: PredictionForMissingValue="omitted"
Data Types: single
| double
| char
| string
Output Arguments
L
— Resubstitution loss
scalar value
Resubstitution loss, returned as a scalar value.
The resubstitution loss is the loss calculated between the response training data and the model’s predicted response values based on the input training data.
Resubstitution loss can be an overly optimistic estimate of the predictive error on new data. If the resubstitution loss is high, the model’s predictions are not likely to be very good. However, having a low resubstitution loss does not guarantee good predictions for new data.
To better assess the predictive accuracy of your model, cross validate the model using crossval
.
Examples
Resubstitution Loss for SVM Regression Model
This example shows how to train an SVM regression model, then calculate the resubstitution loss using mean square error (MSE) and epsilon-insensitive loss.
This example uses the abalone data from the UCI Machine Learning Repository. Download the data and save it in your current directory with the name 'abalone.data'
.
Read the data into a table
.
tbl = readtable('abalone.data','Filetype','text', ... 'ReadVariableNames',false); rng default % for reproducibility
The sample data contains 4177 observations. All of the predictor variables are continuous except for sex
, which is a categorical variable with possible values 'M'
(for males), 'F'
(for females), and 'I'
(for infants). The goal is to predict the number of rings on the abalone, and thereby determine its age, using physical measurements.
Train an SVM regression model to the data, using a Gaussian kernel function with an automatic kernel scale. Standardize the data.
mdl = fitrsvm(tbl,'Var9','KernelFunction','gaussian', ... 'KernelScale','auto','Standardize',true);
Calculate the resubstitution loss using mean square error (MSE).
mse_loss = resubLoss(mdl)
mse_loss = 4.0603
Calculate the epsilon-insensitive loss.
eps_loss = resubLoss(mdl,'LossFun','epsiloninsensitive')
eps_loss = 1.1027
More About
Mean Squared Error
The weighted mean squared error is calculated as follows:
where:
n is the number of rows of data
xj is the jth row of data
yj is the true response to xj
f(xj) is the response prediction of the SVM regression model
mdl
to xjw is the vector of weights.
The weights in w are all equal to one by default. You can specify different values for weights using the 'Weights'
name-value pair argument. If you specify weights, each value is divided by the sum of all weights, such that the normalized weights add to one.
Epsilon-Insensitive Loss Function
The epsilon-insensitive loss function ignores errors that are within the distance epsilon (ε) of the function value. It is formally described as:
The mean epsilon-insensitive loss is calculated as follows:
References
[1] Nash, W.J., T. L. Sellers, S. R. Talbot, A. J. Cawthorn, and W. B. Ford. "The Population Biology of Abalone (Haliotis species) in Tasmania. I. Blacklip Abalone (H. rubra) from the North Coast and Islands of Bass Strait." Sea Fisheries Division, Technical Report No. 48, 1994.
[2] Waugh, S. "Extending and Benchmarking Cascade-Correlation: Extensions to the Cascade-Correlation Architecture and Benchmarking of Feed-forward Supervised Artificial Neural Networks." University of Tasmania Department of Computer Science thesis, 1995.
[3] Clark, D., Z. Schreter, A. Adams. "A Quantitative Comparison of Dystal and Backpropagation." submitted to the Australian Conference on Neural Networks, 1996.
[4] Lichman, M. UCI Machine Learning Repository, [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.
Extended Capabilities
GPU Arrays
Accelerate code by running on a graphics processing unit (GPU) using Parallel Computing Toolbox™. (since R2023a)
This function fully supports GPU arrays. For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).
Version History
Introduced in R2015bR2023b: Specify predicted response value to use for observations with missing predictor values
Starting in R2023b, when you predict or compute the loss, some regression models allow you to specify the predicted response value for observations with missing predictor values. Specify the PredictionForMissingValue
name-value argument to use a numeric scalar, the training set median, or the training set mean as the predicted value. When computing the loss, you can also specify to omit observations with missing predictor values.
This table lists the object functions that support the
PredictionForMissingValue
name-value argument. By default, the
functions use the training set median as the predicted response value for observations with
missing predictor values.
Model Type | Model Objects | Object Functions |
---|---|---|
Gaussian process regression (GPR) model | RegressionGP , CompactRegressionGP | loss , predict , resubLoss , resubPredict |
RegressionPartitionedGP | kfoldLoss , kfoldPredict | |
Gaussian kernel regression model | RegressionKernel | loss , predict |
RegressionPartitionedKernel | kfoldLoss , kfoldPredict | |
Linear regression model | RegressionLinear | loss , predict |
RegressionPartitionedLinear | kfoldLoss , kfoldPredict | |
Neural network regression model | RegressionNeuralNetwork , CompactRegressionNeuralNetwork | loss , predict , resubLoss , resubPredict |
RegressionPartitionedNeuralNetwork | kfoldLoss , kfoldPredict | |
Support vector machine (SVM) regression model | RegressionSVM , CompactRegressionSVM | loss , predict , resubLoss , resubPredict |
RegressionPartitionedSVM | kfoldLoss , kfoldPredict |
In previous releases, the regression model loss
and predict
functions listed above used NaN
predicted response values for observations with missing predictor values. The software omitted observations with missing predictor values from the resubstitution ("resub") and cross-validation ("kfold") computations for prediction and loss.
R2023a: GPU array support
Starting in R2023a, resubLoss
fully supports GPU arrays.
See Also
fitrsvm
| RegressionSVM
| resubPredict
| loss
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)