This is a comprehensive, user-friendly toolbox implementing the state-of-the-art in Bayesian linear regression and Bayesian logistic regression. The toolbox provides highly efficient and numerically stable implementations of ridge, lasso, horseshoe, horseshoe+ and g-prior regression. The lasso, horseshoe and horseshoe+ priors are recommended for data sets where the number of predictors is greater than the sample size. The toolbox allows predictors to be assigned to logical groupings (potentially overlapping, so that predictors can be part of multiple groups). This can be used to exploit a priori knowledge regarding predictors and how they may be related to each other (for example, in grouping genetic data into genes and collections of genes such as pathways).
To support analysis of data with outliers, we provide two heavy-tailed error models in our implementation of Bayesian linear regression: Laplace and Student-t distribution errors. Most features are straightforward to use and the toolbox can work directly with MATLAB tables (including automatically handling categorical variables), or you can use standard MATLAB matrices.
The toolbox is very efficient and can be used with high-dimensional data. Please see the scripts in the directory "examples\" for examples on how to use the toolbox, or type "help bayesreg" within MATLAB. An R version of this toolbox is now available on CRAN. To install the R package, type "install.packages("bayesreg")" within R.
To cite this toolbox:
Makalic E. & Schmidt, D. F.
High-Dimensional Bayesian Regularised Regression with the BayesReg Package
arXiv:1611.06649 [stat.CO], 2016
The package now handles logistic regression without the need for MEX files, but big speed-ups can be obtained when using compiled code, so this is recommended. To compile the C++ code, run compile.m from the bayesreg directory within MATLAB; compilation requires the MS Visual Studio Professional or the GNU g++ compiler. Alternatively, for convenience, the pre-compiled MEX files (MATLAB R2017a) for Windows, Linux and Mac OSX can be downloaded from the following two URLs:
To use these, all you need to do is download them and unzip into the "bayesreg" folder.
Enes Makalic and Daniel F. Schmidt (2016). High-Dimensional Bayesian Regularised Regression with the BayesReg Package, arXiv:1611.06649 [stat.CO].
Fantastic job. Thanks guys!
Great program. Very useful for my work.
We have just finished and uploaded Version 1.3 of the software. This has support for MATLAB tables, handles categorical variables appropriately and has a prediction function that can be used to produce predictions, prediction credible intervals and calculate prediction performance statistics. I hope you find it useful.
Thanks so much for your answer!
I have implemented the toolbox in MATLAB and found the results correspond well with the traditional ridge/lasso regularization results when p<n, meaning the situations when the number of predictors is smaller than the number of observations.
Hope that the new version comes soon and better!
Thanks and best regards,
The fully Bayesian approach used in this tool selects the regularisation parameters automatically by including it in the Bayesian hierarchy and sampling along with the model parameters. The current version implements a "half-Cauchy" prior on the overall regularisation parameter, in accordance with suggestions from Polson and others.
The "best" posterior regression coefficients, in terms of squared-prediction error, are given by retval.muB. We are just finishing a version which provides a "predict" function to compute predictions onto new data (or the training data, if you want) and calculates prediction performance statistics. It also allows you to predict using the full Bayesian predictive posterior distribution, accepts Matlab tables and handles categorical variables. Hopefully this will be released in the next few days.
Cheers and thanks for your interest,
Hi, thanks for your contribution to Bayesian regularization problem!
I have a question on this toolbox:
As we often choose the proper coefficients for the regularization term to obtain the most suitable prediction results through cross-validation when using Ridge/Lasso regularization method, is there a similar process in this toolbox? For example, how can we set the values for the error distributions in this toolbox to obtain different prior distributions? I have this question since I found that the results using this toolbox are different from the results obtained directly by ridge/lasso function in matlab.
By the way, how can we obtain the "best" posterior regression coefficients? Can we regard the results in retval.muB as the "best" results?
Thanks in advance for your help!
-Updated the "Cite As" field in the toolbox description
-Added function "br_sparsify()" to sparsify posterior coefficient estimates; three sparsification methods currently available (see "br_example15")
-improved sampling speed for large design matrices
-Display the Widely Applicable Akaike's Information Criterion (WAIC) instead of DIC in summary output
- written a new parallelised C++ implementation of sampling code for logistic regression
- Added option ‘groups’ which allows grouping of variables into potentially overlapping groups
- Tidied up the summary display
Updated description to include links to the full version of the toolbox.