Monte Carlo simulation of a linear regression model with a lagged dependet variable

4 views (last 30 days)
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!