Monte Carlo simulation of a linear regression model with a lagged dependet variable
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
0 commentaires
Réponses (0)
Voir également
Catégories
En savoir plus sur Linear and Nonlinear Regression dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!