Question regarding DSGEVAR
Posted: Mon Sep 21, 2015 3:04 pm
Dear all,
I have two questions regarding the DSGEVAR procedure (specified with the option dsge_var in the estimation command).
1) Where exactly does dynare store the prior (for the BVAR) generated by the DSGE model? The Bayesian IRFs for the DSGEVAR that are generated by dynare rely on identification based on the DSGE model (which of course makes perfect sense). I would like to compare these IRFs to IRFS from an identification scheme based on sign-restrictions, therefore I need the prior to estimate the reduced form BVAR myself and then apply another rotation (one based on sign restrictions).
2) Does anyone know whether it is possible to use the prior generated let's say for 6 variables for a 12 variables BVAR, for instance through combining it with a Minnesota prior for the 6 variables that did not show up in the DSGE model? The problem is that the DSGE model I am using does not explicitly model some of the variables I would like to include in my BVAR.
Any hint/suggestion is very welcome
Best
Kay
I have two questions regarding the DSGEVAR procedure (specified with the option dsge_var in the estimation command).
1) Where exactly does dynare store the prior (for the BVAR) generated by the DSGE model? The Bayesian IRFs for the DSGEVAR that are generated by dynare rely on identification based on the DSGE model (which of course makes perfect sense). I would like to compare these IRFs to IRFS from an identification scheme based on sign-restrictions, therefore I need the prior to estimate the reduced form BVAR myself and then apply another rotation (one based on sign restrictions).
2) Does anyone know whether it is possible to use the prior generated let's say for 6 variables for a 12 variables BVAR, for instance through combining it with a Minnesota prior for the 6 variables that did not show up in the DSGE model? The problem is that the DSGE model I am using does not explicitly model some of the variables I would like to include in my BVAR.
Any hint/suggestion is very welcome
Best
Kay