Page 1 of 1

on the calculation of the rmse on bvar_forecast

PostPosted: Thu Jan 07, 2010 8:35 pm
by JorgeFornero
Hi,
In the file "bvar_forecast.m" it reads in line 145:

Code: Select all
rmse = sqrt(sq_err_cumul / size(sq_err_cumul, 1));


it follows that invariably the denominator will be 1!
Would not be correct to employ instead

Code: Select all
size(forecast_data.realized_val, 1)
?

Many thanks for this clarification!
Best, Jorge

Re: on the calculation of the rmse on bvar_forecast

PostPosted: Fri Jan 08, 2010 9:36 am
by SébastienVillemot
Sure, you're right.

Thanks for reporting this. It will be fixed in Dynare 4.1.1.

Best

Re: on the calculation of the rmse on bvar_forecast

PostPosted: Thu Jan 14, 2010 5:10 am
by JorgeFornero
Sebastien,
Thank you very much for your quick reply.
I have a second remark though.
In terms of the forecasting ability among BVARS with different lags, my intuition tells me that the farthest you go in the forecast horizon, the largest the RMSE you get (as in any model).
In the example you provided «bvar_and_dsge.mod», I tried different horizons and I allways get the same RMSE figures, i.e. in line 30:
Code: Select all
bvar_forecast(forecast = 8, bvar_replic = 2000, nobs = 200) 8;

by changing forecast = 8 by 2, 4, 6, etc.
The problem seems to be in "bvar_forecast.m" line 139 :
Code: Select all
       y = X * posterior.PhiHat

which yields values incredibly small, which powered to the squared exarcerbate the problem (line 142).
As a result, RMSEs reported are almost identical, so it is hardly impossible to tell which model is better.
Any suggestion on this is highly appreciated.
Best, Jorge

Re: on the calculation of the rmse on bvar_forecast

PostPosted: Thu Jan 21, 2010 10:33 am
by SébastienVillemot
Hi,

I have checked again the forecast code and I think it is fine, even though I cannot completely rule out a bug, at least I don't see one.

The point is that the out-of-sample forecast returns very rapidly to the steady state (which is a zero value here) as do all VAR forecasts, since this forecast is made under the assumption that no shock occurs.

So this is the reason why the line which you mention gives very small value: only the first values of the out-of-sample forecast should be significantly different from zero.

The reason why you have the same RMSE whatever the lags is probably that you have many dates in your out-of-sample forecast. Since the forecast returns in a few periods to zero, whatever the number of lags, you get almost the same mean error.

To get some difference, you probably need to diminish the size of out-of-sample forecast, i.e. diminish the distance between option "nobs" (number of observations) and the size of your dataset: all the observations of the dataset which are after "nobs" are taken as out-of-sample, used in the RMSE calculation.

Hope this helps,