Dear all,
my question is about how to calibrate a parameter to have the standard deviation of an endogenous variable matching the empirical standard deviation.
More precisely, I am replicating the model by Jermann and Quadrini (macroeconomic effects of financial shocks). They calibrate a parameter to obtain the standard deviation of equity payouts in the model equal to the empirical std.
In their presentation of stylized facts, the empirical std is calculated with filtered data (Baxter and King filter). Looking at their (Gauss) code, it seems that the moment to match for calibration is calculated on actual data linearly detrended.
Also I don't know which model-implied std I have to use: theoretical moment or simulated moment? Filtered or not?
So my question is on whether there are some "rules" about which moments (theoretical and empirical) we have to compare.
Let me know if I need to be clearer.
Thank you in advance
Rudy