Estimation of a GARCH extension
Posted: Mon Nov 22, 2010 5:43 am
{using Eviews 6}
Dear Eview fanatics :eviews6:
I would like to use eviews to estimate a Realized GARCH model (Hansen, Huang, Shek, 2010). This would involve programming it, since the model is relatively new and it isn't integrated in eviews. I did the programming tutorials in "An introduction to Eviews programming", but didn't get any wiser on how to launch my project.
I was wondering if someone could give me a rough guide line on where and how to start. And, perhaps, have an information source about it.
About my project:
I'm doing research in verifying whether the Realized GARCH model creates better predictions than other GARCH models. The Realized GARCH model alows for leverage effects as a few other structures such as EGARCH or TGARCH. The difference is that the Realized GARCH model makes use of high frequency data (data per minute/second/milisecond), in my project the so called Realized Kernel, which is a weighted average of the realized variance (Barndorff-Nielsen, 2008). The idea is that high frequency data incorporates more information than normal data, and thus the use of it should result into better predictions.
Thanks in advance! Eviews on!
Dear Eview fanatics :eviews6:
I would like to use eviews to estimate a Realized GARCH model (Hansen, Huang, Shek, 2010). This would involve programming it, since the model is relatively new and it isn't integrated in eviews. I did the programming tutorials in "An introduction to Eviews programming", but didn't get any wiser on how to launch my project.
I was wondering if someone could give me a rough guide line on where and how to start. And, perhaps, have an information source about it.
About my project:
I'm doing research in verifying whether the Realized GARCH model creates better predictions than other GARCH models. The Realized GARCH model alows for leverage effects as a few other structures such as EGARCH or TGARCH. The difference is that the Realized GARCH model makes use of high frequency data (data per minute/second/milisecond), in my project the so called Realized Kernel, which is a weighted average of the realized variance (Barndorff-Nielsen, 2008). The idea is that high frequency data incorporates more information than normal data, and thus the use of it should result into better predictions.
Thanks in advance! Eviews on!