Forecast Evaluation Criteria #2
Posted: Fri May 07, 2010 2:12 am
Eviews can report on a number of forecast evaluation criteria but these are only available as options to the fit and forecast equation procedures. Typically the criteria are used to compare regression-based forecasts with those obtained from other methods but these methods may also be implemented in Eviews e.g. naïve methods. The attached program ‘tricks’ the software into calculating evaluation criteria for any method of forecast. This is achieved by first running a least squares of the forecast values on themselves which always generates a (0, 1) parameter vector. The ‘independent variable’ observations are then changed to ‘actual values’ before the ‘fit’ procedure is undertaken to obtain the required evaluation. The method relies on the fact that all the evaluation statistics are symmetric with respect to what are called ‘forecasts’ and ‘actuals’. The attached code is in the form of a subroutine for pasting into an Eviews program. This permits its use on a number of alternative forecasts within a single Eviews job. Each result can be distinguished by the series names reported in the evaluation tableaux. I have verified the subroutine’s results with several examples against the approach of trubador elsewhere in this forum. All indices match to many decimal places with the exception of MAPE. For the moment I am inclined to think that this is a bug in Eviews! I have available a full Eviews program used for testing and evidence of the bug for distribution via PM.
Gerald
Gerald
Code: Select all
'~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
subroutine eval(series a, series f)
series fa = f
equation eqtemp.ls f c fa
fa = a
freeze eqtemp.fit(e) a
delete fa eqtemp
endsub
'~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
smpl fsmplstart fsmplend ‘ ~~ the ex post forecast period
series y ‘~~ contains the actuals series
series yf ‘~~ contains the forecast series
call eval(y, yf)