Optimization may be unreliable issue
Moderators: EViews Gareth, EViews Moderator, EViews Jason, EViews Matt
Optimization may be unreliable issue
Hi! I'm using the user-defined optimization feature of Eviews, and although the code run well, it sometimes ends with the warning "Optimization may be unreliable (first or second order conditions not met)". What does this really mean and any suggestion on how to fix it? I've tried different starting values and optimization method (BFGS, legacy, ...) but still got this warning most of the time. The optimization equation I'm using has many regresssors and is a bit complicated, but I wonder if that's the whole story. Thanks!
-
EViews Matt
- EViews Developer
- Posts: 583
- Joined: Thu Apr 25, 2013 7:48 pm
Re: Optimization may be unreliable issue
Hello,
At the highest level, that error indicates that EViews is unsure whether the results of the optimization really represent a minimum (or maximum) of your function. EViews tests the final parameter values using two classic criteria, 1) is the gradient of your function at that point zero, and 2) is the Hessian of your function at that point positive (or negative) definite? Those are the first and second order conditions mentioned in the error message, respectively. There are many reasons one or both of those conditions might be false at the final parameter values:
At the highest level, that error indicates that EViews is unsure whether the results of the optimization really represent a minimum (or maximum) of your function. EViews tests the final parameter values using two classic criteria, 1) is the gradient of your function at that point zero, and 2) is the Hessian of your function at that point positive (or negative) definite? Those are the first and second order conditions mentioned in the error message, respectively. There are many reasons one or both of those conditions might be false at the final parameter values:
- The optimization terminated before it could find a minimum (or maximum). Increasing the iteration limit and/or decreasing the convergence threshold may address this.
- There are regions of the parameter space which if used as initial values don't lead to a minimum (or maximum). Restarting with different initial values may address this.
- There are structures in the "landscape" of values produced by your function that the optimization algorithm simply has trouble handling, e.g., saddle points or minima (or maxima) that are neither globally nor locally identified. Such structures could be natural phenomenon in your function or the result of some type of misspecification. For example, constrained systems with insufficient constraints frequently fail in this way.
Re: Optimization may be unreliable issue
Thanks Matt, that's very helpful! I'll check again and post the results if I could solve it in the end.
Who is online
Users browsing this forum: No registered users and 1 guest
