Seemingly Unrelated Regressions and robust covariance matrix

For technical questions regarding estimation of single equations, systems, VARs, Factor analysis and State Space Models in EViews. General econometric questions and advice should go in the Econometric Discussions forum.

Moderators: EViews Gareth, EViews Moderator

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Wed Feb 10, 2010 4:32 pm

Is it possible to estimate a SUR system with the equivalent of the White or Newey-West covariance matrix? I would like to estimate a SUR system which is robust to heteroscedasticity and serial correlation.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Wed Feb 10, 2010 4:43 pm

The version is 5.1 btw.

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Thu Feb 11, 2010 12:30 pm

You should be able to do this through the system GMM estimator.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Thu Feb 11, 2010 4:47 pm

I was under the impression that GMM is rather different to SUR conceptually. I don't have any instruments...

Also, I came across the following post:
viewtopic.php?f=4&t=436&p=1403&hilit=seemingly+unrelated#p1403

This appears to be related to what I am looking for...QMS Gareth said it will be implemented in Eviews 7?

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Thu Feb 11, 2010 6:14 pm

I guess I need to be a bit clearer on what you want (and what I mean :)). The seemingly unrelated refers to the fact that you have a set of equations with no apparent cross-equation restrictions, but with non-zero off-diagonals.

For the purposes of this discussion there are couple of ways to proceed:

. You can estimate the specification using a GLS approach which corrects for cross-sectional heteroskedasticity and contemporaneous correlation (but not for general heteroskedasticity and serial correlation). In principle, you could follow this with a robust standard estimator.

. Alternately, you can estimate using system least squares without correlation correction and then compute with a robust standard estimator, for example a system HAC estimator.

EViews does not allow you to take the former approach, but does allow you to do the latter using the GMM tools. Note that the equivalence results from treating all of the explanatory variables in your specification as exogenous. Just as TSLS using the original regressors as instruments yields the least squares estimator, so too does GMM with the appropriate orthogonality conditions and weighting matrix (what is termed in the dialog TSLS weighting), yield the system least squares estimator. Then all you have to do is to select the appropriate robust covariance option.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Thu Feb 11, 2010 6:40 pm

QMS Glenn wrote:I guess I need to be a bit clearer on what you want (and what I mean :)). The seemingly unrelated refers to the fact that you have a set of equations with no apparent cross-equation restrictions, but with non-zero off-diagonals.

For the purposes of this discussion there are couple of ways to proceed:

. You can estimate the specification using a GLS approach which corrects for cross-sectional heteroskedasticity and contemporaneous correlation (but not for general heteroskedasticity and serial correlation). In principle, you could follow this with a robust standard estimator.

. Alternately, you can estimate using system least squares without correlation correction and then compute with a robust standard estimator, for example a system HAC estimator.

EViews does not allow you to take the former approach, but does allow you to do the latter using the GMM tools. Note that the equivalence results from treating all of the explanatory variables in your specification as exogenous. Just as TSLS using the original regressors as instruments yields the least squares estimator, so too does GMM with the appropriate orthogonality conditions and weighting matrix (what is termed in the dialog TSLS weighting), yield the system least squares estimator. Then all you have to do is to select the appropriate robust covariance option.


Ah, I'd always associated SUR with GLS. I'm guessing that the "Seemingly Unrelated Regressions" option in system estimation is taking the GLS approach?

Could you elaborate on "appropriate orthogonality conditions and weighting matrix"? Are my orthogonality conditions basically the equation-by-equation OLS conditions i.e. E(X'u)=0? From what I recall, the weighting matrix can be computed iteratively right?

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Fri Feb 12, 2010 11:24 am

You are correct in associating SUR with GLS estimation of the system and also correct that what we label SUR is simply GLS on the system of equations. As a side-note, the EViews SUR is a bit more general than the original SUR formulation in that the equations need not be "SU", that is we allow for cross-equation coefficient restrictions.

As to your question about appropriate orthogonality and weighting, let me elaborate, using a single equation as an illustration...

Suppose you estimate the equation Y = XB + e using GMM with X as your instruments and compute the weighting matrix assuming that everything is iid so that E(Xee'X) = sigma^2 (X'X)^{-1}. Then the orthogonality condition implied by this specification is E(X'e) = 0, and least squares is the first step of the GMM estimator. You can think about this weighting matrix as 2SLS weighting matrix since its use yields the TSLS (and in this case the OLS) estimator. I think we label this the "Identity weighting matrix in the system dialog box, but I don't like that label since we're really want the conditional error variance to be the identity matrix).

Typically in GMM you would then go on and use the consistent estimates of B to form a new estimator of the weighting matrix and off you would go.

But you don't have to iterate the weights. Suppose instead that you simply stop after the first estimation of B, and form your coefficient covariance estimates using a robust estimator of the long-run covariance. In this case, you are simply doing OLS with robust standard errors (perhaps White, or Newey-West).

So what I'm proposing is that you set up your instrument specification so that estimating system 2SLS yields identical results to estimating system OLS (your statement about the equation-by-equation orthogonality conditions is spot-on). Once you are sure you've got that lined up, use one of the two system GMM estimators using the 2SLS weighting matrix. This will give you the system OLS estimates with robust standard errors.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Fri Feb 12, 2010 4:16 pm

OK, just to confirm, I am basically setting my instruments as the exogenous explanatory variables.

However, I'm still not quite sure about what the manual says here:

"If you select either GMM method, EViews will display a checkbox labeled Identity weighting
matrix in estimation. If selected, EViews will estimate the model using identity
weights, and will use the estimated coefficients and GMM specification you provide to
compute a coefficient covariance matrix that is robust to cross-section heteroskedasticity
(White) or heteroskedasticity and autocorrelation (Newey-West). If this option is not
selected, EViews will use the GMM weights both in estimation, and in computing the coefficient
covariances."


In my case, is this basically saying that if I tick the box, the first step is OLS, otherwise it isn't. The second step is then the Newey-West HAC covariance matrix.

What if I don't select the option? What does it meant to use "GMM weights both in estimation, and in computing the coefficient covariances"?

Sorry if I seem to be going around circles here...I'm just not sure where we are capturing the cross-equation covariance in disturbances in the GMM procedure.

I also have some options concerning pre-whitening, kernels and bandwidths. I am not familiar with this non-parametric stuff and the manual is rather terse. Could you briefly summarise what the choice boils down to, and provide some recommendations?

Thanks

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Fri Feb 12, 2010 5:22 pm

If the box isn't checked, then you'll do iterative GMM with the corresponding robust weight matrix. If it is checked, then the weight matrix will be the non robust form which gives you OLS in your setting.

The general error structure is captured in the covariance calculation, in essence through the estimation of E(X'ee'X) i n the middle of the variance sandwich estimator.

Explaining the computation of long-run variances is something that I'm going to have to beg off on as it's quite involved. Something like Hamilton or Hayashi is your best bet.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Mon Feb 15, 2010 3:08 pm

QMS Glenn wrote:If the box isn't checked, then you'll do iterative GMM with the corresponding robust weight matrix. If it is checked, then the weight matrix will be the non robust form which gives you OLS in your setting.


If I tick the box, the weight matrix will be the identity matrix initially, so we are doing equation-by-equation OLS as a first step, then using the robust estimator the long run covariance. So it is the 2nd step which captures the cross-equation covariance in disturbances, right?

Just to double check, you are saying that I SHOULD be ticking the box.

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Tue Feb 16, 2010 10:18 am

Your original statement is correct. Ticking the box will do 2SLS/OLS as the first stage and then report robust standard errors which capture various correlation structures.

The "should" part is open to debate. You could just as well not check the box and do GMM with weighting matrices that are estimated using the assumed structure. In principle, these should be more efficient estimators, but I'm not going to argue in favor of or against their use in your setting.

The recommendation that you check the box was merely as an answer to your original question about getting 2SLS/OLS estimates with robust standard errors.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Tue Feb 16, 2010 5:46 pm

QMS Glenn wrote: In principle, these should be more efficient estimators, but I'm not going to argue in favor of or against their use in your setting.


In which case, why would anybody want to tick the box, if not doing so carries all the robustness but is more efficient?

EViews Glenn
EViews Developer
Posts: 2642
Joined: Wed Oct 15, 2008 9:17 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby EViews Glenn » Wed Feb 17, 2010 12:18 pm

What you are asking is why people would ever do 2SLS with robust standard errors instead of GMM with optimal weighting matrices. Or even, why would people do OLS instead of optimal GMM. I won''t comment on the answers to those questions, but I do know that people have, and continue to prefer the former to the latter. I will note, however, that just because something is asymptotically optimal in theory doesn't necessary make it more desirable in finite sample practice. And the fact that the weighting matrices must be estimated from the data must be taken into account when evaluating robustness and efficiency.

Bigbrotherjx
Posts: 36
Joined: Wed Feb 10, 2010 4:25 pm

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby Bigbrotherjx » Mon Mar 15, 2010 4:11 pm

Does the above method still work when there is a lagged dependent variable?

balancesheet
Posts: 1
Joined: Thu Aug 26, 2010 3:46 am

Re: Seemingly Unrelated Regressions and robust covariance matrix

Postby balancesheet » Thu Aug 26, 2010 4:08 am

I have a related questions to these posts. I'm running a VAR with 5 variables and 2 lagged values.

Is there a particular reason eviews can't just estimate via OLS using a HAC estimator? If I run the VAR equation by equation with a HAC estimator i get the same coefficients but slightly different standard errors than when I run the VAR by HAC-GMM using 2SLS. I have used all that lagged values that are in the VAR as instruments (which is right as far as I know).

I'm not familiar with GMM estimating more than 1 equation so it is a bit confusing to me.

Hope someone can tell me if the GMM errors are more robust than the individually estimated equations with HAC.

Cheers.


Return to “Estimation”

Who is online

Users browsing this forum: No registered users and 3 guests