Seemingly Unrelated Regressions and robust covariance matrix
Moderators: EViews Gareth, EViews Moderator

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Seemingly Unrelated Regressions and robust covariance matrix
Is it possible to estimate a SUR system with the equivalent of the White or NeweyWest covariance matrix? I would like to estimate a SUR system which is robust to heteroscedasticity and serial correlation.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
The version is 5.1 btw.

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
You should be able to do this through the system GMM estimator.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
I was under the impression that GMM is rather different to SUR conceptually. I don't have any instruments...
Also, I came across the following post:
viewtopic.php?f=4&t=436&p=1403&hilit=seemingly+unrelated#p1403
This appears to be related to what I am looking for...QMS Gareth said it will be implemented in Eviews 7?
Also, I came across the following post:
viewtopic.php?f=4&t=436&p=1403&hilit=seemingly+unrelated#p1403
This appears to be related to what I am looking for...QMS Gareth said it will be implemented in Eviews 7?

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
I guess I need to be a bit clearer on what you want (and what I mean ). The seemingly unrelated refers to the fact that you have a set of equations with no apparent crossequation restrictions, but with nonzero offdiagonals.
For the purposes of this discussion there are couple of ways to proceed:
. You can estimate the specification using a GLS approach which corrects for crosssectional heteroskedasticity and contemporaneous correlation (but not for general heteroskedasticity and serial correlation). In principle, you could follow this with a robust standard estimator.
. Alternately, you can estimate using system least squares without correlation correction and then compute with a robust standard estimator, for example a system HAC estimator.
EViews does not allow you to take the former approach, but does allow you to do the latter using the GMM tools. Note that the equivalence results from treating all of the explanatory variables in your specification as exogenous. Just as TSLS using the original regressors as instruments yields the least squares estimator, so too does GMM with the appropriate orthogonality conditions and weighting matrix (what is termed in the dialog TSLS weighting), yield the system least squares estimator. Then all you have to do is to select the appropriate robust covariance option.
For the purposes of this discussion there are couple of ways to proceed:
. You can estimate the specification using a GLS approach which corrects for crosssectional heteroskedasticity and contemporaneous correlation (but not for general heteroskedasticity and serial correlation). In principle, you could follow this with a robust standard estimator.
. Alternately, you can estimate using system least squares without correlation correction and then compute with a robust standard estimator, for example a system HAC estimator.
EViews does not allow you to take the former approach, but does allow you to do the latter using the GMM tools. Note that the equivalence results from treating all of the explanatory variables in your specification as exogenous. Just as TSLS using the original regressors as instruments yields the least squares estimator, so too does GMM with the appropriate orthogonality conditions and weighting matrix (what is termed in the dialog TSLS weighting), yield the system least squares estimator. Then all you have to do is to select the appropriate robust covariance option.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
QMS Glenn wrote:I guess I need to be a bit clearer on what you want (and what I mean ). The seemingly unrelated refers to the fact that you have a set of equations with no apparent crossequation restrictions, but with nonzero offdiagonals.
For the purposes of this discussion there are couple of ways to proceed:
. You can estimate the specification using a GLS approach which corrects for crosssectional heteroskedasticity and contemporaneous correlation (but not for general heteroskedasticity and serial correlation). In principle, you could follow this with a robust standard estimator.
. Alternately, you can estimate using system least squares without correlation correction and then compute with a robust standard estimator, for example a system HAC estimator.
EViews does not allow you to take the former approach, but does allow you to do the latter using the GMM tools. Note that the equivalence results from treating all of the explanatory variables in your specification as exogenous. Just as TSLS using the original regressors as instruments yields the least squares estimator, so too does GMM with the appropriate orthogonality conditions and weighting matrix (what is termed in the dialog TSLS weighting), yield the system least squares estimator. Then all you have to do is to select the appropriate robust covariance option.
Ah, I'd always associated SUR with GLS. I'm guessing that the "Seemingly Unrelated Regressions" option in system estimation is taking the GLS approach?
Could you elaborate on "appropriate orthogonality conditions and weighting matrix"? Are my orthogonality conditions basically the equationbyequation OLS conditions i.e. E(X'u)=0? From what I recall, the weighting matrix can be computed iteratively right?

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
You are correct in associating SUR with GLS estimation of the system and also correct that what we label SUR is simply GLS on the system of equations. As a sidenote, the EViews SUR is a bit more general than the original SUR formulation in that the equations need not be "SU", that is we allow for crossequation coefficient restrictions.
As to your question about appropriate orthogonality and weighting, let me elaborate, using a single equation as an illustration...
Suppose you estimate the equation Y = XB + e using GMM with X as your instruments and compute the weighting matrix assuming that everything is iid so that E(Xee'X) = sigma^2 (X'X)^{1}. Then the orthogonality condition implied by this specification is E(X'e) = 0, and least squares is the first step of the GMM estimator. You can think about this weighting matrix as 2SLS weighting matrix since its use yields the TSLS (and in this case the OLS) estimator. I think we label this the "Identity weighting matrix in the system dialog box, but I don't like that label since we're really want the conditional error variance to be the identity matrix).
Typically in GMM you would then go on and use the consistent estimates of B to form a new estimator of the weighting matrix and off you would go.
But you don't have to iterate the weights. Suppose instead that you simply stop after the first estimation of B, and form your coefficient covariance estimates using a robust estimator of the longrun covariance. In this case, you are simply doing OLS with robust standard errors (perhaps White, or NeweyWest).
So what I'm proposing is that you set up your instrument specification so that estimating system 2SLS yields identical results to estimating system OLS (your statement about the equationbyequation orthogonality conditions is spoton). Once you are sure you've got that lined up, use one of the two system GMM estimators using the 2SLS weighting matrix. This will give you the system OLS estimates with robust standard errors.
As to your question about appropriate orthogonality and weighting, let me elaborate, using a single equation as an illustration...
Suppose you estimate the equation Y = XB + e using GMM with X as your instruments and compute the weighting matrix assuming that everything is iid so that E(Xee'X) = sigma^2 (X'X)^{1}. Then the orthogonality condition implied by this specification is E(X'e) = 0, and least squares is the first step of the GMM estimator. You can think about this weighting matrix as 2SLS weighting matrix since its use yields the TSLS (and in this case the OLS) estimator. I think we label this the "Identity weighting matrix in the system dialog box, but I don't like that label since we're really want the conditional error variance to be the identity matrix).
Typically in GMM you would then go on and use the consistent estimates of B to form a new estimator of the weighting matrix and off you would go.
But you don't have to iterate the weights. Suppose instead that you simply stop after the first estimation of B, and form your coefficient covariance estimates using a robust estimator of the longrun covariance. In this case, you are simply doing OLS with robust standard errors (perhaps White, or NeweyWest).
So what I'm proposing is that you set up your instrument specification so that estimating system 2SLS yields identical results to estimating system OLS (your statement about the equationbyequation orthogonality conditions is spoton). Once you are sure you've got that lined up, use one of the two system GMM estimators using the 2SLS weighting matrix. This will give you the system OLS estimates with robust standard errors.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
OK, just to confirm, I am basically setting my instruments as the exogenous explanatory variables.
However, I'm still not quite sure about what the manual says here:
"If you select either GMM method, EViews will display a checkbox labeled Identity weighting
matrix in estimation. If selected, EViews will estimate the model using identity
weights, and will use the estimated coefficients and GMM specification you provide to
compute a coefficient covariance matrix that is robust to crosssection heteroskedasticity
(White) or heteroskedasticity and autocorrelation (NeweyWest). If this option is not
selected, EViews will use the GMM weights both in estimation, and in computing the coefficient
covariances."
In my case, is this basically saying that if I tick the box, the first step is OLS, otherwise it isn't. The second step is then the NeweyWest HAC covariance matrix.
What if I don't select the option? What does it meant to use "GMM weights both in estimation, and in computing the coefficient covariances"?
Sorry if I seem to be going around circles here...I'm just not sure where we are capturing the crossequation covariance in disturbances in the GMM procedure.
I also have some options concerning prewhitening, kernels and bandwidths. I am not familiar with this nonparametric stuff and the manual is rather terse. Could you briefly summarise what the choice boils down to, and provide some recommendations?
Thanks
However, I'm still not quite sure about what the manual says here:
"If you select either GMM method, EViews will display a checkbox labeled Identity weighting
matrix in estimation. If selected, EViews will estimate the model using identity
weights, and will use the estimated coefficients and GMM specification you provide to
compute a coefficient covariance matrix that is robust to crosssection heteroskedasticity
(White) or heteroskedasticity and autocorrelation (NeweyWest). If this option is not
selected, EViews will use the GMM weights both in estimation, and in computing the coefficient
covariances."
In my case, is this basically saying that if I tick the box, the first step is OLS, otherwise it isn't. The second step is then the NeweyWest HAC covariance matrix.
What if I don't select the option? What does it meant to use "GMM weights both in estimation, and in computing the coefficient covariances"?
Sorry if I seem to be going around circles here...I'm just not sure where we are capturing the crossequation covariance in disturbances in the GMM procedure.
I also have some options concerning prewhitening, kernels and bandwidths. I am not familiar with this nonparametric stuff and the manual is rather terse. Could you briefly summarise what the choice boils down to, and provide some recommendations?
Thanks

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
If the box isn't checked, then you'll do iterative GMM with the corresponding robust weight matrix. If it is checked, then the weight matrix will be the non robust form which gives you OLS in your setting.
The general error structure is captured in the covariance calculation, in essence through the estimation of E(X'ee'X) i n the middle of the variance sandwich estimator.
Explaining the computation of longrun variances is something that I'm going to have to beg off on as it's quite involved. Something like Hamilton or Hayashi is your best bet.
The general error structure is captured in the covariance calculation, in essence through the estimation of E(X'ee'X) i n the middle of the variance sandwich estimator.
Explaining the computation of longrun variances is something that I'm going to have to beg off on as it's quite involved. Something like Hamilton or Hayashi is your best bet.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
QMS Glenn wrote:If the box isn't checked, then you'll do iterative GMM with the corresponding robust weight matrix. If it is checked, then the weight matrix will be the non robust form which gives you OLS in your setting.
If I tick the box, the weight matrix will be the identity matrix initially, so we are doing equationbyequation OLS as a first step, then using the robust estimator the long run covariance. So it is the 2nd step which captures the crossequation covariance in disturbances, right?
Just to double check, you are saying that I SHOULD be ticking the box.

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
Your original statement is correct. Ticking the box will do 2SLS/OLS as the first stage and then report robust standard errors which capture various correlation structures.
The "should" part is open to debate. You could just as well not check the box and do GMM with weighting matrices that are estimated using the assumed structure. In principle, these should be more efficient estimators, but I'm not going to argue in favor of or against their use in your setting.
The recommendation that you check the box was merely as an answer to your original question about getting 2SLS/OLS estimates with robust standard errors.
The "should" part is open to debate. You could just as well not check the box and do GMM with weighting matrices that are estimated using the assumed structure. In principle, these should be more efficient estimators, but I'm not going to argue in favor of or against their use in your setting.
The recommendation that you check the box was merely as an answer to your original question about getting 2SLS/OLS estimates with robust standard errors.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
QMS Glenn wrote: In principle, these should be more efficient estimators, but I'm not going to argue in favor of or against their use in your setting.
In which case, why would anybody want to tick the box, if not doing so carries all the robustness but is more efficient?

 EViews Developer
 Posts: 2658
 Joined: Wed Oct 15, 2008 9:17 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
What you are asking is why people would ever do 2SLS with robust standard errors instead of GMM with optimal weighting matrices. Or even, why would people do OLS instead of optimal GMM. I won''t comment on the answers to those questions, but I do know that people have, and continue to prefer the former to the latter. I will note, however, that just because something is asymptotically optimal in theory doesn't necessary make it more desirable in finite sample practice. And the fact that the weighting matrices must be estimated from the data must be taken into account when evaluating robustness and efficiency.

 Posts: 36
 Joined: Wed Feb 10, 2010 4:25 pm
Re: Seemingly Unrelated Regressions and robust covariance matrix
Does the above method still work when there is a lagged dependent variable?

 Posts: 1
 Joined: Thu Aug 26, 2010 3:46 am
Re: Seemingly Unrelated Regressions and robust covariance matrix
I have a related questions to these posts. I'm running a VAR with 5 variables and 2 lagged values.
Is there a particular reason eviews can't just estimate via OLS using a HAC estimator? If I run the VAR equation by equation with a HAC estimator i get the same coefficients but slightly different standard errors than when I run the VAR by HACGMM using 2SLS. I have used all that lagged values that are in the VAR as instruments (which is right as far as I know).
I'm not familiar with GMM estimating more than 1 equation so it is a bit confusing to me.
Hope someone can tell me if the GMM errors are more robust than the individually estimated equations with HAC.
Cheers.
Is there a particular reason eviews can't just estimate via OLS using a HAC estimator? If I run the VAR equation by equation with a HAC estimator i get the same coefficients but slightly different standard errors than when I run the VAR by HACGMM using 2SLS. I have used all that lagged values that are in the VAR as instruments (which is right as far as I know).
I'm not familiar with GMM estimating more than 1 equation so it is a bit confusing to me.
Hope someone can tell me if the GMM errors are more robust than the individually estimated equations with HAC.
Cheers.
Who is online
Users browsing this forum: No registered users and 20 guests