I am new to programming with Eviews 10 so I am struggling a bit with the following case. I managed to create a script to run a monte carlo simulation, in order to "prove" the assumptions of OLS estimators. More specifically I use a population function with two given coefficients (slope and intercept) and 10 given values of the independent variable (X). Also with the following script I create 20 different disturbance term (u) values, with mean zero and variance 4.
Based on the above I obtain 20 different fitted values (y) which then I regress on X, from where I get 20 OLS lines and 20 pairs of coefficients.
What I want is , if there is a quick way to merge all different pairs of coefficients in a series so I can then calculate their means and variances etc? Also can I have a series where it stores the variances of its of the OLS equations? Currently I only have 20 different equations and 20 different coefficient variables with two values each.
EDIT:
I managed to find formulas to store the coefficients and the variance of the residual (which I post below).
Now I have another issue. Based on the experiment, the variance of the residuals based on the different iterations should approximately be close to the "real" value, but when I check the stored values of the object "sevec" which stores the variances of the residuals of each run, the mean value is always 1 less than the actual. E.g. in the code below I want a disturbance term with mean 0 and variance 4, but I get approximately variance 3 even if i run more than 100 iterations.
Is there an error in the formula: series u{!i} = 0 + @sqrt(4)*nrnd ?
Code: Select all
scalar iterations = 51
matrix(2,iterations) mycoefs
vector(iterations) sevec
for !i=1 to iterations
series u{!i} = 0 + @sqrt(4)*nrnd
series y{!i} = b1_c + b2_c* x + u{!i}
equation eq{!i}.ls y{!i} c x
coef c{!i} = c
colplace(mycoefs, eq{!i}.@coefs, !i)
sevec(!i) = (@stdev(resid))^2
next
matrix mycoefs = @transpose(mycoefs)
Thank you in advance!