Search found 83 matches
- Wed Sep 16, 2015 2:44 am
- Forum: Programming
- Topic: Extract k principal coomponents
- Replies: 2
- Views: 3120
Re: Extract k principal coomponents
Thanks, I did get it to work with your advise
- Wed Sep 16, 2015 1:38 am
- Forum: Estimation
- Topic: 'Observables' field truncates specification
- Replies: 1
- Views: 2460
'Observables' field truncates specification
Hi everyone
I noticed a problem when creating factor scores after running factor extraction. It seems the 'Observables' field truncates the text string randomly if the specification is too long. Is this a known issue? Can you provide a workaround?
I am on v7.2
Regards
I noticed a problem when creating factor scores after running factor extraction. It seems the 'Observables' field truncates the text string randomly if the specification is too long. Is this a known issue? Can you provide a workaround?
I am on v7.2
Regards
- Mon Sep 07, 2015 7:04 am
- Forum: Programming
- Topic: Extract k principal coomponents
- Replies: 2
- Views: 3120
Extract k principal coomponents
Hi everyone I would like to create all principal components representing 0.95 of the cumulative proportion of variance explained (PCA). This is what I have so far where CFP01 is the group name of a number of underlying series. %m= "cfp01" freeze(table01) {%m}.pcomp(cproport = 0.95, out=tab...
- Tue Jun 24, 2014 6:52 am
- Forum: Programming
- Topic: Check if series is member of group
- Replies: 1
- Views: 2821
Check if series is member of group
Hi everyone What? what is the quickest check to see whether or not a series is a member of a group? Why? I have a routine that adds series to a group. An inner loop is executed only if the series was not previoulsy added to the group Where? In EViews 7 (build 7.2) Looping throught the group members ...
- Thu May 22, 2014 7:08 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
Okay, I tested with matrix and it does not make much of a difference. This is what I have now: 'set group %group01 = "group01" %group02 = "group01" !n = {%group01}.@count !m = {%group02}.@count !cnt = 1 'loop for !i = 1 to !n %n = {%group01}.@seriesname(!i) dummy.add {%n} table01...
- Thu May 22, 2014 2:22 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
No, I have 7000 series with varying numbes of observation (i.e. unbalanced). The maximum observations would be n = 360. I have no issues using a loop to calculate the correlation factors but the code executes very slowly: 'set group %group01 = "group01" !n = {%group01).@count !cnt = 1 'loo...
- Wed May 21, 2014 9:19 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
Hi Glenn
cant I generate quadrants of the corr matrix, i.e.:
(1-5000) x (1-5000)
(5001-7000) x (1-5000)
(1-5000) x (5001-7000)
(5001-7000) x (5001 x 7000)
Not sure if covariance analysis makes provisions for that?
cant I generate quadrants of the corr matrix, i.e.:
(1-5000) x (1-5000)
(5001-7000) x (1-5000)
(1-5000) x (5001-7000)
(5001-7000) x (5001 x 7000)
Not sure if covariance analysis makes provisions for that?
- Wed May 21, 2014 12:45 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
About 7000. I need to get the data into Excel. Is there another way to store and export?
- Wed May 21, 2014 12:42 am
- Forum: Data Manipulation
- Topic: Large table freeze - program crash
- Replies: 3
- Views: 4109
Re: Large table freeze - program crash
Here the specs:
EViews build 7.2
Operating System: Windows 7 (32-bit)
EViews build 7.2
Operating System: Windows 7 (32-bit)
- Tue May 20, 2014 10:09 am
- Forum: Data Manipulation
- Topic: Large table freeze - program crash
- Replies: 3
- Views: 4109
Large table freeze - program crash
Hi there I have a large table that I would like to freeze (cols > 5000) as a result of a covariance analysis. The program calculates all correlation factors but crashes when I try to freeze the results. I have tried to execute from code with same results. Here my questions: - Is there a table size l...
- Tue May 20, 2014 8:51 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
Nope, still does it. I have:
freeze(corr_table) group01.cor(pairwise) corr
Runs fine until it gets to creating the table when it exits with 'out of memory'. Any suggestions?
freeze(corr_table) group01.cor(pairwise) corr
Runs fine until it gets to creating the table when it exits with 'out of memory'. Any suggestions?
- Tue May 20, 2014 7:17 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
p.s. Specifically, I am looking for the syntax for Pearson correlation, unmatched samples, something on the line of:
mygroup.corr(...
mygroup.corr(...
- Tue May 20, 2014 7:05 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
Hi Gareth
that will work, I forgot that you can uncheck 'common samples'. I can see the calcs being performed. However, something strange happens once it finishes: I get the 'out of memory' error. Perhaps it is better to execute this from code in quiet mode?
that will work, I forgot that you can uncheck 'common samples'. I can see the calcs being performed. However, something strange happens once it finishes: I get the 'out of memory' error. Perhaps it is better to execute this from code in quiet mode?
- Fri Apr 11, 2014 3:48 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Re: Coding: performance in large databases
Hi Gareth I have a SQL database to store the series. I import to EViews from an Excel extract. To clarify my problem: Let us say I have 1,000 time series, each with different starting and end dates. I need to compare all series with one another to determine their degree of correlation. If r >= 0.99 ...
- Thu Apr 10, 2014 8:04 am
- Forum: Programming
- Topic: Coding: performance in large databases
- Replies: 15
- Views: 12339
Coding: performance in large databases
Hi quick question regarding performance: I have a workfile with several thousand time series. I have written some code looping through the time series and calculating their cross correlations. Is there a performance impact from very large databases? Is there a max number of series not to be exceeded...