Hi there,
I was not sure if posting this either here or in the data manipulation section, but anyways.
I am experiencing serious perfomance issues when creating a long table (93860 rows and 3 columns). I am creating this table because I need to extract 1805 series (52 datapoints each) from a workfile into "relational" table format to an excel spreadsheet. I am not aware of any predefined funtion of Eviews that could do this. As an example, from Eviews I can take the series in columns (or rows) like this:
date ser1 ser2
1999 1 2
2000 2 3
2001 3 4
But I need the info to be shown like this:
Date Name Obs
1999 ser1 1
2000 ser1 2
2001 ser1 3
1999 ser2 2
2000 ser2 3
2001 ser2 4
My (not very elegant code is attached with the associated workfile). Basically what is does it to generate small auxiliary tables and then it merges everything into a big output table. My idea was to then export such a big table into a csv file that I could read with Excel. I know I could do this with a VBA macro but it usually crashes and it would be much better for many reasons to have the whole process controlled and programmed in Eviews.
What is the problem? I measure that it takes 14 sec to write the first 10000 rows, 37 seconds to write the following 10000 rows, 1:03 min the following 10000 and then it becomes even slower. Can you tell me please how I could do this in a better way?
Many thanks!
Fede
Large table performance issues
Moderators: EViews Gareth, EViews Jason, EViews Moderator, EViews Matt
Large table performance issues
- Attachments
-
- f_reg_db_relational.prg
- (1.54 KiB) Downloaded 219 times
-
- Sample_workfile.WF1
- (1.46 MiB) Downloaded 209 times
-
- Fe ddaethom, fe welon, fe amcangyfrifon
- Posts: 13319
- Joined: Tue Sep 16, 2008 5:38 pm
Re: Large table performance issues
I didn't look at what your .prg did, but tables are not designed to work well at large sizes, particularly if you do not pre-size them.
The obvious way to do what you want is to stack the data into series rather than a table. In your case you need a double stack:
The obvious way to do what you want is to stack the data into series rather than a table. In your case you need a double stack:
Code: Select all
%list = @wlookup("*", "series")
%newlist = ""
for !i=1 to @wcount(%list)
%newlist = %newlist + " " + @left(@word(%list, !i), 3)
next
%newlist = @wunique(%newlist)
pagestack {%newlist} @ ?* *
%list = @wlookup("*", "series")
%newlist = ""
for !i=1 to @wcount(%list)
%newlist = %newlist + " " + @mid(@word(%list, !i), 2)
next
%newlist = @wunique(%newlist)
pagestack {%newlist} @ *? *
alpha labels = var01 + "_" + var02
rename _ data
pagestruct labels @date(id01)
show data
Follow us on Twitter @IHSEViews
Re: Large table performance issues
You are completely right. Thanks for showing me how easy things can be sometimes
Who is online
Users browsing this forum: No registered users and 46 guests