efficiently storing large amounts of data to server

For requesting general information about EViews, sharing your own tips and tricks, and information on EViews training or guides.

Moderators: EViews Gareth, EViews Moderator

capuchin
Posts: 31
Joined: Tue Oct 15, 2019 9:06 am

efficiently storing large amounts of data to server

Postby capuchin » Fri Jun 12, 2020 8:27 am

I sometimes store a very large amount of data to a server, about 4000 series with a few years of monthly data. The way I do this currently is to store the {string} of all the series that I am going to store, but it takes quite a while for the store command to finish running (20-30 minutes).

Is there a more efficient way to do this from a processing or bandwidth perspective? It takes long enough that my corporate-controlled computer goes into sleep mode (which I think pauses the store?) if I do not hang out to babysit it, which makes the long time a little more problematic. Would opening the database first give me access to a more efficient storing command? Would doing this as a program in Quiet mode save a significant amount of time? (To be honest, I often find the status bar to be reassuring on lengthy processing times.)

My upload speed is 5 mbps, which is not spectacular, but I figure that a bunch of numbers can't really be that many megabytes.

EViews Gareth
Fe ddaethom, fe welon, fe amcangyfrifon
Posts: 13307
Joined: Tue Sep 16, 2008 5:38 pm

Re: efficiently storing large amounts of data to server

Postby EViews Gareth » Fri Jun 12, 2020 9:43 am

Could you provide a few more details - what type of file are you storing to? (EViews database - .edb, or some other kind?)
Follow us on Twitter @IHSEViews

EViews Steve
EViews Developer
Posts: 788
Joined: Tue Sep 16, 2008 3:00 pm
Location: Irvine, CA

Re: efficiently storing large amounts of data to server

Postby EViews Steve » Fri Jun 12, 2020 9:59 am

Without knowing the details yet, I can say that storing to a local database/file first, and then pushing the database/file to the network is going to be significantly faster than pushing each object over the network one by one.

For example, save everything to a local CSV text file, then copy the CSV file to your network location.

Code: Select all

wfsave(type=txt) mydata.csv
shell copy mydata.csv x:\network\folder\mydata.csv

EViews Gareth
Fe ddaethom, fe welon, fe amcangyfrifon
Posts: 13307
Joined: Tue Sep 16, 2008 5:38 pm

Re: efficiently storing large amounts of data to server

Postby EViews Gareth » Fri Jun 12, 2020 1:17 pm

Although Steve's suggestion isn't going to work if you're updating a database.
Follow us on Twitter @IHSEViews

capuchin
Posts: 31
Joined: Tue Oct 15, 2019 9:06 am

Re: efficiently storing large amounts of data to server

Postby capuchin » Fri Jun 19, 2020 11:15 am

This is going to a FAME database on a server.

capuchin
Posts: 31
Joined: Tue Oct 15, 2019 9:06 am

Re: efficiently storing large amounts of data to server

Postby capuchin » Mon Jun 22, 2020 8:07 am

Speaking of servers, my server just got changed to a long and cumbersome name. Is there a way to set the server option for the fetch/store command globally so that I can leave out the "server=..." part of the fetch/store commands? Right now, I write:

fetch(d=MYBANK,t=F,server=MYSERVER) MYSERIES

and I don't really want to change to

fetch(d=MYBANK,t=F,server=MYQWERTYOMGBBQ_#&*!@#SERVER) MYSERIES

EViews Gareth
Fe ddaethom, fe welon, fe amcangyfrifon
Posts: 13307
Joined: Tue Sep 16, 2008 5:38 pm

Re: efficiently storing large amounts of data to server

Postby EViews Gareth » Mon Jun 22, 2020 9:00 am

Add it to your database registry
Follow us on Twitter @IHSEViews


Return to “General Information and Tips and Tricks”

Who is online

Users browsing this forum: No registered users and 20 guests