08-29-2016 04:33 PM
Why not just store the files locally and once or twice a day try to copy the files to the network drive?
If it fails, just keep a list of files to be copied and try again later.
I sort of do that on one of my tests. Once a week I take all the daily data files combine them into an Excel workbook (using XLR8) and email that spread sheet to me, and my team. The email routine is seperate from the main program so of it fails it does not effect the running data logging.
08-29-2016 04:39 PM
@RTSLVU wrote:Why not just store the files locally and once or twice a day try to copy the files to the network drive?
From the original question, I get that it's desirable to be able to OPEN the file on the remote server.
I would assume from that that the user doesn't want to see day-old data.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
08-29-2016 04:41 PM - edited 08-29-2016 04:46 PM
Hmmm, I see what you are saying, on large files and/or frequent updates the network bandwidth could get out of hand...so do you have a suggestion for how to diff the files and only copy the data that was written since the last update? I noticed the Get/Set File Position functions...but how would you use that to only append data difference between the local and remote files....
There also seems to be several ZIP fuctions, takes some processing power on the local machine, but at least could limit the network bandwidth for data mirroring...
Steve, you are correct, I would like to mirror the data to the network drive at least once per hour for remote monitoring. Ideally they should be nearly identical at any given time.
08-29-2016 04:54 PM
You don't want to diff the files (assuming you are not allowing changes on the server end).
Just open the file for appending, which means asking the file size, and setting the position marker to that value before writing.
But you STILL need to keep the data locally, and keep track of what has been put into the master file, and what has not.
FOR EXAMPLE:
Suppose you pick a chunk of 10 minutes, or 10 k bytes.
you write your data to local file "My Local File 1.txt"
When the time or size is exceeded, you switch to writing into "My Local File 2.txt"
You tell your copymaster routine that block 1 is available. (Index into queue).
Your copymaster routine (running separately) gets notified that block 1 is available.
It tries to OPEN/CREATE the network master file.
it Finds the current file size, say 10123 bytes.
It sets the file position to 10123.
It reads file block 1 local file, and writes that data to the master file.
Close the master file.
IF SUCCESSFUL
Remove index 1 from the queue.
Delete local block 1 file (optional)
ELSE
Wait 30 sec and try again.
END IF
IF QUEUE IS EMPTY
Sleep for 10 minutes, or watch for file size > 10k
else repeat with next block index in queue.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
08-29-2016 06:08 PM - edited 08-29-2016 06:10 PM
If you are using windows you can create a batch file to copy the files for you. Then you can schedule a task that does this daily, weekly, etc. It is not that sophisticated, but it might work for you!
As mentioned by others, you won't want to copy every file as the data piles up. You can have a separate folder called "ToBeTransferred" where you store a copy of your data. Then you can use your batch file to "move" those files to your network location, and the copy in "ToBeTransferred" will disappear.
08-29-2016 06:32 PM
You can always go with Google Drive/Dropbox etc ...