Please note that VisualCron support is not actively monitoring this community forum. Please use our contact page for contacting the VisualCron support directly.


lolofx
2017-02-28T14:29:31Z
Hi,
I am a newbie with files trigger (and I do not speak englis very well, sorry).

I have created a task to move all new files from a folder to another folder. The task is ok.
I have created a trigger on source folder, if a new file is created, the task has fired.

My problem, when multiple files have created on the source folder at same time, the task has called many times.
It is not necessary because all the files moved with the first call so other call have no file to move.
And the task result is "Failed".

Can you help me please ?

Thank you,

Lolofx
Sponsor
Forum information
thomas
2017-02-28T14:58:35Z
Hi

You are right. The trigger will fire once per file, the same is true for mail triggers. I'm not a fan of this, but that's how it works.
I have added a stop job task at the end of the job to have the job fire only once.

Capture.PNG
lolofx
2017-02-28T15:49:41Z
Thanks a lot, I try it, it is working perfectly
Gary_W
2017-02-28T19:32:49Z
I ran into this issue just last week, it would have been nice to know this trick then! However, I realized something that made me rework things to handle VisualCron's method of triggering a file at a time. It's worth mentioning I think. I had a legacy process that operated on a folder, processing all files, that ran at a specified time. I moved it to VisualCron, with a file trigger and after copying multiple files to the folder the job started once for each file, and each instance tried to operate on all files. Once I cleaned up the mess and realized what happened (and not knowing Thomas' trick above) I reworked VisualCron and wrote a new script to handle accepting and operating on a file at a time.

Now depending on the settings, VisualCron will fire the file trigger after the file has been released (after the system is done writing the file). This is a good argument for reworking your process to handle a file at a time. You know the file is done being written to when VisualCron kicks it off (if that setting is checked). With the Stop Job method calling a program or operation that operates on all files, you better make sure all files are done being written to the folder before starting, and all files are present. If you had a large number of files being written to a folder, or a number of large files, the called program or copy operation could start when the first one arrives and finish before all files were done being written, or try to run on a partially written file (of course this depends on what that program or operation does, etc).

These factors may or may not apply to your situation, but it is something to consider carefully when setting up the job at any rate.

Consider changing your job to copy just the single file that caused the trigger. I do believe it is safer in the long run. By the way, the name of the file that caused the trigger to fire can be found in the variable: {TRIGGER(Active,LastTrigger,File.Result.Name)}. After processing, I move the file from our working folder which is set up in a job variable {JOB(Active,Variable,JobWorkingDir)} to a "processed" folder, also in a job variable. The trade off I suppose is higher overhead for processing each file one at a time but that is not a factor for us, where the increased reliability is (my opinion only, no facts to support this).

Hope this info helps and I will be interested to hear other folk's experiences.

Gary
thomas
2017-02-28T21:39:52Z
Great points! I agree. It's probably cleaner to deal with one item at the time. In the mail triggers we have, we check every minute. I then have a wait/sleep task just to make sure everything (attachments in our case) are moved to, and available on the file server. Then I do my stuff and have a stop job at the end. It's a dirtier (but potentially faster) way of doing it than dealing with one file at the time.

However sometimes it's nice to be able to do logic on all the files in one go. What if you want to compare files against each other, maybe remove duplicate records in files and so on. We have a .net task that loops through all the files in the folder, merges everything into one giant file, removes duplicates, and generally cleans up the data. Once this is done, we ship it to the database. This should only be done once of course

So there is a case to be made for both approaches. Maybe it should be a feature request. So that it works as it is today, but you can check a box that says 'run only once', or something to that effect.
Gary_W
2017-02-28T22:07:02Z
Yes I like the "run once" option. Or like in the file copy task/result tab a limit setting to limit to x amount of iterations maybe. The bottom line is to be aware of the limitations/options and possible consequences before deciding the course of action to make sure it meets your needs.
Scroll to Top