I think I got it working for me, maybe not the most beautifull solution but it should be ok for testing.
I start with a filetrigger on the file containing the IDs.
1) Read File: where my Filepath is {TRIGGER(Active,LastTrigger,File.Result.FullPath)}
2) Then I get the first ID with {STRING(Substring|{TASK(<Read File - TaskId>,StdOut)}|0|12)} - Since our IDs always have 12 characters - and pass it as a parameter to the FTP call.
3) Write File: in the same folder where the original file is created (so the job will trigger again). As Filepath I use Filename_{DATEFORMAT(yyyyMMdd_hhmmss)}.txt and my Value is: {STRING(Remove|{TASK(<Read File - TaskId>,StdOut)}|0|14)}. This will remove the first line and the newline character. In the main settings tab the checkbox "Put Job in queue" needs to be enabled.
4) Delete File: deletes the triggerfile, Folder:"{TRIGGER(Active,LastTrigger,File.Result.Folder)}" , Include file mask: "{TRIGGER(Active,LastTrigger,File.Result.Name)}"
5) 1 Second Wait: just to be sure that the new file has a higher timestamp than the previous one since it could be that the job is finished within one second.
Further I created a conditionset, containing 10 different conditions which checks the triggerfile if it contains a number (0,1,2,3,4,5,6,7,8,9), because our IDs will always have atleast one number in them.
All tasks except nr. 4) have that condition.
So the jobs just triggers itself as long the file contains a number, if not it deletes the last generate file (which should not contain something except maybe some blanks / linebreaks).
Altought I will test it for some time I see a major problem: If there goes something wrong with the creation of the original file, be it just an additional blank somewhere, it can mess up the whole upload-task.
Regards,
spiedv