Hi!
I have a situation, where VisualCron adds extra characters to the end of text files (.json, .txt, etc...) when I use the Cloud Transfer task.
If I use, for example, Azure Storage Explorer on the server and upload the same files, everything works as expected.
I have tried this with text files with different encodings and the end result is the same.
I would like to add screenshots and test files as attachements, but those components on seems to do absolutely nothing, so I'll try to explain the behaviour below.
Working scenario
1. File size on the server is 77 bytes and encoding is UTF-8
2. I upload test file from the server to Azure Blob by using Azure Storage Explorer
3. I download the file on my workstation by using Azure Storage Explorer
4. Downloaded file size is 77 bytes, size on disk is 0 bytes, encoding is UTF-8 and there are no added characters in the file.
Not working scenario (using Visual Cron to upload the file to Azure)
1. File size on the server is 77 bytes and encoding is UTF-8
2. I run Upload File TAsk in Visual Cron
3. I download the file on my workstation by using Azure Storage Explorer
4. Downloaded file size is 512 bytes, size on disk is 4 kilobytes, encoding is UTF-8 and there are added characters in the file. the extra characters are displayed in Notepad++ as xB4xB4
It seems that the extra characters are added until the the file's size and physical size matches the nearest physical and logical sizes dictated by the NTFS parameters of the hard drive where VisualCron is ran. In my case they are:
LogicalSectorSize : 512
PhysicalSectorSize : 4096
Unfortunately I can't test this theory with different volume with different logical and physical sector sizes.