Please note that VisualCron support is not actively monitoring this community forum. Please use our contact page for contacting the VisualCron support directly.


lal
  •  lal
  • No customer Topic Starter
2009-02-10T02:12:30Z
On the server, there is a 7000K limit to output. There is a large file (~160M), that intermittently, when a task tries to read the file, it gets an outofmemory exception. I would think that given the output limit, there should not be a problem with reading a larger file, is that correct? Many other jobs read files of varying sizes on this server, but only encounter this problem with this one file.
Sponsor
Forum information
Support
2009-02-10T09:21:24Z
The problem is that we read the whole file, then cap to the output limit. We will change this and provide a new version. Are you running the beta now?
Henrik
Support
http://www.visualcron.com 
Please like  VisualCron on facebook!
lal
  •  lal
  • No customer Topic Starter
2009-02-10T09:42:16Z
We have the beta version running on a test machine; but no production jobs are running on it. The standard version that we are using on production servers is 4.9.26. It's not urgent, since so far, that is the only time we've encountered that problem. If the issue is resolved in a future standard version of VC, that will work out fine. Thanks!
Support
2009-02-10T23:22:16Z
This has now been fixed in version 4.9.59. It will probably released this weekend in the beta forum.
Henrik
Support
http://www.visualcron.com 
Please like  VisualCron on facebook!
Scroll to Top