On the server, there is a 7000K limit to output. There is a large file (~160M), that intermittently, when a task tries to read the file, it gets an outofmemory exception. I would think that given the output limit, there should not be a problem with reading a larger file, is that correct? Many other jobs read files of varying sizes on this server, but only encounter this problem with this one file.