I assume there must be a bug in my work environment this is the strangest thing. My co worker is able to get things to work fine, so this is an isolated issue with my development environment.
Version: 10.2B
OS: RHEL
The output stream fails to output all the data. I can get it to do so by using the "unbuffered" option however.
There are two examples here.
1. It will not output less than 1024 characters.
2. If more than 1024, it will only output everything that can be evenly divisible by 1024.
Example: If I need to output 12,989 characters of data to a file, the file will only get 12,288
It's a very strange problem indeed, and the fact the output is working with the number 1024 makes it all the more strange.
Incase you're wondering what methods i'm using here is a short example:
def var file-ou as char no-undo init "/u/tmp/outstream.csv". def stream s-out. output stream s-out to value(file-ou). put stream s-out unformatted fill("0",1024) skip. output close.
Try OUTPUT STREAM s-out CLOSE.
Thanks Fernando. I actually tried that right before reading your suggestion.
I've found this article here
[View:http://knowledgebase.progress.com/articles/Article/Buffering-of-output-seems-increased-how-to-control-buffer-size:550:50]
Which states: unless the output stream is closed, the output buffer must be filled with 1024 bytes of data before the data will be written to disk (flushed).
After reading that, it clicked in the head that the stream may not be getting closed. It was then I realized that the statement "output close" is kind of ambiguous and that Progress would have no way of knowing which stream to close.
I kind of feel like an idiot for not realizing this sooner.
there used to be a way to force the flush… `put control 0` or something like that