Hi,
I have a temp-table that I need to write as a varbinary and extract.
My code is somewhat similar to the following.
lResult = TEMP-TABLE ttTable:WRITE-XML("MEMPTR", lvBefore, yes, ?, ?).
create dbTable.
assign dbTable.varBinaryFld = lvBefore.
lResult = TEMP-TABLE ttTable:READ-XML ("MEMPTR", lvBefore, "EMPTY", ?, ?) .
This READ-XML gives the following two errors
13036 - Error Reading XML from a MEMPTR or LONGCHAR.
13064 - READ-XML encountered an error while parsing the XML document: FATL ERROR: file 'MEMPTR', line '1', column '1', message 'Invalid document structure' .
The same code worked perfectly alright on a test database created with the same dbTable. But when the actual dbTable in the live database is used, I get these errors.
Any input is much appreciated.
Thanks,
Beulah
Additional Information - this is in SQL database.
What do you mean it is in SQL database. The commands you list are ABL. Do you mean SQL Server accessed through a DataServer?
Yes, it is SQL Server accessed through a DataServer.
Then you might try it against a Progress DB to see if it is a DataServer specific issue.
You could also try writing to a file to see what you get ... and then read from the file to see if that works.
Well, in progress, we have used RAW datatype. As this is not supported in dataserver, we are using varbinary (MAX) in MSS for which I am using WRITE-XML and READ-XML.
Will give it a go, thanks. But the database field will have to be varbinary (MAX).
Right, I'm just suggesting some things you could try to identify whether the problem is specific to DataServer and varbinary field types.
As I mentioned initially, when this was tried out on a small test database it worked. It is only when the same has been merged into the live database, it gives this error. Both the test database (where it is working) and the live database(where it is failing) has exactly the same datatypes on this table - it is a big mystery! Don't know what to try to resolve.
what version of progress
if you skip saving to a databare field can you read from the memptr
and convert back ?
have you tried writing to longchar and then copy-lob to memptr ?
do you need memptr at all, or would longchar do ?
On 15 November 2011 23:38, Beulah Antony
Hi Julian,
It is OE 10.2B03.
if you skip saving to a databare field can you read from the memptr and convert back ? - YES
have you tried writing to longchar and then copy-lob to memptr ? - No, have not tried this. So, I guess reading will have to be the same way around too? The datatype has to be varbinary (MAX) and so cannot be longchar. I hope you remember me talking to you about this last week at EMEA PUG! Cheers, Beulah
have you tried writing to longchar and then copy-lob to memptr ? - Tried and this has not helped either
Have tried writing and reading as 'file' as suggested by Thomas and that's working too. Basically, if I skip saving to the database field it works OK!
Is the test system the same architecture as the live? Have you looked at
big-endian vs little-endian?
What is the size of the data?
Have you pre allocated size on the memptr? Try writing to longchar, then
allocating size + 1,then copy-lob to memptr
On Nov 16, 2011 12:30 AM, "Beulah Antony"
you might check what is the size of that XML document (memptr)... I get somehow simmilar errors when trying to load XML that's only (sic) aroung half giga bytes, the error numbers mathc but the message is quite different:
│ Error reading XML from a MEMPTR or LONGCHAR. (13036) │
│ READ-XML encountered an error while parsing the XML Document: FATAL ERROR: │
│ file 'MEMPTR', line '11562517', column '5', message 'An exception occurred! │
│ Type:RuntimeException, Message:operator new fails. Possibly running Of │
│ memory'. (13064)
there's plenty of memory on my dumb 8G laptop but there must be some limmitation they hit, this leads me into thinkig that read-xml is not using a sequential read like sax parser but it tries to load everything in a DOM alike structure
anyway, that might be the reason for which things work in development/test but not in real production environment... i guess
I believe that varbinary has a maximum size of 8,000 bytes.
Estou em viagem hoje.
Responderei e-mails e ligação ao final do dia.
--
Marcos A. Ambrósio
Totvs Curitba
Tel./Fax: (41) 3360-6200
marcos.antonio@totvs.com.br
from supportlink (gasp!)
in OpenEdge 10.2B, DataServer for MS SQL Server was enhanced to use the OpenEdge BLOB data type, enabling you to handle data records of up to 1 gigabyte in size. Release 10.2B provides the following enhancements:
I'm wondering if the solution would be to copy-lob the data to the db field, rather than assigning it ...
if you copy-lob from the field to a file, what does the file look like ?
Trying to determine if the data in the field is corrupted
Thank you for all your invaluable replies and answers. This was fixed as the read-xml was pointing to the wrong record and the test database didn't have those records and so it worked OK there.
As I still faced other issues, have considered the use of varchar (max) (Julian's initial suggestion).
That seems to have fixed all issues.
Once again thanks for all your inputs.
Thanks,
Beulah