READ-XML Error

Posted by BeulahA on 15-Nov-2011 17:38

Hi,

I have a temp-table that I need to write as a varbinary and extract.

My code is somewhat similar to the following.

lResult = TEMP-TABLE ttTable:WRITE-XML("MEMPTR", lvBefore, yes, ?, ?).

create dbTable.

assign dbTable.varBinaryFld = lvBefore.

lResult = TEMP-TABLE ttTable:READ-XML ("MEMPTR", lvBefore, "EMPTY", ?, ?) .

This READ-XML gives the following two errors

13036 - Error Reading XML from a MEMPTR or LONGCHAR.

13064 - READ-XML encountered an error while parsing the XML document: FATL ERROR: file 'MEMPTR', line '1', column '1', message 'Invalid document structure' .

The same code worked perfectly alright on a test database created with the same dbTable.  But when the actual dbTable in the live database is used, I get these errors.

Any input is much appreciated.

Thanks,

Beulah

All Replies

Posted by BeulahA on 15-Nov-2011 17:39

Additional Information - this is in SQL database.

Posted by Thomas Mercer-Hursh on 15-Nov-2011 17:43

What do you mean it is in SQL database.  The commands you list are ABL.   Do you mean SQL Server accessed through a DataServer?

Posted by BeulahA on 15-Nov-2011 17:44

Yes, it is SQL Server accessed through a DataServer.

Posted by Thomas Mercer-Hursh on 15-Nov-2011 17:52

Then you might try it against a Progress DB to see if it is a DataServer specific issue.

Posted by Thomas Mercer-Hursh on 15-Nov-2011 17:53

You could also try writing to a file to see what you get ... and then read from the file to see if that works.

Posted by BeulahA on 15-Nov-2011 17:54

Well, in progress, we have used RAW datatype.  As this is not supported in dataserver, we are using varbinary (MAX) in MSS for which I am using WRITE-XML and READ-XML.

Posted by BeulahA on 15-Nov-2011 17:55

Will give it a go, thanks.  But the database field will have to be varbinary (MAX).

Posted by Thomas Mercer-Hursh on 15-Nov-2011 18:02

Right, I'm just suggesting some things you could try to identify whether the problem is specific to DataServer and varbinary field types.

Posted by BeulahA on 15-Nov-2011 18:05

As I mentioned initially, when this was tried out on a small test database it worked.  It is only when the same has been merged into the live database, it gives this error.  Both the test database (where it is working) and the live database(where it is failing) has exactly the same datatypes on this table - it is a big mystery!  Don't know what to try to resolve.

Posted by jmls on 15-Nov-2011 18:25

what version of progress

if you skip saving to a databare field can you read from the memptr

and convert back ?

have you tried writing to longchar and then copy-lob to memptr ?

do you need memptr at all, or would longchar do ?

On 15 November 2011 23:38, Beulah Antony

Posted by BeulahA on 15-Nov-2011 18:30

Hi Julian,

It is OE 10.2B03.

if you skip saving to a databare field can you read from the memptr and convert back ? - YES

have you tried writing to longchar and then copy-lob to memptr ? -

     No, have not tried this.  So, I guess reading will have to be the same way around too? 

The datatype has to be varbinary (MAX) and so cannot be longchar.

I hope you remember me talking to you about this last week at EMEA PUG!

Cheers,

Beulah

Posted by BeulahA on 15-Nov-2011 18:37

have you tried writing to longchar and then copy-lob to memptr ? - Tried and this has not helped either

Posted by BeulahA on 15-Nov-2011 18:38

Have tried writing and reading as 'file' as suggested by Thomas and that's working too.  Basically, if I skip saving to the database field it works OK!

Posted by jmls on 15-Nov-2011 18:55

Is the test system the same architecture as the live? Have you looked at

big-endian vs little-endian?

What is the size of the data?

Have you pre allocated size on the memptr? Try writing to longchar, then

allocating size + 1,then copy-lob to memptr

On Nov 16, 2011 12:30 AM, "Beulah Antony"

Posted by Admin on 16-Nov-2011 02:23

you might check what is the size of that XML document (memptr)... I get somehow simmilar errors when trying to load XML that's only (sic) aroung half giga bytes, the error numbers mathc but the message is quite different:

│            Error reading XML from a MEMPTR or LONGCHAR. (13036)              │
│  READ-XML encountered an error while parsing the XML Document: FATAL ERROR:  │
│ file 'MEMPTR', line '11562517', column '5', message 'An exception occurred!  │
│   Type:RuntimeException, Message:operator new fails.  Possibly running Of    │
│                               memory'. (13064)

there's plenty of memory on my dumb 8G laptop but there must be some limmitation they hit, this leads me into thinkig that read-xml is not using a sequential read like sax parser but it tries to load everything in a DOM alike structure

anyway, that might be the reason for which things work in development/test but not in real production environment... i guess

Posted by ChUIMonster on 16-Nov-2011 07:54

I believe that varbinary has a maximum size of 8,000 bytes.

Posted by ambrosio on 16-Nov-2011 08:05

Estou em viagem hoje.

Responderei e-mails e ligação ao final do dia.

--

Marcos A. Ambrósio

Totvs Curitba

Tel./Fax: (41) 3360-6200

marcos.antonio@totvs.com.br

Posted by jmls on 16-Nov-2011 08:18

from supportlink (gasp!)

  in OpenEdge 10.2B, DataServer for MS SQL Server was enhanced to use the OpenEdge BLOB data type, enabling you to handle data records of up to 1 gigabyte in size. Release 10.2B provides the following enhancements:

  • The ability to migrate an OpenEdge database with BLOB data type to a MS SQL Server database as VARBINARY (MAX) and pull back as BLOB in the schema holder
  • The ability to pull MS SQL Server data types VARBINARY (MAX), IMAGE, or VARBINARY (MAX) FILESTREAM as an OpenEdge BLOB data type into the schema holder
  • The ability to read or write data to or from the MS SQL Server database through the ABL BLOB data type by using the ABL COPY-LOB operation

I'm wondering if the solution would be to copy-lob the data to the db field, rather than assigning it ...

Posted by jmls on 16-Nov-2011 08:20

if you copy-lob from the field to a file, what does the file look like ?

Trying to determine if the data in the field is corrupted

Posted by BeulahA on 16-Nov-2011 10:20

Thank you for all your invaluable replies and answers.  This was fixed as the read-xml was pointing to the wrong record and the test database didn't have those records and so it worked OK there.

As I still faced other issues, have considered the use of varchar (max) (Julian's initial suggestion).

That seems to have fixed all issues. 

Once again thanks for all your inputs.

Thanks,

Beulah

This thread is closed