write-xml to longchar and stored in a clob results in extra

Posted by Admin on 13-Apr-2009 11:39

I am using a clob field in a temp table to pass data back to the .NET client via proxy. The field is accessible as a string in the DataSet generated by proxygen. When I access the string I get "extra" characters at the end of the file (outside the xml). This happens if I just view the string in debugger or if I write it to a file and then view the file. Here is the last bit of text showing what is occurring.

...</prol__log02><oid_prol_det>0.0000000000</oid_prol_det></ttProcedureSourceLineDetail></dsData>��

Here is how I have the server implemented.

The temp table

define temp-table ttBrowseDefinitionExport no-undo
    field exportXml as clob column-codepage "UTF-8".

The xml is generated from a dataset and assigned to this clob as follows,

define variable xml as longchar no-undo.
fix-codepage(xml) = "UTF-8".

dshData:write-xml("LONGCHAR", xml, false, "UTF-8", ?, false, false).

assign ttBrowseDefinitionExport.exportXml = xml.

On the client side I write the xml to a file on the client. The exportXml is generated as a string by proxygen, if I view the string in debugger I see the extra characters, if I write to a file I see the extra characters. I output to file on the server and I don't see the characters. Does anyone see anything wrong with the above approach?

Regards,
Pat

All Replies

Posted by Admin on 13-Apr-2009 13:17

In case someone else encounters this. I was unable to get this working using a clob, but changing to a blob and using a memprt in the write-xml solved this. Of course you have to convert from a byte[] to a string (using the System.Text.Convert.UTF8.GetString) to write to a file on the client.

--Pat

This thread is closed