Hi All,
OE 11.4 (Windows 2012 server).
I've noticed the following and was wondering if anyone could assist / give usefull ideas?
Firstly, if I set a DATASET output parameter and it contains a decimal field in one of the contained temp table fields it comes out in exponent notation in the json object. Now, if I accept a dataset back with that same field in exponent notation (READ-JSON) and via the adapter, OpenEdge can't convert it back. Apart from converting to character .. any ideas ?
Secondly, using a DATASET-HANDLE as an input parameter, works in one instance but not another. i.e. method A accepts one JSON object and the second method accepts another. method A will work with the supplied JSON - this will not work on method B. The JSON I tried on method B will not work in A at all. If I take both json strings and READ-JSON on a freshly "CREATE DATASET" object it loads without issue. ... again, any ideas ? I am converting the inputs to longchar due to running out of time to figure out whats going on (I did check my config for both methods in the adapter - appear identical other than end point).
Yes, READ-JSON does not accept JSON Numbers with the E notation. We went with the ABL decimal format. This is something you should also log with technical support.
Hi All,
OE 11.4 (Windows 2012 server).
I've noticed the following and was wondering if anyone could assist / give usefull ideas?
Firstly, if I set a DATASET output parameter and it contains a decimal field in one of the contained temp table fields it comes out in exponent notation in the json object. Now, if I accept a dataset back with that same field in exponent notation (READ-JSON) and via the adapter, OpenEdge can't convert it back. Apart from converting to character .. any ideas ?
Secondly, using a DATASET-HANDLE as an input parameter, works in one instance but not another. i.e. method A accepts one JSON object and the second method accepts another. method A will work with the supplied JSON - this will not work on method B. The JSON I tried on method B will not work in A at all. If I take both json strings and READ-JSON on a freshly "CREATE DATASET" object it loads without issue. ... again, any ideas ? I am converting the inputs to longchar due to running out of time to figure out whats going on (I did check my config for both methods in the adapter - appear identical other than end point).
Flag this post as spam/abuse.
We need more information. It would help if you supply the JSON string that you are trying to convert into a dataset.
For case 1, by 'exponent notation', do you mean scientific notation, for example "3.7e-5"?. This can only be treated as a character temp-table field. OpenEdge will not convert such a string into a decimal temp-table field.
For case 2, there is not enough information to give you a good answer/solution to your problem. Please elaborate.
Regards,
Robin
Hi All, Sorry for the late reply - with more info - hope someone is still able to help :)
If I use very large decimals (like dynamics obj numbers) - it winds up in scientific notation -> here is some sample code
At the bottom is the result calling it via FireFox (to rule out Postman doing something odd).
Note: - the output is converting to scientific as the write-json(longchar,file) - do not change to scientific notation.
If the output dataset is pushed to a procedure expecting DATASET-HANDLE - it fails as openedge then doesn't map the E back to decimal.
What I found was ... http://json.org/ which shows that number CAN have E .. which the output seems to do, but the input doesn't seem to do - this is perhaps rather a bug ??
@openapi.openedge.export FILE(type="REST", executionMode="singleton", useReturnValue="false", writeDataSetBeforeImage="false").
/**********************************************************************
* Copyright (C) 2006-2013 by Consultingwerk Ltd. ("CW") - *
* www.consultingwerk.de and other contributors as listed *
* below. All Rights Reserved. *
* *
* Software is distributed on an "AS IS", WITHOUT WARRANTY OF ANY *
* KIND, either express or implied. *
* *
* Contributors: *
* *
**********************************************************************/
/*------------------------------------------------------------------------
File : testScientific
Purpose :
Syntax :
Description :
Author(s) : stevenj
Created : Fri May 22 09:51:07 CAT 2015
Notes :
----------------------------------------------------------------------*/
ROUTINE-LEVEL ON ERROR UNDO, THROW.
USING FW.Rest.* FROM PROPATH .
USING Progress.Lang.* FROM PROPATH .
CLASS FW.Rest.testScientific:
define temp-table breakme no-undo
field bignumberfield as decimal.
define dataset dsbreakme FOR breakme.
define VARIABLE lcbreakme as longchar no-undo.
@openapi.openedge.export(type="REST", useReturnValue="false", writeDataSetBeforeImage="false").
METHOD PUBLIC VOID mbreakme (OUTPUT DATASET FOR dsbreakme, OUTPUT plcbreakme as longchar):
create breakme.
bignumberfield = 10000000.1.
create breakme.
bignumberfield = 100000000.1.
create breakme.
bignumberfield = 1000000000.1.
create breakme.
bignumberfield = 10000000000.1.
create breaKme.
bignumberfield = 10000000.111.
create breakme.
bignumberfield = 100000000.111.
create breakme.
bignumberfield = 1000000000.111.
create breakme.
bignumberfield = 10000000000.111.
DATASET dsbreakme:WRITE-JSON("LONGCHAR",plcbreakme).
END METHOD.
END CLASS.
--> Output put
{"response":{"dsbreakme":{"dsbreakme":{"breakme":[{"bignumberfield":1.00000001E7},{"bignumberfield":1.000000001E8},{"bignumberfield":1.0000000001E9},{"bignumberfield":1.00000000001E10},{"bignumberfield":1.0000000111E7},{"bignumberfield":1.00000000111E8},{"bignumberfield":1.000000000111E9},{"bignumberfield":1.0000000000111E10}]}},"plcbreakme":"{\"dsbreakme\":{\"breakme\":[{\"bignumberfield\":10000000.1},{\"bignumberfield\":100000000.1},{\"bignumberfield\":1000000000.1},{\"bignumberfield\":10000000000.1},{\"bignumberfield\":10000000.111},{\"bignumberfield\":100000000.111},{\"bignumberfield\":1000000000.111},{\"bignumberfield\":10000000000.111}]}}"}}
Hi All, Sorry for the late reply - with more info - hope someone is still able to help :)
If I use very large decimals (like dynamics obj numbers) - it winds up in scientific notation -> here is some sample code
At the bottom is the result calling it via FireFox (to rule out Postman doing something odd).
Note: - the output is converting to scientific as the write-json(longchar,file) - do not change to scientific notation.
If the output dataset is pushed to a procedure expecting DATASET-HANDLE - it fails as openedge then doesn't map the E back to decimal.
What I found was ... http://json.org/ which shows that number CAN have E .. which the output seems to do, but the input doesn't seem to do - this is perhaps rather a bug ??
@openapi.openedge.export FILE(type="REST", executionMode="singleton", useReturnValue="false", writeDataSetBeforeImage="false").
/**********************************************************************
* Copyright (C) 2006-2013 by Consultingwerk Ltd. ("CW") - *
* www.consultingwerk.de and other contributors as listed *
* below. All Rights Reserved. *
* *
* Software is distributed on an "AS IS", WITHOUT WARRANTY OF ANY *
* KIND, either express or implied. *
* *
* Contributors: *
* *
**********************************************************************/
/*------------------------------------------------------------------------
File : testScientific
Purpose :
Syntax :
Description :
Author(s) : stevenj
Created : Fri May 22 09:51:07 CAT 2015
Notes :
[/collapse]
Hi,
Maybe its just cause I'm stubborn, but what I did was to read the longchar coming through the rest service, and parse out what looked like a scientific notation field and converted it and then READ-JSON'd the value.
Seems to be working so far - But in my humble opinion the read-json should cater for scientific nation since the json.org website shows that E notation is valid.
The ABL WRITE-JSON method does not serialize decimal fields using scientific notation. Please log an issue with technical support so that we can investigate this further.
Yes, READ-JSON does not accept JSON Numbers with the E notation. We went with the ABL decimal format. This is something you should also log with technical support.
Thanks for the suggestions, I will log them as support cases as soon as I'm done breaking^H^H^H^H^H^H^H "exploring" the rest of the 11.4 Adapter.