bfx: Field too large for a data item. Try to increase -s. (4

Posted by agent_008_nl on 21-Oct-2015 08:00

hBuffer:FIND-CURRENT(EXCLUSIVE-LOCK).

gives me this error on a certain record while

hBuffer:FIND-BY-ROWID(hBuffer:ROWID, EXCLUSIVE-LOCK).

does not on the same record.

I think there is a vague relation with http://knowledgebase.progress.com/articles/Article/000026121

but this looks like a bug to me.

All Replies

Posted by gus on 21-Oct-2015 08:08

what happens if you try to display the data after find by rowid ????

-gus

On 10/21/15, 9:01 AM, "agent_008_nl"

wrote:

>Update from Progress Community [https://community.progress.com/]

>

>agent_008_nl

>[https://community.progress.com/members/agent_5f00_008_5f00_nl]

>

>hBuffer:FIND-CURRENT(EXCLUSIVE-LOCK).

>

>gives me this error on a certain record while

>

>hBuffer:FIND-BY-ROWID(hBuffer:ROWID, EXCLUSIVE-LOCK).

>

>does not on the same record.

>

>I think there is a vague relation with

>knowledgebase.progress.com/.../000026121

>

>but this looks like a bug to me.

>

>View online

>[https://community.progress.com/community_groups/openedge_general/f/26/t/2

>0902]

>

>You received this notification because you subscribed to the forum. To

>stop receiving updates from only this thread, go here

>[https://community.progress.com/community_groups/openedge_general/f/26/t/2

>0902/mute].

>

>Flag

>[https://community.progress.com/community_groups/openedge_general/f/26/t/2

>0902?AbuseContentId=e32419ee-8e46-411e-95b7-e02de4a8b9d1&AbuseContentTypeI

>d=46448885-d0e6-4133-bbfb-f0cd7b0fd6f7&AbuseFlag=true] this post as

>spam/abuse.

Posted by agent_008_nl on 21-Oct-2015 08:14

I will repeat the test tomorrow and let you know. OE 11.5.1 btw.

Posted by Fernando Souza on 21-Oct-2015 09:26

Does the error go away if you increase -s ? FIND-CURRENT does more than other FIND's because it has to check if the record changed to support CURRENT-CHANGED so it will use more stack space.

Posted by agent_008_nl on 22-Oct-2015 01:18

When I display the data after the find-by-rowid it displays the right data. Db is UTF-8, there is a character field with 18077 characters, some double-byte, record-length is 18370 bytes. The error goes away when I make -s 1000.

Posted by Fernando Souza on 22-Oct-2015 09:17

So I believe this is working as expected.

Posted by agent_008_nl on 22-Oct-2015 11:28

You believe it works as you interpreted it afterwards you mean. Might be your clarifation is ok, I have no idea. You Gus?But anyway, I don't like to change parameters without knowing what is going on, where exactly the error comes from.  Timeconsuming, but you learn something. This kind of differences between a plain find and the dfferent dynamic finds are not nice to have, the cause of the error (or the statement where iot occurred) was timeconsuming to trace.

Posted by Fernando Souza on 22-Oct-2015 11:47

When you increase the -s and the issue goes away, that is usually an indication that you are just close to exhausting the stack. If you keep increasing it and the error still shows up, of if you run that same statement over and over and it keeps complaining about -s after a while, that is an indication of a bug.

Posted by agent_008_nl on 22-Oct-2015 12:38

Might be not correct to call it a bug, but this is not a nice to have. Comments Gus? Where are you?

This thread is closed