A better approach?

Posted by goo on 21-Sep-2016 04:07

OE11.6.2 win

I have an appserver procedure that receives a blobfile within an temp-table. The blob is copylobed into a longchar named lcFile.

It runs smooth as long as the file is small, but now it receives a file with approx 50000 rows and each row has 30-40 columns.

The codelogic is checking the first row, and based on what kind of file, it will hold a list of hte column names in cColumnList. The code is something like this:

do ix = 1 to num-entries(lcFile,chr(10)).
      cString = entry(ix,lcFile,chr(10)).
      if length(cString) le 1 then next.
      if cString begins ',,,,,' then next.
      if cString begins '@' then.......

      :

      :

  case cImportMethod:
        when 'AcBal' or when 'R1Bal' then doImportFile(cString).

        :

        :
        otherwise leave.
      end case.
      if not obOk then return.

end.

:

:

doImportFile......:

  bhAtable:find-unique('where something=something and somethingelse=somethingelse and........)

  if not bhAtable:avail then bhAtable:buffer-create().

:

assign bhAtable:buffer-field('Field...'):buffer-value = .....

:

.

do i = 1 to num-entries(cColumnList):

  bhAtable:buffer-field(entry(i,cColumnList)........ = something 

  if i = lookup('R1',cColumnList) then bhAtable:buffer-field(..... = something.

end.

end. /* Method*/

Since I am running with on AppServer (but not async) and it takes more than 20 min to run it, it will stop because of the VPN timeout (I believe)... What would be a better way of approaching this? I have thought about having a process on the server, and add a process table that the file can be added to. Then I can have a serverside process that runs it and sends a message back to the user....

Any other ideas?

//Geir Otto

All Replies

Posted by smat-consulting on 21-Sep-2016 05:17

I usually do not execute in the AppServer agent itself any process that can take more than 1 or 2 seconds.

Instead I save the input data somewhere convenient and add an entry to the queue, which is worked off by a batch-process.

In your case, you could simply save the file in a directory. You could have the batch process periodically check if there's a file in this directory and when it finds one, work it off. After working it off, it can send an email (or other kind of notification) to the user that initiated the process, including the collected messages about problems the process encountered.

This approach allows the user to continue with their work immediately, and makes it easier for me (i find) to program the backend process. Also, there are always processes that need to be run periodically without a user initiating them - a batch-process queue allows for easily scheduling such repeating processes - so I always have such a process anyhow. Adding a new process to it is rather easy...

Posted by goo on 21-Sep-2016 05:36

Thanks, I was thinking of that approach, but instead of making a file, I would add a processqueue as a table. with a blobfield. Same kind of thinking. Your approach may be faster to implement since no dictupdate...

I was thinking of async approach... would that be possible? Is it so that if the client (webclient) is disconnecting, the async request will be broken?

Posted by Stefan Drissen on 21-Sep-2016 07:43

do ii = 1 to num-entries will count the number of line feeds in your longchar for /every/ row. Calculate once and use that.

While extracting entry x from a longchar is nice ABL - it is not fast. You may want to use the index function to find the next linefeed starting from your current position and substring that slice.

Posted by goo on 21-Sep-2016 13:43

Ok, will try that:

ii = num-entries(....

do i = 1 to ii:

.....

Is ENTRY(...  that slow? Ok, I will test that as well :-)

This thread is closed