Creating duplicate records in heavily used system

Posted by robertsb on 13-Apr-2011 04:14

We have written a new application, running in Progress which servesup information from a legacy database.  We have an issue on the live system, where two transactions try to create the same record at the same time.  We're a bit stumped as to the best way round this.

The idea is that the code logs all requests in a UsageHistroy table.  UsageHistory has primary unique key of RequestDate, RequestTime, RequestUniqueID.

The idea is that, for each request to our application, an entry is created, to which information about the transaction is attached (mode, duration etc).

The code is as follows.  Where there is more than one request per second (Date, Time match) we should add a record with the next unique number.  the code is as follows:

            FIND LAST UsageHistory
                WHERE UsageHistory.RequestDate EQ gExecutionStartDate
                  AND UsageHistory.RequestTime EQ gExecutionStartTime
                      NO-LOCK NO-WAIT NO-ERROR.
                         
            IF NOT AVAILABLE UsageHistory THEN
            DO:
                       
                CREATE bufUsageHistory.
               
                ASSIGN bufUsageHistory.RequestDate     = gExecutionStartDate
                       bufUsageHistory.RequestTime     = gExecutionStartTime
                       bufUsageHistory.RequestUniqueID = 1.                   
                       
            END.
            ELSE
            DO:               
                   
                CREATE bufUsageHistory.
       
                ASSIGN bufUsageHistory.RequestDate     = gExecutionStartDate
                       bufUsageHistory.RequestTime     = gExecutionStartTime
                       bufUsageHistory.RequestUniqueID = UsageHistory.RequestUniqueID + 1.
   
            END.

This all works fine in out Dev system, but in Live we keep getting errors such as the following:

** bufPortalUsageHistory already exists with RequestDate 13/04/2011 RequestTime 35589 RequestUniqueID 1. (132)

This must be because of the volume of transactions on the live server, so two processes try to create the same record.  We've tried all sorts of code to try and wrap the above in some sort of transaction, or repeat on fail, but so far no joy.  We don't really have the OpenEdge expertise to come up with the best solution.

Any help would ge greatly appreciated

All Replies

Posted by jankeir on 13-Apr-2011 04:24

The easiest sollution is to create a sequence (eg seqRequestId) in the database, and assign RequestUniqueId to NEXT-VALUE(seqRequestId). That will cause you're number to continue across date/times (so you won't have 1,2,3 for every date/time combination, but 1,2 for one time, 3,4 for another time,...) but given your information I don't think that would be a problem.

Posted by Admin on 13-Apr-2011 06:50

one could ask why do you need a unique index in that log table in the first place?

what progress version? you might want to switch to a timestamp but even so theoretically you might have more than one event at the same time, if you need to have the 'exact' order then use the sequence solution as proposed by Jan... just make sure the sequence will cycle, reset itself after an arbitrary value... most probably you're safe to presume there won't be more that 1000 events per minute

anyway, just consider dropping the unique index... you most probably don't need it here

This thread is closed