Questionable Application Design Recommendations

Posted by Admin on 20-Feb-2009 21:15


My company is in the process of re-architecting our commercial medical billing application and we paid for a Progress employed consultant to review our design and provide recommendations. I found that his recommendations were highly questionable with regard to the OERA principles as they have been presented by Progress. My boss doesn’t have any experience with this so I’m hoping that some of the high profile extremely knowledgeable people here can provide me with something that I can show him so he can make the right decisions on our design – a quick response from John Sadd would be ideal.

Basically, his recommendation was this - that we should forget about having a separate data access layer and instead create a single large procedure that contained all of the IP’s to handle a specific entity. This would be all business logic and database access. We should access the database directly wherever we needed to rather than run all database access through a single point of touch with the database. Every call to run a procedure would have to run this one large procedure persistently and then call into it.

My problem with this is:

1.) The data access layer is strongly recommended in the OERA design so that schema changes are limited to a single place for future changes. That could be changing a field data type, changing where the data source for a field (as in data usage) comes from, or providing an alternative database option for when a customer’s buy decision hinges on having an MS SQL Server database because they already have a SQL Server DBA and don’t want to have an additional Progress DBA.

2.) This seems to be very inefficient from a performance standpoint. Our application uses an ASP.NET UI and cannot maintain a persistent connection to the AppServer broker. As a result we already have a lot of additional overhead in making service calls because we have to connect to and disconnect from the AppServer broker on each call. Our consultant says that the additional overhead from instantiating a persistent procedure and shutting it down on every call is very minimal, but I seriously doubt that information. This is an application that can potentially have a very large number of concurrent users actively running procedures at the same time.

I’m hoping for a lot of feedback on this, but especially from the Progress architects of the OERA. Mike O – you told me that your top people always monitor this list. Please encourage as many of the OERA architects to respond to this – John Sadd would be ideal if you only had one.

Thanks All.


All Replies

Posted by Admin on 20-Feb-2009 22:10

A quick reply from a low profile progress developer.

There is no roll-out OERA implementation and while there are some basic frameworks out there include Progess Support you with find that you will have to enhance any of those frameworks to have an industrial strength OERA framework.

If your gateway procedure on the appserver runs code persistent then as each agent instantiates and you use each service it will be available when you use that call on the next ocasion. You could also design it so that when starting each agent each of your major services is run persistent.

At my major client I have not gone to that extent and we run both an appService and a WebService and we have gone to that extent and yet experience no performance issue. And our case is probably worse as we decided that a service should be responsible for the things that pertain to that server. So we might have a request that goes to a forms service, but need info from the property or bonds service and so calls are made to those services. In some cases a single call fetches or saves data to 3 or 4 services and each call goes back though the gateway for security and other reasons.

There is a lot to learn about this and that would be the major issue.

Posted by Thomas Mercer-Hursh on 21-Feb-2009 13:53

Kudos to you for being on the ball!

Let me make a couple of comments here.

First PSC Professional Services, like any such organization, has people who are better than others and, of course, the reverse. There have been times when individual consultants have been caught making recommendations which would not meet with the approval of the better consultants in the organization.

Second, there appears to sometimes be communication issues ... although different people will point the finger in different directions as to whose fault it might be. One of the classic forms of that communication problem is the customer thinking that the consultant has pointed to AutoEdge as a model for their application. Officially, of course, PSC is very clear that AutoEdge is nothing more than a pedagogic tool, something to illustrate some points, something to provide a basis for discussion. No one involved in architecting it or creating it believes it to be a finished model of a production system. There is no such model anywhere, inside or outside of PS. When called on it, PS consultants will usually say that they never made such a recommendation and that the customer misunderstood that AutoEdge was just supposed to be a place to start thinking and talking. This could easily be true since there is a natural tendency for people to think that "If PSC did it this way, it must be right". So, one can see how this miscommunication might happen, but it must be guarded against.

Third, as noted above, don't even remotely consider AutoEdge to be a model for a production system. Even the people who wrote it recognize things that need to be changed about it and a lot of work that would need to go into making it a real model.

Fourth, in particular, such a model should clearly be built with OO if it is being done today. All there has been so far on that is a couple of whitepapers sketching out ideas. Not only is there a long way to go, but there is some significant diversity of opinion about whether the parts that have been sketched are even in the right direction. See, for example . BTW, John Sadd and I have proposed a joint presentation for Exchange to talk about our different theories of how to represent relational data in OOABL. I hope we still get to give it since I think it would be very thought provoking.

Fifth, I think you are right that you should follow the OERA concept and provide a data access layer. I hope that there has been some kind of miscommunication in the recommendation you have received since it doesn't appear to conform to the layered approach which is the consensus model.

I would suggest you get another opinion and then exercise your own judgment about the alternate recommendations. You might want to check out

Posted by Tim Kuehn on 23-Feb-2009 07:09

Basically, his recommendation was this - that we should forget about having a separate data access layer and instead create a single large procedure that contained all of the IP?s to handle a specific entity. This would be all business logic and database access. We should access the database directly wherever we needed to rather than run all database access through a single point of touch with the database. Every call to run a procedure would have to run this one large procedure persistently and then call into it.

Hi Dave - and good for you to ask the hard questions. Here's some thoughts from another consultant:

1) I'd say it's a balance of approaches, because when I did a "DA layer" approach, it worked fine for small updates or changes. However, when I was doing larger updates or changes, it meant hitting the table 2x for each update, which was an inefficient waste of resources and time.

However, if one doesn't mandate a db -> TT -> db approach, one can do a "DA layer lite" where the low-level interfaces to the db tables are done in a restricted set of procedures, and then layer other SPs with increasingly higher levels of BL architecture on top of the "data driver" layers in the application.

2) TMH's recommendation to go with an OO approach is one way of doing this. I've also developed a procedure-managed system where it's possible to layer SPs on each other, which works very well and is consistent with what traditional procedural developers are familiar with.

3) It's not necessary to start a new PP for each appserver call - if the db connections remain static, then one can setup all the "service" PP or SP modules when the appserver is started, and then access them as required. While loading PP's may not be that much of an overhead, I've seen cases where it takes 100s of ms to load a single program, which in a loaded environment is never good.

4) From what I've been able to gather, Mike O's moved on to the Apama division of PSC. I haven't seen him posting here recently.

5) PSC had another round of layoffs recently, and the dust hasn't settled, so it's an open question as to whether or not someone will respond shortly.

Finally, if you'd like to hire another consultant for their opinion, I've got lots!

Good luck!

Posted by Admin on 24-Feb-2009 15:16

Hi Miles,

I wouldn’t exactly call you a ‘low profile’ Progress developer – I’ve been following your recommendations for how to work with Prodatasets for quite a while now. I’m not using a common gateway procedure in this instance because the application is browser based and any changes to the .dll’s only require a reinstall on the web server. Our .NET developer definitely prefers having a static API for everything because it makes his work a lot easier in Visual Studio. I went with a common gateway when I was working for Consona because the application was GUI client server and changing a .dll resulted in a huge amount of work for our company and all of our customers. Currently I have a common super procedure that I start at the AppServer startup. It contains the common procedures that all of the business entity and data access procedures run. For example I have a ‘setContext’ procedure that accepts a session ID and restores the proper CLIENT-PRINCIPLE object. This is run by all of the business entity procedures that can be run from the UI. My major concern is that we’re being advised to put all of the logic for an entity (ex. Revenue Code) into a single procedure that will always be started persistently and then run an IP inside of it and then shut it down. Since we’re working in an AppServer environment that seems hugely inefficient. I’ve always tried to minimize the number of CPU ticks that occur when I work with an AppServer and this advice ignores that. I understand that using a single procedure with all of the code for an entity is a good way to organize the code, but it also seems to be bad for performance. A single large procedure will have a longer load time at startup in addition to the additional time that running it persistently and then shutting it down will take. Am I missing something here?


Posted by Thomas Mercer-Hursh on 24-Feb-2009 15:49

No, I think it is your consultant who is missing something...

Huge anything is questionable and, if there is a reason for something to be huge, then there is a big motivation to instantiate it once and leave it running. I am right with you in the idea that any new code instantiated in response to an AppServer call should have a minimal footprint so that it opens quickly, does its work, probably using pre-instantiated code, and then goes away quickly.

Posted by Tim Kuehn on 24-Feb-2009 16:03

My major concern is that we’re being advised to put all of the logic for an entity (ex. Revenue Code) into a single procedure that will always be started persistently and then run an IP inside of it and then shut it down.

Uh - yuck. Not only will this structure be a performance killer if the system starts it up and then has to delete it on every appserver call, monolithic structures like this will also become a huge maintenance hassle down the road. And what happens if one entity needs another one - is the code supposed to do the start / use / shutdown there too?

I'd much rather see you go with a set of layered SP's (or objects) that are started up when the Appserver starts, and then accessed by the application as needed. All the load time'll be taken up front, and then the application can spend it's time doing stuff rather than loading & unloading procedures.

Have you asked your consultant how far back his experience goes? Because this recommendation sounds like it's from someone who got addicted to include files and monolithic program blocks somewhere in the past when it was the "best" idea available, and never got past that.

Posted by Admin on 24-Feb-2009 16:08

Hi Dr. Mercer-Hursh,

Your first point is the focus of my concern – the advice that we’re getting seems completely wrong to me. Our consultant told us that we should forget about a data access layer and scatter our database access wherever we need it. It’s already been hard enough for me to sell the idea of a single point of access to the database because it’s much easier to do a FOR EACH or a FIND when you need some data. It’s always easier to just think about knocking some code out rather than considering how maintainable it will be in the future. We’re definitely not considering AutoEdge for a model for our development. I’m not sure what our consultant is using to guide him, but he’s recommending that we create a single procedure that must always be run persistently and then the IP’s can be run inside it. We’re using ASP.NET which means we can’t maintain a persistent connection to an AppServer broker. Since we already have the additional overhead of connecting and disconnecting to the AS on every call I can’t imagine how the extra overhead of running a persistent procedure can be beneficial. Our consultant only has GUI client server development experience so I doubt his ability to make recommendations anyway.

Regarding using the OO constructs, don’t they also have the same overhead that running a PP would have? Since I can’t expose methods directly through ProxyGen I would have to write stub procedures to call them anyway. My thoughts are that I would create classes for the more complicated business processes such as running monthly statements. Since they would have a longer execution time I would think that the instantiation and cleanup would have less of an impact. As far as code organization goes I could see having a class for every entity in the system, but I’m more concerned with performance.


Posted by Thomas Mercer-Hursh on 24-Feb-2009 16:23

Sounds like someone who has decided that the concept of modular means everything goes in one module!

Posted by Admin on 24-Feb-2009 16:30

Hi Tim,

May I first say thank you in a big way. The huge performance killer is what has been my biggest concern. My problem is that my boss has never had any experience with this and I’m trying to make sure that we go in the right direction. I assumed that by having a PSC consultant we would get someone that adheres to the stated OERA design principles. This guy says ignore them. I previously worked for the Encompix division of Consona Corp. The most important upcoming project was to provide the ability to have a SQL Server backend database. If we had a data access layer then it would have been a 1-2 month project. Instead with the database access spread out all over the place it was closer to a year to accomplish.

I’m actively campaigning to get either you or Dr. TMH to replace the consultant that PSC has provided us with. We’ll see what happens…


Posted by Admin on 24-Feb-2009 16:35

The appserver performance will obviously be the same wherever called - be that from a local network or from the web. In this case I believe that we are discussing a single request and again obviously there will be a greater web overhead where there are multiple requests.

We run with a BE (Business Entity) and DA (Data Access) procedures/layer and they are currently not run persistent. But, if you get the code right there is no reason why that can't be persistent. The DA procedures should be very small as they shouldn't have business logic. The BE has the business logic but it should only be that i.e. your framework will have all the logic to retrieve and save data in your dataset. Our largest BE procedures compile to around 400k but the access on a local network is unnoticeable.

You can break your entities in smaller objects but logical groups if you wish e.g. instead of Revenue Code you might have Contract Revenue and Parts Revenue etc and a Revenue object that has all the standard Revenue code. The sub groups would always call the standard object and it might be persistent.

Performance has not been an issue for us, but using the new concepts come much harder. You suddenly realize for example that your DA code actually includes business logic. Also if decide that all code for say Parties are in the one object, you have another object that needs to fetch or save product details, then it should call the Parties service. The other major thing with dataset is ensuring no memory leaks as this will cause performance issues. Your agents grow until they eventually crash.

I had exactly the same fear as you on performance but that has not been an issue to date. Any performance issue would be developer performance particularly at the outset. As the BE and DA is split there is greater thought and work required and much can be reduced though techniques.

At this stage we have not seen the benefit of the BE DA split as it is all Progress. Having said that we were told many year ago to separate the UI and it was a while before we saw the benefit of that.

Posted by Thomas Mercer-Hursh on 24-Feb-2009 16:45

If the consultant is really saying "forget about a DA layer and sprinkle your DA logic around whereever you want it", then, not only would I find an alternative consultant, but I would report your concerns up the reporting chain. That is clearly off message for modern thought no matter what side of several alternative visions you lie on.

Which said, both Tim and I are suggesting that it can be a reasonable idea to start up a number of procedures or classes at the AppServer startup and then to access the methods/IPs of those classes/procedures using facade procedures which are what are run by the client. That's not quite the way I would like to do things optimally, but it is certainly one approach with current capabilities.

There are three other conceptualizations that you might want to consider, two of which are actually possible and the third of which still seems to be a dream. As for the dream, one of the things I would like very much to have is a simple, natural way to have a set of running classes which are shared by a pool of AppServers. In that pool, one could have long, short, and medium running classes. The long term would be state-free resources. Short term would be temporary facades. And medium would be a facade which hung around for a while to maintain state. And, of course, one probably wants this pool to be multi-threaded ... dream on.

OK, but since we can't have that, another idea is to think in terms of multiple AppServers, each covering a particular domain. Clients connect to the domain relevant to the request and the domains can connect to each other for things that cross over. Each domain is running a different set of long running classes/procedures which are accessed using facade procedures which last only for the duration of the call. If it is necessary to preserve state across multiple calls, then that is managed via the database.

Third, consider Sonic as a alternative or supplement to AppServer. That allows you to do some of the same kinds of things described above. See for a discussion. The use of messaging may seem to lose you some of the immediacy of an AppServer call, but at the same time it is basically a connection you make at the start of the session and then just use and use and use as needed. It is also a natural for good modularization of responsibility. E.g., an order gets sent in parallel to the inventory, order processing, and customer services, each for different actions.

Our consultant only has GUI client server development experience so I doubt his ability to make recommendations anyway.

So, it appears that you have the wrong consultant. Either you should find your own elsewhere or at the least you should go back to PSC PS and tell them you have a mismatch and see if they can do better. Frankly, I can understand easily if this experience didn't leave you with a lot of confidence in them.

On the OO questions, yes, there is no magic bullet unless it comes from better encapsulation and re-use. Yes, I would have a DA object for every entity and possibly some more. I don't know if you have followed the long running debate between John Sadd and I, but he advocates using ProDataSets or Temp-Tables as "data entities" to pass from the DA layer to the BL layer while I advocate PDS only for collection objects and the use of business entity objects based on properties to pass from DA to BL. This avoids have the TT or PDS definition in more than one place and encapsulates all logiic in one place.

Posted by Thomas Mercer-Hursh on 24-Feb-2009 16:57

I assumed that by having a PSC consultant we would get someone that adheres to the stated OERA design principles. This guy says ignore them.

As I said, I don't think this should go unreported, even if you have lost confidence in the organization.

Which said, I have to say that it was perhaps optimistic to think that PSC PS would be a prime source for OERA design principles. I know some people in the organization that seem to have their heads screwed on fairly well, but I have also heard some stories like yours that really make me wonder. Bottom line here, though, is that OERA design is in its infancy. PSC has tried to help with whitepapers and AutoEdge, but the whitepapers don't really get down to the brass tacks needed by production systems and AutoEdge, as interesting as it is, is a long way from being a model for a production system. I.e., one needs to select one's advisers very carefully since no one has been doing any of this for very long. I feel lucky that I got something of a head start on this during the period I was working with Forté because they were very much advocating layered architectures and what amounted to SOA and ESB before those terms had even been invented. So, I've had a while to think about it.

You are bang on about the data layer and making changes. That is one of the big points about this approach ... it isn't that one saves a huge amount of time implementing it up front, but one saves enormously when making changes downstream.

You might also want to check out for some additional ideas. That initiative has only just started, but it is a start.

Posted by Tim Kuehn on 25-Feb-2009 06:47

Sounds like someone who has decided that the concept of modular means everything goes in one module!

That's certainly one way of looking at it.

Posted by Phillip Magnay on 25-Feb-2009 12:41


Could you please drop me a note offline at and provide some details around this issue? I would like to know who the PSC consultant you are referring to is, when the recommendations in question were made, and the specific rationales/justifications that were given to support the recommendations.



This thread is closed