Model-Set-Entity Pattern

Posted by Phillip Magnay on 02-Nov-2009 21:44

Recently I have been having private exchanges with a small number of individuals on several aspects of OO best practices using ABL and I wanted to begin making the content of some of those offline conversations public.  An area which has received a good deal of attention in those exchanges has been object-relational mapping (ORM) and how that may be achieved using OO ABL.  For many years now, the power of the ABL has been centered around managing relational data.  So a central question that has risen since the introduction of OO capabilities into the ABL: how to apply OO principles and best practices in developing data-centric OLTP-style business applications without relinquishing the power of the ABL to manage relational data.  Progress Professional Services (PPS) in North America has been using a foundation framework of components, tools, best practices called Cloudpoint. This foundation framework has its own approach to ORM called the Model-Set-Entity pattern.  I have included a high-level class diagram of Model-Set-Entity below.

ModelSetEntity.JPG

The diagram does not explain everything about Model-Set-Entity nor is it able to depict the whole solution to ORM,  so hopefully there will be a lot of questions and discussions forthcoming.  To the guys who I have been conversing with offline, I would ask you to keep all the detail that we have already shared to yourselves and stay more in the background on this thread until everyone else has caught up to your level. I am very interested in getting questions and opinions from the people who have not yet had the opportunity to discuss this subject.

Phil

All Replies

Posted by AdrianJones on 03-Nov-2009 07:12

The image is not visible to me.

Posted by Phillip Magnay on 03-Nov-2009 07:16

It's just the way that the communities platform manages images.  Just click on the image itself and a clear, full-size image will appear in a popup.

Posted by AdrianJones on 03-Nov-2009 07:28

It works for me in opera, but not Firefox or IE.

Posted by guilmori on 03-Nov-2009 07:42

Great stuff !

Some questions.

1) Why is there a bidirectionnal arrow between OrdersModel and DataAccessController ? Does DataAccessController has knowledge of OrdersModel (meaning EACH different models) ? Or it should be instead an arrow from OrdersModel to DataAccessController, and another arrow from DataAccessController to IBusinessModel ?

2) What is the Refresh method in IEntitySet used for ?

3) In IBusinessEntity, are the Revert() and Update() method used for Accept and Reject changes on the temp-table ?

4) Could you show an example of what kind of steps are required for a consumer to fill an Order model from persistence ? It is not clear to me how the IBusinessModel interface is used for this purpose.

Since the devil is in the detail, here are some more questions about the implementation:

1) How have you implemented your Set classes ? Don't you have a base class for Sets ?

2) How is the performance of filling this model from persistence ? Let's say for example 1000 orders, each with 100 order lines.

3) To create a simple Order/Order Line model, 5 classes are required, right ? This seems quite some work, and this work will be repeated many times for each different model. How is the productivity of creating and working with this pattern ? Is there any class generator in the plans for helping this ?

Posted by guilmori on 03-Nov-2009 07:59

Does a consumer have direct access to Order and OrderLine business entities and sets, or everything must go through the OrderModel ?

Posted by Phillip Magnay on 03-Nov-2009 08:12

Hi Guillaume,

I have a full day day today so I will address those questions which can be answered quickly and I will provide fuller answers to the others after hours.

1) Why is there a bidirectionnal arrow between OrdersModel and DataAccessController ? Does DataAccessController has knowledge of OrdersModel (meaning EACH different models) ? Or it should be instead an arrow from OrdersModel to DataAccessController, and another arrow from DataAccessController to IBusinessModel ?

The diagram should represent it more as a flow than a relationship.

2) What is the Refresh method in IEntitySet used for ?

3) In IBusinessEntity, are the Revert() and Update() method used for Accept and Reject changes on the temp-table ?

Not exactly. But close enough.

4) Could you show an example of what kind of steps are required for a consumer to fill an Order model from persistence ? It is not clear to me how the IBusinessModel interface is used for this purpose.

Later.

Since the devil is in the detail, here are some more questions about the implementation:

1) How have you implemented your Set classes ? Don't you have a base class for Sets ?

Base classes for sets

2) How is the performance of filling this model from persistence ? Let's say for example 1000 orders, each with 100 order lines.

Is your orders scenario using direct db access? Or prodatasets and temp tables?

3) To create a simple Order/Order Line model, 5 classes are required, right ? This seems quite some work, and this work will be repeated many times for each different model. How is the productivity of creating and working with this pattern ? Is there any class generator in the plans for helping this ?

We use UML/MDA to generate these classes.

I will address your questions more thoroughly tonight.

Phil

Posted by Phillip Magnay on 03-Nov-2009 08:14

Direct access. The consumer uses the factory methods in the model to create entity sets.  The consumer then uses the methods on the sets to create/access entities.

Posted by guilmori on 03-Nov-2009 08:27

2) How is the performance of filling this model from persistence ? Let's say for example 1000 orders, each with 100 order lines.

Is your orders scenario using direct db access? Or prodatasets and temp tables?

I meant fill the whole model (dataset and set classes) from the database.

Posted by guilmori on 03-Nov-2009 08:33

Ok, so the DataAccessController:GetModel() returns an OrdersModel with a filled dataset. Then we ask the OrdersModel to create its Entity sets, which is done using the dataset data ?

The consumer then uses the methods on the sets to create/access entities.

The Sets are created empty and they are responsible to fill themself ?

Posted by guilmori on 03-Nov-2009 14:52

In the IBusinessModel definition, is IEntity vs IBusinessEntity a typo, or it's really two different interfaces ?

Posted by Phillip Magnay on 03-Nov-2009 14:55

guilmori wrote:

Ok, so the DataAccessController:GetModel() returns an OrdersModel with a filled dataset. Then we ask the OrdersModel to create its Entity sets, which is done using the dataset data ?

Yes.  The DAController returns a model with a filled dataset.  Also, an existing model instance can be submitted to populate/repopulate its dataset.  Then the consumer (a business task typically implemented as a command class object) will call the a factory method on the model to create the entity set needed for the task.

guilmori wrote:

The consumer then uses the methods on the sets to create/access entities.

The Sets are created empty and they are responsible to fill themself ?

Yes, the sets are empty.  The set is similar to an iterator pattern - it does not encapsulate created entities.  Rather entities are only created when the consumer actually asks for one by calling a factory method to return one.

Posted by Phillip Magnay on 03-Nov-2009 14:59

Thanks for catching that.  It was originally IEntity and then changed to IBusinessEntity.  The old reference must be lingering around in the UML project.  I'll fix.

Posted by Phillip Magnay on 03-Nov-2009 15:02

guilmori wrote:

2) How is the performance of filling this model from persistence ? Let's say for example 1000 orders, each with 100 order lines.

Is your orders scenario using direct db access? Or prodatasets and temp tables?

I meant fill the whole model (dataset and set classes) from the database.

The model encapsulates a dataset so it would just be the time required to populate 2 related temp-tables (Orders & Orderlines) in a prodataset with 1000 orders each with 100 order lines.  Sets are not created during the fill.  They are only created when the consumer asks for them.

Posted by guilmori on 04-Nov-2009 07:13

Do you mean the Entity instances aren't stored anywhere in the model ? They aren't contained by the Sets, only returned to consumer ?

Posted by guilmori on 04-Nov-2009 07:35

Some more questions regarding IBusinessModel.

1) Could you elaborate on what are all the CHARACTER parameters used for ? I hope it's not to send the business entity's name as a CHAR ?

2) Regarding GetEntityValue(...), does this imply some generics functionality (in the ABL ?).

3) Are all the methods in IBusinessModel used by the consumer or some are used by the Data components when re-creating from persistence ?

4) Is it really the job of the BusinessModel to persist itself (Persist() method) ? Or should it be the consumer responsibility to send the BusinessModel instance to the data component for persistence. What if I need to commit 2 BusinessModel in the same transaction ? In other business/domain patterns I saw, one of the primary goal is to make the business components completely oblivious to the data components. Could you comment if your model follow this idea, and if not, why ?

5) I still would like to see some pseudo code from the consumer point of view on how it requests for a model and works with it. A sequence diagram would be great.

Posted by Phillip Magnay on 04-Nov-2009 10:18

guilmori wrote:

Some more questions regarding IBusinessModel.

1) Could you elaborate on what are all the CHARACTER parameters used for ? I hope it's not to send the business entity's name as a CHAR ?

This is seems like a trivial item but it is actually one of the more interesting pieces.  It is the parameter which allows the consumer to ask for a specific entity type. The challenge was achieving some form of type safety while providing something that the model understands. We originally used a progress.lang.class type reference for this parameter which the model then extracted the type information that it could use.  But that approach seemed to place too much burden on the consumer to send in the correct progress.lang.class reference.  So we changed it to a CHAR (progress.lang.class doesn't really offer any addiitonal type safety than CHAR) and included static read-only properties (constants if you like) in the model which the consumer would use to refer to a particular entity type.  Combining these two approaches, ie, constants defined in the model of type progress.lang.class might also be option.

There's probably more ways to do this but keeping simple and direct would be my preference.  I would be interested in any ideas that you may have.

2) Regarding GetEntityValue(...), does this imply some generics functionality (in the ABL ?).

No, not at all. It is just some modeling short-hand.

3) Are all the methods in IBusinessModel used by the consumer or some are used by the Data components when re-creating from persistence ?

No the data components do not use any of the methods in IBusinessModel.  The data components refer to the model under a completely different interface type.

4) Is it really the job of the BusinessModel to persist itself (Persist() method) ? Or should it be the consumer responsibility to send the BusinessModel instance to the data component for persistence. What if I need to commit 2 BusinessModel in the same transaction ? In other business/domain patterns I saw, one of the primary goal is to make the business components completely oblivious to the data components. Could you comment if your model follow this idea, and if not, why ?

Actually, I do believe it is the responsibility of the model to handle persistence.  I don't believe consumers should be directly referring to data components.  I believe consumers should only refer to the model, sets, and entities as per your mention that business components should be completely oblivious to data components. Wrt transactions, I believe that transactions are internal to a given model. I don't believe there is any need to commit 2 models in a single transaction - proper design of the models should eliminate any need for that.

5) I still would like to see some pseudo code from the consumer point of view on how it requests for a model and works with it. A sequence diagram would be great.

It won't be today but I'll post something in due course.

Posted by Phillip Magnay on 04-Nov-2009 10:24

guilmori wrote:

Do you mean the Entity instances aren't stored anywhere in the model ? They aren't contained by the Sets, only returned to consumer ?

Yes, that's right. Entity instances are not stored in the model or in the sets. Entity instances are only created when requested by the consumer via one of the methods on the set.

Posted by Thomas Mercer-Hursh on 04-Nov-2009 11:07

To clarify further, a BE will contain the entity logic, but it serves as a facade for the data which remains in the model.  Access is via the Value methods you see in the interface.  They have also tried using a buffer, as NSRA does, but decided on balance that violated encapsulation and could lead to significant debugging and testing issue since a buffer is navigatable, rather than being an identifier for a particular tuple in the PDS.  One could argue that having value methods on the model and properties on the BE which were the same thing is a violation of Normal Form, but the Value methods on the Model are not really usable by anyone other than the BE because an ID is required to identify the entity in question.

Similarly, Entity Sets don't really contain BEs, but rather identify a query in the Model which designates the set which the ES represents.

In sequential processing of a set of BEs, one would typically instantiate a BE, process it, and then delete it, so that the number of currently instantiated BEs is kept very small.  This is possible, of course, because the data for the BE remains always in the Model.

Posted by Thomas Mercer-Hursh on 04-Nov-2009 11:11

I don't believe there is any need to commit 2 models in a single transaction - proper design of the models should eliminate any need for that.

So, if I am doing something like allocating stock to orders, I am going to have a combined Item and Order Model, not an Item Model and an Order Model?

Posted by Phillip Magnay on 04-Nov-2009 15:21

tamhas wrote:

I don't believe there is any need to commit 2 models in a single transaction - proper design of the models should eliminate any need for that.

So, if I am doing something like allocating stock to orders, I am going to have a combined Item and Order Model, not an Item Model and an Order Model?

Yes, I would take the combination approach over allowing consumers to directly reference DA components in order to submit multiple models in a single transaction.  Using composition to combine models together to form a larger model (which is still compliant with the model interface type) sounds like a much neater tack.  Though the DAController would no doubt need some enhancement to accommodate these larger, more complex models.

Posted by Thomas Mercer-Hursh on 04-Nov-2009 15:41

I have to question this.

First, it seems like it should be the responsibility of the task to define the transaction scope.  Operation 1 might have a scope of a single order and Operation 2 might have a scope of all impacted orders ... how would the Model know.

Second, it seems like it would lead to a combinatorial proliferation of Model types A + B, A + C, A + D, B + C, .... etc.

Posted by Phillip Magnay on 04-Nov-2009 15:56

tamhas wrote:

I have to question this.


Should I expect otherwise?

First, it seems like it should be the responsibility of the task to define the transaction scope.  Operation 1 might have a scope of a single order and Operation 2 might have a scope of all impacted orders ... how would the Model know.

A transaction is initiated whenever a model is submitted to the DA.  A model may contain changes for a single entity or many changes from multiple changed entities.  The business task just requests a model to be persisted and all outstanding changes within a model will be updated to the database in a single transaction whether that transaction encompasses changes to a single record or many.

Second, it seems like it would lead to a combinatorial proliferation of Model types A + B, A + C, A + D, A + E

Sure.  I would expect that such composition would be reserved for cases where the model design doesn't quite stretch to certain use cases.  If it was getting used beyond that then I would suggest that the design of the unitary models may need rethinking.

What would be your answer to your original question?

Posted by Thomas Mercer-Hursh on 04-Nov-2009 16:15

Should I expect otherwise?

I don't know that I like the notion of equating a transaction with a request for persistance.  Business transactions can be lots of things.  Yeah, we are used to managing these mostly with database transactions in an ABL world because it tends to be persisted information that we care about, but not everything is persisted.  Moreover, in an ESB world, we have to step back a bit and look at transactions a bit differently.  That convenient, simple, and absolute reliance on a DB transaction doesn't work with distributed data ... it might be possible, but undesireable.    In particular, I can imagine tasks which would be broken into multiple subtasks  and there might be some outside interactiono between subtasks.  I may want to define a transaction around the subtask so that I know I am complete and ready for the next interaction, but I may not want to persist anything until I get to the end.  Yes, this implies having to start over again from the beginning if something fails before the end, but that seems like a reasonable option.

This proliferation of model vaiations doesn't seem like the OOish thing to do.... and a possible maintenance headache.

What would be your answer to your original question?

Which original question would that be?

Posted by Phillip Magnay on 04-Nov-2009 16:24

I don't know that I like the notion of equating a transaction with a request for persistance.  Business transactions can be lots of things.  Yeah, we are used to managing these mostly with database transactions in an ABL world because it tends to be persisted information that we care about, but not everything is persisted.  Moreover, in an ESB world, we have to step back a bit and look at transactions a bit differently.  That convenient, simple, and absolute reliance on a DB transaction doesn't work with distributed data ... it might be possible, but undesireable.    In particular, I can imagine tasks which would be broken into multiple subtasks  and there might be some outside interactiono between subtasks.  I may want to define a transaction around the subtask so that I know I am complete and ready for the next interaction, but I may not want to persist anything until I get to the end.  Yes, this implies having to start over again from the beginning if something fails before the end, but that seems like a reasonable option.

Sure. But in the context of this pattern, we're dealing with DB transactions.

This proliferation of model vaiations doesn't seem like the OOish thing to do.... and a possible maintenance headache

Variation via composition is a very OO thing to do and would be very maintenance friendly.

Posted by Thomas Mercer-Hursh on 04-Nov-2009 17:14

But in the context of this pattern, we're dealing with DB transactions.

Surely you don't want to exclude ESB data sources!

Variation via composition is a very OO thing to do and would be very maintenance friendly.

Yes, but you don't have objects to compose.  Are you suggesting you will implant a Model within a Model?  Even if you did, this seems like it would be a very different thing than composition involving objects.

And, isn't some thing like Items and Orders for allocation more like aggregation than composition?

Posted by Phillip Magnay on 04-Nov-2009 17:52

tamhas wrote:

But in the context of this pattern, we're dealing with DB transactions.

Surely you don't want to exclude ESB data sources!

It doesn't exclude them. It just won't manage distibuted transactions. And I don't believe that it should be expected to.

Variation via composition is a very OO thing to do and would be very maintenance friendly.

Yes, but you don't have objects to compose.  Are you suggesting you will implant a Model within a Model?  Even if you did, this seems like it would be a very different thing than composition involving objects.

And, isn't some thing like Items and Orders for allocation more like aggregation than composition?

Models are objects. Mulitple uintary models within a larger complex model would form the composition and that sounds perfectly reasonable to me. I wasn't suggesting there would be a composition between Items and Orders.

Posted by Thomas Mercer-Hursh on 04-Nov-2009 18:20

It doesn't exclude them. It just won't manage distibuted transactions. And I don't believe that it should be expected to.

Raising the obvious question of what would manage the distributed transaction?

To be sure, this is a very complext area.  I thought Mike did a great job covering the issues in his Exchange talk, including the recognition that, with distributed data one needed to deal at times with what might call "optimistic transactions", i.e., fire off a change expecting it to work and then have some mechanism to put things back again in the rare case where it doesn't.  One expects it to work, of course, because one has started with reasonably timely information and one is using a reliable delivery mechanism to get the message there.  So, the undo mechanism isn't something that is part of the underlying technology, but rather something that one has to program as a separate task and monitor.  It would seem to me that there are even times when subsystem A is going to need to impact subsystem B, but doesn't necessarily have any data of its own to persist.

It just seems to me that the transaction scope belongs in the task, not coupled to the act of persistence ... but I'll think about it a bit and see if anyone else has 2 cents to throw in.

OK, nesting two or more Models into one larger Models does allow for more re-use than what I thought you were talking about, i.e., a Model with a more complext PDS.  It does make my head hurt a bit though.  It might be OK for sequential processing, but for something more complex I wonder how it would work.

Posted by guilmori on 05-Nov-2009 08:08

If I understand correctly, in this example:

oOrderInst1 = oOrdersModel:GetFirstEntity().

oOrderInst2 = oOrdersModel:GetFirstEntity().

oOrder1 and oOrder2 will be 2 different instances ?

And these instances are still "connected" to the dataset data, so changes made to one instance is automatically reflected into the other instance, right ?

But, even if they are synced, having 2 different instances to represent the same thing smells wrong to me.

When I first saw the pattern, i thought: finally, a pattern where simplicity is a key point in the business entities. But, as more details are revealed, it seems to me that this is drifting away from my first thought. BE being only a facade to the dataset data is a big down for me, as I anticipate the BE to contains many distracting code required for the dataset mapping. For me, BE should be as simple as possible, and contains only business related data AND logic, no infrastructure pollution.

I'm sorry for being quite harsh, this is probably fueled by the fact that the foundation of this pattern is based on temp-table, and my opinion regarding temp-table usage in an OO model is pretty negative... But, I'm still in the very early learning stage of OO modeling, and I still have to see more concrete stuff about this pattern, so my opinion may certainly change.

Posted by guilmori on 05-Nov-2009 08:24

pmagnay a écrit:

guilmori wrote:

Some more questions regarding IBusinessModel.

1) Could you elaborate on what are all the CHARACTER parameters used for ? I hope it's not to send the business entity's name as a CHAR ?

This is seems like a trivial item but it is actually one of the more interesting pieces.  It is the parameter which allows the consumer to ask for a specific entity type. The challenge was achieving some form of type safety while providing something that the model understands. We originally used a progress.lang.class type reference for this parameter which the model then extracted the type information that it could use.  But that approach seemed to place too much burden on the consumer to send in the correct progress.lang.class reference.  So we changed it to a CHAR (progress.lang.class doesn't really offer any addiitonal type safety than CHAR) and included static read-only properties (constants if you like) in the model which the consumer would use to refer to a particular entity type.  Combining these two approaches, ie, constants defined in the model of type progress.lang.class might also be option.

There's probably more ways to do this but keeping simple and direct would be my preference.  I would be interested in any ideas that you may have.

In .Net, I could do the following, all with type safety.

oOrdersModel.GetEntity(typeof(be.Order)).


public IBusinessEntity GetEntity(System.Type beType)

{

   if (beType == typeof(be.Order))

   {

   }

}

Is this possible in ABL ?

However, this does not give any clue to the consumer what are the possible entities returned by the model.

I think this may be a good use case for an Enumeration. Even with your static properties, having a CHAR as a parameter is not clear from the consumer point of view, neither from a UML diagram point of view. Moreover, it doesn't force the consumer to use the static properties.  Using an enumeration here helps the clarity, and reduce risk of error.

I vote for an enhancement to have Enumeration in the language !

Posted by Phillip Magnay on 05-Nov-2009 09:48

guilmori wrote:

If I understand correctly, in this example:

oOrderInst1 = oOrdersModel:GetFirstEntity().

oOrderInst2 = oOrdersModel:GetFirstEntity().

oOrder1 and oOrder2 will be 2 different instances ?

And these instances are still "connected" to the dataset data, so changes made to one instance is automatically reflected into the other instance, right ?

But, even if they are synced, having 2 different instances to represent the same thing smells wrong to me.

They are separate instances referencing the same data.  Changes in one are immediately reflected in the other.  Two separate instances representing the same thing is not ideal (your coding example is not typical) but not unusual either.  As long as the integrity of the data is protected.

When I first saw the pattern, i thought: finally, a pattern where simplicity is a key point in the business entities. But, as more details are revealed, it seems to me that this is drifting away from my first thought. BE being only a facade to the dataset data is a big down for me, as I anticipate the BE to contains many distracting code required for the dataset mapping. For me, BE should be as simple as possible, and contains only business related data AND logic, no infrastructure pollution.

I'm sorry for being quite harsh, this is probably fueled by the fact that the foundation of this pattern is based on temp-table, and my opinion regarding temp-table usage in an OO model is pretty negative... But, I'm still in the very early learning stage of OO modeling, and I still have to see more concrete stuff about this pattern, so my opinion may certainly change.

Yes. Please don't rush to judgment.  You're making some wayward assumptions about the implementation.  You need a code example - I'll see if I can mock something up to show you how can be put together.  You will find that in the BE the "infrastructure pollution" (as you put it) is quite minimal and isolated (one line in each GET and SET of each property). And the larger context here is business applications which work with data that is stored and retrieved in relational databases and the ABL has very efficient features such as queries, temp-tables, and datasets to support interaction with relational data. The intention here is not to replace these powerful features with OO (that makes no sense); it is to use these relational data elements and OO together is such a way that conforms with OO principles and practices, and preserves the power of these relational features.

Posted by Phillip Magnay on 05-Nov-2009 09:59

guilmori wrote:

In .Net, I could do the following, all with type safety.

oOrdersModel.GetEntity(typeof(be.Order)).


public IBusinessEntity GetEntity(System.Type beType)

{

   if (beType == typeof(be.Order))

   {

   }

}

Is this possible in ABL ?

However, this does not give any clue to the consumer what are the possible entities returned by the model.

I think this may be a good use case for an Enumeration. Even with your static properties, having a CHAR as a parameter is not clear from the consumer point of view, neither from a UML diagram point of view. Moreover, it doesn't force the consumer to use the static properties.  Using an enumeration here helps the clarity, and reduce risk of error.

I vote for an enhancement to have Enumeration in the language !

The type-of function in ABL is different (returns a logical, not a type) ) so you could not use it in the same way as you would use typeof in C# or Java.  Using progress.lang.class:GetClass is a little closer but still doesn't provide the type safety that we really want.

Yes. Enums would be a big help here (I vote for it too).  But without enums and without an equivalent to the C# typeof function in tha ABL, we're limited to options that are weakly typed.  I don't like it much either.

Posted by guilmori on 05-Nov-2009 10:58

And the larger context here is business applications which work with data that is stored and retrieved in relational databases and the ABL has very efficient features such as queries, temp-tables, and datasets to support interaction with relational data. The intention here is not to replace these powerful features with OO (that makes no sense); it is to use these relational data elements and OO together is such a way that conforms with OO principles and practices, and preserves the power of these relational features.

I think one basic OO principle is broken here. The data "lives" outside of its corresponding object.

How do you deal with inheritance that extends the data definition of a business entity ?

Why doesn't it make sense to ask for:

  - Creating new instances is as fast if not faster than creating temp-table rows;

  - Efficient and flexible query mechanism over a list of instances (collection).

?

Posted by Thomas Mercer-Hursh on 05-Nov-2009 11:06

Changes in one are immediately reflected in the other.

While true, it doesn't guarantee that the passive one in any change will notice the change.   E.g., the first instance changes A to a new value.  The context using that instance believes A to have that value because it just set it.  Meanwhile, the second instance also changes A to yet another value.  This will show up to the first instance if it asks for the current value, but there is nothing to alert it that a change has occured so the client using it will assume it still has the value it set.

Some of this, I think, is inherent in representing object data in temp-tables.  One has the uncomfortable choice of either cloning the data, in which case the object and the temp-table can become out of sync, or of turning the BE into a facade for the data in the temp-table, in which case we have some not very OO-like properties and the risk of situations like this.  Perhaps they are not serious situations in practice since one will typically process the entities in a set sequentially, but the possibility is still disturbing.

M-S-E is clearly the best thought out of the patterns that use TTs and PDSs in the BL layer, but I have to wonder up front why the foundation is based on what are fundementally collections rather than being founded on BEs themselves.

Posted by Thomas Mercer-Hursh on 05-Nov-2009 11:09

I think one basic OO principle is broken here. The data "lives" outside of its corresponding object.

One can justifiably consider this an instance of the facade pattern.  That, in itself is not the key issue, I don't think ... although, of course, it also isn't what I would do myself.

I'll let Phil answer on inheritance ...

- Efficient and flexible query mechanism over a list of instances (collection).

Still trying to get them to implement LINQ, eh?

Posted by Phillip Magnay on 05-Nov-2009 11:11

I think one basic OO principle is broken here. The data "lives" outside of its corresponding object.

That's not an OO principle I am familiar with. Facade objects are a commonly accepted OO appraoch.

How do you deal with inheritance that extends the data definition of a business entity ?

The specialized sub-class properties can reflect data from the the same temp-table record or from related temp-table records.  The model does the mapping.

Why doesn't it make sense to ask for:

  - Creating new instances is as fast if not faster than creating temp-table rows;

  - Efficient and flexible query mechanism over a list of instances (collection).

?

That's waiting for changes in the language that may or may not come.

Posted by guilmori on 05-Nov-2009 12:18

The specialized sub-class properties can reflect data from the the same temp-table record or from related temp-table records.  The model does the mapping.

Speaking of the temp-tables, are they mostly an exact copy of the database tables, or there is another mapping being done ?

If there is a mapping, for what reasons ?

Why doesn't it make sense to ask for:

  - Creating new instances is as fast if not faster than creating temp-table rows;

  - Efficient and flexible query mechanism over a list of instances (collection).

?

That's waiting for changes in the language that may or may not come.

But, don't you agree that this would make a much simpler model (I mean ditching the temp-tables ) ?

Posted by guilmori on 05-Nov-2009 12:50

What are your thoughts about the other alternative Thomas mentionned ie: "cloning the data" ?

The BusinessModel would play the role of an "Indentity Map". As the consumer asks the model for entities, new instances are created on first request, then re-used on subsequent requests. The model would keep a list of all its BE instances, and use an Indentity pattern to detect if the requested BE is already instantiated.

Posted by Thomas Mercer-Hursh on 05-Nov-2009 14:35

I will let Phil address the mapping question, but about temp-tables ...

There is no reason to ditch something just because it is unfamiliar in traditional OO ... it is just a question of figuring out when, where, and how to use it best.

Traditional 3GL OO uses collections which may or may not have order, but which can generally only be accessed sequentially.  The exception being Map classes which have a key, but only one unlike a temp-table which can have many.  Sequential access seems terribly primative to most ABLers, but the truth of the matter is that one often does want to process all the members of some set sequentially.  One can, of course, do this with a temp-table, but one is using a heavier weight tool than seems necessary.  Something like adding PLO support to work-tables would give us something more like a 3GL collection, but it would only be worth doing if either the memory footprint was substantially lighter and/or the sequential access performance better.  The former seems possible, but then there may be some inherent issues with work-table implementation which would make it not true.  For sure, one would lose the ability to automatically slop to disk for large collections.  I'm dubious about the performance and work-tables might even be worse, since TTs are heavily optimized.

So, one can always use my collection and map classes, act like it was a 3GL, and just forget that there are temp-tables underneath until and if it becomes possible to revise the insides with something lighter.  That would make for a pretty painless transition.

Of course, the second part of the issue is your wanting LINQ like capabilities on these collections.  Whatever the virtue of the idea, I would suggest looking for what it is that you are going to do in the meantime since I wouldn't hold your breath while waiting.  I might just have something for you that helps, though.  Look for forthcoming descriptions of SuperMap.

Posted by Thomas Mercer-Hursh on 05-Nov-2009 14:41

Cloning seems to me to be an unattractive option.  The one reall positive feature it has is that it provides a very natural and simple transaction boundary, i..e., if you don't check the object back in, nothing is committed.  But, it does raise issues about the object and the temp-table being out of sync.  E.g., suppose one clones off an OrderLine object and the client changes the product which changes the product group.  Before it checks the object back in, it needs to finalize the price, but if it is in a context where there is order size discounts by product group, when it asks the temp-table what is the total of all lines in this product group, the temp-table may be out of sync with the line and either include or exclude the line inappropriately.

Posted by Phillip Magnay on 05-Nov-2009 15:10

The BusinessModel would play the role of an "Indentity Map". As the consumer asks the model for entities, new instances are created on first request, then re-used on subsequent requests. The model would keep a list of all its BE instances, and use an Indentity pattern to detect if the requested BE is already instantiated.

Now the Indentity Map idea has potential.  This was something that I didn't think there was a real need for because any potential anomalies appeared to require "curious" coding.. But assuming that "curious" code is a possibility and therefore there is a real need to enforce some form of "identity safety" amongst BE instances, then this sounds like a possible solution.

Posted by guilmori on 05-Nov-2009 18:22

There is no reason to ditch something just because it is unfamiliar in traditional OO ... it is just a question of figuring out when, where, and how to use it best.

Not because it is unfamiliar in OO (in fact, Dataset exists for quite some time in .Net), but more because of the complexity required to have these 2 paradigms working together. Sure this can be done, but at what price ?

In .Net, I don't see why I would integrate Datasets into my models, since working with instances is made so easy and efficient.

Traditional 3GL OO uses collections which may or may not have order, but which can generally only be accessed sequentially.  The exception being Map classes which have a key, but only one unlike a temp-table which can have many. 

I can use LINQ to Objects to read any collection in any order.

Sequential access seems terribly primative to most ABLers

LINQ to Objects seems a lot ABLish to me.

Of course, the second part of the issue is your wanting LINQ like capabilities on these collections.  Whatever the virtue of the idea,

This is like asking why would you use a WHERE or a BY on a for each temp-table statement

I would suggest looking for what it is that you are going to do in the meantime since I wouldn't hold your breath while waiting.  I might just have something for you that helps, though.  Look for forthcoming descriptions of SuperMap.

I sure will !

Posted by Thomas Mercer-Hursh on 05-Nov-2009 18:42

As for LINQ, I propose that we set it aside for the moment since, either we have sufficient similar capabilities in ABL or we don't, but if we don't we still have to figure out how to build applications with what we have.  After all, LINQ is not in every OO 3GL and hasn't existed very long, so there are a whole lot of applications that have been built without it.  Let's get on with figuring out how best to use what we have.

There is certainly an important viewpoint in recognizing that TTs and PDSs are fundamentally expressions of a relational view of the world ... and OO very expressly does not have an OO view of the world.  There is a conflict there.  We now have three patterns which are based around PDSs in the BL layer (technically, NSRA says it isn't, but that is more a case of layer muddiness than something different).  I think this is clearly a strong statement that there are people who feel that PDS is a fundemental part of modern ABL so it would be a mistake to do anything else than make it central to our patterns.

The curious part about this to me is that TTs are all about sets of things and PDS are about multiple sets in relationships, but in both OO thinking and I think in a lot of problem space contexts, the primary focus is the individual instance.  Yes, there is a need for sets of things, but in traditional OO usage these sets are almost all either unordered or, if ordered, still only processed sequentially.  That seems primative to ABLers who flit around the data base with multiple indexes and the like, but bottom line it is enough for a large number of problem space requirements.  Some fancy stuff might go into selecting what goes into a collection, but once it is there, the handling is very simple.

So, we have this curious where on the one hand, the primary focus is a single instance or at most a simple sequential collection but on the other hand the core minimum building block is not only a set, but a set with sophisticated capabilities and potentially multiple sets within a single object.

Posted by guilmori on 05-Nov-2009 19:29

4) Is it really the job of the BusinessModel to persist itself (Persist() method) ? Or should it be the consumer responsibility to send the BusinessModel instance to the data component for persistence. What if I need to commit 2 BusinessModel in the same transaction ? In other business/domain patterns I saw, one of the primary goal is to make the business components completely oblivious to the data components. Could you comment if your model follow this idea, and if not, why ?

Actually, I do believe it is the responsibility of the model to handle persistence.  I don't believe consumers should be directly referring to data components.  I believe consumers should only refer to the model, sets, and entities as per your mention that business components should be completely oblivious to data components.

I'm not sure I understand. If it's the job of the model to persist itself, then it must access the data components. So it has knowledge of the data layer ?

Posted by Thomas Mercer-Hursh on 05-Nov-2009 19:41

Let me help Phil out a little here because I have been pursuing this question in a somewhat different form elsewhere.

A model does not persist itself.  Instead, it is given to the Data Access Controller which then uses the Decorator pattern to add components to it which provide the mechanisms of persistence.  As appropriate, these are then stripped and the Model returned to the business layer.  So, the Model itself has no knowledge of the datasource or database, only the Decorators know that.

Persisting in the focal point for transaction scope.  Thus, if you have a Model with multiple objects, make a change in one, and consider that the transaction scope, then the task using the model would request persistence at that point to preserve that change.  Alternatively, if the task wants to make a series of changes and commit the whole set as a transaction, then it would wait until the end to request persistence.

For the case where there is more than one Model involved, one would create an aggregate Model which contained the two or Models containing the different types of objects.  Requesting persistance of the containing Model would persist all of its contents ... hopefully within the scope of a single transaction.

Posted by randallkharp on 06-Nov-2009 11:01

It's refreshing to see folks with ABL familiarity thoughtfully discussing this topic - even proposing (and using) patterns.

Object/relational mapping has been a discussion topic for many years - although often focused on persisting an object hierarchy in a relational database rather than developing an object hierarchy to manage a relational database. Given a long history of a well-performing 4GL, it often feels somewhat treasonous to give up a straight-forward FOR EACH or FIND for an OO construct.

Undoubtedly, finding the right balance between direct database access and OO constructs is an ongoing pursuit that may have to be judged on a case-by-case basis. Perhaps the real performance killer is the paralysis of analysis. ;-)

FWIW here are a couple more interesting (non-PSC) resources on the topic of object/relational mapping:

http://www.ibm.com/developerworks/library/ws-mapping-to-rdb/

http://www.objectarchitects.de/ObjectArchitects/orpatterns/

Posted by Admin on 15-Nov-2009 15:48

Well, you bring up some interesting points: ORM and the "loosing" of FOR EACH and FIND.

Usually every discussion about OO models turns around the idea of having two incompatible models: the OO model for the application logic and the relational model for data persistency, working togheter by means of a ORM "bridge".

Progress is givin us the posibility of truly joining both models, instead of "bridging them", by means of ProDataSets and Temp-Tables (as we did with NSRA). This join of models gives you the benefits of having an OO structure, error handling, overriding, inheritance, etc. without loosing the ability of doing FOR EACH nor FIND, not over the Database but over Temp-Tables.

The Model-Set-Entity Pattern does not goes that far, it is still an ORM.. I will call it an "advanced ORM", because is giving you more control over the mapping, and is taking some advanteges of the use of ProDataSets and Temp-Tables to be more "efficient" in object managment.

The model is "close" to a pure OO model (as CI model proposed by TMH), but has some "breaks" of OO in the concep of Sets (as far as I can see - not so far by the way !!).

I don't agree with this kind of model, where you loose the ability of doing a FOR EACH or a FIND, for ABL, because I beleave they are "excesively theorical", I really don't think this models are productive enough nor efficient enough (and there are no implementations to prove me wrong!), compared to existing OO models for other languages, to worth the effort of implementing such model.

I think "Progess people" is somehow affraid of changing to an OO developing model because they feel you are loosing the "ABL power" of FOR EACH and FIND, with nothing to compensate this loose. I feel this too when I see this kind of model... and I start working with ABL with version 10.1C allways in OO.. so I think someone that has been working with ABL for longer time must be really worried about this.

Posted by danielStafford on 09-Dec-2009 12:49

I have constructed an application for our business based on the OERA. We now wish to use the GUI for .Net tools / controls and re-write the front end. I am fairly new to OO principals and patterns but have just started the re-write using an MVC pattern. Having read your posts here, I'm very interested.

If, as a user of our (newly written) application, I have a new order in process, switch to a datagrid view of some other set of updatable orders, and open yet another view of customers / orders, will multiple entity sets represent the context for the work being done? Will there be one instance of the OrdersModel?

Do the BE's in your diagram contain the business logic that the OERA implemented on the appServer or mostly a collection of get/set properties?

Posted by Admin on 12-Dec-2009 13:53

Hi Guillame, You seem not to agree to the model proposed and indeed you are giving relevant remarks and are refering to other patterns. Could you give us your image of a proper pattern for Progress and its OO functions.

Sylvestre

Posted by Admin on 12-Dec-2009 13:56

Hi Thomas, you seem to agree with the principles of this framework. Are you designing apps using Progress/Gui .Net and OO principles, and what pattern have you adopted for developing you business applications ?

Could Mike give us his feedback aswell.

Sylvestre

Posted by Thomas Mercer-Hursh on 12-Dec-2009 15:09

For a preliminary read on my reaction to this pattern and others, see http://www.cintegrity.com/content/Patterns-Managing-Relational-Data-OOABL .   There are whitepapers in process which will develop this topic in more detail.

There is a very important watershed in these patterns between those which focus on holding data in a ProDataSet versus those which focus on a more traditional OO model in which the data and behavior of an individual instance is encapsulated in what I call a PABLO, Plain ABL Object.  The PDS has natural appeal because of its power and 4GLness, but I personally find it presents a number of obstacles to following OOP principles, principles which are more easily preserved with PABLOs.

Which said, if you are going to come down in favor of the PDS, the PPS M-S-E pattern has certainly been thought through thoroughly and has some features about it which preserve ... or should I say restore ... some OO-like patterns which are missing in other approaches.

For me, none of this has anything to do with .NET since none of it has to do with any particular form of UI.  This part of the pattern is about what happens on the server.  To me, what happens in the client should, at most, be reflected in a service class on the server which adapts to the needs of the particular client type.  An ABL GUI for .NET client isn't going to use a pattern like this since it has no real data layer, only a facade to the data source on the server.  From the little I know of ABL GUI for .NET clients, there appears to be a role for PDS and TT as local data sources, just as I think there is a role for them in the data layer in my pattern.

Posted by Admin on 13-Dec-2009 02:55

An ABL GUI for .NET client isn't going to use a pattern like this since it has no real data layer, only a facade to the data source on the server.  From the little I know of ABL GUI for .NET clients, there appears to be a role for PDS and TT as local data sources, just as I think there is a role for them in the data layer in my pattern.

Exactly. In order to get ABL data into a .NET Grid  and automatically synchronise Input Fields you'll need a ProBindingSource. The ProBindingSource can talk to a QUERY, a ProDataset or a BUFFER. Not to an object directly (the Alternative would be binding to an Array of .NET objects which is not attractive at all inside the ABL).

So for a GUI for .NET thin-client there needs to be a TEMP-TABLE based representation of the data. The question is where to build that when the backend does encapsulte data completely in classes and forbits all access to the tables directly (well, there might not even be a table as in TMH's PABLO). I doubt that turning OO wrapped data into an XML representation of data just to turn that into TEMP-TABLEs and ProDatasets on the client is the best choice. I rather tend to send ProDatasets between client and AppServer. When all data is wrapped in objects the responsibility for creating TEMP-TABLEs and ProDatasets to send to the ABL based clients is on the Service Interface.

Posted by guilmori on 21-Dec-2009 10:54

I cannot really disagree with this model since no real details have been revealed yet.

However, it all come down to if you are comfortable with using PDS to represent your data in your OO business model. I am not, PDS and OO are conflicting paradigm, where mixed together result in some kind of complex hybrid model, where some OO features and simplicity are lost.

I must confess that I still have more questions than answers regarding an OO architecture in OE.
My opinion is that OE is not yet ready for a real productive OO architecture, for its lack of performance and features.
For a comparison, take a look at the upcoming .NET entity framework 4.0. OE is FAR behind all this, still arguing on the requirement of collections in the language...

I hope somebody proves me wrong.

Thomas Mercer-Hursh upcoming work looks really promising, and I hope this will trigger great interest in the community, so PSC put more emphasis on the OO "server side", instead of the GUI for .NET....

Posted by Thomas Mercer-Hursh on 21-Dec-2009 11:26

I will say, Guillaume, that there are parts of this pattern which correspond very well with OO thinking ... sort of.  But, I agree with you that there are other parts which don't fit with the way I think OO should work.  So, there is a watershed question which people are going to have to ask themselves.  PDS are very appealing tools in the ABL arsenal and there are certainly ways in which they can make life easier.  But, they are also very relational in concept.  Some are going to say "So what, take the best of both worlds!"  And, I don't know that one should reject them out of hand, just on purist grounds.  If one is going to reject them, it should be for one of two reasons:

1. One has something better; or

2. One can identify ways in which this pattern does not conform to OO design principles and one can identify ways in which there is a significant loss of the benefits one would normally get from OO.

Really, of course, one should have both.

Posted by Phillip Magnay on 21-Dec-2009 11:51

Given that via this pattern, the PDS is an internal representation of data that is encapsulated within an object class, I am still having trouble understanding some of the points being made here.

This is not a perfect pattern - none are.  And there are definitely some improvements that can be made (*are* being made, in fact) and the input from this forum has helped identify some of those improvements.

But I am still don't quite understand which particular OO fundamentals and/or principles are supposedly being compromised by this pattern. Perhaps you guys could be a more specific.

Phil

Posted by guilmori on 21-Dec-2009 12:04

Perhaps you could provide more specifics of this model

Posted by Phillip Magnay on 21-Dec-2009 12:13

OK.  That's one response that identifies no specific OO fundamental/principle being broken by the pattern.  Anybody else?

Posted by Thomas Mercer-Hursh on 21-Dec-2009 12:48

I *will* be going into more detail ... on both the good parts and the parts I don't like ... when I get the whitepapers out.  I am finding a need to prepare some background materials for the discussion, so it is taking me longer than I hoped, but I think the background stuff will be useful.

One of the problems here, Phil, is that there isn't really enough info published by which to make a detailed analysis.  You and I have had enough exchange that I feel like I have some understanding of what you are doing, but the fruits of that discussion have not yet been published.

Which said, I think that there are some valid concerns which one can have even without that detail.  Certainly, a PDS is a very relational view of the world.  OO is not a relational view of the world.  In non-ABL systems, the relational and OO issues are resolved at the boundary in the OR mapping layer.  Pushing that relational view all the way into the business layer seems to me to be a valid suspicion that something other than real OO thinking is going on.  As I have said, one can think that purity isn't required or even optimum, but you have to admit that there is a huge background of thinking about why traditional OO design principles are a Good Thing, so there is a good prima facie reason to question.

To forecast a bit more specifically, there are two points which I could bring forward without getting into a lot of detail:

1. Fundemental to the OO paradigm is encapsulating data and behavior together.  Unless a TT is composed of objects, it is inherently just data. If the data populates an object in order to interact with it, then you have an inherent issue in having two copies of the data, one in the TT and one in the object. You address this issue in M-S-E by having a business entiry object which has behavior and which drills through to the TT to get the data.  That is certainly a better solution than some I have seen and it ends up having some nice OOish things about it, but it does mean that you have violated normalization by having the same attribute available on two different objects (although, yes, I recognize one is not very usable because it requires a key which only the BE possesses) and it means that the data and behavior are only simulated as being together.  I'm not going to say that this alone is a fatal flaw by any means ... indeed, I complement you on how good parts of it end up looking like a solid OO pattern, despite the unusual implementation.  I do think you have to recognize, though, that it is a reason for a strong OO traditionalist to get very nervous.

2. I think that one has to recognize that there is an inherent issue about applying generalization and specialization using a TT and especially a PDS.  If entity data in a superclass is in properties, then a subclass can add properties with no problem.  With a TT, one has to start adding child tables and that certainly complicates creating the BE from that base.  Make it a PDS with two levels and the required data structure can quickly be far from trivial.  Add in the behavior, and it is a bigger problem yet.  Add in some delegates which have their own inheritance tree and it can quickly become a nightmare ... and yet the traditional OO view of this same data is simple, elegant, clean, and has all the benefits of separation of concern one would like to see in an OO model.  I know that you think you have addressed this issue by the use of the Decorator pattern, but to me that is far less clearly and solidly structured than solving the problem with generalization and delegation.  I don't doubt that you make it work and, to the extent that you are creating the code with model-to-code technology, it is even probably pretty reliable, but I can't help but think that there must be a better way since it seems to torture the very simplicity and directness which I would expect to get by going with OO.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 12:50

You do understand, don't you Phil, that it is hard for Guillaume to be specific in his criticism because he doesn't have much detail?

That doesn't exclude him thinking that there are things which are not OOish about this ... starting with sticking with the relational data model in the BL.

Posted by Phillip Magnay on 21-Dec-2009 12:59

Thomas,

Again, you use the phrase: "traditional OO design principles".  What are these principles you are alluding to?  Which of these principles are being commpromised by the pattern?

Phil

Posted by Phillip Magnay on 21-Dec-2009 13:04

There is no relational model in the BL.  There is a PDS being used as an internal representation of data within an object.  I believe that point has been stressed a number of times now.

I do not wish to appear dismissive.  But I hear "you're violating OO principles" in this thread and so I respond,  "OK, which ones?"  And no-one points to any specific principle being violated.  It's a simple question.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 13:09

No need to get prickly.  We have different viewpoints.  Other people will have theirs which will differ from ours.

I'm sorry, but it is hard for me to see that a PDS is not a relational model of the data.  Tables of tuples has nothing to do with the way data is structured in OO thinking.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 13:18

One of the background whitepapers I am doing is exactly on what I perceive to be the design principles to which I refer.  I will be sending it too you soon for a pre-release review.

I think my post was reasonably clear on the issue, within the confines of the limited space (d**m I wish they would fix it so the indents weren't so debilitating).

I know that we don't see eye to eye on these issues, Phil.  This is not a place where either one of us can go point to a reference and prove the other one wrong.  All we can really do is to explain our reasoning and let our "public" decide which approach is better or which reasoning they feel is sounder.

E.g., you think Decorator is fine to use abundantly and to use it as a highly flexible substitute for a more rigid structure built on delegation and generalization.  Whereas, I feel that Decorator exists specifically to provide an object with special behavior on a transient basis and feel that the formalization of delegation and generalization is clearer, more analyzable, more predictable, more stable, and more evolvable.  Doesn't mean I wouldn't ever use Decorator in place of adding new subclasses, especially if the behavior was state-dependent, but I think the more formal approach is better for the general case where the data and behavior are properties of the class throughout its lifetime.  We'll see what others think.

Posted by Phillip Magnay on 21-Dec-2009 13:20

Not prickly - just want to get to the bottom of the critiques.

There is no inherently relational aspect to the classes in the BL.  The PDS is simply an internal representation of data within an object.  That internal data representation could be XML, could be anything.  It just happens to be a PDS.

Posted by Phillip Magnay on 21-Dec-2009 13:24

Look forward to reading the white paper.  But in the meantime...

You can't just give me one generally accepted OO design fundamental/principle that is being compromised?

Posted by Admin on 21-Dec-2009 13:30

There is no inherently relational aspect to the classes in the BL.  The PDS is simply an internal representation of data within an object.  That internal data representation could be XML, could be anything.  It just happens to be a PDS.

Internal... And as such it shouldn't be criticized ...

In the environment of the ABL I do strongly vote for a ProDataset (or anything else that is [temp-]table based) for this purpose. There is by far nothing else in the language that comes close to the capabilities for manipulating such data.

Posted by Phillip Magnay on 21-Dec-2009 13:35

tamhas wrote:

E.g., you think Decorator is fine to use abundantly and to use it as a highly flexible substitute for a more rigid structure built on delegation and generalization.  Whereas, I feel that Decorator exists specifically to provide an object with special behavior on a transient basis and feel that the formalization of delegation and generalization is clearer, more analyzable, more predictable, more stable, and more evolvable.  Doesn't mean I wouldn't ever use Decorator in place of adding new subclasses, especially if the behavior was state-dependent, but I think the more formal approach is better for the general case where the data and behavior are properties of the class throughout its lifetime.  We'll see what others think.

One generally accepted OO principle is: "favor composition over inheritance".  This principle was first highlighted by the GoF and in particular at length by Erich Gamma.  It has been stressed time and time again as one of the most important OO principles to follow when building large systems.  Standard inheritance is also supported by the pattern.  But use of the decorator pattern as a more flexible substitute for standard inheritance is following an OO principle that has been generally accepted for many years.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 13:43

Composition <> Decorator.  Composition is a formal structure.  I readily agree that there are many cases in complex objects where the inheritance tree would become unpleasantly complex and fragile if one tried to do everything with inheritance.  That is why I mention delegation.  There is a role in there for other forms of cooperation and association too.  But, all of those are formal structures with a defined relationship over the lifecycle of the object.  Decorator is ad hoc.  Useful when there are state-related changes in behavior, particularly if the state is temporary and likely to revert prior to persistence.

Even if one were going to use Decorator to replace delegation and generalization, the data structure should still reflect that.  If one mushes all the data for all the subtypes into one table and selectively accesses it via Decorator logic, one has not conformed to bundling data and behavior together.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 13:48

Well, internal is in the eye of the beholder too.  Here, the data in this internal structure is accessible via accessor methods by the BE ... that's where it gets its data.  Representing that data in XML is not a realistic option for efficiently selecting one field from one row.  Moreover, the EntitySets are based on a query on the TT.  Going to do that with XML or is that pretty inherently a relational view?

And, frankly, the use of Decorator isn't just a matter of preference and how one sees the pattern, but a matter of necessity, a necessity created by the fact that there is a PDS in there.

Posted by Phillip Magnay on 21-Dec-2009 13:51

I'm afraid you don't understand the decorator pattern and why it is a absolutely an example of composition.  It is definitely not ad hoc - it absolutely represents a well-defined structure.  It follows a long-accepted OO principle and is in fact used by the java.io.* classes.

Posted by Phillip Magnay on 21-Dec-2009 13:55

tamhas wrote:

Well, internal is in the eye of the beholder too.  Here, the data in this internal structure is accessible via accessor methods by the BE ... that's where it gets its data.  Representing that data in XML is not a realistic option for efficiently selecting one field from one row.  Moreover, the EntitySets are based on a query on the TT.  Going to do that with XML or is that pretty inherently a relational view?

Whether there is an issue of efficiency using PDS over XML is immaterial.  The entity sets could be based on a "query" which navigates through the XML.  There is no relational aspect to the objects in the BL as the PDS is entirely an internal representation.

Posted by Phillip Magnay on 21-Dec-2009 13:59

tamhas wrote:

And, frankly, the use of Decorator isn't just a matter of preference and how one sees the pattern, but a matter of necessity, a necessity created by the fact that there is a PDS in there.

This is incorrect.  The decorator is used to encapsulate PDSs.  But standard inheritance could be also be used no problem - it wouldn't be as flexible or as easy to maintain but standard inheritance would be just fine.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 14:24

Phil, I am giving you credit for the way the BEs and ES *look like* what one would expect from an OO point of view.  I have been and will continue to be very clear about that.

You don't have a problem with the PDS because  you feel that it is an internal structure.  Fine.  I also give you credit for having encapsultated it better than at least most other patterns, if not all.

That doesn't stop me from feeling that there is an archetypical relational structure at the core of this pattern, that this is inherently a questionable thing, and that it forces some approaches on complex objects which are other than what I would choose.  We disagree.  Let it be and let others decide as they will.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 14:26

Standard inheritance is possible on the top level in the PDS ... it gets problematic on the lower level.  I.e., flavors of Order, but not flavors of OrderLine and certainly not multiple subtypes of some OrderLine delegate if all of the data is going to be in one Model.

Posted by guilmori on 21-Dec-2009 14:30

What bothers me (again I am making assumption since the details seem top secret) is the complexity of the extra layer of mapping. ie: database pds objects. What I would like is directly mapping database objects. The O/R mapping should happens once when data is requested from the datasource, not being held in the Business layer, doing the mapping as properties/methods are being accessed.

Posted by Phillip Magnay on 21-Dec-2009 14:31

Standard inheritance is not problematic at all at any level of the PDS.  The preference for the decorator is s simply a choice in favor of greater flexibility and ease of maintenance.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 14:34

Again, Phil, we just have to agree to disagree on this one.  You think it is fine to use it all over the place and I don't.  I have my reasons and you have yours.  I am sure we could both come up with some URLs that support our opinion ... OO principles are like that.  Heck, one can even find people who think that public data members are just ducky.

Nowhere in here am I saying "Don't use this pattern because of this characteristic".  On the contrary, I am giving you lots of kudos for having a well thought through pattern, one that stands out in many ways from the other PDS-based patterns.  All I am attempting to do in my analysis is to make people aware of all sides of the matter so that they can make their own choices.  Some of them are going to say "I can live with that given the overall pluses"; others are going to say "I agree with Phil that this is a fine thing to do".  I absolutely understand that it behooves me to come up with something better.  Some people may end up thinking that I have and others won't.  Such is the nature of opinion.

Posted by Phillip Magnay on 21-Dec-2009 14:37

The mapping does not occur in the BL layer.  It takes place in the DA layer.

I'll ignore your "top secret" quip and ask again: what OO fundamental/principle is being compromised?  I don't mean to be dogmatic - I really want to understand where you think there is a compromise to accepted OO principles.

Posted by Phillip Magnay on 21-Dec-2009 15:10

Thomas,

This is not about credit or kudos.  I want to understand the OO fundamentals/principles that you, Guillaume, etc, claim are being compromised.

But no one here wants to indicate a specific fundamental or principle that is at question  These principles have been documented and accepted for many years. Yet I still do not have one that I can use to analyze the pattern and determine where they may be opportunities for improvement.

I have justified the use of a common pattern (the decorator) (and I do not use it "all over the place" as you claim) by putting forward the well-accepted OO principle that it follows: favoring composition over inheritance.  This is a principle that is long-standing and is reinforced by universally recognized OO experts (eg, Erich Gamma).

I have asked for similar justification from others for suggesting that this pattern is not really OO.  But no such justification has been forthcoming - I've only gotten that it's just a difference of opinion.

You say that it's just a difference of opinion between you and me. But it's not.  My opinion is irrelevant here.  It's a difference of opinion between you and many experienced and well-respected authorities on OO principles and practices.  And these are not just a gaggle of URLs found via Google.  These are well-known authors.

Yes, there can be differences of opinion.  But opinions must have some foundation in real experience or they are nothing but leaps of faith.

Until someone indicates which OO principle(s) this pattern is compromising, any suggestions that it not following OO cannot be taken seriously.

Phil

Posted by guilmori on 21-Dec-2009 15:32

Phillip, as I said, I cannot really comment without full knowledge of the model.
Maybe no OO fundamental/principles are being broken from an external point of view, but how much complexity is added in the internals ? This is what I want to see.
Having to maintain a PDS in the BL, even if it is internal, certainly adds complexity.
Is "KISS" an OO principle ? :-)

Posted by Thomas Mercer-Hursh on 21-Dec-2009 15:57

Understand, Guillaume, that the PDS -> object is not something that happens here in quite the way that I think you expect.  A BE object, i.e., a specific instance of one of the levels represented in the PDS, has the logic for that entity and it has properties for its data, but the properties on the BE reference back to accessors on the Model to obtain the actual data in the row of the TT of the PDS which applies to that specific entity.  There is a key in the accessor which is specific to the BE so it only gets its own data, so while technically, one could say this violated normalization, functionally it doesn't since only the BE will have the correct key.

One of the consequences of this is that BEs are very transitory.  I.e., instead of instantiating all the lines for an order at the time of retrieval and putting them in a collection where they get processed in whatever way they are going to get processed, here the data is just in the model and there start out being no BEs.  If one then has a need to navigate the lines in order, for example, then one would request an Entity Set for that purpose.  The Entity Set is a little odd since all it really has is a pointer to a query in the Model for navigating according to the criteria established when the ES is created.  So, one then asks for the first BE in the set, that BE gets created, processing occurs, and than that BE can be destroyed because the data is back in the PDS.  Thus, navigating sequentially through the lines, one might sequentially create, use, and delete each BE and only one would exist at any one time.  Obviously, if one knew one would be making multiple passes, one would defer the deletion.

Also, there is nothing about the pattern which specifies that the PDS corresponds to the data structure in the DB.  That mapping occurs in the DL.  Obviously, if one has control over the schema, it would be better to make them match, but that is not required by the pattern.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 15:59

I think Guillaume's remark is simply a reflection of the fact that there is obviously a great deal of detail about the implementation of this Model which has not yet been publicly exposed.  Without that detail, it is hard for him to analyze.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 16:06

Perhaps, then, you would like to explain how it works.  Let's take Order, which has subtypes InternalOrder and ExternalOrder, OrderLine which has subtypes ProductOrderLine and ServicesOrderLine, and delegate ShippingMethod, which only applies to ProductOrderLine that has subtypes DropShipShipping and DirectShiping.  Lets say that each unique subtype has one property named for the subtype, e.g., IO1, EO1, POL1, SOL1, DSS1, and DS1.  Tell us what the PDS looks like, how many Models there are, and how the BEs are created appropriate in a sequential search of all lines, regardless of type.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 16:26

I understand, Phil, that you feel that your design decisions are justified and that you think that what you have now is highly evolved and full of many virtues.  To some extent, I agree with you, up to a point.  But, that is not the only view.  You are entitled to think that Decorator is fine to be used freely, even for what is really static decomposition.  I disagree and I am not alone.  Your citing a source which you feel supports your position doesn't invalidate mine.  You think the guideline about composition versus inheritance supports your decision here.  I don't because I don't see Decorator as an appropriate instrument for static decomposition.  Dynamic, sure, but not static.

You feel that the presence of the key in the Model accessor methods means that they don't violate normalization.  I think "only sort of".  For me, it is a negative feature of the pattern.  Others may agree with you, agree with me, or just not care.  I just want to expose issues, pro and con, so that people can make informed choices.

We talked about buffers as an alternative to the accessor methods.  On balance, I suppose I think the accessor methods might be the right choice, but I would like people to be aware of the alternative so they can make their own choice - one, the other, or neither.

You think the PDS doesn't count as relational structure because it is encapsulated.  I think "sort of" because it influences the way that a lot of things happen and it appears to have implications for what is possible or at least easily done.

You think having the data in one place and the logic in the other is fine.  I think it is a violation of encapsulation.  Doesn't mean that is a fatal flaw, but it is something that someone might want to recognize.  Likewise with ES and the query in the Model.  I understand perfectly well why you did it and commend you for the rather OO appearance of BEs and ESs from the outside which is achieved, but I still think it is worth noting so that people can include it in their evaluation.

I will go into this a bit more systematically when I get to the whitepapers, but this is a short list for a start of places where I feel there is an issue relative to what I would consider to be good OO practice.  Not everyone would agree and you obviously don't, but that doesn't invalidate my having the opinion unless I am actually wrong about my understanding of how M-S-E works.  I was certainly wrong at several points in our earlier exchange and needed correction, but otherwise it is perfectly valid for me to have my opinion, especially since I think I have some reasonable basis ... not the same as your reasons, but not a question of my simply not having a clue about OO.

Posted by Phillip Magnay on 21-Dec-2009 18:20

I do not "feel" that these design decisions are justified. I have in fact provided specific justifications rooted in established OO design principles. On the other hand, you've made no mention of any specific OO principle in support of your opinions.

Posted by Thomas Mercer-Hursh on 21-Dec-2009 18:48

I disagree.  My perception is that both our viewpoints are founded on our own perceptions of what commonly accepted OO design principles are.  That these differ is hardly surprising since a bit of trolling about on the web will reveal a diversity of opinion on just about every proposed OO design principle, often varying from it being an absolute to being something one tries to do to being a dubious notion, perhaps because some other principle applies and is more important.  There is also a diversity of opinion on what particular programming practices are and are not examples and violations of those principles.  We aren't gonig to change that by anything we say here.

We can, however, do two things.

1. We can be sure that we have a mutual understanding of what your pattern actually does.  I feel pretty good about that personally, but we will see what mistakes I make when I write it up.  It seems pretty clear that others who have not had the benefit of the exchange we had do not yet participate in that understanding.

2. We can be explicit about our principles and values and how they apply in our evaluation of the pattern.  I plan on doing that in my whitepaper and I hope that you will do the same in whatever further documentation of this pattern you publish.  By being clear about our principles and the reasons why we hold them, readers can make judgements about how mainstream they think the idea is, how important it is, and how much it aligns with their own values, thus providing them with guidance as to how they should interpret our evaluations ... and probably providing some education at the same time.

And, I think it is going to be important to apply these standards to evaluating multiple proposed patterns in this same area.  It is not unlikely that none will be seen as perfect, perhaps by anyone's criteria, and it will be a question of picking what seems best.  For some, that will be on the basis of familiarity or comfort.  For a transitional ABL programmer who has already been making extensive use of PDS, that might well be one of the PDS-based patterns.  For someone with a different background, it might be my PABLO idea.

What I would really like is to define a problem and get it illustrated by sample code for each pattern.  I think that would have enormous learning potential, possibly for all of us.  But, I don't suppose that will happen.

Posted by Phillip Magnay on 21-Dec-2009 19:52

Everyone is entitled to their views.  But this is not post-modernist philosophy where everything is relative and subject to an individual's perception.  There are well-defined, well-understood and well-documented OO design principles that have been established for many years.

These principles include:

1. Design to interfaces
Designing to interfaces, rather to implementation, gives great flexibility in allowing the implementation to vary. This generally leads to loosely coupled classes.

2. Favor composition over inheritance
Rather than relying on inheritance to provide for specialized behaviors for similar classes, consider using object composition. This is often referred to as "Encapsulate the concept that varies" principle.

3. Do one thing, and do one thing well (strong cohesion)
Each class/method should strive to do one thing, and to do one thing well. If a class/method is becoming too large, this is generally a sign that it is trying to do too much, and it may be a good time to refactor. Highly cohesive classes are generally easier to unit test and maintain. This is often referred to as the Single Responsibility Principle


4. Minimize class dependencies (loose coupling)
Classes that limit their dependencies are easier to unit test and maintain.


5. Minimize redundancy
Redundancy (both in code and in concepts) makes the system more difficult to maintain.

6. Open Closed Principle (OCP)
A class should be open for extension but closed for modification.
A common indicator that this is being violated is Select/Case statements or nested if/then/else.
Common patterns to address this are strategy, template, decorator, visitor, bridge, and chain of responsibility.


7. Liskov Substitution Principle (LSP)
Subclasses should be substitutable for their base classes.
Common indicators that this is being violated include deep inheritance trees; clients writing to concrete classes rather than to the interface; select/case or "instance of" being used to determine which concrete implementation has been instantiated.
Common patterns to address this are strategy, template, decorator, visitor, observer, and chain of responsibility.

8. Dependency Inversion Principle (DIP)
Depend upon abstractions, not concrete implementations.
Common indicators that this is being violated is direct dependencies between concrete classes.
Common patterns to address this are class factory, abstract class factory, and builder.

9. Interface Segregation Principle (ISP)
Favor specific interfaces over one general purpose interface.
Common indicators that this is being violated are large class interfaces, many different clients that use only a subset of the interface, ripple effects when changing an interface.
Common patterns to address this are adapter, facade, and proxy.

There are some others but these are the most critical.

Yes, there may be some differences of views at the very edges but at the core there is a strong consensus of opinion built from many years of development experience.  People can choose to recognize this reality and use these principles. Or not. Not is OK.  But if not, then a simple recognition that one is not following established principles is necessary. Moreover, if one is going to pass judgment on specific OO designs without reference to these principles, then people ought to be forgiven for thinking such judgment may be lacking in weight and substance.

Posted by Thomas Mercer-Hursh on 22-Dec-2009 11:14

As it happens, we seem to be thinking along some similar lines here since one of the things I am doing at the moment, as background for my whitepapers that parallel my Rotterdam talk, is to create three reference pieces:

  1. OO Vocabulary - a list of terms and a short definition in case people are not familiar with some of the words.  Not an extended discussion, but just enough to nudge one in the right direction;
  2. OO Patterns - a quick review of the GoF patterns to give people an idea of the principles which they embody and a reference when a pattern name is used somewhere; and
  3. OO Design Principles - a discussion of some of the common principles of OO design and why they are important.

This will be a *lot* shorter than a book.

And, I fully concur with your concluding paragraph.  We just differ in some specifics, like our view of the intent and proper use of Decorator, and in how we interpret the principles in regard to M-S-E, e.g., whether or not a business entity which does not own its own data is a good example of these principles.

Posted by Phillip Magnay on 22-Dec-2009 11:32

Why waste your time on such reference material?  There is so much public domain material out there especially on GoF patterns and OO design principles.

You continue to suggest that my view and use of the decorator is somehow improper or inappropriate or questionable. But my view and use of the decorator is completely supported by established OO principles.  Which principle(s) are you using to make your judgments?

Posted by Thomas Mercer-Hursh on 22-Dec-2009 12:08

I am "wasting my time" because I feel that having reference material directly associated with the documents I am producing will be convenient for the reader.  I am also making them compact so that readers don't have to wade through entire books in order to get a basic idea of what is meant.  While one can find a lot by googling or reading Wikipedia, which I reference, one can easily end up with a lot of discussion or material that is distracting from the core idea, e.g., most of the Wikipedia articles are 90% code, which is likely not to be much help to an ABL programmer.

My issue about Decorator isn't so much a question of "this violates principle XYZ", but that my perception of the intended use of the Decorator pattern is different from yours.  My understanding is that it was never intended to be a substitute for Delegation and Generalization, but rather was intended to provide transient, state-dependent behavior to an already existing class.  To me, Delegation and Generalization solve the problem and do so in such a way that the structure is clearly and formally modelled and the implications are apparent to anyone used to publishing those models.

Were you going to respond to my request to illustrate how one uses temp-tables to model the dependancy and generalization example I gave?

Posted by Phillip Magnay on 22-Dec-2009 12:21

tamhas wrote:


My issue about Decorator isn't so much a question of "this violates principle XYZ", but that my perception of the intended use of the Decorator pattern is different from yours.  My understanding is that it was never intended to be a substitute for Delegation and Generalization, but rather was intended to provide transient, state-dependent behavior to an already existing class.  To me, Delegation and Generalization solve the problem and do so in such a way that the structure is clearly and formally modelled and the implications are apparent to anyone used to publishing those models.

OK.  Your understanding of the decorator and it's intended use/role is incomplete.  It has a clear well-defined structure and can be formally modelled as well as any other structure.

Were you going to respond to my request to illustrate how one uses temp-tables to model the dependancy and generalization example I gave?

Not today.

Posted by Thomas Mercer-Hursh on 22-Dec-2009 12:47

Perhaps, in addition to the description of the use of temp-tables, I should post the UML of how I would model that situation and you should post a model of how you would model it with Decorator.

As for the incompleteness of my understanding, let's take http://en.wikipedia.org/wiki/Decorator_pattern which actually refers to Decorator as an alternative to subclassing.  Note, however, that the motivation for using Decorator for this purpose is when one has multiple extensions which occur in unpredictable ways.  This is the classic "crosstab" problem when one tries to model all variations as a single inheritance tree.  If, however, one delegates portions of the responsibility, one eliminate the crosstab problem by separating the concerns each into their own inheritance tree.  I suggest that when it is possible to resolve such problems at compile time, that is preferable to run time resolution.  I have no problem about using Decorator for state-dependent transitional behavior or for complex combinations which simply won't yield to structured decomposition, but I see it as a less rigorous solution if those conditions don't pertain.

Posted by Phillip Magnay on 22-Dec-2009 12:56

tamhas wrote:

...but I see it as a less rigorous solution if those conditions don't pertain.

By what objective criteria is it a less rigorous?

Posted by Thomas Mercer-Hursh on 22-Dec-2009 13:19

Compile time instead of run time.

Formal, clear relationships in the model with all actual and possible relationships clearly laid out.

Like I say, perhaps we should compare your decorator model for the problem I laid out to one using delegation and generalization.

Posted by Phillip Magnay on 22-Dec-2009 13:27

Compile time instead of run time.

This is not correct. The OO compiler is not going to skip type-checking for a decorator implementation or for standard inheritance.

Formal, clear relationships in the model with all actual and possible relationships clearly laid out.

The decorator can be modelled with all formal relatonships clearly laid out.

So hardly less rigorous.

Posted by Thomas Mercer-Hursh on 22-Dec-2009 13:43

I am not referring to a lack of compile time type checking, but to compile time resolution of structure.

I would like to see an example model.  To illustrate the principle, though, it should be the Decorator equivalent of something where there were at least two or three different candidate for delegation, each with at least two flavors.  Should I do a model the way I would do it and post it and they you can show your version for comparison?

Posted by Phillip Magnay on 22-Dec-2009 13:52

tamhas wrote:

I am not referring to a lack of compile time type checking, but to compile time resolution of structure

Could you explain the difference compile time type checking and "compile time resolution of structure"?  What does the latter provide that the former does not?

I would like to see an example model.  To illustrate the principle, though, it should be the Decorator equivalent of something where there were at least two or three different candidate for delegation, each with at least two flavors.  Should I do a model the way I would do it and post it and they you can show your version for comparison?

Post your model.  But I won't be able to model a comparable version straight away.

Posted by Admin on 22-Dec-2009 13:56

Should I do a model the way I would do it and post it and they you can show your version for comparison?

At this time the audience that has been bombed by posts over the last couple of days is certainly more interested in sample code, not just more abstract modells.

Posted by Phillip Magnay on 22-Dec-2009 14:00

OK.  Then that will take much longer.

Posted by Admin on 22-Dec-2009 14:04

pmagnay schrieb:

OK.  Then that will take much longer.

I do certainly understand that - but my primary intention is not to add any more load on any of you guys.

But it will make the difference much clearer for a broader audience, I guess. And potentially create more interest in these topics!

Posted by Thomas Mercer-Hursh on 22-Dec-2009 16:04

By compile time resolution of structure I mean that delegation links are to a specific object type and inheritance connections are resolved.  Thus, the structure is fixed at compile time.  With Decorator, one has no fixed resolution of what combination of Decorations are going to be added at run time.  If there are a large number of options, the number of combinations of options is potentially large.

I don't have a model ready to post, but I will try to work one up.

Posted by Thomas Mercer-Hursh on 22-Dec-2009 16:07

Speak for yourself, Mike, this part of the audience is very interested in what Decorator looks like in UML with a reasonable problem domain.

I agree that sample code for alternate patterns is a great idea.  It would be particularly lovely if we could agree on a common model and each author publish a sample for that model.  However, I am not optimistic about that happening real quickly.

Posted by Thomas Mercer-Hursh on 23-Dec-2009 17:19

this is not post-modernist philosophy where everything is relative and subject to an individual's perception

I was just re-reading this post to double check that I had covered all your points in the whitepapers which I am finishing up at present and this sentence struck me freshly.  Actually, I think that in the end, OO design principles are a lot closer to philosophy than any of us would be entirely happy about.  When I prowl the web looking for background on some of these ideas I am surprised at just how much diversity of opinion there is on what seem to me to be pretty fundemental design principles.  To be sure, it is sometimes easy to write off a particular author as a "nutter', but even the nutters sometime have a point behind their position.  I'm with you in trying to educate people in what I see are sound principles and to encouage people to follow them in their own code and to use them in judging other code and patterns, but for any given principle, you will get one author saying that it is absolute, another one saying it is a nice idea, but often not achievable, another one saying that it is often misapplied and comes close to being an anti-pattern, and yet another deciding that it is irrelevant because it is just an ingredient in some larger pattern or needs breaking down into subpatterns.

People need to make choices about whom they will believe.

Posted by guilmori on 21-Jan-2010 08:19

Phil,

Is there more details or sample code coming in a near future ?

Posted by Thomas Mercer-Hursh on 21-Jan-2010 13:20

I am working on a model using delegation which I will publish in the not too distant future and I am close to publishing a comparison of Decorator to Generalization and Delegation.  Sample code is a little farther out as I need to finsih the whitepapers corresponding to my PUG Challenge talk first.  I should be back to those soon, though, as I am about to release the third of three basic concept papers and then I will be back to this.

BTW, there has been an interesting shift in the implementation of the M-S-E pattern in terms of how the BE gets access to the data in the model.  If Phil doesn't describe it first, I will be covering both the old and the new way in my whitepaper.

Posted by Thomas Mercer-Hursh on 22-Feb-2010 17:31

In this discussion, there has been reference to the use of the Decorator pattern from the Gang of Four book and its use as an alternative for Generalization.  There are really two separate issues here.  One is what the intended use is of the Decorator pattern as expressed in the book and the other is whether or not the usage in Model-Set-Entty which Phil describes by reference to Decorator is or is not the preferred way of dealing with subtypes in the context of M-S-E.   I felt this was a discussion which needed more detail than would work in an on-line forum, so I have written a whitepaper to present my views on Decorator as given by GoF.  This can be found here http://www.cintegrity.com/content/%E2%80%9CGang-Four%E2%80%9D-Decorator-Pattern-What-Its-Appropriate-Use .  The discussion about Model-Set-Entity in general will be forthcoming later.  I am hoping that Phil will write up his own views on Decorator and publish them.  When he does, I will link from mine to his since I think this exchange of viewpoints has educational value.

This thread is closed