Reference Architecture Reactions

Posted by john on 31-Jul-2006 13:07

We've been talking and writing and presenting about the OpenEdge Reference Architecture for a couple of years now. Some partners are actively using (and varying) it as a basis for new or redesigned development, others still seem to see it as farther in the future for them or difficult to relate to their current situations and needs. For you who've seen and read some of the existing materials (see the PSDN topics under the Architecture -- and Implementing the OERA -- as part of the OpenEdge Principles materials here on PSDN), what ideas do you get from the materials? What issues does it raise? How do you or can you imagine making use of the concepts and possibly the samples? What kinds of directions would you like to see the discussion go in? Lots of questions, but let's get started talking about it...

All Replies

Posted by Alon Blich on 03-Aug-2006 05:01

I didnt get to attend any live show presentations but out of interest I read the documentation and whitepapers when they came out and tried to go through the presentation and web events.

Truthly, I had a hard time with the documentation. Although I remember there was a paper on Distributed Applications and AppServers that was just beautiful.

I see the point in separating physical storage from internal view and of course BL/UI but then there's the simplicty of writing web apps. Maybe if we had GUI it might be a more natural setup, maybe ?

I'd love to see other people and IMO theres some very talented writers in the community write on the subject and offer maybe simpler and different ways of presenting the subject.

Posted by Admin on 03-Aug-2006 14:18

I John,

My company and I have been early users of your new concept ADM2/SDO/Ntier a few years ago (2001).

Our main software, ERP like, is completly constructed upon this technology and is pretty successful in our customer production sites.

ADM2 tech have been successful in dev and project teams and on the customer side aswell.

It has been successful in dev/project team, because it produce nice looking and practical screens and because it is a very powerful environment for designing db oriented apps.

It has been successful on the customer side for roughly the same reasons, nice and pratical screens, and because their supplier (us) are able to take their change request without complaining.

You may not see it but it has been for us a dramatic change in app design, giving us the ability to focus on other things than app design which is not our job.

We really think that PSC had done a great job in providing us with this env.

As a comparison, Java and .Net are far from the productivity that ADM2 provide us.

Having said that, we are a little worried for the next dev env that you are providing.

1) it is at present time not complete : no GUI, no decent GUI, it will be provided in 2007 AFAIK. This is really a big lack in your new env.

2) it seems going toward the main current frameworks principles (Java and .Net) with OO Orientation, Services, Presentation (MVC), etc.

This will make dev heavier.

ADM2 had the great advantage of implementing Ntier easily without unneeded hassle. And giving Webclient and its automatic update deployment, what else ...

3) in conclusion to point 2), the only competitive advantage for PSC will be it ease of use of its Data manipulation verbs and principles, easier than JDBC.

The main points of concern are actually :

- Providing us with a comprehensive dev env (with integrated and up to date GUI).

- Making developpers work easier, up to the point that was reached with ADM2, with tight integration of the Modeler tool and the code generated, just like ADM2 as i said.

What is said in this sentence may not seem harmful, but when you are a daily user of ADM2/v9, I can say it is a very high challenge in terms of productivity, maintanability and stability.

Best regards

Posted by Alon Blich on 03-Aug-2006 15:41

I've been wondering about the new GUI. Are we still on for late 2007, or somewhere in that area ?

I know Bruce has moved on. He's the one that presented it at exchange and was, probably, very much involved with it. Though I don't know if he was an authority or something of the sort on the subject ?

I'd love to hear more about whats planned for the new GUI, especially, if it will have a resizing scheme ?! which I think would be a serious mistake not to include or at the very least leave room for it in the future.

I've done lots of work with resizable layouts in the past and of course theres more then a few frameworks around.

It'll be interesting to see them incorporate ActiveX controls, or even better a GUI only framework, even though its a dying technology. Just a thought.

Posted by Mike Ormerod on 03-Aug-2006 15:49

Hi Alon

Your response raises a couple of questions to me Firstly you comment on having a hard time with the documentation, may I ask to which documentation your referring to in particular? Obviously one of our aims is to make the material we're producing as readable as possible.

My second question relates to simplicity. In your opinion do you think that concepts such as the OERA and the subsequent implementation samples we've produced are too complex? One concept we've been thinking about internally is possibly putting together some more simple OERA based examples. So I'd be interested to hear if you think there is a pent up need for such work!

Thanks for the feedback

Mike

Posted by Alon Blich on 03-Aug-2006 17:00

First of, alot of people I know are not just new to OERA and Datasets, even AppServers are new.

So it maybe alot to take in.

Maybe, for me, the n-tier poster is too abstract. OK, it fits everything but its hard to associate what goes where.

From what I pieced together thinking about it.

In a rich client setup, it largely divides into client/service the UI and context management goes into the client and the service is stateless.

And because its a very scalable model.

Datasets are mainly used to pass complex data structures back and forth and theres the proxy and gateway thing.

And to simplify saving/getting data from/to internal view and physical storage.

I read the whitepapers a while ago (not the first version) and the web events and presentations, not the 10.1a docs.

It had alot of detail but I didn't get the big picture.

But then again it was sometime ago, I'll try going over the 10.1a docs this week, maybe it'll look different this time.

English isn't my native language though I don't think the docs ever had a problem in that area. Besides don't take me too seriously

Posted by svi on 04-Aug-2006 08:34

For the questions regarding the New User Interface ...

I've been wondering about the new GUI. Are we still on for late 2007, or somewhere in that area ?

Yes, that is still our target!

I'd love to hear more about whats planned for the new GUI, especially, if it will have a resizing scheme ?! which I think would be a serious mistake not to include or at the very least leave room for it in the future.

If you have not already, please check the presentation INNOV-14 from Exchange 2006. In addition if you could not go to Las Vegas, please plan to attend PTW and INNOV-14, which includes a couple of demos. The session will have additional information and Q&A, of course.

...Resizable layouts:

Yes, that should be possible.

It'll be interesting to see them incorporate ActiveX controls, or even better a GUI only framework, even though its a dying technology. Just a thought.

We are planning ot include everything you may need in-the-box (OpenEdge Architect), including controls. Possibly with popular third party controls as well. I cannot say much else now, we are negotiating with vendors.

Message was edited by: Salvador

Typo:

"...please check the presentation INNOV-14 from Exchange 2005 ..." should read "...Exchange 2006"

Posted by Alon Blich on 04-Aug-2006 10:01

I had a quick read through the 10.1a Getting Started: OpenEdge Reference Architecture doc.

It was surprisingly good, I could relate to the ideas right away. Exactly the overview I was looking for !

Thanks

Posted by Alon Blich on 04-Aug-2006 10:09

That's very exciting to hear, all the work being done around UI's !

Posted by npowers on 04-Aug-2006 10:15

There is so much to say about the ideas raised in this thread that I'm not sure where to start! This is a great conversation to keep going and very helpful to what we are thinking about.

In general, the theme seems to be about productivity and any advantages that we can bring in that area. I would agree with Alon that productivity has, in large part, been elusive over the last few years. It is not that we've taken anything away, it is simply that the environmental expectations have become more complex. OERA is certainly not as productive as the old "FOR EACH..." that made us famous. And, in present form, it is not as easy as ADM2. Basically, that's becaue we have had to sacrifice, at least temporarily, some productivity to gain flexibility. ADM2 made it easy to get across the client/server bridge, but the ADM2 objects do not work with alternative user interfaces and alternative service-based access methods. We needed something more flexible. ProDataSets will form the backbone of that flexibility, in that they handle complex datasets quite well, regardless of the intended exposure targets and/or the intended storage requirements. If all you want to do is store things in an OpenEdge database that exactly reflects the logical data structure, and expose them through only an OpenEdge user interface, then they are perhaps overkill - particularly if your integration requirements are few. But I would maintain that time and growing flexibility demands will show that having a flexible data manipulation and marshalling object between storage, logic, and logic exposure will prove very valuable. And having a SOA-based architecture will make it easier to adapt to those changing requirements. But, if you don't need that, we'll still support the "FOR EACH..." and ADM constructs.

Back to productivity: There was a time when productivity was expressed entirely in the language. It would be nice to return completely to that, but I'm not sure that today's application environments will allow it. Going forward, we see it as a combination of advanced language features, good templates and examples, and advanced toolsets. I can envision UI and BL layout tools that automatically generate the necessary interface points (including ProDataSets). I can also envision tools that automatically expose business logic (including ProDataSets) whenever you need to create a new user or integration interface.

Obviously, we don't have all those language features and tools. Yet. That's one of the ideas behind OERA and PSDN. We hope to use these vehicles to help fill in the gaps for work not yet completed, and we hope to gain some beneficial experiences from all of you that will make the eventual tools and features that much stronger. In fact, it is working already. We've made some adjustments to the ProDataSet roadmap based on things learned through PSDN and the community.

As to our competitors, I think that Microsoft has a similar vision - provide the basic capabilities in the language and layer it with tools and metadata. But our language is already richer with a more native understanding of data, and we think that we can outgun them on this, even if we can't match their marketing budget. As to Java, the community as a whole still doesn't seem to understand the productivity thing, although there are some that are waking up to it slowly. So keep in mind that we're keeping one eye firmly on the characteristics of what has made us successful for all these years, and one eye firmly on where the market is going.

One last note and I'll let you get on with more important reading. We are still working on the new UI, and it is designed to incorporate many of the ideas I've expressed above. I don't want to commit to a release date in this forum, but we'll get it to you as fast as we can - it is our number one priority for the next full release.

Posted by aswindel on 05-Aug-2006 16:45

This is a very interesting and useful discussion. As already highlighted in the other responses developing modern Service Orientated Business Applications (SOBA) is much more complex and the challenge going forward is to provide the same productivity tools for building a SOBA as has been possible with traditional architectures.

We would love to here some ideas from anybody making the transition to SOBA on how you envision tools helping in the future. How do you think differently when building a SOBA and how could the tools assist with this? What are the typical steps you have to take to build a SOBA and how would you like to see tools working together to automate this as much as possible.

This is an area we are focusing a lot of research on at Progress and any input would be very timeley and greatly appreciated.

We have layed a very solid foundation for future tools with OpenEdge Architect (based on Eclipse) which was released as part of OpenEdge 10.1A. This was however just the beginning and the true value will only be realized once we start delivering additional development tools targetted specifically at SOBA and conforming to the guidelines of the OpenEdge Reference Architecture (OERA).

TIA

Anthony

Posted by Simon de Kraa on 08-Aug-2006 01:58

We would love to here some ideas from anybody making

the transition to SOBA on how you envision tools

helping in the future. How do you think differently

when building a SOBA and how could the tools assist

with this? What are the typical steps you have to

take to build a SOBA and how would you like to see

tools working together to automate this as much as

possible.

What about a "workflow designer" to design the "business services"? From this design the necessary client and server code that implements the service will be automatically generated. The implementation must adhere to "web services standards" (WDSL,SOAP,UDDI,WS-*). The design and development will be separated. Not necessary to "code-first".

I am not looking for a ESB solution that is primarily focused on integrating different applications. Although I think the "business services/process modeling tools" from the ESB tools (integration) will tend to move towards application development - I think you should work from the business processes towards the implementation and not the other way around. I think we should look for an application workflow engine that can be used to define the business services (external/interface) but also the more fine-grained components within the actual service implementation itself (internal/implementation).

Posted by Mike Ormerod on 09-Aug-2006 07:58

What about a "workflow designer" to design the

"business services"? From this design the necessary

client and server code that implements the service

will be automatically generated. The implementation

must adhere to "web services standards"

(WDSL,SOAP,UDDI,WS-*). The design and development

will be separated. Not necessary to "code-first".

I am not looking for a ESB solution that is primarily

focused on integrating different applications.

Although I think the "business services/process

modeling tools" from the ESB tools (integration) will

tend to move towards application development - I

think you should work from the business processes

towards the implementation and not the other way

around. I think we should look for an application

workflow engine that can be used to define the

business services (external/interface) but also the

more fine-grained components within the actual

service implementation itself

(internal/implementation).

The concept of looking at applications from a business process point of view, as opposed to starting from the database up, is something that we have been talking about for the past few years, even more so since the introduction of the OERA. So yes I think we agree, you really need to look at application development differently than you did a few years ago, and obviously this ultimately means you need tools to support that development paradigm. I'm not in engineering so I can't make tools promises, so I'll leave Ant and others to comment on how we may help from a product direction, but I think it's safe to say that you just can't sit at a keyboard and start coding these days, you need to think about processes, design and architecture way before you even sit at a keyboard

Posted by Simon de Kraa on 09-Aug-2006 08:52

The concept of looking at applications from a

business process point of view, as opposed to

starting from the database up, is something that we

have been talking about for the past few years

True, we have been talking about this for years...

Although it could work for some type of applications, I think creating (business/web) services from source code implementation just does not work.

The business services should be designed by the "business consultant" not the developer (and the tools should make this possible).

The service implementation should be generated from the business service definition. You shouldn't have to code the business entity/task/workflow layer + servicing layer by hand (although there are some UML based / repository based initiatives).

Shouldn't PSC look at the Sonic platform (Orchestration Server) just like Microsoft is incorporating Windows Workflow Foundation into the Visual Studio development environment (pulling the programming model/application workflow engine from MS BizTalk which will remain their main "integration" product)?

Posted by Mike Ormerod on 10-Aug-2006 16:05

True, we have been talking about this for years...

Although it could work for some type of applications,

I think creating (business/web) services from source

code implementation just does not work.

The business services should be designed by the

"business consultant" not the developer (and the

tools should make this possible).

I think the key thing is that the Business Process should be designed, whether that is by a 'business analyst' or whoever, the critical point is that they have the business knowledge/domain knowledge and can express the processes in an understandable way, preferably using some form of standard notation.

Shouldn't PSC look at the Sonic platform

(Orchestration Server) just like Microsoft is

incorporating Windows Workflow Foundation into the

Visual Studio development environment (pulling the

programming model/application workflow engine from MS

BizTalk which will remain their main "integration"

product)?

Whether Sonic Orchestration Server would be the right approach, or not, your point is well taken that it would be great if such capabilities were in the product. So if we all keep our fingers crossed who knows what the future will bring !!

Posted by Thomas Mercer-Hursh on 14-Aug-2006 12:19

The reference to Sonic is very much on the mark. While ESB was stimulated by the need for EAI, everything about ESB is applicable to building any OERA compliant application set. To a significant degree, the tools wanted here already exist in the Sonic toolset along with the technologies to implement them. What we need is a licensing model which will recognize the use of ESB tools for individual applications.

Posted by Mike Ormerod on 15-Aug-2006 04:45

As you are already aware, and as mentioned by Salvador in another thread (http://www.psdn.com/library/thread.jspa?threadID=2193&tstart=0), there are plans afoot to review licencing.

Posted by Admin on 16-Aug-2006 06:32

This is a very interesting and useful discussion. As

already highlighted in the other responses developing

modern Service Orientated Business Applications

(SOBA) is much more complex and the challenge going

forward is to provide the same productivity tools for

building a SOBA as has been possible with traditional

architectures.

We would love to here some ideas from anybody making

the transition to SOBA on how you envision tools

helping in the future. How do you think differently

when building a SOBA and how could the tools assist

with this? What are the typical steps you have to

take to build a SOBA and how would you like to see

tools working together to automate this as much as

possible.

...

TIA

Anthony

Hi Anthony,

The typical issues you're facing when writing a tiered application are:

1) code duplication

1a) One way or the other, you will end up hooking up validation rules twice:

- once at the UI-level (for rich user interfaces)

- once more at the domain level, since you can't trust the caller

1b) A data driven architecture forces you to code entities/datasets/business documents that map to database tables. A lot of times you're dealing with the same associations (relationships) between entities (tables). There is no real way to handle this consistently.

2) data access is less flexible/efficient in a tiered environment compared to direct database access (from a developers perspective)

2a) a database (driver) provides a sophisticated query interface (SQL-string commands for instance). A business service (component) provides a typed (parametrized) service, so it's a selfdescribing service to the consumer. It's generics (SQL-string command) versus type safety (business service). Most applications want to provide somekind of generic query interface, without the need to predefine dataset schema's, etc.

2b) a data access layer forces you to write an additional layer of code. Very few people succeed in writing an efficient and flexible data access layer. When you write a typed data access layer which exposes parametrized methods for every operation (transaction or query), you will end up extending the data access layer when you want to implement a new business service. So both are still tied together at the coding level.

2c) people will be tempted to use the database schema in their business service documents, since renaming fields and tables and aggregating data seems just like extra work. Or they accept partial database query strings in their business service to increase flexibility, but they forget that this will tie all tiers to the physical database schema.

3) cache consistency

When you start caching data you will have to manage that cache. What should happen to the cache when the physical database transaction rolls back? Most frameworks forget about this issue, but this is one of the key functionalities of a database manager. Most applications have to simulate this functionality with an oversimplified model.

4) code expansion

You can visualize a problem using UML-diagrams. But there is no way to map this model directly to the implementation model, since at that detailed level new classes and new methods will be introcuded to support the tiered model. This is one of the pitfalls when starting off with UML-design tools. Some tools promise you that they can track requirement changes all the way down to the code by creating a logical UML-model and a physical UML-model. Well, it might during the initial design phase, but after a couple of months the system isn't synchronized anymore with the actual code base (even with UML-tools that promise roundtrip code engineering).

5) any database?

When writing a componentized application, database access gets fragmented, since the code is no longer following a procedural flow, but an object oriented flow. So your rewritten and tiered 4GL-application might perform well against a Progress database, it might not perform so well against Oracle/MSSql due to the database access characteristics. Other databases are not so fond of nested queries which is so easy to do in the 4GL:

FOR EACH CUSTOMER NO-LOCK:

FIND ORDERS OF CUSTOMER NO-LOCK NO-ERROR.

FIND COUNTRY of CUSTOMER NO-LOCK NO-ERROR.

It's nice that people start about workflow orchestration, but I think a well designed application architecture is more important. Microsoft is working on data integration with .Net 3.0 (DLinq), which rang a bell: isn't that what the 4GL did for ages? On the other hand, they have extended it with XML-support: being able to FOR-EACH over an XML-document!

So I think it's important to look closely at the 4GL and change it's direction from verbose to lean and mean. Focus on declaration of features. The compiler can spit out the dirty details.

Yes, sure, an entity designer is nice, but an experienced developer works faster when he manipulates the XML-file with the entity definition (XML-file for instance) directly, doesn't he?

Hope this helps,

Theo.

Posted by Mike Ormerod on 16-Aug-2006 06:49

Just so you know Ant is currently in Australia at the PTW over there (lucky chap), but I'm sure he will have something to say I think I need to read it a couple of times before I pass comment.

Mike

Posted by Thomas Mercer-Hursh on 16-Aug-2006 14:26

Who, that was such a long and content-filled post that it could have been one of mine!

However, I would like to put a bit of a different spin on some of your observations.

First, let me say that the design of an OERA application up front is likely to be more complex than traditional applications but:

1. This doesn't necessarily translate into more complex development because strong encapsulation, reuse, and up front clarity all facilitate the development process; and

2. This complexity arises in large part because of the richness of the goal, i.e., an OERA architecture application has a richer set of capabilities which necessitates a certain amount of additional complexity, but if one was trying to achieve those same capabilities with a more traditional architecture, that would be even more complex and difficult. This is particularly true when one looks beyond the initial implementation to the evolution of the software over the long haul.

As to your specific points:

1a) One way or the other, you will end up hooking up validation rules twice:

I don't think this is quite true because they aren't necessarily the same rules. E.g., if there is a rule that a code must be one of the values in a particular table, at the UI this might be manifest in the form of a control which only presents legal values while in the DA layer it might take the form of doing a table lookup. Logically equivalent, but not the same code.

1b) A data driven architecture forces you to code entities/datasets/business documents that map to database tables.

I'm not sure what your point is here. The object presented to and used in the business logic and UI layers doesn't necessarily correspond to a database table and may not even be sourced from a database table, especially not directly since the immediate source may be XML off the bus. While one would normally expect that it would be most efficient to have a stored form and an object form be similar, there are a number of ways in which they may not be and this can be desirable. E.g., an order might be a single object in the BL and UI layers, but multiple tables in the DB. Or, an object might have contents such as summary fields or descriptions of codes which are not in the stored form. One of the main points of the DA layer is to isolate the form used in the application in general from the specifics of how or where it is stored and to encapsulate that relationship in one place so that one doesn't have multiple unrelated pieces of code doing the same assembly and disassembly.

2) data access is less flexible/efficient in a tiered environment compared to

direct database access (from a developers perspective)

I don't know that I would agree that this is true. To be sure, in the most trivial case, a simple FIND statement is very direct compared to the corresponding object reference, but in more complex cases the developer can find that there is a pre-existing component which already does the complex work needed for some new usage and no new development is needed.

2a) ... Most applications want to provide somekind of generic query

interface, without the need to predefine dataset schema's, etc.

I'm not sure of your point here. There is nothing about having a data access layer that prevents one from having a generalized query access to the data. In fact, it can even facilitate it because the generalized finder created for one need can cover the requirements of some other need.

2b) a data access layer forces you to write an additional layer of code. Very

few people succeed in writing an efficient and flexible data access layer.

Well, then, I guess that we will just have to educate them in how to do better, won't we? Seriously though, any time there is a new paradigm, it takes people a while to figure out how to do it right. Some people learn quickly; some don't. I think that good models and some good writing can help people learn to do this properly and then they will gain the advantage. That people do it poorly is not a criticism of the concept unless it is inherently difficult to implement the concept and I don't believe that is the case here.

2c) people will be tempted to use the database schema in their business

service documents, since renaming fields and tables and aggregating data

seems just like extra work.

For my part, having this separation is an enormous relief since new code can use a sensible naming structure without having to worry about legacy names. The equation of the two is made in one place only. To be sure, I would love to get the schema modernized too, but with this isolation I don't have to refactor the whole application to start using a preferred new naming structure.

3) cache consistency

When you start caching data you will have to manage that cache.

Yup! To be sure, there is a requirement, but then there is also the benefit. Just think how many zillion times a busy application does finds into common code tables and how many potential reads one can save by caching those tables which only change once in a blue moon. And, what if the table in question isn't even on the current server? Besides, an optimistic locking strategy tends to imply caching.

4) code expansion

You can visualize a problem using UML-diagrams. But there is no way to

map this model directly to the implementation model, ...

Well, except there is ... it just takes some development work. Ultimately, one should be able to go from UML to ABL 100% using MDA ... someone just needs to do the work to build the MDA transforms. That's my goal.

Well, it might during the initial design phase, but after a couple of months

the system isn't synchronized anymore with the actual code base

I don't think the fault here is the tool.

5) any database?

When writing a componentized application, database access gets

fragmented,

Otherwise known as encapsulated?

So your rewritten and tiered 4GL-application might perform well against a

Progress database, it might not perform so well against Oracle/MSSql due

to the database access characteristics.

If anything, OERA is way ahead of traditional architectures in this respect. For starters, all of the database access is tightly encapsulated so that, if there is some issue about a particular type of query, one can go directly to that location and fix it without having to search all over the application for places that might have the same issue. In the most extreme case, one can have different data access components. Certainly, in my design for a data access layer, virtually every DA component will exist in at least two forms with identical interfaces. One form accesses the database directly and the other acceses the bus via XML to obtain the data from a remote service. The rest of the application can't tell which it is. There is no reason that one couldn't elect, for example, to create a third set which used SQL directly to the Oracle or SQLServer database rather than going through a dataserver.

It's nice that people start about workflow orchestration, but I think a well

designed application architecture is more important.

Well, I for one think that OERA = "well designed application architecture".

Workflow orchestration isn't just a fancy new tool, but a very important new capability. By externalizing the workflow out of the code and into a a business process engine, one makes the application potentially far more nimble. Let me tell you about a case that I knew about from the days of the FusionBus, what one might call the original ESB, only we didn't have the term back then. I think the name of the company was TransCanada Pipeline and they had all these pipes for shipping oil. There were something like 20 different computers which managed individual sections of the pipeline, some really ancient and not very reliable. Before Fusion, scheduling a shipment involved manually connecting with each computer on the route, which might or might not be running at that moment, and trying to piece together a schedule from available blocks. It could easily take days. With Fusion, they were able to automate the whole process and it would even reboot non-resonsive servers and page tech support when something wouldn't come back up. The business process logic engine would walk down the line, back up when it couldn't make a connection, and get the whole thing done in minutes. And, when something changed, it was merely an adjustment of rules in the BP engine, not a coding change.

So I think it's important to look closely at the 4GL and change it's direction

from verbose to lean and mean. Focus on declaration of features. The

compiler can spit out the dirty details.

While I wish it were possible to "clean house" in ABL, e.g., moving all the UI stuff out into a class library with the capability of doing overrides, I'm afraid that it isn't practical unless you can figure out how to:

a) automatically convert all existing code to use the new syntax; and

b) figure out how to convince everyone to use the latest release.

However, which said, I think the new OO features allow one to create one's own discipline and clean code.

Posted by Mike Ormerod on 17-Aug-2006 04:11

Who, that was such a long and content-filled post

that it could have been one of mine!

......

What's even more scary is that I find myself pretty much agreeing with what you've just said !!

One interesting point to a lot of this is the complexity of the architecture, and I've heard it commented a few times lately that the OERA is too complex, or viewed as being just too difficult. Now on questioning this point further I tend to find that it isn't the OERA per se that the person see's as complex, but more the issue of n-Tier architecture in general.

So this leads me to ask the obvious question, is the OERA really a difficult concept, or is it the more fundamental issue of n-Tier architecture in general, and if so should we be focusing some of our efforts in showing how n-Tier architecture can be designed and implemented, and then in effect step up to the more complete OERA approach? Lets not forget, at the end of the day OERA is not code, it's a reference architecture/design/blueprint/approach/.

One final point is that given the OERA is a reference, we've never said you have to implement it all, in every case. If, in your particular situation it makes no logical sense to have a seperate Data Access & Data Source objects, then don't! But if you feel that over time your application requirements may change, and one day you'll need alternative data sources, then maybe taking the full approach is the right thing.

But this is good stuff, please keep it coming.

Posted by Admin on 17-Aug-2006 10:17

Hi Mike, Thomas,

One interesting point to a lot of this is the

complexity of the architecture, and I've heard it

commented a few times lately that the OERA is too

complex, or viewed as being just too difficult. Now

on questioning this point further I tend to find that

it isn't the OERA per se that the person see's as

complex, but more the issue of n-Tier architecture in

general.

This is exactly the point I was trying to make regardless of OERA (or whatever you want to call it at Progress I have experienced this issue in several projects and in several programming environments. It looks like developers found the transition from procedural CHUI to event-driven GUI easier than stepping into the n-tier architecture. Perhaps the GUI-change is easier "to sell", since the other thing is "just an internal change".

A good example is the adoption of the AppServer environment: if everything "is so easy and self explaining with a multi layered architecture" according to Thomas, why have so few 4GL applications been ported to a full AppServer environment? And why do we have Citrix and processor virtualisation nowadays?

I think it's very hard to properly design a layered application architecture, that fullfills all of your dreams. And that's what most people tend to aim for, since the new application should:

- support GUI

- support web

- support mobile devices

- support electronic B2B-integration

- multiple database types

- etc

since that's what the new architecture promises: everything will become easier to connect. Sure, that's true, but we shouldn't forget that there are different requirements and a single component can't solve everything.

The "order" sample has been mentioned. The "order" in a real system is attached to an entire process and consists of the actual order-entity, delivery schema, stock management, credit management, etc. And a web-order is likely to be treated (trusted) differently from an order entered by a salesperson. A typical order in the OERA-examples are simplified to an order and a orderline. Changing the ordered quantity just means storing the new decimal value, while in a real system lots of other things need to be checked. Than there is the simple design question: do you send the original orderline and the new orderline and store the diff or should the business service accept a "cancel 12 items for orderline 12" request, which is a more abstract design.

So, at the high level everything makes sense, but at the detailed level things get more complex. At the abstract level you're talking about the "order entity", at the implementation level you have to box the order entity into an efficient unit.

Theo.

Posted by Tim Kuehn on 17-Aug-2006 12:11

Part of the problem with implemeting an OERA or an n-tier application is how hard it is to develop "layers" of BL functionality. Seperating the UI from the BL is a concept that's been around for years, and yet there's still no "easy" way to accomplish that.

Separating one BL layer from another BL layer (such as the backend BL from the front end or "client" BL) - which you need for n-tier - isn't easily accomplished either.

Now, with OO support, that separation may be easier to accomplish, but it'll take time for new code to be written, and people to get their heads around it.

Before OO support came out, I wrote a manager to handle persistent and super procedure life-cycle and scoping. It's been a huge help wth writing effective BL layers and has gotten some good feedback about it, but it's not generally known.

Tim Kuehn

Posted by Thomas Mercer-Hursh on 17-Aug-2006 16:13

Mike, I think this is exactly it. There is nothing about OERA that makes it harder than any other model for N-tier, the issue is in the complexity of N-tier itself.

I think a lot of this is not so much the inherent complexity ... although there is certainly some of that ... but more that it requires a particular way of seeing things. I remember years ago when I took an OOAD class and most of the class had been developing in an OO language for 2-5 years while I had done almost no development in an OO language at all, but it was really clear by the time the class was over that one other guy and I got "it" and the rest of the class really didn't. Great instructor, too. We certainly saw the same kind of thing when event-driven programming was introduced into ABL. I and a few others made post after post after post on the PEG explaining to people why they were asking the wrong question and trying to do something they shouldn't be doing in an event driven interface. It went on for years ... still happens occasionally.

N-tier is one of those things one gets or one doesn't and, until one gets it, it seems just horridly complex. But, once one gets it, it seems natural and straightforward.

I suppose that I am more religious than you about advocating that people do it right. Take a short cut and you are asking to be bitten. Maybe not today or tomorrow, but eventually. With good patterns and practice, it isn't really harder or more work to do it right. The tough part is in figuring out what is right.

Posted by Thomas Mercer-Hursh on 17-Aug-2006 16:41

It looks like developers found the transition from procedural CHUI to

event-driven GUI easier than stepping into the n-tier architecture.

See above ... I don't know that they did find it all that much easier. In many ways, there is a whole lot less to learn to start to do GUI E-D, but gosh people were mucking it up all over the place and still are. I think there is more to learn about N-tier than core GUI, i.e., without getting into all the use of Active-X and such. In particular, you really need to think in terms of components ... whether or not you make them into real classes, they need to behave like classes. And, as I commented above, there are a lot of people writing OO code that don't get how OO should really work.

I have to say, by the way, that one of the things I find reassuring in PSC's current development efforts is that the people working on the OO stuff seem to understand it pretty well and are reasonably religious about it ... enough so to irritate some people who are inclined not to be so religious. Myself, I think it is a good thing.

why have so few 4GL applications been ported to a full AppServer

environment?

I think that this is a very important question, but I think the main culprit here is resources, not the difficulty of understanding. Most in-house end user staffs that I have ever heard about are struggling to keep ahead of the work load, especially since they are often short staffed, and there is just no time for architectural modernization. It isn't that they don't understand or can't understand how to use an AppServer ... they have never even had time to read the book and there is no budget for a development system for them to play with. Which said, there is a lot of AppServer use out there, but when one considers the number of people who are still stuck in V6 ChUI, it isn't surprising that everyone isn't there.

I think it's very hard to properly design a layered application architecture,

that fullfills all of your dreams. And that's what most people tend to aim for,

since the new application should:

- support GUI

- support web

- support mobile devices

- support electronic B2B-integration

- multiple database types

- etc

since that's what the new architecture promises: everything will become

easier to connect. Sure, that's true, but we shouldn't forget that there are

different requirements and a single component can't solve everything.

So, tough set of requirements, eh? But, where do those requirements come from? Do they come from a decision to implement OERA? Or do they come from the real world environment? The latter, of course. OERA, ESB, SOA, etc. aren't things that people created because they were cool, but because they were answers to problems experienced in real world requirements. To be sure, achieving all of those requirements isn't trivial with OERA ... but can you imagine how difficult they are if you are starting off with a traditional V6-era monolithic ChUI architecture and trying to do all those things?

For example, take the need to interact with another data source with a non-Progress database. If you have database statements sprinkled hither and yon throughout your code, you potentially have to examine that entire code body to deal with this data source ... ask someone who has implemented the Oracle data server how much fun that is. If your data access is all concentrated into one layer, then there is a very compact set of code you need to deal with. And, if the need only relates to some of the tables, then you only need to look at those and each one you fix covers every use everywhere in the application. Moreover, if you have implemented SOA, maybe you don't have to do anything except implement a new service.

Same question if you decide that you need a new UI. If you move from a Java client to AJAX, for example, all you need to visit is the UI layer and just the actual View component of that UI layer.

It costs you something up front to create this structure, but you earn this back many times over when you need to evolve it ... not to mention the benefits from being able to respond nimbly to changed business conditions.

Let me say that I readily admit that good OO design and good SOA design is not a widely distributed talent ... but, you know, neither is any kind of architectural design talent. People manage to turn out working code without a good architect around, but that doesn't mean that is it great code. It merely means that they have managed to bash things into place. Walking into an OERA world, you are aware of the need for the architect because it is unfamiliar territory, but really you could have used that architect all the way along.

Posted by Mike Ormerod on 18-Aug-2006 02:15

Part of the problem with implemeting an OERA or an

n-tier application is how hard it is to develop

"layers" of BL functionality. Seperating the UI from

the BL is a concept that's been around for years, and

yet there's still no "easy" way to accomplish that.

Separating one BL layer from another BL layer (such

as the backend BL from the front end or "client" BL)

- which you need for n-tier - isn't easily

accomplished either.

Not wishing to put words in your mouth, and at the same time trying to read between the lines, by 'easy' are you saying there is no real tools support to help this?

Now, with OO support, that separation may be easier

to accomplish, but it'll take time for new code to be

written, and people to get their heads around it.

Before OO support came out, I wrote a manager to

handle persistent and super procedure life-cycle and

scoping. It's been a huge help wth writing effective

BL layers and has gotten some good feedback about it,

but it's not generally known.

Is this something that would be good for code share?

Posted by Mike Ormerod on 18-Aug-2006 02:24

Mike, I think this is exactly it. There is nothing

about OERA that makes it harder than any other model

for N-tier, the issue is in the complexity of N-tier

itself.

You mean we got something right So putting some effort into material that is aimed at reducing this complexity would be a good thing?

I think a lot of this is not so much the inherent

complexity ... although there is certainly some of

that ... but more that it requires a particular way

of seeing things. I remember years ago when I took

an OOAD class and most of the class had been

developing in an OO language for 2-5 years while I

had done almost no development in an OO language at

all, but it was really clear by the time the class

was over that one other guy and I got "it" and the

rest of the class really didn't. Great instructor,

too. We certainly saw the same kind of thing when

event-driven programming was introduced into ABL. I

and a few others made post after post after post on

the PEG explaining to people why they were asking the

wrong question and trying to do something they

shouldn't be doing in an event driven interface. It

went on for years ... still happens occasionally.

and there's nothing better when your running a training course and you suddenly see the light bulbs appear above peoples heads as they do suddenly get it !

N-tier is one of those things one gets or one doesn't

and, until one gets it, it seems just horridly

complex. But, once one gets it, it seems natural and

straightforward.

I suppose that I am more religious than you about

advocating that people do it right. Take a short cut

and you are asking to be bitten. Maybe not today or

tomorrow, but eventually. With good patterns and

practice, it isn't really harder or more work to do

it right. The tough part is in figuring out what is

right.

It's not that I don't 'believe' you shouldn't follow the whole architecture, but I'm also a pragmatist, and I fully appreciate that in certain situations it won't make sense. But these should be the exception rather that the rule. As we know, the Progress community doesn't much care for being told what to do, so our place is to advise and guide, and to hopefully show the benefits of an approach that we feel is of great benefit to everyone. (Well try anyway !!)

Posted by Tim Kuehn on 18-Aug-2006 08:05

So putting some effort into material that is aimed at reducing this complexity would be a good thing?

Reducing complexity is always good and it makes things simpler all around.

As we know, the Progress community doesn't much care for being told what to do, so our place is to advise and guide, and to hopefully show the benefits of an approach that we feel is of great benefit to everyone.

I prefer to be shown something works by compelling evidence of actual code over a preponderance of "just" white papers.

The ideal implementation example should be a series of relatively simple code examples which can be run against a standard sample (sports?) database. Each of these code examples would demonstrate a particular concept that I can run, see what's going on using the debugger, etc - and then apply to my work.

If this series of examples can be aggregated into an application, then all the better.

However, it's harder to learn from a "reference application" if the instructional content is spread out over the code base since one has to reverse engineer the application's business process from someone else's coding style in order to figure out the various concepts one is supposed to learn. This makes it harder than it should be for me to get on the high side of the learning curve. With "short & sweet" examples, one can re-arrange the code to their own personal coding style and so "see" what's what a lot easier.

Posted by Mike Ormerod on 18-Aug-2006 08:44

>...

If this series of examples can be aggregated into an

application, then all the better.

However, it's harder to learn from a "reference

application" if the instructional content is spread

out over the code base since one has to reverse

engineer the application's business process from

someone else's coding style in order to figure out

the various concepts one is supposed to learn. This

makes it harder than it should be for me to get on

the high side of the learning curve. With "short &

sweet" examples, one can re-arrange the code to their

own personal coding style and so "see" what's what a

lot easier.

Not wishing to build ourselves up too much, but hopefully the AutoEdge example will help with this when we post it. One of the elements of AutoEdge, other than just the code, the designs and supporting docs, is something we've termed livedoc. What this allows you to do is at any point when running the app, you can click an icon (across multiple UI's) and a browser opens with context sensitive info about where you are in the example. It highlights code bits, shows where you are within the OERA, contains the design, but then also has links up to the use case being addressed by this particular function. It also then has links to src files.

It is comming, real soon!

Posted by Admin on 18-Aug-2006 09:33

As we know, the Progress community doesn't much

care for being told what to do, so our place is to

advise and guide, and to hopefully show the benefits

of an approach that we feel is of great benefit to

everyone.

...

However, it's harder to learn from a "reference

application" if the instructional content is spread

out over the code base since one has to reverse

engineer the application's business process from

someone else's coding style in order to figure out

the various concepts one is supposed to learn.

The problem is that we might end up with another "Sports database" or Sun's (Java)/Microsoft's (.Net) ideas of a Petshop reference implementation. These reference implementations are most of the time an oversimplified representation of the real world: maintaince of a contactperson or a task-list, how complex can that get.

The problem with most architectural guides and pattern description is the level of detail provided. The real issues are most of the time left to the reader to reach a broader audience. And the devil is in the details...

This reminds me of the early days of the 4GL DataSet: we tried and tried, gave a lot of feedback, but it was very hard to use the ProDataSet the way we had in mind. We already had lots of experience with the .Net DataSet, so we understood some of the pitfalls. Overtime ProDataSet-issues has been fixed, but the initial release was insufficient. That makes me wonder if Progress finds it more important to release features or to release a solution to a problem.

What worries me a bit is the simplicity of the old 4GL compared to the complexity of the modern ABL. In the old days we defined a QUERY-statement, defined a BROWSE-widget and connected the two and voila, you had something that displayed browsable data (oversimplified as well, hehe). The runtime would handle the rest. Take out the BROWSE-widget and put in an ActiveX-control and you're already facing some of the modern complexities of tiering: explicit or manual binding data, while the 4GL is known for it's implicit databinding.

In a tiered environment with a potential network barrier, we have to define temp-tables to abstract the database tables, AppServer components which populate the temp-tables, define an UI-query against the temp-table and bind that one to the browse-widget. That's more than a couple of lines of code. This illustrates some of the new challanges:

- the screen designer has to think more carefully about the way data is treated to reduce data loads

- the programmer has to manage the amount of data that flows from server tier to client tier (restrict queries, batch data and transfer data from database to consumer tables)

Now I'm all for componentizing, don't get me wrong! And I have done my fair share in Jave/C#, so I understand the non 4GL-world. And I like the OO-extensions in the ABL (since it provides a way to mix new language syntax with old syntaxt). But effectively, there is still a lot of code that needs to be written. And we still have to deal with typical ABL workarounds...

I think Progress is in the business of making application development easier than its competitors. This doesn't mean designing a new ADM/ADM2/Dynamics framework to help you with the core ABL language. The goal should be: achieve as much as possible with expressive lines of code ("FIND Customer. DISPLAY Customer" is very feature rich). So I think step number one is to enumerate all the typical application use cases and design how they should be addressed in as few lines of code as possible.

At Microsoft they try to solve these use cases with buzzword "software factories": you create a grammar, a designer and a compiler for a particular problem area of an application and hook this "software factory" up to the IDE. A problem area could be the designing of a datasource, a data access component. The designer of this datasource software factory stores the query definition and aliasing in an XML-file and a datasource compiler generates the necessary target code. This is an interesting idea and feels a bit like ADM-templates.

Sorry for the # of words in the reply, but hey, it's raining outside

Theo.

Posted by Tim Kuehn on 18-Aug-2006 10:15

I think Progress is in the business of making application development easier than its competitors.

Exactly.

This doesn't mean designing a new ADM/ADM2/Dynamics framework to help you with the core ABL language.

There can be a use for such frameworks as a way of 'extending' the language's functionality, but the next step should be to incorporate the functionality in the framework w/in the language so one doesn't need "tools" and such to implement the same thing in code.

The goal should be: achieve as much as possible with expressive lines of code ("FIND Customer. DISPLAY Customer" is very feature rich).

Agreed. For instance, I wrote a query manager to make doing all the 'query' stuff easier and consistent regardless of what type of query was being used, the buffer type (static, dynamic, etc.). It makes creating query filter conditions, joins, etc. nearly "fill in the blanks" of a macro template expansion.

The next step PSC should do is incorporate that functionality into the language.

Tools are great, but I think the real productivity gains come from giving Joe Programmer the ability to write a couple of lines which take care of all the repetitive "mundane" stuff - much like how the original "FIND customer/ DISPLAY customer" was compared to the 3GLs of it's time.

(The query manager's been submitted to the "code share", and I'm hoping it'll make an appearance in a week or so once it clears review. In the meantime it can be found at http://amduus.com/OpenSrc/SrcLib/QueryMgr/)

Posted by Thomas Mercer-Hursh on 18-Aug-2006 11:46

You mean we got something right

That was exactly my reaction when I saw my first OERA drawing ... look, it finally soaked in!

So putting some effort into material that is aimed at reducing this

complexity would be a good thing?

I don't know that one can actually reduce the complexity since it is inherent to the problem space. However, I do think you can put effort into helping people to understand the complexity and into understanding how to decompose problems into meaningful components so that they become not so overwhelming. I also think you can put effort into tools to help manage the complexity. One of the things that was so strong about the Forté environment was the combination of a strong orientation toward services and tools which made it easy to shift around the deployment of those services to address performance. I suspect that this could be done today with the right UML based tools.

and there's nothing better when your running a training course and you

suddenly see the light bulbs appear above peoples heads as they do

suddenly get it !

And nothing worse than when you get to the end and see a bunch of blank stares. Back in my professorial days I remember one especially painful experience when I had given a lecture on primate intelligence and a student came to me afterwards and wanted me to explain an experiment I had described. I tried, but eventually gave up ... the orangutang had understood it better than her

As we know, the Progress community doesn't much care for being told

what to do, so our place is to advise and guide ...

I fully understand the dilemma, but I also think that, in order to take a leadership role, one also has to take a stand. I.e., to be sure, one has to make the tools support people doing it their own way, but the supporting materials should be clear about saying that X is bad and Y is good.

Of course, this assumes that one has figured out which is which.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 11:51

Tim, I think that one of the problems with small isolated examples ... despite their obvious advantages for clarity ... is that a useful example really needs to reflect real world complexity. Something that is too simple can easily end up being something that falls apart in practice.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 11:56

I hope that one of the aspects of Autodoc which you will consider is presenting it in "enterprise" form. By that I mean that the packaging ... naming conventions, directory structure, etc. ... is something that is appropriate for an enterprise class ERP scale application environment.

Please tell me that my horrible premonition that this example will still be .p, .w, and .i isn't true and that we will actually have a .cls example from PSC.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 11:59

The problem is that we might end up with another "Sports database" or

Sun's (Java)/Microsoft's (.Net) ideas of a Petshop reference implementation.

These reference implementations are most of the time an oversimplified

representation of the real world

Amen!

While Tim is right that it can be much easier to see something in the context of something simple and isolated, we desperately need a foundation example which is a reasonable reflection of a real world scenario. I've spent some time in recent months working on various foundation issues with the idea of publishing discussions and code and I keep finding myself frustrated at all the things not in the Sports2000 database which make it difficult to create real world examples.

I think Progress is in the business of making application development easier

than its competitors.

This is certainly the message that is being put out.

This doesn't mean designing a new ADM/ADM2/Dynamics framework to help

you with the core ABL language. The goal should be: achieve as much as

possible with expressive lines of code ("FIND Customer. DISPLAY

Customer" is very feature rich). So I think step number one is to enumerate

all the typical application use cases and design how they should be

addressed in as few lines of code as possible.

I think I will have to differ with you here. Let me say up front that I think there was a lost opportunity in the move to V7 when so much was added to the language instead of to a framework, but that is water under the bridge (or over it, depending on how hard it is raining!) and, like it or not, there isn't much hope for redesigning the language at this point. Which said, I do think there is a lot of potential in the OO extensions for writing code which at the least tightly encapsulates the complexities and, at best, might even eliminate some of it.

That said, I do think there is a place for frameworks, but the big problem with the historical frameworks is that they are too rigid and specific. They are very much a "do it my way or not at all" construct and the tools that go with them have no utility unless one commits to the whole program. What we need are frameworks and libraries from which people can draw what they need and in which it is easy and straightforward to customize to suit local needs and tastes without creating a barrier to upgrade.

This is what is horridly wrong with the 10.1A implementation of T4BL ... it is very rigidly oriented at a particular structure and provides essentially no options for doing anything else. Since that structure is one I disagree with, it means the tool is useless.

But, I think there is enormous potential for model-driven development techniques here, as long as everything is open and available for customization.

Posted by Tim Kuehn on 18-Aug-2006 12:18

Tim, I think that one of the problems with small isolated examples ... despite their obvious advantages for clarity ... is that a useful example really needs to reflect real world complexity. Something that is too simple can easily end up being something that falls apart in practice.

I agree - but the other side of it is if it's too complicated, it's too hard to "get" what's being discussed without significant of time and effort. If it's too hard to get through a "large, detailed example of everything but the kitchen sink" demo application - the student won't get too far and the material won't have accomplished it's objective.

There needs to be simple cases to illustrate the pattern one expects to see for specific functionality. These simples cases then need to be aggregated into larger structures to show how everything interacts and work together.

The "simple" cases provide the details of the low level functionality, while the aggregate structures illustrate how to get the higher-level functionality and how to build 'real world' applications.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 12:29

I think the way that one strikes a balance is to design a reference application that:

1) contains enough complexity in its schema to reflect real world issues; and

2) reflects this complexity in the portions of the application which are implemented.

I.e., one doesn't need to have a reference application that covers the range of an entire ERP, but the pieces that are there should be strong enough that they wouldn't be out of place in a full ERP.

With that baseline in place, one can then focus in on specific issues and one doesn't need to look at everything every time. One can say, "let's look at an example of a set of data access layer components" and focus in on one or two sets of these. One might be very simple; something like a code table. They other might be something more interesting and reflect the kind of complexity one gets with an order or invoide.

And, it needs to be packaged appropriately. None of this putting the example in the main directory stuff. We have good models from the OO world about how to package things ... let's start learning from it.

Posted by Admin on 18-Aug-2006 12:53

This doesn't mean designing a new ADM/ADM2/Dynamics

framework to help

you with the core ABL language. The goal should be:

achieve as much as

possible with expressive lines of code ("FIND

Customer. DISPLAY

Customer" is very feature rich). So I think step

number one is to enumerate

all the typical application use cases and design

how they should be

addressed in as few lines of code as possible.

I think I will have to differ with you here. Let me

say up front that I think there was a lost

opportunity in the move to V7 when so much was added

to the language instead of to a framework, but that

is water under the bridge (or over it, depending on

how hard it is raining!) and, like it or not, there

isn't much hope for redesigning the language at this

point.

Well, the OO-extension approach show it's possible to extend the current programming environment with a dedicated and streamlined grammar for a particular problem. The runtime building blocks are there, so it's up to the compiler to translate the grammar to executable code. Why continue defining a temp-table the way we have to do right now? You can imagine a new grammar to define a temp-table, dataset and some validation rules, which will be compiled to a standalone r-object. And you reference this definition in your code instead of including it. This is just one example of moving forward and makes tooling easier (entity-designer). You can imagine a data driven grammar for defining data access objects. If you don't want to use the new stuff, use the traditional .p and .w files, similar to cls-files (and Microsoft .Net CLR operability).

What we need are frameworks and libraries from

which people can draw what they need and in which it

is easy and straightforward to customize to suit

local needs and tastes without creating a barrier to

upgrade.

That's basically the problem with the ABL today: everything gets included in the language instead of creating a simple language with an extensive library (Java grammar is relatively small, but the Java libraries are overwhelming).

This is what is horridly wrong with the 10.1A

implementation of T4BL ... it is very rigidly

oriented at a particular structure and provides

essentially no options for doing anything else.

Since that structure is one I disagree with, it

means the tool is useless.

OK, so you want a domain model instead of a data driven model. I don't think the OO-extensions are designed to support real domain classes, since the property access is probably too slow for that. There is just one valid ABL-model at the moment and that's the temp-table model: data seperated from behavior for the internal object model as well as the external object model....

Theo.

Posted by Simon de Kraa on 18-Aug-2006 13:06

As you are already aware, and as mentioned by

Salvador in another thread

(http://www.psdn.com/library/thread.jspa?threadID=2193

&tstart=0), there are plans afoot to review licencing.

(Sorry to go back a few messages...)

I don't think Salvador is talking about Sonic licensing...

I would love to see a T4BL for building and managing business process flows. This could be (based on) the Sonic Orchestration product.

But did you have a look a the Progress Software Product Price List for Sonic Software lately? Sonic Orchestration Server is positioned as an ESB solution, it is not positioned for application development. But I think this is just a matter of time...

I did some quick tests with Sonic Workbench 7.0 and I am very enthousiastic. Examples please!

Sonic Evaluation Kit:

http://www.psdn.com/library/entry.jspa?externalID=1681&categoryID=89

Upcoming Webinar:

Designing and Deploying SOA Applications on Sonic ESB for the OpenEdge Developer: Thursday, August 31, 2006

http://www.psdn.com/library/entry.jspa?entryID=1579

Posted by Thomas Mercer-Hursh on 18-Aug-2006 13:22

Why continue defining a temp-table the way we have to do right now? You

can imagine a new grammar to define a temp-table, dataset and some

validation rules, which will be compiled to a standalone r-object. And you

reference this definition in your code instead of including it.

Do you find something onerous about the language for defining a temp-table? I don't. My problem with temp-table pre 10.1A was that one didn't have much choice for many uses other than the obnoxious practice of defining them in .i files so that identical definitions could be used in all files that referenced them. But, with 10.1A, one defines an object and then references that object everywhere and the temp-table is fully encapsulated. See the discussion and code on collection classes and singletons on my download page at http://www.cintegrity.com/downloads.html

That's basically the problem with the ABL today: everything gets included in > the language instead of creating a simple language with an extensive library

(Java grammar is relatively small, but the Java libraries are overwhelming).

In the abstract, I certainly agree with you. I have never seen a more beautiful language than TOOL, the Forté OO4GL with something like 64 keywords plus SQL and everything else in a library. But, that isn't really a choice. Starting with the current ABL, all we can really do is to add to the number of keywords, not subtract from them meaningfully.

OK, so you want a domain model instead of a data driven model. I don't

think the OO-extensions are designed to support real domain classes, since

the property access is probably too slow for that. There is just one valid

ABL-model at the moment and that's the temp-table model: data seperated

from behavior for the internal object model as well as the external object

model....

What evidence do you have that accessor methods are slow? Especially, too slow? Myself, I am finding 10.1A to have a very creditable set of OO extensions ... not perfect and complete in the first release, of course, but 10.1B should be going to beta soon and that should help round things out. But, even now I think there is enough there to get real work done.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 13:27

I don't think Salvador is talking about Sonic licensing...

He wasn't ... but I had good interactions with all the right people at Exchange specifically about Sonic ... maybe I should post that whitepaper I wrote ... and across the board I got the message that, not just for Sonic, but relative to all the PSC products there is a very strong interest in arriving at a licensing model that will allow us ABL types to make use of the neat new products they have added to the stable. This is a non-trivial problem since one doesn't want to throw away revenue from the contexts that can easily justify the current price of Sonic based on its value to the problem space, but I felt very good that they were thinking about it.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 13:35

OK, I put up the whitepaper. You will find it at http://www.cintegrity.com/PDF/ESBAsAnApplicationArchitecture.pdf

Posted by Admin on 18-Aug-2006 13:52

What evidence do you have that accessor methods are

slow? Especially, too slow? Myself, I am finding

10.1A to have a very creditable set of OO extensions

... not perfect and complete in the first release, of

course, but 10.1B should be going to beta soon and

that should help round things out. But, even now I

think there is enough there to get real work done.

I have a feeling the runtime is not designed to instantiate several hundreds/thousands of OO class instances to represent entities. I assume that a class instance is like a persistent procedure deep down in the runtime core. So I think the more class instances you have, the slower the runtime will become. I think temp-tables are designed for managing data. But you're right, this is my assumption.

I guess you do have a service facade running on an AppServer and you do want to marshal your domain entities sooner or later via temp-tables to the client as your data transfer object, right?

Theo.

Posted by Tim Kuehn on 18-Aug-2006 13:54

I have a feeling the runtime is not designed to instantiate several hundreds/thousands of OO class instances to represent entities

Someone on the main PEG list has admitted to instantiated something around 10K persistent procedures as a matter of practice, and said performance was still pretty good in his application.

Posted by Thomas Mercer-Hursh on 18-Aug-2006 14:09

As Tim notes, there has been some testing on the PEG and there are approaches where one ends up in performance issues, but I think that there is also indication that there are design approaches which don't end up pointing in that direction. In fact, as I recall, it is tens of thousands of temp-tables which tends to produce the problem.

Of course, what we really need is either a true multi-threaded client or at least a highly performant way for multiple sessions to interact and then we can spread the load!

I guess you do have a service facade running on an AppServer and you do

want to marshal your domain entities sooner or later via temp-tables to the

client as your data transfer object, right?

Actually, at this point I am largely ignoring the UI and thinking mostly in terms of AJAX clients, so I wouldn't be sending out temp-tables in any case. And, if I sent something, it would more likely be a ProDataSet.

Posted by Simon de Kraa on 19-Aug-2006 03:13

OK, I put up the whitepaper. You will find it at

http://www.cintegrity.com/PDF/ESBAsAnApplicationArchit

ecture.pdf

Yes, I couldn't agree more.

I must admit I only heard about Microsoft Windows Workflow Foundation (WWF) a week or so ago, but isn't this exactely the same case?

I guess Microsoft will face the same licensing issues for WWF/BizTalk as PSC will have with "OpenEdge Workflow Foundation"/Sonic ESB?

So how about the licensing model for WWF?

Windows Workflow Foundation

http://msdn.microsoft.com/workflow/

"Windows Workflow Foundation is the programming model, engine and tools for quickly building workflow enabled applications on Windows. It consists of a Microsoft .NET Framework version 3.0 (formerly WinFX) namespace, an in-process workflow engine, and designers for Visual Studio 2005."

Posted by Thomas Mercer-Hursh on 19-Aug-2006 12:51

I haven't looked at WWF, but just from the description it sounds very primitive compared to Sonic Orchestration Server.

Posted by Simon de Kraa on 19-Aug-2006 16:44

I haven't looked at WWF, but just from the

description it sounds very primitive compared to

Sonic Orchestration Server.

That is not the point. The point is that Microsoft is pulling the workflow engine from Microsoft BizTalk(*) and making it available in Microsoft Visual Studio. And I think Progress should follow the same path. This is the "natural evolvement" (whatever this might be) of the OERA.

Don't know about licensing. Microsoft WWF in Microsoft Visual Studio is probably geared towards application development (single-app) and in Microsoft BizTalk towards application integration (multi-app). Maybe Microsoft WWF in Microsoft BizTalk can better handle heavy load, has more adapters, better monitoring tools, that sort of thing... Don't know, I should probably install the beta in order to find out...

(*) Not exactly true if I understand correctly; in future versions of Microsoft BizTalk the Microsoft WWF will replace the current Microsoft BizTalk orchestration engine.

Posted by Thomas Mercer-Hursh on 19-Aug-2006 18:08

So, if we can get reasonable access to Orchestration Server for development ... and I think it is in the Integration Workbench ... and reasonable pricing for small scale deployment, we should be in great shape with the best of breed product. I think PSC wants to make it happen.

Posted by Admin on 20-Aug-2006 04:24

OK, I put up the whitepaper. You will find it at

http://www.cintegrity.com/PDF/ESBAsAnApplicationArchit

ecture.pdf

From that document:

"...

Where We Are Today

Therefore, I have refocused my business on the problem of transforming legacy applications into

fully modern, OERA3 compliant, ESB/SOA enabled architectures. This will involve:

1. Analysis tools to extract information from existing ABL applications;

2. Constructing UML models from this information that completely describe the application;

3. Revision of the UML to better conform to modern architecture; and

4. Generation of the complete new application through the techniques of Model-Driven

Architecture (MDA).

As a part of this effort, I will have to create an underlying framework for use by the generated

applications and I will have to create the template structures to use in generation.

..."

Wow, and that's where we are today? Perhaps we're getting a bit off topic here, but any idea how complex it will be to describe the entire application with UML-diagrams? Sure static class diagrams are easy to setup, but putting in dynamics will make it very, very complex. Some UML-evangelists even say: "throw away the sequence diagrams when you have the code". Sure you can reverse engineer sequence diagrams, but these are hardly readable.

And generating the new application through techniques of MDA: I guess you're very, very optimistic (when you want to achieve that goal between now and the next couple of years). These are the modern case tools

Theo.

Posted by Thomas Mercer-Hursh on 20-Aug-2006 12:30

One isn't limited to "diagrams". In fact, in a recent exchange with Phil Magnay at Progress in which we were talking about the problem of creating a class diagram which covered hundreds of classes he remarked that in his own work he tends to use the diagram itself for validation up to a point and then he throws the diagram away in favor of the tablular form.

In addition to the sorts of UML diagrams with which you are familiar, there is also Object Constraint Language and Action Language. The former is a language in which to describe constraints (duh?) and the latter is for describing algorithms.

I recognize that some folks throw away the model after they have the code, but that is because they are only using the model to create a skeleton and expecting to do the "real" stuff by hand writing code. There are also some people who are insistent on being able to "round trip", again because they are only really comfortable doing detail work in the code itself. My orientation is to be model driven and to do everything in the model or the MDA transforms. That way one can remain platform independent and incorporate changes extremely rapidly and consistently.

Check out Mellor and Balcer, Executable UML: A Foundation for Model-Driven Architecture as one view on really focusing on the model.

Posted by Mike Ormerod on 21-Aug-2006 02:18

>.....

Please tell me that my horrible premonition that this

example will still be .p, .w, and .i isn't true and

that we will actually have a .cls example from PSC.

Looks like dreams/nightmares* (delete as applicable) can come true !

The first version of the example app to be published will indeed be procedural, not class based. But don't forget this is the first version. Our intention is that this is an ongoing body of work that is added to with further releases, which will add class based examples, plus other features. So if there are specific features or areas you'd like to see covered in the example then let us know. (Maybe I should start a thread)

Posted by Mike Ormerod on 21-Aug-2006 02:32

I think the way that one strikes a balance is to

design a reference application that:

1) contains enough complexity in its schema to

reflect real world issues; and

2) reflects this complexity in the portions of the

application which are implemented.

I.e., one doesn't need to have a reference

application that covers the range of an entire ERP,

but the pieces that are there should be strong enough

that they wouldn't be out of place in a full ERP.

With that baseline in place, one can then focus in on

specific issues and one doesn't need to look at

everything every time. One can say, "let's look at

an example of a set of data access layer components"

and focus in on one or two sets of these. One might

be very simple; something like a code table. They

other might be something more interesting and reflect

the kind of complexity one gets with an order or

invoide.

And, it needs to be packaged appropriately. None of

this putting the example in the main directory stuff.

We have good models from the OO world about how to

package things ... let's start learning from it.

This is certainly our aim, but we are also at the first step. As the last few postings have commented on, it's all about balance. I certainly think it's fair to say that as an example AutoEdge, it contains more that sports. It is OERA based, it fully utilizes ProDataSets, it has multiple UI's (GUI,HTML, .NET, Pocket PC) against a single set of server code, messaging using both MQ & ESB (with transformation and content based routing) and web services. In addition to the code, as I've sad before, we will also include the specs, the designs, and supporting documentation, plus an online content sensitive version known as LiveDoc.

The business story created is hopefully one that can be applied across any vertical.

Are the business processes simple? Maybe. Too simple? Let us know!

Bear in mind, this is probably the first real example since sports on-line, and as I say is a first step. So if you strongly feel there are things missing, tell us. But also, if there are things you think a great, that would be good too

Posted by Mike Ormerod on 21-Aug-2006 02:38

..

(Sorry to go back a few messages...)

I don't think Salvador is talking about Sonic

licensing...

No, probably not.

I did some quick tests with Sonic Workbench 7.0 and I

am very enthousiastic. Examples please!

The example application uses Sonic 7.0, so hopefully this will provide some examples for you

Posted by Admin on 21-Aug-2006 06:45

then he

throws the diagram away in favor of the tablular

form.

A diagram helps you to verify the model. And a complex model will remain complex when you throw away the graphical representation.

In addition to the sorts of UML diagrams with which

you are familiar, there is also Object Constraint

Language and Action Language.

I'm also familiar with OCL: we tried to apply this in 2001.

There are also some people who are insistent

on being able to "round trip", again because they are

only really comfortable doing detail work in the code

itself.

see http://www-128.ibm.com/developerworks/rational/library/3100.html

My orientation is to be model driven and to

do everything in the model or the MDA transforms.

So why do you still care about the ABL?

That way one can remain platform independent and

incorporate changes extremely rapidly and

consistently.

Sure, but I think this approach is not usable outside universities or very specific application aspects. I can see you haven't tried to apply it to the front tier yet! Don't underestimate the UI-to-business-logic glue.

Check out Mellor and Balcer, Executable UML: A

Foundation for Model-Driven Architecture as one view

on really focusing on the model.

Anyway, in theory it's great stuff, in reality it's (still) too abstract for most developers and proper supporting tools are missing.

Theo.

Posted by Admin on 21-Aug-2006 07:07

This is certainly our aim, but we are also at the

first step. As the last few postings have commented

on, it's all about balance. I certainly think it's

fair to say that as an example AutoEdge, it contains

more that sports. It is OERA based, it fully

utilizes ProDataSets, it has multiple UI's (GUI,HTML,

.NET, Pocket PC) against a single set of server code,

messaging using both MQ & ESB (with transformation

and content based routing) and web services.

All these technologies are very nice, but also try to address the highly normalized database schema. That will illustrate how to deal with code/description pairs in a ProDataSet. The order table has for instance a status code "S" and you have to display the language dependent status description "Ready to ship". Of course there are more relationships in an order entity. It will be interesting to see if you're going to model this aspect as an additional "status table" in the ProDataSet, an additional virtual field in the order temp-table or as an additional lookup-ProDataSet. And when the solution is anything other than a virtual field in the same table, the UI-tier has to know the data-relationship between order status and status temp-table.

Entering a valid status code is another aspect: the user should be able to pick a code from a list, which can be represented as a dropdown list. Somehow you "re-use" the relationship information, but now for a different goal: fetch the entire list of legal values, connected to a particular field.

During validation you will have to use the same list.

Theo.

Posted by Mike Ormerod on 21-Aug-2006 07:19

>...

Entering a valid status code is another aspect: the

user should be able to pick a code from a list, which

can be represented as a dropdown list. Somehow you

"re-use" the relationship information, but now for a

different goal: fetch the entire list of legal

values, connected to a particular field.

During validation you will have to use the same

list.

Theo.

One aspect of the work is also, in some instances, to show how the same thing can be acheived in different ways, which if you were building the application for "real" you wouldn't necessarily do, but as a teaching aid, is a viable option. So you will find that in the code we may take different approaches. Having said that, there will of course be ways and solutions that we haven't done. So here's an opportunity, it would be great for people to take what we do post, and if they feel there is an alternative way, or a 'better way', then feel free to change it and let us have it back so we can post it

By no means are we trying to give the impression that what we post is the only way to build an OERA application, it's just one or more examples of how it could be achieved. I'm sure there will be lots of debate about what we've done, and how wrong we've got it, but we expect that. The cool thing is that there will at least be an example in the community that everyone can look at, and comment on, and hopefully contribute to, to improve it

Posted by Thomas Mercer-Hursh on 21-Aug-2006 12:41

Do you have a prediction for how long we will have to wait for a class-based example? Another year?

Posted by Thomas Mercer-Hursh on 21-Aug-2006 12:44

For a quick description, this certainly sounds like an improvement .. but the proof, of course, is in what is actually done. It sounds like you have a number of technologies in there ... although missing the really big one from 10.1A ... but I hope the business side is complex enough to make it more serious.

Posted by Thomas Mercer-Hursh on 21-Aug-2006 13:20

see http://www-128.ibm.com/developerworks/rational/library/3100.html

Something in particular in that article you wanted to point me at?

So why do you still care about the ABL?

Because I still have to create the framework and I still have to create the transformations and I would much rather do that in a 4GL than in a 3GL.

Sure, but I think this approach is not usable outside universities or very

specific application aspects.

I don't expect to apply my work to missle quidance system. I would think that a large part of the people who need this kind of transformation are people with some flavor of ERP. I don't pretend that there won't be additional work required to step outside of that domain.

Anyway, in theory it's great stuff, in reality it's (still) too abstract for most

developers and proper supporting tools are missing.

Well, the developers might have to catch up. This is an area which is moving rapidly. Conclusions which were valid two years ago aren't valid today and those valid today won't be true two years from now.

In particular, recognize that, while 100% generation is the goal, 90% generation with the ability to insert handwritten pieces in the places where it is needed and to be able to regenerate the entire system when there is a change in the model is still an enormous gain in productivity, flexability, and predictability. I'm not expecting to get to 100% next week, but that doesn't keep it from being a goal. The tools may not be there for ABL yet, but that doesn't mean that the generic tools aren't there. E.g., there is nothing language specific about the MDA tools in Enterprise Architect.

Posted by Thomas Mercer-Hursh on 21-Aug-2006 13:24

Entering a valid status code is another aspect: the user should be able to

pick a code from a list, which can be represented as a dropdown list.

Somehow you "re-use" the relationship information, but now for a different

goal: fetch the entire list of legal values, connected to a particular field.

I think this is one of the kinds of things that can be very nicely encapsulated in an OO model.

Posted by Thomas Mercer-Hursh on 21-Aug-2006 13:26

it would be great for people to take what we do post, and if they feel there

is an alternative way, or a 'better way', then feel free to change it and let us

have it back so we can post it

Like converting it to .cls?

Posted by Mike Ormerod on 22-Aug-2006 02:00

Do you have a prediction for how long we will have to

wait for a class-based example? Another year?

Certainly not another year

Posted by Mike Ormerod on 22-Aug-2006 02:01

it would be great for people to take what we do

post, and if they feel there

is an alternative way, or a 'better way', then feel

free to change it and let us

have it back so we can post it

Like converting it to .cls?

Sure, why not

Posted by Phillip Magnay on 22-Aug-2006 03:27

I'll chime in since my name has been mentioned.

One isn't limited to "diagrams". In fact, in a

recent exchange with Phil Magnay at Progress in which

we were talking about the problem of creating a class

diagram which covered hundreds of classes he remarked

that in his own work he tends to use the diagram

itself for validation up to a point and then he

throws the diagram away in favor of the tablular

form.

UML diagrams are merely a graphical representation of the underlying model. Diagrams are great for communication amongst team members and stakeholders but are not in themselves completely necessary. For example, forward/reverse engineering functions in most modeling tools do not rely on diagrams but rather on the data in the model.

Diagrams with large numbers of classes/associations become difficult to read and use. Usability and understanding is enhanced if a large model (in a logical view) is represented by several smaller (and probably overlapping) diagrams. UML modeling should help make design/development less complex. If modeling doesn't make things simpler, then you're on the wrong track.

Check out Mellor and Balcer, Executable UML: A

Foundation for Model-Driven Architecture as one view

on really focusing on the model.

Great reference. Another good one by Mellor is: MDA Distilled: Principles of Model- Driven Architecture.

Posted by Thomas Mercer-Hursh on 22-Aug-2006 11:34

Sure, why not

I would take that as a challenge, but it seems silly for me to be doing the same thing you are doing, unless I am going to get it done a whole lot sooner. My confidence level might go up if I were to start seeing some good .cls examples being published.

Posted by john on 22-Aug-2006 13:55

Sorry folks, after starting this whole (wonderful) mess of a thread, I then disappeared to the Asia-Pac user conference (Progress Technology World), which was a very nice success apart from the fact of its being on the other side of the freaking planet. It will take me a while to catch up. I picked this message to reply to first because I must thank first Theo for the many interesting points he raised and then Thomas for this and other thorough replies, most of which I would enthusiastically concur with (so you saved me a lot of typing). Let me try to respond to one or two main points of the thread messages here.

First, at the risk of taking issue with my Vice President, I would question -- for the sake of argument at least -- Niel's suggestion that OERA (or a multi-tier architecture in general) is less productive than the world of FOR EACH Customer. As someone I talked to at the conference admitted, his staff still find it much easier to take a spec for a new business requirement and code it as a top-down procedure. This is understandable, and may seem productive, but the question -- where real productivity is concerned -- is what you've got after you've done this a thousand times or so. As most of us know too well, the result is not terribly productive for maintenance and extensibility. Many procedures access the same data (with all its idiosyncracies) and repeat (perhaps unwittingly) much of the same logic. So fixing the database idiosyncracies or tying to a new data source is greatly complicated by the job of finding and fixing all the references to it. Same for business logic, UI assumptions, etc. etc. So the challenge with a more serious architecture is not productivity per se, but the initial investment needed to get you to the starting point of being truly productive in new ways. We don't want you to code each data access object and each whatever else by hand; we want you to build and use patterns for those things that take care of the layered complexities and let you just plug in the specifics for that data entity (and to do this only one for each data entity to serve the entire application -- that's where the "new" productivity comes in). So the hard part is understanding and producing the templates/patterns/models to fill in without having to solve the whole top-down problem over and over again. This is where we at PSC still have a lot of work to do to help out here, to push the right parts of it down into the core product and provide good guidance for the rest (with assistance from many of you, it is to be hoped). The first set of "Implementing the OERA" papers is a part of this. The new AutoEdge material that Mike has referenced is another. And there will be others (and yes, Thomas, they will start to use classes as examples).

Another major sub-thread has been the right approach to describing these parts of the architecture. We do try to avoid "just white papers" (Tim K.) and most of the more substantial papers on the site now (beyond shorter definitional papers) do use and describe code samples. We will continue to focus on real code. As far as the right depth is concerned, I think the topic at hand should determine that. If the paper/example is trying to convey, for example, how to create a Service Interface that lets clients make standard requests of a set of possibly running procedure or class instances on a pool of shared AppServers, then the complexity of the underlying business data and logic is not really the issue. Sports will do fine. But if the topic is how to assemble complex data into a Business Entity and navigate it, then by golly, the data in the sample had better be at least somewhat more complex. And if these are (properly) created as fairly independent and easily digestible chunks, then at least we need to point to where these kinds of 'patterns' fit into the overall architecture story so people can put the pieces together on their own.

So that's a start. Thanks for all the interest, and it would be nice to hear from a few more of the 1200 or so who have seen this thread.

Posted by john on 22-Aug-2006 14:10

N-tier is one of those things one gets or one doesn't

and, until one gets it, it seems just horridly

complex. But, once one gets it, it seems natural and

straightforward.

And this is one of the key issues. If we sit here and think "So what's the big deal?", because we've been dealing with these issues for a long time and therefore think that the issues and the basic solutions are clear, then we're not serving anyone. This thread is helpful for us to better identify why it is that the issues are there and what parts seem unreasonably hard, and then fix them -- ideally in the product itself, otherwise in good examples and explanations that fill in the gaps. I don't think n-tier needs to be "horridly complex" at all, but we clearly have work to do to communicate how to deal with the seeming complexity. Tools like Tim's manager to handle the lifecycle of procedure instances is an example of an important part of filling the gap.

Posted by Tim Kuehn on 22-Aug-2006 14:55

Tools like Tim's manager to handle the lifecycle of procedure instances is an example of an important part of filling the gap.

The procedure manager is a significant and fundamental first step.

There's a lot more that can be built on the procedure manager's functionality - including managing the life cycle of just about every other dynamic widget the ABL supports.

Posted by Thomas Mercer-Hursh on 22-Aug-2006 15:41

First, at the risk of taking issue with my Vice President, I would question --

for the sake of argument at least -- Niel's suggestion that OERA (or a multi-

tier architecture in general) is less productive than the world of FOR EACH

Customer.

I think there are two important points here that deserve to be highlighted.

One is that, if one is going to compare complexity, one needs to compare equivalent systems. A system characterized by for each customer is not equivalent to a system which integrates a dozen different system, supply chain integration, web integration, etc., etc. The valid comparison is between OERA/SOA/ESB ways of getting to that level of functionality versus trying to do it in other ways. If that is the comparison one makes, then the OERA approach is less complex and more manageable than cobbling together something with a million patches. The reason that people become overwhelmed by the complexity of OERA is because they are making the transition to needing those additional capabilities and that is complex no matter how it is done.

The other is the total cost of ownership issue. Yes, developing an OERA application involves a certain amount of up front difficulty in design, but one ends up with an application which is nimble, responsive, and easily changed as business needs change. So, the change which would have been extremely difficult and expensive in an older monolithic architecture becomes quick and easy. Over the life of the application, one can not only save costs, but more importantly can take advantage of many more opportunities.

Posted by aswindel on 22-Aug-2006 15:56

How does one respond sensibly to this whole thread which raises so many interesting discussion items. Rest assured as a key contributor to our future tools strategy I (amongst others) am reading this with keen interest so do not take a lack of response as not listening - this is great feedback.

I would like to respond specifically to a few points that have been made however.

On the topic of tools versus new language syntax, personally I would prefer to use purposed wizards / cheat sheets that only ask me a few key questions necessary to auto generate all the components of my reference architecture complete with a model as well as code in order to implement a well defined pattern. This saves me writing code in lots of objects, having to understand the complexities of the architecture layers, and provides a model that documents what I did with the ability to modify the model and/or code with full roundtrip engineering.

Such tools could support generating any kind of architecture which avoids being tied to any one architecture as well as providing support for legacy code as much as building new code.

Am I on my own with this lazy attitude? I really do not see how we could achieve this through the language itself - but I definitely have visions for how tools could be used to achieve this...

A big question is what should the tools generate? what are these common design patterns we all talk about? how could these tools be configured so that out of the box they could satisfy many use cases without being customized?

The plumbing / tooling is IMHO easier than choosing the correct targets - hence the initial work on reference implementations without supporting tools.

Regarding the integration of Sonic tools - this is absolutely a path we are investigating as many of the Sonic tools could provide a very solid foundation for building OpenEdge application layers, in-process workflow, etc. That is one of the benefits of all the tools moving to Eclipse...

BTW - on vacation and travelling so may be slow to respond to replies.

Regards

Ant

Posted by Thomas Mercer-Hursh on 22-Aug-2006 16:13

On the topic of tools versus new language syntax, personally I would prefer

to use purposed wizards / cheat sheets that only ask me a few key

questions necessary to auto generate all the components of my reference

architecture complete with a model as well as code in order to implement a

well defined pattern.

Wizards can certainly be useful, but they are only a starting point. They certainly don't provide the detail necessary for robust, enterprise-class applications. And, they certainly shouldn't be a crutch that keeps one from having to understand one's application!

One of the inherent problems with wizards is that they tend to exist in relationship to a very specific model of what is going to be built. Change the model that one wants and the wizard can easily become a useless appendage which goes unused.

To avoid this, I think that one needs to plug into some very generalized modeling and generation technologies. That way, whatever it is that PSC delivers, one could change the models and the templates and end up with what one wanted. That has been the big flaw of the tools like ADM which PSC has delivered in the past. For someone like me, it was just too far off the mark to be useful. As you can probably guess, my recommendation is for UML for the models and MDA for the generation. Big job, to be sure, but worth doing.

Posted by aswindel on 22-Aug-2006 17:55

I agree with your comments. Our thought is for an extensible template engine such as JET 2.0 for code generation. This is very flexible and can be based on an extensible component definition for what to actually generate. Our goal is to make this consumable for an OpenEdge developer so that whatever technology we use you can extend it with ABL skills - but the type of functionality offered by JET 2.0 is wat we are thinking of.

As far as wizards go, some default wizards that can capture simple tag replacements without writing a purposed wizards would be the default, with the ability to write a powerful wizard that can capture whatever details are required by the code generation technology.

All of this can feed from some model, which indeed could be UML based - or otherwise.

Thoughts?

Ant

Posted by Thomas Mercer-Hursh on 22-Aug-2006 19:01

Well, while one can get the job done in multiple ways, personally I think there is a strong argument in favor of modeling in UML and MDA for code generation. This is a way more sophisticated and general model than just about anything and it is standards-based and evolving rapidly.

In fact, you would be doing well just to cross license Enterprise Architect and to fund additional ABL elements for it.

Posted by Admin on 23-Aug-2006 03:27

Sorry folks, after starting this whole (wonderful)

mess of a thread,

Here technology is a problem: it's not an indented thread

So the challenge with a more

serious architecture is not productivity per se, but

the initial investment needed to get you to the

starting point of being truly productive in new ways.

Spot on! But any application/architecture "suffers" from maintenance cycles...

We don't want you to code each data access object and

each whatever else by hand; we want you to build and

use patterns for those things that take care of the

layered complexities and let you just plug in the

specifics for that data entity (and to do this only

one for each data entity to serve the entire

application -- that's where the "new" productivity

comes in).

Exactly, hide the plumbing. In the old days we tried to hide it with includes, creating a "stripped down and dedicated 4GL language", nowadays with persistent procedures. The problem with tools often is the fact they generate a lot of code/includes based on a minor specification. So after the initial go, it's hard to pull out the original specification.

But if the topic is how to

assemble complex data into a Business Entity and

navigate it, then by golly, the data in the sample

had better be at least somewhat more complex.

So when can we expect this topic being covered?

Posted by Admin on 23-Aug-2006 03:33

I agree with your comments. Our thought is for an

extensible template engine such as JET 2.0 for code

generation.

That's what I assumed in the OO-thread.

As far as wizards go, some default wizards that can

capture simple tag replacements without writing a

purposed wizards would be the default, with the

ability to write a powerful wizard that can capture

whatever details are required by the code generation

technology.

As long as the wizard stores it's captured data and doesn't generate the target source itself, it would be OK. This way the "wizard" isn't "an idiot", since it can interpret it's own work...

Theo.

Posted by Phillip Magnay on 23-Aug-2006 05:45

Well, while one can get the job done in multiple

ways, personally I think there is a strong argument

in favor of modeling in UML and MDA for code

generation. This is a way more sophisticated and

general model than just about anything and it is

standards-based and evolving rapidly.

JET and UML/MDA are not mutually exclusive. Indeed, in the Eclipse world JET is the means to realize flexible code generation through maintainable templates.

In fact, you would be doing well just to cross

license Enterprise Architect and to fund additional

ABL elements for it.

There is no either/or choice here. T4BL and third party modeling tools which have OpenEdge-specific extensions should be able to co-exist and interoprate with each other. Enterprise Architect has gained some profile in the OpenEdge communityfor a number of reasons: 1) it is a first-class UML tool without the first-class price tag. 2) It comes with a very flexible and powerful customization/extension framework which has readily allowed the development and inclusion of OpenEdge-specific extensions, 3) Sparx Systems, the developer of Enterprise Architect, has been very easy to do business with. They have been in our Technology Partner program for over a year now and they have been very supportive of our goal of opening up the benefits of UML/MDA to the OpenEdge developer community.

But that doesn't mean that other third-party UML tools are out of the picture. Indeed, the more UML tools that can also directly support OpenEdge technologies and the OpenEdge developer, the better.

Posted by Thomas Mercer-Hursh on 23-Aug-2006 13:14

The problem with tools often is the fact they generate a lot of code/includes

based on a minor specification. So after the initial go, it's hard to pull out

the original specification.

This is why I think that the specification needs to be in the form of a model, not just a wizard. It is one thing to have a wizard construct the initial form of the model, but the code itself should come from an explicit model, not just some one-time generation step.

And get rid of the includes!

Posted by Thomas Mercer-Hursh on 23-Aug-2006 13:22

JET and UML/MDA are not mutually exclusive. Indeed, in the Eclipse world

JET is the means to realize flexible code generation through maintainable

templates.

Is this why, when I inquired about customizing a number of template in order to get lower case keywords, among other things, I was told that some were modifiable and others were not ... I think because they were JET?

I have only had the slightest look at JET, but my impression is that it has to do with generating a text file or the equivalent. That might work for code generation per se, but there is a lot of MDA that is model transformation. Is JET suitable for that too?

Indeed, the more UML tools that can also directly support OpenEdge

technologies and the OpenEdge developer, the better.

While I am all in favor of choice and options, it is unlikely that any third party product, unless I write one, will provide any notable support for ABL unless PSC has more than an arm's length relationship with the vendor. Come one, come all isn't going to get us ABL-specific support.

Posted by Phillip Magnay on 24-Aug-2006 02:37

Is this why, when I inquired about customizing a

number of template in order to get lower case

keywords, among other things, I was told that some

were modifiable and others were not ... I think

because they were JET?

In theory, JET can output anything so I'm not sure of the reasons behind this specific instance.

I have only had the slightest look at JET, but my

impression is that it has to do with generating a

text file or the equivalent. That might work for

code generation per se, but there is a lot of MDA

that is model transformation. Is JET suitable for

that too?

Yes. JET is primarily for outputting text. But this can still be useful for MDA transforms. For example, EA's MDA transforms first output an intermediate textual representation of the transformed model then import (internally) that textual information to update the model.

While I am all in favor of choice and options, it is

unlikely that any third party product, unless I write

one, will provide any notable support for ABL unless

PSC has more than an arm's length relationship with

the vendor. Come one, come all isn't going to get us

ABL-specific support.

I didn't mean to suggest a come one, come all approach, merely that other vendors exist with which we may possibly form suitably close relationships in order to achieve similar UML/MDA support for OpenEdge ABL. That is not to suggest that we are currently and actively pursuing any other specific third party vendors or that we are somehow backing away from Enterprise Architect... merely that staying open to other possibilities just seems the most reasonable course at this stage. That said, the relationship with Sparx is very good and it is the first such relationship out of the blocks.

Posted by Thomas Mercer-Hursh on 24-Aug-2006 11:03

In theory, JET can output anything so I'm not sure of the reasons behind

this specific instance.

Quoting from my summary posting to the PEG in re call W606304129:

Now, this solves only a part of the problem. There are also the templates related to the right-click/New options. These are found in

DLC\oeide\eclipse\plugins\com.openedge.pdt.text_10.1.0.01\runtime\

where it may be 10.1.0.01 or 10.1.0.00 depending on whether you have SP1 installed. If you have the service pack, you will actually have both, but the one in the higher numbered directory seems to be the one used. The .template files in this directory are plain text and may be edited both for lower case keywords and any other preferences you might have for style and content.

Unfortunately, this is only a partial solution since there are also templates in

DLC\oeide\eclipse\plugins\com.openedge.pdt.t4bl_10.1.0.01\templates

with a file extension of .ijet . I have been told that these are non-modifiable. My experiments with modifying them have not produced any results ... i.e., I change them, but what gets inserted remains the same.

Also, there are other points where it is not even apparent that the insertions are coming from a file. E.g., in the insert method wizard, one is given options for the return type and the scope and these are shown in upper case only. Similarly, in New Class, INHERITS and the constructor and destructor are coming in upper case, although the latter two are probably coming from templates.

I have recommended summary execution for the programmer responsible for forcing upper case keywords in this fashion.

That said, the relationship with Sparx is very good and it is the first such

relationship out of the blocks.

My point is that it is unlikely that any UML/MDA vendor will invest significant effort on their own initiative to support ABL. That support is either going to have to come in the form of PSC add-ins, as it has to a limited degree with EA, or in the form of PSC incentives.

Posted by Phillip Magnay on 24-Aug-2006 13:01

My point is that it is unlikely that any UML/MDA

vendor will invest significant effort on their own

initiative to support ABL. That support is either

going to have to come in the form of PSC add-ins, as

it has to a limited degree with EA, or in the form of

PSC incentives.

I'll reiterate what I posted above:

This thread is closed