We are using around 25 PL-files to split up our ERP-system by module. We are doing this for quite a while not, but are reconsidering this because we are currently setting up an automated build street.
We are wondering what is better from a performance standpoint?
Will a big PL-file, from around 1 GB function better than 25 smaller ones that vary from a few MB's to 120 MB?
Are seperate PL-files better for memory usage? Or can we best load one big one? Are there other considerations we should take into account?
Thanks in advance!
Personally, I would leave the .pl structure behind and use a plain folder structure, combined with the -q parameter for production and a large memory allocation to keep as much files in memory as possible. What used to be 'big' allocations are not that big anymore with current hardware.
A good use-case for .pl files might be if you write commercial software, since it easier to deliver. A second use-case is when you have a base version and cumulative updates. In both cases file handling is easier.
A shared pl can reduce memory usage in VM, Citrix and Linux enviroments as far as I understand.
I would go for one big one. Easier to deploy for a system admin and the application can use a much shorter propath.
You might find this article useful. This mentions "Procedure libraries can be used to group many R-code files into a single file from which an OpenEdge client can read and execute R-code files. Having a single place to look for R-code files eliminates the large directory and open/close issues, since the library can open once and allow for faster searches."
We use three pl files (client, shared and server) and have our modules broken out into packages inside the pl.
[quote user="Pierre Blitzkow"]
I did not yet know the concept of PL but I found it very interesting, doing some tests in my environment, even the PL being first in Propath if .r exists in another directory it is found before the PL, is this an expected behavior?
No .rs are not found before the pl, the propath is simply searched from beginning to end.
Whatever tool you are using to show where the .r can be found looks like it is simply displaying the entries alphabetically.
> On Dec 20, 2017, at 7:38 AM, Richard.Kelters wrote:
> A shared pl can reduce memory usage in VM, Citrix and Linux enviroments as far as I understand.
> I would go for one big one. Easier to deploy for a system admin and the application can use a much shorter propath.
Correct, a shared pl can save you considerable memory when many users have the same one(s).
However, shared pl’s are memory mapped as a single contiguous chunk of address space. When using 32-it OpenEdge, a large pl may fail to load because there is not a sufficiently large chunk of free address space.
Thanks for all the responses. It seems most people use PL-files as Shared Pl files in order to save memory and increase speed. We do that as well, and are aware of the advantages of shared PL and shorter propath.
But it's very hard to find information about an optimal size for a PL-file. Perhaps someone has done some research / testing in this area?
We are using PL-files for over ten years, but always split up per module. We are now wondering if merging them into one big one, will be beneficial to performance of not.
Any thoughts on that?
> On Dec 24, 2017, at 9:10 AM, onnodehaan wrote:
> We are now wondering if merging them into one big one, will be beneficial to performance of not.
don’t think any one has measured this. aside from the time spent opening and mapping multiple files instead of one, i doubt it makes any difference, since once the code is in memory, everything else is mostly the same regardless of the number of pl files. except one thing: checking time stamps on multiple files compared to one. such checks are fast if you don’t have hundreds or thousands of files.
but, as i said, no one has measured. you could be the first.