Appserver error: Too many open files (errno:24) (8046)?

Posted by MBeynon on 23-May-2016 04:59


Our clients Appserver crashed over the weekend with the above error.

Looking through the knowledgebase for this error I find the following:

which advises setting the ulimit value on the Operating System (HP-UX):

Attempting to exceed the limit for file descriptors that the process can open.
1.  Increase nofiles for the user process.

ulimit -n somevalue

2.  Restart the AdminServer and brokers to detect new setting.

If still experiencing issues, set ulimit -n somevalue Example:  ulimit -n 90000 in the proadsv script prior to the jvmstart command.

Another article talks about setting the ulimit value on the Openedge admin server:

The process exceeds limit on file descriptors. There should be supporting information in the OS log files.
Add "ulimit -n 100000" into the proadsv script, just before jvmstart.
Then restart AdminServer and all broker processes.

If ulimit -Hn is bigger than ulimit -n, increase ulimit "ulimit -n xxx".
If ulimit -n reaches ulimit -Hn setting, adjust kernel.
Refer to the 'man' pages for specific information.

My first question is, what is the file descriptor? Again, the knowledgbase says:

The following solution is applicable when there are clients connecting in self service mode to the Progress database(s), i.e. there are clients running on the same machine as the database broker (DB Broker) and connecting without "-S" parameter to the database(s). Example of a self service connection to a database named sports:
    mpro sports -p main.p

Our client is connecting in "Self Service Mode".

My second question is; how do I fix this? Do I change the OS ulimit, the Admin server or both.

Looking on the production machine the ulimit -n (open files) does not seem to be set but the OS's sys.log file shows:

May 22 12:59:35 syslog: Java: Number of open files: 4096
May 22 12:59:35 syslog: Java: Maximum number of allowed open files: 4096
May 22 12:59:35 syslog: Java: Number of open files: 4096
May 22 13:00:00 above message repeats 3 times
May 22 12:59:35 syslog: Java: Maximum number of allowed open files: 4096
May 22 13:00:00 above message repeats 3 times

Can anyone help?



P.S. Our client is on HP_UX 11.1 and OE10.2A.05

All Replies

Posted by Roy Ellis on 24-May-2016 07:32

Hi Mark,

A file descriptor is a handle to any file, socket, process.  So every file opened, every socket, and any I/O will require affect your -n (open files) limit.

It definitely looks like your process file limit is 4096, but the syslog is saying a Java process is hitting the limit, so not your clients nor the AppServer agents, but possibly the AppServer broker which is Java based.

For an AppServer broker there are file descriptor for each client connecting to the broker, for each AppServer agent, another for each log file and so on.  More for the connection to the AdminServer and any NameServers.  

Now you say ulimit doesn't appear to be set, but every unix system has this setting and increasing it should fix the immediate problem.  But you should also verify you don't have a handle leak.  You can see how many handles a process is using with the "lsof -p PID" command.

I hope this helps, Roy

Posted by ChUIMonster on 24-May-2016 07:48

HPUX also has system limits in the kernel that are frequently at fault for this sort of thing.  NFILES and MAXFILES if I recall.  One is the hard limit per process, the other is the total for the system.

You can also use "glance" on HPUX to dig into the details of things like what file descriptors a process has open and what the the kernel settings are.

Posted by gus on 24-May-2016 09:22

if the process has 4096 file descriptors in use, that is unusual. something is going wrong and that should be investigated.

increasing the limit will perhaps alleviate the symptom temporarily but likely will not cure the disease.

This thread is closed