Differences in memory usage between Windows XP and Windows 2

Posted by Admin on 23-Jun-2010 08:58

Good Morning.

I have seen a curious case and I would like if is normal or an error in the configuration of my server.

I have two servers for testing:

1 .- Windows Server 2003 Standard Edition with Service Pack 2
Intel Xeon CPU 2.80Ghz

2 .- Windows XP Professional with Service Pack 3
1.83GHz Intel Core

I use Openedge 10.2A
I have the same database in both servers.

When I start the database in the first server (Windows Server 2003) and watch the memory usage (the process _mprosrv.exe) I see that has occupied 19.404 K.

When I perform the same operation on the second server (windows XP) something similar happens,
I see that the process generated _mprosrv.exe occupies 19.328 K (similar to the first server),
but after 1 minute the memory of that process is reduced to 768 K.

Is normal that Windows Server 2003 consume that memory?

Thanks in advance.

All Replies

Posted by gus on 23-Jun-2010 10:20

Yes, it is normal. The two systems manage memory differently because their intended usage is different. On XP, if you look at My Computer/Properties/Advanced/Performance, you will find some settings that give you limited influence over this.

Posted by Admin on 23-Jun-2010 10:36

ok but then if I start 30 databases (20.000x30 = 600,000) the system will have occupied those 600,000 of RAM? I understand that if, slowing down the server since it only has 2GB of RAM.
The only solution would be to increase the RAM on the server?

Posted by gus on 23-Jun-2010 11:08

This is a much more complicated question. First of all, determining how much

memory a process is actually consuming is tricky because:

- it varies over time, depending on the type of process

- we are dealing with virtual memory and only part of a process has to be

resident in physical memory at any given moment. The virtual memory size and

resident set size are often radically different.

- parts of what is in a process' address space is shared with other

processes (e.g. shared memory, shared libraries, memory mapped files, the

executable's code, etc.). These things can contribute to the virtual memory

sizes of many processes but there is only 1 copy present in physical memory.

in addition, how much memory is used is highly variable, depending on

configuration and application (database startup parameters, application

code, etc.).

The windows task manager does not give you very much information but does

give a quick summary. The two numbers you gave in your original question are

highly misleading because when you start the database and it uses some

memory, it will not shrink later. Once it has initialized itself, it stays

pretty much the same after that until you shut it down.

If you want to dig into a lot of details, then you need to get Microsoft's

SysInternals suite, which has tools to look at all these things in detail.

For the database server, in most installations the largest consumer of

memory is the buffer pool, which is in shared memory. The larger you set -B,

the more memory it needs. It is not unusual for a single database to need 1

GB or more for shared memory. With a data block size of 4 kilobytes, setting

-B to 100,000 will consume 400 megabytes just for the data buffers.

So, yes, if you start 30 databases, you will definitely need more than 2 GB

or memory. A lot more. Don't forget that the operating system needs memory

too so you don't get the whole 2 GB.


Posted by Admin on 29-Jun-2010 03:38

Thanks, a excellent response

This thread is closed