I have two programs, one program when I run for first time it takes 2 mins, the second time when it runs it takes 1sec due to buffer cache. The other program when I run for first time it takes 3 sec and the second time I run it still remains the same. Now my question is why the buffer cache doesn't improves the performance in second case. Is there any rules applied when applying buffer cache
This highly depends on what makes your program. It would be better if you can provide few more details like what these programs do, startup parameters they use etc.
I assume you are talking about the database buffer cache, yes?
The purpose of the buffer cache is to reduce the need for disk access by keeping copies of recently accessed database data block in memory for a time. Then, if the block you need is already in memory, we don't have to read it from disk. This saves a great deal of time when it happens this way. To maximize the chances of a block being in memory when you need it, we have to predict the future. Sometimes the prediction is wrong. We use what is commonly known as the "least recently used" algorithm to predict the future.