Many people do not realize that memory allocated by a queue is never deallocated until the queue is destroyed or the call-chain that created the queue stops running. This is problematic for queues that are opened at the beginning of the application and used throughout because all of the queues will always retain their maximum size, causing the application to potentially hold a lot of memory that is currently unused or seldomly used.
Consider a consumer that occassionally lags behind and the size of a queue will grow tremendously. Then the consumer picks back up and services the elements out of the queue in a short period of time. It is unlikely the queue will be this large again for quite some time, but unfortunately no memory will be deallocated.
I'd like a primitive that will deallocate all of that memory down to just the current number of elements in the queue. Since the queue won't need that much memory again for a long time and the queue will auto-grow again as needed, I'd like to recover that memory now instead of waiting for the application to be restarted (which is currently the only time the queue is created.)
The alternative is to add some code to periodically force destroy the queue and have the consumer gracefully handle the error and open a new reference. Then replicate this change for all queues. Seems messy and puts too much responsibility on the consumer. I'd rather just periodically call a 'deallocate queue memory' primitive on the queue reference within the producer, perhaps once every few minutes, just to be sure none of the queues are needlessly holding a large amount of memory.
I believe this will:
I realize this will hurt enqueue performance when the queue begins to grow quickly again, but this area is not a bottleneck for my application.
Thanks!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.