Introduction
I recently ran into a problem where I needed to store a collection of objects in a queue for quick access. This turned out to be a problem when I started storing a large number of these objects, which themselves had a decent size, in memory. I was seeing OutOfMemoryException
s and other odd behavior resulting from the overloading of my system's memory.
The OverFlowQueue<T>
is a direct result of needing a FIFO queue that would not blow my system's memory away when flooded with objects. Using MSMQ as a back-end to the standard .NET generic queue successfully mitigates the memory issue with acceptable performance decreases.
That being said, I am sure that there are improvements I can make to this class, and I am welcoming all feedback!
Using the Code
The sample project included here has several classes in it, of which I am only going to discuss OverflowQueue<T>
, but feel free to use the other classes which contain methods to create/modify MSMQ queues, retrieve typed app.config values, and other utility methods.
The OverflowQueue<T>
class itself must be created using a serializable type. This is mandated by the binary formatter that is being used to pack and unpack items from MSMQ.
OverflowQueue<T>
has one constructor that takes two parameters:
public OverflowQueue(string queueName, bool useBackgroundMsmqPull)
{...}
The first parameter, queueName
, is the name of the MSMQ queue that should be created and used for the overflow from the in-memory queue.
The second parameter, useBackgroundMsmqPull
, defines whether or not the operation of pulling from MSMQ to repopulate the in-memory queue is done on the calling thread (on the call to Dequeue()
) or on a background thread that runs every second. For performance reasons, I have this value set to true
so that the pull from MSMQ happens on a secondary thread and does not hold up my dequeue operation. The pull from MSMQ looks like this:
void PullFromMSMQ()
{
while (Interlocked.Read(ref currentMSMQSize) > 0
&& Interlocked.Read(ref currentQueueSize) < maxInternalQueueSize)
{
Message message = overflowMSMQ.Receive();
Interlocked.Decrement(ref currentMSMQSize);
T item = message.Body as T;
PushToMemoryQueue(item);
}
if (Interlocked.Read(ref currentMSMQSize) <= 0)
{
lock (msmqLock)
{
if (Interlocked.Read(ref currentMSMQSize) <= 0)
pushingToMSMQ = false;
}
}
}
So, you can see why I am using a background thread to pull from MSMQ... My application has a good amount of processing on the dequeue, so it works better for me to have the background thread pull from MSMQ, but this might not be the same for everyone.
The rest of the OverflowQueue<T>
class is fairly easy to use; there are the following functions that allow you to enqueue or dequeue messages:
public void Enqueue(T item) {...}
public T Dequeue() {...}
Both of these functions will transparently use the overflow MSMQ queue to enqueue or dequeue messages, which you can see if you download the source code.
Points of Interest
My goal in creating this class was to create a utility that transparently incorporated the MSMQ back-end for my queue. All functionality dealing with the creation and use of the underlying MSMQ is encapsulated in the OverflowQueue<T>
class itself.
There are only two prerequisites to using this class:
- The user must have MSMQ installed
- The type
<T>
in OverflowQueue<T>
must be serializable
History
This is the first revision of the OverflowQueue<T>
class.