I've been doing a lot of multi-threading work recently using the standard Thread
class, the Worker Queue, and the new PLINQ (Parallel LINQ). The problem with most of the built-in generic collections (Queue<>
, List<>
, Dictionary<>
, etc.), is that they are not thread safe.
I created a library of thread safe collections which allow me to use the standard generic collection actions (foreach
, LINQ, etc.), while at the same time being thread safe.
The classes in this library inherit from the appropriate collection interface
(IEnumerable
, ICollection
, etc.). Each class also has all the functions and properties that its original non-thread safe class has.
You can download a copy of the entire library, which includes support for a thread safe List<>
, Dictionary<>
, and Queue<>
, here: Thread Safe Generic Collections.
TQueue<> Example
The first thing we need to do is create a container for the TQueue
and a thread lock object. I generally prefer to use the ReaderWriterLockSlim
because it is light weight and fast.
private readonly Queue<T> m_Queue;
private readonly ReaderWriterLockSlim LockQ = new ReaderWriterLockSlim();
Just like a standard Queue
, we have three overloads for the Initialization. These overloads allow an empty Queue
to be created, a Queue
with a specified capacity, or a Queue
with an initial IEnumerable
collection to populate the Queue
.
public TQueue()
{
m_Queue = new Queue<T>();
}
public TQueue(int capacity)
{
m_Queue = new Queue<T>(capacity);
}
public TQueue(IEnumerable<T> collection)
{
m_Queue = new Queue<T>(collection);
}
This next function is probably the most important one. The GetEnumerator()
is used during ForEach
loops, and returns the next item in the collection. Following Microsoft's example of a thread-safe enumerator, we first get a copy of the current container Queue
, then use this copy for iterating. You'll notice the use of the Read
lock before acquiring the container Queue
copy.
public IEnumerator<T> GetEnumerator()
{
Queue<T> localQ;
LockQ.EnterReadLock();
try
{
localQ = new Queue<T>(m_Queue);
}
finally
{
LockQ.ExitReadLock();
}
foreach (T item in localQ)
yield return item;
}
A Queue
must include an Enqueue
and a Dequeue
, used for adding and removing items from the collection. Just as in every other function, we're using the locks to protect our data access.
public void Enqueue(T item)
{
LockQ.EnterWriteLock();
try
{
m_Queue.Enqueue(item);
}
finally
{
LockQ.ExitWriteLock();
}
}
public T Dequeue()
{
LockQ.EnterWriteLock();
try
{
return m_Queue.Dequeue();
}
finally
{
LockQ.ExitWriteLock();
}
}
I found that many times, I have a need to enqueue multiple items at once. This leads to the creation of the EnqueueAll
functions. You'll notice the second overload is using the thread safe List (TList
).
public void EnqueueAll(IEnumerable<T> ItemsToQueue)
{
LockQ.EnterWriteLock();
try
{
foreach (T item in ItemsToQueue)
m_Queue.Enqueue(item);
}
finally
{
LockQ.ExitWriteLock();
}
}
public void EnqueueAll(TList<T> ItemsToQueue)
{
LockQ.EnterWriteLock();
try
{
foreach (T item in ItemsToQueue)
m_Queue.Enqueue(item);
}
finally
{
LockQ.ExitWriteLock();
}
}
And, since we have an EnqueueAll
, I also found a need to dequeue everything at once. DequeueAll
returns a thread safe list (TList
), instead of the standard List
.
public TList<T> DequeueAll()
{
LockQ.EnterWriteLock();
try
{
TList<T> returnList = new TList<T>();
while (m_Queue.Count > 0)
returnList.Add(m_Queue.Dequeue());
return returnList;
}
finally
{
LockQ.ExitWriteLock();
}
}
No related posts.
Related posts brought to you by Yet Another Related Posts plug in.