So as I mentioned in the previous post, all this work to build lockless queues is really a waste of time. Why, well to answer that question we need something to compare it against. Thus the class below derives from Queue and provides the locking necessary to make the queue thread-safe (well, not thread safe, only those two methods are safe, but you get the idea). After a few benchmarks I quickly realized that after 10 million queue/dequeue operations (on two threads) the performance delta was around 2 seconds. So our lock + event overhead costs us around 0.0002 ms per queue/dequeue operation. When you compare that 200 nanoseconds with your 100 millisecond network latency to get the request in the first place it would be absurd to use a custom ‘Lockless’ queue. Maybe, JUST maybe, you could find use for something like that if your developing games or something; however, what you wind up with in most cases will be a maintenance nightmare and a debugging hell on earth.
Go ahead now and build yourself a little test harness using this queue and the one from the previous post and see for yourself. The below implementation is superior in many ways, it doesn’t have a polling loop so response to an enqueue are potentially faster, it supports any number of producer/consumers so you can perform operations in parallel, and you basically get this behavior free (or near to).
Now this post is not to put down the effort of the new threading objects in .Net 4.0. This is really more about what you should, and should not attempt to do on your own. I’ll leave it for you to decide if the .Net 4.0 team wasted their time ;)
class LockingQueue<T> : Queue<T> { ManualResetEvent _mre = new ManualResetEvent(false); public LockingQueue(int size) : base(size) { } public bool IsEmpty { get { return base.Count == 0; } } public new void Enqueue(T obj) { lock (this) { base.Enqueue(obj); _mre.Set(); } } public bool TryDequeue(int timeout, out T value) { lock (this) { if (base.Count > 0) { value = base.Dequeue(); return true; } _mre.Reset(); } _mre.WaitOne(timeout, false); lock (this) { if (base.Count > 0) { value = base.Dequeue(); return true; } } value = default(T); return false; } }