From time to time I see posts about removing the use of lock() from producer/consumer queues where the number of threads operating are limited. This is an exploration into writing such a queue, the techniques used, and how to create the thread interactions without the use of locks.

Step 1 – Building the first thing that works. Let’s start simple and assume we have a single thread reading from the queue and a single thread writing to the queue. Now we just need to ensure that these two threads don’t write to the same variables. This is actually a very straight-forward problem. We will need a small class to keep a forward-only linked list of the items in the queue. Then we will track both the head and tail of the queue items. So here is the first-pass, the simplest thing that works:

    class LocklessQueue<T>
    {
        class Item
        {
            public Item Next;
            bool _valid;
            T _value;
            public Item(bool valid, T value)
            {
                _valid = valid;
                _value = value;
                Next = null;
            }
            public bool IsValid { get { return _valid; } }
            public T TakeValue()
            {
                T value = _value;
                _valid = false;
                _value = default(T);
                return value;
            }
        }

        Item _first;
        Item _last;

        public LocklessQueue()
        {
            _first = _last = new Item(false, default(T));
        }

        public bool IsEmpty
        {
            get
            {
                while (!_first.IsValid && _first.Next != null)
                    _first = _first.Next;
                return false == _first.IsValid;
            }
        }

        public void Enqueue(T value)
        {
            Item i = new Item(true, value);
            _last.Next = i;
            _last = i;
        }

        public T Dequeue()
        {
            while (!_first.IsValid && _first.Next != null)
                _first = _first.Next;

            if (!_first.IsValid)
                throw new InvalidOperationException();//queue is empty

            return _first.TakeValue();
        }
    }

Now the question is how can we improve upon this? Well one problem this suffers from is the creation of the ‘Item’ instance for each object added to the queue. While this might sound like a small problem it can grow quite large depending upon the frequency of inserts. So the first thing we should look at is reducing the heap thrashing by changing Item to be a collection rather than a single instance. This will be done in a part 2 of this post. Part 3 will enhance it’s usability a little, and attempt to address a mild amount of synchronization between the reader/writer so we can wait for data. Lastly, if I get to it, we can explore how this can be expanded to allow n readers and writers, provided we have a fixed value for ‘n’.

Comments