Last post Feb 26, 2019 05:08 PM by since3point0
Feb 26, 2019 05:08 PM|since3point0|LINK
I have a Windows service which processes a “batch” of parallel tasks from a queue. So I have a queue, that contains a list of hundreds of objects (per Windows service). Due memory, CPU and some other factors, I can only process a number (generally about
5-20) tasks at any time – based on the customer's environment. The problem that I am having trouble with is that when one task completes, I want to pop another task off the queue and run that in parallel. So run a fixed number of parallel tasks, and add a
task from a queue when one of the parallel tasks completes. So the fixed number of parallel tasks is always full until the queue is empty. My goal was to make it easy to follow, so that other fellow programmers can easily read the code.
I have tried quite a few different iterations, and have not got the correct solution yet. I was curious if anybody had any suggestions.
This is the batch process I currently have running in production, but it is not as efficient as I would like.
private ConcurrentQueue<Mailbox> _mailboxNameQueue = new ConcurrentQueue<Mailbox>();
private CancellationTokenSource _ctsMainServiceCancellationToken = new CancellationTokenSource();
CancellationToken sharedToken; // Passed into the function
// Queue could contain hundreds of items
var results = new ConcurrentBag<object>();
// Create new by removing from queue
var mailboxList = GetNewMailboxListFromQueue();
.ForAll(item = >
var response = ProcessMailbox(_serviceInfo, item, sharedToken);
// Process results
catch (OperationCanceledException ex)
So if anybody has any suggestions I would greatly appreciate them.