Introduction
Queues, stacks, deques, and linked lists are a part
of the basic set of programming tools that we have at our disposal.
In reviewing the examples available for stacks most dealt with simple objects such
as strings and integers. Primary examples being the development of a Reverse Polish Notation Calculator.
A nice article that discusses that application is available on
CodeProject.
Background
A Stack is a programmatic structure is most easily represented by a stack of dishes in your kitchen,
when you go for a dish the first dish removed is normally the last one that you had put back. This is also
described as Last In First Out (LIFO) structure. In .Net C# we get access to a built in stack structure
through the addition of the collection class to your list of included references at the start of the program:
using System.Collections
This will give you access to the Queue, Stack, ArrayList
and other classes. I have already discussed queues in
this article.
The Problem
The problem that I will use to exemplify the use of a stack is a stream of jobs arriving to be processed.
Each jobs will have an unique names so that we can keep track of them and a priority. The selection of a job
to be processed is based on its priority. We are using a simple rule that a high priority job will be selected
in front of a low priority job. A job selected for processing will be processed until it is complete.
Because the jobs may be arriving at any time we store them in a Queue as they await processing.
We have a few options as to how we will process these jobs.
A.
- Look at the first job in the queue if it a priority one job process it.
- Look at the first job in the queue it is a low priority job look at the next job,
if it is also a low priority job process the first job.
- Look at the first job in the queue if it is a low priority job
look at the next job, if it is a high priority job push the first job onto a stack and
process the second job.
B.
- Look at the entire queue, put the low priority jobs on the stack and process the
high priority jobs as you find them.
When you are out of high priority jobs go to the stack and remove the low priority jobs
one at time until they are all processed.
The problem with method A is that if the first job is a low priority job followed by high priority jobs it may never get processed. The problem with method B is the same one and the low priority jobs, if you can ever get to them is they are processed in the reverse order of their arrival.
The methods and properties that we have to work with, again using the 80/20 rule are
.count
: the number of items in the stack
.push(obj)
: put an object on the stack
.pop()
get the top item on the stack
.peek(obj)
: look at the next item in the stack
Again a good visualization is the plate dispenser at the salad bar. The dishwasher put them on the stack, and you can only remove the top one. If you peek at it and see its dirty you are stuck with it anyway.
We can also do copies and clones of the stack and I would refer you to my previously referenced article that has an example of how to do this with a queue. The approach would be the same for the stack.
The Solution
A job class is created and a job has two properties; a JobName
and a JobPriority
.
class Job
{
public string jobName;
public string jobPriority;
public Job () {}
public Job (string name, string priority)
{
jobName = name;
jobPriority = priority;
}
}
We also create a JobManager
class that will manage the jobs assigning
them to the processor in accordance with the business rules that we establish.
We are going to use the rules in A, this will limit the low priority
jobs to only one in the stack at a time but it seems a little more fair than the rules in B because if there is a break in the job stream or a sequence of low
priority jobs it will get processed in a reasonable fashion, However if there are
a lot of high priority jobs it will be stuck on the stack for a while.
class JobManager
{
public Queue jobQueue = new Queue();
public Stack jobStack = new Stack();
...
}
Now for a further discussion of the processing rules.
We look at the stack first as we process the jobs, and we apply the following rules
Stack Empty:
- If the first job is a high priority job process it.
- If the first two jobs are low priority process the first one.
- If the first job is a low priority and the second job a high priority job , push the first
job on the stack and process the second job.
Stack not Empty
- If the next job in the queue is a high priority process it.
- If the next job is a low priority job pop the job on the stack and process it.
Sidebar
A special note is needed here, when you remove an item from the stack you need to be aware that it is an object and needs to be cast back to its specific type so that it can be used e.g. : localJob = (Job)jobStack.Pop();
This is taking the object on the stack casting it to a Job and the assigning it to the variable local job.
Back to the problem
Do this processing unstill the queue is empty.
Job job1 = new Job("job1", "low");
Job job2 = new Job("job2", "low");
Job job3 = new Job("job3", "hign");
Job job4 = new Job("job4", "high");
Job job5 = new Job("job5", "low");
Job job6 = new Job("job6", "low");
JobManager jobManager = new JobManager();
jobManager.jobQueue.Enqueue(job1);
jobManager.jobQueue.Enqueue(job2);
jobManager.jobQueue.Enqueue(job3);
jobManager.jobQueue.Enqueue(job4);
jobManager.jobQueue.Enqueue(job5);
jobManager.jobQueue.Enqueue(job6);
jobManager.ProcessJobQueue( jobManager.jobQueue);
Now the processing for the ProcessJobQueue
method:
We have localJob
which is the job that has just removed from the queue and nextJob
which is the job that has just been exposed and is
examined with the .peek method.
The while loop will let us look at all of the items in the queue.
public void ProcessJobQueue(Queue workingQueue)
{
Job localJob = new Job();
Job nextJob = new Job();
while (workingQueue.Count != 0 )
{
First the tests for the stack. Before we do the peek for the next job we test to see that the queue is not empty.
if (jobStack.Count != 0)
{
if (workingQueue.Count > 0)
{ nextJob = (Job)workingQueue.Peek(); }
if (nextJob.jobPriority == "high")
{
localJob = (Job)workingQueue.Dequeue();
Console.WriteLine("Processing High Priority Job {0} from
the Queue, the stack left alone : ", localJob.jobName);
}
if (nextJob.jobPriority == "low")
{
localJob = (Job)jobStack.Pop();
Console.WriteLine("Processing Job {0} POP-ed from
stack:", localJob.jobName);
}
}
if (jobStack.Count == 0)
{
localJob = (Job)workingQueue.Dequeue();
if (workingQueue.Count > 0)
{
nextJob = (Job)workingQueue.Peek();
}
if (localJob.jobPriority == "high")
{
Console.WriteLine("Processing High priority Job
{0} directly from queue: ", localJob.jobName);
}
if ((localJob.jobPriority == "low") &&
(nextJob.jobPriority == "low"))
{
Console.WriteLine("Processing Low priority Job {0}
directly from queue with a low priority job behind
it :", localJob.jobName);
}
if ((localJob.jobPriority == "low") &&
(nextJob.jobPriority == "high"))
{
jobStack.Push(localJob);
Console.WriteLine("Stack Length: {0} ",
jobStack.Count.ToString());
Console.WriteLine("Job PUSH-ed on stack {0}",
localJob.jobName);
localJob = (Job)workingQueue.Dequeue();
Console.WriteLine("Processing Job {0} , the previous
job was pushed onto the stack ", localJob.jobName);
}
}
When the queue is finally empty we check the stack for any jobs that it may have in it.
if (jobStack.Count != 0)
{
localJob = (Job)jobStack.Pop();
Console.WriteLine("Processing Job : {0}", localJob.jobName);
}
The Output
Processing Low priority Job job1 directly from queue with a low priority job behind it:
Stack Length: 1
Job PUSH-ed on stack job2
Processing Job job3 , the previous job was pushed onto the stack
Processing High Priority Job job4 from the Queue, the stack left alone:
Processing Job job2 POP-ed from stack:
Processing Low priority Job job5 directly from queue with a low priority job behind it:
Processing Low priority Job job6 directly from queue with a low priority job behind it:
Points of Interest
There are two basic processor scheduling algorithms that are used by large systems: �A non-preemptive system, where jobs are executed one at a time to completion. The processor will only start the service of a low priority job if no jobs of a higher priority are scheduled. But once that processor has selected a job , it is committed to serve that job to
completion even if jobs of higher priority arrive during its service. ...
In a preemptive system, several jobs can be in various stages of execution.
At any moment the processor is serving a single job; but upon arrival of a job with
a higher priority , the job in service is interrupted and returned to the queue.
Service of the higher priority job is then started. The interrupted job will be
resumed later when no jobs of a higher priority are present. Preemptive scheduling
gives fast response to urgent jobs at the price of increased complexity and overhead.
� [ Operating System Principals by Per Brinch Hansen Prentice-Hall 1973 pg 195]
This scheduling method we have used for this example is a variation of the
non-preemptive schedule the exception being that we put the lower priority jobs on
the stack as opposed to returning it back to the queue.
History
No changes to date.