Introduction
A queue could be an arrangement that works specifically, however, a real-life queue works. After you insert one thing into this arrangement, this new part is side at the top of it.
On the opposite hand, after you take one thing out of it, the part at the front is given to you. That means, the weather comes to enter the order they entered. it’s known as a primary return initial Out (FIFO) structure.
Understand more about the stacks courses at the Coding Ninjas.
A basic queue structure supports the subsequent operations:
- push(element): The part is side to the rear of this queue.
- front(): This returns the part at the front.
- pop(element): The part at the front is coming to you, and it’s deleted from the queue.
- size(): This returns this size of the queue.
- empty(): This checks if the queue is empty.
If you utilise a doubly joined list to implement a queue, you’ll do of these operations in O(1) with the assistance of a worldwide variable to stay count of the dimensions. There is another variant of queue named priority-queue. during this style of the queue, you’ll set the priority of every part, i.e. you’ll fix wherever the info is held on. Most of the operations in priority-queue square measure O(logn).
Implementation of queue using Stacks:-
Yes, a “TwoStackQueue” may be enforced victimisation 2 Stack information structures, in spite of however the Stack is internally enforced. after all, there’s Associate in Nursing associated performance value over a “native” Queue.
A Stack implements most of the Queue practicality, as you most likely notice. However, a lone Stack cannot ingeminate through its information, that is why a second stack is required. consider your information because the liquid that is poured between two glass containers. to make this “TwoStackQueue” you’ll have to be compelled to reimplement the Queue’s Enqueue, Dequeue and Front functions.
There square measure performance decisions that you just will build here (like with the OneQueueStack), reckoning on whether or not you specifically need an optimised Enqueue(), or Associate in Nursing optimised Dequeue() & Front(), or Associate in a Nursing overall balanced and globally optimised system.
1. To travel with the primary choice (fast Enqueue), once each operation (whether Enqueue or Dequeue), you push all information head-first into one in all the stacks, so the tail information is at the highest of that stack. Then once replacement information is Enqueue(), you’ll internally Push() it into that very same Stack. If instead, you receive a Dequeue() request, then you would like to maneuver all the info out of that stack tail=-first into the opposite stack, so the top is at the highest of that Stack, at that purpose you’ll Pop() it internally and Dequeue() it outwardly. thus this makes Dequeue() quite dear – O(N) really. (Cache the top price, thus you don’t have to be compelled to do a full rotation once Front() is termed.)
2. If you associate with the second choice (fast Dequeue), then by default the info is usually sitting within the different stack, with the top price at the highest of the stack. once Dequeue() (or Front()) is termed, that information is instantly on the market. However, once Enqueue is termed, all the info should be transferred head-first over into the opposite stack, then the extra information side, then all the info shifted tail-first into the initial stack, so the {top|the pinnacle} price is once more on top of the stackable to go. thus during this case, as before, there square measure 2 completely different O(N) iterations required, creating Enqueue() dearly.
3. It might solely add up to travel with one in all the 2 earlier cases if you actually required that call to invariably be optimised. Otherwise, the simplest overall system performance is to travel with a balanced case.
Here, you primarily acknowledge that within the semipermanent there’s Associate in Nursing equal range of Enqueue and Dequeue calls and conjointly that we have a tendency to cannot essentially predict the sequence during which they’re going to return. Kind of like however we would treat Associate in Nursing elevator during a two-story building, we have a tendency to simply leave our TwoStackQueue within the state required by the previous operation. With this selection, if we have a tendency to Enqueue-Dequeue-Enqueue-Dequeue, we have a tendency to do no speculative information shifting within the background.
So the answer is affirmative, you’ll implement a Queue with 2 Stacks, and you would like to settle on whether or not you wish to ensure a quick Enqueue (but have v.slow Dequeue), or guarantee a quick Dequeue (but have v.slow Enqueue), or wear average slightly slow Dequeue and Enqueue however best balanced overall system performance.
The task is to implement a queue victimisation instances of stack arrangement and operations on them.
A queue may be enforced victimisation two stacks. Let letter of the alphabet due to be enforced by letter of the alphabet and stacks wont to implement q be stack1 and stack2. letter of the alphabet may be enforced in two ways:
Method one (By creating enQueue operation costly) This technique makes positive that the oldest entered part is usually at the highest of stack one, so deQueue operation simply pops from stack1. to place the part at the prime of stack1, stack2 is employed.
enQueue(q, x):
- While stack1 isn’t empty, push everything from stack1 to stack2.
- Push x to stack1 (By considering the size of stacks is infinite).
- Push everything back to stack1.
Here time quality are going to be O(n).
deQueue(q):
- If stack1 is empty then an error
- Pop Associate in a Nursing item from stack1 and come back it
Here time quality are going to be O(1).
Read about Shrinking Kotlin libraries & apps using Reflection with R8 on Coding Ninjas Blog.
Below is that the implementation of the on top of approach:
// CPP program to implement Queue using
// two stacks with costly enQueue()#include <bits/stdc++.h> using namespace std;
struct Queue {
stack s1, s2;
void enQueue(int x)
{
// Move all elements from s1 to s2
while (!s1.empty()) {
s2.push(s1.top());
s1.pop();
}
// Push item into s1
s1.push(x);
// Push everything back to s1
while (!s2.empty()) {
s1.push(s2.top());
s2.pop();
}
}
// Dequeue an item from the queue
int deQueue()
{
// if first stack is empty
if (s1.empty()) {
cout << "Q is Empty";
exit(0);
}
// Return top of s1
int x = s1.top();
s1.pop();
return x;
}
};
// Driver code
int main()
{
Queue q;
q.enQueue(1);
q.enQueue(2);
q.enQueue(3);
cout << q.deQueue() << '\n';
cout << q.deQueue() << '\n';
cout << q.deQueue() << '\n';
return 0;
}
Output:
- 1
- 2
- 3
Complexity Analysis:
- Time Complexity: Push operation: O(N): In the worst case we’ve got the empty whole of stack one into stack two.Pop operation: O(1). Same as pop operation in a stack.
- Auxiliary Space: O(N): Use of a stack for storing values. Method two (By creating deQueue operation costly)In this technique, in en-queue operation, the new part is entered at the highest of stack1. In de-queue operation, if stack2 is empty then all the weather square measure emotional to stack2 and eventually prime of stack2 is came.
enQueue(q, x)
1) Push x to stack1 (assuming size of stacks is unlimited).
Here time complexity will be O(1)
deQueue(q)
1) If both stacks are empty then error.
2) If stack2 is empty
While stack1 is not empty, push everything from stack1 to stack2.
3) Pop the element from stack2 and return it.
Here time complexity will be O(n)
Method two is certainly higher than technique one.
Method one moves all the weather doubly in enQueue operation, whereas technique two (in deQueue operation) moves {the parts|the weather} once and moves elements on condition that stack2 empty. So, the amortised quality of the dequeue operation becomes
Implementation of method 2:
// CPP program to implement Queue using
// two stacks with costly deQueue()#include <bits/stdc++.h>using namespace std;
struct Queue {
stack s1, s2;
// Enqueue an item to the queue
void enQueue(int x)
{
// Push item into the first stack
s1.push(x);
}
// Dequeue an item from the queue
int deQueue()
{
// if both stacks are empty
if (s1.empty() && s2.empty()) {
cout << "Q is empty";
exit(0);
}
// if s2 is empty, move
// elements from s1
if (s2.empty()) {
while (!s1.empty()) {
s2.push(s1.top());
s1.pop();
}
}
// return the top item from s2
int x = s2.top();
s2.pop();
return x;
}
};
// Driver code
int main()
{
Queue q;
q.enQueue(1);
q.enQueue(2);
q.enQueue(3);
cout << q.deQueue() << '\n';
cout << q.deQueue() << '\n';
cout << q.deQueue() << '\n';
return 0;
}
Output:
- 1 2 3
Complexity Analysis:
Time Complexity:Push operation: O(1). Same as pop operation in stack.Pop operation: O(N).
In the worst case we’ve got empty whole of stack one into stack two
- Auxiliary Space: O(N).
Use of stack for storing values.
Queue may also be enforced victimisation one user stack and one call Stack. Below is changed technique two wherever rule (or call Stack) is employed to implement queue victimisation just one user outlined stack.
- enQueue(x)
- 1) Push x to stack1.
- deQueue:
- 1) If stack1 is empty then error.
6 2) If stack1 has just one part then come back it. - 3) Recursively pop everything from the stack1, store the popped item
- during a variable res, push the res back to stack1 and come back res
The step three makes positive that the last popped item is usually came and since the rule stops once there’s just one item in stack1 (step 2), we have a tendency to get the last part of stack1 in deQueue() and every one different things square measure pushed back in step.