Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Last Updated: Mar 27, 2024
Medium

Weighted-Job Scheduling

gp-icon
Operating system track
Free guided path
14 chapters
83+ problems
gp-badge
Earn badges and level up

Introduction

A traditional CPU can only execute one job at a time, which makes the system operate more but with poor output. Therefore, various Job Scheduling algorithms come into the picture to maximize CPU utilization and increase the system’s efficiency.

Parallel Execution of multiple jobs is one of them, which is accomplished by scheduling the CPU based on the profit or priority given to jobs. 

This article will discuss the variations of the job scheduling problem in data structures and algorithms, followed by a comprehensive discussion of various approaches from brute force to the most optimal.

Weighted Job Scheduling

At first, it might seem not very comforting, but we urge you to read it till the end and believe me, then, there is no turning back; you will be able to solve this problem even in your sleepy head.

Also see: Multiprogramming vs Multitasking and Open Source Operating System.

Problem Statement

In weighted job scheduling, You’re given a list of jobs to complete, each with a Start time, Finish time, and a profit. A job’s profit is only awarded if performed within its deadline. One more catch here is that only one machine is available for processing. 

So here, find the maximum profit you can achieve so that no jobs overlap. In this problem, we must focus on maximizing profit rather than increasing the number of jobs executed. Now that we’ve got the problem statement let’s figure out how to approach its solution.

Solution Approach

A clear vision of our goals is the driving force behind our efforts to achieve them. Similarly, before we further search for the best solution to this problem, let’s explore what the answer might be.

  • A viable solution to this problem will include a set of jobs, each of which can be completed by the specified deadline without overlapping.
  • The value of a possible solution will be the sum of the profits associated with the jobs in the above set. 

Let’s look at an example to help us understand in a better way:

Let the number of jobs = 4
profits (p1 , p2 , p3 , p4)=(100 , 10 , 15 , 27)
start time(s1 , s2 , s3 , s4) =(1 , 0 , 1 , 0) and finish time (f1 , f2 , f3 , f4)=(2 , 1 , 2 , 1)

Solution: The total time available for the execution of jobs is 2 units(equal to the maximum deadline). Refer to the Gantt chart below for a pictorial interpretation.

Now we have to schedule the jobs assigned to us to maximize profit. All of these schedules are listed in the table below.

schedules

Answer: Solution 2 is optimal. In this case, only jobs 1 and 4 are processed, and the value is 127. Therefore, the processing order of the jobs is that job 4 is followed by job 1.

Let’s see the immediate implementation of the brute-force solution.

Algorithm JobScheduling( d, J, n)
{
// J is the set of jobs that can be completed by their deadline
// n is the total number of jobs
      J := { 1 } ;
      For i:= 2 to n do
      {
            if( all jobs in J U i can be completed by their deadlines )
            Then J := J U { i } ;
      }
}

Variations of Job Scheduling Problem

Basic Version

You are given a list of n jobs, each with a start and end time. Choose the maximum number of jobs a single processor can do, given that each can only work on one job at a time. 

Note: The profit associated with each job is the same in this case.

Example: Given four jobs with their start and finish times.

Job1 = { 1, 3 }
Job2 = { 2, 4 }
Job3 = { 3, 5 }
Job4 = { 5, 7 }

Let’s consider,  You as a boss and attending meetings is a part of your daily routine; now you want to attend a maximum number of meetings with the above-given start and end times. 

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

How would you approach this problem?

Reasonably, You will choose the meetings so that the completion time of the current meeting does not clash with the starting time of the next meeting scheduled.

The greedy approach is also the same as yours. It suggests choosing the next job such that its completion time is the shortest among the remaining jobs and the start time is greater than or equal to the preceding job’s finish time.

To accomplish this:

  • Sort the jobs by their completion time so that the next job always has the shortest completion time
  • From the sorted array, pick the first job and print it.
  • Again, Select a job and print it if the start time of this job is greater than or equal to the finish time of the previously selected job.
  • Furthermore, Complete the remaining actions in the sorted array by repeating step three until and unless all jobs are processed.

C++ Implementation of the above Approach

#include <bits/stdc++.h>
using namespace std;
 
// Program to print a maximum set of jobs that can be processed by a single processor, one at a time.
//  Total number of jobs given - n
//  Array containing the start time - s [ ] 
//  Array containing the  finish time - f [ ] 

void printMaxJobs( int s[ ], int f[ ], int n )
{
    int i, j;
 
    // The first activity always gets selected
    i = 0;
    cout <<" "<< i;
 
    // Consider rest of the activities
    for (j = 1; j < n; j++)
    {
      // If this job has start time greater than or
      // equal to the finish time of previously selected
      // job, then select it
      if (s[j] >= f[i])
      {
          cout <<" " << j;
          i = j;
      }
    }
}
 
// driver program 
int main()
{
    int s[] =  { 10, 12, 15, 25 };
    int f[] =  { 15, 25, 25, 35};
    int n = sizeof(s)/sizeof(s[0]);
    printMaxJobs(s, f, n);
    return 0;
}

Output:

0 2 3

Now let’s continue to see the second variation of this problem

Standard Version 

In this case, the profit associated with different jobs is different. Therefore, it is possible that a schedule with fewer jobs can have more profit than a schedule with more jobs. 

Approach 1 (Brute Force Approach)

  1. We will use recursion to reduce the big problem into several smaller subproblems.
  2. The idea is to create an array of ‘Jobs’ of size N, where each entry of Jobs will have three elements: start time of the job, end time of the job, and profit associated with the job.
  3. After that, we’ll sort the Jobs array in increasing order of finish time.
  4. Now, Call a maxProfitHelper function that returns the maximum profit obtained by performing the jobs. 
  5. The algorithm for maxProfitHelper will be as follows:
    • maxProfitHelper(Jobs, current),  (where ‘Jobs’ is the sorted array of jobs, ‘current’ is the index of the current job of the Jobs array): 
    • If current == 0:  return jobs[0].profit
  6. Now, return the maximum of two profits:
    • Maximum profit by excluding the current job.
    • Maximum profit by including the current job.

7. In calculating profit by including the current job, the idea is to find the latest job from the sorted Jobs array before the current job. It does not conflict with Jobs[ current ] using another helper function, nonConflicingJob(). Suppose the index of that job comes out to be i, then recursively call for maxProfitHelper(Jobs, i). 

This way, the total profit becomes profit from the recursive call + Jobs[ current].profit.

Time Complexity

In the worst case, we are sorting the Jobs array that takes O(NlogN) time. For each Job, we have two choices, either to include it or not. This leads to a time complexity of 2^N plus; at each step, we find the non-conflicting job, which takes O(N) time in the worst case. So, the overall time complexity becomes O(NlogN) + O(N)O( 2^N) or O(N(2^N)).

Space Complexity

We created a Job array of size N, which takes O(N) space in the worst case. Also, we are using recursion. So, a recursion stack will take O(N) space. Thus, the final space complexity is O(2*N) or O(N).

Let’s have a look at the Implementation of the Above Approach:

import java.util.Arrays;
import java.util.Comparator;

class job{
    int startTime;
    int endTime;
    int jobProfit;

    public job(int val){
        this.startTime = val;
        this.endTime = val;
        this.jobProfit = val;
    }

}

public class Solution{
    
    // Function to find the latest non conflicting job.
    public static int nonConflictingJob(job[] jobs, int i)
    {
        for (int j = i - 1; j >= 0; j--)
        {
            if (jobs[j].endTime <= jobs[i].startTime)
            {
                return j;
            }
        }

        return -1;
    }

    public static long maxProfitHelper(job[] jobs, int current)
    {
    
        if (current == 0){
            return jobs[0].jobProfit;
        }
    
        // First finding the profit by excluding the current job.
        long excProfit = maxProfitHelper(jobs, current - 1);
    
        // Then, finding the profit by including the current job.
        long incProfit = jobs[current].jobProfit;
    
        // Finding the index of closest non conflicting job with current job.
        int index = nonConflictingJob(jobs, current);
    
        // If the index is not equal to -1, recursively calling for that job.
        if (index != -1){
            incProfit += maxProfitHelper(jobs, index);
        }
    
        return Math.max(incProfit, excProfit);
    }
    

    public static long findMaxProfit(int[] start, int[] end, int[] profit){
    
        int n = start.length;
    
        // Creating jobs array of size N.
        job[] jobs = new job[n];

        for (int i = 0; i < n; i++){   
            job temp =  new job(0);
            temp.startTime = start[i];
            temp.endTime = end[i]; 
            temp.jobProfit = profit[i];
            jobs[i] = temp;
        }

        Arrays.sort(jobs, new Comparator<job>() {    
            @Override
            public int compare(job a, job b) {
                return a.endTime - b.endTime; 
            }               
        });

    
        long maxProfit = maxProfitHelper(jobs, n - 1);
    
        return maxProfit;
    }
}

Approach 2 (Recursive Memoisation):

In this approach, we will apply the memoization technique, which is a way of dynamic programming implementation. Because if we analyze carefully, we can infer that we are solving the same subproblems multiple times, also known as overlapping sub-problems.

We will use memoization to eliminate overlapping sub-problem calculations by storing answers for each recursive state. Memoisation can be eliminated by storing the value obtained from a particular recursive call in a 1d array called the LookUp array. LookUp[i] stores the maximum profit obtained by performing jobs till ith index.

Here we are not going into a detailed description of how memoization or dynamic programming works to get more insights; refer to Dynamic Programming and Algorithms.

Algorithm: 

  • An array named lookUp of size N is created.
  • Recursive Function: maxProfitHelper( Jobs, current, lookUp ), where ‘Jobs’ is the sorted array of jobs, and ‘current’ is the index of the current job of the Jobs array. The initial call of this function will have current = N-1 as an argument.
  • Base condition => current =  0 , return Jobs[0].profit
  • If the value at lookUp[current] is not -1, use this value; that is, return it.
  • There will be two cases:
    • Exclude the current job: In this case, we will not perform the current job and recursively call for maxProfitHelper( Jobs, current-1, lookUp).
    • Include the current job: In this case, we will find the latest non-conflicting job before the Job[current] using another helper function, nonConflictingJob(). And recursively call for maxProfitHelper(Jobs, inx, lookUp) where inx is the index of non-conflicting job. The total profit, in this case, becomes profit from recursive call + Jobs[current].profit.
    • The answer for this recursive function will be the maximum of the two profits.
    • Update the maximum profit value in lookUp[current] for further use.

Time Complexity

O(N*N) per test case, where N is the number of jobs.

Explanation:

In the worst case, we are sorting the Jobs array, which takes O(NlogN) time, then filling the lookup table of size N, which takes O(N) time, and at every step, we are calling nonConflictingJob() which again takes O(N) time to find the latest non-conflicting job. So Overall time complexity becomes O(NN)+ O(NlogN) or O(NN).

Space Complexity

O(N) per test case, where N is the number of jobs.

Explanation:

We are creating a job array and a lookup array of size N, which takes O(N) space in the worst case. In addition, we are using recursion, so O(N) space will be taken by the recursion stack O(N). Thus, the overall complexity becomes O(N) + O(N) + O(N) = O(3*N) or O(N).

Java implementation of the above Algorithm

import java.util.Arrays;
import java.util.Comparator;

class job{
    int startTime;
    int endTime;
    int jobProfit;

    public job(int val){
        this.startTime = val;
        this.endTime = val;
        this.jobProfit = val;
    }

}

public class Solution{

    // Function to find the latest non conflicting job.
    public static int nonConflictingJob(job[] jobs, int i){
        for (int j = i - 1; j >= 0; j--){
            if (jobs[j].endTime <= jobs[i].startTime)
            {
                return j;
            }
        }

        return -1;
    }

    public static long maxProfitHelper(job[] jobs, long[] lookUp, int current){

        if (current == 0){
            return jobs[0].jobProfit;
        }

        // If the answer already exists.
        if (lookUp[current] != -1)
        {
            return lookUp[current];
        }

        // First finding the profit by excluding the current job.
        long excProfit = maxProfitHelper(jobs, lookUp, current - 1);

        // Then, finding the profit by including the current job.
        long incProfit = jobs[current].jobProfit;

        // Finding the index of closest non conflicting job to current job.
        int index = nonConflictingJob(jobs, current);

        // If the index is not equal to -1, recursively calling for that job.
        if (index != -1){
            incProfit += maxProfitHelper(jobs, lookUp, index);
        }

        // Storing the answer for further use.
        lookUp[current] = Math.max(incProfit, excProfit);

        return lookUp[current];
    }

    public static long findMaxProfit(int[] start, int[] end, int[] profit){

        int n = start.length;

        // Creating jobs array of size N.
        job[] jobs = new job[n];

        for (int i = 0; i < n; i++){   
            job temp =  new job(0);
            temp.startTime = start[i];
            temp.endTime = end[i]; 
            temp.jobProfit = profit[i];
            jobs[i] = temp;
        }

        Arrays.sort(jobs, new Comparator<job>() {    
            @Override
            public int compare(job a, job b) {
                return a.endTime - b.endTime; 
            }               
        });

        // Creating a lookUp array of size N.
        long[] lookUp = new long[n];
        for(int i = 0; i < n; i++){
            lookUp[i] = -1;
        }

        long maxProfit = maxProfitHelper(jobs, lookUp, n - 1);

        return maxProfit;
    }

}

Approach 3 (Bottom-Up DP)

This strategy is built on the assumption that the maximum profit attainable up to the ith index is unchanged by the jobs that follow it. As a result, we’ll store the maximum profit from tasks up to that index in an array called DP array for each index. Thus, for each index, there will be two situations:

  • Include the current job: In this case, we will iterate over all jobs from i-1 to 0 and find the first non-conflicting job with our current job, i. The total profit, in this case, becomes profit from that conflicting job + profit from the ith job.
  • Exclude the current job: Our maximum profit answers the i-1 job.
  • The maximum profit of ith job becomes the maximum of the two cases.

Algorithm: 

  • Take a DP array of size ‘N’ and initialize all its values with 0.
  • DP[0] = 1
  • For every index ‘i’ of the Jobs array
    • Iterate over the indices for all ‘j’ such that i-1 >= j >= 0,  and if the job at index j is non-conflicting with the job at index i, DP[i] = max(DP[i], Job[i].profit +DP[j]). And then break the loop.
    • At the end of the above iteration, DP[i] = max(DP[i], DP[i-1]). DP[i-1] is the maximum profit by excluding the current job.
  • Return the value of DP[N-1], that is, the maximum profit till the last job.

Time Complexity

In the worst case, we are sorting the jobs array of size N, which takes O(NlogN) time, then filling the DP array of size N, which takes O(N) time. And for filling each DP[i], we are taking O(N) time to find the non-conflicting job. So, the overall time complexity becomes O(NN) + O(NlogN) = O(NN).

Space Complexity

In the worst case, we create a job array of size N and a DP array of size N. So, the overall space complexity becomes O(N)

Let’s look at the implementation of the above approach

import java.util.Arrays;
import java.util.Comparator;

class job{
    int startTime;
    int endTime;
    int jobProfit;

    public job(int val){
        this.startTime = val;
        this.endTime = val;
        this.jobProfit = val;
    }

}

public class Solution{

    // Function to find the latest non conflicting job.
    public static int nonConflictingJob(job[] jobs, int i){
        for (int j = i - 1; j >= 0; j--){
            if (jobs[j].endTime <= jobs[i].startTime){
                return j;
            }
        }

        return -1;
    }

    public static long findMaxProfit(int[] start, int[] end, int[] profit){

        int n = start.length;

        // Creating jobs array of size N.
        job[] jobs = new job[n];

        for (int i = 0; i < n; i++){   
            job temp =  new job(0);
            temp.startTime = start[i];
            temp.endTime = end[i]; 
            temp.jobProfit = profit[i];
            jobs[i] = temp;
        }

        Arrays.sort(jobs, new Comparator<job>() {    
            @Override
            public int compare(job a, job b) {
                return a.endTime - b.endTime; 
            }               
        });

        // Creating a dp array of size N.
        long[] dp = new long[n];

        dp[0] = jobs[0].jobProfit;

        for (int i = 1; i < n; i++)
        {
            // First calculating the profit by excluding the current job.
            long excProfit = dp[i - 1];

            // Then, calculating the profit by including the current job.
            long incProfit = jobs[i].jobProfit;

            // Finding the index of non conflicting job.
            int index = nonConflictingJob(jobs, i);

            // If the index is not equal to -1, adding the profit of that job.
            if (index != -1){
                incProfit += dp[index];
            }

            // Updating the dp array.
            dp[i] = Math.max(incProfit, excProfit);
        }

        return dp[n - 1];
    }

}

Check out Longest Common Substring

Approach 4 (Using Binary Search)

This method is the optimization of the previous approach. Using binary search as the jobs are sorted in increasing order of finish time will make it more optimal. We are using binary search to find the latest non-conflicting job before the current job.

Time Complexity

In the worst case, we are sorting the Jobs array of size N, which takes O(NlogN) time, then filling the DP table of size N, which takes O(N) time. But now, for filling each DP[i], we are taking O(logN) time to find a non-conflicting job. So, the overall complexity becomes O(NlogN).

Space Complexity

In the worst case, we create a job array of size N and a DP array of size N. So, the space complexity is O(N).

import java.util.Arrays;
import java.util.Comparator;

class job{
    int startTime;
    int endTime;
    int jobProfit;

    public job(int val){
        this.startTime = val;
        this.endTime = val;
        this.jobProfit = val;
    }

}

public class Solution{

    // Function to find the latest non conflicting job.
    public static int nonConflictingJob(job[] jobs, int i)
    {
        int s = 0;
        int e = i - 1;
        int ans = -1;

        while (s <= e)
        {
            int mid = (s + e) / 2;

            if (jobs[mid].endTime <= jobs[i].startTime)
            {
                ans = mid;
                s = mid + 1;
            }
            else
            {
                e = mid - 1;
            }
        }

        return ans;
    }

    public static long findMaxProfit(int[] start, int[] end, int[] profit){

        int n = start.length;

        // Creating jobs array of size N.
        job[] jobs = new job[n];

        for (int i = 0; i < n; i++){   
            job temp =  new job(0);
            temp.startTime = start[i];
            temp.endTime = end[i]; 
            temp.jobProfit = profit[i];
            jobs[i] = temp;
        }

        Arrays.sort(jobs, new Comparator<job>() {    
            @Override
            public int compare(job a, job b) {
                return a.endTime - b.endTime; 
            }               
        });

        // Creating a dp array of size N.
        long[] dp = new long[n];

        dp[0] = jobs[0].jobProfit;

        for (int i = 1; i < n; i++)
        {
            // First calculating the profit by excluding the current job.
            long excProfit = dp[i - 1];

            // Then, calculating the profit by including the current job.
            long incProfit = jobs[i].jobProfit;

            // Finding the index of non conflicting job.
            int index = nonConflictingJob(jobs, i);

            // If the index is not equal to -1, adding the profit of that job.
            if (index != -1){
                incProfit += dp[index];
            }

            // Updating the dp array.
            dp[i] = Math.max(incProfit, excProfit);
        }

        return dp[n - 1];
    }
}

If you have crammed all the approaches, you must attempt a real problem, Weighted Job Scheduling, available on Coding Ninjas Studio, is designed for students to provide loads of practice problems, interview experiences, blogs, and everything they need to crack technical interviews of their dream company.

Must Read: FCFS Scheduling

Also Read About, pcb in operating system

Frequently Asked Questions

What is the weighted interval scheduling problem?

Weighted Job Scheduling is a problem in which you have to estimate the maximum profit you can make given specific jobs, their start and end times, and the profit you make when you accomplish the job, given that no two jobs can be completed concurrently.

Which algorithm is most suitable for weighted task scheduling?

Dynamic Programming is the best algorithm to solve task scheduling problems.

Is task scheduling dynamic programming?

A dynamic programming algorithm is a procedure used for solving a problem. Dynamic programming is a reliable approach for real-time task scheduling in a heterogeneous multiprocessor platform with task duplication.

What is the time complexity of the job scheduling algorithm using memoization?

O(N*N) is the time complexity of the job scheduling algorithm using memoization.

Where is the task scheduling algorithm used?

CPU uses task scheduling to improve its efficiency. CPU scheduling determines which process will own a CPU for execution while another process is on hold.

Conclusion

In this article, we addressed the problem of job scheduling when profits are involved. We have observed that the greedy strategy fits the simple version, which holds the same profit for all jobs. It is also known as the activity selection problem. But, in the case of varying profit, four approaches and their respective time and space complexity were discussed. Finally, dynamic programming methodology was used to determine the optimal solution. 

Recommended Problem - K Closest Points To Origin

If you’re a beginner with trouble grasping the dynamic programming concept, don’t worry; you can refer to  Roadmap For Beginners To Master Dynamic Programming to understand concepts thoroughly.

Previous article
Multiple Processors Scheduling in Operating System
Next article
What is Race Condition in OS?
Guided path
Free
gridgp-icon
Operating system track
14 chapters
83+ Problems
gp-badge
Earn badges and level up
Live masterclass