Welcome to our blog dedicated to understanding Weighted-Job Scheduling, a critical problem in the realm of algorithmic optimization.

Weighted-Job Scheduling involves assigning a set of jobs with varying weights and durations to a limited resource, such as a single machine or processor, while maximizing the total weight or profit. This problem finds applications in diverse domains, including project management, production scheduling, and computational resource allocation.

Weighted job scheduling is a problem in computer science and optimization that involves scheduling a set of jobs with varying durations and weights on a single machine or processor. Each job has an associated weight or profit, which represents its importance or value, and a duration, which denotes the time required to complete the job.

The objective of weighted job scheduling is to maximize the total weight or profit of the scheduled jobs while adhering to certain constraints, such as the availability of the machine and the order in which the jobs must be completed. In other words, the goal is to allocate the limited resources efficiently to complete the most valuable jobs within the given time constraints.

Get the tech career you deserve, faster!

Connect with our expert counsellors to understand how to hack your way to success

User rating 4.7/5

1:1 doubt support

95% placement record

Akash Pal

Senior Software Engineer

326% Hike After Job Bootcamp

Himanshu Gusain

Programmer Analyst

32 LPA After Job Bootcamp

After Job Bootcamp

Problem Statement

In weighted job scheduling, Youâ€™re given a list of jobs to complete, each with a Start time, Finish time, and a profit. A jobâ€™s profit is only awarded if performed within its deadline. One more catch here is that only one machine is available for processing.

So here, find the maximum profit you can achieve so that no jobs overlap. In this problem, we must focus on maximizing profit rather than increasing the number of jobs executed. Now that weâ€™ve got the problem statement letâ€™s figure out how to approach its solution.

Solution Approach

A clear vision of our goals is the driving force behind our efforts to achieve them. Similarly, before we further search for the best solution to this problem, letâ€™s explore what the answer might be.

A viable solution to this problem will include a set of jobs, each of which can be completed by the specified deadline without overlapping.

The value of a possible solution will be the sum of the profits associated with the jobs in the above set.

Letâ€™s look at an example to help us understand in a better way:

Let the number of jobs = 4
profits (p1 , p2 , p3 , p4)=(100 , 10 , 15 , 27)
start time(s1 , s2 , s3 , s4) =(1 , 0 , 1 , 0) and finish time (f1 , f2 , f3 , f4)=(2 , 1 , 2 , 1)

Solution: The total time available for the execution of jobs is 2 units(equal to the maximum deadline). Refer to the Gantt chart below for a pictorial interpretation.

Now we have to schedule the jobs assigned to us to maximize profit. All of these schedules are listed in the table below.

S.no

Feasible solution

Processing Sequence

Value

Explanation of key terms of this table

1

(J1, J2)

2,1

110

Feasible solution: A feasible solution is a set of jobs where all the jobs in that set can be completed on time.

Processing Sequence: The processing sequence refers to the order in which the jobs present in a feasible solution are executed.

Value: The value refers to the total profit of all the jobs in a feasible solution.

2

(J1,J4)

4,1

127

3

(J2,J3)

2,3

25

4

(J3,J4)

4,3

42

5

(J1)

1

100

6

(J2)

2

10

7

(J3)

3

15

8

(J4)

4

27

Answer: Solution 2 is optimal. In this case, only jobs 1 and 4 are processed, and the value is 127. Therefore, the processing order of the jobs is that job 4 is followed by job 1.

Letâ€™s see the immediate implementation of the brute-force solution.

Algorithm JobScheduling( d, J, n)
{
// J is the set of jobs that can be completed by their deadline
// n is the total number of jobs
J := { 1 } ;
For i:= 2 to n do
{
if( all jobs in J U i can be completed by their deadlines )
Then J := J U { i } ;
}
}

Variations of Job Scheduling Problem

Basic Version

You are given a list of n jobs, each with a start and end time. Choose the maximum number of jobs a single processor can do, given that each can only work on one job at a time.

Note: The profit associated with each job is the same in this case.

Example: Given four jobs with their start and finish times.

Letâ€™s consider, You as a boss and attending meetings is a part of your daily routine; now you want to attend a maximum number of meetings with the above-given start and end times.

How would you approach this problem?

Reasonably, You will choose the meetings so that the completion time of the current meeting does not clash with the starting time of the next meeting scheduled.

The greedy approach is also the same as yours. It suggests choosing the next job such that its completion time is the shortest among the remaining jobs and the start time is greater than or equal to the preceding jobâ€™s finish time.

To accomplish this:

Sort the jobs by their completion time so that the next job always has the shortest completion time

From the sorted array, pick the first job and print it.

Again, Select a job and print it if the start time of this job is greater than or equal to the finish time of the previously selected job.

Furthermore, Complete the remaining actions in the sorted array by repeating step three until and unless all jobs are processed.

C++ Implementation of the above Approach

C++

C++

#include <bits/stdc++.h> using namespace std;

// Program to print a maximum set of jobs that can be processed by a single processor, one at a time. // Total number of jobs given - n // Array containing the start time - s [ ] // Array containing the finish time - f [ ]

void printMaxJobs( int s[ ], int f[ ], int n ) { int i, j;

// The first activity always gets selected i = 0; cout <<" "<< i;

// Consider rest of the activities for (j = 1; j < n; j++) { // If this job has start time greater than or // equal to the finish time of previously selected // job, then select it if (s[j] >= f[i]) { cout <<" " << j; i = j; } } }

// driver program int main() { int s[] = { 10, 12, 15, 25 }; int f[] = { 15, 25, 25, 35}; int n = sizeof(s)/sizeof(s[0]); printMaxJobs(s, f, n); return 0; }

Output:

0 2 3

Now letâ€™s continue to see the second variation of this problem

Standard Version

In this case, the profit associated with different jobs is different. Therefore, it is possible that a schedule with fewer jobs can have more profit than a schedule with more jobs.

Approach 1 (Brute Force Approach)

We will use recursion to reduce the big problem into several smaller subproblems.

The idea is to create an array of â€˜Jobsâ€™ of size N, where each entry of Jobs will have three elements: start time of the job, end time of the job, and profit associated with the job.

After that, weâ€™ll sort the Jobs array in increasing order of finish time.

Now, Call a maxProfitHelper function that returns the maximum profit obtained by performing the jobs.

The algorithm for maxProfitHelper will be as follows:

maxProfitHelper(Jobs, current), (where â€˜Jobsâ€™ is the sorted array of jobs, â€˜currentâ€™ is the index of the current job of the Jobs array):

If current == 0: return jobs[0].profit

Now, return the maximum of two profits:

Maximum profit by excluding the current job.

Maximum profit by including the current job.

7. In calculating profit by including the current job, the idea is to find the latest job from the sorted Jobs array before the current job. It does not conflict with Jobs[ current ] using another helper function, nonConflicingJob(). Suppose the index of that job comes out to be i, then recursively call for maxProfitHelper(Jobs, i).

This way, the total profit becomes profit from the recursive call + Jobs[ current].profit.

Time Complexity

In the worst case, we are sorting the Jobs array that takes O(NlogN) time. For each Job, we have two choices, either to include it or not. This leads to a time complexity of 2^N plus; at each step, we find the non-conflicting job, which takes O(N) time in the worst case. So, the overall time complexity becomes O(NlogN) + O(N)O( 2^N) or O(N(2^N)).

Space Complexity

We created a Job array of size N, which takes O(N) space in the worst case. Also, we are using recursion. So, a recursion stack will take O(N) space. Thus, the final space complexity is O(2*N) or O(N).

Letâ€™s have a look at the Implementation of the Above Approach:

// Function to find the latest non conflicting job. public static int nonConflictingJob(job[] jobs, int i) { for (int j = i - 1; j >= 0; j--) { if (jobs[j].endTime <= jobs[i].startTime) { return j; } }

return -1; }

public static long maxProfitHelper(job[] jobs, int current) {

if (current == 0){ return jobs[0].jobProfit; }

// First finding the profit by excluding the current job. long excProfit = maxProfitHelper(jobs, current - 1);

// Then, finding the profit by including the current job. long incProfit = jobs[current].jobProfit;

// Finding the index of closest non conflicting job with current job. int index = nonConflictingJob(jobs, current);

// If the index is not equal to -1, recursively calling for that job. if (index != -1){ incProfit += maxProfitHelper(jobs, index); }

return Math.max(incProfit, excProfit); }

public static long findMaxProfit(int[] start, int[] end, int[] profit){

int n = start.length;

// Creating jobs array of size N. job[] jobs = new job[n];

for (int i = 0; i < n; i++){ job temp = new job(0); temp.startTime = start[i]; temp.endTime = end[i]; temp.jobProfit = profit[i]; jobs[i] = temp; }

Arrays.sort(jobs, new Comparator<job>() { @Override public int compare(job a, job b) { return a.endTime - b.endTime; } });

long maxProfit = maxProfitHelper(jobs, n - 1);

return maxProfit; } }

Approach 2 (Recursive Memoisation):

In this approach, we will apply the memoization technique, which is a way of dynamic programming implementation. Because if we analyze carefully, we can infer that we are solving the same subproblems multiple times, also known as overlapping sub-problems.

We will use memoization to eliminate overlapping sub-problem calculations by storing answers for each recursive state. Memoisation can be eliminated by storing the value obtained from a particular recursive call in a 1d array called the LookUp array. LookUp[i] stores the maximum profit obtained by performing jobs till ith index.

Here we are not going into a detailed description of how memoization or dynamic programming works to get more insights; refer to Dynamic Programming and Algorithms.

Algorithm:

An array named lookUp of size N is created.

Recursive Function: maxProfitHelper( Jobs, current, lookUp ), where â€˜Jobsâ€™ is the sorted array of jobs, and â€˜currentâ€™ is the index of the current job of the Jobs array. The initial call of this function will have current = N-1 as an argument.

Base condition => current = 0 , return Jobs[0].profit

If the value at lookUp[current] is not -1, use this value; that is, return it.

There will be two cases:

Exclude the current job: In this case, we will not perform the current job and recursively call for maxProfitHelper( Jobs, current-1, lookUp).

Include the current job: In this case, we will find the latest non-conflicting job before the Job[current] using another helper function, nonConflictingJob(). And recursively call for maxProfitHelper(Jobs, inx, lookUp) where inx is the index of non-conflicting job. The total profit, in this case, becomes profit from recursive call + Jobs[current].profit.

The answer for this recursive function will be the maximum of the two profits.

Update the maximum profit value in lookUp[current] for further use.

Time Complexity

O(N*N) per test case, where N is the number of jobs.

Explanation:

In the worst case, we are sorting the Jobs array, which takes O(NlogN) time, then filling the lookup table of size N, which takes O(N) time, and at every step, we are calling nonConflictingJob() which again takes O(N) time to find the latest non-conflicting job. So Overall time complexity becomes O(NN)+ O(NlogN) or O(NN).

Space Complexity

O(N) per test case, where N is the number of jobs.

Explanation:

We are creating a job array and a lookup array of size N, which takes O(N) space in the worst case. In addition, we are using recursion, so O(N) space will be taken by the recursion stack O(N). Thus, the overall complexity becomes O(N) + O(N) + O(N) = O(3*N) or O(N).

// Function to find the latest non conflicting job. public static int nonConflictingJob(job[] jobs, int i){ for (int j = i - 1; j >= 0; j--){ if (jobs[j].endTime <= jobs[i].startTime) { return j; } }

return -1; }

public static long maxProfitHelper(job[] jobs, long[] lookUp, int current){

if (current == 0){ return jobs[0].jobProfit; }

// If the answer already exists. if (lookUp[current] != -1) { return lookUp[current]; }

// First finding the profit by excluding the current job. long excProfit = maxProfitHelper(jobs, lookUp, current - 1);

// Then, finding the profit by including the current job. long incProfit = jobs[current].jobProfit;

// Finding the index of closest non conflicting job to current job. int index = nonConflictingJob(jobs, current);

// If the index is not equal to -1, recursively calling for that job. if (index != -1){ incProfit += maxProfitHelper(jobs, lookUp, index); }

// Storing the answer for further use. lookUp[current] = Math.max(incProfit, excProfit);

return lookUp[current]; }

public static long findMaxProfit(int[] start, int[] end, int[] profit){

int n = start.length;

// Creating jobs array of size N. job[] jobs = new job[n];

for (int i = 0; i < n; i++){ job temp = new job(0); temp.startTime = start[i]; temp.endTime = end[i]; temp.jobProfit = profit[i]; jobs[i] = temp; }

Arrays.sort(jobs, new Comparator<job>() { @Override public int compare(job a, job b) { return a.endTime - b.endTime; } });

// Creating a lookUp array of size N. long[] lookUp = new long[n]; for(int i = 0; i < n; i++){ lookUp[i] = -1; }

long maxProfit = maxProfitHelper(jobs, lookUp, n - 1);

return maxProfit; }

}

Approach 3 (Bottom-Up DP)

This strategy is built on the assumption that the maximum profit attainable up to the ith index is unchanged by the jobs that follow it. As a result, weâ€™ll store the maximum profit from tasks up to that index in an array called DP array for each index. Thus, for each index, there will be two situations:

Include the current job: In this case, we will iterate over all jobs from i-1 to 0 and find the first non-conflicting job with our current job, i. The total profit, in this case, becomes profit from that conflicting job + profit from the ith job.

Exclude the current job: Our maximum profit answers the i-1 job.

The maximum profit of ith job becomes the maximum of the two cases.

Algorithm:

Take a DP array of size â€˜Nâ€™ and initialize all its values with 0.

DP[0] = 1

For every index â€˜iâ€™ of the Jobs array

Iterate over the indices for all â€˜jâ€™ such that i-1 >= j >= 0, and if the job at index j is non-conflicting with the job at index i, DP[i] = max(DP[i], Job[i].profit +DP[j]). And then break the loop.

At the end of the above iteration, DP[i] = max(DP[i], DP[i-1]). DP[i-1] is the maximum profit by excluding the current job.

Return the value of DP[N-1], that is, the maximum profit till the last job.

Time Complexity

In the worst case, we are sorting the jobs array of size N, which takes O(NlogN) time, then filling the DP array of size N, which takes O(N) time. And for filling each DP[i], we are taking O(N) time to find the non-conflicting job. So, the overall time complexity becomes O(NN) + O(NlogN) = O(NN).

Space Complexity

In the worst case, we create a job array of size N and a DP array of size N. So, the overall space complexity becomes O(N)

Letâ€™s look at the implementation of the above approach

// Function to find the latest non conflicting job. public static int nonConflictingJob(job[] jobs, int i){ for (int j = i - 1; j >= 0; j--){ if (jobs[j].endTime <= jobs[i].startTime){ return j; } }

return -1; }

public static long findMaxProfit(int[] start, int[] end, int[] profit){

int n = start.length;

// Creating jobs array of size N. job[] jobs = new job[n];

for (int i = 0; i < n; i++){ job temp = new job(0); temp.startTime = start[i]; temp.endTime = end[i]; temp.jobProfit = profit[i]; jobs[i] = temp; }

Arrays.sort(jobs, new Comparator<job>() { @Override public int compare(job a, job b) { return a.endTime - b.endTime; } });

// Creating a dp array of size N. long[] dp = new long[n];

dp[0] = jobs[0].jobProfit;

for (int i = 1; i < n; i++) { // First calculating the profit by excluding the current job. long excProfit = dp[i - 1];

// Then, calculating the profit by including the current job. long incProfit = jobs[i].jobProfit;

// Finding the index of non conflicting job. int index = nonConflictingJob(jobs, i);

// If the index is not equal to -1, adding the profit of that job. if (index != -1){ incProfit += dp[index]; }

// Updating the dp array. dp[i] = Math.max(incProfit, excProfit); }

This method is the optimization of the previous approach. Using binary search as the jobs are sorted in increasing order of finish time will make it more optimal. We are using binary search to find the latest non-conflicting job before the current job.

Time Complexity

In the worst case, we are sorting the Jobs array of size N, which takes O(NlogN) time, then filling the DP table of size N, which takes O(N) time. But now, for filling each DP[i], we are taking O(logN) time to find a non-conflicting job. So, the overall complexity becomes O(NlogN).

Space Complexity

In the worst case, we create a job array of size N and a DP array of size N. So, the space complexity is O(N).

// Function to find the latest non conflicting job. public static int nonConflictingJob(job[] jobs, int i) { int s = 0; int e = i - 1; int ans = -1;

while (s <= e) { int mid = (s + e) / 2;

if (jobs[mid].endTime <= jobs[i].startTime) { ans = mid; s = mid + 1; } else { e = mid - 1; } }

return ans; }

public static long findMaxProfit(int[] start, int[] end, int[] profit){

int n = start.length;

// Creating jobs array of size N. job[] jobs = new job[n];

for (int i = 0; i < n; i++){ job temp = new job(0); temp.startTime = start[i]; temp.endTime = end[i]; temp.jobProfit = profit[i]; jobs[i] = temp; }

Arrays.sort(jobs, new Comparator<job>() { @Override public int compare(job a, job b) { return a.endTime - b.endTime; } });

// Creating a dp array of size N. long[] dp = new long[n];

dp[0] = jobs[0].jobProfit;

for (int i = 1; i < n; i++) { // First calculating the profit by excluding the current job. long excProfit = dp[i - 1];

// Then, calculating the profit by including the current job. long incProfit = jobs[i].jobProfit;

// Finding the index of non conflicting job. int index = nonConflictingJob(jobs, i);

// If the index is not equal to -1, adding the profit of that job. if (index != -1){ incProfit += dp[index]; }

// Updating the dp array. dp[i] = Math.max(incProfit, excProfit); }

return dp[n - 1]; } }

If you have crammed all the approaches, you must attempt a real problem, Weighted Job Scheduling, available on Coding Ninjas Studio, is designed for students to provide loads of practice problems, interview experiences, blogs, and everything they need to crack technical interviews of their dream company.

What is the time complexity of weighted interval scheduling?

The time complexity of weighted interval scheduling is O(n log n), where n is the number of jobs, due to sorting the jobs by their finish times.

What is the weighted interval scheduling problem?

Weighted Job Scheduling is a problem in which you have to estimate the maximum profit you can make given specific jobs, their start and end times, and the profit you make when you accomplish the job, given that no two jobs can be completed concurrently.

Which algorithm is most suitable for weighted task scheduling?

Dynamic Programming is the best algorithm to solve task scheduling problems.

Is task scheduling dynamic programming?

A dynamic programming algorithm is a procedure used for solving a problem. Dynamic programming is a reliable approach for real-time task scheduling in a heterogeneous multiprocessor platform with task duplication.

What is the time complexity of the job scheduling algorithm using memoization?

O(N*N) is the time complexity of the job scheduling algorithm using memoization.

Conclusion

In this article, we addressed the problem of job scheduling when profits are involved. We have observed that the greedy strategy fits the simple version, which holds the same profit for all jobs. It is also known as the activity selection problem. But, in the case of varying profit, four approaches and their respective time and space complexity were discussed. Finally, dynamic programming methodology was used to determine the optimal solution.