AWS Lambda charges by RAM-seconds. Vercel's background job system packs multiple customer jobs into a single Lambda invocation to minimize cost. Given a list of jobs each with a RAM requirement and a priority score, find the optimal combination that maximizes priority without exceeding the Lambda's RAM limit. This is NP-hard in general but solvable in pseudo-polynomial time with DP.
We rent AWS Lambda instances with a maximum RAM capacity of M MB. We have a queue of background jobs, each requiring a specific amount of RAM and yielding a specific priority score. Pick the optimal subset of jobs to run on a single Lambda to maximize the priority score without exceeding M.
βΆ Run Code to test against examples Β· Submit to judge all 5 test cases