AWS needs to patch a fleet of servers distributed across different clusters. An automated script patches servers at a rate of K servers per hour. The patch window closes in H hours. If the script finishes a cluster early, it idles for the rest of the hour. We need to find the absolute minimum speed K that guarantees all servers are patched before the deadline.
Given an array of clusters representing the number of servers in each, and a deadline `H` hours. You can patch `K` servers per hour. If a cluster has fewer than `K` servers, you patch them and idle for the rest of the hour. Return the minimum integer `K` such that all servers are patched within `H` hours.
βΆ Run Code to test against examples Β· Submit to judge all 5 test cases