See the original problem on HackerRank.
Solutions
This problem is an application of the classical running median problem with a complication: the sliding window.
The running median consists in calculating the median of all the subarrays from 0
to N
. For example:


The typical solution to this problem involves two (differently ordered) heaps that maintain the two halves of the input arranged in a way that is convenient to retrieve the maximum of the former and the minimum of the latter. In other words, Consider this input:


Imagine we want to calculate the mediam of this array. We can see the array as:


If both the subarrays are, respectively, a max heap and a min heap, we have a quick way to retrieve the largest element of the left side and the smallest of the right side, thus the median is given by the average of the top elements of both the heaps.
If the size of the array is odd, we will end up having one heap that is longer. The top of that heap is the median.
To do this, we should keep the two heaps sort of balanced, that is: they should never have a size that differs by more than one element.
In addition, this approach is good for solving the problem online (in streaming). In fact, we don’t really need to read all the input in advance. A priority queue is a convenient data structure that uses a heap underneath.
This general problem might be seen as a sort of “growing window”. Elements get in but they never get out. We might take inspiration from this approach, however the sliding window problem has something hard to handle with: removal of the elements from the window. Indeed, elements get in and also out of the window:


The solution is a bit detailed because we have to handle with balacing of the data, but the general idea is as simple as it seems.
Here is a solution in Java (extracted from this great article) that provides a priority queue that supports removing any element:


The time complexity of this solution might seem \( O(N \cdot log(W)) \), where \( W \) is the size of the window (d
in the problem). However, since remove
takes linear time, the correct complexity is \( O(N \cdot (W + log(W))) \) that is \( O(N \cdot W) \). Space complexity is \( O(2 \cdot W) \) that is \( O(W) \), because we use some extra space for managing the window. In other words, if \(W=N\), this solution is quadratic in time and linear in space. Yet, HackerRank accepts this code for all the test samples.
Similar solutions might employ sets or other ordered data structures.
Exploiting a constraint: the frequency table approach
A very interesting solution exploits a constraint of the problem, indeed the maximum value of the input (expenditure[i]
) is constant: we have at most \(200\). This information allows us to simplify the computation of the median by  literally  counting the frequencies of each element.
Consider this example:


Given that the size of the window is 4
, we know that the median will be given by the second and third element of the sorted window:


If the window was odd, suppose d=3
, the median would be just the second element of the sorted window (2
). This can be written also as (2+2)/2
.
Since we know the size of the window, we can calculate the number of elements we must “consume” from the frequency table before getting to those giving the median. Let’s call them i1
and i2
:
i1 = floor((d1)/2.0)
i2 = ceil((d1)/2.0)
Above, when d=4
, we have that i1=1
and i2=2
. On the other hand, when d=3
we have that i1=1
and i2=1
. Indeed, when d
is odd, the median is given by the average of the two middle elements, otherwise it corresponds just to the (unique) middle element.
i1
and i2
are the upper bounds of the number of frequencies we need to “exceed” before getting to the middle elements.
Consider again the example:


The frequency table of the first d
elements (2,3,2,5
) is printed here below:


We need to consume i1=1
element to find the first middle element. This leads us to:


The second middle element is given by consuming i2=2
elements and that leads us to:


To iterate this process, we just need to update the frequencies of the window at the end of each iteration.
Here is a solution in C++:


The complexity of this solution is linear (\( O(N \cdot D)\), where \( D \) is the maximum size of the domain that is 200
). The space complexity is constant for the very same reason (\(D\) is a problem constraint).
The attentive reader might notice that the two internal for loops can be rewritten in terms of prefix sums. Indeed, this works as well:


The only problem is that partial_sum
would be applied on the whole frequency table, regardless of find_if
.
This problem is a good example of a scenario that requires us to find and exploit a constraint.