See the original problem on HackerRank.
Solutions
Sorting
A C++ solution based on sorting and a sliding window made of two increasing pointers:


Basically, we stretch out the windows as far as the difference between the endpoints is less than or equal to 1. When we find a greater difference, we calculate the distance of the pointers (the length of the window) so far and we increase the “tail” (the first endpoint). And so on. We just need to take extra care of the possible case where all the elements are equal. That’s handled on the line:


Frequency table
To avoid sorting and to maximize the performance when the size of the input grows a lot, an alternative solution takes advantage of the domain: indeed, adimissible numbers fall into the range \( [0100] \) and then we can do a statically sized array which stores all the occurrences. This is basically a way to “compress” the duplicates into a single element.
From the problem, we know that only adjacent elements in the occurences array (also known as frequency table) can be picked to maintain the constraint. So it’s just the matter of calculating the maximum among all the adjacent pairs:


A pattern brings from the solution: we applied the zip  map  reduce pipeline.
This pattern is easy to express in languages like Python:


Just for completeness, in C++ we can rewrite the solution in terms of** inner_product**:


Linq Solution by emanu_p81
1 2 3 4 5 6 7 8 9 10 11 
static int pickingNumbers(int[] a) { Dictionary<int, int> frequencies = Enumerable.Range(0, a.Max() + 2) .ToDictionary(x => x, x => a.GroupBy(y => y) .Where(c => c.Key == x) .Select(f => f.Count()) .FirstOrDefault()); return frequencies .Zip(frequencies.Skip(1), (x, y) => (x.Value + y.Value)) .Max(); } 
Another solution by alepez


C++ frequency table (alepez)
This is unreadable, but fits in 147 bytes of C++
1 2 3 
#include <iostream> int main(){int f[100]{};int n;while(std::cin>>n)++f[n];int m=0; for(int i=1;i!=100;++i)m=std::max(m,f[i1]+f[i]);std::cout<<m;} 
ES6 frequency table + zip + reduce
1 2 3 4 5 6 7 8 9 
const zip = (arr, ...arrs) => arr.map((val, i) => arrs.reduce((a, arr) => [...a, arr[i]], [val])); const pickingNumbers = (a) => { let freq = Array(100).fill(0); a.forEach(n => ++freq[n]); return zip(freq, freq.slice(1)) .slice(0, 1) .reduce((max, p) => Math.max(max, p[0] + p[1]), 0); } 
Generalization
It’s possible to generalize this problem to any absolute difference, not just 1. This change does not impact very much on the first solution because we just need to change the condition of the if:


On the other hand, generalizing the second solution is much more interesting. When K=1 we have the original problem and we know that the solution is given by calculating the max among all the adjacent pairs aggregations. If K increases, then we still have to calculate the maximum of all the aggregations of adjacent windows of size K+1 (with K=1, windows are just pairs):
We can do better. We can avoid aggregating numbers that we already aggregated in a previous step. One simple way to implement this is by maintaining an accumulator along the way and every time we move the window by one we subtract the element “leaving the window” and, at the same time, we add the element “entering the window”.
This is basically another pattern that we can implement in terms of the prefix sum: every element of the prefix sum in position \( i \) is the sum of all the elements from the beginning to \( i \). Then we can manage a window of size K+2 and subtracing the two endpoints along the way:
The number of occurrences of the elements compressed in the frequency table will be just the difference between the two endpoints. So we can still apply the zip  map  reduce pattern just by changing the map:

