English
338. Counting Bits
Problem Statement
Given an integer n
, return an array ans
of length n + 1
such that for each i
(0 <= i <= n
), ans[i]
is the number of 1
's in the binary representation of i
.
Example 1:
Input: n = 2
Output: [0,1,1]
Explanation:
0 --> 0
1 --> 1
2 --> 10
Example 2:
Input: n = 5
Output: [0,1,1,2,1,2]
Explanation:
0 --> 0
1 --> 1
2 --> 10
3 --> 11
4 --> 100
5 --> 101
Constraints:
0 <= n <= 105
Follow up:
- It is very easy to come up with a solution with a runtime of
O(n log n)
. Can you do it in linear timeO(n)
and possibly in a single pass? - Can you do it without using any built-in function (i.e., like
__builtin_popcount
in C++)?
Click to open Hints
- You should make use of what you have produced already.
- Divide the numbers in ranges like [2-3], [4-7], [8-15] and so on. And try to generate new range from previous.
- Or does the odd/even status of the number help you in calculating the number of 1s?
Solution:
rs
impl Solution {
pub fn count_bits(n: i32) -> Vec<i32> {
let mut res = vec![0; n as usize + 1];
for i in 1..=n {
let i = i as usize;
// let i_and_i_minus_1 = i & (i - 1);
// res[i] = res[i_and_i_minus_1] + 1;
res[i] = res[i >> 1] + (i & 1) as i32;
}
res
}
}
...