Why diff&[~(diff - 1)] can pick one bit that is one?Could anybody help me with this problem?Thank you very much in advance!
~(diff -1) negates all of the bit in diff except the first binary bit (low-bit), then diff &(~(diff -1)) can only pick one bit which is the first binary bit in diff.
For example, a=0011; b=0101; a xor b=0110
(a xor b)-1=0101; ~(a xor b)=1010;
(a xor b)&(~(a xor b))=0110^1010=0010 (just pick up one binary bit in diff)
Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.