# A less efficient way (binary search)

• I know that the best solution is using two pointers like what is done in the previous solution sharing. However, I see the tag contains "binary search". I do not know if I misunderstand but is binary search a less efficient way for this problem.

Say, fix the first element A[0] and do binary search on the remaining n-1 elements. If cannot find any element which equals target-A[0], Try A[1]. That is, fix A[1] and do binary search on A[2]~A[n-1]. Continue this process until we have the last two elements A[n-2] and A[n-1].

Does this gives a time complexity lg(n-1) + lg(n-2) + ... + lg(1) ~ O(lg(n!)) ~ O(nlgn). So it is less efficient than the O(n) solution. Am I missing something here?

The code also passes OJ.

``````vector<int> twoSum(vector<int> &numbers, int target) {
if(numbers.empty()) return {};
for(int i=0; i<numbers.size()-1; i++) {
int start=i+1, end=numbers.size()-1, gap=target-numbers[i];
while(start <= end) {
int m = start+(end-start)/2;
if(numbers[m] == gap) return {i+1,m+1};
else if(numbers[m] > gap) end=m-1;
else start=m+1;
}
}
}``````

• You can add an early exit if target-numbers[i] > numbers[numbers.size()-1]. This can speed up the binary search.

• Yes. This looks like a valid optimization. But the asymptotic time complexity is still O(nlgn).

• My idea is to use binary search to find the largest number less than target and then use two pointers.

• Sounds Great!!!

• @njucs.cx integers can be negative, so a number larger than target might still be in the answer

• I think it is better to switch the fixed element for every loop. For an array with 100 elements, 1st loop fixes A[0], then do binary search for "end" in A[0-100]. For example, if find A[50], then switch it. The second loop fix A[50], and do binary search for "begin" in A[0-50]. Although the worst case is still O(n), the average can be logN + log(N/2) + log(N/4) + ...

``````public class Solution {
public int[] twoSum(int[] numbers, int target) {
boolean isBegin = true;
int [] result = new int [2];
int temp = 0, before = 0, end = numbers.length, index = 0;
while(before<end){
if(isBegin){
temp = target - numbers[before];
index = Arrays.binarySearch(numbers, before, end, temp);
if(index > 0){
result[0] = before + 1;
result[1] = index + 1;
return result;
}else{
end = -index - 1 - 1;
isBegin = false;
}
}else{
temp = target - numbers[end];
index = Arrays.binarySearch(numbers, before, end, temp);
if(index > 0){
result[0] = index + 1;
result[1] = end + 1;
return result;
}else{
before = -index - 1;
isBegin = true;
}
}
}
return result;
}
}``````

• I think worst and average case are both only O(n log n). After the first iteration, your sum is already close to the target, and then you're unlikely to make any more big jumps. Might even always just shave off a single value.

• There is one possible improvement by trimming the array.
You can traverse the array to find the first num, say x, that x + nums[0] > target. Then any num after x is useless. So that you can trim the inner binary search.

• I think the time complexity of binary search solution is O(NlogN). Here is the Java version. Thanks~

``````    public int[] twoSum(int[] nums, int target) {
int n = nums.length;
for (int i = 0; i < n; i++) {
int j = Arrays.binarySearch(nums, i + 1, n, target - nums[i]);
if (j >= 0) {
return new int[]{ i + 1, j + 1 };
}
}
return new int[]{ -1, -1 };
}
``````

• e largest number less than tar

This is a good idea dealing with array with all positive numbers. But it does not work when numbers arrays having negative values. Let's say numbers array is {-1, 1}, and target is 0.

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.