# Java DFS 14 ms solution: improved version of my previous one. O(N*C*logN) with complexity analysis.

• By considering this as a DAG problem. (Please refer to my previous post for explanation.)
The optimal solution should be O(|E|+|V|). We know |V|=N. However, we have to go through N^2 comparisons to get |E|. In this solution, I use the idea of generating multiples of each V_1 to find all of its edges (V1, V2) s.t. V2 % V1 = 0. This generation is upward bounded by the V_max which is the max value in the given array. The while loop runs V_max/V1 times and could be very large if V_max is large. Therefore, by indexing the given array, the unnecessary loops can be reduced and this indexing costs logN. Thus, the while loop ends up runs min( V_max/V1, N_v) = C, where N_v is the number of V greater than V1. As a result, the overall complexity is NCLogN, where C <= (N+(N-1)+(N-2)+(N-3)+...)/N
and C <= (V_max/V1+V_max/V2+V_max/V3+V_max/V4+...)/N.
Therefore, if N is small, the running time is close to NN since C is close to N and logN is very small. When, N is very large, C should be bounded by V_max/N which is considered as constant so that the running time is close to NlogN.

``````    public class Solution {
int[] maxLen; // whether got visited if yes store its max length
int[] path; // remember the path of max length trace
int len;

public int helper(int cur, int[] nums) {
if (cur > len ) return 0;
if ( maxLen[cur] > 0 ) return maxLen[cur];
path[cur] = cur;
int curMax = 0;
long curVal = (long)nums[cur];
long i=2; // factor for next multiple
long next = curVal * i; // next multiple
int nid = cur+1;
// teminated when next mulitiple is out of range or it is the last one
while ( next <= (long) nums[len-1] && nid < len ) {
if ( nums[nid] < next ) {
// call binary search only when the next value is smaller than next multiple
// and find the ceiling value of the multiple
nid = Arrays.binarySearch(nums, nid+1, len, (int)next );
if (nid < 0) nid = -(nid+1);
} else {
// the ceiling value is a multiple
if ( nums[nid] % nums[cur] == 0 ) {
int temp = helper(nid, nums);
if (temp > curMax ) {
curMax = temp;
path[cur] = nid;
}
}
// increase the factor for next multiple
i= (nums[nid]/nums[cur])+1;
next = curVal * i;
}
}
maxLen[cur] = curMax+1;
return maxLen[cur];
}

public List<Integer> largestDivisibleSubset(int[] nums) {
List<Integer> ret = new ArrayList();
if (nums==null || nums.length ==0) return ret;
len = nums.length;
if (len == 1) {
return ret;
}
maxLen = new int[len];
path = new int[len];
int max = 0;
int idx = -1;
Arrays.sort(nums);
for (int i=0; i<len; i++) {
int temp = helper( i, nums);
if (temp > max ) {
max = temp;
idx = i;
}
}
while (idx != path[idx]) {
idx = path[idx];
}
return ret;
}
}
``````

• If the N numbers are the first N powers of 2, your helper's while-loop runs about N2/2 times and each time runs a binary search, so might be worse than O(N2).

• You mean this test case:
[1,2,4,8,16,32,64,128,256,512,1024,2048,4096,8192,16384,32768,65536,131072,262144,524288,1048576,2097152,4194304,8388608,16777216,33554432,67108864,134217728,268435456,536870912,1073741824]

• Yes, like that.

• It is the worst case scenario. However, the average running time should be in O(N^2). So, I changed the title. The analysis should be similar to the QuickSort, the worst case of the pivot choise leads to O(N^2), the best one leads to O(N). By average, QuickSort is O(N log N). In this problem, the best case is O(N log N) , i.e. [1000, 1001, 1002, 1003, 1004,1005,...,1999 ]

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.