How to get O(N logN) time complexity if use bottom-up algorith. IMO, by given following algorithm, it couldn't be reached since there are 3 while/for loop iterations.
But in theory, the top-down recursive algorithm should be able to be replaced with the iteration algorithm without increasing the extra complexity. Is the root cause of stack space used in recursive one?
function sort(list) // count is the sublist length for count = 1;count < list.length;count *= 2 var newHead; // get two sublist to merge for leftStart = 0;leftStart < list.length;leftStart += count*2 rightStart = min(leftStart + count, list.length - 1) left = sublist from [leftStart] to [rightStart - 1] right = sublist from [rightStart] to list[min(rightStart + count - 1, list.length)] // merge list newsublist = merge(left, right) append the new sublist to newHead end for end for return list end function function merge(left, right) var oneHead; while(left !=null && right != null) choose the small value to append return oneHead;
I am not sure I understand the question correctly.
In general, recursive algorithms have the same time complexity than their iterative counterparts, but use more stack space. (That's not always the case, for exemple tail-recursive algorithms can be compiled so they don't use more stack space, but Java does not do this optimization, and divide-and-conquer algorithms are not tail-recursive anyways)
The algorithm you wrote in pseudo-code runs in O(n log n). The outer loop is called log n times and its core runs in O(n) time. For each value of
count, you scan only one the list to create the sublists
right, of size
count and you merge the
n/count of them. That also is O(n).