For divide and conquer solution of this problem:
def solve(input): if input is digit: return [input] ans = [ ] for x in input: if x is operator: left = solve(input[0 ~ x]) right = solve(input[x + 1 ~ len]) for l in left: for r in right: ans += [ calc(l, r, x) ] return ans
What is the time complexity of it?
With n being the number of operators, it's at least Ω(3^n).
f(n) = f(1)+f(2)+..+f(n-1)+f(n-1)+f(n-2)..+f(1) = 2*(f(1)+f(2)+...+f(n-1)) => f(n+1) = 2*(f(1)+f(2)+..+f(n)) = 2*(f(1)+f(2)+...+f(n-1)) + 2f(n) = f(n)+2f(n) = 3f(n) => f(n+1) = 3f(n) = 3*3*f(n-1) = ... = 3^(n+1) Therefore, O(3^n)
With n being the number of operators, the output size is Cn, the n-th Catalan number. Which grows as
The solution you showed also calculates many subexpressions. I think we have to add another factor n because that's the number of operations that went into each of the Cn results. So I'm not entirely sure, but I think the runtime is proportional to
4^n ------- . sqrt(n)
I think that there is something missing in the above calculation. After we get f(1) and f(n-1), we need C(1)*C(n - 1) time to calculate the results for f(n) and add them to the output for f(n).
What will be the complexity if we introduce a cache to avoid solving same expression again, like here https://leetcode.com/discuss/48490/java-recursive-solution-with-memorization?
Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.