Python, Simple with Explanation


  • 8

    Let's calculate the depth of a node in the usual way: max(depth of node.left, depth of node.right) + 1. While we do, a path "through" this node uses 1 + (depth of node.left) + (depth of node.right) nodes. Let's search each node and remember the highest number of nodes used in some path. The desired length is 1 minus this number.

    def diameterOfBinaryTree(self, root):
        self.best = 1
        def depth(root):
            if not root: return 0
            ansL = depth(root.left)
            ansR = depth(root.right)
            self.best = max(self.best, ansL + ansR + 1)
            return 1 + max(ansL, ansR)
            
        depth(root)
        return self.best - 1
    

  • 2
    T

    Great solution!
    Just one doubt, why do you add and substract self.best by 1?

    This seems to me more readable:

    self.best = 0
    def depth(root):
        ...
        self.best = max(self.best, ansL + ansR)
        ...
    ...
    return self.best
    

Log in to reply
 

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.