Python, Simple with Explanation

  • 8

    Let's calculate the depth of a node in the usual way: max(depth of node.left, depth of node.right) + 1. While we do, a path "through" this node uses 1 + (depth of node.left) + (depth of node.right) nodes. Let's search each node and remember the highest number of nodes used in some path. The desired length is 1 minus this number.

    def diameterOfBinaryTree(self, root): = 1
        def depth(root):
            if not root: return 0
            ansL = depth(root.left)
            ansR = depth(root.right)
   = max(, ansL + ansR + 1)
            return 1 + max(ansL, ansR)
        return - 1

  • 2

    Great solution!
    Just one doubt, why do you add and substract by 1?

    This seems to me more readable: = 0
    def depth(root):
        ... = max(, ansL + ansR)

Log in to reply

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.