I have written a simple code to solve the problem (nothing too fancy) in Python. I have attached my code at the end.
According to the OJ, for the input
[1,null,2], the expected output is
[1,null,2] which is correct.
My output is coming out to be
[1,None,2] which apparently doesn't match with the OJ. I thought
None in Python and
null in OJ were supposed to be equivalent. What am I doing wrong and why isn't the OJ not parsing
Any help would be appreciated.
# Definition for a binary tree node. # class TreeNode(object): # def __init__(self, x): # self.val = x # self.left = None # self.right = None class Codec: def serialize(self, root): """Encodes a tree to a single string. :type root: TreeNode :rtype: str """ li, que = ,  if root: que.append(root) while len(que) > 0: root = que.pop(0) if not root: li.append("null") else: li.append(root.val) que.append(root.left) que.append(root.right) return ",".join(map(str, li)) def deserialize(self, data): """Decodes your encoded data to tree. :type data: str :rtype: TreeNode """ if not data: return None li = map(lambda v: int(v) if v != "null" else None, data.split(",")) while li and li[-1] is None: li.pop() # print li idx = 0 root = TreeNode(li[idx]) que = [(root, idx)] while len(que) > 0: curr, idx = que.pop(0) # print "hola", curr, idx, if not curr: # print continue # print "val", curr.val, if 2*idx+1 < len(li): curr.left = TreeNode(li[2*idx + 1]) que.append((curr.left, 2*idx+1)) if 2*idx+2 < len(li): curr.right = TreeNode(li[2*idx + 2]) que.append((curr.right, 2*idx+2)) # print "que", que return root # Your Codec object will be instantiated and called as such: # codec = Codec() # codec.deserialize(codec.serialize(root))
For that example,
root.left should be
None. Not a node with a
val attribute of
None, which is what you're producing.
@StefanPochmann Yes, I realised as much. There are a lot of issues here with falsy types alone. Thanks for replying!