```
class Solution(object):
def deserialize(self, s):
n = len(s)
stack = [] #stack that stores lists where each is for one level
num = 0
sign = 1 #+/- sign for integers
res = [] #store current level's elements
hasNum = False #check if there is a num to add
singleInt = True #check if the input is just a single integer
for i in xrange(n):
if s[i].isdigit():
num = num*10 + int(s[i])
hasNum = True
else:
if hasNum:
res.append(num*sign)
num = 0
sign = 1
hasNum = False
hasInt = True
if s[i] == "-":
sign = -1
if s[i] == "[":
if i != 0 or res:
stack.append(res)
res = []
if i == 0:
singleInt = False
if s[i] == "]" and stack:
tmp = stack.pop()
tmp.append(res)
res = tmp
if hasNum:
res.append(num*sign)
nestInt = NestedInteger()
if singleInt:
nestInt.setInteger(res[0])
else:
nestInt.setInteger(res)
return nestInt
```

Please note that when creating NestedInteger object, I always use .setInteger() method no matter it is a single integer or a nested list. The reason I am doing is because when I use .add() method, the output will end up with one more bracket than the expected output. For example, a test case input is "[-1]", and if I am using .add() method, my output is [[-1]]. I don't really understand why I am encountering this issue. Any advice will be kindly appreciated.