# Why the output is 2147483647 with the input string is "2147483648" (Python)

• I'm confused with this test case: input string is "2147483648", but the expected output is 2147483647. Anybody has any comments for this?

``````class Solution:
# @param {string} str
# @return {integer}
def myAtoi(self, str):
if len(str) == 0:
return 0
dic = {'0':0, '1':1, '2':2, '3':3, '4':4, '5':5, '6':6,'7':7, '8':8, '9':9}

result = 0
if '+' in str or '-' in str:
flag = 0;
else:
flag = 1

while (str[0] == ' '):
str = str[1:]

for c in str:
if c == '+' and result == 0:
flag += 1
elif c == '-' and result == 0:
flag -= 1
elif c in dic:
result = result * 10 + dic[c]
else:
return result * flag

return result * flag``````

• Test data is the same for all languages, and `atoi` probably comes from C/C++, where 2147483648 is out of range (of 32-bit signed ints). The result is undefined but implementations and thus the test data apparently use the largest possible value (just like strtol does by definition). Which is 2147483647.

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.