You claimed when space complexity is defined with a normal Turing machine, it is effectively equal to the amount of extra memory being used. Please cite one such definition.

Sorry, I slipped there. I shouldn't have tried to put it in those terms. Thanks for the correction.

I should have just stuck with my actual point, that people simply mean extra space when they say space. Because like I said, that's what we're interested in in practice, and that's the actual cost **of calling the function**. The input is already prepared, that cost is already paid! The word "space" has been around a lot longer than complexity theory and computers. When you see that word, you shouldn't automatically assume that people are talking about the space complexity you're thinking of. If you do, then you **are** picking a definition you favor. And I believe most people here neither mean that definition when they say space, nor do they understand it that way when they read that word.

Or let me put it this way: You're talking about the space complexity of **problems**, or of **problem+algorithm** pairs. We here are talking about the space taken/allocated by our functions. So it's wrong to add the input size to that, since allocating that space already happened **before** our functions get called. That's not a cost produced by our functions.

I myself occasionally point out to people that they shouldn't call their solutions "O(...) time" when that's really just the average, not the worst case, with the reasoning that "O(...) time" is commonly understood as worst case if not specified otherwise. I think that's what really matters, that we understand each other. What things are commonly understood as. You call it a "common misunderstanding" (of the space complexity you're thinking of). I call it a common understanding (that "space" refers to the space allocated by our solutions).

As for the is_palindrome function example, you actually raised a very good point. The answer is: the space complexity of this function is not O(1), but instead O(logn). Justification: if the list has n elements, storing variable i alone will take O(logn) bits.

Yes, I know.

What I asked is whether you seriously insist that it's "incorrect" to say that that function takes O(1) space. Because using your definition, and to be consistent, you should.

Practically we call it is O(1)

"We"? So even you yourself would call it that? Then where do you draw the line?