Re: Distinction mutable/immutable

Tim Peters (tim@ksr.com)
9 Jun 1994 02:03:55 GMT

klooster@dutiag.twi.tudelft.nl (Marnix Klooster) writes:
>...
>One of the things I don't like about Python (I _love_ almost all
>other aspects!) is the distinction between mutable and immutable
>objects. I have two questions:

>- Why was this distinction introduced? Was it a design decision
> for the language, or to ease its implementation?

Only Guido can answer that; but my guess is the former.

>- How does this distinction work in practice? Does it complicate
> programming, or simplify it?

The only two places it ever seems _visible_ are for tuples and strings.
tuples are no problem, because if you want a mutable tuple you just use a
list (or if you want an immutable list, use a tuple <wink>).

Making strings immutable was a little stranger, because most languages
don't make that choice. Even among those that do, it's probably more
common to provide syntactic sugar to hide it; e.g., in Icon, where
strings are also immutable, after

s := "abc"
t := s
t[2] := "-"

s is "abc" and t is "a-c". The last line is actually treated as if
it had said
t := t[1:2] || "-" || t[3:*t+1]

In Python, 't[2] = "-"' is simply wrong. But does that get in the way?
Not often.

It's sometimes easier to reason about code that uses immutable types, and
e.g. for immutable structured types with immutable components "all the
way down" it's possible to compute a time-invariant hash (and Python's
form of associative array relies on that).

>A little justification of my dislike of the distinction:
>- It seems an unnecessary distinction, from a conceptual point of
> view.

Yes, after saying 0 is not 1, all the rest _is_ needless confusion <0.9
grin>!

>- If changing one value can also change another value, either:
> * this should be made explicit (e.g. all objects are immutable,
> and introduce pointers, i.e. reference and dereference), or
> * this should be the default (i.e. all objects are mutable).

I don't think you really believe that. I.e., I don't know of any
language crazy enough to insist that integers are mutable: if in C

i = 1;
j = i;
j += 3;
printf("%d\n", i);

printed 4, people would go insane. Almost everyone would agree that
integers should be immutable. OTOH, almost everyone would agree that
arrays should _not_ be immutable; it's the very essence of an array that
it be incrementally updatable (the more rigid-- and unused --functional
languages notwithstanding).

Now you _may_ be one of those rare people who really does believe that
consistency in this distinction (one way or the other) buys enough to
make up for the confusion it creates -- but since you say you love Python
in most other respects, I think you have too much common sense to be one
of those <wink>.

Fact of life that almost all languages do make a distinction between
mutable & immutable types (integers and arrays probably being the most
common examples of each), whether they say so up front or not. Granting
that, the real argument is over which types should be mutable and which
not. Rarely hear a passionate argument about that, though, because it's
not hard to live with just about any non-insane set of choices; much more
likely to hear arguments over, e.g., the distinct (but often confused)
issue of whether functions should pass by value or by reference (Python
passes "by object", so at least Guido never gets sucked into _that_
debate <grin>).

promising-it-won't-get-in-your-way-if-you-don't-think-about-it-too-
much-ly y'rs - tim

Tim Peters tim@ksr.com
not speaking for Kendall Square Research Corp