Re: Overloading function calls

Tim Peters (
Sun, 05 Jun 94 03:46:20 -0400

> [tim, describes a scheme not entirely unlike today's for handling
> special-method binop coercions and RHS dispatch, then ...]
> + FBGNTs that work today have a good chance of continuing to work _if_
> the line
> __rcoerce__ = __coerce__
> is added after their "def __coerce__" method. The same trick is
> appropriate for any FBGNT that needs RHS dispatch, does want coercion,
> and is happy to always promote to a common type.

While that's true, it's true of a scheme other than the one I actually
wrote up <grin/sigh>. I.e., the above is a bald-faced lie!! Don't
believe it, & sorry for the confusion.

I hope that, if a new scheme is adopted, people would recode classes
currently relying on __coerce__ for RHS dispatch, in order to take
advantage of the changes. The poor-but-quick way to achieve backward
compatibility in the proposed scheme is to define __rcoerce__ to mimic
what Python does today:

def __rcoerce__(y, x):
p = y.__coerce__(x)
if p is not None: return p[1], p[0]

E.g., consider 6/x, where x is an instance of class X that defines both
__coerce__ and __div__:

Today: "6" doesn't know how to deal with "x", so Python swaps the
arguments and invokes X.__coerce__(x,6). __coerce__ doesn't know
anything about ordering, and needs to promote 6 to an instance of X,
which I'll denote as X(6). X.__coerce__ then returns (x, X(6)). Python
verifies that the tuple components have the same type. Then, because it
swapped arguments before the call to __coerce__, it swaps them back
again, to get (X(6), x). Finally, Python see that X(6) has a __div__
method, so invokes X(6).__div__(x), and all is well.

Under the compatibility kludge: "6" doesn't know how to deal with "x",
but X.__rcoerce__ is defined so Python invokes X.__rcoerce__(x,6). That
in turn invokes X.__coerce__(x,6) so that it acts exactly like it used
to, and __coerce__ returns (x, X(6)) as before. __rcoerce__ then does
the swapping that Python used to do, and X(6).__div__(x) is called just
as before. The point of testing for "p is not None" is so that the
kludge fails in the same cases and in the same ways the current scheme
fails. If you don't care about that, a slightly simpler compatibility
kludge that _may_ be suitable (it depends on how your class expects to
interact with other classes) is:

def __rcoerce__(y, x):
y, x = y.__coerce__(x)
return x, y

But enough compatibility hacks! It's more interesting to see how life
would work using the proposed scheme from the start. Demo/Classes/
implements a Date class (read the comments at the start) supporting
date + int # Date int days in the future
int + date # ditto
BUT NOT date + date # no sensible meaning

date - int # Date int days in the past
date - date # number of days between dates
BUT NOT int - date # no sensible meaning

date compare date # guess

and as undocumented features

int compare date # don't guess
date compare int

The tribulations of getting this to work under the current scheme are
documented in the file; e.g.,

+ "int + date" and "date - int" go thru __coerce__, and Python won't
accept a (Date,int) pair *from* __coerce__. So __coerce__ hides int
inputs in instances of a _DisguisedInt class, and part of the work
slobbers into that latter class.

+ But "date + int" *doesn't* go thru __coerce__, so Date.__add__ is
unique in not being able to trust the type of its 2nd argument
(__coerce__ never got a chance to check it), and in dealing with a raw
int instead of a _DisguisedInt (__coerce__ never got a chance to hide

Under the proposed scheme, life is easier: Since Date wants RHS dispatch
(for "int + date" and "int compare date"), it still needs to get into the
coercion business. Since it _is_ in that business, it may as well take
advantage of it, putting the gross type checks there. But unlike before,
it needn't disguise int arguments (so no need for the _DisguisedInt
class), and all the binops can rely on knowing their 1st arg is a Date
and their 2nd a Date or an integer:

def __coerce__(self, other):
t = type(other)
if t in _INT_TYPES or \
t is type(self) and other.__class__ is Date:
return self, other
# else signal failure by falling thru (returns None)

def __rcoerce__(self, other):
t = type(other)
if t in _INT_TYPES or \
t is type(self) and other.__class__ is Date:
return self, other, 1

Of course __rcoerce__ could invoke __coerce__ to save duplication.

The changes to the class binop methods are (I hope!) obvious: __add__,
__sub__ and __cmp__ grow a 3rd "swapped=0" argument, weed out the
remaining bad cases ("int - date" and "date + date"), and do the right
thing with the good cases. All that is straightforward.

The code isn't really amazingly simpler, but it is amazingly simpler to
figure out _how_ to write it so that it works.

The most important thing (IMO) is subtler: In reviewing this class, I
discovered that it fails to work nicely with new numeric (semi-numeric,
whatever) data types. That is, if you define a new semi-numeric type SN
that you want to be able to add to Dates, then under the current scheme &
the current Date code

date + SN

will fail with a "can't add <type 'instance'> to date" TypeError, raised
by Date.__add__. Fixing this isn't trivial, and would require new code
in Date.__add__ that either knows directly about the SN type or explicitly
reproduces Python's coercion logic and redispatches to SN.__add__.

Under the proposed scheme and code, it works by magic:
Date.__coerce__(date, SN) fails, so SN.__rcoerce__ will be invoked. If
SN believes it knows how to add itself to a Date, no problem.

More, this will be _generally_ true of numeric types, without explicit
effort: a new type can be defined that knows how to deal with any subset
of the pre-existing numeric types, without needing any changes to the
pre-existing types. This is vital, and is genuinely hard to achieve
under the current scheme (e.g., I blew it in the Dates module, and you'll
find that you can't mix the Rational & Complex types from the
Demo/classes/ directory either).

Marc, would you like to try fleshing out the details for a multi-
method approach instead? The way I picture it:

+ Date's __add__, __sub__, __cmp__ methods, and coercions, go away, and
instead functions external to the class are defined to do the work.

+ Argument "type-checking" becomes a non-issue for the user: 7
<signature, implementor> pairs, corresponding to the only legal
operations, are somehow registered:

date + int
int + date
date - int
date - date
date compare date
int compare date
date compare int

Then e.g. "date + date" raises a generic "don't know how to do that"

Presumably the user can define as few as 3 distinct implementing
functions (one for each of Date {+,-,cmp}), or as many as 7 (one for
each unique signature). In the latter case explicit tests on type
within each function aren't needed; in the former case they are (and
the functions need to know exactly how fuzzy their arguments can be).

+ Adding a new type that can manipulate old types, without changing the
old types, is just a matter of registering the signatures and supplying
implementing functions.

+ Ditto doing something nuts like "even though I didn't write, and can't
even get at, the source for the relevant classes, I want
Complex ^ Date
to mean 42" -- just register (Complex, Date, lambda x,y: 42) with the
__xor__ handler.

+ How coercions-- when desired --work under this approach isn't clear to
me. E.g., I wanted to write the Date class so that

"5 Jun 1994" - Date

would work in the obvious way. It's impossible to do that today
because Python won't accept a string as the LHS of a subtract. Under
the proposed scheme, "it should", and the implementation would be
confined to teaching Date's coerce methods how to convert a string into
a Date.

New signatures involving strings could handle that, along with new
implementing functions, but that would get awfully tedious in a hurry.

I can picture a "coerce handler" with which one registers e.g. a
(string, date, string_to_date_conversion_function) tuple, but the
number of possible schemes for _applying_ such a beast make me dizzy --
kinda like trying to understand the GNU Make chapter on implicit
chaining of Make rules <0.5 grin>.

In all, I can't tell whether I'd like the multi-method approach in
practice. Making type-checking trivially easy, and making it easy to
write implementing functions free of type tests (and "swapped?" tests!),
are clear wins. But moving the implementation of class methods out of
the class body is unattractive (e.g., where does one look to find the
implementation of "typeA - typeB"? today that's a non-question; but once
Steve M has multi-methods, the answer's more likely to be "oh, that was
registered by an exec'ed lambda built from the output I captured from
your screen dump routine" <wink>). Also wonder how to define a
cooperating coercion facility that isn't insanely unpredictable, how it
all fits in with the rest of Python (presumably, if adopted, the approach
shouldn't be limited to special class methods -- right?), how much slower
(or faster?) it would run than the current scheme, how much existing code
it would break, etc. I do think the proposed "Python-like" scheme makes
the usual cases easy, and the hard cases (full-blown numeric types) at
least systematically doable, so I wonder too if there are other clear
wonderful uses.

Anyone have a good enough feel for how multi-methods work out in
other languages to take a stab at some details? I'm out of time for
championing Python crusades for a while, so even if M-M's are great
they're gonna need another whiner to promote them.

"methods"-applied-to-them<wink>-ly y'rs - tim

Tim Peters
not speaking for Kendall Square Research Corp