I won't get pissed, I'll just ignore it. I have good grounds to do
so. We did exactly this in the implementation of ABC (a Python-like
language, Python's predecessor in a sense): sufficiently small
integers were stored directly in the pointer field (we used bit 0 of
the address as a type field, but the idea is the same). We saved some
heap space, and perhaps a little time in ABC programs that manipulated
lots of integers. But I think overall it was a loss, as *every* piece
of code that ever followed a pointer to an object first had to look
whether the pointer really was a pointer. Now this code was all
hidden inside macros, so it still looked clean, but the compiler of
course generated real tests everywhere, so our object code was bigger.
(E.g. every INCREF and DECREF call had to contain this test.) And the
gain was very small, since integer manipulations are simply not where
you spend most of your time in a high level language like this.
BTW note that Python's framework is flexible enough so that integers
aren't allocated as individual malloc() blocks -- there's a separate
free list. But every integer object still has a reference count and a
pointer to its type object. This means the INCREF and DECREF macros
can be fast.
--Guido van Rossum, CWI, Amsterdam <mailto:Guido.van.Rossum@cwi.nl>
<http://www.cwi.nl/cwi/people/Guido.van.Rossum.html>