I am using 1.0.3++.
-- What techniques do you use to isolate unexpected heap growth? The
growth may or may not be a memory leak, but I'd like to know how to
determine that without reading the source for every module I use.
-- Is it reasonable to expect that memory footprint can be controlled
in a non-trivial, long-running application written in python? I know
this is a rather vague question. Perhaps a better one to ask would
be has anyone successfully applied python in this way?
-- Are there known leaks or conditions that produce leaks
(notwithstanding the regex one that passed by the other day)?
-- Would a later release provide substantive improvement in the
memory footprint area?
-- Is anyone aware of memory footprint related problems with dbm (I use it
rather heavily)?
-- has anyone created iterator types for sucking the contents out of a
mapping type (especially something like dbm)? The 'keys()' method
doesn't seem to scale real well with dbm hash files that are large.
Am I thinking about this wrongly?
I appreciate any pointers or help.
Cheers,
lef
PS I am really impressed with Python to date. After having messed
with a bunch of other scripting langs, this one wins hands down
on most of the issues that matter to me. Credits to GvR!