It sure isn't running into syslimits limits....the python process is
only around a 1M or so...
-steve
> From guido@cwi.nl Tue Mar 15 00:37 PST 1994
> To: stk (Steve Kirsch)
> Cc: python-list@cwi.nl
> Subject: Re: MemoryError in big dictionaries
> From: Guido.van.Rossum@cwi.nl
> X-Organization: CWI (Centrum voor Wiskunde en Informatica)
> X-Address: P.O. Box 94079, 1090 GB Amsterdam, The Netherlands
> X-Phone: +31 20 5924127 (work), +31 20 6225521 (home), +31 20 5924199 (fax)
> X-Subliminal-Message: have you voted for comp.lang.python yet?
> Date: Tue, 15 Mar 1994 09:37:01 +0100
> Sender: Guido.van.Rossum@cwi.nl
> Content-Type: text
> Content-Length: 1078
> X-Lines: 36
>
> > Consider the following program:
> >
> > dict={}
> > def dictset(x):
> > for key in range(x):
> > keyc=`key`
> > dict[keyc]=keyc
> > os.times()
> >
> > --------------------
> > dictset(10000) works fine.
> > dictset(100000) gives a memory error.
> >
> > I'm running on a 256M system with no other users.
> >
> > Why am I getting a memory error for a dictionary with only 100,000
> > entries?
>
> Oops. Erm. I think this has to do with the following table in
> Objects/mappingobject.c:
>
> static unsigned int primes[] = {
> 3, 7, 13, 31, 61, 127, 251, 509, 1021, 2017, 4093,
> 5987,
> 9551, 15683, 19609, 31397,
> 0xffffffff /* All bits set -- truncation OK */
> };
>
> The hash table will be extended to the next larger prime in the table.
> This means that you can't create a dictionary with more than 31397
> entries... I would guess you can fix this by adding some more primes
> to it. I duobt I will be able to do much about it for the next couple
> of weeks, so feel free to experiment!
>
> --Guido van Rossum, CWI, Amsterdam <Guido.van.Rossum@cwi.nl>
> URL: <http://www.cwi.nl/cwi/people/Guido.van.Rossum.html>
>