a few others , happy maintainers of few legacy batch jobs written in perl. 30k lines of code, split across maybe 10-15 perl files. we have lot of long-term fixes improving how batch process works, in short term, have keep lights on various other projects depend on output of these batch jobs. at core of main part of these batch jobs hash loaded bunch of data collected various data files in bunch of directories. when these first written, fit nicely memory - no more 100mb or so. things of course grew on years, , hash grows box can handle (8gb), leaving nice message perl: out of memory! this is, of course, poor design batch job, , have clear (long-term) roadmap improve process. i have 2 questions however: what kind of short-term options can at, short of throwing more memory @ machine? os settings can tweaked? perl runtime/compile flags can set? i'd understand why perl crashes "out of memory!" error, opposed using swap space available on machine. for refe...