From owner-freebsd-questions Sun May 4 15:46:31 1997 Return-Path: Received: (from root@localhost) by hub.freebsd.org (8.8.5/8.8.5) id PAA21086 for questions-outgoing; Sun, 4 May 1997 15:46:31 -0700 (PDT) Received: from foo.primenet.com (ip193.sjc.primenet.com [206.165.96.193]) by hub.freebsd.org (8.8.5/8.8.5) with ESMTP id PAA21079 for ; Sun, 4 May 1997 15:46:26 -0700 (PDT) Received: (from bkogawa@localhost) by foo.primenet.com (8.8.2/8.6.12) id PAA04988; Sun, 4 May 1997 15:42:25 -0700 (PDT) Date: Sun, 4 May 1997 15:42:25 -0700 (PDT) Message-Id: <199705042242.PAA04988@foo.primenet.com> To: shovey@buffnet.net Subject: Re: Weird perl problem Newsgroups: localhost.freebsd.questions References: <> From: "Bryan K. Ogawa" Cc: freebsd-questions@FreeBSD.ORG, Steve Howe X-Newsreader: NN version 6.5.0 #1 (NOV) Sender: owner-questions@FreeBSD.ORG X-Loop: FreeBSD.org Precedence: bulk In localhost.freebsd.questions you write: >On Sun, 4 May 1997, Steve Howe wrote: >> On Sun, 4 May 1997, Steve wrote: >> >> i noticed while opening a few Megs of text files with "joe" editor, >> my terminal went completely nut, shell and everything, and i have >> to exit to the login prompt to regain control. i'm not an expert >> on BSD RAM usage, but i something goes nuts when it gets amxed out. >> i think perl sucks in that file as one long string into memory. >> i wonder if it would crash if you chopped the file in 2 or 4 ... >The thing is, it load the data and does the math, and is done with the >file. Its when it print out what it found is when it chokes - and at 8192 >bytes. It sounds like output is getting aborted too soon. Perl automatically buffers input/output. You can try to turn this off using select (FILEHANDLE_TO_UNBUFFER); $| = 1; That way, output gets written as it goes, which may help you to find your error, at least. Question: How does this script load the file? How is it storing it internally? Are you running out of swap space? As the above poster mentioned, perl can consume all of memory without significant effort. If the program loads in more than one line at a time, this can be a possibility. For example, while(<>) { chop; # do rest of processing here, and store the results } Will only use 1 line of buffer at a time (plus data structure usage), while things like unset($/); $file = ; # process entire file here will load all of the file in question into memory (in $file) before doing anything. This probably isn't your problem, although certain types of analysis can use up tremendous amounts of memory (depending on various parameters) -- for example, many scripts use associative arrays to mark: used files visiting hosts so memory usage will be proportional to the # of files and the number of visiting hosts. Hope that helps. bryan -- bryan k ogawa http://www.primenet.com/~bkogawa/