Date: Mon, 3 Apr 2006 13:48:42 +0200 From: "Daniel A." <ldrada@gmail.com> To: "Olaf Greve" <o.greve@axis.nl> Cc: freebsd-questions <freebsd-questions@freebsd.org> Subject: Re: How can I increase the shell's (or specific application's) memory limit? Message-ID: <5ceb5d550604030448t6ecf22uaef80f13f222c465@mail.gmail.com> In-Reply-To: <44310A0D.80607@axis.nl> References: <44310A0D.80607@axis.nl>
next in thread | previous in thread | raw e-mail | index | archive | help
On 4/3/06, Olaf Greve <o.greve@axis.nl> wrote: > Hi, > > I've got a question which is probably pretty easy to answer: how can I > assign more memory to a PHP script running in a shell and/or in a browser= . > > Some more background info: > I'm building a PHP script that has to retrieve pretty large sets of data > from a remote MySQL database, then process it, and store the results to > a local database. > > The issue: > The script (surprise, surprise) quickly runs out of memory. Now, I have > already tried to increase the memory limit in php.ini (followed by an > Apache restart, of course), but even when setting the limit to something > high like 384MB or so, the script still bails out with a memory limit > error when retrieving as little as some 50MB of data... > > Now, of course I could rewrite my PHP script such that it will retrieve > smaller batches of data, but being a programmer I'm lazy, and I'd rather > simply assign more memory to the script (actually, it's not only due to > laziness, but also due to the fact that the script has to agregate data > etc., and I'd rather have it do that in 1 run for a variety of reasons). > > It seems to me like setting the memory limit in php.ini above a value of > 64MB (or so) doesn't seem to have any effect anymore. My assumption then > is that the memory limit is somehow enforced elsewhere (the shell > perhaps, and/or Apache?). > > Can anyone tell me how to adjust this such that I can successfully > assign say 384MB of memory to PHP scripts ran both from browsers (i.e. > through Apache 2.2 and mod_php) as from the commandline? > > Tnx in advance, and cheers, > Olafo > > _______________________________________________ > freebsd-questions@freebsd.org mailing list > http://lists.freebsd.org/mailman/listinfo/freebsd-questions > To unsubscribe, send any mail to "freebsd-questions-unsubscribe@freebsd.o= rg" > Hi Olaf, Generally, I think it's bad programming practice to retrieve such big datasets if it is possible to do otherwise. Concider this example: I needed an app which would write a file which has a set size. The most obvious way to do this, is to make a random string which would be X bytes long. That, however, is not feasible, and very slow. The solution is to find a data size which is not too small and not too big, so disk I/O and CPU time are balanced well. <?php $num =3D ''; // The string to write $bytes =3D 419430400; // 400 megs $starttime =3D time(); $blocksize =3D 1048576; // 1 meg blocks $totalblocks =3D $bytes / $blocksize; $filename =3D "400.megs"; for($i=3D0;$i<$totalblocks;$i++) { for($x=3D0;$x<$blocksize;$x++) { $num .=3D rand(0,1); } $difftime =3D time() - $starttime; $hours =3D date('H', $difftime) - 1; $minsecs =3D date('i:s', $difftime); $y =3D $i+1; $pct =3D round(100/$totalblocks*$y, 2); echo "[$hours:$minsecs] Writing block $y, $pct%\n"; file_put_contents($filename, $num, FILE_APPEND); $num =3D ''; } ?> Now, I suggest that you retrieve only a certain amount of SQL rows at a time, process them, and throw them into your local database. This will make your application a lot faster, by the way.
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?5ceb5d550604030448t6ecf22uaef80f13f222c465>