Date: Wed, 27 Feb 2008 14:21:53 +0000 From: RW <fbsd06@mlists.homeunix.com> To: freebsd-questions@freebsd.org Subject: Re: argument list too long Message-ID: <20080227142153.73b5a680@gumby.homeunix.com.> In-Reply-To: <20080227111551.GA2403@kobe.laptop> References: <20080227100132.G1831@wojtek.tensor.gdynia.pl> <47C52A64.5000701@locolomo.org> <20080227111551.GA2403@kobe.laptop>
next in thread | previous in thread | raw e-mail | index | archive | help
On Wed, 27 Feb 2008 13:15:51 +0200 Giorgos Keramidas <keramida@ceid.upatras.gr> wrote: > It is worth noting, however, that there are usually fairly easy ways > to work with huge lists of command-line arguments. Instead of writing > things like this, for example: > > for file in *.ogg ; do > blah "${file}" > done I've seen loops like this suggested as an alternative to blah *.ogg and the two cases are clearly different, because in the loop you only pass one argument to "blah", and the limitation is in how much space the shell will allow for the expansion of *.ogg. I've not hit this limit with /bin/sh. Anyone know what it is? > one can easily write: > > find . -name '*.ogg' | \ > while read file ; do \ > blah "${file}" > done If blah is interactive, it will try to take its input from the pipe instead of the terminal. Is there a way around this? (I know xargs can handle it with -o)
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20080227142153.73b5a680>