Date: Wed, 21 Apr 2004 08:42:53 -0500 From: Eric Anderson <anderson@centtech.com> To: freebsd-current@freebsd.org Subject: Directories with 2million files Message-ID: <40867A5D.9010600@centtech.com>
next in thread | raw e-mail | index | archive | help
First, let me say that I am impressed (but not shocked) - FreeBSD
quietly handled my building of a directory with 2055476 files in it.
I'm not sure if there is a limit to this number, but at least we know it
works to 2million. I'm running 5.2.1-RELEASE.
However, several tools seem to choke on that many files - mainly ls and
du. Find works just fine. Here's what my directory looks like (from
the parent):
drwxr-xr-x 2 anderson anderson 50919936 Apr 21 08:25 data
and when I cd into that directory, and do an ls:
$ ls -al | wc -l
ls: fts_read: Cannot allocate memory
0
Watching memory usage, it goes up to about 515Mb, and runs out of memory
(can't swap it), and then dies. (I only have 768Mb in this machine).
du does the exact same thing.
find, however, works fine (and is very fast!):
$ time find . | wc -l
2055476
real 0m3.589s
user 0m2.501s
sys 0m1.073s
I'd work on some patches, but I'm not worth much when it comes to C/C++.
If someone has some patches, or code to try, let me know - I'd be more
than willing to test, possibly even give out an account on the machine.
Eric
--
------------------------------------------------------------------
Eric Anderson Sr. Systems Administrator Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?40867A5D.9010600>
