Skip site navigation (1)Skip section navigation (2)
Date:      Sat, 16 Aug 1997 14:29:10 +0200 (CEST)
From:      Mikael Karpberg <karpen@ocean.campus.luth.se>
To:        dg@root.com
Cc:        hackers@FreeBSD.ORG
Subject:   Re: More info on slow "rm" times with 2.2.1+.
Message-ID:  <199708161229.OAA01231@ocean.campus.luth.se>
In-Reply-To: <199708161207.FAA26252@implode.root.com> from David Greenman at "Aug 16, 97 05:07:22 am"

next in thread | previous in thread | raw e-mail | index | archive | help
According to David Greenman:
[...]
> > It would be trivial for me to verify any slow or fast times - all
> >I've got to do is make a big directory... seems that ~300 files is
> >enough to run into whatever the problem may be...
> 
>    How many files are in the directory isn't important. What is important is
> the size of the directory. You can have a 20MB directory and yet have only a
> 100 files in it. There is code to free up unused space in directories, but
> it only works if the free space is at the end. If the directory is large,
> then it will take a large amount of time to search through it.

And this cuz it's slow? Or? Isn't there a command (which could be run in
daily, or weekly, or something) that goes through a directory (or many) and
optimize the space they take?

If there isn't... why? And would it be hard to write?

  /Mikael



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?199708161229.OAA01231>