Date: Wed, 17 Dec 2008 12:58:54 -0500 From: Robert Huff <roberthuff@rcn.com> To: John Almberg <jalmberg@identry.com> Cc: freebsd-questions@freebsd.org Subject: Re: How to find files that are eating up disk space Message-ID: <18761.15838.256303.685029@jerusalem.litteratus.org> In-Reply-To: <7B241EE7-10A4-4BAA-9ABC-8DA5D4C1048B@identry.com> References: <283ACBF4-8227-4A24-9E17-80A17CA2A098@identry.com> <7B241EE7-10A4-4BAA-9ABC-8DA5D4C1048B@identry.com>
next in thread | previous in thread | raw e-mail | index | archive | help
John Almberg writes: > > Is there a command line tool that will help me figure out where the > > problem is? > > I should probably have mentioned that what I currently do is run > > du -h -d0 / > > and gradually work my way down the tree, until I find the > directory that is hogging disk space. This works, but is not > exactly efficient. "-d0" limits the search to the indicated directory; i.e. what you can see by doing "ls -al /". Not superior to "ls -al /" and using the Mark I eyeball. What (I think) you want is "du -x -h /": infinite depth, but do not cross filesystem mount-points. This is still broken in that it returns a list where the numbers are in a fixed-width fiend which are visually distinguished only by the last letter. Try this: du -x / and run the resu;ts through "sort": sort -nr and those results through "head": head -n 20 I have a cron job which does this for /usr and e-mails me the output every morning. After a few days, weeks at most, I know what should be on that list ... and what shouldn't and needs investigating. Robert Huff
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?18761.15838.256303.685029>