Date: Wed, 3 Jul 2019 16:37:46 +0200 From: "Nagy, Attila" <bra@fsn.hu> To: Mike Tancsa <mike@sentex.net>, freebsd-fs@FreeBSD.org Subject: Re: ZFS exhausts kernel memory just by importing zpools Message-ID: <78882cea-c1aa-0d08-d2e8-7f7ae7131bb6@fsn.hu> In-Reply-To: <820ceee3-95aa-9925-066d-5d22884ce001@sentex.net> References: <e542dfd4-9534-1ec7-a269-89c3c20cca1d@fsn.hu> <820ceee3-95aa-9925-066d-5d22884ce001@sentex.net>
next in thread | previous in thread | raw e-mail | index | archive | help
On 2019. 07. 02. 18:13, Mike Tancsa wrote: > On 7/2/2019 10:58 AM, Nagy, Attila wrote: >> Hi, >> >> Running latest stable/12 on amd64 with 64 GiB memory on a machine with >> 44 4T disks. Each disks have its own zpool on it (because I solve the >> redundancy between machines and not locally with ZFS). >> >> One example zpool holds 2.2 TiB of data (according to df) and have >> around 75 million files in hashed directories, this is the typical >> usage on them. >> >> When I import these zpools, top says around 50 GiB wired memory (ARC >> is minimal, files weren't yet touched) and after I start to use (heavy >> reads/writes) the pools, the free memory quickly disappears (ARC >> grows) until all memory is gone and the machine starts to kill >> processes, ends up in a deadlock, where nothing helps. >> >> If I import the pools one by one, each of them adds around 1-1.5 GiB >> of wired memory. > Hi, > > You mean you have 44 different zpools ? 75mil files per pool sounds > like a lot. I wonder for testing purposes, you made 1 or two zpools with > 44 (or 22) different datasets and had 3.3billion files, would you run > into the same memory exhaustion ? > Yes, 44 different pools. I think this is related to how ZFS stores pool metadata in memory. I don't think these scales with the number of the files, but maybe with the number of stored blocks. Sadly, I can't put the same amount of data to a machine with a different setup ATM.
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?78882cea-c1aa-0d08-d2e8-7f7ae7131bb6>