Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 27 Sep 2018 15:39:20 -0700
From:      Johannes Lundberg <johalun0@gmail.com>
To:        Rebecca Cran <rebecca@bluestop.org>
Cc:        FreeBSD Current <freebsd-current@freebsd.org>
Subject:   Re: 12.0-ALPHA5 - ZFS default ARC max apparently forcing system to run out of memory
Message-ID:  <CAECmPwsRxRBxs=S_hBfshRfiVPZYtqzSkMPG7gLgPTER3nkTww@mail.gmail.com>
In-Reply-To: <f3267619-8706-47d2-7b07-590937bf4df7@bluestop.org>
References:  <f3267619-8706-47d2-7b07-590937bf4df7@bluestop.org>

next in thread | previous in thread | raw e-mail | index | archive | help
On Thu, Sep 27, 2018 at 15:03 Rebecca Cran <rebecca@bluestop.org> wrote:

> I'm running 12.0-ALPHA5 on a laptop which has 32GB RAM and 2GB swap.
> I've found it running out of memory when building ports via synth: I
> think I've also seen it when running a buildworld. Johannes on
> FreeBSDDesktop suggested it might be related to ZFS, and setting
> vfs.zfs.arc_max to 8GB *does* appear to have resolved the problem.
>
>
> Shortly after running out of memory (with |"swap_pager_getswapspace(32):
> failed" messages)|, the first few lines of 'top' were:
>
>
> Mem: 4335M Active, 4854M Inact, 7751M Laundry, 9410M Wired, 48K Buf,
> 5332M Free
>
> ARC: 5235M Total, 4169M MFU, 497M MRU, 172K Anon, 97M Header, 471M Other
>
>      3479M Compressed, 5930M Uncompressed, 1.70:1 Ratio
>
> Swap: 2048M Total, 2009M Used, 39M Free, 98% Inuse
>
>
>
> I've not seen this happen before on ZFS systems, so is it a regression
> in 12?


Hi

It=E2=80=99s been a few months since I did the same thing so it=E2=80=99s n=
ot a recent
issue.


>
>
> --
> Rebecca Cran
>
> _______________________________________________
> freebsd-current@freebsd.org mailing list
> https://lists.freebsd.org/mailman/listinfo/freebsd-current
> To unsubscribe, send any mail to "freebsd-current-unsubscribe@freebsd.org=
"
>



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?CAECmPwsRxRBxs=S_hBfshRfiVPZYtqzSkMPG7gLgPTER3nkTww>