Date: Sat, 01 Jan 2011 20:23:08 +0100 From: Attila Nagy <bra@fsn.hu> To: Artem Belevich <fbsdlist@src.cx> Cc: freebsd-fs@freebsd.org, freebsd-stable@freebsd.org, Martin Matuska <mm@freebsd.org> Subject: Re: New ZFSv28 patchset for 8-STABLE Message-ID: <4D1F7F1C.9090106@fsn.hu> In-Reply-To: <AANLkTimGdnESX-wwD52Fh4wCfS4xZ-839g6Ste5Bwihu@mail.gmail.com> References: <4D0A09AF.3040005@FreeBSD.org> <4D1F7008.3050506@fsn.hu> <AANLkTimGdnESX-wwD52Fh4wCfS4xZ-839g6Ste5Bwihu@mail.gmail.com>
next in thread | previous in thread | raw e-mail | index | archive | help
On 01/01/2011 08:09 PM, Artem Belevich wrote: > On Sat, Jan 1, 2011 at 10:18 AM, Attila Nagy<bra@fsn.hu> wrote: >> What I see: >> - increased CPU load >> - decreased L2 ARC hit rate, decreased SSD (ad[46]), therefore increased >> hard disk load (IOPS graph) >> > ... >> Any ideas on what could cause these? I haven't upgraded the pool version and >> nothing was changed in the pool or in the file system. > The fact that L2 ARC is full does not mean that it contains the right > data. Initial L2ARC warm up happens at a much higher rate than the > rate L2ARC is updated after it's been filled initially. Even > accelerated warm-up took almost a day in your case. In order for L2ARC > to warm up properly you may have to wait quite a bit longer. My guess > is that it should slowly improve over the next few days as data goes > through L2ARC and those bits that are hit more often take residence > there. The larger your data set, the longer it will take for L2ARC to > catch the right data. > > Do you have similar graphs from pre-patch system just after reboot? I > suspect that it may show similarly abysmal L2ARC hit rates initially, > too. > > Sadly no, but I remember that I've seen increasing hit rates as the cache grew, that's what I wrote the email after one and a half days. Currently it's at the same level, when it was right after the reboot... We'll see after few days.
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?4D1F7F1C.9090106>