Date: Tue, 29 Sep 2009 10:43:41 +0200 From: Borja Marcos <borjam@sarenet.es> To: Borja Marcos <borjam@sarenet.es> Cc: freebsd-stable@freebsd.org Subject: Re: 8.0RC1, ZFS: deadlock Message-ID: <6C7DE346-65C5-4130-86B8-56A60A1DAC28@sarenet.es> In-Reply-To: <089F63A7-574B-4646-97C7-D82B226CD4CF@sarenet.es> References: <089F63A7-574B-4646-97C7-D82B226CD4CF@sarenet.es>
next in thread | previous in thread | raw e-mail | index | archive | help
On Sep 29, 2009, at 10:29 AM, Borja Marcos wrote: > > Hello, > > I have observed a deadlock condition when using ZFS. We are making a > heavy usage of zfs send/zfs receive to keep a replica of a dataset > on a remote machine. It can be done at one minute intervals. Maybe > we're doing a somehow atypical usage of ZFS, but, well, seems to be > a great solution to keep filesystem replicas once this is sorted out. > > > How to reproduce: > > Set up two systems. A dataset with heavy I/O activity is replicated > from the first to the second one. I've used a dataset containing / > usr/obj while I did a make buildworld. > > Replicate the dataset from the first machine to the second one using > an incremental send > > zfs send -i pool/dataset@Nminus1 pool/dataset@N | ssh destination > zfs receive -d pool > > When there is read activity on the second system, reading the > replicated system, I mean, having read access while zfs receive is > updating it, there can be a deadlock. We have discovered this doing > a test on a hopefully soon in production server, with 8 GB RAM. A > Bacula backup agent was running and ZFS deadlocked. Sorry, forgot to explain what was happening on the second system (the one receiving the incremental snapshots) for the deadlock to happen. It was just running an endless loop, copying the contents of /usr/obj to another dataset, in order to keep the reading activity going on. That's how it has deadlocked. On the original test system an rsync did the same trick. Borja
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?6C7DE346-65C5-4130-86B8-56A60A1DAC28>