Skip site navigation (1)Skip section navigation (2)
Date:      Wed, 3 Nov 2004 21:00:09 GMT
From:      jim feldman <secmgr@jim-liesl.org>
To:        freebsd-gnats-submit@FreeBSD.org
Subject:   i386/73499: gvinum can't init raid5 set
Message-ID:  <200411032100.iA3L09sc026536@www.freebsd.org>
Resent-Message-ID: <200411032110.iA3LAW19052971@freefall.freebsd.org>

next in thread | raw e-mail | index | archive | help

>Number:         73499
>Category:       i386
>Synopsis:       gvinum can't init raid5 set
>Confidential:   no
>Severity:       non-critical
>Priority:       low
>Responsible:    freebsd-i386
>State:          open
>Quarter:        
>Keywords:       
>Date-Required:
>Class:          sw-bug
>Submitter-Id:   current-users
>Arrival-Date:   Wed Nov 03 21:10:32 GMT 2004
>Closed-Date:
>Last-Modified:
>Originator:     jim feldman
>Release:        5.3 RC2
>Organization:
>Environment:
FreeBSD greybrd.xxx.xxx.net 5.3-STABLE FreeBSD 5.3-STABLE #0: Tue Nov  2 03:52:27 MST 2004     root@greybrd.xxx.xxx.net:/usr/obj/usr/src/sys/GREYBRD  i386

>Description:
had working raid 5 set under 5.3rc1 composed of 4 drives (scsi).  I updated using cvsup and the 5.3-RELENG tag.  after make buildworld && make buildkernel && make installkernel reboot to single, make installworld, mergemaster, I rebooted

all the sub disks in the raid 5 plex showed stale.  If I use 
"gvinum rm -r volname", it deletes what it should.  If I re-create the set, it re-creates the set and the sub disks are still stale.
>How-To-Repeat:
      create raid 5 plex based volume with gvinum under rc1.  update to rc2 and watch plex become corrupt and unfixable
>Fix:
      
>Release-Note:
>Audit-Trail:
>Unformatted:



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?200411032100.iA3L09sc026536>