From owner-freebsd-questions@FreeBSD.ORG Wed Feb 25 06:39:34 2004 Return-Path: Delivered-To: freebsd-questions@freebsd.org Received: from mx1.FreeBSD.org (mx1.freebsd.org [216.136.204.125]) by hub.freebsd.org (Postfix) with ESMTP id C6E1F16A4CE for ; Wed, 25 Feb 2004 06:39:34 -0800 (PST) Received: from filer2.rit.edu (filer2.isc.rit.edu [129.21.2.226]) by mx1.FreeBSD.org (Postfix) with ESMTP id 777F743D1F for ; Wed, 25 Feb 2004 06:39:34 -0800 (PST) (envelope-from LogicX@mail.isc.rit.edu) Received: from mail.rit.edu (fluid.rh.rit.edu [129.21.146.204]) by osfmail.rit.edu (PMDF V6.1-1X6 #30661) with ESMTPA id <0HTN00K2UA1NZ0@osfmail.rit.edu> for freebsd-questions@FreeBSD.ORG; Wed, 25 Feb 2004 09:39:23 -0500 (EST) Date: Wed, 25 Feb 2004 09:39:23 -0500 From: Mike Schroll To: freebsd-questions@FreeBSD.ORG Message-id: <403CB39B.5000606@mail.rit.edu> MIME-version: 1.0 Content-type: text/plain; charset=us-ascii; format=flowed Content-transfer-encoding: 7BIT X-Accept-Language: en-us, en User-Agent: Mozilla Thunderbird 0.5a (Windows/20040120) Subject: gzip GB file limit X-BeenThere: freebsd-questions@freebsd.org X-Mailman-Version: 2.1.1 Precedence: list List-Id: User questions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Wed, 25 Feb 2004 14:39:34 -0000 I recently backed up my data before blowing away a partition with tar -cvf - /docs |gzip -> /docs.tgz -- it went through fine, no errors. I've now discovered that the gzip which ships with FreeBSD 5.x base, 1.2.4 has a 2GB or 4GB file size limit on the file it creates... the file created by that command was 4.9GB, now when I extract it I get gzip: docs.tgz: invalid compressed data--format violated I tried following: http://www.gzip.org/recover.txt gzip -d docs.tgz bytes_in -1331625984 gzip: docs.tgz: invalid compressed data--format violated I haven't continued, as that seems like an invalid number, and I'm not even sure if those instructions have any relevance to me --- I was able to do gzip -cd docs.tgz |tar -xvf - and I can get 5.5GB of uncompressed data (the first 4GB of compressed data?) now I need to figure out a way to get the last of this file. I've already installed gzip 1.3.5 from the ports collection, and have been working with that. My problem seems unlike many of the others I've seen -- my file probably isn't corrupt, its just too big; and I hope my original data is in there somewhere. If anyone recommends hex editing, could you also recommend a good hex editor, especially one which is able to deal with a multi-gigabyte file w/o loading it all into ram. Ironically to protect against a problem like this, I had made a backup to DVD+R also, of the raw files, but somehow that DVD is also unreadable -- I used burnatonce in windows (as I have many times before) though I turned on some odd options, such as Relax ISO9660, No Folder Limit, ISO Level 2 and now I get 'incorrect function' trying to access the CD in windows, and a linux box refuses to mount the disc. Thanks in Advance, Mike Schroll Applied Networking System Administration Undergrad Rochester Institute of Technology