From owner-freebsd-questions Fri Mar 28 12:39:42 1997 Return-Path: Received: (from root@localhost) by freefall.freebsd.org (8.8.5/8.8.5) id MAA02111 for questions-outgoing; Fri, 28 Mar 1997 12:39:42 -0800 (PST) Received: from goodall.u.washington.edu (durang@goodall.u.washington.edu [140.142.12.163]) by freefall.freebsd.org (8.8.5/8.8.5) with ESMTP id MAA02097 for ; Fri, 28 Mar 1997 12:39:37 -0800 (PST) Received: from localhost (durang@localhost) by goodall.u.washington.edu (8.8.4+UW96.12/8.8.4+UW97.03) with SMTP id MAA08806 for ; Fri, 28 Mar 1997 12:39:36 -0800 Date: Fri, 28 Mar 1997 12:39:36 -0800 (PST) From: "K. Marsh" To: questions@freebsd.org Subject: How to download an entire website? Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII Sender: owner-questions@freebsd.org X-Loop: FreeBSD.org Precedence: bulk What's the easiest way to download an entire website? Before the man of the hour, Doug White, suggested I use cvsup. I'm sure it's a good idea, but cvsup depends on an enormous package called modula-3 and I ran out of swap space trying to compile the beast. I thought of trying ncftp, but I don't thing web docs can be had by ftp, can they? Doesn't netscape have a feature to do this? _ _ __ _ _ / \ / \ / | / \ / \ University of Washington () | | / / / / | \ | | Chemical Engineering /\