Skip site navigation (1)Skip section navigation (2)
Date:      Sat, 29 Mar 1997 13:28:31 +0800 (WST)
From:      Adrian Chadd <adrian@obiwan.aceonline.com.au>
To:        "K. Marsh" <durang@u.washington.edu>
Cc:        questions@FreeBSD.org
Subject:   Re: How to download an entire website?
Message-ID:  <Pine.BSF.3.95q.970329132641.338B-100000@obiwan.aceonline.com.au>
In-Reply-To: <Pine.A41.3.95b.970328123604.22532A-100000@goodall.u.washington.edu>

next in thread | previous in thread | raw e-mail | index | archive | help
Go to "wombat.omen.com.au" .. in /pub/other.linux.stuff is
"geturl-1.3.tar.gz" .. grab that. 

Untar / compile, and read the instructions. Its a URL grabber that can
recurse to n levels, use a proxy server, and other nice things.

Cya.

-- 
Adrian Chadd			| UNIX, MS-DOS and Windows ...
<adrian@psinet.net.au>		| (also known as the Good, the bad and the
				|				ugly..)


On Fri, 28 Mar 1997, K. Marsh wrote:

> What's the easiest way to download an entire website?
> 
> Before the man of the hour, Doug White, suggested I use cvsup. I'm sure
> it's a good idea, but cvsup depends on an enormous package called modula-3
> and I ran out of swap space trying to compile the beast.
> 
> I thought of trying ncftp, but I don't thing web docs can be had by ftp,
> can they?  Doesn't netscape have a feature to do this?
> 
>      _    _   __  _      _
>     / \  / \ / | / \    / \    University of Washington  ()
>     | | / / / /  |  \   | |      Chemical Engineering    /\
> 
> 




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?Pine.BSF.3.95q.970329132641.338B-100000>