Date: Mon, 23 May 2005 16:17:39 -0400 From: DerDrache <derdrache@gmail.com> To: Tony Shadwick <tshadwick@goinet.com> Cc: Eugene Hercun <eugene.hercun@gmail.com>, freebsd-questions@freebsd.org Subject: Re: downloading entire directories Message-ID: <59c289ef05052313175c8d33a5@mail.gmail.com> In-Reply-To: <20050521224321.A47072@mail.goinet.com> References: <c04ca341050520141040bd0c27@mail.gmail.com> <20050521133758.T48232@zoraida.natserv.net> <20050521224321.A47072@mail.goinet.com>
next in thread | previous in thread | raw e-mail | index | archive | help
On 5/21/05, Tony Shadwick <tshadwick@goinet.com> wrote: > scp -pr user@remote_host:/path/of/dir/you/want/ /path/you/want/it/stored >=20 > Tony >=20 > On Sat, 21 May 2005, Francisco Reyes wrote: >=20 > > On Fri, 20 May 2005, Tony Shadwick wrote: > > > >> There are two ways you could do this. The first is like so: > > > > I believe there may be a third way. > > Have not done it in a while, but some FTP servers allow you to specify = a tar > > file from a directory. > > > > To be honest I don't recall syntax, but it was something like "get > > dirname.tar" and the FTP server would know to prepare a tar of the enti= re > > directory. Don't know which server(s) suppor(ed) this feature though. > > > > Long run something like rsync or unison are better options though. > > Although I think scp can download multiple files, but I don't know if i= t > > recurses. > > > _______________________________________________ > freebsd-questions@freebsd.org mailing list > http://lists.freebsd.org/mailman/listinfo/freebsd-questions > To unsubscribe, send any mail to "freebsd-questions-unsubscribe@freebsd.o= rg" >=20 At the risk of redundently sounding redundent, I think wget might be up your alley. Makes this a very simple operation. wget -r ftp://user:password@host.com/path/to/folder --=20 DerDrache
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?59c289ef05052313175c8d33a5>