Date: Fri, 28 Nov 1997 16:32:36 -0500 From: ringlord@bbs.dcoisp.net To: freebsd-questions@freebsd.org Subject: Best way to move a large website to another server Message-ID: <TCPSMTP.17.11.28.-16.32.36.3047923923.7021@bbs.dcoisp.net>
next in thread | raw e-mail | index | archive | help
Hello all. I have been given the job to move the contents of a rather large website about 300 mb of files to my server and host the site. Here are a few little problems I am having. First, I figured I would just make a tar.gz file of the entire directory tree and then transfer the file containing the archive over the network via ftp. I was told that this website had "unlimited" disk space, so I thought that creating a tar.gz file would work. After receiving an error that read, gzip: stdout: quota limit exceeded, broken pipe, I decided to check the quota they had on them. Typing quota only reveiled a command not found error. Sheesh. Anyway, I knew then that the only way I could see would be to use the mget * command to transfer all the files over via ftp. I ran into a small problem which is more of an annoyance than anything else. When I use the the mget * command, ftp seems to no that directories also need to be transferred, and it seems to make the distinction between directories and files. But, ftp will report a local: file not found, if it tries to transfer a file from a directory on the remote machine if the same directory on the local machine does not exist. Is there anyway that I can tell ftp to just go ahead and create each directory on the local machine if it encounters the same directory name on the remote machine? I would like to preserve all directories in hole, which would have been accomplished with tar. Is there another way that I am missing that would be much simpler than the method described above? Thanks for any tidbits and info someone could pass along. Jeremy
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?TCPSMTP.17.11.28.-16.32.36.3047923923.7021>