Downloading a folder in FTP via Terminal

Say I'm using FTP through terminal and I want to download my "public_html" directory. However, using "get public_html" nets me a "550 public_html: Not a regular file". Then using "mget *" within "public_html" gives me more errors. Is there ANY way to download a local copy of "public_html" through the Terminal?

eMac 1.25Ghz Mac OS X (10.4.2)

Posted on Feb 11, 2007 9:36 PM

Reply
5 replies

Feb 11, 2007 10:13 PM in response to Philippe Van Lieu

You would probably be more impressed with the "curl" command, but you need to know the name of every file. If it is your own server, scp or rsync is what you need, in that case you transfer files from the server not the website.

This command downloads my entire userfolder from one of my servers to the "Backup" folder on my Mac:

scp -r gnarlodious@otaku.freeshell.org:/arpa/gm/g/gnarlodious ~/Backup/

Feb 11, 2007 10:45 PM in response to Gnarlodious

curl won't work because I don't know everything in the folder. 😟

scp doesn't seem to work; I'm inputting the command verbatim (making my own changes, that is), and it's telling me "lost connection". Any info on that?

I suppose at this point I should say that I don't HAVE to use FTP, any command through the Terminal (like scp) would be good enough for me.

eMac 1.25Ghz Mac OS X (10.4.2)

Feb 12, 2007 4:12 AM in response to Philippe Van Lieu

Is there a reason why it has to be via the Terminal? I think if you mount the ftp server via the "Go Server" in the Finder you could drag a folder and it will execute all the back end ftp commands to get all the files. Also apps like Fetch permit drag and drop of multiple files. Perhaps something like Automator could be used if you are trying to script some behaviours?

Feb 12, 2007 5:50 AM in response to Philippe Van Lieu

So if you don't know what's in the folder then you have no access and you are a hacker. What you want to do is known as "slurping", where someone attempts to download an entire website. Some browsers have this feature, like iCab, which might be a better choice for you. iCab's download feature includes downloading all linked pages from a site (called "Grabber").

The DocumentRoot, which is what you want, is off-limits to FTP because the DiectoryIndex page is all you can access. Subfolders are still accessible if indexing is allowed by the server.

If it's true that it is your own server space, then you may not even have shell access. Then you are stuck with FTP, which is very minimal and hard to use. I just use the Transmit application: http://Panic.com/Transmit/

Feb 12, 2007 7:17 PM in response to Gnarlodious

Well TECHNICALLY I know what's in the folder I want to download, it's my own folder. However, it's too large for me to know what exactly is in it, so if your suggestion requires me to write up a list of what's in the folder, I couldn't do it from memory nor would I have the time to go through the entire thing and write up a list of what's in it. If that's what you meant.

I have Transmit, but it has proven itself to be a bit wonky at times when it comes to downloading folders. I generally use Fetch 3 (Classic), but it too is iffy when downloading folders. Both tend to cancel a download midway through.

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

Downloading a folder in FTP via Terminal

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.