12 Replies Latest reply: May 6, 2011 2:42 PM by Alberto Ravasio
dkolbo Level 1 (0 points)

I am setting up FTP on mac mini.  I would lik to give a client a login name and password that will take them to only one folder, and give another client a different login name and password to take them to another folder.  Don't want client number 1 to see folder of client number 2.  Any help is appreciated.



Mac mini, Mac OS X (10.6.7), Snow Leopard Server
  • Camelot Level 8 (46,675 points)

    Either don't use the built-in FTP, or be prepared for some serious under-the-hood shenanigans.


    You see, the built-in FTP server expects to use the central user directory for all accounts, and expects to give each user access to the same things they'd see if they were logged in via any other method.

    Even worse, since you have to create users in the system directory that user will also get a shell login, an email account, wiki account, iChat account, and access to every  other service running on the machine.


    Now, some of this can be mitigated by setting up a chroot jail for the FTP user and changing the Server Admin setup to restrict access to other services, but it's a PITA.


    The alternative is to buy an out-of-the-box file sharing application such as Rumpus or CrushFTP. Both of these support a built-in user directory disctinct from the system directory and make it much easier to lock a user to a specific directory. Well worth the $$s IMHO.

  • MrHoffman Level 6 (14,849 points)

    Alternatively, stop looking back to the pre-Internet era and to tools from the dawn of networking (ftp), and instead look toward current tools and mechanisms such as (for instance) web-based file upload or download processing.

  • dkolbo Level 1 (0 points)

    Thanks Camelot.  Saved me a bunch of research time.

  • Sven-Goran Ljungholm1 Level 2 (245 points)

    You can do this another way.

    When you create a user you can restrict what services they have access to. You don't need to give them iCal, Wiki, Mail etc. In the advanced FTP settings on the server set the users to see home directory only.

    You can let them log in with FTP and they will each see their own home folder.


    The question is, how do you get to the files that they are sending to the server? Are they the only ones that need to see that file?

  • dkolbo Level 1 (0 points)

    The files in question are primarily created in house (Illustrator or Cad) for upload by client.  At times the client(s) may download to us.  Our designers will need to access the home folders to dump the files into.  I can set up permission to a client's home folder for our designers to access.  Thanks.

  • Camelot Level 8 (46,675 points)

    Except it isn't quite that easy, Sven.


    You see, to do what you describe you have to use a negative permissions model - that is, take away permission from the services you don't want that user to have. That's a poor security model since it only applies to the services in use right now. If a new service is added to the system then that user would have access by default - you have to remember to go in and configure that service to block the specific user(s) in question. It's too easy to forget that.


    As for the 'home directory' question, that's a service-wide setting, not a per-user setting, so you don't have the ability to configure specific users in this way while leaving other users with 'normal' access. This may or may not be a problem depending on the number of users you expect to have FTP access.


    So I stand by my earlier post. Use a different application (or, preferably, a different protocol as MrHoffman suggests). Incidentally, both apps I listed support multiple protocols, including ones that have a semblence of security so it makes it easier to transition users once you convince them that FTP is a poor choice.

  • Sven-Goran Ljungholm1 Level 2 (245 points)

    This may or may not be a problem depending on the number of users you expect to have FTP access.


    In this case it seems like a very simple need/load. No question that FTP is not secure.


    All ideas above are valid. Question is how much server tweaking do you want to do to save the $40.

  • carlfrompoway Level 1 (0 points)

    Ok, I am new to sns but couldn't Open Directory: users/groups/protocols/sharepoints take care of all of this.  I mean, there is so much that can be done in server and workgroup managers, I am surprised to see this thread in a way.


  • MrHoffman Level 6 (14,849 points)

    Sharepoints are a convenience and don't necessarily convey any extra security, and they can bag you in various ways. 


    There have been cases of protocol-level security exposures in AFP; one was patched a few months back.


    Further, file-based services can have security issues around what can be uploaded (and unrestricted file-type uploads or gifar-style upload attacks can be particularly nasty), and there are some rather nasty attacks that can be deployed here. 


    As for Open Directory, that can be useful for clients that are joined to the domain, but I'm interpreting "client" in the OPs question as a business client and not a "client system", and passing out usernames and passwords implies they're not members of the domain.


    Unrestricted (either intentionally unrestricted, or due to insecurities or configuration errors) file upload services are regularly sought after by attackers, and the servers can then be filled with warz or other nasty stuff. 


    And the most common transport protocol is ftp, and ftp is hideous around firewalls.   It's also insecure; the credentials are cleartext.  You might as well post the login credentials directly on your web site.  (What can also happen here is a case of accountability and of password reuse; the same username and password gets passed around and embedded everywhere with many of these ftp servers, which often effectively means the password is in the wind...


    Website-based uploads can mean the remote user doesn't necessarily need a login (and particularly if they have no way to access nor reference nor guess the (renamed) name of the uploaded file after they upload it) and website-based downloads mean the remote user only need be provided with a temporary and mostly-random URL to fetch the file, and the package can then clean the file off automatically.


    Sure, sftp and ftp do work here.  (And in addition to not exposing credentials as does ftp, sftp is way easier to deal with around firewalls, and you can use individualized and revokable certificates, if you so choose.)  These schemes just (in my experience) turn into a mess.  And a directory filled with dreck, if somebody's not actively watching it, unfortunately.  And no, I don't prefer to expose SMB/CIFS nor AFP nor NFS to the Internet wild-lands, either.

  • dkolbo Level 1 (0 points)



    What web-based file upload / download processing programs might you recommend?    Thanks.

  • MrHoffman Level 6 (14,849 points)

    I'm running Drupal.


    Drupal is a web content management system, and it deals with a whole lot of what's involved when running a web site.  Including file uploads.  It's gargantuan in its capabilities.


    In general, uploads can be managed with a small wad of scripting within an HTML page, and there are examples posted around.  Here is a tutorial of how the pieces fit together, though the simple example upload script shown isn't as secure as it should be.  Another tutorialHere is an example of avoiding a directory traversal attack in the script.


    I'd fully expect there are downloads of these file-upload and file-management tools available in php, python, perl or one of the other available web-facing languages, but I don't immediately see something.  There are a few commercial products in this space, too.  Seems good fodder for some scripting and then for an article; something specifically targeted for Mac OS X and Mac OS X Server.  (I'll look at writing that script and that article, though it likely won't be on your schedule here.)

  • Alberto Ravasio Level 4 (3,902 points)

    This is how I've implemented a similar situation like yours.


    I recovered an old unused PowerMac G4, set it up with Ubuntu Linux server, a simple Apache web server without bells and whistles plus vsftp.


    The main Apache web directory is used to store files available to everybody without restriction.

    Directory and/or files that must be visible to one particular customer are named (dot)randomchar i.e.




    as you know (dot)file are invisible unless you know their existence.

    Final customer receive a mail with the personalized link and access their files with their favorite browser.

    Just in case they need to upload some file, they use an FTP client to get to the vsftp server.

    For this situation there is a single ftp user that can access in write only mode the local vsftp server.

    Yes like the drop box available inside the Mac OS X user public folder.

    No one but my company's employes can see what's inside.