Linux Servers & File Permissions

edited May 2012 in Software
One of the things that I miss from running my web server on Windows XP was that I never needed to configure file permissions, or grant Apache access to each file, on my server - no matter where a file came from, it would pretty much always work in-browser. This is probably a bit less secure, but I value not having to open terminal on the server and chmod all newly uploaded files (via FTP), after being edited or whatnot.

When I first set up my Apache server on Ubuntu, I was confused as to what I should set the general file permissions to. What users/groups? If I allow full access to "everyone," does that mean "everyone," or just everyone with access to the local file system? Since then I sort of figured out a solution, setting the web pages to 775, with the primary user being the root account, g73net, the secondary being www-data (Apache's user), and 5 (r-x) for all else. I still find this annoying to have to change every time I create a file, etc., especially when I connect remotely via FTP.

Is there any easy way to do the permissions automatically, without umask, and scripts that overwrite the wrong files' permissions? :P

Comments

  • While setting up Arch Linux on my new server, this is still irritating me a bit. What should permissions *universally* be set for, for Web servers - and how can I make it so that files sent there via FTP, etc., are automatically set to that permission, *without* removing the executable bit from executable files in the same directory tree (i.e. scripts, and stuff)?

    If this is too hard to answer, what, for example, are the file permissions on the WinBoards server? Are they automatically updated? well, I'd guess not, since stuff isn't really uploaded directly...

    but I'm curious as to how you would recommend this be set up.
  • The user your web server runs under needs read and execute for the web directory and read for the files.

    The default umask of 022 should work just fine. Yes, that does mean that everyone can read the files, but that's what you want for a website.

    If you have exceptions to this. I.E. files you don't want everyone to read (they shouldn't be in a web directory anyway) or script files that need to be executable, set their permissions manually.
  • Alright, that seems simple enough; however, where I struggle is for files uploaded via FTP to the www directory - how can I make sure their user, group, and permissions are set correctly?

    Also, I'm not sure I understand how to define the permissions of a usergroup; if it's really all based on access given to files (i.e. /cdrom), then how would I allow multiple groups to utilize a file, if there are only two areas for users or groups in permissions? (i.e. "root root drwxrwxr-x"; I can replace the second "root" with a group, but what if I wanted a user and two groups to have access/permission nodes? (etc.))
  • What you need to read up on is ACLs.

    http://www.computerhope.com/unix/usetfacl.htm
  • What I do is setup virtual hosts for each user's home directory and map the FTP roots accordingly as well.
  • Josh: that's a brilliant idea, but I'm not precisely sure how to implement that. I know how to make Apache VirtualHosts, I already have a few for different sites/domains via FreeDNS. I don't necessarily need FTP home directories to be public websites, I just want to be able to add files to, for example, /var/www/g73mc, and have permissions set as 755 or something.
  • Userdirs.
  • I have a bash script. Check out the following:
    #
    # Setup our variables. They are:
    # $username
    # $password
    # 
    #
    #
    #!/bin/bash
    read -p "Username:" username
    read -p "Password:" password
    echo "Creating user account $username..."
    pass=$(perl -e 'print crypt($ARGV[0], "password")' $password)
    useradd -m $username -p $pass
    echo "...done!"
    echo " "
    
    #create apache instance
    echo "Creating $username.taldar.in website..."
    echo "<VirtualHost *:80>
    DocumentRoot /filez/$username
    ServerName $username.taldar.in
    </VirtualHost>" > /etc/apache2/sites-enabled/$username
    echo "<center>Welcome to $username's website. This is their default page. Please check back later to see if it has been updated.</center>" >> /filez/$username/index.php
    echo "...done!"
    echo " "
    
    #create proftpd instance
    echo "Creating $username.taldar.in FTP site..."
    echo "DefaultRoot /filez/$username $username" > /etc/proftpd/sites-enabled/$username
    echo "...done!"
    echo " "
    
    #create mysql user & database
    echo "Creating $username.taldar.in MySQL database..."
    mysql -uroot -pYourPassword -e "create database $username"
    mysql -uroot -pYourPassword -e "CREATE USER '$username'@'localhost' IDENTIFIED BY '$password';"
    mysql -uroot -pYourPassword -e "grant CREATE,INSERT,DELETE,UPDATE,SELECT on $username.* to $username@localhost;"
    echo "...done!"
    echo " "
    
    #restart services
    echo "Restarting services..."
    /etc/init.d/apache2 restart
    /etc/init.d/proftpd restart
    echo "...done!"
    echo " "
    

    A couple givens:
    1. Home directory root is set to /filez/. There is a good tutorial here on how to set this up: http://www.cyberciti.biz/faq/howto-chan ... directory/
    2. proFTPd is installed and configured.
    3. My root domain name has an alias A record configured in DNS. IE, *.taldar.in points to 216.240.243.41 in my case.



    So I keep "add" in my root directory. If I ever want to add a user I invoke the script by typing "bash add".
  • Not bad, but you could set their home directory in the script rather than changing the system default.

    As for FTP, my preference would be to just use SFTP, that way I wouldn't have to install anything extra and my users would have a little security when it comes to transfering their files.

    Of course, with your way, the choice is theirs whether they want to use SFTP or FTP.
  • Well, thing about the home directories is that I want all of my user's files backed up and redundant. /filez/ is actually mounted to a RAID 1 NAS which is nightly backed up to the cloud.

    I actively run SFTP and FTP alongside. I left out the crypto configuration steps required for proFTPD to do this, but you can find it on the internet.
  • You could just change the mount point of the NAS to /home.
  • Yeah. There's a few things like that I need to do. This was sort of a self discovery project in the beginnings. I have plans on going back and redoing the install with some personal tweaks in mind.
  • 'tis indeed a well thought-out solution. Not sure if I'll use exactly that but I'll assuredly consider it, and likely use parts of your script, when I get my server back up. Thanks for the insight.
Sign In or Register to comment.