nginx permission issue serving static file – RedHat 7 SELinux issue

LessQuesar asked:

This is a new installation which I followed the passenger+nginx guide. This will eventually be a rails site, but for now I am trying to get static files to serve, and am unable to figure out the correct combination of permissions.

I’m trying to serve a robots.txt file, here is the nami stack:

namei -om /var/www/c3d/current/public/robots.txt
f: /var/www/c3d/current/public/robots.txt
 dr-xr-xr-x root root /
 drwxr-xr-x root root var
 drwxr-xr-x root root www
 drwxr-xr-x c3d  c3d  c3d
 lrwxrwxrwx c3d  c3d  current -> /var/www/c3d/releases/20160512102658
   dr-xr-xr-x root root /
   drwxr-xr-x root root var
   drwxr-xr-x root root www
   drwxr-xr-x c3d  c3d  c3d
   drwxrwxr-x c3d  c3d  releases
   drwxrwxr-x c3d  c3d  20160512102658
 drwxrwxr-x c3d  c3d  public
 -rwxrwxr-x c3d  c3d  robots.txt

Here is my nginx config for the site:

server {
    listen 80;
    server_name 52.xx.xx.xx;

    # Tell Nginx and Passenger where your app's 'public' directory is
    root /var/www/c3d/current/public;
}

In /etc/nginx/nginx.conf I’m setting the user to c3d:

user  c3d c3d;
worker_processes  1;

error_log  /var/log/nginx/error.log;
...

The error.log outputs this:

2016/05/12 08:06:02 [error] 5192#0: *1 open() "/var/www/c3d/current/public/robots.txt" failed
(13: Permission denied), client: 73.135.yy.yy, server: 52.xx.xx.xx, 
request: "GET /robots.txt HTTP/1.1", host: "52.xx.xx.xx"

Update this look like SELinux security issue.
I see this in /var/log/audit/audit.log

type=AVC msg=audit(1463057724.846:14926): avc:  denied  { read } for  pid=5192 comm="nginx" name="robots.txt" dev="xvda2" ino=444598886 scontext=system_u:system_r:httpd_t:s0 tcontext=unconfined_u:object_r:var_t:s0 tclass=file

What is the correct way to fix this issue?

My answer:


Your file has the wrong security context.

$ ls -Z /var/www/c3d/current/public/robots.txt
unconfined_u:object_r:var_t /var/www/c3d/current/public/robots.txt

You should restore the correct security context.

restorecon -v /var/www/c3d/current/public/robots.txt

You might wish to restore the security contexts for all of your web files.

restorecon -r -v /var/www

View the full question and answer on Server Fault.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.