Previous 1 2 Next 23 Replies Latest reply: May 13, 2008 7:36 AM by EMBARQ Studio54
EMBARQ Studio54 Level 1 (0 points)

I have an odd issue occurring with our OSX Server (XServe RAID)... We have a directory setup as a LAMP Web Server (Apache2/PHP4). Everything works great except for one thing: downloads.

When I attempt to view or download any file (whether it be graphics in an HTML doc or files access directly by URL), it will only allow my to view/download a small portion of the file (anywhere from 64k to 500k).

The first place I looked was my php.ini file. This was because this problem first manifested from a PHP app I am using. Everything is set correctly in the ini file, though (I can upload files of any size -- depending on the ini settings, etc.).

Shortly after, I realized that this issue occurs even when trying to access a file directly by entering the URL to the file (i.e., it's not a PHP issue). So, I checked my httpd.conf file and found that nothing was odd there either.

Has anyone else experienced this problem? Is there somewhere else I can check for a "download" setting?

Keep in mind that I'm not a LAMP expert (nor ISX server). I just need to get this working so that people can view more than just half of a webpage!

Thanks for any help,

P.S. I hope I posted this to the right place.

OSX Server, Mac OS X (10.4.8), 2GHz Xserve RackMac PowerPC G5 (RAID)
  • Wedgetail Level 2 (150 points)
    Have you checked for any bottlenecks on gateway hardware? Switches/Firewalls/Routers? Do you have the OS X firewall enabled? If so, does this problem persist if you turn it off?
  • EMBARQ Studio54 Level 1 (0 points)
    Good questions! I hadn't thought of that... I will check these things and report back.

  • EMBARQ Studio54 Level 1 (0 points)
    I am not finding any issues with firewall settings -- that info gets through at all does indicate that.

    Does anyone else have an idea?

  • Celia Wessen Level 4 (1,155 points)
    Are you having any speed issues with accessing files on the Xserve that is NOT stored on the RAID? How is the RAID connected to your Xserve?

    BTW, how fast are the uploads to the same location? File size?
  • EMBARQ Studio54 Level 1 (0 points)
    The WebRoot directory is on the RAID, so everything is being accessed from there.

    Speed doesn't seem to be an issue in most cases (although page retrieval can hang once in a while). But, when I enter a URL to a file, it does seem to take a few moments before the files tries to actually download. Then, after starting, it stops (or times out, or whatever).

    Uploads (to the same location) are relatively quick. I have my php.ini file set to accept up to 55MB files and I haven't had an issue yet. But, that's in PHP.

    I'm not sure I know how to try an upload via http... Is there a way?

  • Celia Wessen Level 4 (1,155 points)
    By any chance did you change how many offsprings the Apache webserver can spawn?
  • EMBARQ Studio54 Level 1 (0 points)
    Hmmm... I don't think so, but I'm not sure either. How is this done? (So that I can try to refresh my memory.)

  • Celia Wessen Level 4 (1,155 points)
    If you use ServerAdmin then go into the Web > Settings >General tab. You got the "Simultaneous connections" and other settings that will determine how often and under what circumstances Apache will spawn sisters and daughters. I think the default was 500/300/1/5/1. You may also want to checkout the "Persisitent connection" setting.
  • EMBARQ Studio54 Level 1 (0 points)
    Due to some early issues with the Server Admin app, we are not using it... Apache is configured and running on its own.

    My httpd.conf settings are set thusly:


    Persistent Connections is on and set to:

    MaxKeepAliveRequests 100

    (I don't see a setting for Maximum Persistent Connections, though.)

    I will change all of these to the defaults in Server Admin and see what happens.

  • EMBARQ Studio54 Level 1 (0 points)
    All right... I have made a number of adjustments to my httpd.conf settings and have gotten some good results. But, I'm not exactly running at 100% yet.

    It seems that files are timing out and actually "freezing" or "crashing" Apache. When this happens, I can no longer access the site -- but only temporarily. I have waited for several minutes to find that the site comes back to life, but it takes a quite a while.

    *Update: I take that back... I just tried to download a fairly small PDF and it crashed Apache for good -- had to restart.

    If this "freeze" happens and I restart Apache, the site comes back to life right away. (But, graceful doesn't work, only restart.)

    Additionaly, when I attempt to download files via PHP, I am still timing out around 64k or so... Very odd.

    Any ideas?


    Message was edited by: EMBARQ Studio54
  • Celia Wessen Level 4 (1,155 points)
    Could it be that Apache is hogging too much memory? Does it leave anything in the logs when it crashes?

    When it crashes, does the httpd process disappear or is it hanging? Check using apachectl status.

    Keeping persistent connections ON may actually hog more memory if the time-out is set too long.
  • EMBARQ Studio54 Level 1 (0 points)
    I wish I could answer all of your questions, but I can't...

    I turned Persistent Connections off and some thing do seem to be working better. But I'm still getting many hang ups.

    Nothing is left in the logs when Apache freezes. The Access Log shows what I was doing, but nothing there (or in the Error Log) says anything is wrong.

    As for "apachectl status"... I get an error when I try that command:

    /usr/sbin/apachectl: line 91: lynx: command not found

    My apachectl location is /Library/Apache2/bin, so I'm confised why it's outputting that path.

  • Roger Smith3 ( points)
    It's putting out that message because the apachectl that you're running isn't the one that you think that you're running. I'd estimate that you have 3 Apache setups on that box. The Apache 1.3.x, which is centered around /usr and /etc. The default Apache 2.0.x install in /opt/apache2 and an Apache 2.0.x install that you have based in /Library/Apache2.

    When you execute a command, if the command doesn't start with "/" to specify an absolute path the shell will try appending the command to each of the directories in you PATH variable until it either finds the command, or runs out of possible paths (which is when it will give a "Command not found" error). So if you type in:

    apachectl status

    it will find the apachectl program in /usr/sbin instead of the /Library/Apache2/bin/apachectl that you want. You'll either need to modify you PATH variable or use absolute paths with the commands to keep these versions straight.

    AFAIK, ServerAdmin will only work on the default install (the 1.3.x version). For any of the Apache2 setups, you'll be coniguring by hand, probably in /Library/Apache2/conf/httpd.conf.

    FWIW, I was running the /opt/apache2 daemons for a while and they started hanging and crashing. I never did figure out a reason, and had to keep the sites running, so I went back to the 1.3.x httpd.

  • Celia Wessen Level 4 (1,155 points)
    Hmm, a non-Apple Apache install.

    Did you take a look at the system.logs too?
Previous 1 2 Next