Wow, it's taking an awful lot of work trying to keep this site running right now. The server we're currently running on is a virtual CentOS machine with 256MB RAM, and the constant traffic from both users and robots (crawlers) causes Apache to lock up from time to time. Right now I'm tweaking all the LAMP (Linux, Apache, MySQL, and PHP) knobs I know how to tweak to keep the site more or less running most of the day.
Here's a quick review of the hacks, er, performance optimizations, I've been working on.
Linux commands
Really, I'm doing very little with Linux, other than using tools like top
, uptime
, ps
, and vmstat
to see what's happening on the server, i.e., what changes with each tweak I make elsewhere. Other than that most of the tweaks are Apache tweaks, including variables like MaxClients
, MaxRequestsPerChild
, and KeepAliveTimeout
.
Tweaking MySQL
I've tried to convince MySQL to use less RAM by turning off everything I don't need. For instance, I have a number of settings like this in my /etc/my.cnf
file:
skip-bdb skip-ndbcluster skip-networking
I'm also looking for slow queries, but that hasn't really been much of a problem. I also reduced the maximum number of connections, but again, I don't think that had much effect.
Tweaking Drupal and PHP
I'm also trying to tweak the knobs on PHP and Drupal, and this has been the biggest learning experience. In Drupal I've turned off and removed every non-essential module (including my favorite admin_menu module), and I've been running the following PHP commands inside my theme from time to time:
<?php echo memory_get_usage(); echo memory_get_peak_usage(); ?>
This has helped to show me that from within a page served up by Drupal, my max memory use reported here is less than 12MB. By default PHP was installed to use a max of 128MB RAM, so I've cranked that way down, and I haven't had an "out of memory" errors recently.
Tweaking Apache
After all of this work, Apache seems to be the largest culprit. Did you know that Apache processes can swell to 20-30MB each when serving up dynamic content? Did you also know that the same 30MB process that served up a dynamic page might later be used to serve up static content -- like little, bitty images? I really had no idea the processes themselves could get that large; I've always been a Java guy, and just used Apache as a proxy to a Java web server.
After many, many changes to both Drupal and Apache, my Apache httpd
processes now show up as 16m
when looking at them with the top
command. I've looked and looked, and I don't see what else to remove, so that's probably as low as I can go there. When you think about it, if you have 256MB RAM available, and each Apache process takes as much as 16MB RAM, you can't have many Apache processes running at one time, can you?
(As mentioned earlier, most of my Apache tweaks have been around the MaxClients
, MaxRequestsPerChild
, and KeepAliveTimeout
variables. There have been others, including turning off some internationalization features, but most of what I'm doing now centers around these three variables, and other closely related variables.)
Really, the only other Apache optimization that I've read about has to do with .htaccess
files, specifically disabling their use. Did you know that if you have a directory structure that goes five levels deep, and if you have AllowOverride
set to all
, and someone requests a document in that lowest directory, Apache has to look in that directory, and all four directories above it to see if there is a .htaccess
file in any directory? I knew this, but I never really thought about all the work that was required to serve many of the documents on this site, so I'll be setting AllowOverride None
as soon as I can clean up some related things around here.
Summary
All in all, it's been a heck of a learning experience, and I suppose it's good to learn about all these things now. My hope is to grow this site at least 10x, so I might as well learn all these things now, so I can be smarter about my needs in the future.