Quote:
Originally Posted by GMelanie
Less than %65 percent of all total, above this number the OS starts to struggle with opening new connections, swapping and opening files, so the effect is compound at about an x(squared) after that, where each subsequent request eats more and more until peak %100 is unresponsive.
|
Not exactly true ... Apache 2.x will try to use up to 95% of your available resources at any given time ... and if your scripts are running fine, this is generally not a problem.
The more important thing to monitor, is the amount of SWAP that is available ...
Once you run out of SWAP, is when your server will start to bog down, connections will time out, it will start to seem unresponsive, and your load averages will begin to escalate quickly.
When you see this beginning to happen - before it gets too out of hand - you can log into shell and run "top" to see what exactly is pulling your resources, and will give you a place to start in troubleshooting your issues.
What type of a custom script are you running? Is it a scraping type of script? If so, an upgrade to a dedicated server (or a much beefier VPS at least) is probably in your near future, as scraper type scripts are very resource intensive, and depending on how many queries it is performing at a time, will quickly chew through the resources of a basic VPS.