A friend of mine recently asked "how many domains should you run on a shared server?" My first thought: "how long is a piece of string?" The answer depends on many variables. So many variables that there's no useful answer.
We host web sites that are one single page of static HTML with limited graphics. They're "online business cards" that get at most a few visitors a month. How many of those can we put on a shared server? Hundreds, thousands, maybe even tens of thousands! On the other end of the spectrum we have some low volume sites with complex database driven components. For these sites, a search engine crawl can put quite a load on a server (see Optimizing Web Crawlers for Shared Hosts for more on this). How many of those can we handle? On a server with multiple CPU cores and a terabyte of storage, we can fit several hundred sites, but if just a handful of those sites are popular we'll start to see significant load. If just one small CMS based site gets featured on Oprah, it's going to bury a shared server in no time.
There's the unsatisfying answer: 2 to 20,000. When looking for a shared host, the real question is how proactively is the server managed and what's the target load factor. The load factor is a measure of how busy the server's CPU cores are, and it's best expressed as load per CPU: if you have an eight core processor, then a load factor of 4 means the CPU is running at half capacity. Load factor tends to be lower at night as demand from the west coast drops off and before European demand comes on stream.
I've heard that some of the big "cheap" hosts run their servers with load per CPU as high as 20. Without accounting for disk latency this means a pace that would normally be delivered to a user in 0.4 seconds now takes at least 8, and that's the optimistic number! There are lots of studies that show significant abandonment of slow sites. Any site that takes more than two seconds to start delivering a page will start losing a significant number of visitors, eight seconds is deadly. Too few ask the question "how much is saving a few dollars a month going to cost my business?" The answer might be surprising.
At Abivia, we have performance monitors that raise warnings, complete with a list of the currently running processes, when our load per CPU hits 1.5. Most of the time this is transient load, but if these warnings become frequent it's something to look into it a little more deeply. After all, our goal is to have the sites that host with us become successful and to thrive, and a slow site isn't going to help with that!
We have another report that comes up whenever a single account has an unusual number of "long running" processes. Sometimes those processes are benign – like the controller for a backup – sometimes they're an indication of growth in demand for the site, and sometimes they're an indication of a problem. In all these cases our primary goal is to ensure that we're providing quality service to all sites on the server. If the problem is extreme, we might resort to the suspension of a "rogue" account, but our goal is to address issues before they get to that point. After all an "account suspended" page is even worse than taking more than two seconds to serve a page.
In most cases, there is a simple fix to the problem. It could be updating software or seeking technical assistance form a third party developer. It could be defending against malicious requests, for example a denial of service attempt. It could be defending against other undesirable activity – we had one site that was being scanned end to end by a competitor every night, any new information showed up on the competitor's site the next day – a few blocking rules put an end to that, and load returned to normal.
So the question isn't "how many sites do you put on a server", but "how proactively do you manage your servers". Based on the disaster stories we've seen over the years, a lot of inferior hosting companies won't even understand the question.
Image credit: dee_ used under a CC license.