- I cut the number of tested hosts, but added different types of installations (see the next point).
- I monitor now two types of website installations – “Lite” and “Heavy”. “Lite” is a default WordPress installation without any plugins except one lightweight security plugin, and there is no caching plugins. “Heavy” installation is full of content, with lots of plugins, but including a free caching plugin. This is a real-world website with a bloated front page.
- I moved to pingdom.com monitoring tool which does not provide Satisfactory Apdex metrics.
- Speed is tested now every 30 minutes.
You’ve probably seen a lot of hosting recommendations on different websites and blogs promising that this or that host is very good. But what’s missing there is the proof of the hosting performance.
On this page I’m filling this gap for you. I’m showing you the performance contest results of my picked webhosts.
See below the table of contents for easy navigation for the contest results and for Questions and Answers.
Why is continuous performance monitoring a crucial part of choosing a host?
Although many hosting companies may announce 99.9% or even higher Service level agreement (SLA), take it with a bit of salt. You may think that it means that your website will be that much uptime and accessible, but this is not exactly so. There may be some factors that are not included in SLA, or just there may be some unexpected hardware issues, short overloading periods, network issues, hosting support actions or human errors etc.
Thus, it’s hard to judge what the hosting true reliability is and how much uptime you will get in the real life until you monitor it.
Some hosting reviews have just one-time performance monitoring snapshots. But it is obviously not enough for judging the webhost performance, because performance differs throughout the years and months. Or even within one day host performance may be very different. I monitor the hosts constantly and continuously and show you the results on this page.
Someone’s feeling that some host is functioning fine may also be misleading. Without continuous monitoring of the hosting performance it’s not easy to distinguish a good host from a very good host or a good host from a mediocre one.
That’s why I’ve decided to gather real data both in real-time and historically to compare performance of websites on different hosting.
What do I mean under hosting performance in my hosting performance contests?
I consider server uptime, full page load time and Apdex (application performance index) as the factors for the contest.
You can see below more details about how they are monitored and calculated.
How do I choose hosts to monitor their performance?
After analyzing real users reviews and professional hosting community opinion I pick out the hosts that are the best I could find in their market segment.
In general, the hosts I’ve picked out are already very good hosts and you can go with any of them.
However, nothing speaks better than pure vivid facts. That’s why I anonymously buy hosting at these hosts and monitor their performance. And I share this information with you.
Why do I monitor just several hosts?
Monitoring service of this kind (high frequency of full page load time monitoring) as well as buying hosting and domains requires some budget. That’s why I don’t monitor all the hosts at the moment I would like. But I will be adding more and more hosts to the monitoring.
Do I inform the hosts that I’m monitoring them?
No, I don’t tell them about it (I do it totally anonymously), because I want it to be a clear experiment.
Hosting is bought as usual and it’s bought not by me.
Also, the domain names I use for the test websites are not registered to me.
Besides, the domain names are just something not specific, i.e. they are not sort of stablehost-test.com or something like that.
So it’s not easy to guess which website is the test website.
In short, I keep it all anonymous.
Also, my affiliate accounts with the hosting companies are not connected with the hosts in any way.
And here’s a standard notice required by hosting companies:
Disclosure: There are some affiliate links on this page. In other words, I get paid if you click on the links and make a purchase. All such links open in new window/tab; no software/program will be installed to your computer.
Is it really enough to judge a host by this kind of monitoring?
Yes and no.
Yes, because this monitoring shows how a shared hosting account works literally every minute.
And no because this is just a sort of sample test. In other words, hosting company may have many servers and they may perform differently, wheres my monitoring is watching just one shared hosting account.
However, this kind of monitoring is 100% objective and gives us much more precise data than any personal real user opinion on how this or that hosting performs. That’s why this monitoring is very important and can be used for decision making which host to choose.
What kind of websites do I monitor?
On each monitored hosting I’ve got similar test website based on WordPress.
Each test website is hosted on the most affordable plan of a particular hosting.
All test websites are made practically equal in order to run these comparison monitoring tests in as much equal conditions as possible.
Each website has the same WordPress theme and contains a number of blog posts with many hundreds of words with images in every post. The front page displays extracts from the first 10 posts with featured images.
No caching or other website load speed optimization plugins are installed on the test websites to make the testing of hosting performance reliable.
What exactly is monitored?
I’m using monitis.com services for monitoring my test websites.
There are two monitors for each website:
- HTTP server response time (Time To First Byte):
- Tests are performed every 1 minute.
- Test results are measured in milliseconds.
- The main purpose of these tests is uptime monitoring.
- Full page load time (How long it takes to fully load a page for a real user)
- Tests are performed every 15 minutes. (Update: since July 2017 the tests are performed every 20 minutes.)
- Test results are measured in seconds.
- The main purpose of these tests is to estimate the real user experience regarding speed of website loading on a particular hosting.
The tests for each host are performed from different US East and US West locations.
The load of websites which is impacted by the monitoring system is equal to 144 unique visitors per day (about 4,320 unique visitors per month) distributed evenly.
How to read real-time monitoring charts?
Looking at real-time charts on the page with my recommended hosting keep in mind that a host may be considered to be down at a particular moment if both locations are showing downtime. A failure at one location may be because of a network issue on the way between the testing location and the hosting server, i.e. the issue is not connected with the hosting. Whereas one location may show downtime, from any other place the website may be loading fine.
That’s why I’m monitoring the hosting performance from not one location, but from two locations. This is to decrease the chances that the tests could give me false positives (false positive is when a connection error is detected whereas the hosting server and my test website is fine).
How are the aggregated (monthly etc) statistics calculated?
HTTP response time is calculated as the sum of the best response times from all locations, divided by the number OKs during the reporting period. Thus the value can be less than the average response time for each location.
Uptime calculation is based on http response time. The uptime is calculated as number of simultaneous NOKs (downtimes) from all locations divided by the number of checks (NOKs+Oks).
NOK (downtime) is considered when http response time is more than 10 seconds for all locations simultaneously.
Full page load time is calculated as the average of the best full page load times from all locations.
Apdex (Application Performance Index) is calculated as the average of the best Apdex from all locations.
Apdex for each test location is calculated by Monitis as “(Satisfied Count + Tolerating Count / 2) / Total Samples“. The levels for Apdex are the following: Satisfactory full page load time is below 2.5 seconds, tolerating is below 10 seconds, and frustrating is above 10 seconds.
Satisfactory Apdex shows how often (in percentage of all time) the site was loading faster than 2.5 seconds. On other words, it can be treated as how much a user is satisfied with hosting speed.
How are hosting contest results calculated?
I take into account the following factors (the most important ones in my opinion):
- Uptime
- Satisfactory Apdex (the percentage of time when a test website on a particular hosting is loading faster than 2.5 seconds)
- Average full page load time (i.e. how fast hosting in general it is)
A hosting is awarded as a winner if it has the best (the smallest) average full page load time provided its uptime is above 99.9% and its Satisfactory Apdex is above 99%.
In other words, I pick out the best from the best hosting: uptime for these hosts is really high and its performance is continuously superb.
The Hosting Performance Contest results
The monthly reports on hosting performance are here.
Also, you can see real-time performance charts of the hosts I monitor as well as historical data on the hosting performance on this page.
And finally, my recommended hosts are here.
BTW, I respect your privacy, and of course I don't send spam, affiliate offers or trade your emails. What I send is information that I consider useful.