Ranking Methodology for Web Hosting and Cloud Service Providers

Why Web Hosting

Web hosting addresses all customers with the need of their own website – regardless of whether these websites are for private or business-related purposes. In contrast to dedicated hosting products, this service provides shared environments, which leads to the likewise-used name shared hosting, where multiple websites of multiple customers are hosted on one server.

In the past, web hosting was the access point for web-based projects. Cheap prices have attracted customers who were willing to share performance with others and renounce the latest technology. This downside has changed over the past years. Web hosting now provides state-of-the-art technology, great user guidance, and an excellent price-performance ratio.

While inexperienced users benefit from the ease-of-use provided by features like website builders or 1-click installation for apps like WordPress or Joomla!, experts are pleased by the high-performance platforms of modern web hosting packages combined with favorable entry-prices.

Why Cloud Servers

Cloud computing plays a key role in professional IT departments due to its flexibility, reliability and efficiency. By using cloud servers, you will no longer need to maintain outdated hardware and software but instead will be able to use the latest state of the art technologies at low costs. However, the choice of the right cloud provider is crucial.

Introducing cloud services into your IT stack is the optimal solution for your business, because Managed Public Clouds are popular services so that the IT department can focus on the application performance while the cloud providers’ specialists maintain the overall servers and infrastructure health.

This not only saves you personnel costs but also saves you deploying updates on your infrastructure and allows you an unbeatable scalability. Cloud infrastructure can be set up in seconds and is the perfect solution for any kind of cloud platform as you can manage it right from your own dashboard.

Web Hosting Providers - Ranking Methodology

In order to provide consumers with accurate and unbiased reviews, Monitis purchases and measures performance and uptime of all web hosting services that it provides ranking for. All criteria utilized to rank shared hosts are data-based. Major leading EU and US-based hosting providers are selected for testing and comparison of website performance and availability.

We track three types of hosting offerings: Linux, Windows and Managed WordPress. For each offering, a Benchmarking website was uploaded to the hosting providers servers to simulate end user experience as accurately as possible.

All web content (images, text, etc.) were hosted on the local server so that performance doesn’t depend on files or objects stored outside of the web server. As not all of the providers carried all three types of offerings, the providers listed in each section vary.

Page load speed and uptime are checked and recorded utilizing Monitis monitoring services. The recorded data is compiled and averaged for selected reporting period (last month, last three months, last year).

The testing is conducted from Monitis’ reserved monitoring locations (dedicated servers) in the following countries and territories: US WEST, US EAST, Spain, France, United Kingdom, Germany, Italy.

WordPress Monitoring

For The Monitoring of WordPress websites, we have installed a WordPress package available at each Hosting provider and a default WordPress website has been published. The Monitoring has been configured towards the default WordPress site.

Uptime and Response Time Rankings

The main metrics measured in this ranking are the hosted site’s uptime percentage and response time in milliseconds.

The scoring is based upon Monitis data generated via 1-minute interval testing averaged for the selected reporting period (last month, last three months, last year). The uptime and the response time durations are tracked globally from selected locations in EU or US.

Uptime calculation

Monitis gathers uptime data by pinging the hosted web server at set intervals of 1 min. The web server is recorded as being up if a response is received. If no response was received, the web server would be marked as having a downtime during that interval from that particular testing location.

If a particular location is selected:

Uptime for a reporting interval is calculated as a percentage of checks that the site was up out of all checks run from the location:

Uptime = (Count of checks that site was up / All checks count) * 100%

If a region is selected:

Uptime for a reporting interval is calculated as the MEDIAN uptime across all locations in the region.

Uptime = The MEDIAN of uptimes across the region’s locations

See wiki article for definition of MEDIAN: https://en.wikipedia.org/wiki/Median.

Response time calculation

Monitis gathers response data when the server is up. Response time is tracked as a time spent on: tcp_connect + ssl_handshake + http_request_send + http_request_receive + redirects. The monitoring location is sending an HTTP request to the web server and recording the time duration between when the message was sent and when it was received back as an indication of network speed.

If a particular location is selected:

Response time for a reporting interval is calculated as the average response time excluding the checks that site was down:

Response Time = Average response time for a location = Sum of response times / Count of checks that site was up.

If a region is selected:

Response time for a reporting interval is calculated as the MEDIAN response time across all locations in the region:

Response Time = The MEDIAN of response times across the region’s locations

Performance Ranking

For the performance ranking Monitis’ Full Page Load monitoring tests are used to measure page load time. By tracking the page load time the test measures page visitors’ user experience and satisfaction level. All tests are done with real web browsers, so the results match the end-user experience exactly. We use a bunch of instances of Firefox web browser to load websites and record full page load time. Tests are done from selected Monitis’ locations in EU or US.

Why it is important?

  • Slow page load times negatively impact website’s Google ranking.
  • Slow page load times turn away visitors. Research has proven that slow page loads (4 secs or more) increase users’ frustration and dissatisfaction with a site.
  • Slow page load times result in lost sales. Research has shown that an 8 second load time (4 seconds above the optimal load time of 4 seconds) can result in a visitor loss of up to 75.75%.

Two main metrics are tracked: Full Page Load and Apdex. Only Full Page Load data is used to determine ranking.

Full Page Load

With Full Page Load monitoring, you can see how long it takes to load a complete HTML page in real browsers including images, CSS, JavaScripts, RSS, Flash and frames/iframes, etc.

If a particular location is selected:

Page load time for a reporting interval is calculated as the average page load time:
Page Load Time = Average page load time for a location = Sum of page load durations / Count of checks that site was up

If a region is selected:

Page Load time for a reporting interval is calculated as the MEDIAN page load time across all locations in the region:
Page Load Time = The MEDIAN of page load times across the region’s locations

The Apdex Score

Apdex (Application Performance Index) is an open standard method for reporting and comparing the performance of software applications in computing developed by an alliance of companies. Its purpose is to convert measurements into insights about user satisfaction.

The Apdex score (0-1) is calculated for the full page load test duration. The calculation is based on so called Target for the Test Duration - maximum time it should take the test to complete to satisfy the user. Monitis used Target is equal to 2.5 seconds.

Monitis runs page load tests at 15 mins frequency. For each test one of the below Apdex counts is incremented:

  • Satisfied: if the test duration is less than or equal to the Target.
  • Tolerating: if the test duration is greater than the Target and less than or equal to four times the target (4 x Target).
  • Frustrated: if test duration is greater than 4 x Target.

The Apdex score is calculated then for the chosen time range as:
Apdex = (Satisfied Count + Tolerating Count / 2) / Total Tests
See https://en.wikipedia.org/wiki/Apdex#Apdex_method for more info.

If a particular location is selected:

Apdex score for a reporting interval is calculated as per the formula above.

If a region is selected:

Apdex score for a reporting interval is calculated as:

Apdex score = Average Apdex score across all locations in the region = Sum of the region’s locations’ Apdex scores / Number of locations in the region

Cloud Service Providers - Ranking Methodology

Methodology used for ranking cloud service providers is the same as for web hosting, except that all cloud servers run on Linux Ubuntu or Debian 64bit with similar hardware setup and Apache Tomcat 7.0.70.

Benchmarking website

An official Benchmarking website created by the European Telecommunications Standards Institute has been used for Benchmarking purposes.

This website contains:

  • Images of different sizes
  • Different Fonts
  • Special characters
  • iframes
  • CSS
  • Javascript
  • non-compressible files to simulate, e.g., Flash content to some extent
All resources are hosted on the Hosting providers server, no external resources are used for these tests. A sample of the Benchmarking website can be downloaded at: ETSI Kepler Web Reference Page.

Trend View

The arrows in the trend view column indicate the change in ranking for a hosting or cloud service provider based on improvement of uptime, response time and full page load.