MobiNEX: The Mobile Network Experience Index, H1 2016

Executive Summary

In response to customers’ growing usage of mobile data and applications, in April 2016 STL Partners developed MobiNEX: The Mobile Network Experience Index, which ranks mobile network operators by key measures relating to customer experience. To do this, we benchmark mobile operators’ network speed and reliability, allowing individual operators to see how they are performing in relation to the competition in an objective and quantitative manner.

Operators are assigned an individual MobiNEX score out of 100 based on their performance across four measures that STL Partners believes to be core drivers of customer app experience: download speed, average latency, error rate and latency consistency (the proportion of app requests that take longer than 500ms to fulfil).

Our partner Apteligent has provided us with the raw data for three out of the four measures, based on billions of requests made from tens of thousands of applications used by hundreds of millions of users in H1 2016. While our April report focused on the top three or four operators in just seven Western markets, this report covers 80 operators drawn from 25 markets spread across the globe in the first six months of this year.

The top ten operators were from Japan, France, the UK and Canada:

  • Softbank JP scores highest on the MobiNEX for H1 2016, with high scores across all measures and a total score of 85 out of 100.
  • Close behind are Bouygues FR (80) and Free FR (79), which came first and second respectively in the Q4 2015 rankings. Both achieve high scores for error rate, latency consistency and average latency, but are slightly let down by download speed.
  • The top six is completed by NTT DoCoMo JP (78), Orange FR (75) and au (KDDI) JP (71).
  • Slightly behind are Vodafone UK (65), EE UK (64), SFR FR (63), O2 UK (62) and Rogers CA (62). Except in the case of Rogers, who score similarly on all measures, these operators are let down by substantially worse download speeds.

The bottom ten operators all score a total of 16 or lower out of 100, suggesting a materially worse customer app experience.

  • Trailing the pack with scores of 1 or 2 across all four measures were Etisalat EG (4), Vodafone EG (4), Smart PH (5) and Globe PH (5).
  • Beeline RU (11) and Malaysian operators U Mobile MY (9) and Digi MY (9) also fare poorly, but benefit from slightly higher latency consistency scores. Slightly better overall, but still achieving minimum scores of 1 for download speed and average latency, are Maxis MY (14) and MTN ZA (12).

Overall, the extreme difference between the top and bottom of the table highlights a vast inequality in network quality customer experience across the planet. Customer app experience depends to a large degree on where one lives. However, our analysis shows that while economic prosperity does in general lead to a more advanced mobile experience as you might expect, it does not guarantee it. Norway, Sweden, Singapore and the US market are examples of high income countries with lower MobiNEX scores than might be expected against the global picture. STL Partners will do further analysis to uncover more on the drivers of differentiation between markets and players within them.

 

MobiNEX H1 2016 – included markets

MobiNEX H1 2016 – operator scores

 Source: Apteligent, OpenSignal, STL Partners analysis

 

  • About MobiNEX
  • Changes for H1 2016
  • MobiNEX H1 2016: results
  • The winners: top ten operators
  • The losers: bottom ten operators
  • The surprises: operators where you wouldn’t expect them
  • MobiNEX by market
  • MobiNEX H1 2016: segmentation
  • MobiNEX H1 2016: Raw data
  • Error rate
  • Latency consistency
  • Download speed
  • Average latency
  • Appendix 1: Methodology and source data
  • Latency, latency consistency and error rate: Apteligent
  • Download speed: OpenSignal
  • Converting raw data into MobiNEX scores
  • Setting the benchmarks
  • Why measure customer experience through app performance?
  • Appendix 2: Country profiles
  • Country profile: Australia
  • Country profile: Brazil
  • Country profile: Canada
  • Country profile: China
  • Country profile: Colombia
  • Country profile: Egypt
  • Country profile: France
  • Country profile: Germany
  • Country profile: Italy
  • Country profile: Japan
  • Country profile: Malaysia
  • Country profile: Mexico
  • Country profile: New Zealand
  • Country profile: Norway
  • Country profile: Philippines
  • Country profile: Russia
  • Country profile: Saudi Arabia
  • Country profile: Singapore
  • Country profile: South Africa
  • Country profile: Spain
  • Country profile: United Arab Emirates
  • Country profile: United Kingdom
  • Country profile: United States
  • Country profile: Vietnam

 

  • Figure 1: MobiNEX scoring breakdown, benchmarks and raw data used
  • Figure 2: MobiNEX H1 2016 – included markets
  • Figure 3: MobiNEX H1 2016 – operator scores breakdown (top half)
  • Figure 4: MobiNEX H1 2016 – operator scores breakdown (bottom half)
  • Figure 5: MobiNEX H1 2016 – average scores by country
  • Figure 6: MobiNEX segmentation dimensions
  • Figure 7: MobiNEX segmentation – network speed vs reliability
  • Figure 8: MobiNEX segmentation – network speed vs reliability – average by market
  • Figure 9: MobiNEX vs GDP per capita – H1 2016
  • Figure 10: MobiNEX vs smartphone penetration – H1 2016
  • Figure 11: Error rate per 10,000 requests, H1 2016 – average by country
  • Figure 12: Error rate per 10,000 requests, H1 2016 (top half)
  • Figure 13: Error rate per 10,000 requests, H1 2016 (bottom half)
  • Figure 14: Requests with total roundtrip latency > 500ms (%), H1 2016 – average by country
  • Figure 15: Requests with total roundtrip latency > 500ms (%), H1 2016 (top half)
  • Figure 16: Requests with total roundtrip latency > 500ms (%), H1 2016 (bottom half)
  • Figure 17: Average weighted download speed (Mbps), H1 2016 – average by country
  • Figure 18: Average weighted download speed (Mbps), H1 2016 (top half)
  • Figure 19: Average weighted download speed (Mbps), H1 2016 (bottom half)
  • Figure 20: Average total roundtrip latency (ms), H1 2016 – average by country
  • Figure 21: Average total roundtrip latency (ms), H1 2016 (top half)
  • Figure 22: Average total roundtrip latency (ms), H1 2016 (bottom half)
  • Figure 23: Benchmarks and raw data used

MobiNEX: The Mobile Network Experience Index, Q4 2015

Executive Summary

In response to customers’ growing usage of mobile data and applications, STL Partners has developed MobiNEX: The Mobile Network Customer Experience Index, which benchmarks mobile operators’ network speed and reliability by measuring the consumer app experience, and allows individual players to see how they are performing in relation to competition in an objective and quantitative manner.

We assign operators an individual MobiNEX score based on their performance across four measures that are core drivers of customer app experience: download speed; average latency; error rate; latency consistency (the percentage of app requests that take longer than 500ms to fulfil). Apteligent has provided us with the raw data for three out of four of the measures based on billions of requests made from tens of thousands of applications used by hundreds of millions of users in Q4 2015. We plan to expand the index to cover other operators and to track performance over time with twice-yearly updates.

Encouragingly, MobiNEX scores are positively correlated with customer satisfaction in the UK and the US suggesting that a better mobile app experience contributes to customer satisfaction.

The top five performers across twenty-seven operators in seven countries in Europe and North America (Canada, France, Germany, Italy, Spain, UK, US) were all from France and the UK suggesting a high degree of competition in these markets as operators strive to improve relative to peers:

  • Bouygues Telecom in France scores highest on the MobiNEX for Q4 2015 with consistently high scores across all four measures and a total score of 76 out of 100.
  • It is closely followed by two other French operators. Free, the late entrant to the market, which started operations in 2012, scores 73. Orange, the former national incumbent, is slightly let down by the number of app errors experienced by users but achieves a healthy overall score of 70.
  • The top five is completed by two UK operators: EE (65) and O2 (61) with similar scores to the three French operators for everything except download speed which was substantially worse.

The bottom five operators have scores suggesting a materially worse customer app experience and we suggest that management focuses on improvements across all four measures to strengthen their customer relationships and competitive position. This applies particularly to:

  • E-Plus in Germany (now part of Telefónica’s O2 network but identified separately by Apteligent).
  • Wind in Italy, which is particularly let down by latency consistency and download speed.
  • Telefónica’s Movistar, the Spanish market share leader.
  • Sprint in the US with middle-ranking average latency and latency consistency but, like other US operators, poor scores on error rate and download speed.
  • 3 Italy, principally a result of its low latency consistency score.

Surprisingly, given the extensive deployment of 4G networks there, the US operators perform poorly and are providing an underwhelming customer app experience:

  • The best-performing US operator, T-Mobile, scores only 45 – a full 31 points below Bouygues Telecom and 4 points below the median operator.
  • All the US operators perform very poorly on error rate and, although 74% of app requests in the US were made on LTE in Q4 2015, no US player scores highly on download speed.

MobiNEX scores – Q4 2015

 Source: Apteligent, OpenSignal, STL Partners analysis

MobiNEX vs Customer Satisfaction

Source: ACSI, NCSI-UK, STL Partners

 

  • Introduction
  • Mobile app performance is dependent on more than network speed
  • App performance as a measure of customer experience
  • MobiNEX: The Mobile Network Experience Index
  • Methodology and key terms
  • MobiNEX Q4 2015 Results: Top 5, bottom 5, surprises
  • MobiNEX is correlated with customer satisfaction
  • Segmenting operators by network customer experience
  • Error rate
  • Quantitative analysis
  • Key findings
  • Latency consistency: Requests with latency over 500ms
  • Quantitative analysis
  • Key findings
  • Download speed
  • Quantitative analysis
  • Key findings
  • Average latency
  • Quantitative analysis
  • Key findings
  • Appendix: Source data and methodology
  • STL Partners and Telco 2.0: Change the Game
  • About Apteligent

 

  • MobiNEX scores – Q4 2015
  • MobiNEX vs Customer Satisfaction
  • Figure 1: MobiNEX – scoring methodology
  • Figure 2: MobiNEX scores – Q4 2015
  • Figure 3: Customer Satisfaction vs MobiNEX, 2015
  • Figure 4: MobiNEX operator segmentation – network speed vs network reliability
  • Figure 5: MobiNEX operator segmentation – with total scores
  • Figure 6: Major Western markets – error rate per 10,000 requests
  • Figure 7: Major Western markets – average error rate per 10,000 requests
  • Figure 8: Major Western operators – percentage of requests with total roundtrip latency greater than 500ms
  • Figure 9: Major Western markets – average percentage of requests with total roundtrip latency greater than 500ms
  • Figure 10: Major Western operators – average weighted download speed across 3G and 4G networks (Mbps)
  • Figure 11: Major European markets – average weighted download speed (Mbps)
  • Figure 12: Major Western markets – percentage of requests made on 3G and LTE
  • Figure 13: Download speed vs Percentage of LTE requests
  • Figure 14: Major Western operators – average total roundtrip latency (ms)
  • Figure 15: Major Western markets – average total roundtrip latency (ms)
  • Figure 16: MobiNEX benchmarks

Mobile app latency in Europe: French operators lead; Italian & Spanish lag

Latency as a proxy for customer app experience

Latency is a measure of the time taken for a packet of data to travel from one designated point to another. The complication comes in defining the start and end point. For an operator seeking to measure its network latency, it might measure only the transmission time across its network.

However, to objectively measure customer app experience, it is better to measure the time it takes from the moment the user takes an action, such as pressing a button on a mobile device, to receiving a response – in effect, a packet arriving back and being processed by the application at the device.

This ‘total roundtrip latency’ time is what is measured by our partner, Crittercism, via embedded code within applications themselves on an aggregated and anonymised basis. Put simply, total roundtrip latency is the best measure of customer experience because it encompasses the total ‘wait time’ for a customer, not just a portion of the multi-stage journey

Latency is becoming increasingly important

Broadband speeds tend to attract most attention in the press and in operator advertising, and speed does of course impact downloads and streaming experiences. But total roundtrip latency has a bigger impact on many user digital experiences than speed. This is because of the way that applications are built.

In modern Web applications, the business logic is parcelled-out into independent ‘microservices’ and their responses re-assembled by the client to produce the overall digital user experience. Each HTTP request is often quite small, although an overall onscreen action can be composed of a number of requests of varying sizes so broadband speed is often less of a factor than latency – the time to send and receive each request. See Appendix 2: Why latency is important, for a more detailed explanation of why latency is such an important driver of customer app experience.

The value of using actual application latency data

As we have already explained, STL Partners prefers to use total roundtrip latency as an indicator of customer app experience as it measures the time that a customer waits for a response following an action. STL Partners believes that Crittercism data reflects actual usage in each market because it operates within apps – in hundreds of thousands of apps that people use in the Apple App Store and in Google Play. This is a quite different approach to other players which require users to download a specific app which then ‘pings’ a server and awaits a response. This latter approach has a couple of limitations:

1. Although there have been several million downloads of the OpenSignal and Actual Experience app, this doesn’t get anywhere near the number of people that have downloaded apps containing the Crittercism measurement code.

2. Because the Crittercism code is embedded within apps, it directly measures the latency experienced by users when using those apps1. A dedicated measurement app fails to do this. It could be argued that a dedicated app gives the ‘cleanest’ app reading – it isn’t affected by variations in app design, for example. This is true but STL Partners believes that by aggregating the data for apps such variation is removed and a representative picture of total roundtrip latency revealed. Crittercism data can also show more granular data. For example, although we haven’t shown it in this report, Crittercism data can show latency performance by application type – e.g. Entertainment, Shopping, and so forth – based on the categorisation of apps used by Google and Apple in their app stores.

A key premise of this analysis is that, because operators’ customer bases are similar within and across markets, the profile of app usage (and therefore latency) is similar from one operator to the next. The latency differences between operators are, therefore, down to the performance of the operator.

Why it isn’t enough to measure average latency

It is often said that averages hide disparities in data, and this is particularly true for latency and for customer experience. This is best illustrated with an example. In Figure 2 we show the distribution of latencies for two operators. Operator A has lots of very fast requests and a long tail of requests with high latencies.

Operator B has much fewer fast requests but a much shorter tail of poor-performing latencies. The chart clearly shows that operator B has a much higher percentage of requests with a satisfactory latency even though its average latency performance is lower than operator A (318ms vs 314ms). Essentially operator A is let down by its slowest requests – those that prevent an application from completing a task for a customer.

This is why in this report we focus on average latency AND, critically, on the percentage of requests that are deemed ‘unsatisfactory’ from a customer experience perspective.

Using latency as a measure of performance for customers

500ms as a key performance cut-off

‘Good’ roundtrip latency is somewhat subjective and there is evidence that experience declines in a linear fashion as latency increases – people incrementally drop off the site. However, we have picked 500ms (or half a second) as a measure of unsatisfactory performance as we believe that a delay of more than this is likely to impact mobile users negatively (expectations on the ‘fixed’ internet are higher). User interface research from as far back as 19682 suggests that anything below 100ms is perceived as “instant”, although more recent work3 on gamers suggests that even lower is usually better, and delay starts to become intrusive after 200-300ms. Google experiments from 20094 suggest that a lasting effect – users continued to see the site as “slow” for several weeks – kicked in above 400ms.

Percentage of app requests with total roundtrip latency above 500ms – markets

Five key markets in Europe: France, Germany, Italy, and the UK.

This first report looks at five key markets in Europe: France, Germany, Italy, and the UK. We explore performance overall for Europe by comparing the relative performance of each country and then dive into the performance of operators within each country.

We intend to publish other reports in this series, looking at performance in other regions – North America, the Middle East and Asia, for example. This first report is intended to provider a ‘taster’ to readers, and STL Partners would like feedback on additional insight that readers would welcome, such as latency performance by:

  • Operating system – Android vs Apple
  • Specific device – e.g. Samsung S6 vs iPhone 6
  • App category – e.g. shopping, games, etc.
  • Specific countries
  • Historical trends

Based on this feedback, STL Partners and Crittercism will explore whether it is valuable to provide specific total roundtrip latency measurement products.

Contents

  • Latency as a proxy for customer app experience
  • ‘Total roundtrip latency’ is the best measure for customer ‘app experience’
  • Latency is becoming increasingly important
  • STL Partners’ approach
  • Europe: UK, Germany, France, Italy, Spain
  • Quantitative Analysis
  • Key findings
  • UK: EE, O2, Vodafone, 3
  • Quantitative Analysis
  • Key findings
  • Germany: T-Mobile, Vodafone, e-Plus, O2
  • Quantitative Analysis
  • Key findings
  • France: Orange, SFR, Bouygues Télécom, Free
  • Quantitative Analysis
  • Key findings
  • Italy: TIM, Vodafone, Wind, 3
  • Quantitative Analysis
  • Key findings
  • Spain: Movistar, Vodafone, Orange, Yoigo
  • Quantitative Analysis
  • Key findings
  • About STL Partners and Telco 2.0
  • About Crittercism
  • Appendix 1: Defining latency
  • Appendix 2: Why latency is important

 

  • Figure 1: Total roundtrip latency – reflecting a user’s ‘wait time’
  • Figure 2: Why a worse average latency can result in higher customer satisfaction
  • Figure 3: Major European markets – average total roundtrip latency (ms)
  • Figure 4: Major European markets – percentage of requests above 500ms
  • Figure 5: The location of Google and Amazon’s European data centres favours operators in France, UK and Germany
  • Figure 6: European operators – average total roundtrip latency (ms)
  • Figure 7: European operators – percentage of requests with latency over 500ms
  • Figure 8: Customer app experience is likely to be particularly poor at 3 Italy, Movistar (Spain) and Telecom Italia
  • Figure 9: UK Operators – average latency (ms)
  • Figure 10: UK operators – percentage of requests with latency over 500ms
  • Figure 11: German Operators – average latency (ms)
  • Figure 12: German operators – percentage of requests with latency over 500ms
  • Figure 13: French Operators – average latency (ms)
  • Figure 14: French operators – percentage of requests with latency over 500ms
  • Figure 15: Italian Operators – average latency (ms)
  • Figure 16: Italian operators – percentage of requests with latency over 500ms
  • Figure 17: Spanish Operators – average latency (ms)
  • Figure 18: Spanish operators – percentage of requests with latency over 500ms
  • Figure 19: Breakdown of HTTP requests in facebook.com, by type and size