The $300bn COVID digital health dividend

This report introduces a new sizing model for digital healthcare that reflects the recent impact of the COVID pandemic on the sector, with the goal of identifying the new opportunities and risks presented to operators and others attempting or considering investment in the market. A key finding is that market development has been accelerated four years ahead of its prior trajectory, meaning that players should significantly reassess the urgency and scale of their strategic application.

—————————————————————————————————————————————————————

Download the additional file to access the database tool accompanying the analytical report

—————————————————————————————————————————————————————

Why healthcare?

STL Partners has long argued that if telecoms operators want to build new businesses beyond connectivity, they will need 1) clarity on which customer needs to address and 2) long term commitment to investment and innovation to address them. Adding value farther up the value chain requires significant new skills and capabilities, so we believe telecoms operators must be deliberate in their choice of which customers they want to serve, i.e. which verticals, and what they want to do for them. For more detail, see STL Partners’ report How mobile operators can build winning 5G business models.

We believe that healthcare is a vertical that is well suited to telecoms operators’ strategic scope:

  • Healthcare is a consistently growing need in every country in the world
  • It is a big sector that can truly move the needle on telcos’ revenues, accounting for nearly 10% of GDP globally in 2018, up from 8.6% of GDP in 2000 according to WHO data
  • It operates within national economies of scale (even if the technology is global, implementation of that technology requires local knowledge and relationships)
  • The sector has historically been slower than others in its adoption of new technologies, partly due to quality and regulatory demands, factors that telcos are used to dealing with
  • Improving healthcare outcomes is meaningful work that all employees and stakeholders can relate to.

Many telcos also believe that healthcare is a vertical with significant opportunity, as demonstrated by operators’ such as TELUS and Telstra’s big investments into building health IT businesses, and smaller but ongoing efforts from many others. See STL Partners’ report How to crack the healthcare opportunity for profiles of nine telecoms operators’ strategies in the healthcare vertical.

Our research into the telecoms industry’s investment priorities in 2021 shows that the accelerated uptake of digital health solutions throughout the COVID pandemic has only shifted health further up the priority list for operators.

Figure 1: Digital health is among telcos’ top investment priorities in 2021

digital health telecoms priority

Source: STL Partners, Telecoms priorities: Ready for the crunch?

However, few operators have put their full effort into driving the transformation of healthcare delivery and outcomes through digital solutions. From our conversations with operators around the world, we believe this is in large because they are not yet fully convinced that addressing the challenges associated with transforming healthcare – fragmented and complex systems, slow moving public processes, impact on human lives – will pay off. Are they capable of solving these challenges, and is the business opportunity big enough to justify the risk?

Taking a cautious “wait and see” approach to developing a digital health business, launching a couple of trials or PoCs and seeing if they deliver value, or investing in a digital health start-up or two, may have been a viable approach for operators before the COVID pandemic hit, but with the acceleration in digital health adoption this is no longer the case. Now that COVID has forced healthcare providers and patients to embrace new technologies, the proof points and business cases the industry has been demanding have become a lot clearer. As a result, the digital health market is now four years ahead of where it was at the beginning of 2020, so operators seeking to build a business in healthcare should commit now while momentum and appetite for change is strong.

Enter your details below to download the report extract

How is COVID changing healthcare delivery?

The first and most significantly affected area of the digital health landscape throughout 2020 was virtual consultations and telehealth, where almost overnight doctors shifted as many appointments on to phone or video calls as possible. For example, in the UK the proportion of doctor’s visits happening over the phone or video rose from around 13% in late 2019 to 48% at the peak of the pandemic in April-June 2020, while US based virtual consultation provider Teladoc’s total visits tripled between Q219 and Q220, to 2.8mn.

By necessity, regulatory barriers to adoption of virtual consultations were lowered. Other barriers, such as insurers or governments not reimbursing or underpaying doctors for virtual appointments, and organisational and culture barriers among both patients and providers also broke down. The knock on effect has been acceleration across the broader digital health market, in areas such as remote patient monitoring and population level analytics. (See more on the immediate impact of COVID on digital health in STL article How COVID-19 is changing digital health – and what it means for telcos)

The key question is how much of an impact has COVID had, and will it last over the long term? This is what we aim to answer in this report and the accompanying global database tool. Key questions we address in this analysis are:

  • How much has COVID accelerated adoption of digital health applications?
  • What are the cost savings from accelerated uptake of digital health following COVID?
  • Which digital health application areas have been most affected by COVID?
  • Beyond the COVID impact, what is the total potential value of digital health applications for healthcare providers?
  • Which digital health application areas will deliver the biggest cost savings, globally and within specific markets?

To answer these questions we have built a bottom-up forecast model with a focus on the application areas we believe are most relevant to telecoms operators, as illustrated in Figure 2.

Figure 2: Five digital health application areas for telcos

5 digital health application areas

Source: STL Partners

We believe these are most relevant because their high dependence on connectivity, and needs for significant coordination and engagement with a broad range of local stakeholders to succeed, are well aligned with telecoms operators assets. See this STL Partners article for more detail on why these application areas are good entry points for telecoms operators.

NB We chose to omit the Personal health and wellness application area from our bottom-up model. It is a more generic and global application area than the others, dominated by players such as Google/Fitbit and Apple and with little integration thus far into formal healthcare services. While it is nonetheless an area of interest for telecoms operators, especially those that are seeking to build deeper relationships directly with consumers, it is a difficult entry point for telecoms operators seeking to build a healthcare business. This global and consumer focused nature of this application area also means that it is difficult to find reliable local data and quantify its value for healthcare systems.

What are these forecasts for?

Telecoms operators and others should use this forecast analysis to understand the potential value of digital health, including:

  • The size of the digital health opportunity in different markets
  • The market size for new applications across the four areas we modelled (remote patient monitoring, virtual care and telehealth, diagnostics and triage, data and analytics)
  • The relative size of the opportunities across the four application areas in different countries
  • The pace of digital health adoption and market growth in different countries and application areas

In other words, it shows how big the overall digital health market is, how fast it is growing, and which application areas are most valuable and/or growing fastest.

In a follow-up report, we will expand on this analysis to assess how much of this value telecoms operators specifically can capture.

Enter your details below to download the report extract

Seven Tough CEO Questions – Telco 2.0 Update

Seven Tough Questions

In the process of refreshing our 2016-2017 research agenda, STL Partners has identified seven ‘meta’ themes in our recent research that can be thought of as part of a contemporary checklist for telecoms strategy.

These are some of the questions that we believe boards and executives should be asking themselves as they assess their own strategies and decide what further actions and initiatives to investigate and initiate. In the following brief article, we point our customers to our latest findings in these areas. [NB If you’re not a customer, you’ll be able to see some, but not all of the analysis.]

We do not claim that this is a full and exclusive list, and indeed we’d welcome your input via contact@stlpartners.com to set up a call with our research team or share your thoughts and questions directly.

1. Do you have a compelling vision based on an evolving, competitive digital customer experience?

There are a number of scenarios facing telecoms operators in which any compelling vision must apply. Our initial scenario analysis was directed to European operators, and we have subsequently also conducted workshops and seminars with operators in other parts of the world that identified and developed similar relevant groupings in their markets.

Figure 1 – Four Illustrative Telecoms Market Scenarios

We then identified the fundamental problems with telecoms transformations in Problem: Telecoms technology inhibits operator business model change, and subsequently proposed a vision and solution to this in Transforming to the Telco Cloud Service Provider.

Much of our other analysis integrates with these ideas, exploring the themes in more specific domains, such as how to transform, relevant strategies in adjacent and disrupted/disrupting industries, developments in advanced enterprise cloud and ICT, and the future of the network, and we outline some of this analysis in summary below.

A critical foundation stone of any future strategy is how competitive the digital experience that your company delivers to its clients. In this regard, we have published the first iteration of MobiNEX: The Mobile Network Customer Experience Index which looked at 27 operators in seven markets, and compared the relative performance of operators’ mobile data networks in terms of how they deliver customer app-use experience. We are also working on further a global analysis in this domain which will be published soon.

Another foundation stone for telcos is that becoming a truly digital business is not just about technology, IT, marketing, or even HR for that matter. It is an approach that requires the engagement, re-thinking and adaptation / evolution of the whole business.

Looking at other aspects of operators’ digital competence, and following on from our research into operator agility, we are also now working on research into how well operators are transforming their customer-facing digital activities in marketing and sales, and also into other areas of digital maturity and transformation.

In addition, we continue to frame and update the strategic picture in the context of industry analyses such as Brexit: Telecoms Strategy Implications and US Wireless Market: Early Warning Signs of Change, and identify leading case studies of telco innovation in reports such as this one on Dialog’s surprisingly successful API programme, and this on Telstra’s ambitious healthcare investment programme.

 

  • Seven Questions (and pointers to our answers)
  • 1. Do you have a compelling vision based on an evolving, competitive digital customer experience?
  • 2. NFV/SDN: tools of business transformation or toys of the technology department?
  • 3. IoT/5G/Cloud: the Holy Trinity of Hope – or Hype?
  • 4. Can/does your business work well with others in new ways to deliver?
  • 5. Are you tuned into innovation in Communications, Commerce and Content?
  • 6. Is your Enterprise/SMB strategy keeping pace with the market?
  • 7. Is your network holding you back or taking you forward?
  • What’s next?

 

  • Figure 1: Four Illustrative Telecoms Market Scenarios
  • Figure 2: Cloud business practices – key principles
  • Figure 3: Challenges for Telco Digital Services Partnering
  • Figure 4: Six Healthcare Pain Points Telstra Health Aims to Address

Fast-Tracking Operator Plans to Win in the $5bn Location Insights Market

If you don’t subscribe to our research yet, you can download the free report as part of our sample report series.

Preface

Subscriber location information is a much-heralded asset of the telecoms operator. Operators have generally understood the importance of this asset but have typically struggled to monetize their position. Some operators have used location information to enable third party services whilst others have attempted to address the opportunity more holistically, with mixed success.

This report updates and expands on a previous STL Partners study: “Making Money from Location Insights” (2013). It outlines how to address the potential opportunity around Location Services. It draws on interviews conducted amongst key stakeholders within the emerging ecosystem, supplemented by STL Partners’ research and analysis, with the objective of determining how operators can release the value from their unique position in the location value chain.

This report focuses on what we have defined as Location Insight Services. The report argues that operators should first seek to offer Location Insight Services before evolving to cover Location Based Services. This strategic approach allows operators to better understand their data and to build location services for enterprise customers rather than starting with consumer-orientated location services that require additional capabilities. This approach provides the most upside with the least associated risk, offering the potential for incremental learning.

This report was commissioned and supported by Viavi Solutions (formerly JDSU). The research, analysis and the writing of the report itself were carried out independently by STL Partners. The views and conclusions contained herein are those of STL Partners.

Location Based Services vs. Location Insight Services

In the 2013 report, STL made a clear distinction between different types of location services.

  • Location Based Services (LBS) are geared towards supporting business processes (typically marketing-oriented) that are dependent on the instant availability of real-time or near real-time data about an individually identifiable subscriber. These are provided on the reasonable assumption that knowing an individual’s location enables a company to deliver a service or make an offer that is more relevant, there and then. Typically these services require explicit consent and an interaction with the customer (e.g. push marketing) and therefore require compelling user interfaces and permissions.
  • Additionally there is an opportunity to derive and deliver Location Insight Services (LIS) from connected consumers’ mobile location data. This opportunity does not necessarily require real-time data and where insights are aggregated and anonymized, can safeguard individuals’ privacy. The underlying premise is that identification of repetitive patterns in location activity over time not only enables a much deeper understanding of the consumer in terms of behavior and motivation, but also builds a clearer picture of the visitor profile of the location. Additionally LIS has the potential to provide data that is not available via other routes (e.g. understanding the footfall within a competitor’s store).

Figure 7:  Mapping the Telco Opportunity Landscape

Source: STL Partners

The framework in Figure 7 has been developed by STL Partners specifically with the mobile operator’s perspective in mind. We have split out operator location opportunities along two dimensions:

  • Real-time vs. Non-real-time data acquisition
  • Individual vs. Aggregated data analysis and action

Choosing the Right Strategy

Where are we now?

Most operators understand the potential value of their location asset and have attempted to monetize their data. Some operators have used location to enable 3rd party services whilst others have attempted to address the opportunity more holistically. Both have achieved mixed success for a number of reasons.

Most operators who are attempting to monetize location data have been drawn towards Location Based Services, namely push-marketing and advertising. Whilst some operators have achieved moderate success here (e.g. O2 Priority Moments), most are acting as enablers for other services. They are therefore addressing a limited part of the value chain and subsequently are not realizing significant value from their data. We do not consider those that pursue this strategy to be Location Based Services Providers, rather they are simply enablers.

Similarly a number of operators are addressing Location Insights, albeit with different approaches. Some are partnering with analytics and insight companies (e.g. Telefonica and GfK), others are developing services mostly on their own (e.g. SingTel’s DataSpark), whilst others are simply launching pilots.

In order to maximize the value that operators can secure through Location Services, we believe that operators need to address the whole Location ‘Stack’, not simply enabling new services or providing raw data. STL believe that the best way to do this is to start with Location Insight Services.

Start with Location Insight Services

When considering how to develop and monetize their location assets we recommend that operator’s select to start with LIS. Whilst many operators are already engaged in LBS (e.g. enabling push-marketing) the majority are not actually providing the service but are simply sharing data and enabling a 3rd party service provider.
Starting with LIS has a number of strategic advantages:

  • It’s a big opportunity in its own right
  • Telcos (should) have a data capture/technology advantage for LIS over OTT players
  • LIS provides an opportunity to build & learn incrementally, proving value
  • Privacy risks are reduced (particularly with aggregated data)
  • LIS does not require 100% coverage of the population, unlike a number of LBS use cases
  • LIS can provide internal benefits and can bolster the Go-to-Market strategy for vertical specific offerings

These advantages are explored in more detail further in this report.

 

  • Location, Location, Location
  • The Importance of Information
  • Location Based Services vs. Location Insight Services
  • Choosing the Right Strategy
  • Where are we now?
  • Start with Location Insight Services
  • Improve your LIS offering, transition towards LBS & position yourself as a Trusted Data Provider
  • Location Insights – Marketplace Overview
  • Where is the Opportunity for Location Insight Services?
  • Which Sectors are most addressable?
  • Sizing the Opportunity
  • Why haven’t forecasts developed as quickly as expected?
  • Location Insights potentially worth $5bn globally by 2020
  • Benchmarks
  • Where does the value come from – the Location Insights ‘Stack’
  • Understanding the Technology Options
  • The Technology Options for Location Data Acquisition
  • Technology Advantages for Telcos
  • The Right Degree of Location Precision
  • Other Advantages of Starting with LIS
  • Incremental Learning
  • Addressing the Privacy Question
  • Market Coverage
  • LIS can provide internal benefits and can bolster the Go-to-Market strategy for vertical specific offerings
  • Expanding Beyond Insights
  • Addressing Location Based Services
  • Becoming a Trusted Data Provider
  • Practical Guidance to Launch Location Services
  • Market Strategy
  • Data Management
  • An agile approach, partnering, orchestration and governance
  • Conclusions
  • Appendices
  • Appendix 1: Location Acquisition Technologies in Detail
  • Appendix 2: Opportunity Sizing Methodology
  • Appendix 3: About STL Partners and Telco 2.0: Change the Game

 

  • Figure 1: Location Insight vs. Location Based Services
  • Figure 2: STL Partners’ Analysis of the value of Global Location Insight Services (by 2020)
  • Figure 3: Analysis of location data acquisition technologies suitability for Location Insight Services
  • Figure 4: The Strategy Beyond Location Insights
  • Figure 5: The Explosion of Smartphones (2007-2014)
  • Figure 6: ‘Non-Smart’ Data Insights Become More Important as More ‘Things’ are Connected
  • Figure 7: Mapping the Telco Opportunity Landscape
  • Figure 8: Four opportunity domains for operators
  • Figure 9: Turkcell’s Smart Map Tool
  • Figure 10: TomTom’s Fusion Engine to Analyze Real-Time Traffic Information
  • Figure 11: Tado’s Proximity Based Thermostat
  • Figure 12: Expanding Beyond LIS
  • Figure 13: Location Insights – Market Taxonomy
  • Figure 14: Telefónica Smart Steps Location Analytics Tool
  • Figure 15: Motionlogic’s Location Analytics Tool
  • Figure 16: The value of Global Location Insight Services by industry and sector (by 2020)
  • Figure 17: The Location Insights ‘Stack’
  • Figure 18: How well do different location data acquisition technologies support Location Insight Services needs?
  • Figure 19: Real-Time vs. Near Real-Time Location Information
  • Figure 20: Deveryware’s Dynamic Permissions Tool
  • Figure 21: Become a Trusted Data Provider
  • Figure 22: Analysis of App/OS based real-time location Technology
  • Figure 23: Analysis of App/OS based data stored on device Technology
  • Figure 24: Analysis of Emergency Services Location Technology
  • Figure 25: Analysis of Granular (building level) network based Technology
  • Figure 26: Analysis of Coarse (cell-level) network based Technology
  • Figure 27: Analysis of Indoor Technologies

How 5G is Disrupting Cloud and Network Strategy Today

5G – cutting through the hype

As with 3G and 4G, the approach of 5G has been heralded by vast quantities of debate and hyperbole. We contemplated reviewing some of the more outlandish statements we’ve seen and heard, but for the sake of brevity and progress we’ll concentrate in this report on the genuine progress that has also occurred.

A stronger definition: a collection of related technologies

Let’s start by defining terms. For us, 5G is a collection of related technologies that will eventually be incorporated in a 3GPP standard replacing the current LTE-A. NGMN, the forum that is meant to coordinate the mobile operators’ requirements vis-à-vis the vendors, recently issued a useful document setting out what technologies they wanted to see in the eventual solution or at least have considered in the standards process.

Incremental progress: ‘4.5G’

For a start, NGMN includes a variety of incremental improvements that promise substantially more capacity. These are things like higher modulation, developing the carrier-aggregation features in LTE-A to share spectrum between cells as well as within them, and improving interference coordination between cells. These are uncontroversial and are very likely to be deployed as incremental upgrades to existing LTE networks long before 5G is rolled out or even finished. This is what some vendors, notably Huawei, refer to as 4.5G.

Better antennas, beamforming, etc.

More excitingly, NGMN envisages some advanced radio features. These include beamforming, in which the shape of the radio beam between a base station and a mobile station is adjusted, taking advantage of the diversity of users in space to re-use the available radio spectrum more intensely, and both multi-user and massive MIMO (Multiple Input/Multiple Output). Massive MIMO simply means using many more antennas – at the moment the latest equipment uses 8 transmitter and 8 receiver antennas (8T*8R), whereas 5G might use 64. Multi-user MIMO uses the variety of antennas to serve more users concurrently, rather than just serving them faster individually. These promise quite dramatic capacity gains, at the cost of more computationally intensive software-defined radio systems and more complex antenna designs.Although they are cutting-edge, it’s worth pointing that 802.11ac Wave 2 WiFi devices shipping now have these features, and it is likely that the WiFi ecosystem will hold a lead in these for some considerable length of time.

New spectrum

NGMN also sees evolution towards 5G in terms of spectrum. We can divide this into a conservative and a radical phase – in the first, conservative phase, 5G is expected to start using bands below 6GHz, while in the second, radical phase, the centimetre/millimetre-wave bands up to and above 30GHz are in discussion. These promise vastly more bandwidth, but as usual will demand a higher density of smaller cells and lower transmitter power levels. It’s worth pointing out that it’s still unclear whether 6GHz will make the agenda for this year’s WRC-15 conference, and 60GHz may or may not be taken up in 2019 at WRC-19, so spectrum policy is a critical path for the whole project of 5G.

Full duplex radio – doubling capacity in one stroke

Moving on, we come to some much more radical proposals and exotic technologies. 5G may use the emerging technology of full-duplex radio, which leverages advances in hardware signal processing to get rid of self-interference and make it possible for radio devices to send and receive at the same time on the same frequency, something hitherto thought impossible and a fundamental issue in radio. This area has seen a lot of progress recently and is moving from an academic research project towards industrial status. If it works, it promises to double the capacity provided by all the other technologies together.

A new, flatter network architecture?

A major redesign of the network architecture is being studied. This is highly controversial. A new architecture would likely be much “flatter” with fewer levels of abstraction (such as the encapsulation of Internet traffic in the GTP protocol) or centralised functions. This, however, would be a very radical break with the GSM-inspired practice that worked in 2G, 3G, and in an adapted form in 4G. However, the very demanding latency targets we will discuss in a moment will be very difficult to satisfy with a centralised architecture.

Content-centric networking

Finally, serious consideration is being given to what the NGMN calls information-based networking, better known to the wider community as either name-based networking, named-data networking, or content-centric networking, as TCP-Reno inventor Van Jacobsen called it when he introduced the concept in a now-classic lecture. The idea here is that the Internet currently works by mapping content to domain names to machines. In content-centric networking, users request some item of content, uniquely identified by a name, and the network finds the nearest source for it, thus keeping traffic localised and facilitating scalable, distributed systems. This would represent a radical break with both GSM-inspired and most Internet practice, and is currently very much a research project. However, code does exist and has even beenimplemented using the OpenFlow NFV platform, and IETF standardisation is under way.

The mother of all stretch targets

5G is already a term associated with implausibly grand theoretical maxima, like every G before it. However, the NGMN has the advantage that it is a body that serves first of all the interests of the operators, the customers, rather than the vendors. Its expectations are therefore substantially more interesting than some of the vendors’ propaganda material. It has also recently started to reach out to other stakeholders, such as manufacturing companies involved in the Internet of Things.

Reading the NGMN document raises some interesting issues about the definition of 5G. Rather than set targets in an absolute sense, it puts forward parameters for a wide range of different use cases. A common criticism of the 5G project is that it is over-ambitious in trying to serve, for example, low bandwidth ultra-low power M2M monitoring networks and ultra-HD multicast video streaming with the same network. The range of use cases and performance requirements NGMN has defined are so diverse they might indeed be served by different radio interfaces within a 5G infrastructure, or even by fully independent radio networks. Whether 5G ends up as “one radio network to rule them all”, an interconnection standard for several radically different systems, or something in between (for example, a radio standard with options, or a common core network and specialised radios) is very much up for debate.

In terms of speed, NGMN is looking for 50Mbps user throughput “everywhere”, with half that speed available uplink. Success is defined here at the 95th percentile, so this means 50Mbps to 95% geographical coverage, 95% of the time. This should support handoff up to 120Km/h. In terms of density, this should support 100 users/square kilometre in rural areas and 400 in suburban areas, with 10 and 20 Gbps/square km capacity respectively. This seems to be intended as the baseline cellular service in the 5G context.

In the urban core, downlink of 300Mbps and uplink of 50Mbps is required, with 100Km/h handoff, and up to 2,500 concurrent users per square kilometre. Note that the density targets are per-operator, so that would be 10,000 concurrent users/sq km when four MNOs are present. Capacity of 750Gbps/sq km downlink and 125Gbps/sq km uplink is required.

An extreme high-density scenario is included as “broadband in a crowd”. This requires the same speeds as the “50Mbps anywhere” scenario, with vastly greater density (150,000 concurrent users/sq km or 30,000 “per stadium”) and commensurately higher capacity. However, the capacity planning assumes that this use case is uplink-heavy – 7.5Tbps/sq km uplink compared to 3.75Tbps downlink. That’s a lot of selfies, even in 4K! The fast handoff requirement, though, is relaxed to support only pedestrian speeds.

There is also a femtocell/WLAN-like scenario for indoor and enterprise networks, which pushes speed and capacity to their limits, with 1Gbps downlink and 500Mbps uplink, 75,000 concurrent users/sq km or 75 users per 1000 square metres of floor space, and no significant mobility. Finally, there is an “ultra-low cost broadband” requirement with 10Mbps symmetrical, 16 concurrent users and 16Mbps/sq km, and 50Km/h handoff. (There are also some niche cases, such as broadcast, in-car, and aeronautical applications, which we propose to gloss over for now.)

Clearly, the solution will have to either be very flexible, or else be a federation of very different networks with dramatically different radio properties. It would, for example, probably be possible to aggregate the 50Mbps everywhere and ultra-low cost solutions – arguably the low-cost option is just the 50Mbps option done on the cheap, with fewer sites and low-band spectrum. The “broadband in a crowd” option might be an alternative operating mode for the “urban core” option, turning off handoff, pulling in more aggregated spectrum, and reallocating downlink and uplink channels or timeslots. But this does begin to look like at least three networks.

Latency: the X factor

Another big stretch, and perhaps the most controversial issue here, is the latency requirement. NGMN draws a clear distinction between what it calls end-to-end latency, aka the familiar round-trip time measurement from the Internet, and user-plane latency, defined thus:

Measures the time it takes to transfer a small data packet from user terminal to the Layer 2 / Layer 3 interface of the 5G system destination node, plus the equivalent time needed to carry the response back.

That is to say, the user-plane latency is a measurement of how long it takes the 5G network, strictly speaking, to respond to user requests, and how long it takes for packets to traverse it. NGMN points out that the two metrics are equivalent if the target server is located within the 5G network. NGMN defines both using small packets, and therefore negligible serialisation delay, and assuming zero processing delay at the target server. The target is 10ms end-to-end, 1ms for special use cases requiring low latency, or 50ms end-to-end for the “ultra-low cost broadband” use case. The low-latency use cases tend to be things like communication between connected cars, which will probably fall under the direct device-to-device (D2D) element of 5G, but nevertheless some vendors seem to think it refers to infrastructure as well as D2D. Therefore, this requirement should be read as one for which the 5G user plane latency is the relevant metric.

This last target is arguably the biggest stretch of all, but also perhaps the most valuable.

The lower bound on any measurement of latency is very simple – it’s the time it takes to physically reach the target server at the speed of light. Latency is therefore intimately connected with distance. Latency is also intimately connected with speed – protocols like TCP use it to determine how many bytes it can risk “in flight” before getting an acknowledgement, and hence how much useful throughput can be derived from a given theoretical bandwidth. Also, with faster data rates, more of the total time it takes to deliver something is taken up by latency rather than transfer.

And the way we build applications now tends to make latency, and especially the variance in latency known as jitter, more important. In order to handle the scale demanded by the global Internet, it is usually necessary to scale out by breaking up the load across many, many servers. In order to make this work, it is usually also necessary to disaggregate the application itself into numerous, specialised, and independent microservices. (We strongly recommend Mary Poppendieck’s presentation at the link.)

The result of this is that a popular app or Web page might involve calls to dozens to hundreds of different services. Google.com includes 31 HTTP requests these days and Amazon.com 190. If the variation in latency is not carefully controlled, it becomes statistically more likely than not that a typical user will encounter at least one server’s 99th percentile performance. (EBay tries to identify users getting slow service and serve them a deliberately cut-down version of the site – see slide 17 here.)

We discuss this in depth in a Telco 2.0 Blog entry here.

Latency: the challenge of distance

It’s worth pointing out here that the 5G targets can literally be translated into kilometres. The rule of thumb for speed-of-light delay is 4.9 microseconds for each kilometre of fibre with a refractive index of 1.47. 1ms – 1000 microseconds – equals about 204km in a straight line, assuming no routing delay. A response back is needed too, so divide that distance in half. As a result, in order to be compliant with the NGMN 5G requirements, all the network functions required to process a data call must be physically located within 100km, i.e. 1ms, of the user. And if f the end-to-end requirement is taken seriously, the applications or content that they want must also be hosted within 1000km, i.e. 10ms, of the user. (In practice, there will be some delay contributed by serialisation, routing, and processing at the target server, so this would actually be somewhat more demanding.)

To achieve this, the architecture of 5G networks will need to change quite dramatically. Centralisation suddenly looks like the enemy, and middleboxes providing video optimisation, deep packet inspection, policy enforcement, and the like will have no place. At the same time, protocol designers will have to think seriously about localising traffic – this is where the content-centric networking concept comes in. Given the number of interested parties in the subject overall, it is likely that there will be a significant period of ‘horse-trading’ over the detail.

It will also need nothing more or less than a CDN and data-centre revolution. Content, apps, or commerce hosted within this 1000km contour will have a very substantial competitive advantage over those sites that don’t move their hosting strategy to take advantage of lower latency. Telecoms operators, by the same token, will have to radically decentralise their networks to get their systems within the 100km contour. Those content, apps, or commerce sites that move closer in still, to the 5ms/500km contour or further, will benefit further. The idea of centralising everything into shared services and global cloud platforms suddenly looks dated. So might the enormous hyperscale data centres one day look like the IT equivalent of sprawling, gas-guzzling suburbia? And will mobile operators become a key actor in the data-centre economy?

  • Executive Summary
  • Introduction
  • 5G – cutting through the hype
  • A stronger definition: a collection of related technologies
  • The mother of all stretch targets
  • Latency: the X factor
  • Latency: the challenge of distance
  • The economic value of snappier networks
  • Only Half The Application Latency Comes from the Network
  • Disrupt the cloud
  • The cloud is the data centre
  • Have the biggest data centres stopped getting bigger?
  • Mobile Edge Computing: moving the servers to the people
  • Conclusions and recommendations
  • Regulatory and political impact: the Opportunity and the Threat
  • Telco-Cloud or Multi-Cloud?
  • 5G vs C-RAN
  • Shaping the 5G backhaul network
  • Gigabit WiFi: the bear may blow first
  • Distributed systems: it’s everyone’s future

 

  • Figure 1: Latency = money in search
  • Figure 2: Latency = money in retailing
  • Figure 3: Latency = money in financial services
  • Figure 4: Networking accounts for 40-60 per cent of Facebook’s load times
  • Figure 5: A data centre module
  • Figure 6: Hyperscale data centre evolution, 1999-2015
  • Figure 7: Hyperscale data centre evolution 2. Power density
  • Figure 8: Only Facebook is pushing on with ever bigger data centres
  • Figure 9: Equinix – satisfied with 40k sq ft
  • Figure 10: ETSI architecture for Mobile Edge Computing

 

Google’s MVNO: What’s Behind it and What are the Implications?

Google’s core business is under pressure

Google, the undisputed leader in online advertising and tech industry icon, has more problems than you might think. The grand narrative is captured in the following chart, showing basic annual financial metrics for Google, Inc. between 2009 and 2014.

Figure 1: Google’s margins have eroded substantially over time

Source: STL Partners, Google 10-K filing

This is essentially the classic problem of commoditisation. The IT industry has been structurally deflationary throughout its existence, which has always posed problems for its biggest successes – how do you maintain profitability in a business where prices only ever fall? Google is growing in terms of volume, but its margins are sliding, and as a result, profitability is growing much more slowly than revenue. Since 2010, the operating margin has shrunk from around 35% to around 25%, a period during which a major competitor emerged (Facebook) and Google initiated a variety of major investments, research projects, and flirted with manufacturing hardware (through the Motorola acquisition).

And it could get worse. In its most recent 10-K filing, Google says: “We anticipate downward pressure on our operating margin in the future.” It cites increasing competition and increased expenditures, while noting that it is becoming more reliant on lower margin products: “The margin on the sale of digital content and apps, advertising revenues from mobile devices and newer advertising formats are generally less than the margin on revenues we generate from advertising on our websites on traditional formats.”

Google remains massively dependent on a commoditising advertising business

Google is very, very dependent on selling advertising for revenue. It does earn some revenue from content, but most of this is generated from the ContentID program, which places adverts on copyrighted material and shares revenue with the rightsholder, and therefore, amounts to much the same thing. Over the past two years, Google has actually become more advert-dominated, as Figure 2 shows. Advertising revenues are not only vastly greater than non-advertising revenues, they are growing much faster and increasing as a share of the total. Over- reliance on the fickle and fast changing advertising market is obviously risky. Also, while ad brokering is considered a high-margin business, Google’s margins are now at the same level as AT&T’s.

Figure 2: Not only is Google overwhelmingly dependent on advertising, advertising revenue is growing faster than non-advertising

Source: STL Partners, Google 10-K

The growth rate of non-advertising revenue at Google has slowed sharply since last year. It is now growing more slowly than either advertising on Google properties, or in the Google affiliate network (see Figure 3).

Figure 3: Google’s new-line businesses are growing slower than the core business

Source: STL Partners, Google 10-K

At the same time, the balance has shifted a little between Google’s own properties (such as Google.com) and its affiliate network. Historically, more and more Google revenue has come from its own inventory and less from placing ads on partner sites. Costs arise from the affiliate network because Google pays out revenue share to the partner sites, known as traffic-acquisition costs or TACs. Own-account ad inventory, however, isn’t free – Google has to create products to place advertising in, and this causes it to incur R&D expenditures.

In a real sense, R&D is the equivalent to TAC for the 60-odd per cent of Google’s business that occurs on its own web sites. Google engineering excellence, and perhaps economies of scale, mean that generating ad inventory via product creation might be a better deal than paying out revenue share to hordes of bloggers or app developers, and Figure 4 shows this is indeed the case. R&D makes up a much smaller percentage of revenue from Google properties than TAC does of revenue from the affiliate network.

Figure 4: R&D is a more efficient means of generating ad inventory than affiliate payouts

Source: STL Partners, Google 10-K

Note, that although TAC might well be rising, the spike for Q4 2014 is probably a seasonal effect – Q4 is likely to be a month when a lot of adverts get clicked across the web.

 

  • Executive Summary
  • Google’s core business is under pressure
  • Google remains massively dependent on a commoditising advertising business
  • Google spends far more on R&D and capex than Apple
  • But while costs soar, Google ad pricing is falling
  • Google also has very high running costs
  • The threats from Facebook and Apple are real
  • Google MVNO: a strategic initiative
  • What do you need to make a mini-carrier?
  • The Google MVNO will launch into a state of price war
  • How low could the Google MVNO’s prices be?
  • Google’s MVNO: The Strategic Rationale
  • Option 1: Ads
  • Option 2: Straightforward carrier business model
  • Option 3: Android-style strategic initiative vs MNOs
  • Option 4: Anti-Apple virus, 2.0
  • Conclusions

 

  • Figure 1: Google’s margins have eroded substantially over time
  • Figure 2: Not only is Google overwhelmingly dependent on advertising, advertising revenue is growing faster than non-advertising
  • Figure 3: Growth in Google’s new-line businesses is now slower than in the core business
  • Figure 4: R&D is a more efficient means of generating ad inventory than affiliate payouts
  • Figure 5: Google spends a lot of money on research
  • Figure 6: Proportionately, Google research spending is even higher
  • Figure 7: Google’s dollar capex is almost identical to vastly bigger Apple’s
  • Figure 8: Google is startlingly capex-intensive compared to Apple, especially for an ad broker versus a global manufacturing titan
  • Figure 9: Google’s ad pricing is declining, and volume growth paused for most of 2014
  • Figure 10: Google is a more expensive company to run than Apple
  • Figure 11: The aircraft hangar Google leases from NASA
  • Figure 12: Facebook is pursuing quality over quantity in ad placement
  • Figure 12: Facebook is gradually closing the gap on Google in digital advertising
  • Figure 14: Despite a huge revenue quarter, Facebook’s Q4 saw a sharp hit to margin
  • Figure 15: Facebook’s margin hit is explained by the rise in R&D spending
  • Figure 16: Apple’s triumph – a terrible Q4 for the Android ecosystem
  • Figure 17: Price disruption in France and in the United States
  • Figure 18: Price disruption in the US – this is only the beginning
  • Figure 19: Defending AT&T and Verizon Wireless’ ARPU comes at a price
  • Figure 20: Modelling the service price of a mini-carrier
  • Figure 21: A high WiFi offload rate could make Google’s pricing aggressive
  • Figure 21: Handset subsidies are alive and well at T-Mobile

 

Key Questions for NextGen Broadband Part 1: The Business Case

Introduction

It’s almost a cliché to talk about “the future of the network” in telecoms. We all know that broadband and network infrastructure is a never-ending continuum that evolves over time – its “future” is continually being invented and reinvented. We also all know that no two networks are identical, and that despite standardisation there are always specific differences, because countries, regulations, user-bases and legacies all vary widely.

But at the same time, the network clearly matters still – perhaps more than it has for the last two decades of rapid growth in telephony and SMS services, which are now dissipating rapidly in value. While there are certainly large swathes of the telecom sector benefiting from content provision, commerce and other “application-layer” activities, it is also true that the bulk of users’ perceived value is in connectivity to the Internet, IPTV and enterprise networks.

The big question is whether CSPs can continue to convert that perceived value from users into actual value for the bottom-line, given the costs and complexities involved in building and running networks. That is the paradox.

While the future will continue to feature a broader set of content/application revenue streams for telcos, it will also need to support not just more and faster data connections, but be able to cope with a set of new challenges and opportunities. Top of the list is support for “Connected Everything” – the so-called Internet of Things, smart homes, connected cars, mobile healthcare and so on. There is a significant chance that many of these will not involve connection via the “public Internet” and therefore there is a possibility for new forms of connectivity proposition evolving – faster- or lower-powered networks, or perhaps even the semi-mythical “QoS”, which if not paid for directly, could perhaps be integrated into compelling packages and data-service bundles. There is also the potential for “in-network” value to be added through SDN and NFV – for example, via distributed servers close to the edge of the network and “orchestrated” appropriately by the operator. But does this add more value than investing in more web/OTT-style applications and services, de-coupled from the network?

Again, this raises questions about technology, business models – and the practicalities of making it happen.

This plays directly into the concept of the revenue “hunger gap” we have analysed for the past two years – without ever-better (but more efficient) networks, the telecom industry is going to get further squeezed. While service innovation is utterly essential, it also seems to be slow-moving and patchy. The network part of telcos needs to run just to stand still. Consumers will adopt more and faster devices, better cameras and displays, and expect network performance to keep up with their 4K videos and real-time games, without paying more. Depending on the trajectory of regulatory change, we may also see more consolidation among parts of the service provider industry, more quad-play networks, more sharing and wholesale models.

We also see communications networks and applications permeating deeper into society and government. There is a sense among some policymakers that “telecoms is too important to leave up to the telcos”, with initiatives like Smart Cities and public-safety networks often becoming decoupled from the mainstream of service providers. There is an expectation that technology – and by extension, networks – will enable better economies, improved healthcare and education, safer and more efficient transport, mechanisms for combatting crime and climate change, and new industries and jobs, even as old ones become automated and robotised.

Figure 1 – New services are both network-integrated & independent

Source: STL Partners

And all of this generates yet more uncertainty, with yet more questions – some about the innovations needed to support these new visions, but also whether they can be brought to market profitably, given the starting-point we find ourselves at, with fragmented (yet growing) competition, regulatory uncertainty, political interference – and often, internal cultural barriers within the CSPs themselves. Can these be overcome?

A common theme from the section above is “Questions”. This document – and a forthcoming “sequel” – is intended to group, lay out and introduce the most important ones. Most observers just tend to focus on a few areas of uncertainty, but in setting up the next year or so of detailed research, Telco 2.0 wants to fully list and articulate all of the hottest issues. Only once they are collated, can we start to work out the priorities – and inter-dependencies.

Our belief is that all of the detailed questions on “Future Networks” can, it fact, be tied back to one of two broader, more over-reaching themes:

  • What are the business cases and operational needs for future network investment?
  • Which disruptions (technological or other) are expected in the future?

The business case theme is covered in this document. It combines future costs (spectrum, 4G/5G/fibre deployments, network-sharing, virtualisation, BSS/OSS transformation etc.) and revenues (data connectivity, content, network-integrated service offerings, new Telco 2.0-style services and so on). It also encompasses what is essential to make the evolution achievable, in terms of organisational and cultural transformation within telcos.

A separate Telco 2.0 document, to be published in coming weeks, will cover the various forthcoming disruptions. These are expected to include new network technologies that will ultimately coalesce to form 5G mobile and new low-power wireless, as well as FTTx and DOCSIS cable evolution. In addition, virtualisation in both NFV and SDN guises will be hugely transformative.

There is also a growing link between mobile and fixed domains, reflected in quad-play propositions, industry consolidation, and the growth of small-cells and WiFi with fixed-line backhaul. In addition, to support future service innovation, there need to be adequate platforms for both internal and external developers, as well as a meaningful strategy for voice/video which fits with both network and end-user trends. Beyond the technical, additional disruption will be delivered by regulatory change (for example on spectrum and neutrality), and also a reshaped vendor landscape.

The remainder of this report lays out the first five of the Top 10 most important questions for the Future Network. We can’t give definitive analyses, explanations or “answers” in a report of this length – and indeed, many of them are moving targets anyway. But taking a holistic approach to laying out each question properly – where it comes from, and what the “moving parts” are, we help to define the landscape. The objective is to help management teams apply those same filters to their own organisations, understand how can costs be controlled and revenues garnered, see where consolidation and regulatory change might help or hinder, and deal with users and governments’ increasing expectations.

The 10 Questions also lay the ground for our new Future Network research stream, forthcoming publications and comment/opinion.

Overview: what is the business case for Future Networks?

As later sections of both this document and the second in the series cover, there are various upcoming technical innovations in the networking pipeline. Numerous advanced radio technologies underpin 4.5G and 5G, there is ongoing work to improve fibre and DSL/cable broadband, virtualisation promises much greater flexibility in carrier infrastructure and service enablement, and so on. But all those advances are predicated on either (ideally) more revenues, or at least reduced costs to deploy and operate. All require economic justification for investment to occur.

This is at the core of the Future Networks dilemma for operators – what is the business case for ongoing investment? How can the executives, boards of directors and investors be assured of returns? We all know about the ongoing shift of business & society online, the moves towards smarter cities and national infrastructure, changes in entertainment and communication preferences and, of course, the Internet of Things – but how much benefit and value might accrue to CSPs? And is that value driven by network investments, or should telecom companies re-focus their investments and recruitment on software, content and the cloud?

This is not a straightforward question. There are many in the industry that assert that “the network is the key differentiator & source of value”, while others counter that it is a commodity and that “the real value is in the services”.

What is clear is that better/faster networks will be needed in any case, to achieve some of the lofty goals that are being suggested for the future. However, it is far from clear how much of the overall value-chain profit can be captured from just owning the basic machinery – recent years have shown a rapid de-coupling of network and service, apart from a few areas.

In the past, networks largely defined the services offered – most notably broadband access, phone calls and SMS, as well as cable TV and IPTV. But with the ubiquitous rise of Internet access and service platforms/gateways, an ever-increasing amount of service “logic” is located on the web, or in the cloud – not enshrined in the network itself. This is an important distinction – some services are abstracted and designed to be accessed from any network, while others are intimately linked to the infrastructure.

Over the last decade, the prevailing shift has been for network-independent services. In many ways “the web has won”. Potentially this trend may reverse in future though, as servers and virtualised, distributed cloud capabilities get pushed down into localised network elements. That, however, brings its own new complexities, uncertainties and challenges – it a brave (or foolhardy) telco CEO that would bet the company on new in-network service offers alone. We will also see API platforms expose network “capabilities” to the web/cloud – for example, W3C is working on standards to allow web developers to gain insights into network congestion, or users’ data-plans.

But currently, the trend is for broadband access and (most) services to be de-coupled. Nonetheless, some operators seem to have been able to make clever pricing, distribution and marketing decisions (supported by local market conditions and/or regulation) to enable bundles to be made desirable.

US operators, for example, have generally fared better than European CSPs, in what should have been comparably-mature markets. But was that due to a faster shift to 4G networks? Or other factors, such as European telecom fragmentation and sub-scale national markets, economic pressures, or perhaps a different legacy base? Did the broad European adoption of pre-paid (and often low-ARPU) mobile subscriptions make it harder to justify investments on the basis of future cashflows – or was it more about the early insistence that 2.6GHz was going to be the main “4G band”, with its limitations later coming back to bite people? It is hard to tease apart the technology issues from the commercial ones.

Similar differences apply in the fixed-broadband world. Why has adoption and typical speed varied so much? Why have some markets preferred cable to DSL? Why are fibre deployments patchy and very nation-specific? Is it about the technology involved – or the economy, topography, government policies, or the shape of the TV/broadcast sector?

Understanding these issues – and, once again, articulating the questions properly – is core to understanding the future for CSPs’ networks. We are in the middle of 4G rollout in most countries, with operators looking at the early requirements for 5G. SDN and NFV are looking important – but their exact purpose, value and timing still remain murky, despite the clear promises. Can fibre rollouts – FTTC or FTTH – still be justified in a world where TV/video spend is shifting away from linear programming and towards online services such as Netflix?

Given all these uncertainties, it may be that either network investments get slowed down – or else consolidation, government subsidy or other top-level initiatives are needed to stimulate them. On the other hand, it could be the case that reduced costs of capex and opex – perhaps through outsourcing, sharing or software-based platforms, or even open-source technology – make the numbers work out well, even for raw connectivity. Certainly, the last few years have seen rising expenditure by end-users on mobile broadband, even if it has also contributed to the erosion of legacy services such as telephony and SMS, by enabling more modern/cheaper rivals. We have also seen a shift to lower-cost network equipment and software suppliers, and an emphasis for “off the shelf” components, or open interfaces, to reduce lock-in and encourage competition.

The following sub-sections each frame a top-level, critical question relating to the business case for Future Networks:

  • Will networks support genuinely new services & enablers/APIs, or just faster/more-granular Internet access?
  • Speed, coverage, performance/QoS… what actually generates network value? And does this derive from customer satisfaction, new use-cases, or other sources?
  • Does quad-play and fixed-mobile convergence win?
  • Consolidation, network-sharing & wholesale: what changes?
  • Telco organisation and culture: what needs to change to support future network investments?

 

  • Executive Summary
  • Introduction
  • Overview: what is the business case for Future Networks?
  • Supporting new services or just faster Internet?
  • Speed, coverage, quality…what is most valuable?
  • Does quad-play & fixed-mobile convergence win?
  • Consolidation, network-sharing & wholesale: what changes?
  • Telco organisation & culture: what changes?
  • Conclusions

 

  • Figure 1 – New services are both network-integrated & independent
  • Figure 2 – Mobile data device & business model evolution
  • Figure 3 – Some new services are directly enabled by network capabilities
  • Figure 4 – Network investments ultimately need to map onto customers’ goals
  • Figure 5 – Customers put a priority on improving indoor/fixed connectivity
  • Figure 6 – Notional “coverage” does not mean enough capacity for all apps
  • Figure 7 – Different operator teams have differing visions of the future
  • Figure 8 – “Software telcos” may emulate IT’s “DevOps” organisational dynamic

 

Connected Car: Key Trends, Players and Battlegrounds

Introduction: Putting the Car in Context

A growing mythology around M2M and the Internet of Things

The ‘Internet of Things’, which is sometimes used interchangeably with ‘machine-to-machine’ communication (M2M), is not a new idea: as a term, it was coined by Kevin Ashton as early as 1999. Although initially focused on industrial applications, such as the use of RFID for tagging items in the supply chain, usage of the term has now evolved to more broadly describe the embedding of sensors, connectivity and (to varying degrees) intelligence into traditionally ‘dumb’ environments. Figure 1 below outlines some of the service areas potentially disrupted, enabled or enhanced by the Internet of Things (IoT):

Figure 1: Selected Internet of Things service areas

Source: STL Partners

To put the IoT in context, one can conceive of the Internet as having experienced three key generations to date. The first generation dates back to the 1970s, which involved ARPANET and the interconnection of various military, government and educational institutions around the United States. The second, beginning in the 1990s, can be thought of as the ‘AOL phase’, with email and web browsing becoming mainstream. Today’s generation is dominated by ‘mobile’ and ‘social’, with the two inextricably linked. The fourth generation will be signified by the arrival of the Internet of Things, in which the majority of internet traffic is generated by ‘things’ rather than humans.

The enormous growth of networks, cheaper connectivity, proliferation of smart devices, more efficient wireless protocols (e.g. ZigBee) and various government incentives/regulations have led many to confidently predict that the fourth generation of the Internet – the Internet of Things – will soon be upon us. Visions include the “Internet of Everything” (Cisco) or a “connected future” with 50 billion connected devices by 2020 (Ericsson). Similarly rapid growth is also forecasted by the MIT Technology Review, as detailed below:

Figure 2: Representative connected devices forecast, 2010-20

Source: MIT Technology Review

This optimism is reflected in broader market excitement, which has been intensified by such headline-grabbing announcements as Google’s $3.2bn acquisition of Nest Labs (discussed in depth in the Connected Home EB) and Apple’s recently announced Watch. Data extracted from Google Trends (Figure 3) shows that the popularity of ‘Internet of Things’ as a search term has increased fivefold since 2012:

Figure 3: The popularity of ‘Internet of Things’ as a search term on Google since 2004

Source: Google Trends

However, the IoT to date has predominantly been a case study in hype vs. reality. Technologists have argued for more than a decade about when the army of connected devices will arrive, as well as what we should be calling this phenomenon, and with this a mythology has grown around the Internet of Things: widespread disruption was promised, but it has not yet materialised. To many consumers the IoT can sound all too far-fetched: do I really need a refrigerator with a web browser?

Yet for every ‘killer app’ that wasn’t we are now seeing inroads being made elsewhere. Smart meters are being deployed in large numbers around the world, wearable technology is rapidly increasing in popularity, and many are hailing the connected car as the ‘next big thing’. Looking at the connected car, for example, 2013 saw a dramatic increase in the amount of VC funding it received:

Figure 4: Connected car VC activity, 2010-13

Source: CB Insights Venture Capital Database

The Internet of Things is potentially an important phenomenon for all, but it is of particular relevance to mobile network operators (MNOs) and network equipment providers. Beyond providing cellular connectivity to many of these devices, the theory is that MNOs can expand across the value chain and generate material and sustainable new revenues as their core business continues to decline (for more, see the ‘M2M 2.0: New Approaches Needed’ Executive Briefing).

Nevertheless, the temptation is always to focus on the grandiose but less well-defined opportunities of the future (e.g. smart grids, smart cities) rather than the less expansive but more easily monetised ones of today. It is easy to forget that MNOs have been active to varying degrees in this space for some time: for example, O2 UK had a surprisingly large business serving fleet operators with the 9.6Kbps Mobitex data network for much of the 2000s. To further substantiate this context, we will address three initial questions:

  1. Is there a difference between M2M and the Internet of Things?
  2. Which geographies are currently seeing the most traction?
  3. Which verticals are currently seeing the most traction?

These are now addressed in turn…

 

  • Executive Summary
  • Introduction: Putting the Car in Context
  • A growing mythology around M2M and the Internet of Things
  • The Internet of Things: a vision of what M2M can become
  • M2M today: driven by specific geographies and verticals
  • Background: History and Growth Drivers
  • History: from luxury models to mass market deployment
  • Growth drivers: macroeconomics, regulation, technology and the ‘connected consumer’
  • Ecosystem: Services and Value Chain
  • Service areas: data flows vs. consumer value proposition
  • Value chain: increasingly complex with two key battlegrounds
  • Markets: Key Geographies Today
  • Conclusions

 

  • Figure 1: Selected Internet of Things service areas
  • Figure 2: Representative connected devices forecast, 2010-20
  • Figure 3: The popularity of ‘Internet of Things’ as a search term on Google since 2004
  • Figure 4: Connected car VC activity, 2010-13
  • Figure 5: Candidate differences between M2M and the Internet of Things
  • Figure 6: Selected leading MNOs by M2M connections globally
  • Figure 7: M2M market maturity vs. growth by geographic region
  • Figure 8: Global M2M connections by vertical, 2013-20
  • Figure 9: Global passenger car profit by geography, 2007-12
  • Figure 10: A connected car services framework
  • Figure 11: Ericsson’s vision of the connected car’s integration with the IoT
  • Figure 12: The emerging connected car value chain
  • Figure 13: Different sources of in-car connectivity
  • Figure 14: New passenger car sales vs. consumer electronics spending by market
  • Figure 15: Index of digital content spending (aggregate and per capita), 2013
  • Figure 16: OEM embedded modem shipments by region, 2014-20
  • Figure 17: Telco 2.0™ ‘two-sided’ telecoms business model

New Mobile & Digital Transformation Strategies: OnFuture EMEA Executive Brainstorm 2014, Day One (Wednesday 11 June)

New Mobile & Digital Transformation Strategies. Presentations and Voting Slides from the New Mobile & Digital Transformation Strategies stream of the OnFuture EMEA Executive Brainstorm, 11th June 2014, in London.

0845 Event Start: Welcome, Agenda, Introductions & Warm Up

Andrew Collinson, COO & Research Director, STL Partners/Telco 2.0 Initiative (download here)

0900 Managing Disruptive Innovation in the Digital World

Chris Barraclough, MD & Chief Strategist, STL Partners/Telco 2.0 Initiative (download here)

Peter Briscoe, Head of Innovation, Ericsson (download here)

Paolo Campoli, Service Provider CTO, Cisco (download here)

 

In the afternoon there were two parallel streams – Communications Services and In-Home and In-Store Services:

Stream A Workshops: Communications Services – Innovation for the Consumer and the Enterprise

1345 Future Communications: Radical innovation in voice, messaging and data services

Bob Brace, Senior Analyst, STL Partners/Telco 2.0 Initiative (download here)

Rainer Deutschmann, SVP Core Product Innovation, Deutsche Telekom (download here)

Giles Corbett, Head of Libon, Orange (download here)

Dean Elwood, CEO and Founder, Voxygen (Panel Only)

Chris Barraclough, MD & Chief Strategist, STL Partners/Telco 2.0 Initiative (Moderator)

 

1430 Enterprise Mobility: A strategic approach to creating competitive advantage

Bob Brace, Senior Analyst, STL Partners/Telco 2.0 Initiative (download here)

Albane Coeurquetin, Consultant, STL Partners/Telco 2.0 Initiative (download here)

Michael Crossey, Director Product Marketing, Intel (Unavailable)

Alessandro Vigilante, VP Business Development & Strategy, Colt (download here)

Philip Laidler, Director of Consulting, STL Partners/Telco 2.0 Initiative (Moderator)

 

Stream B: In-Home and In-Store Services: the ‘Internet of Things…and of People’

1315 The ‘Internet of Things’ in the Digital Home: Towards a new ecosystem

Matt Jones, Consultant, STL Partners/Telco 2.0 Initiative (download here)

Martin Harriman, Director of Digital Home, Telefonica (download here)

Kevin Petersen, SVP, AT&T Digital Home (download here)

Pilgrim Beart, Founder, AlertMe (download here)

Philip Laidler, Director of Consulting, STL Partners/Telco 2.0 Initiative (Moderator)

 

1430 In-Store Retail: How mobile technology can revive the high street, not kill it

Owen McCabe, Director, Kantar Retail (download here)

Omaid Hiwaizi, Chief Strategy Officer, Geometry Global/WPP (download here)

Graham Cove, Director of Wi-Fi, Everything Everywhere (EE) (download here)

Chris Barraclough, MD & Chief Strategist, STL Partners/Telco 2.0 Initiative (Moderator)

 


Final plenary session:

1615 Mobile Payments: Creating a viable ecosystem that enables true ‘mobile commerce’

Andrew Collinson, STL Partners/Telco 2.0 Initiative (Moderator)

Holger Rambach, VP Products & Innovation, Deutsche Telekom (download here)

David Pringle, Senior Associate, STL Partners/Telco 2.0 Initiative (Panel Only)

Phil Laidler, Director of Consulting, STL Partners/Telco 2.0 Initiative (Panel Only)

 

Voting slides from Day 1: New Mobile & Digital Transformation Strategies

 

Click here for Day 2 presentations – Next Generation Mobile Marketing & Commerce
Click here to go back to the main OnFuture EMEA London page

Facing Up to the Software-Defined Operator

Introduction

At this year’s Mobile World Congress, the GSMA’s eccentric decision to split the event between the Fira Gran Via (the “new Fira”, as everyone refers to it) and the Fira Montjuic (the “old Fira”, as everyone refers to it) was a better one than it looked. If you took the special MWC shuttle bus from the main event over to the developer track at the old Fira, you crossed a culture gap that is widening, not closing. The very fact that the developers were accommodated separately hints at this, but it was the content of the sessions that brought it home. At the main site, it was impressive and forward-thinking to say you had an app, and a big deal to launch a new Web site; at the developer track, presenters would start up a Web service during their own talk to demonstrate their point.

There has always been a cultural rift between the “netheads” and the “bellheads”, of which this is just the latest manifestation. But the content of the main event tended to suggest that this is an increasingly serious problem. Everywhere, we saw evidence that core telecoms infrastructure is becoming software. Major operators are moving towards this now. For example, AT&T used the event to announce that it had signed up Software Defined Networks (SDN) specialists Tail-F and Metaswitch Networks for its next round of upgrades, while Deutsche Telekom’s Terastream architecture is built on it.

This is not just about the overused three letter acronyms like “SDN and NFV” (Network Function Virtualisation – see our whitepaper on the subject here), nor about the duelling standards groups like OpenFlow, OpenDaylight etc., with their tendency to use the word “open” all the more the less open they actually are. It is a deeper transformation that will affect the device, the core network, the radio access network (RAN), the Operations Support Systems (OSS), the data centres, and the ownership structure of the industry. It will change the products we sell, the processes by which we deliver them, and the skills we require.

In the future, operators will be divided into providers of the platform for software-defined network services and consumers of the platform. Platform consumers, which will include MVNOs, operators, enterprises, SMBs, and perhaps even individual power users, will expect a degree of fine-grained control over network resources that amounts to specifying your own mobile network. Rather than trying to make a unitary public network provide all the potential options as network services, we should look at how we can provide the impression of one network per customer, just as virtualisation gives the impression of one computer per user.

To summarise, it is no longer enough to boast that your network can give the customer an API. Future operators should be able to provision a virtual network through the API. AT&T, for example, aims to provide a “user-defined network cloud”.

Elements of the Software-Defined Future

We see five major trends leading towards the overall picture of the ‘software defined operator’ – an operator whose boundaries and structure can be set and controlled through software.

1: Core network functions get deployed further and further forwards

Because core network functions like the Mobile Switching Centre (MSC) and Home Subscriber Server (HSS) can now be implemented in software on commodity hardware, they no longer have to be tied to major vendors’ equipment deployed in centralised facilities. This frees them to migrate towards the edge of the network, providing for more efficient use of transmission links, lower latency, and putting more features under the control of the customer.

Network architecture diagrams often show a boundary between “the Internet” and an “other network”. This is called the ‘Gi interface’ in 3G and 4G networks. Today, the “other network” is usually itself an IP-based network, making this distinction simply that between a carrier’s private network and the Internet core. Moving network functions forwards towards the edge also moves this boundary forwards, making it possible for Internet services like content-delivery networking or applications acceleration to advance closer to the user.

Increasingly, the network edge is a node supporting multiple software applications, some of which will be operated by the carrier, some by third-party services like – say – Akamai, and some by the carrier’s customers.

2: Access network functions get deployed further and further back

A parallel development to the emergence of integrated small cells/servers is the virtualisation and centralisation of functions traditionally found at the edge of the network. One example is so-called Cloud RAN or C-RAN technology in the mobile context, where the radio basebands are implemented as software and deployed as virtual machines running on a server somewhere convenient. This requires high capacity, low latency connectivity from this site to the antennas – typically fibre – and this is now being termed “fronthaul” by analogy to backhaul.

Another example is the virtualised Optical Line Terminal (OLT) some vendors offer in the context of fixed Fibre to the home (FTTH) deployments. In these, the network element that terminates the line from the user’s premises has been converted into software and centralised as a group of virtual machines. Still another would be the increasingly common “virtual Set Top Box (STB)” in cable networks, where the TV functions (electronic programming guide, stop/rewind/restart, time-shifting) associated with the STB are actually provided remotely by the network.

In this case, the degree of virtualisation, centralisation, and multiplexing can be very high, as latency and synchronisation are less of a problem. The functions could actually move all the way out of the operator network, off to a public cloud like Amazon EC2 – this is in fact how Netflix does it.

3: Some business support and applications functions are moving right out of the network entirely

If Netflix can deliver the world’s premier TV/video STB experience out of Amazon EC2, there is surely a strong case to look again at which applications should be delivered on-premises, in the private cloud, or moved into a public cloud. As explained later in this note, the distinctions between on-premises, forward-deployed, private cloud, and public cloud are themselves being eroded. At the strategic level, we anticipate pressure for more outsourcing and more hosted services.

4: Routers and switches are software, too

In the core of the network, the routers that link all this stuff together are also turning into software. This is the domain of true SDN – basically, the effort to substitute relatively smart routers with much cheaper switches whose forwarding rules are generated in software by a much smarter controller node. This is well reported elsewhere, but it is necessary to take note of it. In the mobile context, we also see this in the increasing prevalence of virtualised solutions for the LTE Enhanced Packet Core (EPC), Mobility Management Entity (MME), etc.

5: Wherever it is, software increasingly looks like the cloud

Virtualisation – the approach of configuring groups of computers to work like one big ‘virtual computer’ – is a key trend. Even when, as with the network devices, software is running on a dedicated machine, it will be increasingly found running in its own virtual machine. This helps with management and security, and most of all, with resource sharing and scalability. For example, the virtual baseband might have VMs for each of 2G, 3G, and 4G. If the capacity requirements are small, many different sites might share a physical machine. If large, one site might be running on several machines.

This has important implications, because it also makes sharing among users easier. Those users could be different functions, or different cell sites, but they could also be customers or other operators. It is no accident that NEC’s first virtualised product, announced at MWC, is a complete MVNO solution. It has never been as easy to provide more of your carrier needs yourself, and it will only get easier.

The following Huawei slide (from their Carrier Business Group CTO, Sanqi Li) gives a good visual overview of a software-defined network.

Figure 1: An architecture overview for a software-defined operator
An architecture overview for a software-defined operator March 2014

Source: Huawei

 

  • The Challenges of the Software-Defined Operator
  • Three Vendors and the Software-Defined Operator
  • Ericsson
  • Huawei
  • Cisco Systems
  • The Changing Role of the Vendors
  • Who Benefits?
  • Who Loses?
  • Conclusions
  • Platform provider or platform consumer
  • Define your network sharing strategy
  • Challenge the coding cultural cringe

 

  • Figure 1: An architecture overview for a software-defined operator
  • Figure 2: A catalogue for everything
  • Figure 3: Ericsson shares (part of) the vision
  • Figure 4: Huawei: “DevOps for carriers”
  • Figure 5: Cisco aims to dominate the software-defined “Internet of Everything”

Software Defined People: How it Shapes Strategy (and us)

Introduction: software’s defining influence

Our knowledge, employment opportunities, work itself, healthcare, potential partners, purchases from properties to groceries, and much else can now be delivered or managed via software and mobile apps.

So are we all becoming increasingly ‘Software Defined’? It’s a question that has been stimulated in part by producing research on ‘Software Defined Networks (SDN): A Potential Game Changer’ and Enterprise Mobility, this video from McKinsey and Eric Schmidt, Google’s Exec Chairman, a number of observations throughout the past year, and particularly at this and last year’s Mobile World Congress (MWC).

But is software really the key?

The rapid adoption of smartphones and tablets, enabled by ever faster networks, is perhaps the most visible and tangible phenomenon in the market. Less visible but equally significant is the huge growth in ‘big data’ – the use of massive computing power to process types and volume of data that were previously inaccessible, as well as ‘small data’ – the increasing use of more personalised datasets.

However, what is now fuelling these trends is that many core life and business tools are now software of some form or another. In other words, programmes and ‘apps’ that create economic value, utility, fun or efficiency. Software is now the driving force, and the evolving data and hardware are by-products and enablers of the applications respectively.

Software: your virtual extra hand

In effect, mobile software is the latest great tool in humanity’s evolutionary path. With nearly a quarter of the world’s population using a smartphone, the human race has never had so much computing power by its side in every moment of everyday life. Many feature phones also possess significant processing power, and the extraordinary reach of mobile can now deliver highly innovative solutions like mobile money transfer even in markets with relatively underdeveloped financial service infrastructure.

How we are educated, employed and cared for are all starting to change with the growing power of mobile technologies, and will all change further and with increasing pace in the next phase of the mobile revolution. Knowing how to get the best from this world is now a key life skill.

The way that software is used is changing and will change further. While mobile apps have become a mainstream consumer phenomenon in many markets in the last few years, the application of mobile, personalised technologies is also changing education, health, employment, and the very fabric of our social lives. For example:

  • Back at MWC 2013 we saw the following fascinating video from Ericsson as part of its ‘Networked Society’ vision of why education has evolved as is has (to mass-produce workers to work in factories), and what the possibilities are with advanced technology, which is well worth a few minutes of your time whether you have kids or not.
  • We also saw this education demo video from a Singapore school from Qualcomm, based on the creative use of phones in all aspects of schooling in the WE Learn project.
  • There are now a growing number of eHealth applications (heart rate, blood pressure, stroke and outpatient care), and productivity apps and outreach of CRM applications like Salesforce into the mobile employment context are having an increasingly massive impact.
  • While originally a ‘fixed’ phenomena, the way we meet and find partners has seen a massive change in recent years. For example, in the US, 17% of recent marriages and 20% of ‘committed relationships’ started in the $1Bn online dating world – another world which is now increasingly going mobile.

The growing sophistication in human-software interactivity

Horace Dediu pointed out at a previous Brainstorm that the disruptive jumps in mobile handset technology have come from changes in the user interface – most recently in the touch-screen revolution accompanying smartphones and tablets.

And the way in which we interact with the software will continue to evolve, from the touch screens of smartphones, through voice activation, gesture recognition, retina tracking, on-body devices like watches, in-body sensors in the blood and digestive system, and even potentially by monitoring brainwaves, as illustrated in the demonstration from Samsung labs shown in Figure 1.

Figure 1: Software that reads your mind?

Source: Samsung Labs

Clearly, some of these techniques are still at an early stage of development. It is a hard call as to which will be the one to trigger the next major wave of innovation (e.g. see Facebook’s acquisition of Oculus Rift), as there are so many factors that influence the likely take-up of new technologies, from price through user experience to social acceptance.

Exploring and enhancing the senses

Interactive goggles / glasses such as Google Glass have now been around for over a year, and AR applications that overlay information from the virtual world onto images of the real world continue to evolve.

Search is also becoming a visual science – innovations such as Cortexica, recognise everyday objects (cereal packets, cars, signs, advertisements, stills from a film, etc.) and return information on how and where you can buy the related items. While it works from a smartphone today, it makes it possible to imagine a world where you open the kitchen cupboard and tell your glasses what items you want to re-order.

Screens will be in increasing abundance, able to interact with passers-by on the street or with you in your home or car. What will be on these screens could be anything that is on any of your existing screens or more – communication, information, entertainment, advertising – whatever the world can imagine.

Segmented by OS?

But is it really possible to define a person by the software they use? There is certainly an ‘a priori’ segmentation originating from device makers’ segmentation and positioning:

  • Apple’s brand and design ethos have held consistently strong appeal for upmarket, creative users. In contrast, Blackberry for a long time held a strong appeal in the enterprise segment, albeit significantly weakened in the last few years.
  • It is perhaps slightly harder to label Android users, now the largest group of smartphone users. However, the openness of the software leads to freedom, bringing with it a plurality of applications and widgets, some security issues, and perhaps a greater emphasis on ‘work it out for yourself’.
  • Microsoft, once ubiquitous through its domination of the PC universe, now finds itself a challenger in the world of mobiles and tablets, and despite gradually improving sales and reported OS experience and design has yet to find a clear identity, other than perhaps now being the domain of those willing to try something different. While Microsoft still has a strong hand in the software world through its evolving Office applications, these are not yet hugely mobile-friendly, and this is creating a niche for new players, such as Evernote and others, that have a more focused ‘mobile first’ approach.

Other segments

From a research perspective, there are many other approaches to thinking about what defines different types of user. For example:

  • In adoption, the Bass Diffusion Model segments e.g. Innovators, Early Adopters, Mass Market, Laggards;
  • Segments based on attitudes to usage, e.g. Lovers, Haters, Functional Users, Social Users, Cost Conscious, etc.;
  • Approaches to privacy and the use of personal data, e.g. Pragmatic, Passive, Paranoid.

It is tempting to hypothesise that there could be meta-segments combining these and other behavioural distinctions (e.g. you might theorise that there would be more ‘haters’ among the ‘laggards’ and the ‘paranoids’ than the ‘innovators’ and ‘pragmatics’), and there may indeed be underlying psychological drivers such as extraversion that drive people to use certain applications (e.g. personal communications) more.

However, other than anecdotal observations, we don’t currently have the data to explore or prove this. This knowledge may of course exist within the research and insight departments of major players and we’d welcome any insight that our partners and readers can contribute (please email contact@telco2.net if so).

Hypothesis: a ‘software fingerprint’?

The collection of apps and software each person uses, and how they use them, could be seen as a software fingerprint – a unique combination of tools showing interests, activities and preferences.

Human beings are complex creatures, and it may be a stretch to say a person could truly be defined by the software they use. However, there is a degree of cause and effect with software. Once you have the ability to use it, it changes what you can achieve. So while the software you use may not totally define you, it will play an increasing role in shaping you, and may ultimately form a distinctive part of your identity.

For example, Minecraft is a phenomenally successful and addictive game. If you haven’t seen it, imagine interactive digital Lego (or watch the intro video here). Children and adults all over the world play on it, make YouTube films about their creations, and share knowledge and stories from it as with any game.

To be really good at it, and to add enhanced features, players install ‘mods’ – essentially software upgrades, requiring the use of quite sophisticated codes and procedures, and the understanding of numerous file types and locations. So through this one game, ten year old kids are developing creative, social and IT skills, as well as exploring and creating new identities for themselves.

Figure 2: Minecraft – building, killing ‘creepers’ and coding by a kid near you

Minecraft March 2014

Source: Planetminecraft.com

But who is in charge – you or the software?

There are also two broad schools of thought in advanced IT design. One is that IT should augment human abilities and its application should always be controlled by its users. The other is the idea that IT can assist people by providing recommendations and suggestions that are outside the control of the user. An example of this second approach is Google showing you targeted ads based on your search history.

Being properly aware of this will become increasingly important to individuals’ freedom from unrecognised manipulation. Just as knowing that embarrassing photos on Facebook will be seen by prospective employers, knowing who’s pulling your data strings will be an increasingly important to controlling one’s own destiny in the future.

Back to the law of the Jungle?

Many of the opportunities and abilities conferred by software seem perhaps trivial or entertaining. But some will ultimately confer advantages on their users over those who do not possess the extra information, gain those extra moments, or learn that extra winning idea. The questions are: which will you use well; and which will you enable others to use? The answer to the first may reflect your personal success, and the second that of your business.

So while it used to be that your genetics, parents, and education most strongly steered your path, now how you take advantage of the increasingly mobile cyber-world will be a key additional competitive asset. It’s increasingly what you use and how you use it (as well as who you know, of course) that will count.

And for businesses, competing in an ever more resource constrained world, the effective use of software to track and manage activities and assets, and give insight to underlying trends and ways to improve performance, is an increasingly critical competence. Importantly for telcos and other ICT providers, it’s one that is enabled and enhanced by cloud, big data, and mobile.

The Software as a Service (SaaS) application Salesforce is an excellent case in point. It can brings instantaneous data on customers and business operations to managers’ and employees’ fingertips to any device. This can confer huge advantages over businesses without such capabilities.

Figure 3: Salesforce delivers big data and cloud to mobile

Salesforce delivers big data and cloud to mobile March 2014

Source: Powerbrokersoftware.com

 

  • Executive Summary: the key role of mobile
  • Why aren’t telcos more involved?
  • Revenue Declines + Skills Shortage = Digital Hunger Gap
  • What should businesses do about it?
  • All Businesses
  • Technology Businesses and Enablers
  • Telcos
  • Next steps for STL Partners and Telco 2.0

 

  • Figure 1: Software that reads your mind?
  • Figure 2: Minecraft – building, killing ‘creepers’ and coding by a kid near you
  • Figure 3: Salesforce delivers big data and cloud to mobile
  • Figure 4: The Digital Hunger Gap for Telcos
  • Figure 5: Telcos need Software Skills to deliver a ‘Telco 2.0 Service Provider’ Strategy
  • Figure 6: The GSMA’s Vision 2020

Are Telefonica, AT&T, Ooredoo, SingTel, and Verizon aiming for the right goals?

The importance of setting Telco 2.0 goals…

Communications Service Providers (CSPs) in all markets are now embracing new Telco 2.0 business models in earnest.  However, this remains a period of exploration and experimentation and a clear Telco 2.0 goal has not yet emerged for most players. At the most basic level, senior managers and strategists face a fundamental question:

What is an appropriate Telco 2.0 goal given my organisation’s current performance and market conditions?

This note introduces a framework based on analysis undertaken for the Telco 2.0 Transformation Index and offers some initial thoughts on how to start addressing this question [1] by exploring 5 CSPs in the context of the markets in which they operate and their current business model transformation performances.

Establishing the right Telco 2.0 goal for the organisation is an important first-step for senior management in the telecoms industry because:

  • Setting a Telco 2.0 goal that is unrealistically bold will quickly result in a sense of failure and a loss of morale among employees;
  • Conversely, a lack of ambition will see the organisation squeezed slowly and remorselessly into a smaller and smaller addressable market as a utility pipe provider.

Striking the right balance is critical to avoid these two unattractive outcomes.

…and the shortcomings of traditional frameworks

Senior management teams and strategists within the telecoms industry already have tools and approaches for managing investments and setting corporate goals.  So why is a fresh approach needed?  Put simply, the telecoms market is in the process of being irreversibly disrupted.  As we show in the first part of this note, traditional thinking and frameworks offer a view of the ‘as-is’ world but one which is changing fast because CSPs’ core communications services are being substituted by alternate offerings from new competitors.  The game is changing before our eyes and managers must think (and act) differently.  The framework outlined in summary here and covered in detail in the Telco 2.0 Transformation Index is designed to facilitate this fresh thinking.

Traditional strategic frameworks are useful to assess the ‘Telco 1.0’ situation

Understanding CSP groups’ ‘Telco 1.0’ strategic positioning: Ooredoo in a position of strength

Although they lack the detailed information and deep knowledge of the telecoms industry, investors have the benefit of an impartial view of different CSPs.  Unlike CSP management teams, they generally carry little personal ‘baggage’ and instead take a cold arm’s length approach to evaluating companies.  Their investment decisions obviously take into account future profit prospects and the current share price for each company to determine whether a stock is good value or not.  Leaving aside share prices, how might an investor sensibly appraise the ‘traditional’ Telco 1.0 telecoms market?

One classic framework plots competitive position against market attractiveness.  STL Partners has conducted this for 5 CSP groups in different markets as part of the analysis undertaken for the Telco 2.0 Transformation Index (see Figure 1).  According to the data collected, Ooredoo appears to be in the strongest position and, therefore, the most attractive potential investment vehicle.  Telefonica and SingTel appear to be moderately attractive and, surprisingly to many, Verizon and AT&T least attractive.

Figure 1: Strategic positioning framework for 5 CSP groups
Strategic Positioning Framework March 2014

Source: STL Partners’ Telco 2.0 Transformation Index, February 2014

Determining a CSP’s Telco 1.0 competitive position: Ooredoo enjoying life in the least competitive markets

As with all analytical tools, the value of the framework in Figure 1 is dependent upon the nature of the data collected and the methodology for converting it into comparable scores.  The full data set, methodology, and scoring tables for this and other analyses are available in the Telco 2.0 Transformation Index Benchmarking Report.  In this report, we will explore a small part of the data which drives part of the vertical axis scores in Figure 1 – Competitive Position (we exclude Customer Engagement in this report for simplicity).  In the Index methodology, there are 7 factors that determine ‘Competitive Position’ which are split into 2 categories:

  • Market competition, a consolidated score driven by:
  • Herfindahl score.  A standard economic indicator of competitiveness, reflecting the state of development of the underlying market structure, with more consolidated markets being less competitive and scoring more highly on the Herfindahl score.
  • Mobile revenue growth.  The compound annual growth of mobile revenues over a 2-year period.  Growing markets generally display less competition as individual players need to fight less hard to achieve growth.
  • Facebook penetration.  A proxy for the strength of internet and other ‘OTT’ players in the market.
  • CSP market positioning, driven by:
  • CSP total subscribers. The overall size of the CSP across all its markets.
  • CSP monthly ARPU as % of GDP per capita. The ability of the CSP to provide value to consumers relative to their income – essentially the CSP’s share of consumer wallet.
  • CSP market share. Self-explanatory – the relative share of subscribers.
  • CSP market share gain/loss. The degree to which the CSP is winning or losing subscribers relative to its peers.

If we look at the first 3 factors – those that drive fundamental market competition – it is clear why Ooredoo scores highly:

  • Its markets are substantially more consolidated than those of the other players (Figure 2).  Surprisingly, given the regular accusations of the US market being a duopoly, Verizon and AT&T have the most fragmented and competitive markets in the US.  For the fixed market, this latter point may be overstated since the US, for consumer and SME segments at least, is effectively carved up into regional areas where major fixed operators like Verizon and AT&T often do not compete head-to-head.
  • Its markets enjoy the strongest mobile revenue growth at 8.1% per annum between 2010 and 2012 versus 4.6% in Telefonica’s markets (fast in Latin America and negative in Europe), 5% in the US, and an annual decline (-1.7% ) for SingTel (Figure 3).
  • Facebook and the other internet players are much weaker in Ooredoo’s Middle Eastern markets than in Asia Pacific and Australia (SingTel), Europe and Latin America (Telefonica) and particularly the US (Verizon and AT&T) – see Figure 4.

 Figure 2: Herfindahl Score – Ooredoo enjoys the least competitive markets

Market Herfindahl Score March 2014

Note: Verizon and AT&T have slightly different scores owing the different business mixes between fixed and mobile within the US market

Source: STL Partners’ Telco 2.0 Transformation Index, February 2014

Figure 3: Ooredoo enjoying the strongest mobile market growth
Mobile Market Revenue Growth 2010-2012 March 2014

Source: STL Partners’ Telco 2.0 Transformation Index, February 2014

Ooredoo also operates in markets that have less competition from new players. For example, social network penetration is 56% in North America where AT&T and Verizon operate, 44% in Europe and South America where Telefonica operates, 58% in Singapore but only 34% in Qatar (Ooredoo’s main market) and 24% in the Middle East on average.

 

  • Identifying an individual CSP’s Telco 1.0 strategy: Telefonica Group in ‘harvest’ mode in most markets – holding prices, sacrificing share, generating cash
  • Frameworks used in the Telco 2.0 Transformation Index help identify evolving goals and strategies for CSPs
  • Traditional frameworks fail to account for new competitors, new services, new business models…
  • …but understanding how well each CSP is transforming to a new business model uncovers the optimum Telco 2.0 goal
  • STL Partners and the Telco 2.0™ Initiative

 

  • Figure 1: Strategic positioning framework for 5 CSP groups
  • Figure 2: Herfindahl Score – Ooredoo enjoys the least competitive markets
  • Figure 3: Ooredoo enjoying the strongest mobile market growth
  • Figure 4: Telefonica in harvest mode – milking companies for cash
  • Figure 5: Telco 2.0 Transformation Index strategic goals framework

Communications Services: What now makes a winning value proposition?

Introduction

This is an extract of two sections of the latest Telco 2.0 Strategy Report The Future Value of Voice and Messaging for members of the premium Telco 2.0 Executive Briefing Service.

The full report:

  • Shows how telcos can slow the decline of voice and messaging revenues and build new communications services to maximise revenues and relevance with both consumer and enterprise customers.
  • Includes detailed forecasts for 9 markets, in which the total decline is forecast between -25% and -46% on a $375bn base between 2012 and 2018, giving telcos an $80bn opportunity to fight for.
  • Shows impacts and implications for other technology players including vendors and partners, and general lessons for competing with disruptive players in all markets.
  • Looks at the impact of so-called OTT competition, market trends and drivers, bundling strategies, operators developing their own Telco-OTT apps, advanced Enterprise Communications services, and the opportunities to exploit new standards such as RCS, WebRTC and VoLTE.

The Transition in User Behaviour

A global change in user behaviour

In November, 2012 we published European Mobile: The Future’s not Bright, it’s Brutal. Very soon after its publication, we issued an update in the light of results from Vodafone and Telefonica that suggested its predictions were being borne out much faster than we had expected.

Essentially, the macro-economic challenges faced by operators in southern Europe are catalysing the processes of change we identify in the industry more broadly.

This should not be seen as a “Club Med problem”. Vodafone reported a 2.7% drop in service revenue in the Netherlands, driven by customers reducing their out-of-bundle spending. This sensitivity and awareness of how close users are getting to their monthly bundle allowances is probably a good predictor of willingness to adopt new voice and messaging applications, i.e. if a user is regularly using more minutes or texts than are included in their service bundle, they will start to look for free or lower cost alternatives. KPN Mobile has already experienced a “WhatsApp shock” to its messaging revenues. Even in Vodafone Germany, voice revenues were down 6.1% and messaging 3.7%. Although enterprise and wholesale business were strong, prepaid lost enough revenue to leave the company only barely ahead. This suggests that the sizable low-wage segment of the German labour market is under macro-economic stress, and a shock is coming.

The problem is global, for example, at the 2013 Mobile World Congress, the CEO of KT Corp described voice revenues as “collapsing” and stated that as a result, revenues from their fixed operation had halved in two years. His counterpart at Turk Telekom asserted that “voice is dead”.

The combination of technological and macro-economic challenge results in disruptive, rather than linear change. For example, Spanish subscribers who adopt WhatsApp to substitute expensive operator messaging (and indeed voice) with relatively cheap data because they are struggling financially have no particular reason to return when the recovery eventually arrives.

Price is not the only issue

Also, it is worth noting that price is not the whole problem. Back at MWC 2013, the CEO of Viber, an OTT voice and messaging provider, claimed that the app has the highest penetration in Monaco, where over 94% of the population use Viber every day. Not only is Monaco somewhere not short of money, but it is also a market where the incumbent operator bundles unlimited SMS, though we feel that these statistics might slightly stretch the definition of population as there are many French subscribers using Monaco SIM cards. However, once adoption takes off it will be driven by social factors (the dynamics of innovation diffusion) and by competition on features.

Differential psychological and social advantages of communications media

The interaction styles and use cases of new voice and messaging apps that have been adopted by users are frequently quite different to the ones that have been imagined by telecoms operators. Between them, telcos have done little more than add mobility to telephony during the last 100 years, However, because of the Internet and growth of the smartphone, users now have many more ways to communicate and interact other than just calling one another.

SMS (only telcos’ second mass ‘hit’ product after voice) and MMS are “fire-and-forget” – messages are independent of each other, and transported on a store-and-forward basis. Most IM applications are either conversation-based, with messages being organised in threads, or else stream-based, with users releasing messages on a broadcast or publish-subscribe basis. They often also have a notion of groups, communities, or topics. In getting used to these and internalising their shortcuts, netiquette, and style, customers are becoming socialised into these applications, which will render the return of telcos as the messaging platform leaders with Rich Communication System (RCS) less and less likely. Figure 1 illustrates graphically some important psychological and social benefits of four different forms of communication.

Figure 1:  Psychological and social advantages of voice, SMS, IM, and Social Media

Psychological and social advantages of voice, SMS, IM, and Social Media Dec 2013

Source: STL Partners

The different benefits can clearly be seen. Taking voice as an example, we can see that a voice call could be a private conversation, a conference call, or even part of a webinar. Typically, voice calls are 1 to 1, single instance, and with little presence information conveyed (engaged tone or voicemail to others). By their very nature, voice calls are real time and have a high time commitment along with the need to pay attention to the entire conversation. Whilst not as strong as video or face to face communication, a voice call can communicate high emotion and of course is audio.

SMS has very different advantages. The majority of SMS sent are typically private, 1 to 1 conversations, and are not thread based. They are not real time, have no presence information, and require low time commitment, because of this they typically have minimal attention needs and while it is possible to use a wide array of emoticons or smileys, they are not the same as voice or pictures. Even though some applications are starting to blur the line with voice memos, today SMS messaging is a visual experience.

Instant messaging, whether enterprise or consumer, offers a richer experience than SMS. It can include presence, it is often thread based, and can include pictures, audio, videos, and real time picture or video sharing. Social takes the communications experience a step further than IM, and many of the applications such as Facebook Messenger, LINE, KakaoTalk, and WhatsApp are exploiting the capabilities of these communications mechanisms to disrupt existing or traditional channels.

Voice calls, whether telephony or ‘OTT’, continue to possess their original benefits. But now, people are learning to use other forms of communication that better fit the psychological and social advantages that they seek in different contexts. We consider these changes to be permanent and ongoing shifts in customer behaviour towards more effective applications, and there will doubtless be more – which is both a threat and an opportunity for telcos and others.

The applicable model of how these shifts transpire is probably a Bass diffusion process, where innovators enter a market early and are followed by imitators as the mass majority. Subsequently, the innovators then migrate to a new technology or service, and the cycle continues.

One of the best predictors of churn is knowing a churner, and it is to be expected that users of WhatsApp, Vine, etc. will take their friends with them. Economic pain will both accelerate the diffusion process and also spread it deeper into the population, as we have seen in South Korea with KakaoTalk.

High-margin segments are more at risk

Generally, all these effects are concentrated and emphasised in the segments that are traditionally unusually profitable, as this is where users stand to gain most from the price arbitrage. A finding from European Mobile: The Future’s not Bright, it’s Brutal and borne out by the research carried out for this report is that prices in Southern Europe were historically high, offering better margins to operators than elsewhere in Europe. Similarly, international and roaming calls are preferentially affected – although international minutes of use continue to grow near their historic average rates, all of this and more accrues to Skype, Google, and others. Roaming, despite regulatory efforts, remains expensive and a target for disruptors. It is telling that Truphone, a subject of our 2008 voice report, has transitioned from being a company that competed with generic mobile voice to being one that targets roaming.

 

  • Consumers: enjoying the fragmentation
  • Enterprises: in search of integration
  • What now makes a winning value proposition?
  • The fall of telephony
  • Talk may be cheap, but time is not
  • The increasing importance of “presence”
  • The competition from Online Service Providers
  • Operators’ responses
  • Free telco & other low-cost voice providers
  • Meeting Enterprise customer needs
  • Re-imagining customer service
  • Telco attempts to meet changing needs
  • Voice Developers – new opportunities
  • Into the Hunger Gap
  • Summary: the changing telephony business model
  • Conclusions
  • STL Partners and the Telco 2.0™ Initiative

 

  • Figure 1:  Psychological and social advantages of voice, SMS, IM, and Social Media
  • Figure 2: Ideal Enterprise mobile call routing scenario
  • Figure 3: Mobile Clients used to bypass high mobile call charges
  • Figure 4: Call Screening Options
  • Figure 5: Mobile device user context and data source
  • Figure 6: Typical business user modalities
  • Figure 7:  OSPs are pursuing platform strategies
  • Figure 8: Subscriber growth of KakaoTalk
  • Figure 9: Average monthly minutes of use by market
  • Figure 10: Key features of Voice and Messaging platforms
  • Figure 11: Average user screen time Facebook vs. WhatsApp  (per month)
  • Figure 12: Disruptive price competition also comes from operators
  • Figure 13: The hunger gap in music

The Future Value of Voice and Messaging

Background – ‘Voice and Messaging 2.0’

This is the latest report in our analysis of developments and strategies in the field of voice and messaging services over the past seven years. In 2007/8 we predicted the current decline in telco provided services in Voice & Messaging 2.0 “What to learn from – and how to compete with – Internet Communications Services”, further articulated strategic options in Dealing with the ‘Disruptors’: Google, Apple, Facebook, Microsoft/Skype and Amazon in 2011, and more recently published initial forecasts in European Mobile: The Future’s not Bright, it’s Brutal. We have also looked in depth at enterprise communications opportunities, for example in Enterprise Voice 2.0: Ecosystem, Species and Strategies, and trends in consumer behaviour, for example in The Digital Generation: Introducing the Participation Imperative Framework.  For more on these reports and all of our other research on this subject please see here.

The New Report


This report provides an independent and holistic view of voice and messaging market, looking in detail at trends, drivers and detailed forecasts, the latest developments, and the opportunities for all players involved. The analysis will save valuable time, effort and money by providing more realistic forecasts of future potential, and a fast-track to developing and / or benchmarking a leading-edge strategy and approach in digital communications. It contains

  • Our independent, external market-level forecasts of voice and messaging in 9 selected markets (US, Canada, France, Germany, Spain, UK, Italy, Singapore, Taiwan).
  • Best practice and leading-edge strategies in the design and delivery of new voice and messaging services (leading to higher customer satisfaction and lower churn).
  • The factors that will drive best and worst case performance.
  • The intentions, strategies, strengths and weaknesses of formerly adjacent players now taking an active role in the V&M market (e.g. Microsoft)
  • Case studies of Enterprise Voice applications including Twilio and Unified Communications solutions such as Microsoft Office 365
  • Case studies of Telco OTT Consumer Voice and Messaging services such as like Telefonica’s TuGo
  • Lessons from case studies of leading-edge new voice and messaging applications globally such as Whatsapp, KakaoTalk and other so-called ‘Over The Top’ (OTT) Players


It comprises a 18 page executive summary, 260 pages and 163 figures – full details below. Prices on application – please email contact@telco2.net or call +44 (0) 207 247 5003.

Benefits of the Report to Telcos, Technology Companies and Partners, and Investors


For a telco, this strategy report:

  • Describes and analyses the strategies that can make the difference between best and worst case performance, worth $80bn (or +/-20% revenues) in the 9 markets we analysed.
  • Externally benchmarks internal revenue forecasts for voice and messaging, leading to more realistic assumptions, targets, decisions, and better alignment of internal (e.g. board) and external (e.g. shareholder) expectations, and thereby potentially saving money and improving contributions.
  • Can help improve decisions on voice and messaging services investments, and provides valuable insight into the design of effective and attractive new services.
  • Enables more informed decisions on partner vs competitor status of non-traditional players in the V&M space with new business models, and thereby produce better / more sustainable future strategies.
  • Evaluates the attractiveness of developing and/or providing partner Unified Communication services in the Enterprise market, and ‘Telco OTT’ services for consumers.
  • Shows how to create a valuable and realistic new role for Voice and Messaging services in its portfolio, and thereby optimise its returns on assets and capabilities


For other players including technology and Internet companies, and telco technology vendors

  • The report provides independent market insight on how telcos and other players will be seeking to optimise $ multi-billion revenues from voice and messaging, including new revenue streams in some areas.
  • As a potential partner, the report will provide a fast-track to guide product and business development decisions to meet the needs of telcos (and others).
  • As a potential competitor, the report will save time and improve the quality of competitor insight by giving strategic insights into the objectives and strategies that telcos will be pursuing.


For investors, it will:

  • Improve investment decisions and strategies returning shareholder value by improving the quality of insight on forecasts and the outlook for telcos and other technology players active in voice and messaging.
  • Save vital time and effort by accelerating decision making and investment decisions.
  • Help them better understand and evaluate the needs, goals and key strategies of key telcos and their partners / competitors


The Future Value of Voice: Report Content Summary

  • Executive Summary. (18 pages outlining the opportunity and key strategic options)
  • Introduction. Disruption and transformation, voice vs. telephony, and scope.
  • The Transition in User Behaviour. Global psychological, social, pricing and segment drivers, and the changing needs of consumer and enterprise markets.
  • What now makes a winning Value Proposition? The fall of telephony, the value of time vs telephony, presence, Online Service Provider (OSP) competition, operators’ responses, free telco offerings, re-imaging customer service, voice developers, the changing telephony business model.
  • Market Trends and other Forecast Drivers. Model and forecast methodology and assumptions, general observations and drivers, ‘Peak Telephony/SMS’, fragmentation, macro-economic issues, competitive and regulatory pressures, handset subsidies.
  • Country-by-Country Analysis. Overview of national markets. Forecast and analysis of: UK, Germany, France, Italy, Spain, Taiwan, Singapore, Canada, US, other markets, summary and conclusions.
  • Technology: Products and Vendors’ Approaches. Unified Comminications. Microsoft Office 365, Skype, Cisco, Google, WebRTC, Rich Communications Service (RCS), Broadsoft, Twilio, Tropo, Voxeo, Hypervoice, Calltrunk, Operator voice and messaging services, summary and conclusions.
  • Telco Case Studies. Vodafone 360, One Net and RED, Telefonica Digital, Tu Me, Tu Go, Bluvia and AT&T.
  • Summary and Conclusions. Consumer, enterprise, technology and Telco OTT.

Cloud 2.0: Securing Trust to Survive the ‘One-In-Five’ CSP Shake-Out

Summary: The Cloud market is on the verge of the next wave of market penetration, yet it’s likely that only one in five Cloud Service Providers (CSPs) in today’s marketplace will still be around by 2018, as providers fail or are swallowed up by aggressive competitors. So what do CSPs need to do to survive and prosper? (October 2013, Foundation 2.0, Executive Briefing Service, Cloud & Enterprise ICT Stream.) Technology adoption rates Sept 2013


Introduction: one in five Cloud providers will survive 

The Cloud market is on the verge of the next wave of market penetration, yet it’s likely that only one in five Cloud Service Providers (CSPs) in today’s marketplace will still be around by 2018, as providers fail or are swallowed up by aggressive competitors. So what do CSPs need to do to survive and prosper?

This research was sponsored by Trend Micro but the analysis and recommendations represent STL Partners’ independent view. STL Partners carried out an independent study based on in-depth interviews with 27 senior decision makers representing Cloud Service Providers and enterprises across Europe. These discussions explored from both perspectives cloud maturity, the barriers to adoption and how these might be overcome. The findings and observations are detailed in this three-part report, together with practical recommendations on how CSPs can address enterprise security concerns and ensure the sustainability of the cloud model itself.

Part 1: Cloud – coming of age or troubled adolescent?

While the concept of organising computing as a utility dates back to the 1960s, the cloud computing model as we know it today is built on the sub-classifications of Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).

We’ve covered telcos’ role in Cloud Services in depth in our Cloud research stream, and found that hype, hope and uncertainty have been notable features of the early stages of development of the market, with many optimistic forecasts of adoption being somewhat premature.

In terms of the adoption cycle adoption today, our analysis is that Cloud Services are on the brink of ‘the chasm’: well established among early adopters but less well known, trusted and used by the mass market segment of the enterprise market.

Building trust among new customer segments is the key to bridging this gap. For the industry it is a make or break point in terms of achieving scale. For CSPs, trust will be a key to survival and prosperity in the next phase of the market, enabling them to open up new opportunities and expand the amenable market, as well as to compete to retain and grow their individual market shares.

Many of the obstacles to and inhibitors of cloud adoption stem from customers’ perceptions of product immaturity – “will it be safe and work how we want without too much hassle and commitment?” In this report we examine findings on the general inhibitors and drivers of adoption, and then those related to the main inhibitor, data security, and how they might be addressed.

Overcoming the obstacles

Enterprise decision-makers in the study admitted to being deterred from the cloud by the prospect of migration, with the “enterprise/cloud barrier” perceived as a significant technical hurdle. While CSPs with enterprise-grade propositions have in place the business model, margins and consultative resources to offer customers an assisted journey to the cloud, standard public offerings are provided on a Do-It-Yourself basis.

However, data privacy and security remain the biggest inhibitors to cloud adoption among enterprises, due in no small part to a perceived loss of visibility and control.  Recent headline-grabbing events relating to mass surveillance programmes such as PRISM have only served to feed these fears.  As will be seen in this report, a lack of consistent industry standards, governance and even terminology heightens the confusion. Internal compliance procedures, often rooted in an out-dated “physical” mind-set, fail to reflect today’s technological realty and the nature of potential threats.

According to the UK Department for Business Innovation & Skills, the direct cost of a security breach (any unauthorised access of data, applications, services, networks or devices) is around £65,000 for SMEs and £850,000 for larger enterprises. However, add to this financial penalties for failure to protect customer data, reputational damage, diminished goodwill and lost business, and the consequential losses can be enough to put a company out of business.  It’s little wonder some enterprises still regard cloud as a risk too far.

In reality, CSPs with a heritage in managed services and favourable economies of scale can typically match or better the security provisions of on-premise data centres.  However, as “super enterprises” they present a larger and therefore more attractive target for malicious activity than a single business.  There is simply no room for complacency.

CSPs must shift their view of security from a business inhibitor to a business enabler: crucial to maintaining and expanding the overall cloud market and confidence in the model by winning customer trust.  This requires a fundamental rethink of compliance – both on the part of CSPs and enterprises – from a tick-box exercise to achieve lowest-cost perimeter protection to cost effectively meeting the rigorous demands of today’s information-reliant enterprises.

Cloud services cannot be considered mature until enterprises en masse are prepared to entrust anything more than low-sensitivity data to third party CSPs.  The more customer security breaches that occur, the more trust will be undermined, and the greater the risk of the cloud model imploding altogether.

State of the nation

The journey to the cloud is often presented in the media as a matter of “when” rather than “if”.  However, while several CSPs in our study believed that the cloud model was starting to approach maturity, enterprise participants were more likely to contend that cloud was still at an experimental or “early adopter” stage.

The requirements of certain vertical markets were perceived by some respondents to make cloud a non-starter, for example, broadcasters that need to upload and download multi-terabyte sized media files, or low-latency trading environments in the financial sector.  Similarly, the value of intellectual property was cited by pharmaceutical companies as justifying the retention of data in a private cloud or internal data centre at any cost.

CSPs universally acknowledged that their toughest competitor continues to be enterprises’ own in-house data centres.  IT departments are accustomed to having control over their applications, services, servers, storage, network and security. While notionally, they accept they will have to be less “hands on” in the cloud, a lack of trust persists among many. This reticence was typically seen by CSPs as unwarranted fear and parochialism, yet many are still finding it a challenge to educate prospective customers and correct misconceptions. CSPs suggested that IT professionals may be as likely to voice support for the cloud as turkeys voting for Christmas. However, more enlightened IT functions have embraced the opportunity to evolve their remit to working with their CSP to monitor services against SLAs, enforce compliance requirements and investigate new technologies rather than maintaining the old.

For tentative enterprises, security is still seen as a barrier to, rather than an accelerant of, cloud adoption, and one of the most technically challenging issues for both IT and compliance owners. Enterprises that had advanced their cloud strategy testified that successful adoption relies on effective risk management when evaluating and engaging a cloud partner. Proponents of cloud solutions will need compelling proof points to win over their CISO, security team or compliance officer.  However, due diligence is a lengthy and often convoluted process that should be taken into account by those drawn to the cloud model for the agility it promises.

The majority of CSPs interviewed were relatively dismissive of customer security concerns, making the valid argument that their security provisions were at least equal to, if not better than, that of most enterprise data centres.  However, as multiple companies concentrate their data into the hands of a few CSPs, the larger and more attractive those providers become to hackers as an attack target. Nonetheless, CSPs rarely offer any indemnification against hacking (aside from financial compensation for a breach of SLA) and SaaS providers tend to be more obscure than IaaS/PaaS providers in terms of the security of their operations.  Further commercial concerns explored in this report relate to migration and punitive contractual lock-in. Enterprises need to feel that they can easily relocate services and data across the cloud boundary, whether back in house or to another provider.  This creates the added challenge of being able to provide end-to-end audit continuity as well as in transit.

There are currently around 800 cloud service providers (CSPs) in Europe.  Something of a land grab is taking place as organisations whose heritage lies in software, telecoms and managed hosting are launching cloud-enabled services, primarily IaaS and SaaS.

However, “cloudwashing” – a combination of vendor obfuscation and hyperbole – is already slowing down the sales cycles at a time when greater transparency would be likely to lead to more proofs of concept, accelerated uptake and expansion of the overall market.

Turbulence in the macro economy is exacerbating the problem: business creation and destruction are among the most telling indicators of economic vitality.  A landmark report from RSM shows that the net rate of business creation (business births minus deaths) for the G7 countries was just 0.8% on a compound annual basis over the five-year period of the study. The BRICs, by contrast, show a net rate of business creation of 6.2% per annum – approximately eight times the G7 rate.

In parallel, the pace of technology success is accelerating.  Technologies are considered to have become “mainstream” once they have achieved 25% penetration. As cloud follows this same trajectory, with a rash of telcos, cable operators, data centre specialists and colocation providers entering the market, significant consolidation will be inevitable, since cloud economics are inextricably linked to scale.

Figure 1 – Technology adoption rates
Technology Adoption Rates Sept 2013

Source: STL Partners

Lastly, customers are adapting and evolving faster than ever, due in no small part to the advent of social media and digital marketing practices, creating a hyper-competitive environment.  As a by-product, the rate of business failure is rising.  In the 1950s, two-thirds of the Fortune 500 companies failed. Throughout the 1980s, almost nine out of ten of the so-called “Excellent” companies went to the wall, and 98% of firms borne out of the “Dot Com” revolution in the late 1990s are not expected to survive.

As a result, STL Partners anticipates that by 2018, a combination of consolidation and natural wastage will leave only 160 CSPs in the marketplace – a survival rate of one in five.

Drivers of cloud adoption

The business benefits of the cloud are well documented, so the main value drivers cited by participants in the study can be briefly summarised as follows:

Figure 2 – Business and IT Drivers of cloud adoption
Business and IT Drivers of cloud adoption Sept 2013

Report Contents

  • Introduction: one in five Cloud providers will survive
  • Part 1: Cloud – coming of age or troubled adolescent?
  •    Overcoming the obstacles
  •    State of the nation
  •    Drivers of cloud adoption
  •    Inhibitors to cloud adoption
  •       Cloud migration and integration with internal systems
  •       Vendor lock-in and exit strategies
  •       Governance and compliance issues
  •       Supplier credibility and longevity
  •       Testing and assurance
  • Part 2: Cloud security and data privacy challenges
  •    Physical security
  •    Data residency and jurisdiction
  •    Compliance and audit
  •    Encryption
  •    Identity and Access Management
  •    Shared resources and data segregation
  •    Security incident management
  •    Continuity services
  •    Data disposal
  •    Cloud provider assessment
  •    Industry standards and codes of practice
  •    Migration strategy
  •    Customer visibility
  • Part 3: Improving your ‘security posture’
  •    The ethos, tools and know-how needed to win customers’ trust
  •    The Four Levels of Cloud Security
  • Key take-aways for Cloud Services Providers
  • About STL Partners
  • About Trend Micro

Table of Figures

  • Figure 1 – Technology adoption rates
  • Figure 2 – Business and IT Drivers of cloud adoption
  • Figure 3 – Information security breaches 2013
  • Figure 4 – The four levels of Cloud security
  • Figure 5 – A 360 Degree Framework for Cloud Security

Digital Commerce 2.0: New $50bn Disruptive Opportunities for Telcos, Banks and Technology Players

Introduction – Digital Commerce 2.0

Digital commerce is centred on the better use of the vast amounts of data created and captured in the digital world. Businesses want to use this data to make better strategic and operational decisions, and to trade more efficiently and effectively, while consumers want more convenience, better service, greater value and personalised offerings. To address these needs, Internet and technology players, payment networks, banks and telcos are vying to become digital commerce intermediaries and win a share of the tens of billions of dollars that merchants and brands spend finding and serving customers.

Mobile commerce is frequently considered in isolation from other aspects of digital commerce, yet it should be seen as a springboard to a wider digital commerce proposition based on an enduring and trusted relationship with consumers. Moreover, there are major potential benefits to giving individuals direct control over the vast amount of personal data their smartphones are generating.

We have been developing strategies in these fields for a number of years, including our engagement with the World Economic Forum’s (WEF) Rethinking Personal Data project, and ongoing research into user data and privacy, digital money and payments, and digital advertising and marketing.

This report brings all of these themes together and is the first comprehensive strategic playbook on how smartphones and authenticated personal data can be combined to deliver a compelling digital commerce proposition for both merchants and consumers. It will save customers valuable time, effort and money by providing a fast-track to developing and / or benchmarking a leading edge strategy and approach in the fast-evolving new world of digital commerce.

Benefits of the Report to Telcos, Other Players, Investors and Merchants


For telcos, this strategy report:

  • Shows how to evaluate and implement a comprehensive and successful digital commerce strategy worth up to c.$50bn (5% of core revenues in 5 years)
  • Saves time and money by providing a fast-track for decision making and an outline business case
  • Rapidly challenges / validates existing strategy and services against relevant ‘best in class’, including their peers, ‘OTT players’ and other leading edge players.


For other players including Internet companies, technology vendors, banks and payment networks:

  • The report provides independent market insight on how telcos and other players will be seeking to generate $ multi-billion revenues from digital commerce
  • As a potential partner, the report will provide a fast-track to guide product and business development decisions to meet the needs of telcos (and others) that will need to make commensurate investment in technologies and partnerships to achieve their value creation goals
  • As a potential competitor, the report will save time and improve the quality of competitor insight by giving a detailed and independent picture of the rationale and strategic approach you and your competitors will need to take


For merchants building digital commerce strategies, it will:

 

  • Help to improve revenue outlook, return on investment and shareholder value by improving the quality of insight to strategic decisions, opportunities and threats lying ahead in digital commerce
  • Save vital time and effort by accelerating internal decision making and speed to market


For investors, it will:

  • Improve investment decisions and strategies returning shareholder value by improving the quality of insight on the outlook of telcos and other digital commerce players
  • Save vital time and effort by accelerating decision making and investment decisions
  • Help them better understand and evaluate the needs, goals and key strategies of key telcos and their partners / competitors

Digital Commerce 2.0: Report Content Summary

  • Executive Summary. (9 pages outlining the opportunity and key strategic options)
  • Strategy. The shape and scope of the opportunities, the convergence of personal data, mobile, digital payments and advertising, and personal cloud. The importance of giving consumers control. and the nature of the opportunity, including Amazon and Vodafone case studies.
  • The Marketplace. Cultural, commercial and regulatory factors, and strategies of the market leading players. Further analysis of Google, Facebook, Apple, eBay and PayPal, telco and financial services market plays.
  • The Value Proposition. How to build attractive customer propositions in mobile commerce and personal cloud. Solutions for banked and unbanked markets, including how to address consumers and merchants.
  • The Internal Value Network. The need for change in organisational structure in telcos and banks, including an analysis of Telefonica and Vodafone case studies.
  • The External Value Network. Where to collaborate, partner and compete in the value chain – working with telcos, retailers, banks and payment networks. Building platforms and relationships with Internet players. Case studies include Weve, Isis, and the Merchant Customer Exchange.
  • Technology. Making appropriate use of personal data in different contexts. Tools for merchants and point-of-sale transactions. Building a flexible, user-friendly digital wallet.
  • Finance. Potential revenue streams from mobile commerce, personal cloud, raw big data, professional services, and internal use.
  • Appendix – the cutting edge. An analysis of fourteen best practice and potentially disruptive plays in various areas of the market.

 

Telco Opportunities in the ‘New Mobile Web’?

Summary: The transformed mobile web experience, brought about by the adoption of a range of new technologies, is creating a new arena for operators seeking to (re)build their role in the digital marketplace. Operators are potentially well-placed to succeed in this space; they have the requisite assets and capabilities and the desire to grow their digital businesses. This report examines the findings of interviews and a survey conducted amongst key industry players, supplemented by STL Partners’ research and analysis, with the objective of determining the opportunities for operators in the New Mobile Web and the strategies they can implement in order to succeed. (September 2013, Foundation 2.0, Executive Briefing Service.) Operator Opportunities in the “New Mobile Web”

This report explores new opportunities for telecom operators (telcos) in Digital, facilitated by the emergence of the “New Mobile Web”. The New Mobile Web is a term we have used to describe the transformed mobile Web experience achieved through advances in technology; HTLM5, faster, cheaper (4G) connectivity, better mobile devices. This paper argues that the New Mobile Web will lead to a shift away from native (Apple & Android) app ecosystems to browser-based consumption of media and services. This shift will create new opportunities for operators seeking to re(build) their digital presence.

STL Partners has undertaken research in this domain through interviews and surveys with operators and other key players in the market. In this report, we present our findings and analysis, as well as providing recommendations for operators.

The New Mobile Web

The emergence of the New Mobile Web is creating a new arena for operators seeking to (re)build their role in the digital marketplace. Many telecoms operators (telcos) are looking to build big “digital” businesses to offset the forecasted decline in their core voice and messaging businesses over the next 5-7 years. Growth in data services and revenues will only partly offset these declines.

In general, despite a lot of effort and noise, telcos have been marginalised from the explosion in mobile Apps and Content, except insofar as it has helped them upgrade customers to smartphones and data-plans. Most notably, there has been a shift in market influence to Google & Apple, and spiralling traffic and signalling loads from easy-to-use interactive apps on smartphones.

Technical developments, including the adoption of HTML5, better mobile devices and faster networks, are transforming the user experience on mobile devices thereby creating a “New Mobile Web”. This New Mobile Web extends beyond “pages”, to content that looks and behaves more like “apps”. By having such “Web-apps” that work across different operating systems and devices – not just phones, but also PCs, TVs and more – the Web may be able to wrest back its role and influence in mobile Apps and Content.

The Key Opportunities for Operators

This new digital arena is in turn creating new opportunities to support others; STL’s research found that respondents felt the key opportunities for operators in the New Mobile Web were around: Monetisation, Discovery, Distribution and Loyalty.

Figure 1 – Operators see the New Mobile Web creating most value around Payments, Monetisation and Loyalty
Operators see the New Mobile Web Creating most value

Telcos can leverage their assets

Telcos have the requisite assets and capabilities to succeed in this area; they are strong candidates for assisting in monetisation, discovery, distribution and loyalty, especially if they can link in their other capabilities such as billing and customer-knowledge.

This report sets out some of the existing activities and assets that operators should seek to exploit and expand in pursuing their ambitions in the New Mobile Web:

Strategic Options for telcos to succeed

Operators that are aiming to become ‘digital players’ need to adopt coherent strategies that exploit and build on their assets and capabilities. This report identifies 5 broad strategic options that operators should look to pursue and it sets out the rationale for each. These strategies are not necessarily mutually exclusive and can be combined to develop clear direction and focus across the organisation.

Seizing the opportunity

Although many operators believe that they urgently need to build strong digital businesses, most are struggling to do so. Telcos are not going to get too many chances to re-engage with customers and carve-out a bigger role for themselves in the digital economy. If it fulfils its promise, the New Mobile Web will disrupt the incumbent mobile Apps and Content value networks. This disruption will provide new opportunities for operators.

The operator community needs to participate in shaping the New Mobile Web and its key enabling technologies. Telcos also need to understand the implications of these technologies at a strategic level – not just something that the Web techies get excited about.

If telcos are not deeply involved – from board level downwards – they risk being overtaken by events, once again. Continued marginalisation from the digital economy will leave operators with the prospect of facing a grim future of endless cost-cutting, commoditisation and consolidation. This should not be inevitable.

Report Contents

  • Preface
  • Executive Summary
  • Introduction to the New Mobile Web
  • Meeting Operators’ strategic goals
  • Key opportunities in the New Mobile Web
  • Operators have plenty of existing assets and could add more
  • Case Studies
  • Telco Strategies in the New Mobile Web
  • Appendix 1: The New Mobile Web – “Rebalancing” from “Native”

Table of Figures

  • Figure 1: On-line survey respondents
  • Figure 2: Key opportunities in the New Mobile Web.  Enabling…
  • Figure 3: Areas of Value for Operators
  • Figure 4: Telco assets that should be used to address the opportunity
  • Figure 5:  Operator Strategies
  • Figure 6: Drivers of the New Mobile Web
  • Figure 7: Data growth alone will not fill the gap in declining Voice and Messaging Revenue
  • Figure 8: Survey results on operator ambitions
  • Figure 9: Asian and MEA operators are the most ambitious
  • Figure 10: Telcos in native app dominated geographies are more likely to believe that their ambitions could not be met in the current world. However, as stated above, there are notable exceptions…
  • Figure 11: Key opportunities in the New Mobile Web.  Enabling…
  • Figure 12: Operators see the New Mobile Web creating most value around Payments, Monetisation and Loyalty
  • Figure 13: A vast display ecosystem enables Web content providers to indirectly monetise their content
  • Figure 14: Within Digital, operators see most value in Self-care, Mobile Payments and Banking, Video and Music
  • Figure 15: Existing operator assets to build a role in the New Mobile Web
  • Figure 16: iRadio Overview
  • Figure 17: Tapjoy Overview
  • Figure 18: Mozilla Firefox OS Overview
  • Figure 19: Globe Telecom promotion
  • Figure 20: Financial Times Overview
  • Figure 21: AppsFuel Overview
  • Figure 22: Summary of the 5 Broad Strategies
  • Figure 23: Percentage of (US) smartphone and tablet users’ time by application area
  • Figure 24: The industry is beginning to see a “re-birth of the Web”
  • Figure 25: HTML5 seeks to bring the best of both Web and app worlds:
  • Figure 26: Telcos see most HTML5 value in reducing the cost of service & maintenance and improving the time to market.
  • Figure 27: The Industry sees the dominance of existing ecosystems as the biggest barrier to HTML5’s success

Finding the Next Golden Egg: Sourcing Great Telecoms Innovations

The telco innovation problem…

The challenge facing the telecoms industry has been well documented (not least by STL Partners). The solution, the need for telcos to develop a new telecoms ’business model’ is also now generally accepted. For some, the new business model may entail eschewing service development and instead focusing on cost efficiency and network performance – the Telco 2.0 Happy Piper.

For many, however, the desire to compete in the ‘services layer’ remains strong. These would-be Telco 2.0 Service Providers must seek to replace the contracting voice and messaging revenue streams with new revenues from new products and services and customers.

How to develop these new products and services and customer relationships is the $1 trillion question for telcos and their partners.

STL Partners has spent much time exploring both the nature of new opportunities and the processes for realising them. The problem for telcos is that they are not natural innovators. Their raisin d’etre historically has been to build infrastructure and generate returns from services that were only available because they owned and controlled the infrastructure – voice, messaging, and connectivity. The result was very low levels of innovation in telecoms but stable high-margin returns from ‘protected services’.

The Internet has changed the game. Now, voice and messaging and other communications services are available from alternate service providers – the internet giants and start-ups in particular. These new players have innovation in their DNA – they are product and service-oriented; they have sexy brands; they understand the value of customer data and how to exploit it; with lower capital expenditures, they can generate returns on investment with much lower margins.

…and one part of the solution addressed in this report

For telcos to develop competitive enabling or end-user services, whether consumer or enterprise, they need to develop the same skills and relationships enjoyed by the new competitors. As we discuss at length A Practical Guide to Implementing Telco 2.0 and we measure in the forthcoming Telco 2.0 Transformation Index, this requires a fundamental business model transformation that encompasses the whole telco industry: services, organisation structure and processes, partnerships, technology, and the cost and revenue model.

Rather than cover all the elements of the transformation, this report focuses narrowly on the process of developing compelling new propositions and services that deliver what customers want better than existing available solutions. It is based on a simple premise: that innovation and creativity is based on ‘associative thinking’ – the ability to link together ideas and concepts. For example, it was associative thinking in 2006 that led Apple’s iPhone designers to spot how an accelerometer – a widely used device in the transport, construction and medical industries – could be integrated into an iPhone to manage automatic screen rotation and countless applications we now take for granted on mobile.

Two ‘associative thinking’ approaches to identifying Telco 2.0 innovations

1. Existing tried and tested solutions

Rather than start with a blank sheet of paper, one way to innovate is to copy solutions that others have brought to market successfully. This does not necessarily imply a ‘me too’ approach entirely as there is scope, or course, to improve the solutions that others have created. In fact, most innovations are actually an extension of an existing product or service. For example:

  • Apple’s iPhone, with its capacitive screen and integrated content ecosystem was a massive improvement on previous smartphones but clearly drew on early work done by, for example, Nokia with its 9210 Communicator and Ericsson with the R380.
  • Google’s powerful search algorithm and clean user interface contrasted with the clutter of earlier search sites such AltaVista but also built on their idea of helping people find things on the web. Interestingly, AltaVista has now made a comeback with a slick clean interface that looks remarkably similar to Google!

If there is value in taking another firm’s idea and improving it, what are the sources of such concepts for CSPs?

STL Partners sees three main ones:

1. Your local telecoms market.

Scan the offerings of your competitors and if you spot something that looks attractive or seems to be getting traction in the marketplace, find ways to improve it and launch a better competitive offering yourself. You may remember in the view of Telefonica and Vodafone we mentioned that Freebees was a copy of O2’s earlier Top-up Surprises. Two important points here that Vodafone failed to do:

  • Follow fast. The Freebees programme was launched around three years after Top-up Surprises and so Vodafone missed out on being seen as an innovator. Vodafone also missed out on the financial benefits that O2 enjoyed in those intervening years.
  • Improve the original concept. Freebees is fine but fails to materially improve on what was offered by O2 – rewards for customers that top-up their prepay account.

2. The global telecoms market.

Look outside your market to other geographies to see what has worked in other parts of the world and then explore how these solutions might work in your own market. Clearly, you need to make allowance for different local customs and behaviours, industry structures, regulations and so on but the global nature of (tele)communications means that things that have worked in one market can often be easily adapted to others. STL Partners carries out this global scouting service for clients looking at what is available from other CSPs, vendors and start-ups and believes it is a sensible low-risk strategy for many CSPs – see on page 17 of this document, for more details.

Contents

To access the contents of the report, including…

  • The telco innovation problem…
  • …and one part of the solution addressed in this report
  • Two ‘associative thinking’ approaches to identifying Telco 2.0 innovations
  • 1. Existing tried and tested solutions
  • 2. Customer Goal-led Innovation (CGLI)
  • Case study on identifying Telco 2.0 innovations: The STL Partners scouting service
  • About STL Partners

…and the following table of exhibits…

  • Figure 1: Sources for tried and tested Telco 2.0 solutions
  • Figure 2: The limitations of asking customers what they need when innovating, some examples
  • Figure 3: How Customer Goal-led Innovation focuses on real needs and uncovers innovation opportunities
  • Figure 4: The STL Partners’ Customer Goal-led Innovation process
  • Figure 5: Producing a customer activity map to support a goal statement
  • Figure 6: Customer goal-led innovation – activity analysis table, example
  • Figure 7: Identifying opportunity areas for innovation, example
  • Figure 8: The STL Partners scouting service in a nutshell