Mobile Broadband 2.0: The Top Disruptive Innovations

Summary: Key trends, tactics, and technologies for mobile broadband networks and services that will influence mid-term revenue opportunities, cost structures and competitive threats. Includes consideration of LTE, network sharing, WiFi, next-gen IP (EPC), small cells, CDNs, policy control, business model enablers and more.(March 2012, Executive Briefing Service, Future of the Networks Stream).

Trends in European data usage

  Read in Full (Members only)  Buy a single user license online  To Subscribe click here

Below is an extract from this 44 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email contact@telco2.net / call +44 (0) 207 247 5003. We’ll also be discussing our findings and more on Facebook at the Silicon Valley (27-28 March) and London (12-13 June) New Digital Economics Brainstorms.

To share this article easily, please click:



Introduction

Telco 2.0 has previously published a wide variety of documents and blog posts on mobile broadband topics – content delivery networks (CDNs), mobile CDNs, WiFi offloading, Public WiFi, network outsourcing (“‘Under-The-Floor’ (UTF) Players: threat or opportunity? ”) and so forth. Our conferences have featured speakers and panellists discussing operator data-plan pricing strategies, tablets, network policy and numerous other angles. We’ve also featured guest material such as Arete Research’s report LTE: Late, Tempting, and Elusive.

In our recent ‘Under the Floor (UTF) Players‘ Briefing we looked at strategies to deal with some of of the challenges facing operators’ resulting from market structure and outsourcing

Under The Floor (UTF) Players Telco 2.0

This Executive Briefing is intended to complement and extend those efforts, looking specifically at those technical and business trends which are truly “disruptive”, either immediately or in the medium-term future. In essence, the document can be thought of as a checklist for strategists – pointing out key technologies or trends around mobile broadband networks and services that will influence mid-term revenue opportunities and threats. Some of those checklist items are relatively well-known, others more obscure but nonetheless important. What this document doesn’t cover is more straightforward concepts around pricing, customer service, segmentation and so forth – all important to get right, but rarely disruptive in nature.

During 2012, Telco 2.0 will be rolling out a new MBB workshop concept, which will audit operators’ existing technology strategy and planning around mobile data services and infrastructure. This briefing document is a roundup of some of the critical issues we will be advising on, as well as our top-level thinking on the importance of each trend.

It starts by discussing some of the issues which determine the extent of any disruption:

  • Growth in mobile data usage – and whether the much-vaunted “tsunami” of traffic may be slowing down
  • The role of standardisation , and whether it is a facilitator or inhibitor of disruption
  • Whether the most important MBB disruptions are likely to be telco-driven, or will stem from other actors such as device suppliers, IT companies or Internet firms.

The report then drills into a few particular domains where technology is evolving, looking at some of the most interesting and far-reaching trends and innovations. These are split broadly between:

  • Network infrastructure evolution (radio and core)
  • Control and policy functions, and business-model enablers

It is not feasible for us to cover all these areas in huge depth in a briefing paper such as this. Some areas such as CDNs and LTE have already been subject to other Telco 2.0 analysis, and this will be linked to where appropriate. Instead, we have drilled down into certain aspects we feel are especially interesting, particularly where these are outside the mainstream of industry awareness and thinking – and tried to map technical evolution paths onto potential business model opportunities and threats.

This report cannot be truly exhaustive – it doesn’t look at the nitty-gritty of silicon components, or antenna design, for example. It also treads a fine line between technological accuracy and ease-of-understanding for the knowledgeable but business-focused reader. For more detail or clarification on any area, please get in touch with us – email mailto:contact@stlpartners.com or call +44 (0) 207 247 5003.

Telco-driven disruption vs. external trends

There are various potential sources of disruption for the mobile broadband marketplace:

  • New technologies and business models implemented by telcos, which increase revenues, decrease costs, improve performance or alter the competitive dynamics between service providers.
  • 3rd party developments that can either bolster or undermine the operators’ broadband strategies. This includes both direct MBB innovations (new uses of WiFi, for example), or bleed-over from adjacent related marketplaces such as device creation or content/application provision.
  • External, non-technology effects such as changing regulation, economic backdrop or consumer behaviour.

The majority of this report covers “official” telco-centric innovations – LTE networks, new forms of policy control and so on,

External disruptions to monitor

But the most dangerous form of innovation is that from third parties, which can undermine assumptions about the ways mobile broadband can be used, introducing new mechanisms for arbitrage, or somehow subvert operators’ pricing plans or network controls. 

In the voice communications world, there are often regulations in place to protect service providers – such as banning the use of “SIM boxes” to terminate calls and reduce interconnection payments. But in the data environment, it is far less obvious that many work-arounds can either be seen as illegal, or even outside the scope of fair-usage conditions. That said, we have already seen some attempts by telcos to manage these effects – such as charging extra for “tethering” on smartphones.

It is not really possible to predict all possible disruptions of this type – such is the nature of innovation. But by describing a few examples, market participants can gauge their level of awareness, as well as gain motivation for ongoing “scanning” of new developments.

Some of the areas being followed by Telco 2.0 include:

  • Connection-sharing. This is where users might link devices together locally, perhaps through WiFi or Bluetooth, and share multiple cellular data connections. This is essentially “multi-tethering” – for example, 3 smartphones discovering each other nearby, perhaps each with a different 3G/4G provider, and pooling their connections together for shared use. From the user’s point of view it could improve effective coverage and maximum/average throughput speed. But from the operators’ view it would break the link between user identity and subscription, and essentially offload traffic from poor-quality networks on to better ones.
  • SoftSIM or SIM-free wireless. Over the last five years, various attempts have been made to decouple mobile data connections from SIM-based authentication. In some ways this is not new – WiFi doesn’t need a SIM, while it’s optional for WiMAX, and CDMA devices have typically been “hard-coded” to just register on a specific operator network. But the GSM/UMTS/LTE world has always relied on subscriber identification through a physical card. At one level, it s very good – SIMs are distributed easily and have enabled a successful prepay ecosystem to evolve. They provide operator control points and the ability to host secure applications on the card itself. However, the need to obtain a physical card restricts business models, especially for transient/temporary use such as a “one day pass”. But the most dangerous potential change is a move to a “soft” SIM, embedded in the device software stack. Companies such as Apple have long dreamed of acting as a virtual network provider, brokering between user and multiple networks. There is even a patent for encouraging bidding per-call (or perhaps per data-connection) with telcos competing head to head on price/quality grounds. Telco 2.0 views this type of least-cost routing as a major potential risk for operators, especially for mobile data – although it also possible enables some new business models that have been difficult to achieve in the past.
  • Encryption. Various of the new business models and technology deployment intentions of operators, vendors and standards bodies are predicated on analysing data flows. Deep packet inspection (DPI) is expected to be used to identify applications or traffic types, enabling differential treatment in the network, or different charging models to be employed. Yet this is rendered largely useless (or at least severely limited) when various types of encryption are used. Various content and application types already secure data in this way – content DRM, BlackBerry traffic, corporate VPN connections and so on. But increasingly, we will see major Internet companies such as Apple, Google, Facebook and Microsoft using such techniques both for their own users’ security, but also because it hides precise indicators of usage from the network operators. If a future Android phone sends all its mobile data back via a VPN tunnel and breaks it out in Mountain View, California, operators will be unable to discern YouTube video from search of VoIP traffic. This is one of the reasons why application-based charging models – one- or two-sided – are difficult to implement.
  • Application evolution speed. One of the largest challenges for operators is the pace of change of mobile applications. The growing penetration of smartphones, appstores and ease of “viral” adoption of new services causes a fundamental problem – applications emerge and evolve on a month-by-month or even week-by-week basis. This is faster than any realistic internal telco processes for developing new pricing plans, or changing network policies. Worse, the nature of “applications” is itself changing, with the advent of HTML5 web-apps, and the ability to “mash up” multiple functions in one app “wrapper”. Is a YouTube video shared and embedded in a Facebook page a “video service”, or “social networking”?

It is also really important to recognise that certain procedures and technologies used in policy and traffic management will likely have some unanticipated side-effects. Users, devices and applications are likely to respond to controls that limit their actions, while other developments may result in “emergent behaviours” spontaneously. For instance, there is a risk that too-strict data caps might change usage models for smartphones and make users just connect to the network when absolutely necessary. This is likely to be at the same times and places when other users also feel it necessary, with the unfortunate implication that peaks of usage get “spikier” rather than being ironed-out.

There is no easy answer to addressing these type of external threats. Operator strategists and planners simply need to keep watch on emerging trends, and perhaps stress-test their assumptions and forecasts with market observers who keep tabs on such developments.

The mobile data explosion… or maybe not?

It is an undisputed fact that mobile data is growing exponentially around the world. Or is it?

A J-curve or an S-curve?

Telco 2.0 certainly thinks that growth in data usage is occurring, but is starting to see signs that the smooth curves that drive so many other decisions might not be so smooth – or so steep – after all. If this proves to be the case, it could be far more disruptive to operators and vendors than any of the individual technologies discussed later in the report. If operator strategists are not at least scenario-planning for lower data growth rates, they may find themselves in a very uncomfortable position in a year’s time.

In its most recent study of mobile operators’ traffic patterns, Ericsson concluded that Q2 2011 data growth was just 8% globally, quarter-on-quarter, a far cry from the 20%+ growths seen previously, and leaving a chart that looks distinctly like the beginning of an S-curve rather than a continued “hockey stick”. Given that the 8% includes a sizeable contribution from undoubted high-growth developing markets like China, it suggests that other markets are maturing quickly. (We are rather sceptical of Ericsson’s suggestion of seasonality in the data). Other data points come from O2 in the UK , which appears to have had essentially zero traffic growth for the past few quarters, or Vodafone which now cites European data traffic to be growing more slowly (19% year-on-year) than its data revenues (21%). Our view is that current global growth is c.60-70%, c.40% in mature markets and 100%+ in developing markets.

Figure 1 – Trends in European data usage

 Trends in European Data Usage
 

Now it is possible that various one-off factors are at play here – the shift from unlimited to tiered pricing plans, the stronger enforcement of “fair-use” plans and the removal of particularly egregious heavy users. Certainly, other operators are still reporting strong growth in traffic levels. We may see resumption in growth, for example if cellular-connected tablets start to be used widely for streaming video. 

But we should also consider the potential market disruption, if the picture is less straightforward than the famous exponential charts. Even if the chart looks like a 2-stage S, or a “kinked” exponential, the gap may have implications, like a short recession in the economy. Many of the technical and business model innovations in recent years have been responses to the expected continual upward spiral of demand – either controlling users’ access to network resources, pricing it more highly and with greater granularity, or building out extra capacity at a lower price. Even leaving aside the fact that raw, aggregated “traffic” levels are a poor indicator of cost or congestion, any interruption or slow-down of the growth will invalidate a lot of assumptions and plans.

Our view is that the scary forecasts of “explosions” and “tsunamis” have led virtually all parts of the industry to create solutions to the problem. We can probably list more than 20 approaches, most of them standalone “silos”.

Figure 2 – A plethora of mobile data traffic management solutions

A Plethora of Mobile Data Traffic Management Solutions

What seems to have happened is that at least 10 of those approaches have worked – caps/tiers, video optimisation, WiFi offload, network densification and optimisation, collaboration with application firms to create “network-friendly” software and so forth. Taken collectively, there is actually a risk that they have worked “too well”, to the extent that some previous forecasts have turned into “self-denying prophesies”.

There is also another common forecasting problem occurring – the assumption that later adopters of a technology will have similar behaviour to earlier users. In many markets we are now reaching 30-50% smartphone penetration. That means that all the most enthusiastic users are already connected, and we’re left with those that are (largely) ambivalent and probably quite light users of data. That will bring the averages down, even if each individual user is still increasing their consumption over time. But even that assumption may be flawed, as caps have made people concentrate much more on their usage, offloading to WiFi and restricting their data flows. There is also some evidence that the growing numbers of free WiFi points is also reducing laptop use of mobile data, which accounts for 70-80% of the total in some markets, while the much-hyped shift to tablets isn’t driving much extra mobile data as most are WiFi-only.

So has the industry over-reacted to the threat of a “capacity crunch”? What might be the implications?

The problem is that focusing on a single, narrow metric “GB of data across the network” ignores some important nuances and finer detail. From an economics standpoint, network costs tend to be driven by two main criteria:

  • Network coverage in terms of area or population
  • Network capacity at the busiest places/times

Coverage is (generally) therefore driven by factors other than data traffic volumes. Many cells have to be built and run anyway, irrespective of whether there’s actually much load – the operators all want to claim good footprints and may be subject to regulatory rollout requirements. Peak capacity in the most popular locations, however, is a different matter. That is where issues such as spectrum availability, cell site locations and the latest high-speed networks become much more important – and hence costs do indeed rise. However, it is far from obvious that the problems at those “busy hours” are always caused by “data hogs” rather than sheer numbers of people each using a small amount of data. (There is also another issue around signalling traffic, discussed later). 

Yes, there is a generally positive correlation between network-wide volume growth and costs, but it is far from perfect, and certainly not a direct causal relationship.

So let’s hypothesise briefly about what might occur if data traffic growth does tail off, at least in mature markets.

  • Delays to LTE rollout – if 3G networks are filling up less quickly than expected, the urgency of 4G deployment is reduced.
  • The focus of policy and pricing for mobile data may switch back to encouraging use rather than discouraging/controlling it. Capacity utilisation may become an important metric, given the high fixed costs and low marginal ones. Expect more loyalty-type schemes, plus various methods to drive more usage in quiet cells or off-peak times.
  • Regulators may start to take different views of traffic management or predicted spectrum requirements.
  • Prices for mobile data might start to fall again, after a period where we have seen them rise. Some operators might be tempted back to unlimited plans, for example if they offer “unlimited off-peak” or similar options.
  • Many of the more complex and commercially-risky approaches to tariffing mobile data might be deprioritised. For example, application-specific pricing involving packet-inspection and filtering might get pushed back down the agenda.
  • In some cases, we may even end up with overcapacity on cellular data networks – not to the degree we saw in fibre in 2001-2004, but there might still be an “overhang” in some places, especially if there are multiple 4G networks.
  • Steady growth of (say) 20-30% peak data per annum should be manageable with the current trends in price/performance improvement. It should be possible to deploy and run networks to meet that demand with reducing unit “production cost”, for example through use of small cells. That may reduce the pressure to fill the “revenue gap” on the infamous scissors-diagram chart.

Overall, it is still a little too early to declare shifting growth patterns for mobile data as a “disruption”. There is a lack of clarity on what is happening, especially in terms of responses to the new controls, pricing and management technologies put recently in place. But operators need to watch extremely closely what is going on – and plan for multiple scenarios.

Specific recommendations will depend on an individual operator’s circumstances – user base, market maturity, spectrum assets, competition and so on. But broadly, we see three scenarios and implications for operators:

  • “All hands on deck!”: Continued strong growth (perhaps with a small “blip”) which maintains the pressure on networks, threatens congestion, and drives the need for additional capacity, spectrum and capex.
    • Operators should continue with current multiple strategies for dealing with data traffic – acquiring new spectrum, upgrading backhaul, exploring massive capacity enhancement with small cells and examining a variety of offload and optimisation techniques. Where possible, they should explore two-sided models for charging and use advanced pricing, policy or segmentation techniques to rein in abusers and reward those customers and applications that are parsimonious with their data use. Vigorous lobbying activities will be needed, for gaining more spectrum, relaxing Net Neutrality rules and perhaps “taxing” content/Internet companies for traffic injected onto networks.
  • “Panic over”: Moderating and patchy growth, which settles to a manageable rate – comparable with the patterns seen in the fixed broadband marketplace
    • This will mean that operators can “relax” a little, with the respite in explosive growth meaning that the continued capex cycles should be more modest and predictable. Extension of today’s pricing and segmentation strategies should improve margins, with continued innovation in business models able to proceed without rush, and without risking confrontation with Internet/content companies over traffic management techniques. Focus can shift towards monetising customer insight, ensuring that LTE rollouts are strategic rather than tactical, and exploring new content and communications services that exploit the improving capabilities of the network.
  • “Hangover”: Growth flattens off rapidly, leaving operators with unused capacity and threatening brutal price competition between telcos.
    • This scenario could prove painful, reminiscent of early-2000s experience in the fixed-broadband marketplace. Wholesale business models could help generate incremental traffic and revenue, while the emphasis will be on fixed-cost minimisation. Some operators will scale back 4G rollouts until cost and maturity go past the tipping-point for outright replacement of 3G. Restrictive policies on bandwidth use will be lifted, as operators compete to give customers the fastest / most-open access to the Internet on mobile devices. Consolidation – and perhaps bankruptcies – may ensure as declining data prices may coincide with substitution of core voice and messaging business

To read the note in full, including the following analysis…

  • Introduction
  • Telco-driven disruption vs. external trends
  • External disruptions to monitor
  • The mobile data explosion… or maybe not?
  • A J-curve or an S-curve?
  • Evolving the mobile network
  • Overview
  • LTE
  • Network sharing, wholesale and outsourcing
  • WiFi
  • Next-gen IP core networks (EPC)
  • Femtocells / small cells / “cloud RANs”
  • HetNets
  • Advanced offload: LIPA, SIPTO & others
  • Peer-to-peer connectivity
  • Self optimising networks (SON)
  • M2M-specific broadband innovations
  • Policy, control & business model enablers
  • The internal politics of mobile broadband & policy
  • Two sided business-model enablement
  • Congestion exposure
  • Mobile video networking and CDNs
  • Controlling signalling traffic
  • Device intelligence
  • Analytics & QoE awareness
  • Conclusions & recommendations
  • Index

…and the following figures…

  • Figure 1 – Trends in European data usage
  • Figure 2 – A plethora of mobile data traffic management solutions
  • Figure 3 – Not all operator WiFi is “offload” – other use cases include “onload”
  • Figure 4 – Internal ‘power tensions’ over managing mobile broadband
  • Figure 5 – How a congestion API could work
  • Figure 6 – Relative Maturity of MBB Management Solutions
  • Figure 7 – Laptops generate traffic volume, smartphones create signalling load
  • Figure 8 – Measuring Quality of Experience
  • Figure 9 – Summary of disruptive network innovations

Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 44 page report in PDF format hereNon-Members, please subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email contact@telco2.net / call +44 (0) 207 247 5003.

Organisations, geographies, people and products referenced: 3GPP, Aero2, Alcatel Lucent, AllJoyn, ALU, Amazon, Amdocs, Android, Apple, AT&T, ATIS, BBC, BlackBerry, Bridgewater, CarrierIQ, China, China Mobile, China Unicom, Clearwire, Conex, DoCoMo, Ericsson, Europe, EverythingEverywhere, Facebook, Femto Forum, FlashLinq, Free, Germany, Google, GSMA, H3G, Huawei, IETF, IMEI, IMSI, InterDigital, iPhones,Kenya, Kindle, Light Radio, LightSquared, Los Angeles, MBNL, Microsoft, Mobily, Netflix, NGMN, Norway, NSN, O2, WiFi, Openet, Qualcomm, Radisys, Russia, Saudi Arabia, SoftBank, Sony, Stoke, Telefonica, Telenor, Time Warner Cable, T-Mobile, UK, US, Verizon, Vita, Vodafone, WhatsApp, Yota, YouTube, ZTE.

Technologies and industry terms referenced: 2G, 3G, 4.5G, 4G, Adaptive bitrate streaming, ANDSF (Access Network Discovery and Selection Function), API, backhaul, Bluetooth, BSS, capacity crunch, capex, caps/tiers, CDMA, CDN, CDNs, Cloud RAN, content delivery networks (CDNs), Continuous Computing, Deep packet inspection (DPI), DPI, DRM, Encryption, Enhanced video, EPC, ePDG (Evolved Packet Data Gateway), Evolved Packet System, Femtocells, GGSN, GPS, GSM, Heterogeneous Network (HetNet), Heterogeneous Networks (HetNets), HLRs, hotspots, HSPA, HSS (Home Subscriber Server), HTML5, HTTP Live Streaming, IFOM (IP Flow Mobility and Seamless Offload), IMS, IPR, IPv4, IPv6, LIPA (Local IP Access), LTE, M2M, M2M network enhancements, metro-cells, MiFi, MIMO (multiple in, MME (Mobility Management Entity), mobile CDNs, mobile data, MOSAP, MSISDN, MVNAs (mobile virtual network aggregators)., MVNO, Net Neutrality, network outsourcing, Network sharing, Next-generation core networks, NFC, NodeBs, offload, OSS, outsourcing, P2P, Peer-to-peer connectivity, PGW (PDN Gateway), picocells, policy, Policy and Charging Rules Function (PCRF), Pre-cached video, pricing, Proximity networks, Public WiFi, QoE, QoS, RAN optimisation, RCS, remote radio heads, RFID, self-optimising network technology (SON), Self-optimising networks (SON), SGW (Serving Gateway), SIM-free wireless, single RANs, SIPTO (Selective IP Traffic Offload), SMS, SoftSIM, spectrum, super-femtos, Telco 2.0 Happy Pipe, Transparent optimisation, UMTS, ‘Under-The-Floor’ (UTF) Players, video optimisation, VoIP, VoLTE, VPN, White space, WiFi, WiFi Direct, WiFi offloading, WiMAX, WLAN.

The value of “Smart Pipes” to mobile network operators

Preface

Rationale and hypothesis for this report

It is over fourteen years since David Isenberg wrote his seminal paper The Rise of the Stupid Network in which he outlined the view that telephony networks would increasingly become dumb pipes as intelligent endpoints came to control how and where data was transported. Many of his predictions have come to fruition. Cheaper computing technology has resulted in powerful ‘smartphones’ in the hands of millions of people and new powerful internet players are using data centres to distribute applications and services ‘over the top’ to users over fixed and mobile networks.

The hypothesis behind this piece of research is that endpoints cannot completely control the network. STL Partners believes that the network itself needs to retain intelligence so it can interpret the information it is transporting between the endpoints. Mobile network operators, quite rightly, will not be able to control how the network is used but must retain the ability within the network to facilitate a better experience for the endpoints. The hypothesis being tested in this research is that ‘smart pipes’ are needed to:

  1. Ensure that data is transported efficiently so that capital and operating costs are minimised and the internet and other networks remain cheap methods of distribution.
  2. Improve user experience by matching the performance of the network to the nature of the application or service being used. ‘Best effort’ is fine for asynchronous communication, such as email or text, but unacceptable for voice. A video call or streamed movie requires guaranteed bandwidth, and real-time gaming demands ultra-low latency;
  3. Charge appropriately for use of the network. It is becoming increasingly clear that the Telco 1.0 business model – that of charging the end-user per minute or per Megabyte – is under pressure as new business models for the distribution of content and transportation of data are being developed. Operators will need to be capable of charging different players – end-users, service providers, third-parties (such as advertisers) – on a real-time basis for provision of broadband and guaranteed quality of service (QoS);
  4. Facilitate interactions within the digital economy. Operators can compete and partner with other players, such as the internet companies, in helping businesses and consumers transact over the internet. Networks are no longer confined to communications but are used to identify and market to prospects, complete transactions, make and receive payments and remittances, and care for customers. The knowledge that operators have about their customers coupled with their skills and assets in identity and authentication, payments, device management, customer care etc. mean that ‘the networks’ can be ‘enablers’ in digital transactions between third-parties – helping them to happen more efficiently and effectively.

Overall, smarter networks will benefit network users – upstream service providers and end users – as well as the mobile network operators and their vendors and partners. Operators will also be competing to be smarter than their peers as, by differentiating here, they gain cost, revenue and performance advantages that will ultimately transform in to higher shareholder returns.

Sponsorship and editorial independence

This report has kindly been sponsored by Tellabs and is freely available. Tellabs developed the initial concepts, and provided STL Partners with the primary input and scope for the report. Research, analysis and the writing of the report itself was carried out independently by STL Partners. The views and conclusions contained herein are those of STL Partners.

About Tellabs

Tellabs logo

Tellabs innovations advance the mobile Internet and help our customers succeed. That’s why 43 of the top 50 global communications service providers choose our mobile, optical, business and services solutions. We help them get ahead by adding revenue, reducing expenses and optimizing networks.

Tellabs (Nasdaq: TLAB) is part of the NASDAQ Global Select Market, Ocean Tomo 300® Patent Index, the S&P 500 and several corporate responsibility indexes including the Maplecroft Climate Innovation Index, FTSE4Good and eight FTSE KLD indexes. http://www.tellabs.com

Executive Summary

Mobile operators no longer growth stocks

Mobile network operators are now valued as utility companies in US and Europe (less so APAC). Investors are not expecting future growth to be higher than GDP and so are demanding money to be returned in the form of high dividends.

Two ‘smart pipes’ strategies available to operators

In his seminal book, Michael Porter identified three generic strategies for companies – ‘Cost leadership’, ‘Differentiation’ and ‘Focus’. Two of these are viable in the mobile telecommunications industry – Cost leadership, or Happy Pipe in STL Partners parlance, and Differentiation, or Full-service Telco 2.0. No network operators have found a Focus strategy to work as limiting the customer base to a segment of the market has not yielded sufficient returns on the high capital investment of building a network. Even MVNOs that have pursued this strategy, such as Helio which targeted Korean nationals in the US, have struggled.

Underpinning the two business strategies are related ‘smart pipe’ approaches – smart network and smart services:

Porter

Strategy

Telco 2.0 strategy

Nature of smartness

Characteristics

Cost leadership

Happy Pipe

Smart network

Cost efficiency – minimal network, IT and commercial costs.  Simple utility offering.

Differentiation

Full-service Telco 2.0

Smart services

Technical and commercial flexibility: improve customer experience by integrating network capabilities with own and third-party services and charging either end user or service provider (or both).

Source: STL Partners

It is important to note that, currently at least, having a smart network is a precursor of smart services.  It would be impossible for an operator to implement a Full-service Telco 2.0 strategy without having significant network intelligence.  Full-service Telco 2.0 is, therefore, an addition to a Happy Pipe strategy.

Smart network strategy good, smart services strategy better

In a survey conducted for this report, it was clear that operators are pursuing ‘smart’ strategies, whether at the network level or extending beyond this into smart services, for three reasons:

  • Revenue growth: protecting existing revenue sources and finding new ones.  This is seen as the single most important driver of building more intelligence.
  • Cost savings: reducing capital and operating costs.
  • Performance improvement: providing customers with an improved customer experience.

Assuming that most mobile operators currently have limited smartness in either network or services, our analysis suggests significant upside in financial performance from successfully implementing either a Happy Pipe or Full-service Telco 2.0 strategy.  Most mobile operators generate Cash Returns on Invested Capital of between 5 and 7%.  For the purposes of our analysis, we have a assumed a baseline of 5.8%.  The lower capital and operator costs of a Happy Pipe strategy could increase this to 7.4% and the successful implementation of a Full-service Telco 2.0 strategy would increase this to a handsome 13.3%:

Telco 2.0 strategy

Nature of smartness

Cash Returns on Invested Capital

As-is – Telco 1.0

Low – relatively dumb

5.8%

Happy Pipe

Smart network

7.4%

Full-service Telco 2.0

Smart services

13.3%

Source: STL Partners

STL Partners has identified six opportunity areas for mobile operators to exploit with a Full-service Telco 2.0 strategy.  Summarised here, these are outlined in detail in the report:

Opportunity Type

Approach

Typical Services

Core Services

Improving revenues and customer loyalty by better design, analytics, and smart use of data in existing services.

Access, Voice and Messaging, Broadband, Standard Wholesale, Generic Enterprise ICT Services (inc. SaaS)

Vertical industry solutions (SI)

Delivery of ICT projects and support to vertical enterprise sectors.

Systems Integration (SI), Vertical CEBP solutions, Vertical ICT, Vertical M2M solutions, and Private Cloud.

Infrastructure services

Optimising cost and revenue structures by buying and selling core telco ICT asset capacity.

Bitstream ADSL, Unbundled Local Loop, MVNOs, Wholesale Wireless, Network Sharing, Cloud – IaaS.

Embedded communications

Enabling wider use of voice, messaging, and data by facilitating access to them and embedding them in new products.

Comes with data, Sender pays delivery, Horizontal M2M Platforms, Voice, Messaging and Data APIs for 3rd Parties.

Third-pary business enablers

Enabling new telco assets (e.g. Customer data) to be leveraged in support of 3rd party business processes.

Telco enabled Identity and Authorisation, Advertising and Marketing, Payments. APIs to non-core services and assets.

Own-brand OTT services

Building value through Telco-owned online properties and ‘Over-the-Top’ services.

Online Media, Enterprise Web Services, Own Brand VOIP services.


Source: STL Partners

Regional approaches to smartness vary

As operators globally experience a slow-down in revenue growth, they are pursuing ways of maintaining margins by reducing costs.  Unsurprisingly therefore, most operators in North America, Europe and Asia-Pacific appear to be pursuing a Happy Pipe/smart network strategy.  Squeezing capital and operating costs and improving network performance is being sought through such approaches as:

  • Physical network sharing – usually involving passive elements such as towers, air-conditioning equipment, generators, technical premises and pylons.
  • Peering data traffic rather than charging (and being charged) for transit.
  • Wi-Fi offload – moving data traffic from the mobile network on to cheaper fixed networks.
  • Distributing content more efficiently through the use of multicast and CDNs.
  • Efficient network configuration and provisioning.
  • Traffic shaping/management via deep-packet inspection (DPI) and policy controls.
  • Network protection – implementing security procedures for abuse/fraud/spam so that network performance is maximised.
  • Device management to ameliorate device impact on network and improve customer experience

Vodafone Asia-Pacific is a good example of an operator pursuing these activities aggressively and as an end in itself rather than as a basis for a Telco 2.0 strategy.  Yota in Russia and Lightsquared in the US are similarly content with being Happy Pipers.

In general, Asia-Pacific has the most disparate set of markets and operators.  Markets vary radically in terms of maturity, structure and regulation and operators seem to polarise into extreme Happy Pipers (Vodafone APAC, China Mobile, Bharti) and Full-Service Telco 2.0 players (NTT Docomo, SK Telecom, SingTel, Globe).

In Telefonica, Europe is the home of the operator with the most complete Telco 2.0 vision globally.  Telefonica has built and acquired a number of ‘smart services’ which appear to be gaining traction including O2 Priority Moments, Jajah, Tuenti and Terra.  Recent structural changes at the company, in which Telefonica Digital was created to focus on opportunities in the digital economy, further indicate the company’s focus on Telco 2.0 and smart services.  Europe too appears to be the most collaborative market.  Vodafone, Telefonica, Orange, Telecom Italia and T-Mobile are all working together on a number of Telco 2.0 projects and, in so doing, seek to generate enough scale to attract upstream developers and downstream end-users.

The sheer scale of the two leading mobile operators in the US, AT&T and Verizon, which have over 100 million subscribers each, means that they are taking a different approach to Telco 2.0.  They are collaborating on one or two opportunities, notably with ISIS, a near-field communications payments solution for mobile, which is a joint offer from AT&T, Verizon and T-Mobile.  However, in the main, there is a high degree of what one interviewee described as ‘Big Bell dogma’ – the view that their company is big enough and powerful enough to take on the OTT players and ‘control’ the experiences of end users in the digital economy.  The US market is more consolidated than Europe (giving the big players more power) but, even so, it seems unlikely that either AT&T or Verizon can keep customers using only their services – the lamented wall garden approach.

Implementing a Telco 2.0 strategy is important but challenging

STL Partners explored both how important and how difficult it is to implement the changes required to deliver a Happy Pipe strategy (outlined in the bullets above) and those needed for Full-service Telco 2.0 strategy, via industry interviews with operators and a quantitative survey.  The key findings of this analysis were:

  • Overall, respondents felt that many activities were important as part of a smart strategy.  In our survey, all except two activity areas – Femto/pico underlay and Enhanced switches (vs. routers) – were rated by more than 50% of respondents as either ‘Quite important’ or ‘Very important’ (see chart below).
  • Activities associated with a Full-service Telco 2.0 strategy were rated as particularly important:
  • Making operator assets available via APIs, Differentiated pricing and charging and Personalised and differentiated services were ranked 1, 2 and 3 out of the thirteen activities.
  • Few considered that any of the actions were dangerous and could destroy value, although Physical network sharing and Traffic shaping/DPI were most often cited here.
Smart Networks - important implementation factors to MNOs
Source: STL Partners/Telco 2.0 & Tellabs ‘Smart pipes’ survey, July 2011, n=107

NOTE: Overall ranking was based on a weighted scoring policy of Very important +4, Quite important +3, Not that important +2, Unimportant +1, Dangerous -4.

Overall, most respondents to the survey and people we spoke with felt that operators had more chance in delivering a Happy Pipe strategy and that only a few Tier 1 operators would be successful with a Full-Service Telco 2.0 strategy.  For both strategies, they were surprisingly sceptical about operators’ ability to implement the necessary changes.  Five reasons were cited as major barriers to success and were particularly big when considering a Full-Service Telco 2.0 strategy:

  1. Competition from internet players.  Google, Apple, Facebook et al preventing operators from expanding their role in the digital economy.
  2. Difficulty in building a viable ecosystem. Bringing together the required players for such things as near-field communications (NFC) mobile payments and sharing value among them.
  3. Lack of mobile operators skills.  The failure of operators to develop or exploit key skills required for facilitating transactions such as customer data management and privacy.
  4. Culture.  Being too wedded to existing products, services and business models to alter the direction of the super-tanker.
  5. Organisation structure. Putting in place the people and processes to manage the change.

Looking at the specific activities required to build smartness, it was clear that those required for a Full-service Telco 2.0/smart services strategy are considered the hardest to implement (see chart below):

  • Personalised and differentiated services via use of customer data – content, advertising, etc.
  • Making operator assets available to end users and other service providers – location, presence, ID, payments
  • Differentiated pricing and charging based on customer segment, service, QoS
Smart Networks - how challenging are the changes?
Source: STL Partners/Telco 2.0 & Tellabs ‘Smart pipes’ survey, July 2011, n=100

NOTE: Overall ranking was based on a weighted scoring policy of Very easy +5, Relatively straightforward +4, Manageable +3, Quite difficult +2, Very difficult -2.

Conclusions and recommendations

By comparing the relative importance of specific activities against how easy they are to implement, we were able to classify them into four categories:

Category

Importance for delivering smart strategy

Relative ease of implementation

Must get right

High

Easy

Strive for new role

High

Difficult

Housekeeping

Low

Easy

Forget

Low

Difficult

Rating of factors needed for Telco 2.0 'Smart Pipes' and 'Full Services' Strategies
Source: STL Partners/Telco 2.0 & Tellabs ‘Smart pipes’ survey, July 2011, n=100

Unfortunately, as the chart above shows, no activities fall clearly into the ‘Forget’ categories but there are some clear priorities:

  • A Full-service Telco 2.0 strategy is about striving for a new role in the digital economy and is probably most appropriate for Tier 1 MNOs, since it is going to require substantial scale and investment in new skills such as software and application development and customer data.  It will also require the development of new partnerships and ecosystems and complex commercial arrangements with players from other industries (e.g. banking). 
  • There is a cluster of smart network activities that are individually relatively straightforward to implement and will yield a big bang for the buck if investments are made – the ‘Must get right’ group:
  • More efficient network configuration and provisioning;
  • Strengthen network security to cope with abuse and fraud;
  • Improve device management (and cooperation with handset manufacturers and content players) to reduce the impact of smartphone burden on the network;

Although deemed more marginal in our survey, we would include as equally important:

  • Traffic shaping and DPI which, in many cases, underpins various smart services opportunities such as differentiated pricing based on QoS and Multicast and CDNs which are proven in the fixed world and likely to be equally beneficial in a video-dominated mobile one.

There is second cluster of smart network activities which appear to be equally easy (or difficult) to implement but are deemed by respondents to be lower value and therefore fall into a lower ‘Housekeeping’ category:

  • Wi-Fi offload – we were surprised by this given the emphasis placed on this by NTT Docomo, China Mobile, AT&T, O2 and others;
  • Peering (vs. transit) and Enhanced switches  – this is surely business-as-usual for all MNOs;
  • Femto/Pico underlay – generally felt to be of limited importance by respondents although a few cited its importance in pushing network intelligence to the edge which would enable MNOs to more easily deliver differentiated QoS and more innovative retail and wholesale revenue models;
  • Physical network sharing – again, a surprising result given the keenness of the capital markets on this strategy. 

 

Overall, it appears that mobile network operators need to continue to invest resources in developing smart networks but that a clear prioritisation of efforts is needed given the multitude of ‘moving parts’ required to develop a smart network that will deliver a successful Happy Pipe strategy.

A successful Full-Service Telco 2.0 strategy is likely to be extremely profitable for a mobile network operator and would result in a substantial increase in share price.  But delivering this remains a major challenge and investors are sceptical.  Collaboration, experimentation and investment are important facets of a Telco 2.0 implementation strategy as they drive scale, learning and innovation respectively.  Given the demands of investors for dividend yields, investment is only likely to be available if an operator becomes more efficient, so implementing a Happy Pipe strategy which reduces capital and operating costs is critical.

 

Report Contents

 

  • Executive Summary
  • Mobile network operator challenges
  • The future could still be bright
  • Defining a ‘smart’ network
  • Understanding operator strategies
  • Video: Case study in delivering differentiation and cost leadership
  • The benefits of Smart on CROIC
  • Implementing a ‘smart’ strategy
  • Conclusions and recommendations

Report Figures

 

  • Figure 1: Pressure from all sides for operators
  • Figure 2: Vodafone historical dividend yield – from growth to income
  • Figure 3: Unimpressed capital markets and falling employment levels
  • Figure 4: Porter and Telco 2.0 competitive strategies
  • Figure 5: Defining Differentiation/Telco 2.0
  • Figure 6 – The Six Opportunity Areas – Approach, Typical Services and Examples
  • Figure 7: Defining Cost Leadership/Happy Pipe
  • Figure 8: Defining ‘smartness’
  • Figure 9: Telco 2.0 survey – Defining smartness
  • Figure 10: NTT’s smart content delivery system – a prelude to mobile CDNs?
  • Figure 11: Vodafone India’s ARPU levels are now below $4/month, illustrating the need for a ‘smart network’ approach
  • Figure 12: China Mobile’s WLAN strategy for coverage, capacity and cost control
  • Figure 13: GCash – Globe’s text-based payments service
  • Figure 14: PowerOn – SingTel’s on-demand business services
  • Figure 15: Telefonica’s Full-service Telco 2.0 strategy
  • Figure 16: Vodafone – main messages are about being an efficient data pipe
  • Figure 17: Collaboration with other operators key to smart services strategy
  • Figure 18: Verizon Wireless and Skype offering
  • Figure 19: Content delivery with and without a CDN
  • Figure 20: CDN benefits to consumers are substantial
  • Figure 21: Cash Returns on Invest Capital of different Telco 2.0 opportunity areas
  • Figure 22: The benefits of smart to a MNO are tangible and significant
  • Figure 23: Telco 2.0 Survey – benefits of smart to MNOs
  • Figure 24: Telco 2.0 survey – MNO chances of success with smart strategies
  • Figure 25: Telco 2.0 survey – lots of moving parts required for ‘smartness’
  • Figure 26: Telco 2.0 survey – Differentiation via smart services is particularly challenging
  • Figure 27: Telco 2.0 survey – Implementing changes is challenging
  • Figure 28: Telco 2.0 survey – Prioritising smart implementation activities

 

M2M 2.0: Report and analysis of the event

M2M 2.0: Event Summary Analysis: A summary of the findings of the M2M 2.0 session, 10th November 2011, held in the Guoman Hotel, London

M2M 2.0: Event Summary Analysis Presentation


Part of the New Digital
Economics Executive Brainstorm
series, the M2M 2.0 session took place at
the Guoman Hotel, London on the 10th November and reviewed real-world
experience with M2M projects from operators and other actors. Using
a widely acclaimed interactive format called ‘Mindshare’, the
event enabled specially-invited senior executives from across the
communications, energy and technology sectors to.

This
note summarises some of the high-level findings and includes the verbatim
output of the brainstorm.

More information: email contact@stlpartners.com, or phone: +44 (0) 207 247 5003.

DOWNLOAD REPORT

Extracted example slide:

M2M 2.0: Event Summary Analysis Presentation

M2M 2.0: New Business Models for Service Enablers (Deutche Telekom) Presentation)

M2M 2.0: Service Enablers – New Business Models Needed,
Presentation by Sven Krey, Head of Sales Development, M2M Competence Centre, Deutsche Telekom.
Business model challenges facing operators in M2M and how DTAG is facing them. Presented at EMEA Brainstorm, November 2011.
M2M connections explode, prices plunge

Download presentation here.

Links here for more on New Digital Economics brainstorms and M2M 2.0 research, or call +44 (0) 207 247 5003.

Example slide from presentation:

M2M Chart from Deutsche Telekom

M2M 2.0: Market, Business Models, and Telcos’ Role(s)

Summary: Our latest report on M2M 2.0 covers: M2M market growth, structure and dynamics; business models; the best role(s) for telcos; and leading thinking from Deutsche Telekom, Vodafone, Telenor, KPN and Swisscom. It describes how ‘Service Enablers’ are key to the telco opportunity in M2M in addition to connectivity. (July 2011, Executive Briefing Service)

M2M Pie Chart Service Enablers July 2011

  Read in Full (Members only)    Buy This Report    To Subscribe

Below is an extract from this 39 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service here. Non-members can buy a Single User license for this report online here for £595 (+VAT) or subscribe here. For multiple user licenses or other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.

To share this article, please click:

Background


Our previous M2M 2.0 research includes: M2M 2.0: New Approaches Needed; Aligning M2M with Telco 2.0 Strategies; and M2M / Embedded Market Overview, Healthcare Focus, and Strategic Options. M2M is also a theme of the upccoming New Digital Economics Exeutive Brainstorms in H2 2011, and there is a thought-provoking video (registration required) by Ericsson on the ‘Social web of things‘ on our Best Practice Live! site.

It’s a Long Way to the Top

The grand vision of 50 Billion connected devices looks a long way distant when contemplating the ‘cottage industry’ that is M2M today.

While there are lots of possibilities for connecting devices usefully, there are numerous challenges to doing it well and growing the market to its full potential:

  • There are many different networks may be used for M2M – cellular, WiFi, WiMax, fixed, Bluetooth and other radio networks;
  • The needs of existing and potential M2M customers are very diverse;
  • There are many different types of potential M2M connectivity and services providers, from vertical specialists, through fixed and mobile telcos, other network owners, and device makers;
  • There are many diverse M2M devices, some with have 30 year life-spans, others lives measured in months;
  • And massive growth in intelligent devices that can increasingly choose different networks for different applications.

There are also industry barriers to the take-up of current offerings, such as

  • The lack of common, global, flexible solutions;
  • Performance and cost issues;
  • A low base of user and potential awareness and understanding.

Figure 3 (Extract) – The Key Challenge for M2M Growth is to Create a Broad, Open Market

M2M 2.0 rating of the industry barriers to M2M adoption

Source: Delegate Vote, 11th Telco 2.0 EMEA Brainstorm

More Money is in Service Enablers

It is our view (and that of the attendees at our last M2M brainstorm) that the pure connectivity revenues (to be paid for delivering the data from machine to machine) will become highly commoditised and low margin.

The “growth opportunity” will be in Software Enabling Services (SES), responsible for such activities as device provisioning, update/rollback of device software and firmware, data-warehousing, and some forms of data reduction pushed down into the network. These could be delivered traditionally or as Software-as-a-Service (SaaS).

How much Money, and for Whom?

The complex driving and structural factors lead to a high degree of uncertainty in the Industry’s view of the market opportunity. For example, on average, delegates thought that by 2015, service enabler revenues would comprise a value of 78% of connectivity revenues – although this average was formed by a large group that thought it would be in the range 20-40% and a small minority that thought it would be much higher (>200%)

What role(s) should Telcos play?

Operators can add value by making it easier to use their connectivity and providing more “M2M-friendly” interfaces – often described as managed connectivity. Beyond this, they can look to create and participate in the service enablers market for developers/application providers to easily identify, authenticate, provision, and maintain their device fleet; to update and rollback software on the devices and enable them to deploy processing logic into the “Internet of things” in order to render the system more robust, distributed, and autonomous.

Some operators already have the skills and resources to offer the application development, implementation and service hosting on top of this. Summarised in the report are examples of leading thinking and practice including Vodafone’s Global M2M Platform, Telenor Objects, Deutsche Telekom’s ‘Intelligent Network’, KPN’s and Swisscom’s platforms, plus we have previously reported on Verizon’s Open Development Initiative (ODI) in the US.

Figure 7 (Extract) – Why The Classical Approach to M2M May Fail

M2M 2.0 Why the classical approach may fail July 2011

Source: Telenor Presentation

The industry as a whole has made rapid progress but could do much more to stimulate the embedded mobility market and drive growth through standards, interoperability and portability. The industry’s historical reluctance to do more to open itself up has left it vulnerable to being marginalized. The GSMA’s recent acceptance of over-the-air (OTA) SIM update, opens up the promise of more practical ways for an M2M customer to switch operator. It now rests on the industry (or failing that, the regulatory authorities) to deliver this promise.

Telco 2.0 Take-Out & Next Steps

M2M is growing up as an industry, and becoming more coherent and adopting increasingly similar concepts and vocabulary. However, as the wide variation in voting testifies, there is still considerable divergence in understanding and vision.

The M2M Opportunity is potentially significant but does not necessarily belonging to cellular networks, particularly if the industry does not work out how to create more common models that allow customers to use M2M in the way they actually need to use it – flexibly, seamlessly and cheaply.

While there is much energy in the debate on Machine-to-Machine in the operator community, there is widespread recognition that it is still something of a ‘cottage industry’ for operators at present, and a welcome sense of realism in that operators seem to understand that they don’t have all the answers. The core strategic challenge is to find a model that will scale beyond bespoke vertical industry applications.

While there is not yet a straightforward consensus on the relative value of service enablers compared to connectivity, our view remains that telcos need to develop the service enabler model as the connectivity market will be highly commoditised. We will continue to work to support this community, develop the service enabler model, and promote collective industry progress on M2M.

To read the full 39 page report, including analysis of the presentations, voting and delegate analysis from the M2M 2.0 Executive Brainstorms in April 2011, and London in November 2010, and the following charts…

  • Figure 1- T-Mobile’s Forecast of European M2M Markets
  • Figure 2 – Vodafone’s Global M2M Platform
  • Figure 3 – The Key Challenge for M2M Growth is to Create a Broad, Open Market
  • Figure 4 – What is the best service enabler opportunity for telcos?
  • Figure 5 – Will connectivity and generic horizontal service enabler platforms emerge and define the market?
  • Figure 6 – What are the priorities for the industry in developing M2M opportunities?
  • Figure 7 – Why ‘classical’ approaches to M2M may fail
  • Figure 8 – Horizontally layered approach needed
  • Figure 9 – KPN Development Platform
  • Figure 10 – Forecast share of service enabler revenue by type of player
  • Figure 11 – 2015 Global Service Enabler vs. Connectivity Revenues
  • Figure 12 – Issues for the ‘Internet of Things’
  • Figure 13 – Operator opportunities in the ‘Internet of Things’
  • Figure 14 – How would you characterise Ericsson’s vision of the Social Web of Things?
  • Figure 15 – What percentage of connections will be made by cellular mobile networks in 2020?

……Members of the Telco 2.0 Executive Briefing Subscription Service can download the full 39 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595, or for multi-user licenses and any other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.

Organisations, products and industry terms referenced: API, ARPU, Beecham Research, Bluetooth, Bosch, BT / Arqiva, Cincius, connected car, Deutsche Telekom, Embedded Mobile, energy, Enfora, Ericsson, Facebook, GSM, GSMA, healthcare, HLR, HTTP, IMSI, Indesit, intelligent networks, Internet of things, iPhones, Kindle, KPN, Logica, M2M, messaging, m-health, MNC, MVNE, MVNO, Novatel, Objects, Open Development Initiative, Orange, OTA, OTT, platforms, roaming, SaaS, Service Enabler, SIM, smart grid, SMS, Social Web of Things, Software-as-a-Service, spectrum policy, standardization, strategy, Swisscom, Telenor, Telenor Objects, T-Mobile, transport, USIM, Verizon, Vertical, Vertical M2M, Vodafone, WiMAX, Zigbee.

 

M2M 2.0: New Approaches Needed

Summary: the M2M market structure is evolving rapidly and new roles and requirements are emerging that are unaligned with much current telco practice. What must telcos do to avoid missing out and potentially inhibiting the market overall?

Logged-in members can download this 15 Page Analyst Note here in PDF format. Non-members please see here for details of the service and how to join, or call +44 (0) 207 247 5003 / email contact@telco2.net.

Introduction

This note summarises some of our recent work on M2M, and we will cover the developments on this topic at our 2011 Executive Brainstorms, and online at Best Practice Live! on 2-3 Feb 2011.

Context

This note draws on the study Aligning M2M with Telco2.0 Strategies exploring which elements of the M2M ecosystem are appropriate to given operator strategies and drawing on the experience of Linux.  Enterprise 2.0: Machine-to-machine – opening for business is a Telco 2.0 summary and analysis of recent developments, including an advanced case study from Telenor Objects and new research from Intel, Ericsson and SAP. M2M / Embedded Market Overview, Healthcare Focus, and Strategic Options gives an overview of the market, focusing on the cost-crisis needs of the Healthcare Sector, and reviewing strategic options for Telcos and other communications industry players. We’ve also created a new M2M and Embedded category on this research portal.

Machine to Machine: Telcos’ next Goldmine?


Defining Machine-to-Machine

Machine to Machine (M2M) is the term used by the cellular network industry to describe the business of connecting devices other than phones, laptops and similar consumer devices, over cellular networks. Although definitions vary, mobile communication is deemed to be M2M largely by virtue of the type of device connected. In broad terms, communications is deemed to be M2M when it is between an application and a device for which the connectivity is required to deliver the application, but is not considered an application itself (e.g. Amazon Kindle, Point-of-Sale card reader).

M2M communications

Traditional Cellular communications

  • Connected sensors and actuators
  • Logistics modules (asset tracking)
  • Smartmeters
  • Medical devices
  • Dedicated application consumer devices (E-readers, navigation devices)
  • Remote camcorders / CCTV cameras
  • Mobile phones
  • Data cards and USB “dongles” for PCs
  • Multi-functional personal “compunicators” (Tablets, MIDs)

The history

Historically, M2M has applied to major projects that integrate software, hardware and connectivity for integrate SCADA (supervisory control and data acquisition) industrial, facilities or specialised logistic applications. M2M communications payload has also tended to involve limited volumes of control and status data flows, characterised by primarily upstream narrowband traffic rather than downstream broadband.

For most operators, M2M has been a relatively minor sideline, accommodated alongside the core subscriber business. There has subsequently been limited investment by operators in dedicated M2M capabilities, functionality, support or distribution. Some operators, with well-developed systems integrator (SI) capabilities have included M2M connectivity as part of wider solutions, but they have done so no differently from any SI would.

The opportunity – Beyond point solutions

For years, M2M has been plagued by the promise of vast potential that seems forever “just around the corner”. A recent spate of bullish forecasts has renewed interest. Furthermore, a number of significant projects (e.g. Smartgrids) heralds significant potential increases in demand. Operators have responded with recent activity geared to better addressing the needs of M2M application providers.

Given the relative size and wide variety of applications the “vertical” industry-application, a point-solution approach has been inevitable. It has served its purpose well. However, this approach requires considerable expertise to be built-up by application developers and application providers that find it expensive to develop and implement point applications across many device types, especially if these are connected by different communication links. The additional cost (or inflexibility, for those not wishing to incur this cost) of supporting variety of devices and connections continues through development and implementation into support and maintenance.

Yet, this is precisely the kind of situation that we would anticipate emerging with the “Internet of Things” that are implicit in the heady forecasts: applications connecting millions of instances of thousands of different device types, over more than one communication technology. All this, delivered by developers whose interest and expertise lies elsewhere than in mastering device or bearer-specific APIs. Just as for computing, the “Internet of things” will need support ongoing change in devices and communications in a generic (ideally open) fashion.

The challenge for Telcos – making it happen and securing a share of the value

Network operators’ challenge is to facilitate the growth of the M2M market and still secure a sizeable share of the value generated. The concern is that their role could be marginalised to that of suppliers in a commoditised and very competitive connectivity market.

M2M customers (primarily application providers and developers) have had to work with different operators in each territory and been presented with various levels of support infrastructure and interfaces. This has made it extremely complex and expensive to roll-out M2M applications across a region such as Europe. This regional requirement not only applies to logistics functions (connect devices that move across regions), but also applications for “static” devices which are being rolled-out and supported across a region: a typical situation for many businesses. For example, when Amazon sought to introduce its Kindle across Europe, it was frustrated by the absence of a single commercial or commercial platform. Amazon eventually contracted AT&T (an operator with no network presence of its own in Europe, only roaming agreements) to provide the service that European operators could not.

Making it easier to use M2M connectivity

Operators have been working hard to make their connectivity more “M2M-friendly”:

  • Developing “one-SIM” regional (or even global) coverage. This has primarily been achieved through roaming and/or MVNOs. Larger operators have been able to leverage their assets across multiple territories, smaller ones have worked with aggregators and roaming agreements.
  • Providing dedicated M2M managed connectivity resources (people, services and platforms) to customers. Some of these have been developed internally, others have been secured through partnerships with aggregators and service platforms such as Wyless or Jasper Wireless.

These developments represent a real step forwards for the industry and signs of the market “growing-up”. These initiatives have been focused on making (generally only one type of) connectivity easier to use. Customers looking to support multiple devices over multiple bearer networks are still faced with an array of fragmented technical and commercial challenges and limited flexibility in combining these.

Making it easier to develop, launch and maintain applications…. flexibly

The emergence of open, ubiquitous general purpose technologies will make it possible to develop, launch and maintain new applications with dramatically lower needs for capital and lead times. This would potentially be an open-source technical stack, analogous to the LAMP (Linux/Apache/MySQL and one of Perl, PHP, or Python) stack ubiquitous on the Web.

This should lead to an explosion of new M2M application providers. New players, ranging from start-ups to established “un-connected” product brands are better able to integrate connectivity into the offers applications and services to final customers and also to each other. Typically, they would collectively source connectivity, hardware, software & services and sell tailored applications to end users – the main variation between them being how far along the chain from raw material to finished product they work.

This new market, like the World Wide Web before it, would be critically dependent on interoperability and interconnection, achieved through open standards. To quote a key Telco 2.0 principle – “The development of new common technical and commercial platforms across Telcos, which create economies of scale required to deliver a ubiquitous solution to upstream customers.”

This note therefore describes the emerging M2M market structure, characterises the business models and value networks within it, speculates about its potential future evolution and draws conclusions as to what operators can do in order to benefit from it.

Structure: M2M as a Value Network

In this analysis we use a model that defines businesses in terms of their place in a value network, and that begins with raw materials (like mineral ores) and ends with the finished product and the customer. The illustrative example is gold – we begin with the goldmine and the miners, digging the ore out of the earth, we see it refined into semi-manufactured bullion, which is worked into wholesale components products, which the jeweller finally customises according to the customer’s needs.

Figure 1 – Where to play in the Value Network?

AN%20-%20Telecoms%202015%20charts%20Gold.png

 

Source: Telco 2.0

Each step in the process is linked with a very different type of business; notably, the left hand side of figure 1 tends to involve very large and capital-intensive firms, whose assets are usually very fixed – nothing, after all, is more fixed than a deposit of gold-bearing rocks. As the process proceeds, the minimum efficient scale tends to fall, and the optimal mix of assets changes; at the very far left, they are dominated by land, then by physical plant, then by intellectual property, and finally by human capital and intangible goods like knowledge of the customer.

To read the rest of this analysis, covering…

  • Applying the gold analogy to M2M
  • Managed Connectivity and Service Enablement Layers
  • Roles: Specialisation vs. Scale
  • Where should telcos seek to serve?
  • Future Evolution
  • How must the telco industry meet the challenges?

…and including…

  • Figure 1 – Where to play in the Value Network?
  • Figure 2: Distinguishing between the layers
  • Figure 3 – Mapping the emerging M2M Players
  • Figure 4: Balancing the equation: Telcos as service enablers

Members of the Telco 2.0TM Executive Briefing Subscription Service can download the full 15 page report in PDF format here (when logged-in). Non-Members, please see here for how to subscribe. Please email contact@telco2.net or call +44 (0) 207 247 5003 for further details.

Full Article: Aligning M2M with Telco 2.0 Strategies

Summary: A review of Telenor, Jasper Wireless and KPN’s approaches to M2M,
examining how M2M strategy needs to fit with an operators’ future
broadband business model strategy. (October 2010)

NB A PDF copy of this article can be downloaded here.

M2M: escaping the cottage industry

The M2M (Machine-to-Machine) market, also known as “Embedded
Mobile”, has frequently been touted as a major source of future growth for the
industry. Verizon Wireless, for example, has set a target of 400% mobile
penetration, implying three embedded devices for each individual subscriber.
However, it is widely considered that this market is cursed by potential –
success always seems to be five years away. At this Spring’s Telco 2.0
Executive Brainstorm, delegates described it as being “sub-scale” and a
“cottage industry”.

Specific technical, operational, and economic issues have
driven this initial fragmentation. M2M is characterised by diversity- this is
inevitable, as there are thousands of business processes in each of tens of
thousands of sub-sectors across the industrial economy. As well as the highly
tailored nature of the applications, there is considerable diversity in
hardware and software products, and new products will have to coexist with many
legacy systems. These many diverse but necessary combinations have provided
fertile ground for the separate ‘cottage industries’.

As a result, it is conversely difficult to build scale,
despite the large total market size. Also, the high degree of specialisation in
each sub-market acts as a barrier to entry. Volume is critical, as typical
ARPUs for embedded devices are only a fraction of those we have come to expect
from human subscribers. This also implies that efficiency and project execution
are extremely important – there is little margin for error.  Finally, with so much specialisation at both
the application and device ends of the equation, it is hard to see if and where
there is much room for generic functionality in the middle. 

Special Technical and Operational Challenges

The technical problems are challenging. M2M applications are
frequently safety-critical, operations-critical, or both. This sets a high bar
in terms of availability and reliability. They often have to operate in
difficult environments. Information security issues will be problematic and new
technologies such as the “Internet of things”/Ubiquitous Computing will make
new demands in terms of disclosure that contradict efforts to secure the
system. An increasingly common requirement is for embedded devices to
communicate directly and to self-organise – in the past, M2M systems have
typically used a client-server architecture and guaranteed security by
isolating their communications networks from the wider world. The security
requirements of a peer-to-peer, internetworked M2M system are qualitatively
different to those of traditional Supervisory, Control, and Data Acquisition
(SCADA) systems.

One of the reasons for customer interest in self-organising
systems is that M2M projects often involve large numbers of endpoints, which
may be difficult to access once deployed, and the costs of managing the system
can be very high. How are the devices deployed, activated, maintained, and
withdrawn? How does the system authenticate them? Can a new device appearing on
the scene be automatically detected, authenticated, and connected? A related
problem is that devices are commonly integrated in industrial assets that have
much longer design lives than typical cellular electronics; computers are
typically depreciated over 3 years, but machine tools, vehicles, and other
plant may have a design life of 30 years or more.

This implies that the
M2M element must be repeatedly upgraded during its lifetime, and if possible,
this should happen without a truckroll. (The asset, after all, may be an
offshore wind turbine, in which case no-one will be able to visit it without using
a helicopter
.) This also requires that upgrades can be rolled-back in the
event they go wrong.

The Permanent Legacy Environment

We’ve already noted that there are a great variety of
possible device classes and vendors, and that new deployments will have to
co-exist with legacy systems. In fact, given the disparity between their
upgrade cycles and the design lives of the assets they monitor, it’s more
accurate to say that these devices will exist in a permanent legacy
environment.

Solution: The Importance of System Assurance

Given the complex needs of M2M applications, just providing
GPRS connectivity and modules will not be enough. Neither is there any reason
to think operators will be better than anyone else at developing industrial
process control or management-information systems. However, look again at the
issues we’ve just discussed – they cluster around what might be termed “system
assurance”. Whatever the application or the vendor, customers will need to be
able to activate, deactivate, identify, authenticate, read-out, locate,
command, update, and rollback their fleet of embedded devices. It is almost
certainly best that device designers decide what interfaces their product will
have as extensions to a standard management protocol. This also implies that
the common standard will need to include a function to read out what extensions
are available on a given device. The

similarities with the well-known SNMP (Simple Network
Management Protocol) and with USSD are extensive.

These are the problems we need to solve. Are there
technology strategies and business models that operators can use to profit by
solving them?

We have encountered a number of examples of how operators
and others have answered this question.

Three Operators’ Approaches

1.  Telenor:
Comprehensive Platform

Telenor Objects is a platform for handling the management,
systems administration, information assurance, and applications development of
large, distributed M2M systems. The core of the product is an open-source
software application developed in-house at Telenor. Commercially, Objects is
offered as a managed service hosted in Telenor’s data centres, either with or
without Telenor data network capacity. This represents concentration on the
system assurance problems we discussed above, with a further concern for rapid
applications development and direct device-to-device communication.

2.  Jasper:
Connectivity Broker.

Several companies – notably Jasper Wireless, Wyless plc.,
and Telenor’s own Connexion division – specialise in providing connectivity for
M2M applications as a managed service. Various implementations exist, but a
typical one is a data-only MVNO with either wholesale or roaming relationships
to multiple physical operators. As well as acting as a broker in wholesale data
service, they may also provide some specialised BSS-OSS features for M2M work,
thus tackling part of the problems given above.

3.  KPN:
M2M Happy Pipe

KPN (an investor in Jasper Wireless) has recently announced
that it intends to deploy a CDMA450 network in the Netherlands exclusively for
M2M applications. Although this is a significant CAPEX commitment to the low
margin connectivity element of the M2M market, it may be a valid option.
Operating at 450MHz, as opposed to 900/1800/1900MHz GSM or 2100MHz UMTS,
provides much better building penetration and coverage at the cost of reduced
capacity. The majority of M2M applications are low bandwidth, many of them will
be placed inside buildings or other radio-absorbing structures, and the low
ARPUs imply that cost minimisation will be significantly more important than
capacity. Where suitable spectrum is available, and a major launch customer –
for example, a smart grid project – exists to justify initial expenditure, such
a multi-tenant data network may be an attractive opportunity. However, this
assumes that the service-enablement element of the product is provided by
someone else – which may be the profitable element.

Finally, Verizon Wireless’s Open Development Initiative,
rather than being a product, is a standardisation initiative intended to
increase the variety of devices available for M2M implementers by speeding up
the process of homologating (the official term) new modules. The intention is
to create a catalogue of devices whose characteristics can be trusted and whose
control interfaces are published. This is not a lucrative business, but
something like it is necessary to facilitate the development of M2M hardware
and software.

Horizontal Enablers

These propositions have in common that they each represent a
different horizontal element of the total M2M system-of-systems –
whether it’s the device-management and applications layer, as in Telenor
Objects, a data-only MVNO such as Connexion or Wyless, or a radio network like
KPN’s, it’s a feature or capability that is shared between different vertical
silos and between multiple customers.

In developing horizontal enabler capabilities, operators
need to consider how to both drive development and growth of what is
effectively a new market and ensure that they are adding
value and that they are getting paid for it.   There is a natural tension between these
objectives.

The tension is between providing a compelling opportunity to
potential ecosystem partners (and in particular, offering them low cost access
to a large potential market) and securing a clear role for providers to extract
value (in particular, through differentiation).


Tensions between operators
and users

Linux: a case study

To explore operator options, we have looked to the
experience of Linux. This is an example of how the demands of a highly diverse
user base can be tackled through horizontalisation, modular design, and open
source development. Since the 1990s, the operating system has come to include a
huge diversity of specialised variants, known as distributions. These consist
of elements that are common to all Linux systems – such as one of the various
kernels which provide the core operating system functions, the common API,
drivers for different hardware devices, and a subset of a wide range of
software libraries that provide key utility programs – and modules specific to
the distribution, that implement its special features.

 For example, Red Hat
Enterprise Linux and OpenSUSE are enterprise-optimised distributions, CentOS is
frequently used for Asterisk and other VoIP applications, Ubuntu is a consumer
distribution (which itself has several specialised variants such as Edubuntu
for educational applications), Android is a mobile-optimised distribution,
Slackware and Debian exist for hardcore developers, Quagga and Zebra are
optimised for use as software-based IP routers, and WindRiver produces
ultra-low power systems for embedded use.

In fact, it’s probably easier to illustrate this than it is
to describe it. The following diagram illustrates the growing diversity of the
Linux family.

The evolution of Linux
distributions over time

The reason why this has been a) possible and b)
tolerable  is the horizontalised,
open-source, and modular nature of Linux. It could easily have been far too
difficult to do a version for minimal industrial devices, another for desktop
PCs, and yet another for supercomputers. Or the effort to do so could have
created a morass of highly incompatible subprojects

In creating a specialised distribution (or ‘distro’), it’s
possible to rely on the existing features that span the various distributions
and deal with the requirements they have in common. Similarly, a major
improvement in one of those core features has a natural source of scale, and
will tend to attract community involvement in its maintenance, as all the other
distros will rely on it. This structure both supports specialisation and
innovation, and helps to scale up support for the features everyone uses.

The Linux kernel – horizontal
specialisation in action

 

To recap, we think that M2M devices may be a little like
this – very different, but relying on at least a subset of the features in a
common specification. The Linux analogy is especially important given that a
lot of them are likely to use some sort of embedded Linux platform. Minimal
common features are likely to cluster around:

  • Activation/Deactivation – initial switch on of a
    device, provisioning it with connectivity, and eventually switching it off

  • Authentication – checking if this is the device
    it should be

  • Update/Rollback – updating the software and
    firmware on a device, and reversing this if it goes wrong

  • Device Discovery – detecting the presence of new
    devices

  • State Readout – get the current values for
    whichever parameters the device is monitoring

  • Location – where is the device?

  • Device Status – is it working?

  • Generic Event notification parameters – provide
    for notifications to and from devices that are specified by the user

This list is likely to be extended by device implementers
and software developers with device- and application-specific commands and data
formats, so there will also need to be a function to get a device’s interfaces
and capabilities. Technically, this has considerable commonality with formats
like USB, SNMP (Simple Network Management Protocol), SyncML, etc. – it’s
possible that these might be implemented as extensions to one of these
protocols.

For our purposes, it’s more interesting to note that these
functions have a lot in common with telcos’ own competences with regard to
device management, activation, OSS/BSS, and the SIM/USIM. Operators in general,
and specifically mobile operators, already have to detect, authenticate,
provision-on, update, and eventually end-of-life a diverse variety of mobile
devices. As much of this as possible must happen over-the-air and
automatically.

It is worth noting that Telstra recently announced their
move into the M2M market. Although they are doing so with Jasper Wireless as a
partner, the product (Telstra
M2M Wireless Control Centre
) is a Web-based self-service application for
customers to activate and administer their own devices.

The commercial strategies of Linux vendors

Returning to the IT world, it’s worth asking “how do the
Linux vendors make money?” After all, their product is at least theoretically
free. We see three options.

  • Option 1 – Red Hat, Novell

Both of these major IT companies maintain their own Linux
distribution (RHEL and OpenSUSE respectively, two of the most common enterprise
distros) and are very significant contributors of code to the core development
process. They also develop much application-layer software.

As well as releasing the source code, though, they also
offer paid-for, shrink-wrapped versions of the distributions, often including
added extras, and custom installations for major enterprise projects.

Typically, a large part of the commercial offering in such a
deal consists of the vendor providing technical support, from first line up to
systems integration and custom software development, and consulting services
over the lifetime of the product. It has been remarked that Linux is only free
if you value your own time at zero – this business model externalises the
maintenance costs and makes them into a commercial product that supports the
“free” element.

  •  Option 2 – IBM

Although IBM has long had its own proprietary Unix-like
operating system, through the 2000s it has become an ever more significant
Linux company – the only enterprise that could claim to be bigger would be
Google. Essentially, they use it as just another software option for their IT
consulting and managed services operation to sell, with the considerable
advantages of no upstream licence costs, very broad compatibility, and maximum
scope for custom development. In return, IBM contributes significant resources
to Linux, and to other open-source projects, notably OpenOffice.

  • Option 3 – Rackspace

And, of course, one way to make money from Linux is good
old-fashioned hosting – they call it the cloud these days. Basically, this
option captures any sort of managed-service offering that uses it as a core
enabler, or even as the product itself.

The big divide between the options, in the end, is the cost
of entry and the form it takes. If you aim to tackle Option 1, there is no
substitute for very significant investment in technical R&D, at least to
the level of Objects. Building up the team, the infrastructure, and significant
original technology is the entry ticket. Operators aren’t – with some
honourable exceptions – the greatest at internal innovation, so beware.

Telenor: flexibility through integration of multiple strategies

With Objects, Telenor has chosen this daring course.
However, they have also hedged their bets between the Red Hat/Novell model and
the managed-service model, by integrating elements of options 1 and 3. Objects
is technically an open-source software play, and commercially/operationally a
hosted service based in their existing data centre infrastructure. Its business
model is solidly based on usage-based subscription.

This doesn’t mean, however, that they couldn’t flex to a
different model in markets where they don’t have telco infrastructure –
offering technical support and consulting to third party implementers of the
software would be an option, and so would rolling it into a broader
systems-integration/consulting offering. In this way, horizontalisation offers
flexibility.

Option 2, of course, demands a significant specialisation in
IT, SI, and related trades.. This is probably achievable for those operators,
like BT and DTAG, who maintain a significant IT services line of business.
Otherwise, this would require a major investment and a risky change of focus.

Connectivity: needs a launch customer…

Option 3 – pure-play connectivity – is a commodity business
in a sector where ARPU is typically low. However, oil is also a commodity, and
nobody thinks that’s not a good business to be in. Two crucial elements for
success will be operations excellence – customers will demand high
availability, while low ARPU will constrain operators to an obsessive focus on
cost – and volume. It will be vital to acquire a major launch customer to get
the business off the ground. A smart grid project, for example, would be ideal.
Once there, you can sell the remaining capacity to as many other customers as
you can drum up.

Existing operators, like KPN, will have the enormous
advantage of being able to re-use their existing physical footprint of cell
sites, power, and backhaul, by adding a radio network more suited to the
demands of cheap coverage, building penetration, and relatively low bandwidth
demands, such as CDMA450 or WiMAX at relatively low frequencies.

Conclusion: M2M must fit into a total strategy

In conclusion, the future M2M market tends to map onto other
ideas about the future of operators. We identified three key strategies in the Future Broadband Business Models
strategy report
, and they have significant relevance here.

“Telco 2.0”, with its aim to be a highly agile development
platform, is likely to look to the software-led Option 1, and perhaps consider partnering
with a suitable player. They might license the Objects brand and make use of the
source code, or else come to an arrangement with Telenor to bring the product
to their customers as a managed service.

The wholesale-led and cost-focused  “Happy Pipe”, and its close cousin,
“Government Department”, is likely to take its specialisation in cheap,
reliable connectivity into a string of new vertical markets, picking
appropriate technology and looking for opportunities in the upcoming spectrum
auctions.

“Device Specialists”, with their deep concern for finding
the right content, brands, and channels to market are likely to pick Option 2 –
if they have a business like T-Systems or BT Global Services, they’ll integrate
it, otherwise they’ll partner with an IT player.

Telco 2.0 Further Reading

If you found this article interesting, you may also be interested in Enterprise 2.0 – Machine-to-Machine Opening for Business, a report of the M2M session at the last Telco 2.0 Executive Brainstorm, and M2M / Embedded Market Overview, Healthcare Focus, and Strategic Options, our Executive Briefing on the M2M market and the healthcare industry vertical.