Microsoft, Affirmed and Metaswitch: What does it mean for telecoms?

What is Microsoft doing, and should telcos be worried?

Over the past two years, Microsoft and its cloud business unit Azure have intensified and deepened their involvement in the telecoms vertical. In 2020, this included the acquisition of two leading independent vendors of cloud-native network software, Affirmed Networks and Metaswitch. This move surprised many industry observers, as it represented an intensification of Microsoft’s involvement in telco networking.

In addition, in September 2020, Microsoft announced its ‘Azure for Operators’ strategy. This packages up all the elements of Microsoft’s and Azure’s infrastructure and service offerings for the telecoms industry – including those provided by Affirmed and Metaswitch – into a more comprehensive, end-to-end portfolio organised around Microsoft’s concept of a ‘carrier-grade cloud’: a cloud that is truly capable of supporting and delivering the distinct performance and reliability that telcos require from their network functions, as opposed to the mainstream cloud devoted to enterprise IT.

In this report, our discussion of Microsoft’s strategy and partnership offer to telcos is our own interpretation based on our research, including conversations with executives from Microsoft, Affirmed Networks and Metaswitch.

We examine Microsoft’s activities in the telecoms vertical in the light of three central questions:

  • What is Microsoft doing in telecoms, and what are its intentions?
  • How should telcos respond to Microsoft’s moves and those of comparable hyperscale cloud providers? Should they consume the hyperscalers’ telco cloud products, compete against the hyperscalers, or collaborate with them?
  • And what would count as success for telcos in relationship to Microsoft and the other hyperscalers? Are there any lessons to be learned from what is happening already?

Enter your details below to request an extract of the report


Microsoft’s telecom timeline

The last couple of years has seen Microsoft and Azure increasing their involvement in telecoms infrastructure and software while building partnerships with telcos around the world. This march into telecoms stepped up a level with Microsoft’s acquisition in 2020 of two independent virtual network function (VNF) vendors with a strong presence in the mobile core, among other things: Affirmed Networks and Metaswitch. Microsoft was not previously known for its strength in telco network software, and particularly the mobile domain – prompting the question: what exactly was it doing in telecoms?

The graphic below illustrates some of the key milestones in Microsoft’s steady march into telecoms.

Microsoft’s move on telecoms

Microsoft’s five partnership and service models

Microsoft Azure’s key initiatives over the past two years have been to expand its involvement in telecoms, culminating in Microsoft’s acquisition of Affirmed and Metaswitch, and the launch of the Azure for Operators portfolio.

As a result of these initiatives, we believe there are five models of partnership and service delivery that Microsoft is now proposing to operators, addressing the opportunities arising from a convergence of network, cloud and compute. Altogether, these five models are:

Five business models for partnerships

  • A classic telco-vendorrelationship (e.g. with Affirmed or Metaswitch) – helping telcos to evolve their own cloud-native network functions (CNFs), and cloud infrastructure and operations
  • The delivery and management of VNFs and CNFs as a cloud service, or ‘Network Functions-as-a-Service’ (NFaaS)
  • Enabling operators to pursue a hybrid-cloud operating model supporting the delivery of their own vertical-specific and enterprise applications and services, or Platform-as-a-Service (PaaS)
  • Rolling out Azure edge-cloud data centres into telco and enterprise edge locations to serve as a cloud delivery platform for third-party application developers providing low latency-dependent and high-bandwidth services, or ‘Network-as-a-Cloud Platform’ (NaaCP)
  • Using such Azure edge clouds – in enterprise and neutral facilities alongside telco edge locations – as the platform for full-fledged ‘net compute’ services, whether these are developed collaboratively with operators or not.

Table of Contents

  • Executive Summary
    • Microsoft wants to be a win-win partner
    • What should telcos and others do?
    • Next steps
  • Introduction
    • What is Microsoft doing, and should telcos be worried?
  • What has Microsoft done?
    • Microsoft’s telecom timeline
  • What is Microsoft’s strategy?
    • Microsoft’s five partnership and service models
    • The ‘Azure for Operators’ portfolio completes the set
    • 5G, cloud-native and net compute: Microsoft places itself at the heart of telco industry transformation
    • Cellular connectivity – particularly 5G – is pivotal
  • Telco-hyperscaler business models: What should telcos do?
    • Different hyperscalers have different telco strategies: comparison between Azure, AWS and Google Cloud
    • What should telcos do? Compete, consume or collaborate?
  • Microsoft’s ecosystem partnership model: What counts as success for telcos?
    • More important to grow the ecosystem than share of the value chain
    • Real-world examples: AT&T versus Verizon
  • Conclusion: Telcos should stay in the net compute game – and Microsoft wants be a partner
  • Appendix 1: Analysis of milestones of Microsoft’s journey into telecoms
  • Appendix 2: Opportunities and risks of different types of telco-hyperscaler partnership
  • Index

Enter your details below to request an extract of the report



Fixed wireless access growth: To 20% homes by 2025

=======================================================================================

Download the additional file on the left for the PPT chart pack accompanying this report

=======================================================================================

Fixed wireless access growth forecast

Fixed Wireless Access (FWA) networks use a wireless “last mile” link for the final connection of a broadband service to homes and businesses, rather than a copper, fibre or coaxial cable into the building. Provided mostly by WISPs (Wireless Internet Service Providers) or mobile network operators (MNOs), these services come in a wide range of speeds, prices and technology architectures.

Some FWA services are just a short “drop” from a nearby pole or fibre-fed hub, while others can work over distances of several kilometres or more in rural and remote areas, sometimes with base station sites backhauled by additional wireless links. WISPs can either be independent specialists, or traditional fixed/cable operators extending reach into areas they cannot economically cover with wired broadband.

There is a fair amount of definitional vagueness about FWA. The most expansive definitions include cheap mobile hotspots (“Mi-Fi” devices) used in homes, or various types of enterprise IoT gateway, both of which could easily be classified in other market segments. Most service providers don’t give separate breakouts of deployments, while regulators and other industry bodies report patchy and largely inconsistent data.

Our view is that FWA is firstly about providing permanent broadband access to a specific location or premises. Primarily, this is for residential wireless access to the Internet and sometimes typical telco-provided services such as IPTV and voice telephony. In a business context, there may be a mix of wireless Internet access and connectivity to corporate networks such as VPNs, again provided to a specific location or building.

A subset of FWA relates to M2M usage, for instance private networks run by utility companies for controlling grid assets in the field. These are typically not Internet-connected at all, and so don’t fit most observers’ general definition of “broadband access”.

Usually, FWA will be marketed as a specific service and package by some sort of network provider, usually including the terminal equipment (“CPE” – customer premise equipment), rather than allowing the user to “bring their own” device. That said, lower-end (especially 4G) offers may be SIM-only deals intended to be used with generic (and unmanaged) portable hotspots.
There are some examples of private network FWA, such as a large caravan or trailer park with wireless access provided from a central point, and perhaps in future municipal or enterprise cellular networks giving fixed access to particular tenant structures on-site – for instance to hangars at an airport.

Enter your details below to request an extract of the report


FWA today

Today, fixed-wireless access (FWA) is used for perhaps 8-9% of broadband connections globally, although this varies significantly by definition, country and region. There are various use cases (see below), but generally FWA is deployed in areas without good fixed broadband options, or by mobile-only operators trying to add an additional fixed revenue stream, where they have spare capacity.

Fixed wireless internet access fits specific sectors and uses, rather than the overall market

FWA Use Cases

Source: STL Partners

FWA has traditionally been used in sparsely populated rural areas, where the economics of fixed broadband are untenable, especially in developing markets without existing fibre transport to towns and villages, or even copper in residential areas. Such networks have typically used unlicensed frequency bands, as there is limited interference – and little financial justification for expensive spectrum purchases. In most cases, such deployments use proprietary variants of Wi-Fi, or its ill-fated 2010-era sibling WiMAX.

Increasingly however, FWA is being used in more urban settings, and in more developed market scenarios – for example during the phase-out of older xDSL broadband, or in places with limited or no competition between fixed-network providers. Some cellular networks primarily intended for mobile broadband (MBB) have been used for fixed usage as well, especially if spare capacity has been available. 4G has already catalysed rapid growth of FWA in numerous markets, such as South Africa, Japan, Sri Lanka, Italy and the Philippines – and 5G is likely to make a further big difference in coming years. These mostly rely on licensed spectrum, typically the national bands owned by major MNOs. In some cases, specific bands are used for FWA use, rather than sharing with normal mobile broadband. This allows appropriate “dimensioning” of network elements, and clearer cost-accounting for management.

Historically, most FWA has required an external antenna and professional installation on each individual house, although it also gets deployed for multi-dwelling units (MDUs, i.e. apartment blocks) as well as some non-residential premises like shops and schools. More recently, self-installed indoor CPE with varying levels of price and sophistication has helped broaden the market, enabling customers to get terminals at retail stores or delivered direct to their home for immediate use.

Looking forward, the arrival of 5G mass-market equipment and larger swathes of mmWave and new mid-band spectrum – both licensed and unlicensed – is changing the landscape again, with the potential for fibre-rivalling speeds, sometimes at gigabit-grade.

Enter your details below to request an extract of the report


Table of contents

  • Executive Summary
  • Introduction
    • FWA today
    • Universal broadband as a goal
    • What’s changed in recent years?
    • What’s changed because of the pandemic?
  • The FWA market and use cases
    • Niche or mainstream? National or local?
    • Targeting key applications / user groups
  • FWA technology evolution
    • A broad array of options
    • Wi-Fi, WiMAX and close relatives
    • Using a mobile-primary network for FWA
    • 4G and 5G for WISPs
    • Other FWA options
    • Customer premise equipment: indoor or outdoor?
    • Spectrum implications and options
  • The new FWA value chain
    • Can MNOs use FWA to enter the fixed broadband market?
    • Reinventing the WISPs
    • Other value chain participants
    • Is satellite a rival waiting in the wings?
  • Commercial models and packages
    • Typical pricing and packages
    • Example FWA operators and plans
  • STL’s FWA market forecasts
    • Quantitative market sizing and forecast
    • High level market forecast
  • Conclusions
    • What will 5G deliver – and when and where?
  • Index

Apple Glass: An iPhone moment for 5G?

Augmented reality supports many use cases across industries

Revisiting the themes explored in the AR/VR: Won’t move the 5G needle report STL Partners published in January 2018, this report explores whether augmented reality (AR) could become a catalyst for widespread adoption of 5G, as leading chip supplier Qualcomm and some telcos hope.

It considers how this technology is developing, its relationship with virtual reality (VR), and the implications for telcos trying to find compelling reasons for customers to use low latency 5G networks.

This report draws the following distinction between VR and AR

  • Virtual reality: use of an enclosed headset for total immersion in a digital3D
  • Augmented reality: superimposition of digital graphics onto images of the real world via a camera viewfinder, a pair of glasses or onto a screen fixed in real world.

In other words, AR is used both indoors and outdoors and on a variety of devices. Whereas Wi-Fi/fibre connectivity will be the preferred connectivity option in many scenarios, 5G will be required in locations lacking high-speed Wi-Fi coverage.  Many AR applications rely on responsive connectivity to enable them to interact with the real world. To be compelling, animated images superimposed on those of the real world need to change in a way that is consistent with changes in the real world and changes in the viewing angle.

AR can be used to create innovative games, such as the 2016 phenomena Pokemon Go, and educational and informational tools, such as travel guides that give you information about the monument you are looking at.  At live sports events, spectators could use AR software to identify players, see how fast they are running, check their heart rates and call up their career statistics.

Note, an advanced form of AR is sometimes referred to as mixed reality or extended reality (XR). In this case, fully interactive digital 3D objects are superimposed on the real world, effectively mixing virtual objects and people with physical objects and people into a seamless interactive scene. For example, an advanced telepresence service could project a live hologram of the person you are talking to into the same room as you. Note, this could be an avatar representing the person or, where the connectivity allows, an actual 3D video stream of the actual person.

Widespread usage of AR services will be a hallmark of the Coordination Age, in the sense that they will bring valuable information to people as and when they need it. First responders, for example, could use smart glasses to help work their way through smoke inside a building, while police officers could be immediately fed information about the owner of a car registration plate. Office workers may use smart glasses to live stream a hologram of a colleague from the other side of the world or a 3D model of a new product or building.

In the home, both AR and VR could be used to generate new entertainment experiences, ranging from highly immersive games to live holograms of sports events or music concerts. Some people may even use these services as a form of escapism, virtually inhabiting alternative realities for several hours a day.

Given sufficient time to develop, STL Partners believes mixed-reality services will ultimately become widely adopted in the developed world. They will become a valuable aid to everyday living, providing the user with information about whatever they are looking at, either on a transparent screen on a pair of glasses or through a wireless earpiece. If you had a device that could give you notifications, such as an alert about a fast approaching car or a delay to your train, in your ear or eyeline, why wouldn’t you want to use it?

How different AR applications affect mobile networks

One of the key questions for the telecoms industry is how many of these applications will require very low latency, high-speed connectivity. The transmission of high-definition holographic images from one place to another in real time could place enormous demands on telecoms networks, opening up opportunities for telcos to earn additional revenues by providing dedicated/managed connectivity at a premium price. But many AR applications, such as displaying reviews of the restaurant a consumer is looking at, are unlikely to generate much data traffic. the figure below lists some potential AR use cases and indicates how demanding they will be to support.

Examples of AR use cases and the demands they make on connectivity


Source: STL Partners

Although telcos have always struggled to convince people to pay a premium for premium connectivity, some of the most advanced AR applications may be sufficiently compelling to bring about this kind of behavioural shift, just as people are prepared to pay more for a better seat at the theatre or in a sports stadium. This could be on a pay-as-you-go or a subscription basis.

Enter your details below to request an extract of the report


The pioneers of augmented reality

Augmented reality (AR) is essentially a catch-all term for any application that seeks to overlay digital information and images on the real-world. Applications of AR can range from a simple digital label to a live 3D holographic projection of a person or event.

AR really rose to prominence at the start of the last decade with the launch of smartphone apps, such as Layar, Junaio, and Wikitude, which gave you information about what you were looking at through the smartphone viewfinder. These apps drew on data from the handset’s GPS chip, its compass and, in some cases, image recognition software to try and figure out what was being displayed in the viewfinder. Although they attracted a lot of media attention, these apps were too clunky to break through into the mass-market. However, the underlying concept persists – the reasonably popular Google Lens app enables people to identify a product, plant or animal they are looking at or translate a menu into their own language.

Perhaps the most high profile AR application to date is Niantic’s Pokemon Go, a smartphone game that superimposes cartoon monsters on images of the real world captured by the user’s smartphone camera. Pokemon Go generated $1 billion in revenue globally just seven months after its release in mid 2016, faster than any other mobile game, according to App Annie. It has also shown remarkable staying power. Four years later, in May 2020, Pokemon Go continued to be one of the top 10 grossing games worldwide, according to SensorTower.

In November 2017, Niantic, which has also had another major AR hit with sci-fi game Ingress, raised $200 million to boost its AR efforts. In 2019, it released another AR game based on the Harry Potter franchise.

Niantic is now looking to use its AR expertise to create a new kind of marketing platform. The idea is that brands will be able to post digital adverts and content in real-world locations, essentially creating digital billboards that are viewable to consumers using the Niantic platform. At the online AWE event in May 2020, Niantic executives claimed “AR gamification and location-based context” can help businesses increase their reach, boost user sentiment, and drive foot traffic to bricks-and-mortar stores. Niantic says it is working with major brands, such as AT&T, Simon Malls, Starbucks, Mcdonalds, and Samsung, to develop AR marketing that “is non-intrusive, organic, and engaging.”

The sustained success of Pokemon Go has made an impression on the major Internet platforms. By 2018, the immediate focus of both Apple and Google had clearly shifted from VR to AR. Apple CEO Tim Cook has been particularly vocal about the potential of AR. And he continues to sing the praises of the technology in public.

In January 2020, for example, during a visit to Ireland, Cook described augmented reality as the “next big thing.”  In an earnings call later that month, Cook added:When you look at AR today, you would see that there are consumer applications, there are enterprise applications. … it’s going to pervade your life…, because it’s going to go across both business and your whole life. And I think these things will happen in parallel.”

Both Apple and Google have released AR developer tools, helping AR apps to proliferate in both Apple’s App Store and on Google Play.  One of the most popular early use cases for AR is to check how potential new furniture would look inside a living room or a bedroom. Furniture stores and home design companies, such as Ikea, Wayfair and Houzz, have launched their own AR apps using Apple’s ARKit. Once the app is familiar with its surroundings, it allows the user to overlay digital models of furniture anywhere in a room to see how it will fit. The technology can work in outdoor spaces as well.

In a similar vein, there are various AR apps, such as MeasureKit, that allow you to measure any object of your choosing. After the user picks a starting point with a screen tap, a straight line will measure the length until a second tap marks the end. MeasureKit also claims to be able to calculate trajectory distances of moving objects, angle degrees, the square footage of a three-dimensional cube and a person’s height.

Table of contents

  • Executive Summary
    • More mainstream models from late 2022
    • Implications and opportunities for telcos
  • Introduction
  • Progress and Immediate Prospects
    • The pioneers of augmented reality
    • Impact of the pandemic
    • Snap – seeing the world differently
    • Facebook – the keeper of the VR flame
    • Google – the leader in image recognition
    • Apple – patiently playing the long game
    • Microsoft – expensive offerings for the enterprise
    • Amazon – teaming up with telcos to enable AR/VR
    • Market forecasts being revised down
  • Telcos Get Active in AR
    • South Korea’s telcos keep trying
    • The global picture
  • What comes next?
    • Live 3D holograms of events
    • Enhancing live venues with holograms
    • 4K HD – Simple, but effective
  • Technical requirements
    • Extreme image processing
    • An array of sensors and cameras
    • Artificial intelligence plays a role
    • Bandwidth and latency
    • Costs: energy, weight and financial
  • Timelines for Better VR and AR
    • When might mass-market models become available?
    • Implications for telcos
    • Opportunities for telcos
  • Appendix: Societal Challenges
    • AR: Is it acceptable in a public place?
    • VR: health issues
    • VR and AR: moral and ethical challenges
    • AR and VR: What do consumers really want?
  • Index

Enter your details below to request an extract of the report


Telco edge computing: How to partner with hyperscalers

Edge computing is getting real

Hyperscalers such as Amazon, Microsoft and Google are rapidly increasing their presence in the edge computing market by launching dedicated products, establishing partnerships with telcos on 5G edge infrastructure and embedding their platforms into operators’ infrastructure.

Many telecoms operators, who need cloud infrastructure and platform support to run their edge services, have welcomed the partnership opportunity. However, they are yet to develop clear strategies on how to use these partnerships to establish a stronger proposition in the edge market, move up the value chain and play a role beyond hosting infrastructure and delivering connectivity. Operators that miss out on the partnership opportunity or fail to fully utilise it to develop and differentiate their capabilities and resources could risk either being reduced to connectivity providers with a limited role in the edge market and/or being late to the game.

Edge computing or multi-access edge computing (MEC) enables processing data closer to the end user or device (i.e. the source of data), on physical compute infrastructure that is positioned on the spectrum between the device and the internet or hyperscale cloud.

Telco edge computing is mainly defined as a distributed compute managed by a telco operator. This includes running workloads on customer premises as well as locations within the operator network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs. Depending on where the computing resources reside, edge computing can be broadly divided into:

  • Network edge which includes sites or points of presence (PoPs) owned by a telecoms operator such as base stations, central offices and other aggregation points on the access and/or core network.
  • On-premise edge where the computing resources reside at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.

Our overview on edge computing definitions, network structure, market opportunities and business models can be found in our previous report Telco Edge Computing: What’s the operator strategy?

The edge computing opportunity for operators and hyperscalers

Many operators are looking at edge computing as a good opportunity to leverage their existing assets and resources to innovate and move up the value chain. They aim to expand their services and revenue beyond connectivity and enter the platform and application space. By deploying computing resources at the network edge, operators can offer infrastructure-as-a-service and alternative application and solutions for enterprises. Also, edge computing as a distributed compute structure and an extension of the cloud supports the operators’ own journey into virtualising the network and running internal operations more efficiently.

Cloud hyperscalers, especially the biggest three – Amazon Web Services (AWS), Microsoft Azure and Google – are at the forefront of the edge computing market. In the recent few years, they have made efforts to spread their influence outside of their public clouds and have moved the data acquisition point closer to physical devices. These include efforts in integrating their stack into IoT devices and network gateways as well as supporting private and hybrid cloud deployments. Recently, hyperscalers took another step to get closer to customers at the edge by launching platforms dedicated to telecom networks and enabling integration with 5G networks. The latest of these products include Wavelength from AWS, Azure Edge Zones from Microsoft and Anthos for Telecom from Google Cloud. Details on these products are available in section.

Enter your details below to request an extract of the report


From competition to coopetition

Both hyperscalers and telcos are among the top contenders to lead the edge market. However, each stakeholder lacks a significant piece of the stack which the other has. This is the cloud platform for operators and the physical locations for hyperscalers. Initially, operators and hyperscalers were seen as competitors racing to enter the market through different approaches. This has resulted in the emergence of new types of stakeholders including independent mini data centre providers such as Vapor IO and EdgeConnex, and platform start-ups such as MobiledgeX and Ori Industries.

However, operators acknowledge that even if they do own the edge clouds, these still need to be supported by hyperscaler clouds to create a distributed cloud. To fuel the edge market and build its momentum, operators will, in the most part, work with the cloud providers. Partnerships between operators and hyperscalers are starting to take place and shape the market, impacting edge computing short- and long-term strategies for operators as well as hyperscalers and other players in the market.

Figure 1: Major telco-hyperscalers edge partnerships

Major telco-hyperscaler partnerships

Source: STL Partners analysis

What does it mean for telcos?

Going to market alone is not an attractive option for either operators or hyperscalers at the moment, given the high investment requirement without a guaranteed return. The partnerships between two of the biggest forces in the market will provide the necessary push for the use cases to be developed and enterprise adoption to be accelerated. However, as markets grow and change, so do the stakeholders’ strategies and relationships between them.

Since the emergence of cloud computing and the development of the digital technologies market, operators have been faced with tough competition from the internet players, including hyperscalers who have managed to remain agile while building a sustained appetite for innovation and market disruption. Edge computing is not an exception and they are moving rapidly to define and own the biggest share of the edge market.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market while partnering with hyperscalers in edge computing.

Table of contents

  • Executive Summary
    • Operators’ roles along the edge computing value chain
    • Building a bigger ecosystem and pushing market adoption
    • How partnerships can shape the market
    • What next?
  • Introduction
    • The edge computing opportunity for operators and hyperscalers
    • From competition to coopetition
    • What does it mean for telcos?
  • Overview of the telco-hyperscalers partnerships
    • Explaining the major roles required to enable edge services
    • The hyperscaler-telco edge commercial model
  • Hyperscalers’ edge strategies
    • Overview of hyperscalers’ solutions and activities at the edge
    • Hyperscalers approach to edge sites and infrastructure acquisition
  • Operators’ edge strategies and their roles in the partnerships
    • Examples of operators’ edge computing activities
    • Telcos’ approach to integrating edge platforms
  • Conclusion
    • Infrastructure strategy
    • Platform strategy
    • Verticals and ecosystem building strategy

 

Enter your details below to request an extract of the report


Telco edge computing: What’s the operator strategy?

To access the report chart pack in PPT download the additional file on the left

Edge computing can help telcos to move up the value chain

The edge computing market and the technologies enabling it are rapidly developing and attracting new players, providing new opportunities to enterprises and service providers. Telco operators are eyeing the market and looking to leverage the technology to move up the value chain and generate more revenue from their networks and services. Edge computing also represents an opportunity for telcos to extend their role beyond offering connectivity services and move into the platform and the application space.

However, operators will be faced with tough competition from other market players such as cloud providers, who are moving rapidly to define and own the biggest share of the edge market. Plus, industrial solution providers, such as Bosch and Siemens, are similarly investing in their own edge services. Telcos are also dealing with technical and business challenges as they venture into the new market and trying to position themselves and identifying their strategies accordingly.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market.

Following this introduction, we focus on:

  1. Edge terminology and structure, explaining common terms used within the edge computing context, where the edge resides, and the role of edge computing in 5G.
  2. An overview of the edge computing market, describing different types of stakeholders, current telecoms operators’ deployments and plans, competition from hyperscale cloud providers and the current investment and consolidation trends.
  3. Telcos challenges in addressing the edge opportunity: technical, organisational and commercial challenges given the market
  4. Potential use cases and business models for operators, also exploring possible scenarios of how the market is going to develop and operators’ likely positioning.
  5. A set of recommendations for operators that are building their strategy for the edge.

Request a report extract

What is edge computing and where exactly is the edge?

Edge computing brings cloud services and capabilities including computing, storage and networking physically closer to the end-user by locating them on more widely distributed compute infrastructure, typically at smaller sites.

One could argue that edge computing has existed for some time – local infrastructure has been used for compute and storage, be it end-devices, gateways or on-premises data centres. However, edge computing, or edge cloud, refers to bringing the flexibility and openness of cloud-native infrastructure to that local infrastructure.

In contrast to hyperscale cloud computing where all the data is sent to central locations to be processed and stored, edge computing local processing aims to reduce time and save bandwidth needed to send and receive data between the applications and cloud, which improves the performance of the network and the applications. This does not mean that edge computing is an alternative to cloud computing. It is rather an evolutionary step that complements the current cloud computing infrastructure and offers more flexibility in executing and delivering applications.

Edge computing offers mobile operators several opportunities such as:

  • Differentiating service offerings using edge capabilities
  • Providing new applications and solutions using edge capabilities
  • Enabling customers and partners to leverage the distributed computing network in application development
  • Improving networkperformance and achieving efficiencies / cost savings

As edge computing technologies and definitions are still evolving, different terms are sometimes used interchangeably or have been associated with a certain type of stakeholder. For example, mobile edge computing is often used within the mobile network context and has evolved into multi-access edge computing (MEC) – adopted by the European Telecommunications Standards Institute (ETSI) – to include fixed and converged network edge computing scenarios. Fog computing is also often compared to edge computing; the former includes running intelligence on the end-device and is more IoT focused.

These are some of the key terms that need to be identified when discussing edge computing:

  • Network edge refers to edge compute locations that are at sites or points of presence (PoPs) owned by a telecoms operator, for example at a central office in the mobile network or at an ISP’s node.
  • Telco edge cloud is mainly defined as distributed compute managed by a telco  This includes running workloads on customer premises equipment (CPE) at customers’ sites as well as locations within the operator network such as base stations, central offices and other aggregation points on access and/or core network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs.
  • On-premise edge computing refers to the computing resources that are residing at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.
  • Edge cloud is used to describe the virtualised infrastructure available at the edge. It creates a distributed version of the cloud with some flexibility and scalability at the edge. This flexibility allows it to have the capacity to handle sudden surges in workloads from unplanned activities, unlike static on-premise servers. Figure 1 shows the differences between these terms.

Figure 1: Edge computing types

definition of edge computing

Source: STL Partners

Network infrastructure and how the edge relates to 5G

Discussions on edge computing strategies and market are often linked to 5G. Both technologies have overlapping goals of improving performance and throughput and reducing latency for applications such as AR/VR, autonomous vehicles and IoT. 5G improves speed by increasing spectral efficacy, it offers the potential of much higher speeds than 4G. Edge computing, on the other hand, reduces latency by shortening the time required for data processing by allocating resources closer to the application. When combined, edge and 5G can help to achieve round-trip latency below 10 milliseconds.

While 5G deployment is yet to accelerate and reach ubiquitous coverage, the edge can be utilised in some places to reduce latency where needed. There are two reasons why the edge will be part of 5G:

  • First, it has been included in the 5Gstandards (3GPP Release 15) to enable ultra-low latency which will not be achieved by only improvements in the radio interface.
  • Second, operators are in general taking a slow and gradual approach to 5G deployment which means that 5G coverage alone will not provide a big incentive for developers to drive the application market. Edge can be used to fill the network gaps to stimulate the application market growth.

The network edge can be used for applications that need coverage (i.e. accessible anywhere) and can be moved across different edge locations to scale capacity up or down as required. Where an operator decides to establish an edge node depends on:

  • Application latency needs. Some applications such as streaming virtual reality or mission critical applications will require locations close enough to its users to enable sub-50 milliseconds latency.
  • Current network topology. Based on the operators’ network topology, there will be selected locations that can meet the edge latency requirements for the specific application under consideration in terms of the number of hops and the part of the network it resides in.
  • Virtualisation roadmap. The operator needs to consider virtualisation roadmap and where data centre facilities are planned to be built to support future network
  • Site and maintenance costs. The cloud computing economies of scale may diminish as the number of sites proliferate at the edge, for example there is a significant difference in maintaining 1-2 large data centres to maintaining 100s across the country
  • Site availability. Some operators’ edge compute deployment plans assume the nodes reside in the same facilities as those which host their NFV infrastructure. However, many telcos are still in the process of renovating these locations to turn them into (mini) data centres so aren’t yet ready.
  • Site ownership. Sometimes the preferred edge location is within sites that the operators have limited control over, whether that is in the customer premise or within the network. For example, in the US, the cell towers are owned by tower operators such as Crown Castle, American Tower and SBA Communications.

The potential locations for edge nodes can be mapped across the mobile network in four levels as shown in Figure 2.

Figure 2: possible locations for edge computing

edge computing locations

Source: STL Partners

Table of Contents

  • Executive Summary
    • Recommendations for telco operators at the edge
    • Four key use cases for operators
    • Edge computing players are tackling market fragmentation with strategic partnerships
    • What next?
  • Table of Figures
  • Introduction
  • Definitions of edge computing terms and key components
    • What is edge computing and where exactly is the edge?
    • Network infrastructure and how the edge relates to 5G
  • Market overview and opportunities
    • The value chain and the types of stakeholders
    • Hyperscale cloud provider activities at the edge
    • Telco initiatives, pilots and plans
    • Investment and merger and acquisition trends in edge computing
  • Use cases and business models for telcos
    • Telco edge computing use cases
    • Vertical opportunities
    • Roles and business models for telcos
  • Telcos’ challenges at the edge
  • Scenarios for network edge infrastructure development
  • Recommendation
  • Index

Request STL research insights overview pack

Cloud gaming: What’s the telco play?

To access the report chart pack in PPT download the additional file on the left

Drivers for cloud gaming services

Although many people still think of PlayStation and Xbox when they think about gaming, the console market represents only a third of the global games market. From its arcade and console-based beginnings, the gaming industry has come a long way. Over the past 20 years, one of the most significant market trends has been growth of casual gamers. Whereas hardcore gamers are passionate about frequent play and will pay more to play premium games, casual gamers play to pass the time. With the rapid adoption of smartphones capable of supporting gaming applications over the past decade, the population of casual/occasional gamers has risen dramatically.

This trend has seen the advent of free-to-play business models for games, further expanding the industry’s reach. In our earlier report, STL estimated that 45% of the population in the U.S. are either casual gamers (between 2 and 5 hours a week) or occasional gamers (up to 2 hours a week). By contrast, we estimated that hardcore gamers (more than 15 hours a week) make up 5% of the U.S. population, while regular players (5 to 15 hours a week) account for a further 15% of the population.

The expansion in the number of players is driving interest in ‘cloud gaming’. Instead of games running on a console or PC, cloud gaming involves streaming games onto a device from remote servers. The actual game is stored and run on a remote compute with the results being live streamed to the player’s device. This has the important advantage of eliminating the need for players to purchase dedicated gaming hardware. Now, the quality of the internet connection becomes the most important contributor to the gaming experience. While this type of gaming is still in its infancy, and faces a number of challenges, many companies are now entering the cloud gaming fold in an effort to capitalise on the new opportunity.

5G can support cloud gaming traffic growth

Cloud gaming requires not just high bandwidth and low latency, but also a stable connection and consistent low latency (jitter). In theory, 5G promises to deliver stable ultra-low latency. In practice, an enormous amount of infrastructure investment will be required in order to enable a fully loaded 5G network to perform as well as end-to-end fibre5G networks operating in the lower frequency bands would likely buckle under the load if lots of gamers in a cell needed a continuous 25Mbps stream. While 5G in millimetre-wave spectrum would have more capacity, it would require small cells and other mechanisms to ensure indoor penetration, given the spectrum is short range and could be blocked by obstacles such as walls.

Request a report extract

A complicated ecosystem

As explained in our earlier report, Cloud gaming: New opportunities for telcos?, the cloud gaming ecosystem is beginning to take shape. This is being accelerated by the growing availability of fibre and high-speed broadband, which is now being augmented by 5G and, in some cases, edge data centres. Early movers in cloud gaming are offering a range of services, from gaming rigs, to game development platforms, cloud computing infrastructure, or an amalgamation of these.

One of the main attractions of cloud gaming is the potential hardware savings for gamers. High-end PC gaming can be an extremely expensive hobby: gaming PCs range from £500 for the very cheapest to over £5,000 for the very top end. They also require frequent hardware upgrades in order to meet the increasing processing demands of new gaming titles. With cloud gaming, you can access the latest graphics processing unit at a much lower cost.

By some estimates, cloud gaming could deliver a high-end gaming environment at a quarter of the cost of a traditional console-based approach, as it would eliminate the need for retailing, packaging and delivering hardware and software to consumers, while also tapping the economies of scale inherent in the cloud. However, in STL Partners’ view that is a best-case scenario and a 50% reduction in costs is probably more realistic.

STL Partners believes adoption of cloud gaming will be gradual and piecemeal for the next few years, as console gamers work their way through another generation of consoles and casual gamers are reluctant to commit to a monthly subscription. However, from 2022, adoption is likely to grow rapidly as cloud gaming propositions improve.

At this stage, it is not yet clear who will dominate the value chain, if anyone. Will the “hyperscalers” be successful in creating a ‘Netflix’ for games? Google is certainly trying to do this with its Stadia platform, which has yet to gain any real traction, due to both its limited games library and its perceived technological immaturity. The established players in the games industry, such as EA, Microsoft (Xbox) and Sony (PlayStation), have launched cloud gaming offerings, or are, at least, in the process of doing so. Some telcos, such as Deutsche Telekom and Sunrise, are developing their own cloud gaming services, while SK Telecom is partnering with Microsoft.

What telcos can learn from Shadow’s cloud gaming proposition

The rest of this report explores the business models being pursued by cloud gaming providers. Specifically, it looks at cloud gaming company Shadow and how it fits into the wider ecosystem, before evaluating how its distinct approach compares with that of the major players in online entertainment, such as Sony and Google. The second half of the report considers the implications for telcos.

Table of Contents

  • Executive Summary
  • Introduction
  • Cloud gaming: a complicated ecosystem
    • The battle of the business models
    • The economics of cloud gaming and pricing models
    • Content offering will trump price
    • Cloud gaming is well positioned for casual gamers
    • The future cloud gaming landscape
  • 5G and fixed wireless
  • The role of edge computing
  • How and where can telcos add value?
  • Conclusions

Request STL research insights overview pack

Cloud gaming: New opportunities for telcos?

Gaming is following video to the cloud

Cloud gaming services enable consumers to play video games using any device with a screen and an Internet connection – the software and hardware required to play the game are all hosted on remote cloud services. Some reviewers say connectivity and cloud technologies have now advanced to a point where cloud gaming can begin to rival the experience offered by leading consoles, such as Microsoft’s Xbox and Sony’s PlayStation, while delivering greater interactivity and flexibility than gaming that relies on local hardware. Google believes it is now feasible to move gaming completely into the cloud – it has just launched its Stadia cloud gaming service. Although Microsoft is sounding a more cautious note, it is gearing up to launch a rival cloud gaming proposition called xCloud.

This report explores cloud gaming and models the size of the potential market, including the scale of the opportunity for telcos. It also considers the potential ramifications for telecoms networks. If Stadia, xCloud and other cloud gaming services take off, consumer demand for high-bandwidth, low latency connectivity could soar. At the same time, cloud gaming could also provide a key test of the business rationale for edge computing, which involves the deployment of compute power and data storage closer to the end users of digital content and applications. This allows the associated data to be processed, analysed and acted on locally, instead of being transmitted long distances to be processed at central data centres.

This report then goes on to outline the rollout of cloud gaming services by various telcos, including Deutsche Telekom in Germany and Sunrise in Switzerland, while also considering Apple’s strategy in this space. Finally, the conclusions section summarises how telcos around the world should be preparing for mass-market cloud gaming.

This report builds on previous executive briefings published by STL Partners, including:

Enter your details below to request an extract of the report


What is cloud gaming?

Up to now, keen gamers have generally bought a dedicated console, such as a Microsoft Xbox or Sony PlayStation, or a high-end computer, to play technically complex and graphically rich games. They also typically buy a physical copy of the game (a DVD), which they install on their console or in an optical disc drive attached to their PC. Alternatively, some platforms, such as Steam, allow gamers to download games from a marketplace.

Cloud gaming changes that paradigm by running the games on remote hardware in the cloud, with the video and audio then streamed to the consumer’s device, which could be a smartphone, a connected TV, a low-end PC or a tablet. The player would typically connect this device to a dedicated handheld controller, similar to one that they would use with an Xbox or a PlayStation.

There is also a half-way house between full cloud gaming and console gaming. This “lite” form of cloud gaming is sometimes known as “command streaming”. In this case, the game logic and graphics commands are processed in the cloud, but the graphics rendering happens locally on the device. This approach lowers the amount of bandwidth required (sending commands requires less bandwidth than sending video) and is less demanding from a latency perspective (no encoding/ decoding of the video stream). But the quality of graphics will be limited to the capabilities of the graphic processing unit on the end-user’s device. For keen players that want to play graphically rich games, command streaming wouldn’t necessarily eliminate the need to buy a console or a powerful PC.

As well as relocating and rejigging the computing permutations, cloud gaming opens up new business models. Rather than buying individual games, for example, the consumer could pay for a Netflix-style subscription service that would enable them to play a wide range of online video games, without having to download them. Alternatively, cloud gaming services could use a pay-as-you-go model, simply charging consumers by the minute or hour.

Today, these cloud gaming subscriptions can be relatively expensive. For example, Shadow, an existing cloud gaming service charges US$35 a month in the U.S., £32 a month in the U.K. and €40 a month in France and Germany (but there are significant discounts if the subscriber commits to 12 months). Shadow can support graphics resolution of 4K at 60 frames per second and conventional HD at 144 frames per second, which is superior to a typical console specification. It requires an Internet connection of at least 15 Mbps. Shadow is compatible with Windows 7/8/10, macOS, Android, Linux (beta), iOS (beta) and comes with a Windows 10 license, which can be used for other PC applications.

At those prices, Shadow is a niche offering. But Google is now looking to take cloud gaming mainstream by setting subscription charges at around US$10 a month – comparable to a Spotify or Netflix subscription, although the user will have to pay additional fees to buy most games. Google says its new Stadia cloud gaming service is accessible from any device that can run YouTube in HD at 30/60 frames per second (fps), as long as it has a fast enough connection (15–25Mbps). The consumer then uses a dedicated controller that can connect directly to their Wi-Fi, bypassing the device with the screen. All the processing is done in Google’s cloud, which then sends a YouTube video-stream to the device: the URL pinpoints which clip of the gameplay to request and receive.

In other words, Stadia will treat games as personalised YouTube video clips/web-pages that a player or viewer can interact with in real time. As a result, the gamer can share that stream easily with friends by sending them the URL. With permission from the gamer, the friend could then jump straight into the gameplay using their own device.

What is cloud gaming?

Table of contents

  • Executive Summary
  • Introduction
  • What is cloud gaming?
    • Why consumers will embrace cloud gaming
  • Ramifications for telecoms networks
    • Big demands on bandwidth
    • Latency
    • Edge computing
    • The network architecture underpinning Google Stadia
  • How large is the potential market?
    • Modelling the U.S. cloud gaming market
    • New business models
  • Telcos’ cloud gaming activities
    • Microsoft hedges its bets
    • Apple takes a different tack
  • Conclusions
    • Telcos without their own online entertainment offering
    • Telcos with their own online entertainment offering

Enter your details below to request an extract of the report


The Industrial IoT: What’s the telco opportunity?

The Industrial IoT is a confusing world

This report is the final report in a mini-series about the Internet for Things (I4T), which we see as the next stage of evolution from today’s IoT.

The first report, The IoT is dead: Long live the Internet for Things, outlines why today’s IoT infrastructure is insufficient for meeting businesses’ needs. The main problem with today’s IoT is that every company’s data is locked in its own silo, and one company’s solutions are likely deployed on a different platform than their partners’. So companies can optimise their internal operations, but have limited scope to use IoT to optimise operations involving multiple organisations.

The second report, Digital twins: A catalyst of disruption in the Coordination Age, provides an overview of what a digital twin is, and how they can play a role in overcoming the limitations of today’s IoT industry.

This report looks more closely at the state of development of enterprise and industrial IoT and the leading players in today’s IoT industry, which we believe is a crucial driver of the Coordination Age. In the Coordination Age, we believe the crucial socio-economic need in the world – and therefore the biggest business opportunity – is to make better use of our resources, whether that is time, money, or raw materials. Given the number of people employed in and resources going through industrial processes, figuring out what’s needed to make the industrial IoT reach its full potential is a big part of making this possible.

Three types of IoT

There are three ways of dividing up the types of IoT applications. As described by IoT expert Stacey Higginbotham, each group has distinct needs and priorities based on their main purpose:

  1. Consumer IoT: A connected device, with an interactive app, that provides an additional service to the end user compared with an unconnected version of the device. The additional service is enabled by the insights and data gathered from the device. The key priority for consumer devices is low price point and ease of installation, given most users’ lack of technical expertise.
  2. Enterprise IoT: This includes all the devices and sensors that enterprises are connecting to the internet, e.g. enterprise mobility and fleet tracking. Since every device connected to an enterprise network is a potential point of vulnerability, the primary concern of enterprise IoT is security and device management. This is achieved through documentation of devices on enterprise networks, prioritisation of devices and traffic across multiple types of networks, e.g. depending on speed and security requirements, and access rights controls, to track who is sharing data with whom and when.
  3. Industrial IoT: This field is born out of industrial protocols such as SCADA, which do not currently connect to the internet but rather to an internal control and monitoring system for manufacturing equipment. More recently, enterprises have enhanced these systems with a host of devices connected to IP networks through Wi-Fi or other technologies, and linked legacy monitoring systems to gateways that feed operational data into more user-friendly, cloud-based monitoring and analytics solutions. At this point, the lines between Industrial IoT and Enterprise IoT blur. When the cloud-based systems have the ability to control connected equipment, for instance through firmware updates, security to prevent malicious or unintended risks is paramount. The primary goals in IIoT remain to control and monitor, in order to improve operational efficiency and safety, although with rising security needs.

The Internet for Things (I4T) is in large part about bridging the divide between Enterprise and Industrial IoT. The idea is to be able to share highly sensitive industrial information, such as a change in operational status that will affect a supply chain, or a fault in public infrastructure like roads, rail or electricity grid, that will affect surroundings and require repairs. This requires new solutions that can coordinate and track the movement of Industrial IoT data into Enterprise IoT insights and actions.

Understandably, enterprises are way of opening any vulnerabilities into their operations through deeper or broader connections, so finding a secure way to bring about the I4T is the primary concern.

The proliferation of IoT platforms

Almost every major player in the ICT world is pitching for a role in both Enterprise and Industrial IoT. Most largescale manufacturers and telecoms operators are also trying to carve out a role in the IoT industry.

By and large, these players have developed specific IoT solutions linked to their core businesses, and then expanded by developing some kind of “IoT platform” that brings together a broader range of capabilities across the IT stack necessary to provide end-to-end IoT solutions.
The result is a hugely complex industry with many overlapping and competing “platforms”. Because they all do something different, the term “platform” is often unhelpful in understanding what a company provides.

A company’s “IoT platform” might comprise of any combination of these four layers of the IoT stack, all of which are key components of an end-to-end solution:

  1. Hardware: This is the IoT device or sensor that is used to collect and transmit data. Larger devices may also have inbuilt compute power enabling them to run local analysis on the data collected, in order to curate which data need to be sent to a central repository or other devices.
  2. Connectivity: This is the means by which data is transmitted, including location-based connectivity (Bluetooth, Wi-Fi), to low power wide area over unlicensed spectrum (Sigfox, LoRa), and cellular (NB-IoT, LTE-M, LTE).
  3. IoT service enablement: This is the most nebulous category, because it includes anything that sits as middleware in between connectivity and the end application. The main types of enabling functions are:
    • Cloud compute capacity for storing and analysing data
    • Data management: aggregating, structuring and standardising data from multiple different sources. There are sub-categories within this geared towards specific end applications, such as product or service lifecycle management tools.
    • Device management: device onboarding, monitoring, software updates, and security. Software and security management are often broken out as separate enablement solutions.
    • Connectivity management: orchestrating IoT devices over a variety of networks
    • Data / device visualisation: This includes graphical interfaces for presenting complex data sets and insights, and 3D modelling tools for industrial equipment.
  4. Applications: These leverage tools in the IoT enablement layer to deliver specific insights or trigger actions that deliver a specific outcome to end users, such as predictive maintenance or fleet management. Applications are usually tailored to the specific needs of end users and rarely scale well across multiple industries.

Most “IoT platforms” combine at least two layers across this IoT stack

graphic of 4 layers on the IoT stack

Source: STL Partners

There are two key reasons why platforms offering end-to-end services have dominated the early development of the IoT industry:

  • Enterprises’ most immediate needs have been to have greater visibility into their own operations and make them more efficient. This means IoT initiatives have been driven primarily by business owners, rather than technology teams, who often don’t have the skills to piece together multiple different components by themselves.
  • Although the IoT as a whole is a big business, each individual component to bringing a solution together is relatively small. So companies providing IoT solutions – including telcos – have attempted to capture a larger share of the value chain in order to make it a better business.

Making sense of the confusion

It is a daunting task to work out how to bring IoT into play in any organisation. It requires a thorough re-think of how a business operates, for a start, then tinkering with (or transforming) its core systems and processes, depending on how you approach it.

That’s tricky enough even without the burgeoning market of self-proclaimed “leaders of industrial IoT” and technology players’ “IoT platforms”.

This report does not attempt to answer “what is the best way / platform” for different IoT implementations. There are many other resources available that attempt to offer comparisons to help guide users through the task of picking the right tools for the job.

The objective here is to gain a sense of what is real today, and where the opportunities and gaps are, in order to help telecoms operators and their partners understand how they can help enterprises move beyond the IoT, into the I4T.

 

Table of contents

  • Executive Summary
  • Introduction
    • Three types of IoT
    • The proliferation of IoT platforms
    • Making sense of the confusion
  • The state of the IoT industry
    • In the beginning, there was SCADA
    • Then there were specialised industrial automation systems
    • IoT providers are learning about evolving customer needs
  • Overview of IoT solution providers
    • Generalist scaled IT players
    • The Internet players (Amazon, Google and Microsoft)
    • Large-scale manufacturers
    • Transformation / IoT specialists
    • Big telco vendors
    • Telecoms operators
    • Other connectivity-led players
  • Conclusions and recommendations
    • A buyers’ eye view: Too much choice, not enough agility
    • How telcos can help – and succeed over the long term in IoT

B2B growth: How can telcos win in ICT?

Introduction

The telecom industry’s growth profile over the last few years is a sobering sight. As we have shown in our recent report Which operator growth strategies will remain viable in 2017 and beyond?, yearly revenue growth rates have been clearly slowing down globally since 2009 (see Figure 1). In three major regions (North America, Europe, Middle East) compound annual growth rates have even been behind GDP growth.

 

Figure 1: Telcos’ growth performance is flattening out (Sample of sixty-eight operators)

Source: Company accounts; STL Partners analysis

To break out of this decline telcos are constantly searching for new sources of revenue, for example, by expanding into adjacent, digital service areas which are largely placed within mass consumer markets (e.g. content, advertising, commerce).

However, in our ongoing conversations with telecoms operators, we increasingly come across the notion that a large part of future growth potential might actually lie in B2B (business-to-business) markets and that this customer segment will have an increasing impact of overall revenue growth.

This report investigates the rationale behind this thinking in detail and tries to answer the following key questions:

  1. What is the current state of telco’s B2B business?
  2. Where are the telco growth opportunities in the wider enterprise ICT arena?
  3. What makes an enterprise ICT growth strategy difficult for telcos to execute?
  4. What are the pillars of a successful strategy for future B2B growth?

 

  • Executive Summary
  • Introduction
  • Telcos may have different B2B strategies, but suffer similar problems
  • Finding growth opportunities within the wider enterprise ICT arena could help
  • Three complications for revenue growth in enterprise ICT
  • Complication 1: Despite their potential, telcos struggle to marshal their capabilities effectively
  • Complication 2: Telcos are not alone in targeting enterprise ICT for growth
  • Complication 3: Telcos’ core services are being disrupted by OTT players – this time in B2B
  • STL Partners’ recommendations: strategic pillars for future B2B growth
  • Conclusion

 

  • Figure 1: Telcos’ growth performance is flattening out (Sample of sixty-eight operators)
  • Figure 2: Telcos’ B2B businesses vary significantly by scale and performance (selected operators)
  • Figure 3: High-level structure of the telecom industry’s revenue pool (2015) – the consumer segment dominates
  • Figure 4: Orange aims to expand the share of “IT & integration services” in OBS’s revenue mix
  • Figure 5: Global enterprise ICT expenditures are projected to growth 7% p.a.
  • Figure 6: Telcos and Microsoft are moving in opposite directions
  • Figure 7: SD-WAN value chain
  • Figure 8: Within AT&T Business Solutions’ revenue mix, growth in fixed strategic services cannot yet offset the decline in legacy services

The Open Source Telco: Taking Control of Destiny

Preface

This report examines the approaches to open source software – broadly, software for which the source code is freely available for use, subject to certain licensing conditions – of telecoms operators globally. Several factors have come together in recent years to make the role of open source software an important and dynamic area of debate for operators, including:

  • Technological Progress: Advances in core networking technologies, especially network functions virtualisation (NFV) and software-defined networking (SDN), are closely associated with open source software and initiatives, such as OPNFV and OpenDaylight. Many operators are actively participating in these initiatives, as well as trialling their software and, in some cases, moving them into production. This represents a fundamental shift away from the industry’s traditional, proprietary, vendor-procured model.
    • Why are we now seeing more open source activities around core communications technologies?
  • Financial Pressure: However, over-the-top (OTT) disintermediation, regulation and adverse macroeconomic conditions have led to reduced core communications revenues for operators in both developed and emerging markets alike. As a result, operators are exploring opportunities to move away from their core, infrastructure business, and compete in the more software-centric services layer.
    • How do the Internet players use open source software, and what are the lessons for operators?
  • The Need for Agility: In general, there is recognition within the telecoms industry that operators need to become more ‘agile’ if they are to succeed in the new, rapidly-changing ICT world, and greater use of open source software is seen by many as a key enabler of this transformation.
    • How can the use of open source software increase operator agility?

The answers to these questions, and more, are the topic of this report, which is sponsored by Dialogic and independently produced by STL Partners. The report draws on a series of 21 interviews conducted by STL Partners with senior technologists, strategists and product managers from telecoms operators globally.

Figure 1: Split of Interviewees by Business Area

Source: STL Partners

Introduction

Open source is less optional than it once was – even for Apple and Microsoft

From the audience’s point of view, the most important announcement at Apple’s Worldwide Developer Conference (WWDC) this year was not the new versions of iOS and OS X, or even its Spotify-challenging Apple Music service. Instead, it was the announcement that Apple’s highly popular programming language ‘Swift’ was to be made open source, where open source software is broadly defined as software for which the source code is freely available for use – subject to certain licensing conditions.

On one level, therefore, this represents a clever engagement strategy with developers. Open source software uptake has increased rapidly during the last 15 years, most famously embodied by the Linux operating system (OS), and with this developers have demonstrated a growing preference for open source tools and platforms. Since Apple has generally pushed developers towards proprietary development tools, and away from third-party ones (such as Adobe Flash), this is significant in itself.

An indication of open source’s growth can be found in OS market shares in consumer electronics devices. As Figure 2 shows below, Android (open source) had a 49% share of shipments in 2014; if we include the various other open source OS’s in ‘other’, this increases to more than 50%.

Figure 2: Share of consumer electronics shipments* by OS, 2014

Source: Gartner
* Includes smartphones, tablets, laptops and desktop PCs

However, one of the components being open sourced is Swift’s (proprietary) compiler – a program that translates written code into an executable program that a computer system understands. The implication of this is that, in theory, we could even see Swift applications running on non-Apple devices in the future. In other words, Apple believes the risk of Swift being used on Android is outweighed by the reward of engaging with the developer community through open source.

Whilst some technology companies, especially the likes of Facebook, Google and Netflix, are well known for their activities in open source, Apple is a company famous for its proprietary approach to both hardware and software. This, combined with similar activities by Microsoft (who open sourced its .NET framework in 2014), suggest that open source is now less optional than it once was.

Open source is both an old and a new concept for operators

At first glance, open source also appears to now be less optional for telecoms operators, who traditionally procure proprietary software (and hardware) from third-party vendors. Whilst many (but not all) operators have been using open source software for some time, such as Linux and various open source databases in the IT domain (e.g. MySQL), we have in the last 2-3 years seen a step-change in operator interest in open source across multiple domains. The following quote, taken directly from the interviews, summarises the situation nicely:

“Open source is both an old and a new project for many operators: old in the sense that we have been using Linux, FreeBSD, and others for a number of years; new in the sense that open source is moving out of the IT domain and towards new areas of the industry.” 

AT&T, for example, has been speaking widely about its ‘Domain 2.0’ programme. Domain 2.0 has the objectives to transform AT&T’s technical infrastructure to incorporate network functions virtualisation (NFV) and software-defined networking (SDN), to mandate a higher degree of interoperability, and to broaden the range of alternative suppliers available across its core business. By 2020, AT&T hopes to virtualise 75% of its network functions, and it sees open source as accounting for up to 50% of this. AT&T, like many other operators, is also a member of various recently-formed initiatives and foundations around NFV and SDN, such as OPNFV – Figure 3 lists some below.

Figure 3: OPNFV Platinum Members

Source: OPNFV website

However, based on publicly-available information, other operators might appear to have lesser ambitions in this space. As ever, the situation is more complex than it first appears: other operators do have significant ambitions in open source and, despite the headlines NFV and SDN draw, there are many other business areas in which open source is playing (or will play) an important role. Figure 4 below includes three quotes from the interviews which highlight this broad spectrum of opinion:

Figure 4: Different attitudes of operators to open source – selected interview quotes

Source: STL Partners interviews

Key Questions to be Addressed

We therefore have many questions which need to be addressed concerning operator attitudes to open source software, adoption (by area of business), and more:

  1. What is open source software, what are its major initiatives, and who uses it most widely today?
  2. What are the most important advantages and disadvantages of open source software? 
  3. To what extent are telecoms operators using open source software today? Why, and where?
  4. What are the key barriers to operator adoption of open source software?
  5. Prospects: How will this situation change?

These are now addressed in turn.

  • Preface
  • Executive Summary
  • Introduction
  • Open source is less optional than it once was – even for Apple and Microsoft
  • Open source is both an old and a new concept for operators
  • Key Questions to be Addressed
  • Understanding Open Source Software
  • The Theory: Freely available, licensed source code
  • The Industry: Dominated by key initiatives and contributors
  • Research Findings: Evaluating Open Source
  • Open source has both advantages and disadvantages
  • Debunking Myths: Open source’s performance and security
  • Where are telcos using open source today?
  • Transformation of telcos’ service portfolios is making open source more relevant than ever…
  • … and three key factors determine where operators are using open source software today
  • Open Source Adoption: Business Critical vs. Service Area
  • Barriers to Telco Adoption of Open Source
  • Two ‘external’ barriers by the industry’s nature
  • Three ‘internal’ barriers which can (and must) change
  • Prospects and Recommendations
  • Prospects: An open source evolution, not revolution
  • Open Source, Transformation, and Six Key Recommendations
  • About STL Partners and Telco 2.0
  • About Dialogic

 

  • Figure 1: Split of Interviewees by Business Area
  • Figure 2: Share of consumer electronics shipments* by OS, 2014
  • Figure 3: OPNFV Platinum Members
  • Figure 4: Different attitudes of operators to open source – selected interview quotes
  • Figure 5: The Open IT Ecosystem (incl. key industry bodies)
  • Figure 6: Three Forms of Governance in Open Source Software Projects
  • Figure 7: Three Classes of Open Source Software License
  • Figure 8: Web Server Share of Active Sites by Developer, 2000-2015
  • Figure 9: Leading software companies vs. Red Hat, market capitalisation, Oct. 2015
  • Figure 10: The Key Advantages and Disadvantages of Open Source Software
  • Figure 11: How Google Works – Failing Well
  • Figure 12: Performance gains from an open source activation (OSS) platform
  • Figure 13: Intel Hardware Performance, 2010-13
  • Figure 14: Open source is more likely to be found today in areas which are…
  • Figure 15: Framework mapping current telco uptake of open source software
  • Figure 16: Five key barriers to telco adoption of open source software
  • Figure 17: % of employees with ‘software’ in their LinkedIn job title, Oct. 2015
  • Figure 18: ‘Waterfall’ and ‘Agile’ Software Development Methodologies Compared
  • Figure 19: Four key cultural attributes for successful telco transformation

Microsoft: Pivoting to a Communications-Centric Business

Introduction: From Monopoly to Disruption

For many years, Microsoft was an iconic monopolist, in much the same way as AT&T had been before divestment. Microsoft’s products were ubiquitous and often innovative, and its profitability enormous. It was familiar, yet frequently scorned as the creator of a dreary monoculture with atrocious security properties. Microsoft’s mission statement could not have been simpler: a computer in every office and in every home. This achieved, though, its critics have often seen it as an organisation in search of an identity, experimenting with mobile, search, maps, hardware and much else without really settling on a new direction.

Going to the numbers, for the last two years, there has been steady erosion of the once phenomenally high margins, although revenue is still steadily rising. Since Q3 2013, revenue at Microsoft grew an average of 3.5% annually, but the decline in margins meant that profits barely grew, with a CAGR of 0.66%. Telcos will be familiar with this kind of stagnation, but telcos would be delighted with Microsoft’s 66% gross margins. Note, that getting into hardware has given Microsoft a typical hardware vendor’s Christmas spike in revenue.

Figure 1:  MS revenue is growing steadily but margin erosion undermines it

Source: Microsoft 10-K, STL Partners

Over the long term, the pattern is clearer, as are the causes. Figure 2 shows Microsoft’s annual revenue and gross margin since the financial year 1995. From 1995 to 2010, gross margins were consistently between 80 and 90 per cent, twice the 45% target HP traditionally defined as “fascinating”. It was good to be king. However, in the financial year 2010, there is a clear inflection point: margins depart from the 80% mark and never return, falling at a 3.45% clip between 2010 and 2015.

The event that triggered this should be no surprise. Microsoft has traditionally been discussed in parentheses with Apple, and Apple’s 2010 was a significant one. It was the first year that Apple began using the A-series processors of its own design, benefiting from the acquisition of PA Semiconductor in 2008. This marked an important strategic shift at Apple from the outsourced, design- and brand-centric business to vertical integration and investment in manufacturing, a strategy associated with Tim Cook’s role as head of the supply chain.

Figure 2: The inflection point in 2010

Source: Microsoft 10-K, STL Partners

The deployment of the A4 chip made possible two major product launches in 2010 – the iPhone 4, which would sell enormously more than any of the previous iPhones, and the iPad, which created an entirely new product category competing directly with the PC. Another Apple product launch that year, which also competed head-on with Microsoft, wasn’t quite as dramatic but was also very significant – the MacBook line began shipping with SSDs rather than hard disks, and the very popular 11” MacBook Air was added as an entry-level option. At the time, the PC industry and hence Microsoft was heavily committed to the Intel-backed netbooks, and the combination of the iPad and the 11” Air essentially destroyed the netbook as a product category.

The problems started in the consumer market, but the industry was beginning to recognise that innovations had begun to take hold in consumer and then diffuse into the enterprise. Further, the enterprise franchise centred on the Microsoft Business division and what was then termed Server & Tools[1] were both threatened by the increasing adoption of Apple products.

Microsoft had to respond, and it did so with a succession of dramatic initiatives. One was to rethink Windows as a tablet- or phone-optimised operating system, in Windows Phone 7 and Windows 8. Another was to acquire Nokia’s smartphone business, and to diversify into hardware via the Xbox and Surface projects. And yet a third was to embrace the cloud. Figure 3 shows the results.

  • Introduction
  • Executive Summary
  • From Monopoly to Disruption
  • The push into mobile fails…but what about the cloud?
  • Changing Platforms: from Windows to Office
  • The Skype Acquisition: a missed opportunity?
  • Skype for Business and Office 365: the new platform
  • The rise of the consumer cloud
  • Bing may just about be breaking even…but the real story here is consumer cloud
  • Scaling out in the cloud
  • Conclusions: towards a communications-centric Microsoft

 

  • Figure 1: MS revenue is growing steadily but margin erosion undermines it
  • Figure 2: The inflection point in 2010
  • Figure 3: Revenue by product category at Microsoft, last 2 years
  • Figure 4: Cloud and the Enterprise drive profitability at Microsoft
  • Figure 5: Cloud is the driver of growth at Microsoft
  • Figure 6: Internally-developed hardware and cloud services are improving their margins
  • Figure 7: The Nokia Devices & Services business slides into loss
  • Figure 8: In 2011, an unifying API appeared critical for Skype’s future within Microsoft
  • Figure 9: Cloud is now over $8bn a year in revenue
  • Figure 10: Spot the deliberate mistake. No mention of Bing’s profitability or otherwise
  • Figure 11: Bing was a money pit for years, but may have begun to improve
  • Figure 12: The app store and consumer cloud businesses are performing superbly

Google’s Big, Big Data Battle

The challenges to Google’s core business 

Although Google is the world’s leading search engine by some distance, its pre-eminence is more fragile than its first appears. As Google likes to remind anti-trust authorities, its competitors are just a click away. And its primary competitors are some of the most powerful and well-financed companies in the world – Apple, Amazon, Facebook and Microsoft. As these companies, as well as specialist service providers, accumulate more and more data on consumers, Google’s position as the leading broker of online advertising is under threat in several, inter-related, ways:

  1. Google’s margins are being squeezed, as competition intensifies. Increasingly experienced web users are using specialist search engines, such as Amazon (shopping), Expedia (travel) and moneysupermarket.com (financial services), or going direct to the sites they need, thereby circumventing Google’s search engine and the advertising brokered by Google. This trend is exacerbated by Google’s ongoing lockout from the vast amount of content being generated by Facebook’s social network. As the Internet matures, general-purpose web search may become yesterday’s business.
  2. The rise of the app-based Internet: As consumers increasingly access the Internet via mobile devices, they are making greater use of apps and less use of browsers and, by extension, conventional search engines. Apps are popular on mobile devices because they are designed to take the consumer straight to the content they are looking for, rather than requiring them to navigate around the web using small and fiddly on-screen keyboards. Moreover, Apple, the leading provider of smartphones and tablets to the affluent, is seeking to relegate, and where feasible, remove, Google’s apps and services in its ecosystem.
  3. Android forks: Android, an extraordinarily successful ‘Trojan Horse’ for Google’s apps and services, is the market leading operating system for mobile devices, but Google’s control of Android is patchy. Some device makers are integrating their own apps into a forked variant of this open-source platform. Amazon and Nokia are among those who have stripped Google’s search, maps, mail and store apps from their variants of the Android operating system, reducing the data that Google can gather on their customers. At the same time, Samsung, the world’s largest handset vendor, is straining at Google’s Android leash.
  4. Quality dilution: As Google is the world’s dominant search engine, it is the prime target for so-called content farms that produce large volumes of low quality content in an effort to rank highly in Google’s search results and thereby attract traffic and advertising.
  5. Regulatory scrutiny: Despite a February 2014 settlement with the European Commission concerning its search practices, Google remains in the regulatory spotlight. Competition authorities across the world continue to fret about Google’s market power and its ability to influence what people look at on the Internet.

1. Google’s margin squeeze

Price deflation

Google, the company that facilitated massive deflation across advertising, content, e-commerce, and mobile operating systems, is itself suffering from the deflationary environment of the Internet. Although revenue and net income are still growing, margins are shrinking (see Figure 2). Google is still growing because it is adding volume. However, there is strong evidence that its pricing power is being eroded.

Figure 2: Google margins are steadily falling as volumes continue to rise

Telco 2 Figure 2: Google margins are steadily falling as volumes continue to rise

Source: Google filings

To put this in the context of its Silicon Valley peers, Figure 3 shows the same data for Google, Facebook, and Apple using a trend line covering the 2009 to 2013 period for each company. Note, that we have used a log scale to compare three companies of very different size. Apple saw growth in both revenue and operating margins until 2013, when it hit a difficult patch, although a big product launch might fix that at any time. Facebook has grown revenues enormously, but went through a traumatic 2012 as the shift to mobile hit it. While all this drama went on, Google has grown steadily, while seeing its margins eroded.

Figure 3: Google’s operating margins are now below those of Apple and Facebook

Telco 2 Figure 3 googles operating mar

Source: SEC filings

What are the factors behind Google’s declining operating margin? We believe the main drivers are:

  • The amount Google can charge per click is falling – buyers get more ads per buck.
  • The cost of acquiring ad inventory is increasing.

Cheaper ads

As Figure 4 shows, Google continues to drive ad volume (paid clicks), but ad rates (cost per click) are falling steadily. The average cost-per-click on Google websites and Google Network Members’ websites decreased approximately 8% from 2012 to 2013.  We think this is primarily due to intensifying competition, particularly from Facebook. However, Google attributes the decline to “various factors, such as the introduction of new products as well as changes in property mix, platform mix and geographical mix, and the general strengthening of the U.S. dollar compared to certain foreign currencies.” The second quarter of 2014 saw paid clicks rise 2% quarter-on-quarter, while the cost per click was flat.

Figure 4: The cost per click is declining in lockstep with rising volume

Telco 2 Figure 4 The cost per click is declining in lockstep with rising volume

Source: Google filings

 

  • Introduction
  • Executive Summary
  • The challenges to Google’s core business
  • 1. Google’s margin squeeze
  • 2. The rising importance of mobile apps
  • 3. Android forks
  • 4. Quality dilution
  • 5. Regulatory scrutiny
  • Google’s strategy – get on the front foot
  • Google Now – turning search on its head
  • Reactive search becomes more proactive
  • Voice input
  • Anticipating wearables, connected cars and the Internet of Things
  • Searching inside apps
  • Evaluating Google Now
  • 1. The marketplace
  • 2. Develop compelling service offerings
  • 3. The value network
  • 4. Technology
  • 5. Finance – the high-level business model

 

  • Figure 1: How Google is neutralising threats and pursuing opportunities
  • Figure 2: Google margins are steadily falling as volumes continue to rise
  • Figure 3: Google’s operating mar gins are now below those of Apple and Facebook
  • Figure 4: The cost per click is declining in lockstep with rising volume
  • Figure 5: Rising distribution costs are driving Google’s TAC upwards
  • Figure 6: Google’s revenues are increasingly coming from in-house sites and apps
  • Figure 7: R&D is the fastest-growing ad-acquisition cost in absolute terms
  • Figure 8: Daily active users of Facebook generating content out of Google’s reach
  • Figure 9: Google is still the most popular destination on the Internet
  • Figure 10: In the U.S., usage of desktop web sites is falling
  • Figure 11: Google’s declining share of mobile search advertising in the U.S.
  • Figure 12: Google’s lead on the mobile web is narrower than on the desktop web
  • Figure 13: Top smartphone apps in the U.S. by average unique monthly users
  • Figure 14: For Google, its removal from the default iOS Maps app is a major blow
  • Figure 15: On Android, Google owns four of the five most used apps in the U.S.
  • Figure 16: The resources Google needs to devote to web spam are rising over time
  • Figure 17: Google, now genuinely global.
  • Figure 18: A gap in the market: Timely proactive recommendations
  • Figure 19: Google’s search engine is becoming proactive
  • Figure 20: The ongoing evolution of Google Search into a proactive, recommendations service
  • Figure 21: The Telco 2.0 Business Model Framework
  • Figure 22: Amazon Local asks you to set preferences
  • Figure 23: Google Now’s cards and the information they use
  • Figure 24: Android dominates the global smartphone market
  • Figure 25: Samsung has about 30% of the global smartphone market
  • Figure 26: Google – not quite the complete Internet company
  • Figure 27: Google’s strategic response

Cisco, Microsoft, Google, AT&T, Telefonica, et al: the disruptive battle for value in communications

Technology: Products and Vendors’ Approaches

There are many vendors and products in the voice/telephony arena. Some started as pure voice products or solutions like Cisco Call Manager, while others such as Microsoft Office 365 started as an office productivity suite, to which voice and presence became a natural extension, and then later a central part of the core product functionality. We have included details on RCS, however RCS is not globally available, and is limited in its functionality compared to some of the other products listed here.

Unified Communications

Unified Communications (UC) is not a standard; there are many different interpretations, but there is a general consensus about what it means – the unification of voice, video, messaging, presence, conferencing, and collaboration into a simple integrated user experience.

UC is an important technology for enterprise customers, it brings mobility and agility to an organisation, improves communication and collaboration, adds a social element, and lowers costs by reducing the need for office space and multiple disparate communications systems each with their own management and control systems. UC can be delivered as a cloud service and has the acronym UCaaS. Leading providers are Microsoft, Google, and Cisco. Other players include IBM, 8X8, and a number of other smaller vendors, as well as telco equipment manufacturers such as Ericsson. We have covered some of the leading solutions in this report, and there are definite opportunities for telcos to collaborate with these vendors, adding integration with core services such as telephony and mobile data, as well as customer support and billing.

There are several elements for an enterprise to consider when developing a UC solution for it to be successful:

  • Fixed voice functions and needs (including PBX) and integration into a UC solution
  • Mobile voice – billing, call routing, integration with fixed and UC solutions
  • Desktop and mobile video calling
  • Collaboration tools (conferencing, video conferencing, desktop integration, desktop sharing etc.)
  • Desktop integration – how does the solution integrate with core productivity tools (Microsoft Office, Google Apps, OpenOffice etc?)
  • PC and mobile clients – can a mobile user participate in a video conference, share files
  • Instant messaging and social integration
  • How the user is able to interact with the system and how intuitive it is to use. This is sometimes called the user experience and is probably the most important aspect, as a good user experience promotes efficiency and end user satisfaction

From the user perspective, it would be desirable for the solution to include the basic elements shown in Figure 1.

Figure 1: Basic user needs from Unified Communications
Voice Messaging Tech Cover

Source: STL Partners

Historically, Enterprise communications has been an area where telcos have been a supplier to the enterprise – delivering voice end points (E.164 phone numbers and mobile devices), voice termination, and outgoing voice and data services.

Organisational voice communications (i.e. internal calling) has been an area of strength for companies like Cisco, Avaya, Nortel and others that have delivered on-premise solutions which offer sophisticated voice and video services. These have grown over the years to provide Instant Messaging (IM), desktop collaboration tools, and presence capabilities. PC clients often replace fixed phones, adding functionality, and can be used when out of the office. What these systems have lacked is deep integration with desktop office suites such as Microsoft Office, Google Apps, and Lotus Notes. Plug-ins or other tools can be used to integrate presence and voice, but the user experience is usually a compromise as different vendors are involved.

The big software vendors have also been active, with Microsoft and IBM adding video and telephony features, and Google building telephony and conferencing into its growing portfolio. Microsoft also acquired Skype and has delivered on its promise to integrate Skype with Lync. Meanwhile, Google has made a number of acquisitions in the video and voice arena like ON2, Global IP Solutions, and Grand Central. The technology from ON2 allows video to be compressed and sent over an Internet connection. Google is pushing the products from ON2 to be integrated into one of the next major disruptors – WebRTC.

Microsoft began including voice capability with its release of Office Communications Server (OCS) in 2007. An OCS user could send instant messages, make a voice call, or place a video call to another OCS user or group of users. Presence was directly integrated with Outlook and a separate product – Office Live Meeting – was used to collaborate. Although OCS included some Private Branch eXchange (PBX) features, few enterprises regarded it as having enough features or capability to replace existing systems from the likes of Cisco. With Office 365, Microsoft stepped up the game, adding a new user interface, enhanced telephony features, integrated collaboration, and multiple methods of deployment using Microsoft’s cloud, on premise, and service provider deployments.

 

  • Technology: Products and Vendors’ Approaches
  • Unified Communications
  • Microsoft Office 365 – building on enterprise software strengths
  • Skype – the popular international behemoth
  • Cisco – the incumbent enterprise giant
  • Google – everything browser-based
  • WebRTC – a major disruptive opportunity
  • Rich Communication Service (RCS) – too little too late?
  • Broadsoft – neat web integration
  • Twilio – integrate voice and SMS into applications
  • Tropo – telephony integration technology leader
  • Voxeo – a pathfinder in integration
  • Hypervoice –make voice a native web object
  • Calltrunk – makes calls searchable
  • Operator Voice and Messaging Services
  • Section Summary
  • Telco Case Studies
  • Vodafone – 360, One Net and RED
  • Telefonica – Digital, Tu Me, Tu Go, BlueVia, Free Wi-Fi
  • AT&T – VoIP, UC, Tropo, Watson
  • Section Summary
  • STL Partners and the Telco 2.0™ Initiative

 

  • Figure 1: Basic user needs from Unified Communications
  • Figure 2: Microsoft Lync 2013 client
  • Figure 3: Microsoft Lync telephony integration options
  • Figure 4: International Telephone and Skype Traffic 2005-2012
  • Figure 5: The Skype effect on international traffic
  • Figure 6: Voice call charging in USA
  • Figure 7: Google Voice call charging in USA
  • Figure 8: Google Voice call charging in Europe
  • Figure 9: Google outbound call rates
  • Figure 10: Calliflower beta support for WebRTC
  • Figure 11: Active individual user base for WebRTC, millions
  • Figure 12: Battery life compared for different services
  • Figure 13: Vodafone One Net Express call routing
  • Figure 14: Vodafone One Net Business Call routing
  • Figure 15: Enterprise is a significant part of Vodafone group revenue
  • Figure 16: Vodafone Red Bundles
  • Figure 17: Telefonica: Market Positioning Map, Q4 2012
  • Figure 18: US market in transition towards greater competition
  • Figure 19: Voice ARPU at AT&T, fixed and mobile
  • Figure 20: Industry Value is Concentrated at the Interfaces
  • Figure 21: Telco 2.0™ ‘two-sided’ telecoms business model

Communications Services: What now makes a winning value proposition?

Introduction

This is an extract of two sections of the latest Telco 2.0 Strategy Report The Future Value of Voice and Messaging for members of the premium Telco 2.0 Executive Briefing Service.

The full report:

  • Shows how telcos can slow the decline of voice and messaging revenues and build new communications services to maximise revenues and relevance with both consumer and enterprise customers.
  • Includes detailed forecasts for 9 markets, in which the total decline is forecast between -25% and -46% on a $375bn base between 2012 and 2018, giving telcos an $80bn opportunity to fight for.
  • Shows impacts and implications for other technology players including vendors and partners, and general lessons for competing with disruptive players in all markets.
  • Looks at the impact of so-called OTT competition, market trends and drivers, bundling strategies, operators developing their own Telco-OTT apps, advanced Enterprise Communications services, and the opportunities to exploit new standards such as RCS, WebRTC and VoLTE.

The Transition in User Behaviour

A global change in user behaviour

In November, 2012 we published European Mobile: The Future’s not Bright, it’s Brutal. Very soon after its publication, we issued an update in the light of results from Vodafone and Telefonica that suggested its predictions were being borne out much faster than we had expected.

Essentially, the macro-economic challenges faced by operators in southern Europe are catalysing the processes of change we identify in the industry more broadly.

This should not be seen as a “Club Med problem”. Vodafone reported a 2.7% drop in service revenue in the Netherlands, driven by customers reducing their out-of-bundle spending. This sensitivity and awareness of how close users are getting to their monthly bundle allowances is probably a good predictor of willingness to adopt new voice and messaging applications, i.e. if a user is regularly using more minutes or texts than are included in their service bundle, they will start to look for free or lower cost alternatives. KPN Mobile has already experienced a “WhatsApp shock” to its messaging revenues. Even in Vodafone Germany, voice revenues were down 6.1% and messaging 3.7%. Although enterprise and wholesale business were strong, prepaid lost enough revenue to leave the company only barely ahead. This suggests that the sizable low-wage segment of the German labour market is under macro-economic stress, and a shock is coming.

The problem is global, for example, at the 2013 Mobile World Congress, the CEO of KT Corp described voice revenues as “collapsing” and stated that as a result, revenues from their fixed operation had halved in two years. His counterpart at Turk Telekom asserted that “voice is dead”.

The combination of technological and macro-economic challenge results in disruptive, rather than linear change. For example, Spanish subscribers who adopt WhatsApp to substitute expensive operator messaging (and indeed voice) with relatively cheap data because they are struggling financially have no particular reason to return when the recovery eventually arrives.

Price is not the only issue

Also, it is worth noting that price is not the whole problem. Back at MWC 2013, the CEO of Viber, an OTT voice and messaging provider, claimed that the app has the highest penetration in Monaco, where over 94% of the population use Viber every day. Not only is Monaco somewhere not short of money, but it is also a market where the incumbent operator bundles unlimited SMS, though we feel that these statistics might slightly stretch the definition of population as there are many French subscribers using Monaco SIM cards. However, once adoption takes off it will be driven by social factors (the dynamics of innovation diffusion) and by competition on features.

Differential psychological and social advantages of communications media

The interaction styles and use cases of new voice and messaging apps that have been adopted by users are frequently quite different to the ones that have been imagined by telecoms operators. Between them, telcos have done little more than add mobility to telephony during the last 100 years, However, because of the Internet and growth of the smartphone, users now have many more ways to communicate and interact other than just calling one another.

SMS (only telcos’ second mass ‘hit’ product after voice) and MMS are “fire-and-forget” – messages are independent of each other, and transported on a store-and-forward basis. Most IM applications are either conversation-based, with messages being organised in threads, or else stream-based, with users releasing messages on a broadcast or publish-subscribe basis. They often also have a notion of groups, communities, or topics. In getting used to these and internalising their shortcuts, netiquette, and style, customers are becoming socialised into these applications, which will render the return of telcos as the messaging platform leaders with Rich Communication System (RCS) less and less likely. Figure 1 illustrates graphically some important psychological and social benefits of four different forms of communication.

Figure 1:  Psychological and social advantages of voice, SMS, IM, and Social Media

Psychological and social advantages of voice, SMS, IM, and Social Media Dec 2013

Source: STL Partners

The different benefits can clearly be seen. Taking voice as an example, we can see that a voice call could be a private conversation, a conference call, or even part of a webinar. Typically, voice calls are 1 to 1, single instance, and with little presence information conveyed (engaged tone or voicemail to others). By their very nature, voice calls are real time and have a high time commitment along with the need to pay attention to the entire conversation. Whilst not as strong as video or face to face communication, a voice call can communicate high emotion and of course is audio.

SMS has very different advantages. The majority of SMS sent are typically private, 1 to 1 conversations, and are not thread based. They are not real time, have no presence information, and require low time commitment, because of this they typically have minimal attention needs and while it is possible to use a wide array of emoticons or smileys, they are not the same as voice or pictures. Even though some applications are starting to blur the line with voice memos, today SMS messaging is a visual experience.

Instant messaging, whether enterprise or consumer, offers a richer experience than SMS. It can include presence, it is often thread based, and can include pictures, audio, videos, and real time picture or video sharing. Social takes the communications experience a step further than IM, and many of the applications such as Facebook Messenger, LINE, KakaoTalk, and WhatsApp are exploiting the capabilities of these communications mechanisms to disrupt existing or traditional channels.

Voice calls, whether telephony or ‘OTT’, continue to possess their original benefits. But now, people are learning to use other forms of communication that better fit the psychological and social advantages that they seek in different contexts. We consider these changes to be permanent and ongoing shifts in customer behaviour towards more effective applications, and there will doubtless be more – which is both a threat and an opportunity for telcos and others.

The applicable model of how these shifts transpire is probably a Bass diffusion process, where innovators enter a market early and are followed by imitators as the mass majority. Subsequently, the innovators then migrate to a new technology or service, and the cycle continues.

One of the best predictors of churn is knowing a churner, and it is to be expected that users of WhatsApp, Vine, etc. will take their friends with them. Economic pain will both accelerate the diffusion process and also spread it deeper into the population, as we have seen in South Korea with KakaoTalk.

High-margin segments are more at risk

Generally, all these effects are concentrated and emphasised in the segments that are traditionally unusually profitable, as this is where users stand to gain most from the price arbitrage. A finding from European Mobile: The Future’s not Bright, it’s Brutal and borne out by the research carried out for this report is that prices in Southern Europe were historically high, offering better margins to operators than elsewhere in Europe. Similarly, international and roaming calls are preferentially affected – although international minutes of use continue to grow near their historic average rates, all of this and more accrues to Skype, Google, and others. Roaming, despite regulatory efforts, remains expensive and a target for disruptors. It is telling that Truphone, a subject of our 2008 voice report, has transitioned from being a company that competed with generic mobile voice to being one that targets roaming.

 

  • Consumers: enjoying the fragmentation
  • Enterprises: in search of integration
  • What now makes a winning value proposition?
  • The fall of telephony
  • Talk may be cheap, but time is not
  • The increasing importance of “presence”
  • The competition from Online Service Providers
  • Operators’ responses
  • Free telco & other low-cost voice providers
  • Meeting Enterprise customer needs
  • Re-imagining customer service
  • Telco attempts to meet changing needs
  • Voice Developers – new opportunities
  • Into the Hunger Gap
  • Summary: the changing telephony business model
  • Conclusions
  • STL Partners and the Telco 2.0™ Initiative

 

  • Figure 1:  Psychological and social advantages of voice, SMS, IM, and Social Media
  • Figure 2: Ideal Enterprise mobile call routing scenario
  • Figure 3: Mobile Clients used to bypass high mobile call charges
  • Figure 4: Call Screening Options
  • Figure 5: Mobile device user context and data source
  • Figure 6: Typical business user modalities
  • Figure 7:  OSPs are pursuing platform strategies
  • Figure 8: Subscriber growth of KakaoTalk
  • Figure 9: Average monthly minutes of use by market
  • Figure 10: Key features of Voice and Messaging platforms
  • Figure 11: Average user screen time Facebook vs. WhatsApp  (per month)
  • Figure 12: Disruptive price competition also comes from operators
  • Figure 13: The hunger gap in music

The Future Value of Voice and Messaging

Background – ‘Voice and Messaging 2.0’

This is the latest report in our analysis of developments and strategies in the field of voice and messaging services over the past seven years. In 2007/8 we predicted the current decline in telco provided services in Voice & Messaging 2.0 “What to learn from – and how to compete with – Internet Communications Services”, further articulated strategic options in Dealing with the ‘Disruptors’: Google, Apple, Facebook, Microsoft/Skype and Amazon in 2011, and more recently published initial forecasts in European Mobile: The Future’s not Bright, it’s Brutal. We have also looked in depth at enterprise communications opportunities, for example in Enterprise Voice 2.0: Ecosystem, Species and Strategies, and trends in consumer behaviour, for example in The Digital Generation: Introducing the Participation Imperative Framework.  For more on these reports and all of our other research on this subject please see here.

The New Report


This report provides an independent and holistic view of voice and messaging market, looking in detail at trends, drivers and detailed forecasts, the latest developments, and the opportunities for all players involved. The analysis will save valuable time, effort and money by providing more realistic forecasts of future potential, and a fast-track to developing and / or benchmarking a leading-edge strategy and approach in digital communications. It contains

  • Our independent, external market-level forecasts of voice and messaging in 9 selected markets (US, Canada, France, Germany, Spain, UK, Italy, Singapore, Taiwan).
  • Best practice and leading-edge strategies in the design and delivery of new voice and messaging services (leading to higher customer satisfaction and lower churn).
  • The factors that will drive best and worst case performance.
  • The intentions, strategies, strengths and weaknesses of formerly adjacent players now taking an active role in the V&M market (e.g. Microsoft)
  • Case studies of Enterprise Voice applications including Twilio and Unified Communications solutions such as Microsoft Office 365
  • Case studies of Telco OTT Consumer Voice and Messaging services such as like Telefonica’s TuGo
  • Lessons from case studies of leading-edge new voice and messaging applications globally such as Whatsapp, KakaoTalk and other so-called ‘Over The Top’ (OTT) Players


It comprises a 18 page executive summary, 260 pages and 163 figures – full details below. Prices on application – please email contact@telco2.net or call +44 (0) 207 247 5003.

Benefits of the Report to Telcos, Technology Companies and Partners, and Investors


For a telco, this strategy report:

  • Describes and analyses the strategies that can make the difference between best and worst case performance, worth $80bn (or +/-20% revenues) in the 9 markets we analysed.
  • Externally benchmarks internal revenue forecasts for voice and messaging, leading to more realistic assumptions, targets, decisions, and better alignment of internal (e.g. board) and external (e.g. shareholder) expectations, and thereby potentially saving money and improving contributions.
  • Can help improve decisions on voice and messaging services investments, and provides valuable insight into the design of effective and attractive new services.
  • Enables more informed decisions on partner vs competitor status of non-traditional players in the V&M space with new business models, and thereby produce better / more sustainable future strategies.
  • Evaluates the attractiveness of developing and/or providing partner Unified Communication services in the Enterprise market, and ‘Telco OTT’ services for consumers.
  • Shows how to create a valuable and realistic new role for Voice and Messaging services in its portfolio, and thereby optimise its returns on assets and capabilities


For other players including technology and Internet companies, and telco technology vendors

  • The report provides independent market insight on how telcos and other players will be seeking to optimise $ multi-billion revenues from voice and messaging, including new revenue streams in some areas.
  • As a potential partner, the report will provide a fast-track to guide product and business development decisions to meet the needs of telcos (and others).
  • As a potential competitor, the report will save time and improve the quality of competitor insight by giving strategic insights into the objectives and strategies that telcos will be pursuing.


For investors, it will:

  • Improve investment decisions and strategies returning shareholder value by improving the quality of insight on forecasts and the outlook for telcos and other technology players active in voice and messaging.
  • Save vital time and effort by accelerating decision making and investment decisions.
  • Help them better understand and evaluate the needs, goals and key strategies of key telcos and their partners / competitors


The Future Value of Voice: Report Content Summary

  • Executive Summary. (18 pages outlining the opportunity and key strategic options)
  • Introduction. Disruption and transformation, voice vs. telephony, and scope.
  • The Transition in User Behaviour. Global psychological, social, pricing and segment drivers, and the changing needs of consumer and enterprise markets.
  • What now makes a winning Value Proposition? The fall of telephony, the value of time vs telephony, presence, Online Service Provider (OSP) competition, operators’ responses, free telco offerings, re-imaging customer service, voice developers, the changing telephony business model.
  • Market Trends and other Forecast Drivers. Model and forecast methodology and assumptions, general observations and drivers, ‘Peak Telephony/SMS’, fragmentation, macro-economic issues, competitive and regulatory pressures, handset subsidies.
  • Country-by-Country Analysis. Overview of national markets. Forecast and analysis of: UK, Germany, France, Italy, Spain, Taiwan, Singapore, Canada, US, other markets, summary and conclusions.
  • Technology: Products and Vendors’ Approaches. Unified Comminications. Microsoft Office 365, Skype, Cisco, Google, WebRTC, Rich Communications Service (RCS), Broadsoft, Twilio, Tropo, Voxeo, Hypervoice, Calltrunk, Operator voice and messaging services, summary and conclusions.
  • Telco Case Studies. Vodafone 360, One Net and RED, Telefonica Digital, Tu Me, Tu Go, Bluvia and AT&T.
  • Summary and Conclusions. Consumer, enterprise, technology and Telco OTT.