The three telco Metaverse strategies

The Metaverse offers opportunities beyond connectivity for telcos

The Metaverse is the increasingly accepted term for a set of interconnected virtual worlds. One way to think about the Metaverse is to see it as a 3D version of the world wide web in which organizations operate their own virtual 3D worlds, rather than 2D web sites. Represented by avatars, visitors to a virtual world can interact with other users or with avatars controlled by artificial intelligence. The term Metaverse entered the popular consciousness when Facebook renamed itself Meta in October 2021.

Enter your details below to request an extract of the report

The renaming of Facebook sparked a surge of interest in the Metaverse

Source: Google Trends

Whereas the existing Internet is essentially a 2D digital overlay of the world, composed of text, voice, images and video, the Metaverse will provide a 3D digital overlay. This is the way Nvidia’s CEO Jensen Huang, portrayed the Metaverse in a speech in November 2021. As a leading provider of graphics chips, Nvidia is thinking deeply about how to build a business case for the Metaverse, which could drive rapid growth in demand for its products.

For a fully immersive experience, the Metaverse will need to be accessed through virtual reality (VR) headsets, but it could also be explored by moving through 3D environments using a conventional handset, laptop or television. Indeed, it is important to stress that the fortunes of the Metaverse won’t necessarily depend on the fortunes of VR. Hundreds of millions of people already play video games in 3D, interacting with each other, without wearing headsets.

The Metaverse looks set to host both entirely fictional virtual spaces where people can socialise, play and enjoy entertainment, as well as simulations of the real world, where people can test new product designs, learn new skills or watch concerts and sports events they can’t attend in person.

The first part of this report considers how the Metaverse could create value and the obstacles that lie in its way. It also outlines the strategies of Improbable, Meta (formerly Facebook), Microsoft and Nvidia – four companies developing many of the key enabling technologies.

The second part explores the Metaverse strategies of telcos. Broadband networks and related telco services are fundamental to the smooth running of digital environments today, and will be the building blocks of the Metaverse. We believe that telcos could play a coordination role that will help prevent the Metaverse from fragmenting into silos that are unable to interoperate with each other.

Our landmark report The Coordination Age: A third age of telecoms explained how reliable and ubiquitous connectivity can enable companies and consumers to use digital technologies to efficiently allocate and source assets and resources. In the case of Metaverse, telcos can help people and businesses to interact and transact with each other safely and securely in 3D environments.

As it considers the opportunities for telcos, this report draws on the experiences and actions of SKT, Telefónica and Verizon, which are each deploying strategies to help coordinate the development of the Metaverse.

Table of Contents

  • Executive Summary
  • Introduction
  • What is the Metaverse for?
    • The lure of the virtual road
    • Corporate worlds take over from web sites
    • Dominance or democracy?
    • The non-fungible flexibility paradox
    • Facebook pursues metamorphosis
    • Microsoft has most of the pieces
  • What will the Metaverse mean for telcos?
    • Recreating the real world is challenging
    • Traffic implications for telcos
    • Opportunities for telcos
    • SK Telecom – the full stack standard bearer
    • Telefónica looks to play coordination role
    • AT&T and Verizon – connectivity plus edge
  • Conclusions
  • Index

Related Research

 

Enter your details below to request an extract of the report

Telefónica’s 10 steps to sustainable telecoms

Telefónica’s sustainability: A 20-year journey

Sustainability in the Coordination Age

As part of STL Partners’ research on the opportunities for telecoms operators and the wider industry in the Coordination Age, where the ultimate goal for operators, their customers, and society at large is to make better use of the world’s resources, we have explored how telcos can integrate sustainability into their activities. Previous research on this topic includes:

During the course of this research, we have identified Telefónica as one of the most proactive operators in sustainability. Through our interactions with Telefónica’s sustainability team, we have also found the team to be seriously committed, organised and successful in achieving buy-in to their vision from both the executive leadership team and several business units and opcos. This is a highly impressive achievement for such a large operator.

With the support of Telefónica’s sustainability team, through candid interviews with the team and their colleagues across the business, we have created this case study on their experiences in embedding sustainability across the business. We believe this will help other telcos intent on following a similar trajectory to understand how they can embed sustainability into their corporate strategies and day-to-day activities.

Enter your details below to request an extract of the report

How Telefónica got to where it is today

Since the creation of Telefónica’s first sustainability team in 2001, the operator has gradually built up its sustainability activities into a company-wide approach with cross team participation over the last twenty years. The first move in this direction came with the creation of the Climate Change Office in 2007, which included senior representatives from Operations, Procurement and Social Responsibility.

Over the last ten years Telefónica has implemented more than 1,400 energy efficiency projects and has carried an annual Energy and Climate Change Workshop with more than 30 vendors for 12 consecutive years, to exchange challenges and solutions to reduce their energy consumption and carbon emissions.

It has three main climate targets: energy efficiency and reducing energy consumption; utilising renewable energy; and reducing its carbon footprint to achieve net-zero emissions in 2040, including its value chain. Figure 2 outlines Telefónica’s sustainability journey and key inflection points through the years.

Key activities and inflection points in Telefónica’s

Telefónica's sustainability

Achieving buy-in across the organisation

Embedding sustainability into Telefónica has been a grassroots effort on the part of the small but hardworking Global Sustainability Department (hereafter known as the environmental team in this report) to find the proof points necessary to convince Telefónica’s senior management to build sustainability into the corporate strategy. The team has used a mixture of bottom-up and top-down approaches, with management support at crucial moments, which will be explored later in the report.

Through our many conversations with Telefónica’s environmental team, perseverance stood out as the most important characteristic within the team. When they recruit new employees, their priority is to find people with the ability to come up with innovative ideas for meeting sustainability targets, resilience, and perseverance.

This determined and visionary approach means that the environmental team works intuitively and pre-empts other departments’ needs. By the time colleagues from other departments approach the environmental team with their requirements for sustainability-related projects (for example the finance team’s interest in launching a Green Bond), the team is already armed with a range of data, materials and resources needed to put together a business case for the activity. As a result of this preparation, the environmental team has been able to quickly support and capitalise on new opportunities as they have arisen, ensuring they can keep the momentum going whenever it builds.

However, the process of embedding sustainability into company strategy has not come without challenges and difficulties. In conversations with STL Partners, the environmental team said that one of the challenges of working with different teams has been picking the right moment to approach them with ideas. Telefónica also stressed the importance of finding strategic alliances and internal champions on other teams. Through strategic, considered and strong relationship building, the environmental team has found internal champions in their Spanish core network operations, finance, procurement, enterprise, and sales teams, who are fully on board with the Telefónica sustainability vision and strategy.

Although the environmental team is currently working with the marketing team to ensure its sustainability message and efforts is more present in its brands, the environmental team cited this as one of its top priorities in 2022. Aside from needing to build stronger relationships and buy-in, part of the challenge is working with the marketing team on how to accurately and effectively market sustainability, without appearing to be ‘greenwashing’.

Another challenge is adapting to the different ways in which the other teams operate when implementing sustainability initiatives across the company. For example, the sales team generally work towards quick deadlines with short-term results, hence it may be harder to create an aligned dialogue with this team. Having a strong insight into the way Telefónica works as an organisation, by working directly within other teams e.g., helping the sales team to complete RFPs, helps this challenge.

By embedding sustainability into the company in these ways, all departments see the benefit and engage with the process. Telefónica told STL Partners that its employees believe in sustainability on a personal level as well as seeing the business benefit and commercial opportunity. Employees are genuinely engaging with sustainability issues themselves and want Telefónica to work towards sustainability goals as a company. As one employee said to us, “you don’t have to work in the environmental team to want to protect the environment”.

Ultimately, this rigorous, patient, committed and collaborative approach to sustainability has enabled the team to achieve broad buy-in across Telefónica’s business units and international opcos. Throughout the report we will explore how it has done this in:

  • Core network operations
  • Finance
  • Enterprise services
  • International opcos.

Table of contents

  • Executive Summary
    • What makes Telefónica different to other telcos?
    • Next steps
  • Table of Figures
  • Telefónica’s 20-year sustainability journey
    • Sustainability in the Coordination Age
    • How Telefónica got to where it is today
    • Achieving buy-in across the organisation
  • Why Telefónica stands out among telcos
    • High level overview of achievements so far
    • How Telefónica compares with other telcos
    • How Telefónica collaborates with its peers
  • Network operations: The first step to embedding sustainability in Telefónica
  • Sustainable financing: A pioneer in telecoms
    • How the first Green Bond came to life
    • Subsequent green and sustainable bonds
    • Challenges and benefits
  • Eco Smart label and consulting services: Expanding from networks to services
    • How the idea came to life
    • Consulting services through Telefónica Tech
    • Eco Smart label in 5G services
    • Sustainability as a core component of digital transformation
  • Implementing sustainability across a global footprint
    • Aligning goals with individual market dynamics
  • Conclusion
    • Ten takeaways from Telefónica’s holistic approach
  • Index

 

 

 

Enter your details below to request an extract of the report

Building the learning telco

Organisational learning is key to telcos’ success in the Coordination Age

Developments in technology and organisational digital transformations increased the pressure on learning and development (L&D) departments in telcos. L&D departments, many of which were compliance-focused, were tasked with upgrading telcos’ entire skills inventories to ensure that workforces were fit for new ways of working (e.g. AT&T’s “Workforce Reskilling” effort announced in 2016).

What was perhaps under-appreciated initially was that the need for L&D would not go away:

  • Telcos continue to operate in dynamic environments that are inherently unstable (e.g. pandemics, climate crises, new and evolving technologies);
  • Traditional telco revenue streams have remained under pressure, requiring new and innovative thinking to identify opportunities for growth.

The VUCA acronym (first coined in 1987) – standing for volatility, uncertainty, complexity, ambiguity – provides a useful framework to describe the current telco environment.

Enter your details below to request an extract of the report

The telco’s highly VUCA environment

learning telco

Source: STL Partners

Telcos have made changes to organisation structures in order to accommodate this reality, e.g. “flattening” the organisation and decentralising decision-making to accelerate the pace at which organisations can take action (absorb change and innovate).

Additionally, they are recognising the importance of learning to this process. Workforce skills must remain relevant and collective corporate intelligence must evolve to decide and inform winning strategies.

This type of “organisational learning” requires conscious efforts on the part of both the organisation and individual employees. It is not enough to make L&D the sole responsibility of an L&D team, or an HR department and to task them with identifying appropriate content and courses to push out to employees.

Organisations need to foster an environment where learning is encouraged and enabled in pursuit of organisational improvement, customer satisfaction, innovation and growth. After all, it is impossible to improve/do something new without learning in the first instance. Learning tools, processes and practices are required – and barriers to learning should be removed.

Learning barriers can include:

  • L&D teams creating bottlenecks to learning (e.g. restricted course access)
  • The existence of knowledge silos
  • Beliefs that “knowledge is power”
  • A lack of clear goals around using knowledge/new capabilities for improvement (i.e. learningto create behaviour change)
  • No incentives for individuals or teams to engage in learning
  • Uncertainty about processes for capturing and sharing learning
  • Fear of failure inhibiting trials in order to learn something new.

This report considers the key practices associated with organisational learning and identifies lessons from telcos who are progressing towards becoming a learning organisation.

Table of contents

  • Executive Summary
  • Introduction
  • The value of organisational learning
  • Enabling organisational learning
    • Types of learning in organisations
  • Organisational learning in practice
    • Learning as an organisational priority
    • Identifying learning purpose
    • Content-based learning
    • Person-led learning (knowledge sharing)
    • Process-led learning
    • Trial, reflection and practice
    • Recognition and rewards for learning
  • Towards learning organisations
    • Findings
    • Evaluation
  • Conclusions
  • Index

Related Research

Enter your details below to request an extract of the report

Microsoft, Affirmed and Metaswitch: What does it mean for telecoms?

What is Microsoft doing, and should telcos be worried?

Over the past two years, Microsoft and its cloud business unit Azure have intensified and deepened their involvement in the telecoms vertical. In 2020, this included the acquisition of two leading independent vendors of cloud-native network software, Affirmed Networks and Metaswitch. This move surprised many industry observers, as it represented an intensification of Microsoft’s involvement in telco networking.

In addition, in September 2020, Microsoft announced its ‘Azure for Operators’ strategy. This packages up all the elements of Microsoft’s and Azure’s infrastructure and service offerings for the telecoms industry – including those provided by Affirmed and Metaswitch – into a more comprehensive, end-to-end portfolio organised around Microsoft’s concept of a ‘carrier-grade cloud’: a cloud that is truly capable of supporting and delivering the distinct performance and reliability that telcos require from their network functions, as opposed to the mainstream cloud devoted to enterprise IT.

In this report, our discussion of Microsoft’s strategy and partnership offer to telcos is our own interpretation based on our research, including conversations with executives from Microsoft, Affirmed Networks and Metaswitch.

We examine Microsoft’s activities in the telecoms vertical in the light of three central questions:

  • What is Microsoft doing in telecoms, and what are its intentions?
  • How should telcos respond to Microsoft’s moves and those of comparable hyperscale cloud providers? Should they consume the hyperscalers’ telco cloud products, compete against the hyperscalers, or collaborate with them?
  • And what would count as success for telcos in relationship to Microsoft and the other hyperscalers? Are there any lessons to be learned from what is happening already?

Enter your details below to request an extract of the report

Microsoft’s telecom timeline

The last couple of years has seen Microsoft and Azure increasing their involvement in telecoms infrastructure and software while building partnerships with telcos around the world. This march into telecoms stepped up a level with Microsoft’s acquisition in 2020 of two independent virtual network function (VNF) vendors with a strong presence in the mobile core, among other things: Affirmed Networks and Metaswitch. Microsoft was not previously known for its strength in telco network software, and particularly the mobile domain – prompting the question: what exactly was it doing in telecoms?

The graphic below illustrates some of the key milestones in Microsoft’s steady march into telecoms.

Microsoft’s move on telecoms

Microsoft’s five partnership and service models

Microsoft Azure’s key initiatives over the past two years have been to expand its involvement in telecoms, culminating in Microsoft’s acquisition of Affirmed and Metaswitch, and the launch of the Azure for Operators portfolio.

As a result of these initiatives, we believe there are five models of partnership and service delivery that Microsoft is now proposing to operators, addressing the opportunities arising from a convergence of network, cloud and compute. Altogether, these five models are:

Five business models for partnerships

  • A classic telco-vendorrelationship (e.g. with Affirmed or Metaswitch) – helping telcos to evolve their own cloud-native network functions (CNFs), and cloud infrastructure and operations
  • The delivery and management of VNFs and CNFs as a cloud service, or ‘Network Functions-as-a-Service’ (NFaaS)
  • Enabling operators to pursue a hybrid-cloud operating model supporting the delivery of their own vertical-specific and enterprise applications and services, or Platform-as-a-Service (PaaS)
  • Rolling out Azure edge-cloud data centres into telco and enterprise edge locations to serve as a cloud delivery platform for third-party application developers providing low latency-dependent and high-bandwidth services, or ‘Network-as-a-Cloud Platform’ (NaaCP)
  • Using such Azure edge clouds – in enterprise and neutral facilities alongside telco edge locations – as the platform for full-fledged ‘net compute’ services, whether these are developed collaboratively with operators or not.

Table of Contents

  • Executive Summary
    • Microsoft wants to be a win-win partner
    • What should telcos and others do?
    • Next steps
  • Introduction
    • What is Microsoft doing, and should telcos be worried?
  • What has Microsoft done?
    • Microsoft’s telecom timeline
  • What is Microsoft’s strategy?
    • Microsoft’s five partnership and service models
    • The ‘Azure for Operators’ portfolio completes the set
    • 5G, cloud-native and net compute: Microsoft places itself at the heart of telco industry transformation
    • Cellular connectivity – particularly 5G – is pivotal
  • Telco-hyperscaler business models: What should telcos do?
    • Different hyperscalers have different telco strategies: comparison between Azure, AWS and Google Cloud
    • What should telcos do? Compete, consume or collaborate?
  • Microsoft’s ecosystem partnership model: What counts as success for telcos?
    • More important to grow the ecosystem than share of the value chain
    • Real-world examples: AT&T versus Verizon
  • Conclusion: Telcos should stay in the net compute game – and Microsoft wants be a partner
  • Appendix 1: Analysis of milestones of Microsoft’s journey into telecoms
  • Appendix 2: Opportunities and risks of different types of telco-hyperscaler partnership
  • Index

Enter your details below to request an extract of the report

Fixed wireless access growth: To 20% homes by 2025

=======================================================================================

Download the additional file on the left for the PPT chart pack accompanying this report

=======================================================================================

Fixed wireless access growth forecast

Fixed Wireless Access (FWA) networks use a wireless “last mile” link for the final connection of a broadband service to homes and businesses, rather than a copper, fibre or coaxial cable into the building. Provided mostly by WISPs (Wireless Internet Service Providers) or mobile network operators (MNOs), these services come in a wide range of speeds, prices and technology architectures.

Some FWA services are just a short “drop” from a nearby pole or fibre-fed hub, while others can work over distances of several kilometres or more in rural and remote areas, sometimes with base station sites backhauled by additional wireless links. WISPs can either be independent specialists, or traditional fixed/cable operators extending reach into areas they cannot economically cover with wired broadband.

There is a fair amount of definitional vagueness about FWA. The most expansive definitions include cheap mobile hotspots (“Mi-Fi” devices) used in homes, or various types of enterprise IoT gateway, both of which could easily be classified in other market segments. Most service providers don’t give separate breakouts of deployments, while regulators and other industry bodies report patchy and largely inconsistent data.

Our view is that FWA is firstly about providing permanent broadband access to a specific location or premises. Primarily, this is for residential wireless access to the Internet and sometimes typical telco-provided services such as IPTV and voice telephony. In a business context, there may be a mix of wireless Internet access and connectivity to corporate networks such as VPNs, again provided to a specific location or building.

A subset of FWA relates to M2M usage, for instance private networks run by utility companies for controlling grid assets in the field. These are typically not Internet-connected at all, and so don’t fit most observers’ general definition of “broadband access”.

Usually, FWA will be marketed as a specific service and package by some sort of network provider, usually including the terminal equipment (“CPE” – customer premise equipment), rather than allowing the user to “bring their own” device. That said, lower-end (especially 4G) offers may be SIM-only deals intended to be used with generic (and unmanaged) portable hotspots.
There are some examples of private network FWA, such as a large caravan or trailer park with wireless access provided from a central point, and perhaps in future municipal or enterprise cellular networks giving fixed access to particular tenant structures on-site – for instance to hangars at an airport.

Enter your details below to request an extract of the report

FWA today

Today, fixed-wireless access (FWA) is used for perhaps 8-9% of broadband connections globally, although this varies significantly by definition, country and region. There are various use cases (see below), but generally FWA is deployed in areas without good fixed broadband options, or by mobile-only operators trying to add an additional fixed revenue stream, where they have spare capacity.

Fixed wireless internet access fits specific sectors and uses, rather than the overall market

FWA Use Cases

Source: STL Partners

FWA has traditionally been used in sparsely populated rural areas, where the economics of fixed broadband are untenable, especially in developing markets without existing fibre transport to towns and villages, or even copper in residential areas. Such networks have typically used unlicensed frequency bands, as there is limited interference – and little financial justification for expensive spectrum purchases. In most cases, such deployments use proprietary variants of Wi-Fi, or its ill-fated 2010-era sibling WiMAX.

Increasingly however, FWA is being used in more urban settings, and in more developed market scenarios – for example during the phase-out of older xDSL broadband, or in places with limited or no competition between fixed-network providers. Some cellular networks primarily intended for mobile broadband (MBB) have been used for fixed usage as well, especially if spare capacity has been available. 4G has already catalysed rapid growth of FWA in numerous markets, such as South Africa, Japan, Sri Lanka, Italy and the Philippines – and 5G is likely to make a further big difference in coming years. These mostly rely on licensed spectrum, typically the national bands owned by major MNOs. In some cases, specific bands are used for FWA use, rather than sharing with normal mobile broadband. This allows appropriate “dimensioning” of network elements, and clearer cost-accounting for management.

Historically, most FWA has required an external antenna and professional installation on each individual house, although it also gets deployed for multi-dwelling units (MDUs, i.e. apartment blocks) as well as some non-residential premises like shops and schools. More recently, self-installed indoor CPE with varying levels of price and sophistication has helped broaden the market, enabling customers to get terminals at retail stores or delivered direct to their home for immediate use.

Looking forward, the arrival of 5G mass-market equipment and larger swathes of mmWave and new mid-band spectrum – both licensed and unlicensed – is changing the landscape again, with the potential for fibre-rivalling speeds, sometimes at gigabit-grade.

Enter your details below to request an extract of the report

Table of contents

  • Executive Summary
  • Introduction
    • FWA today
    • Universal broadband as a goal
    • What’s changed in recent years?
    • What’s changed because of the pandemic?
  • The FWA market and use cases
    • Niche or mainstream? National or local?
    • Targeting key applications / user groups
  • FWA technology evolution
    • A broad array of options
    • Wi-Fi, WiMAX and close relatives
    • Using a mobile-primary network for FWA
    • 4G and 5G for WISPs
    • Other FWA options
    • Customer premise equipment: indoor or outdoor?
    • Spectrum implications and options
  • The new FWA value chain
    • Can MNOs use FWA to enter the fixed broadband market?
    • Reinventing the WISPs
    • Other value chain participants
    • Is satellite a rival waiting in the wings?
  • Commercial models and packages
    • Typical pricing and packages
    • Example FWA operators and plans
  • STL’s FWA market forecasts
    • Quantitative market sizing and forecast
    • High level market forecast
  • Conclusions
    • What will 5G deliver – and when and where?
  • Index

Apple Glass: An iPhone moment for 5G?

Augmented reality supports many use cases across industries

Revisiting the themes explored in the AR/VR: Won’t move the 5G needle report STL Partners published in January 2018, this report explores whether augmented reality (AR) could become a catalyst for widespread adoption of 5G, as leading chip supplier Qualcomm and some telcos hope.

It considers how this technology is developing, its relationship with virtual reality (VR), and the implications for telcos trying to find compelling reasons for customers to use low latency 5G networks.

This report draws the following distinction between VR and AR

  • Virtual reality: use of an enclosed headset for total immersion in a digital3D
  • Augmented reality: superimposition of digital graphics onto images of the real world via a camera viewfinder, a pair of glasses or onto a screen fixed in real world.

In other words, AR is used both indoors and outdoors and on a variety of devices. Whereas Wi-Fi/fibre connectivity will be the preferred connectivity option in many scenarios, 5G will be required in locations lacking high-speed Wi-Fi coverage.  Many AR applications rely on responsive connectivity to enable them to interact with the real world. To be compelling, animated images superimposed on those of the real world need to change in a way that is consistent with changes in the real world and changes in the viewing angle.

AR can be used to create innovative games, such as the 2016 phenomena Pokemon Go, and educational and informational tools, such as travel guides that give you information about the monument you are looking at.  At live sports events, spectators could use AR software to identify players, see how fast they are running, check their heart rates and call up their career statistics.

Note, an advanced form of AR is sometimes referred to as mixed reality or extended reality (XR). In this case, fully interactive digital 3D objects are superimposed on the real world, effectively mixing virtual objects and people with physical objects and people into a seamless interactive scene. For example, an advanced telepresence service could project a live hologram of the person you are talking to into the same room as you. Note, this could be an avatar representing the person or, where the connectivity allows, an actual 3D video stream of the actual person.

Widespread usage of AR services will be a hallmark of the Coordination Age, in the sense that they will bring valuable information to people as and when they need it. First responders, for example, could use smart glasses to help work their way through smoke inside a building, while police officers could be immediately fed information about the owner of a car registration plate. Office workers may use smart glasses to live stream a hologram of a colleague from the other side of the world or a 3D model of a new product or building.

In the home, both AR and VR could be used to generate new entertainment experiences, ranging from highly immersive games to live holograms of sports events or music concerts. Some people may even use these services as a form of escapism, virtually inhabiting alternative realities for several hours a day.

Given sufficient time to develop, STL Partners believes mixed-reality services will ultimately become widely adopted in the developed world. They will become a valuable aid to everyday living, providing the user with information about whatever they are looking at, either on a transparent screen on a pair of glasses or through a wireless earpiece. If you had a device that could give you notifications, such as an alert about a fast approaching car or a delay to your train, in your ear or eyeline, why wouldn’t you want to use it?

How different AR applications affect mobile networks

One of the key questions for the telecoms industry is how many of these applications will require very low latency, high-speed connectivity. The transmission of high-definition holographic images from one place to another in real time could place enormous demands on telecoms networks, opening up opportunities for telcos to earn additional revenues by providing dedicated/managed connectivity at a premium price. But many AR applications, such as displaying reviews of the restaurant a consumer is looking at, are unlikely to generate much data traffic. the figure below lists some potential AR use cases and indicates how demanding they will be to support.

Examples of AR use cases and the demands they make on connectivity


Source: STL Partners

Although telcos have always struggled to convince people to pay a premium for premium connectivity, some of the most advanced AR applications may be sufficiently compelling to bring about this kind of behavioural shift, just as people are prepared to pay more for a better seat at the theatre or in a sports stadium. This could be on a pay-as-you-go or a subscription basis.

Enter your details below to request an extract of the report

The pioneers of augmented reality

Augmented reality (AR) is essentially a catch-all term for any application that seeks to overlay digital information and images on the real-world. Applications of AR can range from a simple digital label to a live 3D holographic projection of a person or event.

AR really rose to prominence at the start of the last decade with the launch of smartphone apps, such as Layar, Junaio, and Wikitude, which gave you information about what you were looking at through the smartphone viewfinder. These apps drew on data from the handset’s GPS chip, its compass and, in some cases, image recognition software to try and figure out what was being displayed in the viewfinder. Although they attracted a lot of media attention, these apps were too clunky to break through into the mass-market. However, the underlying concept persists – the reasonably popular Google Lens app enables people to identify a product, plant or animal they are looking at or translate a menu into their own language.

Perhaps the most high profile AR application to date is Niantic’s Pokemon Go, a smartphone game that superimposes cartoon monsters on images of the real world captured by the user’s smartphone camera. Pokemon Go generated $1 billion in revenue globally just seven months after its release in mid 2016, faster than any other mobile game, according to App Annie. It has also shown remarkable staying power. Four years later, in May 2020, Pokemon Go continued to be one of the top 10 grossing games worldwide, according to SensorTower.

In November 2017, Niantic, which has also had another major AR hit with sci-fi game Ingress, raised $200 million to boost its AR efforts. In 2019, it released another AR game based on the Harry Potter franchise.

Niantic is now looking to use its AR expertise to create a new kind of marketing platform. The idea is that brands will be able to post digital adverts and content in real-world locations, essentially creating digital billboards that are viewable to consumers using the Niantic platform. At the online AWE event in May 2020, Niantic executives claimed “AR gamification and location-based context” can help businesses increase their reach, boost user sentiment, and drive foot traffic to bricks-and-mortar stores. Niantic says it is working with major brands, such as AT&T, Simon Malls, Starbucks, Mcdonalds, and Samsung, to develop AR marketing that “is non-intrusive, organic, and engaging.”

The sustained success of Pokemon Go has made an impression on the major Internet platforms. By 2018, the immediate focus of both Apple and Google had clearly shifted from VR to AR. Apple CEO Tim Cook has been particularly vocal about the potential of AR. And he continues to sing the praises of the technology in public.

In January 2020, for example, during a visit to Ireland, Cook described augmented reality as the “next big thing.”  In an earnings call later that month, Cook added:When you look at AR today, you would see that there are consumer applications, there are enterprise applications. … it’s going to pervade your life…, because it’s going to go across both business and your whole life. And I think these things will happen in parallel.”

Both Apple and Google have released AR developer tools, helping AR apps to proliferate in both Apple’s App Store and on Google Play.  One of the most popular early use cases for AR is to check how potential new furniture would look inside a living room or a bedroom. Furniture stores and home design companies, such as Ikea, Wayfair and Houzz, have launched their own AR apps using Apple’s ARKit. Once the app is familiar with its surroundings, it allows the user to overlay digital models of furniture anywhere in a room to see how it will fit. The technology can work in outdoor spaces as well.

In a similar vein, there are various AR apps, such as MeasureKit, that allow you to measure any object of your choosing. After the user picks a starting point with a screen tap, a straight line will measure the length until a second tap marks the end. MeasureKit also claims to be able to calculate trajectory distances of moving objects, angle degrees, the square footage of a three-dimensional cube and a person’s height.

Table of contents

  • Executive Summary
    • More mainstream models from late 2022
    • Implications and opportunities for telcos
  • Introduction
  • Progress and Immediate Prospects
    • The pioneers of augmented reality
    • Impact of the pandemic
    • Snap – seeing the world differently
    • Facebook – the keeper of the VR flame
    • Google – the leader in image recognition
    • Apple – patiently playing the long game
    • Microsoft – expensive offerings for the enterprise
    • Amazon – teaming up with telcos to enable AR/VR
    • Market forecasts being revised down
  • Telcos Get Active in AR
    • South Korea’s telcos keep trying
    • The global picture
  • What comes next?
    • Live 3D holograms of events
    • Enhancing live venues with holograms
    • 4K HD – Simple, but effective
  • Technical requirements
    • Extreme image processing
    • An array of sensors and cameras
    • Artificial intelligence plays a role
    • Bandwidth and latency
    • Costs: energy, weight and financial
  • Timelines for Better VR and AR
    • When might mass-market models become available?
    • Implications for telcos
    • Opportunities for telcos
  • Appendix: Societal Challenges
    • AR: Is it acceptable in a public place?
    • VR: health issues
    • VR and AR: moral and ethical challenges
    • AR and VR: What do consumers really want?
  • Index

Enter your details below to request an extract of the report

Telco edge computing: How to partner with hyperscalers

Edge computing is getting real

Hyperscalers such as Amazon, Microsoft and Google are rapidly increasing their presence in the edge computing market by launching dedicated products, establishing partnerships with telcos on 5G edge infrastructure and embedding their platforms into operators’ infrastructure.

Many telecoms operators, who need cloud infrastructure and platform support to run their edge services, have welcomed the partnership opportunity. However, they are yet to develop clear strategies on how to use these partnerships to establish a stronger proposition in the edge market, move up the value chain and play a role beyond hosting infrastructure and delivering connectivity. Operators that miss out on the partnership opportunity or fail to fully utilise it to develop and differentiate their capabilities and resources could risk either being reduced to connectivity providers with a limited role in the edge market and/or being late to the game.

Edge computing or multi-access edge computing (MEC) enables processing data closer to the end user or device (i.e. the source of data), on physical compute infrastructure that is positioned on the spectrum between the device and the internet or hyperscale cloud.

Telco edge computing is mainly defined as a distributed compute managed by a telco operator. This includes running workloads on customer premises as well as locations within the operator network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs. Depending on where the computing resources reside, edge computing can be broadly divided into:

  • Network edge which includes sites or points of presence (PoPs) owned by a telecoms operator such as base stations, central offices and other aggregation points on the access and/or core network.
  • On-premise edge where the computing resources reside at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.

Our overview on edge computing definitions, network structure, market opportunities and business models can be found in our previous report Telco Edge Computing: What’s the operator strategy?

The edge computing opportunity for operators and hyperscalers

Many operators are looking at edge computing as a good opportunity to leverage their existing assets and resources to innovate and move up the value chain. They aim to expand their services and revenue beyond connectivity and enter the platform and application space. By deploying computing resources at the network edge, operators can offer infrastructure-as-a-service and alternative application and solutions for enterprises. Also, edge computing as a distributed compute structure and an extension of the cloud supports the operators’ own journey into virtualising the network and running internal operations more efficiently.

Cloud hyperscalers, especially the biggest three – Amazon Web Services (AWS), Microsoft Azure and Google – are at the forefront of the edge computing market. In the recent few years, they have made efforts to spread their influence outside of their public clouds and have moved the data acquisition point closer to physical devices. These include efforts in integrating their stack into IoT devices and network gateways as well as supporting private and hybrid cloud deployments. Recently, hyperscalers took another step to get closer to customers at the edge by launching platforms dedicated to telecom networks and enabling integration with 5G networks. The latest of these products include Wavelength from AWS, Azure Edge Zones from Microsoft and Anthos for Telecom from Google Cloud. Details on these products are available in section.

Enter your details below to request an extract of the report

From competition to coopetition

Both hyperscalers and telcos are among the top contenders to lead the edge market. However, each stakeholder lacks a significant piece of the stack which the other has. This is the cloud platform for operators and the physical locations for hyperscalers. Initially, operators and hyperscalers were seen as competitors racing to enter the market through different approaches. This has resulted in the emergence of new types of stakeholders including independent mini data centre providers such as Vapor IO and EdgeConnex, and platform start-ups such as MobiledgeX and Ori Industries.

However, operators acknowledge that even if they do own the edge clouds, these still need to be supported by hyperscaler clouds to create a distributed cloud. To fuel the edge market and build its momentum, operators will, in the most part, work with the cloud providers. Partnerships between operators and hyperscalers are starting to take place and shape the market, impacting edge computing short- and long-term strategies for operators as well as hyperscalers and other players in the market.

Figure 1: Major telco-hyperscalers edge partnerships

Major telco-hyperscaler partnerships

Source: STL Partners analysis

What does it mean for telcos?

Going to market alone is not an attractive option for either operators or hyperscalers at the moment, given the high investment requirement without a guaranteed return. The partnerships between two of the biggest forces in the market will provide the necessary push for the use cases to be developed and enterprise adoption to be accelerated. However, as markets grow and change, so do the stakeholders’ strategies and relationships between them.

Since the emergence of cloud computing and the development of the digital technologies market, operators have been faced with tough competition from the internet players, including hyperscalers who have managed to remain agile while building a sustained appetite for innovation and market disruption. Edge computing is not an exception and they are moving rapidly to define and own the biggest share of the edge market.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market while partnering with hyperscalers in edge computing.

Table of contents

  • Executive Summary
    • Operators’ roles along the edge computing value chain
    • Building a bigger ecosystem and pushing market adoption
    • How partnerships can shape the market
    • What next?
  • Introduction
    • The edge computing opportunity for operators and hyperscalers
    • From competition to coopetition
    • What does it mean for telcos?
  • Overview of the telco-hyperscalers partnerships
    • Explaining the major roles required to enable edge services
    • The hyperscaler-telco edge commercial model
  • Hyperscalers’ edge strategies
    • Overview of hyperscalers’ solutions and activities at the edge
    • Hyperscalers approach to edge sites and infrastructure acquisition
  • Operators’ edge strategies and their roles in the partnerships
    • Examples of operators’ edge computing activities
    • Telcos’ approach to integrating edge platforms
  • Conclusion
    • Infrastructure strategy
    • Platform strategy
    • Verticals and ecosystem building strategy

 

Enter your details below to request an extract of the report

Telco edge computing: What is the operator strategy?

To access the report chart pack in PPT download the additional file on the left

Edge computing can help telcos to move up the value chain

The edge computing market and the technologies enabling it are rapidly developing and attracting new players, providing new opportunities to enterprises and service providers. Telco operators are eyeing the market and looking to leverage the technology to move up the value chain and generate more revenue from their networks and services. Edge computing also represents an opportunity for telcos to extend their role beyond offering connectivity services and move into the platform and the application space.

However, operators will be faced with tough competition from other market players such as cloud providers, who are moving rapidly to define and own the biggest share of the edge market. Plus, industrial solution providers, such as Bosch and Siemens, are similarly investing in their own edge services. Telcos are also dealing with technical and business challenges as they venture into the new market and trying to position themselves and identifying their strategies accordingly.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market.

Following this introduction, we focus on:

  1. Edge terminology and structure, explaining common terms used within the edge computing context, where the edge resides, and the role of edge computing in 5G.
  2. An overview of the edge computing market, describing different types of stakeholders, current telecoms operators’ deployments and plans, competition from hyperscale cloud providers and the current investment and consolidation trends.
  3. Telcos challenges in addressing the edge opportunity: technical, organisational and commercial challenges given the market
  4. Potential use cases and business models for operators, also exploring possible scenarios of how the market is going to develop and operators’ likely positioning.
  5. A set of recommendations for operators that are building their strategy for the edge.

Enter your details below to request an extract of the report

What is edge computing and where exactly is the edge?

Edge computing brings cloud services and capabilities including computing, storage and networking physically closer to the end-user by locating them on more widely distributed compute infrastructure, typically at smaller sites.

One could argue that edge computing has existed for some time – local infrastructure has been used for compute and storage, be it end-devices, gateways or on-premises data centres. However, edge computing, or edge cloud, refers to bringing the flexibility and openness of cloud-native infrastructure to that local infrastructure.

In contrast to hyperscale cloud computing where all the data is sent to central locations to be processed and stored, edge computing local processing aims to reduce time and save bandwidth needed to send and receive data between the applications and cloud, which improves the performance of the network and the applications. This does not mean that edge computing is an alternative to cloud computing. It is rather an evolutionary step that complements the current cloud computing infrastructure and offers more flexibility in executing and delivering applications.

Edge computing offers mobile operators several opportunities such as:

  • Differentiating service offerings using edge capabilities
  • Providing new applications and solutions using edge capabilities
  • Enabling customers and partners to leverage the distributed computing network in application development
  • Improving networkperformance and achieving efficiencies / cost savings

As edge computing technologies and definitions are still evolving, different terms are sometimes used interchangeably or have been associated with a certain type of stakeholder. For example, mobile edge computing is often used within the mobile network context and has evolved into multi-access edge computing (MEC) – adopted by the European Telecommunications Standards Institute (ETSI) – to include fixed and converged network edge computing scenarios. Fog computing is also often compared to edge computing; the former includes running intelligence on the end-device and is more IoT focused.

These are some of the key terms that need to be identified when discussing edge computing:

  • Network edge refers to edge compute locations that are at sites or points of presence (PoPs) owned by a telecoms operator, for example at a central office in the mobile network or at an ISP’s node.
  • Telco edge cloud is mainly defined as distributed compute managed by a telco  This includes running workloads on customer premises equipment (CPE) at customers’ sites as well as locations within the operator network such as base stations, central offices and other aggregation points on access and/or core network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs.
  • On-premise edge computing refers to the computing resources that are residing at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.
  • Edge cloud is used to describe the virtualised infrastructure available at the edge. It creates a distributed version of the cloud with some flexibility and scalability at the edge. This flexibility allows it to have the capacity to handle sudden surges in workloads from unplanned activities, unlike static on-premise servers. Figure 1 shows the differences between these terms.

Figure 1: Edge computing types

definition of edge computing

Source: STL Partners

Network infrastructure and how the edge relates to 5G

Discussions on edge computing strategies and market are often linked to 5G. Both technologies have overlapping goals of improving performance and throughput and reducing latency for applications such as AR/VR, autonomous vehicles and IoT. 5G improves speed by increasing spectral efficacy, it offers the potential of much higher speeds than 4G. Edge computing, on the other hand, reduces latency by shortening the time required for data processing by allocating resources closer to the application. When combined, edge and 5G can help to achieve round-trip latency below 10 milliseconds.

While 5G deployment is yet to accelerate and reach ubiquitous coverage, the edge can be utilised in some places to reduce latency where needed. There are two reasons why the edge will be part of 5G:

  • First, it has been included in the 5Gstandards (3GPP Release 15) to enable ultra-low latency which will not be achieved by only improvements in the radio interface.
  • Second, operators are in general taking a slow and gradual approach to 5G deployment which means that 5G coverage alone will not provide a big incentive for developers to drive the application market. Edge can be used to fill the network gaps to stimulate the application market growth.

The network edge can be used for applications that need coverage (i.e. accessible anywhere) and can be moved across different edge locations to scale capacity up or down as required. Where an operator decides to establish an edge node depends on:

  • Application latency needs. Some applications such as streaming virtual reality or mission critical applications will require locations close enough to its users to enable sub-50 milliseconds latency.
  • Current network topology. Based on the operators’ network topology, there will be selected locations that can meet the edge latency requirements for the specific application under consideration in terms of the number of hops and the part of the network it resides in.
  • Virtualisation roadmap. The operator needs to consider virtualisation roadmap and where data centre facilities are planned to be built to support future network
  • Site and maintenance costs. The cloud computing economies of scale may diminish as the number of sites proliferate at the edge, for example there is a significant difference in maintaining 1-2 large data centres to maintaining 100s across the country
  • Site availability. Some operators’ edge compute deployment plans assume the nodes reside in the same facilities as those which host their NFV infrastructure. However, many telcos are still in the process of renovating these locations to turn them into (mini) data centres so aren’t yet ready.
  • Site ownership. Sometimes the preferred edge location is within sites that the operators have limited control over, whether that is in the customer premise or within the network. For example, in the US, the cell towers are owned by tower operators such as Crown Castle, American Tower and SBA Communications.

The potential locations for edge nodes can be mapped across the mobile network in four levels as shown in Figure 2.

Figure 2: possible locations for edge computing

edge computing locations

Source: STL Partners

Table of Contents

  • Executive Summary
    • Recommendations for telco operators at the edge
    • Four key use cases for operators
    • Edge computing players are tackling market fragmentation with strategic partnerships
    • What next?
  • Table of Figures
  • Introduction
  • Definitions of edge computing terms and key components
    • What is edge computing and where exactly is the edge?
    • Network infrastructure and how the edge relates to 5G
  • Market overview and opportunities
    • The value chain and the types of stakeholders
    • Hyperscale cloud provider activities at the edge
    • Telco initiatives, pilots and plans
    • Investment and merger and acquisition trends in edge computing
  • Use cases and business models for telcos
    • Telco edge computing use cases
    • Vertical opportunities
    • Roles and business models for telcos
  • Telcos’ challenges at the edge
  • Scenarios for network edge infrastructure development
  • Recommendation
  • Index

Enter your details below to request an extract of the report

Cloud gaming: What is the telco play?

To access the report chart pack in PPT download the additional file on the left

Drivers for cloud gaming services

Although many people still think of PlayStation and Xbox when they think about gaming, the console market represents only a third of the global games market. From its arcade and console-based beginnings, the gaming industry has come a long way. Over the past 20 years, one of the most significant market trends has been growth of casual gamers. Whereas hardcore gamers are passionate about frequent play and will pay more to play premium games, casual gamers play to pass the time. With the rapid adoption of smartphones capable of supporting gaming applications over the past decade, the population of casual/occasional gamers has risen dramatically.

This trend has seen the advent of free-to-play business models for games, further expanding the industry’s reach. In our earlier report, STL estimated that 45% of the population in the U.S. are either casual gamers (between 2 and 5 hours a week) or occasional gamers (up to 2 hours a week). By contrast, we estimated that hardcore gamers (more than 15 hours a week) make up 5% of the U.S. population, while regular players (5 to 15 hours a week) account for a further 15% of the population.

The expansion in the number of players is driving interest in ‘cloud gaming’. Instead of games running on a console or PC, cloud gaming involves streaming games onto a device from remote servers. The actual game is stored and run on a remote compute with the results being live streamed to the player’s device. This has the important advantage of eliminating the need for players to purchase dedicated gaming hardware. Now, the quality of the internet connection becomes the most important contributor to the gaming experience. While this type of gaming is still in its infancy, and faces a number of challenges, many companies are now entering the cloud gaming fold in an effort to capitalise on the new opportunity.

5G can support cloud gaming traffic growth

Cloud gaming requires not just high bandwidth and low latency, but also a stable connection and consistent low latency (jitter). In theory, 5G promises to deliver stable ultra-low latency. In practice, an enormous amount of infrastructure investment will be required in order to enable a fully loaded 5G network to perform as well as end-to-end fibre5G networks operating in the lower frequency bands would likely buckle under the load if lots of gamers in a cell needed a continuous 25Mbps stream. While 5G in millimetre-wave spectrum would have more capacity, it would require small cells and other mechanisms to ensure indoor penetration, given the spectrum is short range and could be blocked by obstacles such as walls.

Enter your details below to request an extract of the report

A complicated ecosystem

As explained in our earlier report, Cloud gaming: New opportunities for telcos?, the cloud gaming ecosystem is beginning to take shape. This is being accelerated by the growing availability of fibre and high-speed broadband, which is now being augmented by 5G and, in some cases, edge data centres. Early movers in cloud gaming are offering a range of services, from gaming rigs, to game development platforms, cloud computing infrastructure, or an amalgamation of these.

One of the main attractions of cloud gaming is the potential hardware savings for gamers. High-end PC gaming can be an extremely expensive hobby: gaming PCs range from £500 for the very cheapest to over £5,000 for the very top end. They also require frequent hardware upgrades in order to meet the increasing processing demands of new gaming titles. With cloud gaming, you can access the latest graphics processing unit at a much lower cost.

By some estimates, cloud gaming could deliver a high-end gaming environment at a quarter of the cost of a traditional console-based approach, as it would eliminate the need for retailing, packaging and delivering hardware and software to consumers, while also tapping the economies of scale inherent in the cloud. However, in STL Partners’ view that is a best-case scenario and a 50% reduction in costs is probably more realistic.

STL Partners believes adoption of cloud gaming will be gradual and piecemeal for the next few years, as console gamers work their way through another generation of consoles and casual gamers are reluctant to commit to a monthly subscription. However, from 2022, adoption is likely to grow rapidly as cloud gaming propositions improve.

At this stage, it is not yet clear who will dominate the value chain, if anyone. Will the “hyperscalers” be successful in creating a ‘Netflix’ for games? Google is certainly trying to do this with its Stadia platform, which has yet to gain any real traction, due to both its limited games library and its perceived technological immaturity. The established players in the games industry, such as EA, Microsoft (Xbox) and Sony (PlayStation), have launched cloud gaming offerings, or are, at least, in the process of doing so. Some telcos, such as Deutsche Telekom and Sunrise, are developing their own cloud gaming services, while SK Telecom is partnering with Microsoft.

What telcos can learn from Shadow’s cloud gaming proposition

The rest of this report explores the business models being pursued by cloud gaming providers. Specifically, it looks at cloud gaming company Shadow and how it fits into the wider ecosystem, before evaluating how its distinct approach compares with that of the major players in online entertainment, such as Sony and Google. The second half of the report considers the implications for telcos.

Table of Contents

  • Executive Summary
  • Introduction
  • Cloud gaming: a complicated ecosystem
    • The battle of the business models
    • The economics of cloud gaming and pricing models
    • Content offering will trump price
    • Cloud gaming is well positioned for casual gamers
    • The future cloud gaming landscape
  • 5G and fixed wireless
  • The role of edge computing
  • How and where can telcos add value?
  • Conclusions

Enter your details below to request an extract of the report

Cloud gaming: New opportunities for telcos?

Gaming is following video to the cloud

Cloud gaming services enable consumers to play video games using any device with a screen and an Internet connection – the software and hardware required to play the game are all hosted on remote cloud services. Some reviewers say connectivity and cloud technologies have now advanced to a point where cloud gaming can begin to rival the experience offered by leading consoles, such as Microsoft’s Xbox and Sony’s PlayStation, while delivering greater interactivity and flexibility than gaming that relies on local hardware. Google believes it is now feasible to move gaming completely into the cloud – it has just launched its Stadia cloud gaming service. Although Microsoft is sounding a more cautious note, it is gearing up to launch a rival cloud gaming proposition called xCloud.

This report explores cloud gaming and models the size of the potential market, including the scale of the opportunity for telcos. It also considers the potential ramifications for telecoms networks. If Stadia, xCloud and other cloud gaming services take off, consumer demand for high-bandwidth, low latency connectivity could soar. At the same time, cloud gaming could also provide a key test of the business rationale for edge computing, which involves the deployment of compute power and data storage closer to the end users of digital content and applications. This allows the associated data to be processed, analysed and acted on locally, instead of being transmitted long distances to be processed at central data centres.

This report then goes on to outline the rollout of cloud gaming services by various telcos, including Deutsche Telekom in Germany and Sunrise in Switzerland, while also considering Apple’s strategy in this space. Finally, the conclusions section summarises how telcos around the world should be preparing for mass-market cloud gaming.

This report builds on previous executive briefings published by STL Partners, including:

Enter your details below to request an extract of the report

What is cloud gaming?

Up to now, keen gamers have generally bought a dedicated console, such as a Microsoft Xbox or Sony PlayStation, or a high-end computer, to play technically complex and graphically rich games. They also typically buy a physical copy of the game (a DVD), which they install on their console or in an optical disc drive attached to their PC. Alternatively, some platforms, such as Steam, allow gamers to download games from a marketplace.

Cloud gaming changes that paradigm by running the games on remote hardware in the cloud, with the video and audio then streamed to the consumer’s device, which could be a smartphone, a connected TV, a low-end PC or a tablet. The player would typically connect this device to a dedicated handheld controller, similar to one that they would use with an Xbox or a PlayStation.

There is also a half-way house between full cloud gaming and console gaming. This “lite” form of cloud gaming is sometimes known as “command streaming”. In this case, the game logic and graphics commands are processed in the cloud, but the graphics rendering happens locally on the device. This approach lowers the amount of bandwidth required (sending commands requires less bandwidth than sending video) and is less demanding from a latency perspective (no encoding/ decoding of the video stream). But the quality of graphics will be limited to the capabilities of the graphic processing unit on the end-user’s device. For keen players that want to play graphically rich games, command streaming wouldn’t necessarily eliminate the need to buy a console or a powerful PC.

As well as relocating and rejigging the computing permutations, cloud gaming opens up new business models. Rather than buying individual games, for example, the consumer could pay for a Netflix-style subscription service that would enable them to play a wide range of online video games, without having to download them. Alternatively, cloud gaming services could use a pay-as-you-go model, simply charging consumers by the minute or hour.

Today, these cloud gaming subscriptions can be relatively expensive. For example, Shadow, an existing cloud gaming service charges US$35 a month in the U.S., £32 a month in the U.K. and €40 a month in France and Germany (but there are significant discounts if the subscriber commits to 12 months). Shadow can support graphics resolution of 4K at 60 frames per second and conventional HD at 144 frames per second, which is superior to a typical console specification. It requires an Internet connection of at least 15 Mbps. Shadow is compatible with Windows 7/8/10, macOS, Android, Linux (beta), iOS (beta) and comes with a Windows 10 license, which can be used for other PC applications.

At those prices, Shadow is a niche offering. But Google is now looking to take cloud gaming mainstream by setting subscription charges at around US$10 a month – comparable to a Spotify or Netflix subscription, although the user will have to pay additional fees to buy most games. Google says its new Stadia cloud gaming service is accessible from any device that can run YouTube in HD at 30/60 frames per second (fps), as long as it has a fast enough connection (15–25Mbps). The consumer then uses a dedicated controller that can connect directly to their Wi-Fi, bypassing the device with the screen. All the processing is done in Google’s cloud, which then sends a YouTube video-stream to the device: the URL pinpoints which clip of the gameplay to request and receive.

In other words, Stadia will treat games as personalised YouTube video clips/web-pages that a player or viewer can interact with in real time. As a result, the gamer can share that stream easily with friends by sending them the URL. With permission from the gamer, the friend could then jump straight into the gameplay using their own device.

What is cloud gaming?

Table of contents

  • Executive Summary
  • Introduction
  • What is cloud gaming?
    • Why consumers will embrace cloud gaming
  • Ramifications for telecoms networks
    • Big demands on bandwidth
    • Latency
    • Edge computing
    • The network architecture underpinning Google Stadia
  • How large is the potential market?
    • Modelling the U.S. cloud gaming market
    • New business models
  • Telcos’ cloud gaming activities
    • Microsoft hedges its bets
    • Apple takes a different tack
  • Conclusions
    • Telcos without their own online entertainment offering
    • Telcos with their own online entertainment offering

Enter your details below to request an extract of the report

The Industrial IoT: What’s the telco opportunity?

The Industrial IoT is a confusing world

This report is the final report in a mini-series about the Internet for Things (I4T), which we see as the next stage of evolution from today’s IoT.

The first report, The IoT is dead: Long live the Internet for Things, outlines why today’s IoT infrastructure is insufficient for meeting businesses’ needs. The main problem with today’s IoT is that every company’s data is locked in its own silo, and one company’s solutions are likely deployed on a different platform than their partners’. So companies can optimise their internal operations, but have limited scope to use IoT to optimise operations involving multiple organisations.

The second report, Digital twins: A catalyst of disruption in the Coordination Age, provides an overview of what a digital twin is, and how they can play a role in overcoming the limitations of today’s IoT industry.

This report looks more closely at the state of development of enterprise and industrial IoT and the leading players in today’s IoT industry, which we believe is a crucial driver of the Coordination Age. In the Coordination Age, we believe the crucial socio-economic need in the world – and therefore the biggest business opportunity – is to make better use of our resources, whether that is time, money, or raw materials. Given the number of people employed in and resources going through industrial processes, figuring out what’s needed to make the industrial IoT reach its full potential is a big part of making this possible.

Three types of IoT

There are three ways of dividing up the types of IoT applications. As described by IoT expert Stacey Higginbotham, each group has distinct needs and priorities based on their main purpose:

  1. Consumer IoT: A connected device, with an interactive app, that provides an additional service to the end user compared with an unconnected version of the device. The additional service is enabled by the insights and data gathered from the device. The key priority for consumer devices is low price point and ease of installation, given most users’ lack of technical expertise.
  2. Enterprise IoT: This includes all the devices and sensors that enterprises are connecting to the internet, e.g. enterprise mobility and fleet tracking. Since every device connected to an enterprise network is a potential point of vulnerability, the primary concern of enterprise IoT is security and device management. This is achieved through documentation of devices on enterprise networks, prioritisation of devices and traffic across multiple types of networks, e.g. depending on speed and security requirements, and access rights controls, to track who is sharing data with whom and when.
  3. Industrial IoT: This field is born out of industrial protocols such as SCADA, which do not currently connect to the internet but rather to an internal control and monitoring system for manufacturing equipment. More recently, enterprises have enhanced these systems with a host of devices connected to IP networks through Wi-Fi or other technologies, and linked legacy monitoring systems to gateways that feed operational data into more user-friendly, cloud-based monitoring and analytics solutions. At this point, the lines between Industrial IoT and Enterprise IoT blur. When the cloud-based systems have the ability to control connected equipment, for instance through firmware updates, security to prevent malicious or unintended risks is paramount. The primary goals in IIoT remain to control and monitor, in order to improve operational efficiency and safety, although with rising security needs.

The Internet for Things (I4T) is in large part about bridging the divide between Enterprise and Industrial IoT. The idea is to be able to share highly sensitive industrial information, such as a change in operational status that will affect a supply chain, or a fault in public infrastructure like roads, rail or electricity grid, that will affect surroundings and require repairs. This requires new solutions that can coordinate and track the movement of Industrial IoT data into Enterprise IoT insights and actions.

Understandably, enterprises are way of opening any vulnerabilities into their operations through deeper or broader connections, so finding a secure way to bring about the I4T is the primary concern.

Enter your details below to download an extract of the report

The proliferation of IoT platforms

Almost every major player in the ICT world is pitching for a role in both Enterprise and Industrial IoT. Most largescale manufacturers and telecoms operators are also trying to carve out a role in the IoT industry.

By and large, these players have developed specific IoT solutions linked to their core businesses, and then expanded by developing some kind of “IoT platform” that brings together a broader range of capabilities across the IT stack necessary to provide end-to-end IoT solutions.
The result is a hugely complex industry with many overlapping and competing “platforms”. Because they all do something different, the term “platform” is often unhelpful in understanding what a company provides.

A company’s “IoT platform” might comprise of any combination of these four layers of the IoT stack, all of which are key components of an end-to-end solution:

  1. Hardware: This is the IoT device or sensor that is used to collect and transmit data. Larger devices may also have inbuilt compute power enabling them to run local analysis on the data collected, in order to curate which data need to be sent to a central repository or other devices.
  2. Connectivity: This is the means by which data is transmitted, including location-based connectivity (Bluetooth, Wi-Fi), to low power wide area over unlicensed spectrum (Sigfox, LoRa), and cellular (NB-IoT, LTE-M, LTE).
  3. IoT service enablement: This is the most nebulous category, because it includes anything that sits as middleware in between connectivity and the end application. The main types of enabling functions are:
    • Cloud compute capacity for storing and analysing data
    • Data management: aggregating, structuring and standardising data from multiple different sources. There are sub-categories within this geared towards specific end applications, such as product or service lifecycle management tools.
    • Device management: device onboarding, monitoring, software updates, and security. Software and security management are often broken out as separate enablement solutions.
    • Connectivity management: orchestrating IoT devices over a variety of networks
    • Data / device visualisation: This includes graphical interfaces for presenting complex data sets and insights, and 3D modelling tools for industrial equipment.
  4. Applications: These leverage tools in the IoT enablement layer to deliver specific insights or trigger actions that deliver a specific outcome to end users, such as predictive maintenance or fleet management. Applications are usually tailored to the specific needs of end users and rarely scale well across multiple industries.

Most “IoT platforms” combine at least two layers across this IoT stack

graphic of 4 layers on the IoT stack

Source: STL Partners

There are two key reasons why platforms offering end-to-end services have dominated the early development of the IoT industry:

  • Enterprises’ most immediate needs have been to have greater visibility into their own operations and make them more efficient. This means IoT initiatives have been driven primarily by business owners, rather than technology teams, who often don’t have the skills to piece together multiple different components by themselves.
  • Although the IoT as a whole is a big business, each individual component to bringing a solution together is relatively small. So companies providing IoT solutions – including telcos – have attempted to capture a larger share of the value chain in order to make it a better business.

Making sense of the confusion

It is a daunting task to work out how to bring IoT into play in any organisation. It requires a thorough re-think of how a business operates, for a start, then tinkering with (or transforming) its core systems and processes, depending on how you approach it.

That’s tricky enough even without the burgeoning market of self-proclaimed “leaders of industrial IoT” and technology players’ “IoT platforms”.

This report does not attempt to answer “what is the best way / platform” for different IoT implementations. There are many other resources available that attempt to offer comparisons to help guide users through the task of picking the right tools for the job.

The objective here is to gain a sense of what is real today, and where the opportunities and gaps are, in order to help telecoms operators and their partners understand how they can help enterprises move beyond the IoT, into the I4T.

 

Table of contents

  • Executive Summary
  • Introduction
    • Three types of IoT
    • The proliferation of IoT platforms
    • Making sense of the confusion
  • The state of the IoT industry
    • In the beginning, there was SCADA
    • Then there were specialised industrial automation systems
    • IoT providers are learning about evolving customer needs
  • Overview of IoT solution providers
    • Generalist scaled IT players
    • The Internet players (Amazon, Google and Microsoft)
    • Large-scale manufacturers
    • Transformation / IoT specialists
    • Big telco vendors
    • Telecoms operators
    • Other connectivity-led players
  • Conclusions and recommendations
    • A buyers’ eye view: Too much choice, not enough agility
    • How telcos can help – and succeed over the long term in IoT

Enter your details below to download an extract of the report

B2B growth: How can telcos win in ICT?

Introduction

The telecom industry’s growth profile over the last few years is a sobering sight. As we have shown in our recent report Which operator growth strategies will remain viable in 2017 and beyond?, yearly revenue growth rates have been clearly slowing down globally since 2009 (see Figure 1). In three major regions (North America, Europe, Middle East) compound annual growth rates have even been behind GDP growth.

 

Figure 1: Telcos’ growth performance is flattening out (Sample of sixty-eight operators)

Source: Company accounts; STL Partners analysis

To break out of this decline telcos are constantly searching for new sources of revenue, for example, by expanding into adjacent, digital service areas which are largely placed within mass consumer markets (e.g. content, advertising, commerce).

However, in our ongoing conversations with telecoms operators, we increasingly come across the notion that a large part of future growth potential might actually lie in B2B (business-to-business) markets and that this customer segment will have an increasing impact of overall revenue growth.

This report investigates the rationale behind this thinking in detail and tries to answer the following key questions:

  1. What is the current state of telco’s B2B business?
  2. Where are the telco growth opportunities in the wider enterprise ICT arena?
  3. What makes an enterprise ICT growth strategy difficult for telcos to execute?
  4. What are the pillars of a successful strategy for future B2B growth?

 

  • Executive Summary
  • Introduction
  • Telcos may have different B2B strategies, but suffer similar problems
  • Finding growth opportunities within the wider enterprise ICT arena could help
  • Three complications for revenue growth in enterprise ICT
  • Complication 1: Despite their potential, telcos struggle to marshal their capabilities effectively
  • Complication 2: Telcos are not alone in targeting enterprise ICT for growth
  • Complication 3: Telcos’ core services are being disrupted by OTT players – this time in B2B
  • STL Partners’ recommendations: strategic pillars for future B2B growth
  • Conclusion

 

  • Figure 1: Telcos’ growth performance is flattening out (Sample of sixty-eight operators)
  • Figure 2: Telcos’ B2B businesses vary significantly by scale and performance (selected operators)
  • Figure 3: High-level structure of the telecom industry’s revenue pool (2015) – the consumer segment dominates
  • Figure 4: Orange aims to expand the share of “IT & integration services” in OBS’s revenue mix
  • Figure 5: Global enterprise ICT expenditures are projected to growth 7% p.a.
  • Figure 6: Telcos and Microsoft are moving in opposite directions
  • Figure 7: SD-WAN value chain
  • Figure 8: Within AT&T Business Solutions’ revenue mix, growth in fixed strategic services cannot yet offset the decline in legacy services

The Open Source Telco: Taking Control of Destiny

Preface

This report examines the approaches to open source software – broadly, software for which the source code is freely available for use, subject to certain licensing conditions – of telecoms operators globally. Several factors have come together in recent years to make the role of open source software an important and dynamic area of debate for operators, including:

  • Technological Progress: Advances in core networking technologies, especially network functions virtualisation (NFV) and software-defined networking (SDN), are closely associated with open source software and initiatives, such as OPNFV and OpenDaylight. Many operators are actively participating in these initiatives, as well as trialling their software and, in some cases, moving them into production. This represents a fundamental shift away from the industry’s traditional, proprietary, vendor-procured model.
    • Why are we now seeing more open source activities around core communications technologies?
  • Financial Pressure: However, over-the-top (OTT) disintermediation, regulation and adverse macroeconomic conditions have led to reduced core communications revenues for operators in both developed and emerging markets alike. As a result, operators are exploring opportunities to move away from their core, infrastructure business, and compete in the more software-centric services layer.
    • How do the Internet players use open source software, and what are the lessons for operators?
  • The Need for Agility: In general, there is recognition within the telecoms industry that operators need to become more ‘agile’ if they are to succeed in the new, rapidly-changing ICT world, and greater use of open source software is seen by many as a key enabler of this transformation.
    • How can the use of open source software increase operator agility?

The answers to these questions, and more, are the topic of this report, which is sponsored by Dialogic and independently produced by STL Partners. The report draws on a series of 21 interviews conducted by STL Partners with senior technologists, strategists and product managers from telecoms operators globally.

Figure 1: Split of Interviewees by Business Area

Source: STL Partners

Introduction

Open source is less optional than it once was – even for Apple and Microsoft

From the audience’s point of view, the most important announcement at Apple’s Worldwide Developer Conference (WWDC) this year was not the new versions of iOS and OS X, or even its Spotify-challenging Apple Music service. Instead, it was the announcement that Apple’s highly popular programming language ‘Swift’ was to be made open source, where open source software is broadly defined as software for which the source code is freely available for use – subject to certain licensing conditions.

On one level, therefore, this represents a clever engagement strategy with developers. Open source software uptake has increased rapidly during the last 15 years, most famously embodied by the Linux operating system (OS), and with this developers have demonstrated a growing preference for open source tools and platforms. Since Apple has generally pushed developers towards proprietary development tools, and away from third-party ones (such as Adobe Flash), this is significant in itself.

An indication of open source’s growth can be found in OS market shares in consumer electronics devices. As Figure 2 shows below, Android (open source) had a 49% share of shipments in 2014; if we include the various other open source OS’s in ‘other’, this increases to more than 50%.

Figure 2: Share of consumer electronics shipments* by OS, 2014

Source: Gartner
* Includes smartphones, tablets, laptops and desktop PCs

However, one of the components being open sourced is Swift’s (proprietary) compiler – a program that translates written code into an executable program that a computer system understands. The implication of this is that, in theory, we could even see Swift applications running on non-Apple devices in the future. In other words, Apple believes the risk of Swift being used on Android is outweighed by the reward of engaging with the developer community through open source.

Whilst some technology companies, especially the likes of Facebook, Google and Netflix, are well known for their activities in open source, Apple is a company famous for its proprietary approach to both hardware and software. This, combined with similar activities by Microsoft (who open sourced its .NET framework in 2014), suggest that open source is now less optional than it once was.

Open source is both an old and a new concept for operators

At first glance, open source also appears to now be less optional for telecoms operators, who traditionally procure proprietary software (and hardware) from third-party vendors. Whilst many (but not all) operators have been using open source software for some time, such as Linux and various open source databases in the IT domain (e.g. MySQL), we have in the last 2-3 years seen a step-change in operator interest in open source across multiple domains. The following quote, taken directly from the interviews, summarises the situation nicely:

“Open source is both an old and a new project for many operators: old in the sense that we have been using Linux, FreeBSD, and others for a number of years; new in the sense that open source is moving out of the IT domain and towards new areas of the industry.” 

AT&T, for example, has been speaking widely about its ‘Domain 2.0’ programme. Domain 2.0 has the objectives to transform AT&T’s technical infrastructure to incorporate network functions virtualisation (NFV) and software-defined networking (SDN), to mandate a higher degree of interoperability, and to broaden the range of alternative suppliers available across its core business. By 2020, AT&T hopes to virtualise 75% of its network functions, and it sees open source as accounting for up to 50% of this. AT&T, like many other operators, is also a member of various recently-formed initiatives and foundations around NFV and SDN, such as OPNFV – Figure 3 lists some below.

Figure 3: OPNFV Platinum Members

Source: OPNFV website

However, based on publicly-available information, other operators might appear to have lesser ambitions in this space. As ever, the situation is more complex than it first appears: other operators do have significant ambitions in open source and, despite the headlines NFV and SDN draw, there are many other business areas in which open source is playing (or will play) an important role. Figure 4 below includes three quotes from the interviews which highlight this broad spectrum of opinion:

Figure 4: Different attitudes of operators to open source – selected interview quotes

Source: STL Partners interviews

Key Questions to be Addressed

We therefore have many questions which need to be addressed concerning operator attitudes to open source software, adoption (by area of business), and more:

  1. What is open source software, what are its major initiatives, and who uses it most widely today?
  2. What are the most important advantages and disadvantages of open source software? 
  3. To what extent are telecoms operators using open source software today? Why, and where?
  4. What are the key barriers to operator adoption of open source software?
  5. Prospects: How will this situation change?

These are now addressed in turn.

  • Preface
  • Executive Summary
  • Introduction
  • Open source is less optional than it once was – even for Apple and Microsoft
  • Open source is both an old and a new concept for operators
  • Key Questions to be Addressed
  • Understanding Open Source Software
  • The Theory: Freely available, licensed source code
  • The Industry: Dominated by key initiatives and contributors
  • Research Findings: Evaluating Open Source
  • Open source has both advantages and disadvantages
  • Debunking Myths: Open source’s performance and security
  • Where are telcos using open source today?
  • Transformation of telcos’ service portfolios is making open source more relevant than ever…
  • … and three key factors determine where operators are using open source software today
  • Open Source Adoption: Business Critical vs. Service Area
  • Barriers to Telco Adoption of Open Source
  • Two ‘external’ barriers by the industry’s nature
  • Three ‘internal’ barriers which can (and must) change
  • Prospects and Recommendations
  • Prospects: An open source evolution, not revolution
  • Open Source, Transformation, and Six Key Recommendations
  • About STL Partners and Telco 2.0
  • About Dialogic

 

  • Figure 1: Split of Interviewees by Business Area
  • Figure 2: Share of consumer electronics shipments* by OS, 2014
  • Figure 3: OPNFV Platinum Members
  • Figure 4: Different attitudes of operators to open source – selected interview quotes
  • Figure 5: The Open IT Ecosystem (incl. key industry bodies)
  • Figure 6: Three Forms of Governance in Open Source Software Projects
  • Figure 7: Three Classes of Open Source Software License
  • Figure 8: Web Server Share of Active Sites by Developer, 2000-2015
  • Figure 9: Leading software companies vs. Red Hat, market capitalisation, Oct. 2015
  • Figure 10: The Key Advantages and Disadvantages of Open Source Software
  • Figure 11: How Google Works – Failing Well
  • Figure 12: Performance gains from an open source activation (OSS) platform
  • Figure 13: Intel Hardware Performance, 2010-13
  • Figure 14: Open source is more likely to be found today in areas which are…
  • Figure 15: Framework mapping current telco uptake of open source software
  • Figure 16: Five key barriers to telco adoption of open source software
  • Figure 17: % of employees with ‘software’ in their LinkedIn job title, Oct. 2015
  • Figure 18: ‘Waterfall’ and ‘Agile’ Software Development Methodologies Compared
  • Figure 19: Four key cultural attributes for successful telco transformation

Microsoft: Pivoting to a Communications-Centric Business

Introduction: From Monopoly to Disruption

For many years, Microsoft was an iconic monopolist, in much the same way as AT&T had been before divestment. Microsoft’s products were ubiquitous and often innovative, and its profitability enormous. It was familiar, yet frequently scorned as the creator of a dreary monoculture with atrocious security properties. Microsoft’s mission statement could not have been simpler: a computer in every office and in every home. This achieved, though, its critics have often seen it as an organisation in search of an identity, experimenting with mobile, search, maps, hardware and much else without really settling on a new direction.

Going to the numbers, for the last two years, there has been steady erosion of the once phenomenally high margins, although revenue is still steadily rising. Since Q3 2013, revenue at Microsoft grew an average of 3.5% annually, but the decline in margins meant that profits barely grew, with a CAGR of 0.66%. Telcos will be familiar with this kind of stagnation, but telcos would be delighted with Microsoft’s 66% gross margins. Note, that getting into hardware has given Microsoft a typical hardware vendor’s Christmas spike in revenue.

Figure 1:  MS revenue is growing steadily but margin erosion undermines it

Source: Microsoft 10-K, STL Partners

Over the long term, the pattern is clearer, as are the causes. Figure 2 shows Microsoft’s annual revenue and gross margin since the financial year 1995. From 1995 to 2010, gross margins were consistently between 80 and 90 per cent, twice the 45% target HP traditionally defined as “fascinating”. It was good to be king. However, in the financial year 2010, there is a clear inflection point: margins depart from the 80% mark and never return, falling at a 3.45% clip between 2010 and 2015.

The event that triggered this should be no surprise. Microsoft has traditionally been discussed in parentheses with Apple, and Apple’s 2010 was a significant one. It was the first year that Apple began using the A-series processors of its own design, benefiting from the acquisition of PA Semiconductor in 2008. This marked an important strategic shift at Apple from the outsourced, design- and brand-centric business to vertical integration and investment in manufacturing, a strategy associated with Tim Cook’s role as head of the supply chain.

Figure 2: The inflection point in 2010

Source: Microsoft 10-K, STL Partners

The deployment of the A4 chip made possible two major product launches in 2010 – the iPhone 4, which would sell enormously more than any of the previous iPhones, and the iPad, which created an entirely new product category competing directly with the PC. Another Apple product launch that year, which also competed head-on with Microsoft, wasn’t quite as dramatic but was also very significant – the MacBook line began shipping with SSDs rather than hard disks, and the very popular 11” MacBook Air was added as an entry-level option. At the time, the PC industry and hence Microsoft was heavily committed to the Intel-backed netbooks, and the combination of the iPad and the 11” Air essentially destroyed the netbook as a product category.

The problems started in the consumer market, but the industry was beginning to recognise that innovations had begun to take hold in consumer and then diffuse into the enterprise. Further, the enterprise franchise centred on the Microsoft Business division and what was then termed Server & Tools[1] were both threatened by the increasing adoption of Apple products.

Microsoft had to respond, and it did so with a succession of dramatic initiatives. One was to rethink Windows as a tablet- or phone-optimised operating system, in Windows Phone 7 and Windows 8. Another was to acquire Nokia’s smartphone business, and to diversify into hardware via the Xbox and Surface projects. And yet a third was to embrace the cloud. Figure 3 shows the results.

  • Introduction
  • Executive Summary
  • From Monopoly to Disruption
  • The push into mobile fails…but what about the cloud?
  • Changing Platforms: from Windows to Office
  • The Skype Acquisition: a missed opportunity?
  • Skype for Business and Office 365: the new platform
  • The rise of the consumer cloud
  • Bing may just about be breaking even…but the real story here is consumer cloud
  • Scaling out in the cloud
  • Conclusions: towards a communications-centric Microsoft

 

  • Figure 1: MS revenue is growing steadily but margin erosion undermines it
  • Figure 2: The inflection point in 2010
  • Figure 3: Revenue by product category at Microsoft, last 2 years
  • Figure 4: Cloud and the Enterprise drive profitability at Microsoft
  • Figure 5: Cloud is the driver of growth at Microsoft
  • Figure 6: Internally-developed hardware and cloud services are improving their margins
  • Figure 7: The Nokia Devices & Services business slides into loss
  • Figure 8: In 2011, an unifying API appeared critical for Skype’s future within Microsoft
  • Figure 9: Cloud is now over $8bn a year in revenue
  • Figure 10: Spot the deliberate mistake. No mention of Bing’s profitability or otherwise
  • Figure 11: Bing was a money pit for years, but may have begun to improve
  • Figure 12: The app store and consumer cloud businesses are performing superbly

Google’s Big, Big Data Battle

The challenges to Google’s core business 

Although Google is the world’s leading search engine by some distance, its pre-eminence is more fragile than its first appears. As Google likes to remind anti-trust authorities, its competitors are just a click away. And its primary competitors are some of the most powerful and well-financed companies in the world – Apple, Amazon, Facebook and Microsoft. As these companies, as well as specialist service providers, accumulate more and more data on consumers, Google’s position as the leading broker of online advertising is under threat in several, inter-related, ways:

  1. Google’s margins are being squeezed, as competition intensifies. Increasingly experienced web users are using specialist search engines, such as Amazon (shopping), Expedia (travel) and moneysupermarket.com (financial services), or going direct to the sites they need, thereby circumventing Google’s search engine and the advertising brokered by Google. This trend is exacerbated by Google’s ongoing lockout from the vast amount of content being generated by Facebook’s social network. As the Internet matures, general-purpose web search may become yesterday’s business.
  2. The rise of the app-based Internet: As consumers increasingly access the Internet via mobile devices, they are making greater use of apps and less use of browsers and, by extension, conventional search engines. Apps are popular on mobile devices because they are designed to take the consumer straight to the content they are looking for, rather than requiring them to navigate around the web using small and fiddly on-screen keyboards. Moreover, Apple, the leading provider of smartphones and tablets to the affluent, is seeking to relegate, and where feasible, remove, Google’s apps and services in its ecosystem.
  3. Android forks: Android, an extraordinarily successful ‘Trojan Horse’ for Google’s apps and services, is the market leading operating system for mobile devices, but Google’s control of Android is patchy. Some device makers are integrating their own apps into a forked variant of this open-source platform. Amazon and Nokia are among those who have stripped Google’s search, maps, mail and store apps from their variants of the Android operating system, reducing the data that Google can gather on their customers. At the same time, Samsung, the world’s largest handset vendor, is straining at Google’s Android leash.
  4. Quality dilution: As Google is the world’s dominant search engine, it is the prime target for so-called content farms that produce large volumes of low quality content in an effort to rank highly in Google’s search results and thereby attract traffic and advertising.
  5. Regulatory scrutiny: Despite a February 2014 settlement with the European Commission concerning its search practices, Google remains in the regulatory spotlight. Competition authorities across the world continue to fret about Google’s market power and its ability to influence what people look at on the Internet.

1. Google’s margin squeeze

Price deflation

Google, the company that facilitated massive deflation across advertising, content, e-commerce, and mobile operating systems, is itself suffering from the deflationary environment of the Internet. Although revenue and net income are still growing, margins are shrinking (see Figure 2). Google is still growing because it is adding volume. However, there is strong evidence that its pricing power is being eroded.

Figure 2: Google margins are steadily falling as volumes continue to rise

Telco 2 Figure 2: Google margins are steadily falling as volumes continue to rise

Source: Google filings

To put this in the context of its Silicon Valley peers, Figure 3 shows the same data for Google, Facebook, and Apple using a trend line covering the 2009 to 2013 period for each company. Note, that we have used a log scale to compare three companies of very different size. Apple saw growth in both revenue and operating margins until 2013, when it hit a difficult patch, although a big product launch might fix that at any time. Facebook has grown revenues enormously, but went through a traumatic 2012 as the shift to mobile hit it. While all this drama went on, Google has grown steadily, while seeing its margins eroded.

Figure 3: Google’s operating margins are now below those of Apple and Facebook

Telco 2 Figure 3 googles operating mar

Source: SEC filings

What are the factors behind Google’s declining operating margin? We believe the main drivers are:

  • The amount Google can charge per click is falling – buyers get more ads per buck.
  • The cost of acquiring ad inventory is increasing.

Cheaper ads

As Figure 4 shows, Google continues to drive ad volume (paid clicks), but ad rates (cost per click) are falling steadily. The average cost-per-click on Google websites and Google Network Members’ websites decreased approximately 8% from 2012 to 2013.  We think this is primarily due to intensifying competition, particularly from Facebook. However, Google attributes the decline to “various factors, such as the introduction of new products as well as changes in property mix, platform mix and geographical mix, and the general strengthening of the U.S. dollar compared to certain foreign currencies.” The second quarter of 2014 saw paid clicks rise 2% quarter-on-quarter, while the cost per click was flat.

Figure 4: The cost per click is declining in lockstep with rising volume

Telco 2 Figure 4 The cost per click is declining in lockstep with rising volume

Source: Google filings

 

  • Introduction
  • Executive Summary
  • The challenges to Google’s core business
  • 1. Google’s margin squeeze
  • 2. The rising importance of mobile apps
  • 3. Android forks
  • 4. Quality dilution
  • 5. Regulatory scrutiny
  • Google’s strategy – get on the front foot
  • Google Now – turning search on its head
  • Reactive search becomes more proactive
  • Voice input
  • Anticipating wearables, connected cars and the Internet of Things
  • Searching inside apps
  • Evaluating Google Now
  • 1. The marketplace
  • 2. Develop compelling service offerings
  • 3. The value network
  • 4. Technology
  • 5. Finance – the high-level business model

 

  • Figure 1: How Google is neutralising threats and pursuing opportunities
  • Figure 2: Google margins are steadily falling as volumes continue to rise
  • Figure 3: Google’s operating mar gins are now below those of Apple and Facebook
  • Figure 4: The cost per click is declining in lockstep with rising volume
  • Figure 5: Rising distribution costs are driving Google’s TAC upwards
  • Figure 6: Google’s revenues are increasingly coming from in-house sites and apps
  • Figure 7: R&D is the fastest-growing ad-acquisition cost in absolute terms
  • Figure 8: Daily active users of Facebook generating content out of Google’s reach
  • Figure 9: Google is still the most popular destination on the Internet
  • Figure 10: In the U.S., usage of desktop web sites is falling
  • Figure 11: Google’s declining share of mobile search advertising in the U.S.
  • Figure 12: Google’s lead on the mobile web is narrower than on the desktop web
  • Figure 13: Top smartphone apps in the U.S. by average unique monthly users
  • Figure 14: For Google, its removal from the default iOS Maps app is a major blow
  • Figure 15: On Android, Google owns four of the five most used apps in the U.S.
  • Figure 16: The resources Google needs to devote to web spam are rising over time
  • Figure 17: Google, now genuinely global.
  • Figure 18: A gap in the market: Timely proactive recommendations
  • Figure 19: Google’s search engine is becoming proactive
  • Figure 20: The ongoing evolution of Google Search into a proactive, recommendations service
  • Figure 21: The Telco 2.0 Business Model Framework
  • Figure 22: Amazon Local asks you to set preferences
  • Figure 23: Google Now’s cards and the information they use
  • Figure 24: Android dominates the global smartphone market
  • Figure 25: Samsung has about 30% of the global smartphone market
  • Figure 26: Google – not quite the complete Internet company
  • Figure 27: Google’s strategic response