How telcos can provide a tonic for transport

5G can help revolutionise public transport

With the advent of 5G, STL Partners believes telcos have a broad opportunity to help coordinate better use of the world’s resources and assets, as outlined in the report: The Coordination Age: A third age of telecoms. Reliable and ubiquitous connectivity can enable companies and consumers to use digital technologies to efficiently allocate and source assets and resources.

In urban and suburban transport markets, one precious resource is in short supply – space. Trains can be crowded, roads can be congested and there may be nowhere to park. Following the enormous changes in working patterns in the wake of the pandemic, both individuals and policymakers are reviewing their transport choices.

Enter your details below to request an extract of the report


 

This report explores how the concept of mobility-as-a-service (MaaS) is evolving, while outlining the challenges facing those companies looking to transform public transport. In particular, it considers how telcos and 5G could support the development and deployment of automated shuttle buses, which are now beginning to appear on the world’s roads. Whereas self-driving cars are taking much longer to develop than their proponents expected, automated shuttle buses look like a more realistic mid-term prospect. Running on relatively short set routes, these vehicles are easier to automate and can be monitored/controlled by dedicated connectivity infrastructure.

This report also examines the role of 5G connectivity in other potentially-disruptive transport propositions, such as remotely controlled hire cars, passenger drones and flying cars, which could emerge over the next decade. It builds on previous STL Partners research including:

Where is transport headed?

Across the world, transport is in a state of flux. Growing congestion, the pandemic, concerns about air quality and climate change, and the emergence of new technologies are taking the transport sector in new directions. Urban planners have long recognised that having large numbers of half-empty cars crawling around at 20km/hour looking for somewhere to park is not a good use of resources.

Experimentation abounds. Many municipalities are building bike lanes and closing roads to try and encourage people to get out of their cars. In response, sales of electric bikes and scooters are rising fast. The past 10 years has also seen a global boom (followed by a partial bust) in micro-mobility services – shared bikes and scooters. Although they haven’t lived up to the initial hype, these sharing economy services have become a key part of the transport mix in many cities (for more on this, see the STL Partners report: Can telcos help cities combat congestion?).

Indeed, these micro-mobility services may be given a shot in the arm by the difficulties faced by the ride hailing business. In many cities, Uber and Lyft are under intense pressure to improve their driver proposition by giving workers more rights, while complying with more stringent safety regulations. That is driving costs upwards. Uber had hoped to ultimately replace human drivers with self-driving vehicles, but that now looks unlikely to happen in the foreseeable future. Tesla, which has always been bullish about the prospects autonomous driving, keeps having to revise its timelines backwards.

Tellingly, the Chinese government has pushed back a target to have more than half of new cars sold to have self-driving capabilities from 2020 to 2025. It blamed technical difficulties, exacerbated by the coronavirus pandemic, in a 2020 statement issued by National Development and Reform Commission and the Ministry of Industry and Information Technology.

Still, self-driving cars will surely arrive eventually. In July, Alphabet (Google’s parent) reported that its experimental self-driving vehicle unit Waymo continues to grow. “People love the fully autonomous ride hailing service in Phoenix,” Sundar Pichai, CEO Alphabet and Google, enthused. “Since first launching its services to the public in October 2020, Waymo has safely served tens of thousands of rides without a human driver in the vehicle, and we look forward to many more.”

In response to analyst questions, Pichai added: “We’ve had very good experience by scaling up rides. These are driverless rides and no one is in the car other than the passengers. And people have had a very positive experience overall. …I expect us to scale up more through the course of 2022.”

More broadly, the immediate priority for many governments will be on greening their transport systems, given the rising public concern about climate change and extreme weather. The latest report from the Intergovernmental Panel on Climate Change calls for “immediate, rapid and large-scale reductions in greenhouse gas emissions” to stabilise the earth’s climate. This pressure will likely increase the pace at which traditional components of the transport system become all-electric – cars, motorbikes, buses, bikes, scooters and even small aircraft are making the transition from relying on fossil fuel or muscle power to relying on batteries.

The rest of this 45-page report explores how public transport is evolving, and the role of 5G connectivity and telcos can play in enabling the shift.

Table of contents

  • Executive Summary
  • Introduction
  • Where is transport headed?
    • Mobility-as-a-service
    • The role of digitisation and data
    • Rethinking the bus
    • Takeaways
  • How telcos are supporting public transport
    • Deutsche Telekom: Trying to digitise transport
    • Telia: Using 5G to support shuttle buses
    • Takeaways
  • The key challenges
    • A complex and multi-faceted value chain
    • Regulatory caution
    • Building viable business models
    • Takeaways
  • Automakers become service providers
    • Volvo to retrieve driving data in real-time
    • Automakers and tech companies team up
    • Takeaways
  • Taxis and buses take to the air
    • The prognosis for passenger drones
    • Takeaways
  • Conclusions: Strategic implications for telcos

 

Enter your details below to request an extract of the report


 

Are telcos smart enough to make money work?

Telco consumer financial services propositions

Telcos face a perplexing challenge in consumer markets. On the one hand, telcos’ standing with consumers has improved through the COVID-19 pandemic, and demand for connectivity is strong and continues to grow. On the other hand, most consumers are not spending more money with telcos because operators have yet to create compelling new propositions that they can charge more for. In the broadest sense, telcos need to (and can in our view) create more value for consumers and society more generally.

Download the report extract

As discussed in our previous research, we believe the world is now entering a “Coordination Age” in which multiple stakeholders will work together to maximize the potential of the planet’s natural and human resources. New technologies – 5G, analytics, AI, automation, cloud – are making it feasible to coordinate and optimise the allocation of resources in real-time. As providers of connectivity that generates vast amounts of relevant data, telcos can play an important role in enabling this coordination. Although some operators have found it difficult to expand beyond connectivity, the opportunity still exists and may actually be expanding.

In this report, we consider how telcos can support more efficient allocation of capital by playing in the financial services sector.  Financial services (banking) sits in a “sweet spot” for operators: economies of scale are available at a national level, connected technology can change the industry.

Financial Services in the Telecoms sweet spot

financial services

Source STL Partners

The financial services industry is undergoing major disruption brought about by a combination of digitisation and liberalisation – new legislation, such as the EU’s Payment Services Directive, is making it easier for new players to enter the banking market. And there is more disruption to come with the advent of digital currencies – China and the EU have both indicated that they will launch digital currencies, while the U.S. is mulling going down the same route.

A digital currency is intended to be a digital version of cash that is underpinned directly by the country’s central bank. Rather than owning notes or coins, you would own a deposit directly with the central bank. The idea is that a digital currency, in an increasingly cash-free society, would help ensure financial stability by enabling people to store at least some of their money with a trusted official platform, rather than a company or bank that might go bust. A digital currency could also make it easier to bring unbanked citizens (the majority of the world’s population) into the financial system, as central banks could issue digital currencies directly to individuals without them needing to have a commercial bank account. Telcos (and other online service providers) could help consumers to hold digital currency directly with a central bank.

Although the financial services industry has already experienced major upheaval, there is much more to come. “There’s no question that digital currencies and the underlying technology have the potential to drive the next wave in financial services,” Dan Schulman, the CEO of PayPal told investors in February 2021. “I think those technologies can help solve some of the fundamental problems of the system. The fact that there’s this huge prevalence and cost of cash, that there’s lack of access for so many parts of the population into the system, that there’s limited liquidity, there’s high friction in commerce and payments.”

In light of this ongoing disruption, this report reviews the efforts of various operators, such as Orange, Telefónica and Turkcell, to expand into consumer financial services, notably the provision of loans and insurance. A close analysis of their various initiatives offers pointers to the success criteria in this market, while also highlighting some of the potential pitfalls to avoid.

Table of contents

  • Executive Summary
  • Introduction
  • Potential business models
    • Who are you serving?
    • What are you doing for the people you serve?
    • M-Pesa – a springboard into an array of services
    • Docomo demonstrates what can be done
    • But the competition is fierce
  • Applying AI to lending and insurance
    • Analysing hundreds of data points
    • Upstart – one of the frontrunners in automated lending
    • Takeaways
  • From payments to financial portal
    • Takeaways
  • Turkcell goes broad and deep
    • Paycell has a foothold
    • Consumer finance takes a hit
    • Regulation moving in the right direction
    • Turkcell’s broader expansion plans
    • Takeaways
  • Telefónica targets quick loans
    • Growing competition
    • Elsewhere in Latin America
    • Takeaways
  • Momentum builds for Orange
    • The cost of Orange Bank
    • Takeaways
  • Conclusions and recommendations
  • Index

This report builds on earlier STL Partners research, including:

Download the report extract

Commerce and connectivity: A match made in heaven?

Rakuten and Reliance: The exceptions or the rule?

Over the past decade, STL Partners has analysed how connectivity, commerce and content have become increasingly interdependent – as both shopping and entertainment go digital, telecoms networks have become key distribution channels for all kinds of consumer businesses. Equally, the growing availability of digital commerce and content are driving demand for connectivity both inside and outside the home.

To date, the top tier of consumer Internet players – Google, Apple, Amazon, Alibaba, Tencent and Facebook – have tended to focus on trying to dominate commerce and content, largely leaving the provision of connectivity to the conventional telecoms sector. But now some major players in the commerce market, such as Rakuten in Japan and Reliance in India, are pushing into connectivity, as well as content.

This report considers whether Rakuten’s and Reliance’s efforts to combine content, commerce and connectivity into a single package is a harbinger of things to come or the exceptions that will prove the longstanding rule that telecoms is a distinct activity with few synergies with adjacent sectors. The provision of connectivity has generally been regarded as a horizontal enabler for other forms of economic activity, rather than part of a vertically-integrated service stack.

This report also explores the extent to which new technologies, such as cloud-native networks and open radio access networks, and an increase in licence-exempt spectrum, are making it easier for companies in adjacent sectors to provide connectivity. Two chapters cover Google and Amazon’s connectivity strategies respectively, analysing the moves they have made to date and what they may do in future. The final section of this report draws some conclusions and then considers the implications for telcos.

This report builds on earlier STL Partners research, including:

Enter your details below to download an extract of the report


Mixing commerce and connectivity

Over the past decade, the smartphone has become an everyday shopping tool for billions of people, particularly in Asia. As a result, the smartphone display has become an important piece of real estate for the global players competing for supremacy in the digital commerce market. That real estate can be accessed via a number of avenues – through the handset’s operating system, a web browser, mobile app stores or through the connectivity layer itself.

As Google and Apple exercise a high degree of control over smartphone operating systems, popular web browsers and mobile app stores, other big digital commerce players, such as Amazon, Facebook and Walmart, risk being marginalised. One way to avoid that fate may be to play a bigger role in the provision of wireless connectivity as Reliance Industries is doing in India and Rakuten is doing in Japan.

For telcos, this is potentially a worrisome prospect. By rolling out its own greenfield mobile network, e-commerce, and financial services platform Rakuten has brought disruption and low prices to Japan’s mobile connectivity market, putting pressure on the incumbent operators. There is a clear danger that digital commerce platforms use the provision of mobile connectivity as a loss leader to drive to traffic to their other services.

Table of Contents

  • Executive Summary
  • Introduction
  • Mixing connectivity and commerce
    • Why Rakuten became a mobile network operator
    • Will Rakuten succeed in connectivity?
    • Why hasn’t Rakuten Mobile broken through?
    • Borrowing from the Amazon playbook
    • How will the hyperscalers react?
  • New technologies, new opportunities
    • Capacity expansion
    • Unlicensed and shared spectrum
    • Cloud-native networks and Open RAN attract new suppliers
    • Reprogrammable SIM cards
  • Google: Knee deep in connectivity waters
    • Google Fiber and Fi maintain a holding pattern
    • Google ramps up and ramps down public Wi-Fi
    • Google moves closer to (some) telcos
    • Google Cloud targets telcos
    • Big commitment to submarine/long distance infrastructure
    • Key takeaways: Vertical optimisation not integration
  • Amazon: A toe in the water
    • Amazon Sidewalk
    • Amazon and CBRS
    • Amazon’s long distance infrastructure
    • Takeaways: Control over connectivity has its attractions
  • Conclusions and implications for telcos in digital commerce/content
  • Index

Enter your details below to download an extract of the report


Consumer strategy: What should telcos do?

Globally, telcos are pursuing a wide variety of strategies in the consumer market, ranging from broad competition with the major Internet platforms to a narrow focus on delivering connectivity.

Some telcos, such as Orange France, Telefónica Spain, Reliance Jio and Rakuten Mobile, are combining connectivity with an array of services, such as messaging, entertainment, smart home, financial services and digital health propositions. Others, such as Three UK, focus almost entirely on delivering connectivity, while many sit somewhere in between, targeting a single vertical market, in addition to connectivity. AT&T is entertainment-orientated, while Safaricom is financial services-focused.

This report analyses the consumer strategies of the leading telcos in the UK and the Brazil – two very different markets. Whereas the UK is a densely populated, English-speaking country, Brazil has a highly-dispersed population that speaks Portuguese, making the barriers to entry higher for multinational telecoms and content companies.

By examining these two telecoms markets in detail, this report will consider which of these strategies is working, looking, in particular, at whether a halfway-house approach can be successful, given the economies of scope available to companies, such as Amazon and Google, that offer consumers a broad range of digital services. It also considers whether telcos need to be vertically-integrated in the consumer market to be successful. Or can they rely heavily on partnerships with third-parties? Do they need their own distinctive service layer developed in-house?

In light of the behavourial changes brought about by the pandemic, the report also considers whether telcos should be revamping their consumer propositions so that they are more focused on the provision of ultra-reliable connectivity, so people can be sure to work from home productively. Is residential connectivity really a commodity or can telcos now charge a premium for services that ensure a home office is reliably and securely connected throughout the day?

A future STL Partners report will explore telcos’ new working from home propositions in further detail.

Enter your details below to request an extract of the report


The UK market: Convergence is king

The UK is one of the most developed and competitive telecoms markets in the world. It has a high population density, with 84% of its 66 million people living in urban areas, according to the CIA Factbook. There are almost 272 people for every square kilometre, compared with an average of 103 across Europe. For every 100 people, there are 48 fixed lines and 41 broadband connections, while the vast majority of adults have a mobile phone. GDP per capita (on a purchasing power parity basis) is US$ 48,710, compared with US$ 65,118 in the US (according to the World Bank).

The strength of the state-funded public service broadcaster, the BBC, has made it harder for private sector players to make money in the content market. The BBC delivers a large amount of high-quality advertising-free content to anyone in the UK who pays the annual license fee, which is compulsory to watch television.

In the UK, the leading telcos have mostly eschewed expansion into the broader digital services market. That reflects the strong position of the leading global Internet platforms in the UK, as well as the quality of free-to-air television, and the highly competitive nature of the UK telecoms market – UK operators have relatively low margins, giving them little leeway to invest in the development of other digital services.

Figure 1 summarises where the five main network operators (and broadband/TV provider Sky) are positioned on a matrix mapping degree of vertical integration against the breadth of the proposition.

Most UK telcos have focused on the provision of connectivity

UK telco B2C strategies

Source: STL Partners

Brazil: Land of new opportunities

Almost as large as the US, Brazil has a population density is just 25 people per square kilometre – one tenth of the total UK average population density. Although 87% of Brazil’s 212 million people live in urban areas, according to the CIA Fact book, that means almost 28 million people are spread across the country’s rural communities.

By European standards, Brazil’s fixed-line infrastructure is relatively sparse. For every 100 people, Brazil has 16 fixed lines, 15 fixed broadband connections and 99 mobile connections. Its GDP per capita (on a purchasing power parity basis) is US$ 15,259 – about one third of that in the UK. About 70% of adults had a bank account in 2017, according to the latest World Bank data. However, only 58% of the adult population were actively using the account.

A vast middle-income country, Brazil has a very different telecoms market to that of the UK. In particular, network coverage and quality continue to be important purchasing criteria for consumers in many parts of the country. As a result, Oi, one of the four main network operators, became uncompetitive and entered a bankruptcy restructuring process in 2016. It is now hoping to to sell its sub-scale mobile unit for at least 15 billion reais (US$ 2.8 billion) to refocus the company on its fibre network. The other three major telcos, Vivo (part of Telefónica), Claro (part of América Móvil) and TIM Brazil, have made a joint bid to buy its mobile assets.

For this trio, opportunities may be opening up. They could, for example, play a key role in making financial services available across Brazil’s sprawling landmass, much of which is still served by inadequate road and rail infrastructure. If they can help Brazil’s increasingly cash-strapped consumers to save time and money, they will likely prosper. Even before COVID-19 struck, Brazil was struggling with the fall-out from an early economic crisis.

At the same time, Brazil’s home entertainment market is in a major state of flux. Demand for pay television, in particular, is falling away, as consumers seek out cheaper Internet-based streaming options.

All of Brazil’s major telcos are building a broad consumer play

Brazil telco consumer market strategy overview

Source: STL Partners

Table of contents

  • Executive Summary
  • Introduction
    • The UK market: Convergence is king
    • BT: Trying to be broad and deep
    • Virgin Media: An aggregation play
    • O2 UK: Changing course again
    • Vodafone: A belated convergence play
    • Three UK: Small and focused
    • Takeaways from the UK market: Triple play gridlock
  • Brazil: Land of new opportunities
    • The Brazilian mobile market
    • The Brazilian fixed-line market
    • The Brazilian pay TV market
    • The travails of Oi
    • Vivo: Playing catch-up in fibre
    • Telefónica’s financial performance
    • América Móvil goes broad in Brazil
    • TIM: Small, but perfectly formed?
    • Takeaways from the Brazilian market: A potentially treacherous transition
  • Index

Fixed wireless access growth: To 20% homes by 2025

=======================================================================================

Download the additional file on the left for the PPT chart pack accompanying this report

=======================================================================================

Fixed wireless access growth forecast

Fixed Wireless Access (FWA) networks use a wireless “last mile” link for the final connection of a broadband service to homes and businesses, rather than a copper, fibre or coaxial cable into the building. Provided mostly by WISPs (Wireless Internet Service Providers) or mobile network operators (MNOs), these services come in a wide range of speeds, prices and technology architectures.

Some FWA services are just a short “drop” from a nearby pole or fibre-fed hub, while others can work over distances of several kilometres or more in rural and remote areas, sometimes with base station sites backhauled by additional wireless links. WISPs can either be independent specialists, or traditional fixed/cable operators extending reach into areas they cannot economically cover with wired broadband.

There is a fair amount of definitional vagueness about FWA. The most expansive definitions include cheap mobile hotspots (“Mi-Fi” devices) used in homes, or various types of enterprise IoT gateway, both of which could easily be classified in other market segments. Most service providers don’t give separate breakouts of deployments, while regulators and other industry bodies report patchy and largely inconsistent data.

Our view is that FWA is firstly about providing permanent broadband access to a specific location or premises. Primarily, this is for residential wireless access to the Internet and sometimes typical telco-provided services such as IPTV and voice telephony. In a business context, there may be a mix of wireless Internet access and connectivity to corporate networks such as VPNs, again provided to a specific location or building.

A subset of FWA relates to M2M usage, for instance private networks run by utility companies for controlling grid assets in the field. These are typically not Internet-connected at all, and so don’t fit most observers’ general definition of “broadband access”.

Usually, FWA will be marketed as a specific service and package by some sort of network provider, usually including the terminal equipment (“CPE” – customer premise equipment), rather than allowing the user to “bring their own” device. That said, lower-end (especially 4G) offers may be SIM-only deals intended to be used with generic (and unmanaged) portable hotspots.
There are some examples of private network FWA, such as a large caravan or trailer park with wireless access provided from a central point, and perhaps in future municipal or enterprise cellular networks giving fixed access to particular tenant structures on-site – for instance to hangars at an airport.

Enter your details below to request an extract of the report


FWA today

Today, fixed-wireless access (FWA) is used for perhaps 8-9% of broadband connections globally, although this varies significantly by definition, country and region. There are various use cases (see below), but generally FWA is deployed in areas without good fixed broadband options, or by mobile-only operators trying to add an additional fixed revenue stream, where they have spare capacity.

Fixed wireless internet access fits specific sectors and uses, rather than the overall market

FWA Use Cases

Source: STL Partners

FWA has traditionally been used in sparsely populated rural areas, where the economics of fixed broadband are untenable, especially in developing markets without existing fibre transport to towns and villages, or even copper in residential areas. Such networks have typically used unlicensed frequency bands, as there is limited interference – and little financial justification for expensive spectrum purchases. In most cases, such deployments use proprietary variants of Wi-Fi, or its ill-fated 2010-era sibling WiMAX.

Increasingly however, FWA is being used in more urban settings, and in more developed market scenarios – for example during the phase-out of older xDSL broadband, or in places with limited or no competition between fixed-network providers. Some cellular networks primarily intended for mobile broadband (MBB) have been used for fixed usage as well, especially if spare capacity has been available. 4G has already catalysed rapid growth of FWA in numerous markets, such as South Africa, Japan, Sri Lanka, Italy and the Philippines – and 5G is likely to make a further big difference in coming years. These mostly rely on licensed spectrum, typically the national bands owned by major MNOs. In some cases, specific bands are used for FWA use, rather than sharing with normal mobile broadband. This allows appropriate “dimensioning” of network elements, and clearer cost-accounting for management.

Historically, most FWA has required an external antenna and professional installation on each individual house, although it also gets deployed for multi-dwelling units (MDUs, i.e. apartment blocks) as well as some non-residential premises like shops and schools. More recently, self-installed indoor CPE with varying levels of price and sophistication has helped broaden the market, enabling customers to get terminals at retail stores or delivered direct to their home for immediate use.

Looking forward, the arrival of 5G mass-market equipment and larger swathes of mmWave and new mid-band spectrum – both licensed and unlicensed – is changing the landscape again, with the potential for fibre-rivalling speeds, sometimes at gigabit-grade.

Enter your details below to request an extract of the report


Table of contents

  • Executive Summary
  • Introduction
    • FWA today
    • Universal broadband as a goal
    • What’s changed in recent years?
    • What’s changed because of the pandemic?
  • The FWA market and use cases
    • Niche or mainstream? National or local?
    • Targeting key applications / user groups
  • FWA technology evolution
    • A broad array of options
    • Wi-Fi, WiMAX and close relatives
    • Using a mobile-primary network for FWA
    • 4G and 5G for WISPs
    • Other FWA options
    • Customer premise equipment: indoor or outdoor?
    • Spectrum implications and options
  • The new FWA value chain
    • Can MNOs use FWA to enter the fixed broadband market?
    • Reinventing the WISPs
    • Other value chain participants
    • Is satellite a rival waiting in the wings?
  • Commercial models and packages
    • Typical pricing and packages
    • Example FWA operators and plans
  • STL’s FWA market forecasts
    • Quantitative market sizing and forecast
    • High level market forecast
  • Conclusions
    • What will 5G deliver – and when and where?
  • Index

Reliance Jio: Learning from India’s problem solver

=======================================================================================

Download the additional file on the left for the PPT chart pack accompanying this report

=======================================================================================

Introduction

This year marks the 25th anniversary of mobile networks in India. The huge potential of the market has attracted many players (even as recently as 2016, there were 12 mobile operators in India). But most have had their fingers burned by the complexities of this market, as well as intense competition, particularly following the entry of Reliance Jio in September 2016.

In the past four years, Reliance Jio has gone from strength to strength, becoming the leading telco in terms of mobile subscriber numbers in December 2019, dramatically expanding internet access and driving adoption of digital services across the country. It is not an exaggeration to say that Jio played a major role in the digital transformation of India to date.

Evidence of Jio’s impact on the Indian market

Source: STL Partners

Jio leads Indian telecoms

By delivering broad societal progress and value, Jio has been able to overcome many of the regulatory and political challenges that have hindered other new entrants to the Indian telecoms market. Jio is in good standing as regards its future ambitions in the digital environment, helping it to attract over USD20 billion in investment between April and July 2020 from Facebook, Google and other international investors.

In India, Reliance Jio has trialled elements of a Coordination Age approach, setting out to solve various socio-economic problems by matching supply and demand, while moving up the value chain to unlock further sources of revenue growth.

At the time of Jio’s entry, India was still predominantly a 3G market, with voice calls being the main application. Although there were a multitude of plans on offer and the retail price per minute was among the lowest in the world, mobile communications remained out of reach for many (not helped by high license and spectrum fees that translated into upward pressure on pricing).

Reliance Industries recognised an opportunity to use the advent of 4G technology to build a data-first telecoms player that could support its wider aspirations to develop a globally competitive technology business in India. Accordingly, it obtained a nationwide license to operate a 4G network and encouraged take-up with a promotion that offered customers free voice calls forever.

The existing operators rushed to defend their market positions by dropping their prices resulting in a price war that destroyed value in the market and has led to consolidation and insolvencies such that, aside from Jio, only two privately-owned operators remain – with the real possibility that the market will shrink further and become a duopoly.

STL Partners covered the success of Jio’s disruptive market entry strategy in Telco-Driven Disruption: Will AT&T, Axiata, Reliance Jio and Turkcell succeed? report in 2017. This report considers Jio’s strategy in the context of the Coordination Age. It looks at what this has meant for the market and highlights the implications for operators in other developing markets.

Enter your details below to request an extract of the report


Table of Contents

  • Executive Summary
  • Introduction
  • Interventionist government shapes market
    • Mobile market overview
    • The shifting sands of policy
  • Jio overtakes the incumbents
  • The rise of Reliance Jio
    • Leveraging the strength of a conglomerate
    • Restructuring and renewal
  • Major emphasis on partnerships
    • Start-ups
    • Global technology partners
  • Competitor positions
    • Bharti Airtel faring better than Vodafone Idea
    • Competitors’ relationship with the government
  • Conclusions
    • Lessons for telcos in developing markets
  • Index

Enter your details below to request an extract of the report


Apple Glass: An iPhone moment for 5G?

Augmented reality supports many use cases across industries

Revisiting the themes explored in the AR/VR: Won’t move the 5G needle report STL Partners published in January 2018, this report explores whether augmented reality (AR) could become a catalyst for widespread adoption of 5G, as leading chip supplier Qualcomm and some telcos hope.

It considers how this technology is developing, its relationship with virtual reality (VR), and the implications for telcos trying to find compelling reasons for customers to use low latency 5G networks.

This report draws the following distinction between VR and AR

  • Virtual reality: use of an enclosed headset for total immersion in a digital3D
  • Augmented reality: superimposition of digital graphics onto images of the real world via a camera viewfinder, a pair of glasses or onto a screen fixed in real world.

In other words, AR is used both indoors and outdoors and on a variety of devices. Whereas Wi-Fi/fibre connectivity will be the preferred connectivity option in many scenarios, 5G will be required in locations lacking high-speed Wi-Fi coverage.  Many AR applications rely on responsive connectivity to enable them to interact with the real world. To be compelling, animated images superimposed on those of the real world need to change in a way that is consistent with changes in the real world and changes in the viewing angle.

AR can be used to create innovative games, such as the 2016 phenomena Pokemon Go, and educational and informational tools, such as travel guides that give you information about the monument you are looking at.  At live sports events, spectators could use AR software to identify players, see how fast they are running, check their heart rates and call up their career statistics.

Note, an advanced form of AR is sometimes referred to as mixed reality or extended reality (XR). In this case, fully interactive digital 3D objects are superimposed on the real world, effectively mixing virtual objects and people with physical objects and people into a seamless interactive scene. For example, an advanced telepresence service could project a live hologram of the person you are talking to into the same room as you. Note, this could be an avatar representing the person or, where the connectivity allows, an actual 3D video stream of the actual person.

Widespread usage of AR services will be a hallmark of the Coordination Age, in the sense that they will bring valuable information to people as and when they need it. First responders, for example, could use smart glasses to help work their way through smoke inside a building, while police officers could be immediately fed information about the owner of a car registration plate. Office workers may use smart glasses to live stream a hologram of a colleague from the other side of the world or a 3D model of a new product or building.

In the home, both AR and VR could be used to generate new entertainment experiences, ranging from highly immersive games to live holograms of sports events or music concerts. Some people may even use these services as a form of escapism, virtually inhabiting alternative realities for several hours a day.

Given sufficient time to develop, STL Partners believes mixed-reality services will ultimately become widely adopted in the developed world. They will become a valuable aid to everyday living, providing the user with information about whatever they are looking at, either on a transparent screen on a pair of glasses or through a wireless earpiece. If you had a device that could give you notifications, such as an alert about a fast approaching car or a delay to your train, in your ear or eyeline, why wouldn’t you want to use it?

How different AR applications affect mobile networks

One of the key questions for the telecoms industry is how many of these applications will require very low latency, high-speed connectivity. The transmission of high-definition holographic images from one place to another in real time could place enormous demands on telecoms networks, opening up opportunities for telcos to earn additional revenues by providing dedicated/managed connectivity at a premium price. But many AR applications, such as displaying reviews of the restaurant a consumer is looking at, are unlikely to generate much data traffic. the figure below lists some potential AR use cases and indicates how demanding they will be to support.

Examples of AR use cases and the demands they make on connectivity


Source: STL Partners

Although telcos have always struggled to convince people to pay a premium for premium connectivity, some of the most advanced AR applications may be sufficiently compelling to bring about this kind of behavioural shift, just as people are prepared to pay more for a better seat at the theatre or in a sports stadium. This could be on a pay-as-you-go or a subscription basis.

Enter your details below to request an extract of the report


The pioneers of augmented reality

Augmented reality (AR) is essentially a catch-all term for any application that seeks to overlay digital information and images on the real-world. Applications of AR can range from a simple digital label to a live 3D holographic projection of a person or event.

AR really rose to prominence at the start of the last decade with the launch of smartphone apps, such as Layar, Junaio, and Wikitude, which gave you information about what you were looking at through the smartphone viewfinder. These apps drew on data from the handset’s GPS chip, its compass and, in some cases, image recognition software to try and figure out what was being displayed in the viewfinder. Although they attracted a lot of media attention, these apps were too clunky to break through into the mass-market. However, the underlying concept persists – the reasonably popular Google Lens app enables people to identify a product, plant or animal they are looking at or translate a menu into their own language.

Perhaps the most high profile AR application to date is Niantic’s Pokemon Go, a smartphone game that superimposes cartoon monsters on images of the real world captured by the user’s smartphone camera. Pokemon Go generated $1 billion in revenue globally just seven months after its release in mid 2016, faster than any other mobile game, according to App Annie. It has also shown remarkable staying power. Four years later, in May 2020, Pokemon Go continued to be one of the top 10 grossing games worldwide, according to SensorTower.

In November 2017, Niantic, which has also had another major AR hit with sci-fi game Ingress, raised $200 million to boost its AR efforts. In 2019, it released another AR game based on the Harry Potter franchise.

Niantic is now looking to use its AR expertise to create a new kind of marketing platform. The idea is that brands will be able to post digital adverts and content in real-world locations, essentially creating digital billboards that are viewable to consumers using the Niantic platform. At the online AWE event in May 2020, Niantic executives claimed “AR gamification and location-based context” can help businesses increase their reach, boost user sentiment, and drive foot traffic to bricks-and-mortar stores. Niantic says it is working with major brands, such as AT&T, Simon Malls, Starbucks, Mcdonalds, and Samsung, to develop AR marketing that “is non-intrusive, organic, and engaging.”

The sustained success of Pokemon Go has made an impression on the major Internet platforms. By 2018, the immediate focus of both Apple and Google had clearly shifted from VR to AR. Apple CEO Tim Cook has been particularly vocal about the potential of AR. And he continues to sing the praises of the technology in public.

In January 2020, for example, during a visit to Ireland, Cook described augmented reality as the “next big thing.”  In an earnings call later that month, Cook added:When you look at AR today, you would see that there are consumer applications, there are enterprise applications. … it’s going to pervade your life…, because it’s going to go across both business and your whole life. And I think these things will happen in parallel.”

Both Apple and Google have released AR developer tools, helping AR apps to proliferate in both Apple’s App Store and on Google Play.  One of the most popular early use cases for AR is to check how potential new furniture would look inside a living room or a bedroom. Furniture stores and home design companies, such as Ikea, Wayfair and Houzz, have launched their own AR apps using Apple’s ARKit. Once the app is familiar with its surroundings, it allows the user to overlay digital models of furniture anywhere in a room to see how it will fit. The technology can work in outdoor spaces as well.

In a similar vein, there are various AR apps, such as MeasureKit, that allow you to measure any object of your choosing. After the user picks a starting point with a screen tap, a straight line will measure the length until a second tap marks the end. MeasureKit also claims to be able to calculate trajectory distances of moving objects, angle degrees, the square footage of a three-dimensional cube and a person’s height.

Table of contents

  • Executive Summary
    • More mainstream models from late 2022
    • Implications and opportunities for telcos
  • Introduction
  • Progress and Immediate Prospects
    • The pioneers of augmented reality
    • Impact of the pandemic
    • Snap – seeing the world differently
    • Facebook – the keeper of the VR flame
    • Google – the leader in image recognition
    • Apple – patiently playing the long game
    • Microsoft – expensive offerings for the enterprise
    • Amazon – teaming up with telcos to enable AR/VR
    • Market forecasts being revised down
  • Telcos Get Active in AR
    • South Korea’s telcos keep trying
    • The global picture
  • What comes next?
    • Live 3D holograms of events
    • Enhancing live venues with holograms
    • 4K HD – Simple, but effective
  • Technical requirements
    • Extreme image processing
    • An array of sensors and cameras
    • Artificial intelligence plays a role
    • Bandwidth and latency
    • Costs: energy, weight and financial
  • Timelines for Better VR and AR
    • When might mass-market models become available?
    • Implications for telcos
    • Opportunities for telcos
  • Appendix: Societal Challenges
    • AR: Is it acceptable in a public place?
    • VR: health issues
    • VR and AR: moral and ethical challenges
    • AR and VR: What do consumers really want?
  • Index

Enter your details below to request an extract of the report


Telco edge computing: How to partner with hyperscalers

Edge computing is getting real

Hyperscalers such as Amazon, Microsoft and Google are rapidly increasing their presence in the edge computing market by launching dedicated products, establishing partnerships with telcos on 5G edge infrastructure and embedding their platforms into operators’ infrastructure.

Many telecoms operators, who need cloud infrastructure and platform support to run their edge services, have welcomed the partnership opportunity. However, they are yet to develop clear strategies on how to use these partnerships to establish a stronger proposition in the edge market, move up the value chain and play a role beyond hosting infrastructure and delivering connectivity. Operators that miss out on the partnership opportunity or fail to fully utilise it to develop and differentiate their capabilities and resources could risk either being reduced to connectivity providers with a limited role in the edge market and/or being late to the game.

Edge computing or multi-access edge computing (MEC) enables processing data closer to the end user or device (i.e. the source of data), on physical compute infrastructure that is positioned on the spectrum between the device and the internet or hyperscale cloud.

Telco edge computing is mainly defined as a distributed compute managed by a telco operator. This includes running workloads on customer premises as well as locations within the operator network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs. Depending on where the computing resources reside, edge computing can be broadly divided into:

  • Network edge which includes sites or points of presence (PoPs) owned by a telecoms operator such as base stations, central offices and other aggregation points on the access and/or core network.
  • On-premise edge where the computing resources reside at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.

Our overview on edge computing definitions, network structure, market opportunities and business models can be found in our previous report Telco Edge Computing: What’s the operator strategy?

The edge computing opportunity for operators and hyperscalers

Many operators are looking at edge computing as a good opportunity to leverage their existing assets and resources to innovate and move up the value chain. They aim to expand their services and revenue beyond connectivity and enter the platform and application space. By deploying computing resources at the network edge, operators can offer infrastructure-as-a-service and alternative application and solutions for enterprises. Also, edge computing as a distributed compute structure and an extension of the cloud supports the operators’ own journey into virtualising the network and running internal operations more efficiently.

Cloud hyperscalers, especially the biggest three – Amazon Web Services (AWS), Microsoft Azure and Google – are at the forefront of the edge computing market. In the recent few years, they have made efforts to spread their influence outside of their public clouds and have moved the data acquisition point closer to physical devices. These include efforts in integrating their stack into IoT devices and network gateways as well as supporting private and hybrid cloud deployments. Recently, hyperscalers took another step to get closer to customers at the edge by launching platforms dedicated to telecom networks and enabling integration with 5G networks. The latest of these products include Wavelength from AWS, Azure Edge Zones from Microsoft and Anthos for Telecom from Google Cloud. Details on these products are available in section.

Enter your details below to request an extract of the report


From competition to coopetition

Both hyperscalers and telcos are among the top contenders to lead the edge market. However, each stakeholder lacks a significant piece of the stack which the other has. This is the cloud platform for operators and the physical locations for hyperscalers. Initially, operators and hyperscalers were seen as competitors racing to enter the market through different approaches. This has resulted in the emergence of new types of stakeholders including independent mini data centre providers such as Vapor IO and EdgeConnex, and platform start-ups such as MobiledgeX and Ori Industries.

However, operators acknowledge that even if they do own the edge clouds, these still need to be supported by hyperscaler clouds to create a distributed cloud. To fuel the edge market and build its momentum, operators will, in the most part, work with the cloud providers. Partnerships between operators and hyperscalers are starting to take place and shape the market, impacting edge computing short- and long-term strategies for operators as well as hyperscalers and other players in the market.

Figure 1: Major telco-hyperscalers edge partnerships

Major telco-hyperscaler partnerships

Source: STL Partners analysis

What does it mean for telcos?

Going to market alone is not an attractive option for either operators or hyperscalers at the moment, given the high investment requirement without a guaranteed return. The partnerships between two of the biggest forces in the market will provide the necessary push for the use cases to be developed and enterprise adoption to be accelerated. However, as markets grow and change, so do the stakeholders’ strategies and relationships between them.

Since the emergence of cloud computing and the development of the digital technologies market, operators have been faced with tough competition from the internet players, including hyperscalers who have managed to remain agile while building a sustained appetite for innovation and market disruption. Edge computing is not an exception and they are moving rapidly to define and own the biggest share of the edge market.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market while partnering with hyperscalers in edge computing.

Table of contents

  • Executive Summary
    • Operators’ roles along the edge computing value chain
    • Building a bigger ecosystem and pushing market adoption
    • How partnerships can shape the market
    • What next?
  • Introduction
    • The edge computing opportunity for operators and hyperscalers
    • From competition to coopetition
    • What does it mean for telcos?
  • Overview of the telco-hyperscalers partnerships
    • Explaining the major roles required to enable edge services
    • The hyperscaler-telco edge commercial model
  • Hyperscalers’ edge strategies
    • Overview of hyperscalers’ solutions and activities at the edge
    • Hyperscalers approach to edge sites and infrastructure acquisition
  • Operators’ edge strategies and their roles in the partnerships
    • Examples of operators’ edge computing activities
    • Telcos’ approach to integrating edge platforms
  • Conclusion
    • Infrastructure strategy
    • Platform strategy
    • Verticals and ecosystem building strategy

 

Enter your details below to request an extract of the report


Telco ecosystems: How to make them work

The ecosystem business framework

The success of large businesses such as Microsoft, Amazon and Google as well as digital disrupters like Airbnb and Uber is attributed to their adoption of platform-enabled ecosystem business frameworks. Microsoft, Amazon and Google know how to make ecosystems work. It is their ecosystem approach that helped them to scale quickly, innovate and unlock value in opportunity areas where businesses that are vertically integrated, or have a linear value chain, would have struggled. Internet-enabled digital opportunity areas tend to be unsuited to the traditional business frameworks. These depend on having the time and the ability to anticipate needs, plan and execute accordingly.

As businesses in the telecommunications sector and beyond try to emulate the success of these companies and their ecosystem approach, it is necessary to clarify what is meant by the term “ecosystem” and how it can provide a framework for organising business.

The word “ecosystem” is borrowed from biology. It refers to a community of organisms – of any number of species – living within a defined physical environment.

A biological ecosystem

The components of a biological ecosystem

Source: STL Partners

A business ecosystem can therefore be thought of as a community of stakeholders (of different types) that exist within a defined business environment. The environment of a business ecosystem can be small or large.  This is also true in biology, where both a tree and a rainforest can equally be considered ecosystem environments.

The number of organisms within a biological community is dynamic. They coexist with others and are interdependent within the community and the environment. Environmental resources (i.e. energy and matter) flow through the system efficiently. This is how the ecosystem works.

Companies that adopt an ecosystem business framework identify a community of stakeholders to help them address an opportunity area, or drive business in that space. They then create a business environment (e.g. platforms, rules) to organise economic activity among those communities.  The environment integrates community activities in a complementary way. This model is consistent with STL Partners’ vision for a Coordination Age, where desired outcomes are delivered to customers by multiple parties acting together.

Characteristics of business ecosystems that work

In the case of Google, it adopted an ecosystem approach to tackle the search opportunity. Its search engine platform provides the environment for an external stakeholder community of businesses to reach consumers as they navigate the internet, based on what consumers are looking for.

  • Google does not directly participate in the business-consumer transaction, but its platform reduces friction for participants (providing a good customer experience) and captures information on the exchange.

While Google leverages a technical platform, this is not a requirement for an ecosystem framework. Nespresso built an ecosystem around its patented coffee pod. It needed to establish a user-base for the pods, so it developed a business environment that included licensing arrangements for coffee machine manufacturers.  In addition, it provided support for high-end homeware retailers to supply these machines to end-users. It also created the online Nespresso Club for coffee aficionados to maintain demand for its product (a previous vertically integrated strategy to address this premium coffee-drinking niche had failed).

Ecosystem relevance for telcos

Telcos are exploring new opportunities for revenue. In many of these opportunities, the needs of the customer are evolving or changeable, budgets are tight, and time-to-market is critical. Planning and executing traditional business frameworks can be difficult under these circumstances, so ecosystem business frameworks are understandably of interest.

Traditional business frameworks require companies to match their internal strengths and capabilities to those required to address an opportunity. An ecosystem framework requires companies to consider where those strengths and capabilities are (i.e. external stakeholder communities). An ecosystem orchestrator then creates an environment in which the stakeholders contribute their respective value to meet that end. Additional end-user value may also be derived by supporting stakeholder communities whose products and services use, or are used with, the end-product or service of the ecosystem (e.g. the availability of third party App Store apps add value for end customers and drives demand for high end Apple iPhones). It requires “outside-in” strategic thinking that goes beyond the bounds of the company – or even the industry (i.e. who has the assets and capabilities, who/what will support demand from end-users).

Many companies have rushed to implement ecosystem business frameworks, but have not attained the success of Microsoft, Amazon or Google, or in the telco arena, M-Pesa. Telcos require an understanding of the rationale behind ecosystem business frameworks, what makes them work and how this has played out in other telco ecosystem implementations. As a result, they should be better able to determine whether to leverage this approach more widely.

Table of Contents

  • Executive Summary
  • The ecosystem business framework
  • Why ecosystem business frameworks?
    • Benefits of ecosystem business frameworks
  • Identifying ecosystem business frameworks
  • Telco experience with ecosystem frameworks
    • AT&T Community
    • Deutsche Telekom Qivicon
    • Telecom Infra Project (TIP)
    • GSMA Mobile Connect
    • Android
    • Lessons from telco experience
  • Criteria for successful ecosystem businesses
    • “Destination” status
    • Strong assets and capabilities to share
    • Dynamic strategy
    • Deep end-user knowledge
    • Participant stakeholder experience excellence
    • Continuous innovation
    • Conclusions
  • Next steps
    • Index

Network convergence: How to deliver a seamless experience

Operators need to adapt to the changing connectivity demands post-COVID19

The global dependency on consistent high-performance connectivity has recently come to the fore as the COVID-19 outbreak has transformed many of the remaining non-digital tasks into online activities.

The typical patterns of networking have broken and a ‘new normal’, albeit possibly a somewhat transitory one, is emerging. The recovery of the global economy will depend on governments, healthcare providers, businesses and their employees robustly communicating and gaining uninhibited access to content and cloud through their service providers – at any time of day, from any location and on any device.

Reliable connectivity is a critical commodity. Network usage patterns have shifted more towards the home and remote working. Locations which were previously light-usage now have high demands. Conversely, many business locations no longer need such high capacity. Utilisation is not expected to return to pre-COVID-19 patterns either, as people and businesses adapt to new daily routines – at least for some time.

The strategies with which telcos started the year have of course been disrupted with resources diverted away from strategic objectives to deal with a new mandate – keep the country connected. In the short-term, the focus has shifted to one which is more tactical – ensuring customer satisfaction through a reliable and adaptable service with rapid response to issues. In the long-term, however, the objectives for capacity and coverage remain. Telcos are still required to reach national targets for a minimum connection quality in rural areas, whilst delivering high bandwidth service demands in hotspot locations (although these hotspot locations might now change).

Of course, modern networks are designed with scalability and adaptability in mind – some recent deployments from new disruptors (such as Rakuten) demonstrate the power of virtualisation and automation in that process, particularly when it comes to the radio access network (RAN). In many legacy networks, however, one area which is not able to adapt fast enough is the physical access. Limits on spectrum, coverage (indoors and outdoors) and the speed at which physical infrastructure can be installed or updated become a bottleneck in the adaptation process. New initiatives to meet home working demand through an accelerated fibre rollout are happening, but they tend to come at great cost.

Network convergence is a concept which can provide a quick and convenient way to address this need for improved coverage, speed and reliability in the access network, without the need to install or upgrade last mile infrastructure. By definition, it is the coming-together of multiple network assets, as part of a transformation to one intelligent network which can efficiently provide customers with a single, unified, high-quality experience at any time, in any place.

It has already attracted interest and is finding an initial following. A few telcos have used it to provide better home broadband. Internet content and cloud service providers are interested, as it adds resilience to the mobile user experience, and enterprises are interested in utilising multiple lower cost commodity backhauls – the combination of which benefits from inherent protection against costly network outages.Request a report extract

Network convergence helps create an adaptable and resilient last mile

Most telcos already have the facility to connect with their customers via multiple means; providing mobile, fixed line and public Wi-Fi connectivity to those in their coverage footprint. The strategy has been to convert individual ‘pure’ mobile or fixed customers into households. The expectation is that this creates revenue increase through bundling and loyalty whilst bringing some added friction into the ability to churn – a concept which has been termed ‘convergence’. Although the customer may see one converged telco through brand, billing and customer support, the delivery of a consistent user experience across all modes of network access has been lacking and awkward. In the end, it is customer dissatisfaction which drives churn, so delivering a consistent user experience is important.

Convergence is a term used to mean many different things, from a single bill for all household connectivity, to modernising multiple core networks into a single efficient core. While most telcos have so far been concentrating on increasing operational efficiency, increasing customer loyalty/NPS and decreasing churn through some initial aspects of convergence, some are now looking into network convergence – where multiple access technologies (4G, 5G, Wi-Fi, fixed line) can be used together to deliver a resilient, optimised and consistent network quality and coverage.

Overview of convergence

Source: STL Partners

As an overarching concept, network convergence introduces more flexibility into the access layer. It allows a single converged core network to utilise and aggregate whichever last mile connectivity options are most suited to the environment. Some examples are:

  • Hybrid Access: DSL and 4G macro network used together to provide extra speed and fallback reliability in hybrid fixed/mobile home gateways.
  • Cell Densification: 5G and Wi-Fi small cells jointly providing short range capacity to augment the macro network in dense urban areas.
  • Fixed Wireless Access: using cellular as a fibre alternative in challenging areas.

The ability to combine various network accesses is attractive as an option for improving adaptability, resilience and speed. Strategically, putting such flexibility in place can support future growth and customer retention with the added advantage of improving operational efficiency. Tactically, it enables an ability to quickly adapt resources to short-term changes in demand. COVID-19 has been a clear example of this need.

Table of Contents

  • Executive Summary
    • Convergence and network convergence
    • Near-term benefits of network convergence
    • Strategic benefits of network convergence
    • Balancing the benefits of convergence and divergence
    • A three-step plan
  • Introduction
    • The changing environment
    • Network convergence: The adaptable and resilient last mile
    • Anticipated benefits to telcos
    • Challenges and opposing forces
  • The evolution to network convergence
    • Everyone is combining networks
    • Converging telco networks
    • Telco adoption so far
  • Strategy, tactics and hurdles
    • The time is right for adaptability
    • Tactical motivators
    • Increasing the relationship with the customer
    • Modernisation and efficiency – remaining competitive
    • Hurdles from within the telco ecosystem
    • Risk or opportunity? Innovation above-the-core
  • Conclusion
    • A three-step plan
  • Index

Request STL research insights overview pack

 

 

Fighting the fakes: How telcos can help

Internet platforms need a frictionless solution to fight the fakes

On the Internet, the old adage, nobody knows you are a dog, can still ring true. All of the major Internet platforms, with the partial exception of Apple, are fighting frauds and fakes. That’s generally because these platforms either allow users to remain anonymous or because they use lax authentication systems that prioritise ease-of-use over rigour. Some people then use the cloak of anonymity in many different ways, such as writing glowing reviews of products they have never used on Amazon (in return for a payment) or enthusiastic reviews of restaurants owned by friends on Tripadvisor. Even the platforms that require users to register financial details are open to abuse. There have been reports of multiple scams on eBay, while regulators have alleged there has been widespread sharing of Uber accounts among drivers in London and other cities.

At the same time, Facebook/WhatsApp, Google/YouTube, Twitter and other social media services are experiencing a deluge of fake news, some of which can be very damaging for society. There has been a mountain of misinformation relating to COVID-19 circulating on social media, such as the notion that if you can hold your breath for 10 seconds, you don’t have the virus. Fake news is alleged to have distorted the outcome of the U.S. presidential election and the Brexit referendum in the U.K.

In essence, the popularity of the major Internet platforms has made them a target for unscrupulous people who want to propagate their world views, promote their products and services, discredit rivals and have ulterior (and potentially criminal) motives for participating in the gig economy.

Although all the leading Internet platforms use tools and reporting mechanisms to combat misuse, they are still beset with problems. In reality, these platforms are walking a tightrope – if they make authentication procedures too cumbersome, they risk losing users to rival platforms, while also incurring additional costs. But if they allow a free-for-all in which anonymity reigns, they risk a major loss of trust in their services.

In STL Partners’ view, the best way to walk this tightrope is to use invisible authentication – the background monitoring of behavioural data to detect suspicious activities. In other words, you keep the Internet platform very open and easy-to-use, but algorithms process the incoming data and learn to detect the patterns that signal potential frauds or fakes. If this idea were taken to an extreme, online interactions and transactions could become completely frictionless. Rather than asking a person to enter a username and password to access a service, they can be identified through the device they are using, their location, the pattern of keystrokes and which features they access once they are logged in. However, the effectiveness of such systems depends heavily on the quality and quantity of data they are feeding on.

In come telcos

This report explores how telcos could use their existing systems and data to help the major Internet companies to build better systems to protect the integrity of their platforms.

It also considers the extent to which telcos will need to work together to effectively fight fraud, just as they do to combat telecoms-related fraud and prevent stolen phones from being used across networks. For most use cases, the telcos in each national market will generally need to provide a common gateway through which a third party could check attributes of the user of a specific mobile phone number. As they plot their way out of the current pandemic, governments are increasingly likely to call for such gateways to help them track the spread of COVID-19 and identify people who may have become infected.

Request a report extract

Using big data to combat fraud

In the financial services sector, artificial intelligence (AI) is now widely used to help detect potentially fraudulent financial transactions. Learning from real-world examples, neural networks can detect the behavioural patterns associated with fraud and how they are changing over time. They can then create a dynamic set of thresholds that can be used to trigger alarms, which could prompt a bank to decline a transaction.

In a white paper published in 2019, IBM claimed its AI and cognitive solutions are having a major impact on transaction monitoring and payment fraud modelling. In one of several case studies, the paper describes how the National Payment Switch in France (STET) is using behavioural information to reduce fraud losses by US$100 million annually. Owned by a consortium of financial institutions, STET processes more than 30 billion credit and debit card, cross-border, domestic and on-us payments annually.

STET now assesses the fraud risk for every authorisation request in real time. The white paper says IBM’s Safer Payments system generates a risk score, which is then passed to banks, issuers and acquirers, which combine it with customer information to make a decision on whether to clear or decline the transaction. IBM claims the system can process up to 1,200 transactions per second, and can compute a risk score in less than 10 milliseconds. While STET itself doesn’t have any customer data or data from other payment channels, the IBM system looks across all transactions, countrywide, as well as creating “deep behavioural profiles for millions of cards and merchants.”

Telcos, or at least the connectivity they provide, are also helping banks combat fraud. If they think a transaction is suspicious, banks will increasingly send a text message or call a customer’s phone to check whether they have actually initiated the transaction. Now, some telcos, such as O2 in the UK, are making this process more robust by enabling banks to check whether the user’s SIM card has been swapped between devices recently or if any call diverts are active – criminals sometimes pose as a specific customer to request a new SIM. All calls and texts to the number are then routed to the SIM in the fraudster’s control, enabling them to activate codes or authorisations needed for online bank transfers, such as a one-time PINs or passwords.

As described below, this is one of the use cases supported by Mobile Connect, a specification developed by the GSMA, to enable mobile operators to take a consistent approach to providing third parties with identification, authentication and attribute-sharing services. The idea behind Mobile Connect is that a third party, such as a bank, can access these services regardless of which operator their customer subscribes to.

Adapting telco authentication for Amazon, Uber and Airbnb

Telcos could also provide Internet platforms, such as Amazon, Uber and Airbnb, with identification, authentication and attribute-sharing services that will help to shore up trust in their services. Building on their nascent anti-fraud offerings for the financial services industry, telcos could act as intermediaries, authenticating specific attributes of an individual without actually sharing personal data with the platform.

STL Partners has identified four broad data sets telcos could use to help combat fraud:

  1. Account activity – checking which individual owns which SIM card and that the SIM hasn’t been swapped recently;
  2. Movement patterns – tracking where people are and where they travel frequently to help identify if they are who they say they are;
  3. Contact patterns – establishing which individuals come into contact with each other regularly;
  4. Spending patterns – monitoring how much money an individual spends on telecoms services.

Table of contents

  • Executive Summary
  • Introduction
  • Using big data to combat fraud
    • Account activity
    • Movement patterns
    • Contact patterns
    • Spending patterns
    • Caveats and considerations
  • Limited progress so far
    • Patchy adoption of Mobile Connect
    • Mobile identification in the UK
    • Turkcell employs machine learning
  • Big Internet use cases
    • Amazon – grappling with fake product reviews
    • Facebook and eBay – also need to clampdown
    • Google Maps and Tripadvisor – targets for fake reviews
    • Uber – serious safety concerns
    • Airbnb – balancing the interests of hosts and guests
  • Conclusions
  • Index

Request STL research insights overview pack

Telco edge computing: What’s the operator strategy?

To access the report chart pack in PPT download the additional file on the left

Edge computing can help telcos to move up the value chain

The edge computing market and the technologies enabling it are rapidly developing and attracting new players, providing new opportunities to enterprises and service providers. Telco operators are eyeing the market and looking to leverage the technology to move up the value chain and generate more revenue from their networks and services. Edge computing also represents an opportunity for telcos to extend their role beyond offering connectivity services and move into the platform and the application space.

However, operators will be faced with tough competition from other market players such as cloud providers, who are moving rapidly to define and own the biggest share of the edge market. Plus, industrial solution providers, such as Bosch and Siemens, are similarly investing in their own edge services. Telcos are also dealing with technical and business challenges as they venture into the new market and trying to position themselves and identifying their strategies accordingly.

Telcos that fail to develop a strategic approach to the edge could risk losing their share of the growing market as non-telco first movers continue to develop the technology and dictate the market dynamics. This report looks into what telcos should consider regarding their edge strategies and what roles they can play in the market.

Following this introduction, we focus on:

  1. Edge terminology and structure, explaining common terms used within the edge computing context, where the edge resides, and the role of edge computing in 5G.
  2. An overview of the edge computing market, describing different types of stakeholders, current telecoms operators’ deployments and plans, competition from hyperscale cloud providers and the current investment and consolidation trends.
  3. Telcos challenges in addressing the edge opportunity: technical, organisational and commercial challenges given the market
  4. Potential use cases and business models for operators, also exploring possible scenarios of how the market is going to develop and operators’ likely positioning.
  5. A set of recommendations for operators that are building their strategy for the edge.

Request a report extract

What is edge computing and where exactly is the edge?

Edge computing brings cloud services and capabilities including computing, storage and networking physically closer to the end-user by locating them on more widely distributed compute infrastructure, typically at smaller sites.

One could argue that edge computing has existed for some time – local infrastructure has been used for compute and storage, be it end-devices, gateways or on-premises data centres. However, edge computing, or edge cloud, refers to bringing the flexibility and openness of cloud-native infrastructure to that local infrastructure.

In contrast to hyperscale cloud computing where all the data is sent to central locations to be processed and stored, edge computing local processing aims to reduce time and save bandwidth needed to send and receive data between the applications and cloud, which improves the performance of the network and the applications. This does not mean that edge computing is an alternative to cloud computing. It is rather an evolutionary step that complements the current cloud computing infrastructure and offers more flexibility in executing and delivering applications.

Edge computing offers mobile operators several opportunities such as:

  • Differentiating service offerings using edge capabilities
  • Providing new applications and solutions using edge capabilities
  • Enabling customers and partners to leverage the distributed computing network in application development
  • Improving networkperformance and achieving efficiencies / cost savings

As edge computing technologies and definitions are still evolving, different terms are sometimes used interchangeably or have been associated with a certain type of stakeholder. For example, mobile edge computing is often used within the mobile network context and has evolved into multi-access edge computing (MEC) – adopted by the European Telecommunications Standards Institute (ETSI) – to include fixed and converged network edge computing scenarios. Fog computing is also often compared to edge computing; the former includes running intelligence on the end-device and is more IoT focused.

These are some of the key terms that need to be identified when discussing edge computing:

  • Network edge refers to edge compute locations that are at sites or points of presence (PoPs) owned by a telecoms operator, for example at a central office in the mobile network or at an ISP’s node.
  • Telco edge cloud is mainly defined as distributed compute managed by a telco  This includes running workloads on customer premises equipment (CPE) at customers’ sites as well as locations within the operator network such as base stations, central offices and other aggregation points on access and/or core network. One of the reasons for caching and processing data closer to the customer data centres is that it allows both the operators and their customers to enjoy the benefit of reduced backhaul traffic and costs.
  • On-premise edge computing refers to the computing resources that are residing at the customer side, e.g. in a gateway on-site, an on-premises data centre, etc. As a result, customers retain their sensitive data on-premise and enjoy other flexibility and elasticity benefits brought by edge computing.
  • Edge cloud is used to describe the virtualised infrastructure available at the edge. It creates a distributed version of the cloud with some flexibility and scalability at the edge. This flexibility allows it to have the capacity to handle sudden surges in workloads from unplanned activities, unlike static on-premise servers. Figure 1 shows the differences between these terms.

Figure 1: Edge computing types

definition of edge computing

Source: STL Partners

Network infrastructure and how the edge relates to 5G

Discussions on edge computing strategies and market are often linked to 5G. Both technologies have overlapping goals of improving performance and throughput and reducing latency for applications such as AR/VR, autonomous vehicles and IoT. 5G improves speed by increasing spectral efficacy, it offers the potential of much higher speeds than 4G. Edge computing, on the other hand, reduces latency by shortening the time required for data processing by allocating resources closer to the application. When combined, edge and 5G can help to achieve round-trip latency below 10 milliseconds.

While 5G deployment is yet to accelerate and reach ubiquitous coverage, the edge can be utilised in some places to reduce latency where needed. There are two reasons why the edge will be part of 5G:

  • First, it has been included in the 5Gstandards (3GPP Release 15) to enable ultra-low latency which will not be achieved by only improvements in the radio interface.
  • Second, operators are in general taking a slow and gradual approach to 5G deployment which means that 5G coverage alone will not provide a big incentive for developers to drive the application market. Edge can be used to fill the network gaps to stimulate the application market growth.

The network edge can be used for applications that need coverage (i.e. accessible anywhere) and can be moved across different edge locations to scale capacity up or down as required. Where an operator decides to establish an edge node depends on:

  • Application latency needs. Some applications such as streaming virtual reality or mission critical applications will require locations close enough to its users to enable sub-50 milliseconds latency.
  • Current network topology. Based on the operators’ network topology, there will be selected locations that can meet the edge latency requirements for the specific application under consideration in terms of the number of hops and the part of the network it resides in.
  • Virtualisation roadmap. The operator needs to consider virtualisation roadmap and where data centre facilities are planned to be built to support future network
  • Site and maintenance costs. The cloud computing economies of scale may diminish as the number of sites proliferate at the edge, for example there is a significant difference in maintaining 1-2 large data centres to maintaining 100s across the country
  • Site availability. Some operators’ edge compute deployment plans assume the nodes reside in the same facilities as those which host their NFV infrastructure. However, many telcos are still in the process of renovating these locations to turn them into (mini) data centres so aren’t yet ready.
  • Site ownership. Sometimes the preferred edge location is within sites that the operators have limited control over, whether that is in the customer premise or within the network. For example, in the US, the cell towers are owned by tower operators such as Crown Castle, American Tower and SBA Communications.

The potential locations for edge nodes can be mapped across the mobile network in four levels as shown in Figure 2.

Figure 2: possible locations for edge computing

edge computing locations

Source: STL Partners

Table of Contents

  • Executive Summary
    • Recommendations for telco operators at the edge
    • Four key use cases for operators
    • Edge computing players are tackling market fragmentation with strategic partnerships
    • What next?
  • Table of Figures
  • Introduction
  • Definitions of edge computing terms and key components
    • What is edge computing and where exactly is the edge?
    • Network infrastructure and how the edge relates to 5G
  • Market overview and opportunities
    • The value chain and the types of stakeholders
    • Hyperscale cloud provider activities at the edge
    • Telco initiatives, pilots and plans
    • Investment and merger and acquisition trends in edge computing
  • Use cases and business models for telcos
    • Telco edge computing use cases
    • Vertical opportunities
    • Roles and business models for telcos
  • Telcos’ challenges at the edge
  • Scenarios for network edge infrastructure development
  • Recommendation
  • Index

Request STL research insights overview pack

Cloud gaming: What’s the telco play?

To access the report chart pack in PPT download the additional file on the left

Drivers for cloud gaming services

Although many people still think of PlayStation and Xbox when they think about gaming, the console market represents only a third of the global games market. From its arcade and console-based beginnings, the gaming industry has come a long way. Over the past 20 years, one of the most significant market trends has been growth of casual gamers. Whereas hardcore gamers are passionate about frequent play and will pay more to play premium games, casual gamers play to pass the time. With the rapid adoption of smartphones capable of supporting gaming applications over the past decade, the population of casual/occasional gamers has risen dramatically.

This trend has seen the advent of free-to-play business models for games, further expanding the industry’s reach. In our earlier report, STL estimated that 45% of the population in the U.S. are either casual gamers (between 2 and 5 hours a week) or occasional gamers (up to 2 hours a week). By contrast, we estimated that hardcore gamers (more than 15 hours a week) make up 5% of the U.S. population, while regular players (5 to 15 hours a week) account for a further 15% of the population.

The expansion in the number of players is driving interest in ‘cloud gaming’. Instead of games running on a console or PC, cloud gaming involves streaming games onto a device from remote servers. The actual game is stored and run on a remote compute with the results being live streamed to the player’s device. This has the important advantage of eliminating the need for players to purchase dedicated gaming hardware. Now, the quality of the internet connection becomes the most important contributor to the gaming experience. While this type of gaming is still in its infancy, and faces a number of challenges, many companies are now entering the cloud gaming fold in an effort to capitalise on the new opportunity.

5G can support cloud gaming traffic growth

Cloud gaming requires not just high bandwidth and low latency, but also a stable connection and consistent low latency (jitter). In theory, 5G promises to deliver stable ultra-low latency. In practice, an enormous amount of infrastructure investment will be required in order to enable a fully loaded 5G network to perform as well as end-to-end fibre5G networks operating in the lower frequency bands would likely buckle under the load if lots of gamers in a cell needed a continuous 25Mbps stream. While 5G in millimetre-wave spectrum would have more capacity, it would require small cells and other mechanisms to ensure indoor penetration, given the spectrum is short range and could be blocked by obstacles such as walls.

Request a report extract

A complicated ecosystem

As explained in our earlier report, Cloud gaming: New opportunities for telcos?, the cloud gaming ecosystem is beginning to take shape. This is being accelerated by the growing availability of fibre and high-speed broadband, which is now being augmented by 5G and, in some cases, edge data centres. Early movers in cloud gaming are offering a range of services, from gaming rigs, to game development platforms, cloud computing infrastructure, or an amalgamation of these.

One of the main attractions of cloud gaming is the potential hardware savings for gamers. High-end PC gaming can be an extremely expensive hobby: gaming PCs range from £500 for the very cheapest to over £5,000 for the very top end. They also require frequent hardware upgrades in order to meet the increasing processing demands of new gaming titles. With cloud gaming, you can access the latest graphics processing unit at a much lower cost.

By some estimates, cloud gaming could deliver a high-end gaming environment at a quarter of the cost of a traditional console-based approach, as it would eliminate the need for retailing, packaging and delivering hardware and software to consumers, while also tapping the economies of scale inherent in the cloud. However, in STL Partners’ view that is a best-case scenario and a 50% reduction in costs is probably more realistic.

STL Partners believes adoption of cloud gaming will be gradual and piecemeal for the next few years, as console gamers work their way through another generation of consoles and casual gamers are reluctant to commit to a monthly subscription. However, from 2022, adoption is likely to grow rapidly as cloud gaming propositions improve.

At this stage, it is not yet clear who will dominate the value chain, if anyone. Will the “hyperscalers” be successful in creating a ‘Netflix’ for games? Google is certainly trying to do this with its Stadia platform, which has yet to gain any real traction, due to both its limited games library and its perceived technological immaturity. The established players in the games industry, such as EA, Microsoft (Xbox) and Sony (PlayStation), have launched cloud gaming offerings, or are, at least, in the process of doing so. Some telcos, such as Deutsche Telekom and Sunrise, are developing their own cloud gaming services, while SK Telecom is partnering with Microsoft.

What telcos can learn from Shadow’s cloud gaming proposition

The rest of this report explores the business models being pursued by cloud gaming providers. Specifically, it looks at cloud gaming company Shadow and how it fits into the wider ecosystem, before evaluating how its distinct approach compares with that of the major players in online entertainment, such as Sony and Google. The second half of the report considers the implications for telcos.

Table of Contents

  • Executive Summary
  • Introduction
  • Cloud gaming: a complicated ecosystem
    • The battle of the business models
    • The economics of cloud gaming and pricing models
    • Content offering will trump price
    • Cloud gaming is well positioned for casual gamers
    • The future cloud gaming landscape
  • 5G and fixed wireless
  • The role of edge computing
  • How and where can telcos add value?
  • Conclusions

Request STL research insights overview pack

Cloud gaming: New opportunities for telcos?

Gaming is following video to the cloud

Cloud gaming services enable consumers to play video games using any device with a screen and an Internet connection – the software and hardware required to play the game are all hosted on remote cloud services. Some reviewers say connectivity and cloud technologies have now advanced to a point where cloud gaming can begin to rival the experience offered by leading consoles, such as Microsoft’s Xbox and Sony’s PlayStation, while delivering greater interactivity and flexibility than gaming that relies on local hardware. Google believes it is now feasible to move gaming completely into the cloud – it has just launched its Stadia cloud gaming service. Although Microsoft is sounding a more cautious note, it is gearing up to launch a rival cloud gaming proposition called xCloud.

This report explores cloud gaming and models the size of the potential market, including the scale of the opportunity for telcos. It also considers the potential ramifications for telecoms networks. If Stadia, xCloud and other cloud gaming services take off, consumer demand for high-bandwidth, low latency connectivity could soar. At the same time, cloud gaming could also provide a key test of the business rationale for edge computing, which involves the deployment of compute power and data storage closer to the end users of digital content and applications. This allows the associated data to be processed, analysed and acted on locally, instead of being transmitted long distances to be processed at central data centres.

This report then goes on to outline the rollout of cloud gaming services by various telcos, including Deutsche Telekom in Germany and Sunrise in Switzerland, while also considering Apple’s strategy in this space. Finally, the conclusions section summarises how telcos around the world should be preparing for mass-market cloud gaming.

This report builds on previous executive briefings published by STL Partners, including:

Enter your details below to request an extract of the report


What is cloud gaming?

Up to now, keen gamers have generally bought a dedicated console, such as a Microsoft Xbox or Sony PlayStation, or a high-end computer, to play technically complex and graphically rich games. They also typically buy a physical copy of the game (a DVD), which they install on their console or in an optical disc drive attached to their PC. Alternatively, some platforms, such as Steam, allow gamers to download games from a marketplace.

Cloud gaming changes that paradigm by running the games on remote hardware in the cloud, with the video and audio then streamed to the consumer’s device, which could be a smartphone, a connected TV, a low-end PC or a tablet. The player would typically connect this device to a dedicated handheld controller, similar to one that they would use with an Xbox or a PlayStation.

There is also a half-way house between full cloud gaming and console gaming. This “lite” form of cloud gaming is sometimes known as “command streaming”. In this case, the game logic and graphics commands are processed in the cloud, but the graphics rendering happens locally on the device. This approach lowers the amount of bandwidth required (sending commands requires less bandwidth than sending video) and is less demanding from a latency perspective (no encoding/ decoding of the video stream). But the quality of graphics will be limited to the capabilities of the graphic processing unit on the end-user’s device. For keen players that want to play graphically rich games, command streaming wouldn’t necessarily eliminate the need to buy a console or a powerful PC.

As well as relocating and rejigging the computing permutations, cloud gaming opens up new business models. Rather than buying individual games, for example, the consumer could pay for a Netflix-style subscription service that would enable them to play a wide range of online video games, without having to download them. Alternatively, cloud gaming services could use a pay-as-you-go model, simply charging consumers by the minute or hour.

Today, these cloud gaming subscriptions can be relatively expensive. For example, Shadow, an existing cloud gaming service charges US$35 a month in the U.S., £32 a month in the U.K. and €40 a month in France and Germany (but there are significant discounts if the subscriber commits to 12 months). Shadow can support graphics resolution of 4K at 60 frames per second and conventional HD at 144 frames per second, which is superior to a typical console specification. It requires an Internet connection of at least 15 Mbps. Shadow is compatible with Windows 7/8/10, macOS, Android, Linux (beta), iOS (beta) and comes with a Windows 10 license, which can be used for other PC applications.

At those prices, Shadow is a niche offering. But Google is now looking to take cloud gaming mainstream by setting subscription charges at around US$10 a month – comparable to a Spotify or Netflix subscription, although the user will have to pay additional fees to buy most games. Google says its new Stadia cloud gaming service is accessible from any device that can run YouTube in HD at 30/60 frames per second (fps), as long as it has a fast enough connection (15–25Mbps). The consumer then uses a dedicated controller that can connect directly to their Wi-Fi, bypassing the device with the screen. All the processing is done in Google’s cloud, which then sends a YouTube video-stream to the device: the URL pinpoints which clip of the gameplay to request and receive.

In other words, Stadia will treat games as personalised YouTube video clips/web-pages that a player or viewer can interact with in real time. As a result, the gamer can share that stream easily with friends by sending them the URL. With permission from the gamer, the friend could then jump straight into the gameplay using their own device.

What is cloud gaming?

Table of contents

  • Executive Summary
  • Introduction
  • What is cloud gaming?
    • Why consumers will embrace cloud gaming
  • Ramifications for telecoms networks
    • Big demands on bandwidth
    • Latency
    • Edge computing
    • The network architecture underpinning Google Stadia
  • How large is the potential market?
    • Modelling the U.S. cloud gaming market
    • New business models
  • Telcos’ cloud gaming activities
    • Microsoft hedges its bets
    • Apple takes a different tack
  • Conclusions
    • Telcos without their own online entertainment offering
    • Telcos with their own online entertainment offering

Enter your details below to request an extract of the report


The Industrial IoT: What’s the telco opportunity?

The Industrial IoT is a confusing world

This report is the final report in a mini-series about the Internet for Things (I4T), which we see as the next stage of evolution from today’s IoT.

The first report, The IoT is dead: Long live the Internet for Things, outlines why today’s IoT infrastructure is insufficient for meeting businesses’ needs. The main problem with today’s IoT is that every company’s data is locked in its own silo, and one company’s solutions are likely deployed on a different platform than their partners’. So companies can optimise their internal operations, but have limited scope to use IoT to optimise operations involving multiple organisations.

The second report, Digital twins: A catalyst of disruption in the Coordination Age, provides an overview of what a digital twin is, and how they can play a role in overcoming the limitations of today’s IoT industry.

This report looks more closely at the state of development of enterprise and industrial IoT and the leading players in today’s IoT industry, which we believe is a crucial driver of the Coordination Age. In the Coordination Age, we believe the crucial socio-economic need in the world – and therefore the biggest business opportunity – is to make better use of our resources, whether that is time, money, or raw materials. Given the number of people employed in and resources going through industrial processes, figuring out what’s needed to make the industrial IoT reach its full potential is a big part of making this possible.

Three types of IoT

There are three ways of dividing up the types of IoT applications. As described by IoT expert Stacey Higginbotham, each group has distinct needs and priorities based on their main purpose:

  1. Consumer IoT: A connected device, with an interactive app, that provides an additional service to the end user compared with an unconnected version of the device. The additional service is enabled by the insights and data gathered from the device. The key priority for consumer devices is low price point and ease of installation, given most users’ lack of technical expertise.
  2. Enterprise IoT: This includes all the devices and sensors that enterprises are connecting to the internet, e.g. enterprise mobility and fleet tracking. Since every device connected to an enterprise network is a potential point of vulnerability, the primary concern of enterprise IoT is security and device management. This is achieved through documentation of devices on enterprise networks, prioritisation of devices and traffic across multiple types of networks, e.g. depending on speed and security requirements, and access rights controls, to track who is sharing data with whom and when.
  3. Industrial IoT: This field is born out of industrial protocols such as SCADA, which do not currently connect to the internet but rather to an internal control and monitoring system for manufacturing equipment. More recently, enterprises have enhanced these systems with a host of devices connected to IP networks through Wi-Fi or other technologies, and linked legacy monitoring systems to gateways that feed operational data into more user-friendly, cloud-based monitoring and analytics solutions. At this point, the lines between Industrial IoT and Enterprise IoT blur. When the cloud-based systems have the ability to control connected equipment, for instance through firmware updates, security to prevent malicious or unintended risks is paramount. The primary goals in IIoT remain to control and monitor, in order to improve operational efficiency and safety, although with rising security needs.

The Internet for Things (I4T) is in large part about bridging the divide between Enterprise and Industrial IoT. The idea is to be able to share highly sensitive industrial information, such as a change in operational status that will affect a supply chain, or a fault in public infrastructure like roads, rail or electricity grid, that will affect surroundings and require repairs. This requires new solutions that can coordinate and track the movement of Industrial IoT data into Enterprise IoT insights and actions.

Understandably, enterprises are way of opening any vulnerabilities into their operations through deeper or broader connections, so finding a secure way to bring about the I4T is the primary concern.

The proliferation of IoT platforms

Almost every major player in the ICT world is pitching for a role in both Enterprise and Industrial IoT. Most largescale manufacturers and telecoms operators are also trying to carve out a role in the IoT industry.

By and large, these players have developed specific IoT solutions linked to their core businesses, and then expanded by developing some kind of “IoT platform” that brings together a broader range of capabilities across the IT stack necessary to provide end-to-end IoT solutions.
The result is a hugely complex industry with many overlapping and competing “platforms”. Because they all do something different, the term “platform” is often unhelpful in understanding what a company provides.

A company’s “IoT platform” might comprise of any combination of these four layers of the IoT stack, all of which are key components of an end-to-end solution:

  1. Hardware: This is the IoT device or sensor that is used to collect and transmit data. Larger devices may also have inbuilt compute power enabling them to run local analysis on the data collected, in order to curate which data need to be sent to a central repository or other devices.
  2. Connectivity: This is the means by which data is transmitted, including location-based connectivity (Bluetooth, Wi-Fi), to low power wide area over unlicensed spectrum (Sigfox, LoRa), and cellular (NB-IoT, LTE-M, LTE).
  3. IoT service enablement: This is the most nebulous category, because it includes anything that sits as middleware in between connectivity and the end application. The main types of enabling functions are:
    • Cloud compute capacity for storing and analysing data
    • Data management: aggregating, structuring and standardising data from multiple different sources. There are sub-categories within this geared towards specific end applications, such as product or service lifecycle management tools.
    • Device management: device onboarding, monitoring, software updates, and security. Software and security management are often broken out as separate enablement solutions.
    • Connectivity management: orchestrating IoT devices over a variety of networks
    • Data / device visualisation: This includes graphical interfaces for presenting complex data sets and insights, and 3D modelling tools for industrial equipment.
  4. Applications: These leverage tools in the IoT enablement layer to deliver specific insights or trigger actions that deliver a specific outcome to end users, such as predictive maintenance or fleet management. Applications are usually tailored to the specific needs of end users and rarely scale well across multiple industries.

Most “IoT platforms” combine at least two layers across this IoT stack

graphic of 4 layers on the IoT stack

Source: STL Partners

There are two key reasons why platforms offering end-to-end services have dominated the early development of the IoT industry:

  • Enterprises’ most immediate needs have been to have greater visibility into their own operations and make them more efficient. This means IoT initiatives have been driven primarily by business owners, rather than technology teams, who often don’t have the skills to piece together multiple different components by themselves.
  • Although the IoT as a whole is a big business, each individual component to bringing a solution together is relatively small. So companies providing IoT solutions – including telcos – have attempted to capture a larger share of the value chain in order to make it a better business.

Making sense of the confusion

It is a daunting task to work out how to bring IoT into play in any organisation. It requires a thorough re-think of how a business operates, for a start, then tinkering with (or transforming) its core systems and processes, depending on how you approach it.

That’s tricky enough even without the burgeoning market of self-proclaimed “leaders of industrial IoT” and technology players’ “IoT platforms”.

This report does not attempt to answer “what is the best way / platform” for different IoT implementations. There are many other resources available that attempt to offer comparisons to help guide users through the task of picking the right tools for the job.

The objective here is to gain a sense of what is real today, and where the opportunities and gaps are, in order to help telecoms operators and their partners understand how they can help enterprises move beyond the IoT, into the I4T.

 

Table of contents

  • Executive Summary
  • Introduction
    • Three types of IoT
    • The proliferation of IoT platforms
    • Making sense of the confusion
  • The state of the IoT industry
    • In the beginning, there was SCADA
    • Then there were specialised industrial automation systems
    • IoT providers are learning about evolving customer needs
  • Overview of IoT solution providers
    • Generalist scaled IT players
    • The Internet players (Amazon, Google and Microsoft)
    • Large-scale manufacturers
    • Transformation / IoT specialists
    • Big telco vendors
    • Telecoms operators
    • Other connectivity-led players
  • Conclusions and recommendations
    • A buyers’ eye view: Too much choice, not enough agility
    • How telcos can help – and succeed over the long term in IoT

New age, new control points?

Why control points matter

This executive briefing explores the evolution of control points – products, services or roles that give a company disproportionate power within a particular digital value chain. Historically, such control points have included Microsoft’s Windows operating system and Intel’s processor architecture for personal computers (PCs), Google’s search engine and Apple’s iPhone. In each case, these control points have been a reliable source of revenues and a springboard into other lucrative new markets, such as productivity software (Microsoft) server chips (Intel), display advertising (Google) and app retailing (Apple).

Although technical and regulatory constraints mean that most telcos are unlikely to be able to build out their own control points, there are exceptions, such as the central role of Safaricom’s M-Pesa service in Kenya’s digital economy. In any case, a thorough understanding of where new control points are emerging will help telcos identify what their customers most value in the digital ecosystem. Moreover, if they move early enough to encourage competition and/or appropriate regulatory intervention, telcos could prevent themselves, their partners and their customers from becoming too dependent on particular companies.

The emergence of Microsoft’s operating system as the dominant platform in the PC market left many of its “partners” struggling to eke out a profit from the sale of computer hardware. Looking forward, there is a similar risk that a company that creates a dominant artificial intelligence platform could leave other players in various digital value chains, including telcos, at their beck and call.

This report explores how control points are evolving beyond simple components, such as a piece of software or a microprocessor, to become elaborate vertically-integrated stacks of hardware, software and services that work towards a specific goal, such as developing the best self-driving car on the planet or the most accurate image recognition system in the cloud. It then outlines what telcos and their partners can do to help maintain a balance of power in the Coordination Age, where, crucially, no one really wants to be at the mercy of a “master coordinator”.

The report focuses primarily on the consumer market, but the arguments it makes are also applicable in the enterprise space, where machine learning is being applied to optimise specialist solutions, such as production lines, industrial processes and drug development. In each case, there is a danger that a single company will build an unassailable position in a specific niche, ultimately eliminating the competition on which effective capitalism depends.

Enter your details below to request an extract of the report


Control points evolve and shift

A control point can be defined as a product, service or solution on which every other player in a value chain is heavily dependent. Their reliance on this component means the other players in the value chain generally have to accept the terms and conditions imposed by the entity that owns the control point. A good contemporary example is Apple’s App Store – owners of Apple’s devices depend on the App Store to get access to software they need/want, while app developers depend on the App Store to distribute their software to the 1.4 billion Apple devices in active use. This pivotal position allows Apple to levy a controversial commission of 30% on software and digital content sold through the App Store.

But few control points last forever: the App Store will only continue to be a control point if consumers continue to download a wide range of apps, rather than interacting with online services through a web browser or another software platform, such as a messaging app. Recent history shows that as technology evolves, control points can be sidestepped or marginalised. For example, Microsoft’s Windows operating system and Internet Explorer browser were once regarded as key control points in the personal computing ecosystem, but neither piece of software is still at the heart of most consumers’ online experience.

Similarly, the gateway role of Apple’s App Store looks set to be eroded over time. Towards the end of 2018, Netflix — the App Store’s top grossing app — no longer allowed new customers to sign up and subscribe to the streaming service within the Netflix app for iOS across all global markets, according to a report by TechCrunch. That move is designed to cut out the expensive intermediary — Apple. Citing data compiled by Sensor Tower, the report said Netflix would have paid Apple US$256 million of the US$853 million grossed by its 2018 the Netflix iOS app, assuming a 30% commission for Apple (however, after the first year, Apple’s cut on subscription renewals is lowered to 15%).

TechCrunch noted that Netflix is following in the footsteps of Amazon, which has historically restricted movie and TV rentals and purchases to its own website or other “compatible” apps, instead of allowing them to take place through its Prime Video app for iOS or Android. In so doing, Amazon is preventing Apple or Google from taking a slice of its content revenues. Amazon takes the same approach with Kindle e-books, which also aren’t offered in the Kindle mobile app. Spotify has also discontinued the option to pay for its Premium service using Apple’s in-app payment system.

Skating ahead of the puck

As control points evolve and shift, some of today’s Internet giants, notably Alphabet, Amazon and Facebook, are skating where the puck is heading, acquiring the new players that might disrupt their existing control points. In fact, the willingness of today’s Internet platforms to spend big money on small companies suggests they are much more alert to this dynamic than their predecessors were. Facebook’s US$19 billion acquisition of messaging app WhatsApp, which has generated very little in the way of revenues, is perhaps the best example of the perceived value of strategic control points – consumers’ time and attention appears to be gradually shifting from traditional social into messaging apps, such as WhatsApp, or hybrid-services, such as Instagram, which Facebook also acquired.

In fact, the financial and regulatory leeway Alphabet, Amazon, Facebook and Apple enjoy (granted by long-sighted investors) almost constitutes another control point. Whereas deals by telcos and media companies tend to come under much tougher scrutiny and be restricted by rigorous financial modelling, the Internet giants are generally trusted to buy whoever they like.

The decision by Alphabet, the owner of Google, to establish its “Other Bets” division is another example of how today’s tech giants have learnt from the complacency of their predecessors. Whereas Microsoft failed to anticipate the rise of tablets and smart TVs, weakening its grip on the consumer computing market, Google has zealously explored the potential of new computing platforms, such as connected glasses, self-driving cars and smart speakers.

In essence, the current generation of tech leaders have taken Intel founder Andy Grove’s famous “only the paranoid survive” mantra to heart. Having swept away the old order, they realise their companies could also easily be side-lined by new players with new ways of doing things. Underlining this point, Larry Page, founder of Google, wrote in 2014:Many companies get comfortable doing what they have always done, making only incremental changes. This incrementalism leads to irrelevance over time, especially in technology, where change tends to be revolutionary, not evolutionary. People thought we were crazy when we acquired YouTube and Android and when we launched Chrome, but those efforts have matured into major platforms for digital video and mobile devices and a safer, popular browser.”

Table of contents

  • Executive Summary
  • Introduction
  • What constitutes a control point?
    • Control points evolve and shift
    • New kinds of control points
  • The big data dividend
    • Can incumbents’ big data advantage be overcome?
    • Data has drawbacks – dangers of distraction
    • How does machine learning change the data game?
  • The power of network effects
    • The importance of the ecosystem
    • Cloud computing capacity and capabilities
    • Digital identity and digital payments
  • The value of vertical integration
    • The machine learning super cycle
    • The machine learning cycle in action – image recognition
  • Tesla’s journey towards self-driving vehicles
    • Custom-made computing architecture
    • Training the self-driving software
    • But does Tesla have a sustainable advantage?
  • Regulatory checks and balances
  • Conclusions and recommendations

Enter your details below to request an extract of the report


Elisa Automate: Growing value with sisu

Elisa’s transformation journey

Almost every telco aspires to innovate and become a ‘digital services player’, selling more than just data, voice, messages, and entertainment services, but few have made significant inroads to this aim.

Yet Elisa, the market leader in Finland, which has a population of only just over 5.5 million people, can stake a claim to having achieved more than most.

The Finnish word ‘sisu’ has no direct English translation. It means a spirit of determination, independence and fortitude, and is considered by some Finns to be the heart of Finnish character.

Elisa and the other Finnish telcos certainly have plenty of sisu. They have resolutely charted their own course and prospered, with Elisa quadrupling its market valuation over the last ten years.

Elisa’s share price has quadrupled since 2009 

Source: Yahoo Finance

The genesis of Elisa Automate

Elisa’s overall strategy was based on a sound but uncommon piece of customer insight: nobody knows what a megabit of data actually is, so it is crazy to price data services based on the volume of data used. So Elisa and the other players in the Finnish market moved to unlimited data packages prices by speed (see report Sense check: Can data growth save telco revenues?).

The consequences of this decision have been that Finnish customers use a lot of data, and secondly, Finnish operators have built out coverage so that they can enjoy using it whenever and wherever.

This means that Elisa has to deliver a lot of data across its network.

Elisa’s data traffic has grown massively

Source: Elisa

Elisa has grown its revenues and EBIT too

Source: Elisa

Necessity can be the mother of invention

To manage profitability in a market where use and therefore data volume is effectively unlimited, Elisa had to tie its costs firmly to its revenues, and to do so elected to keep the ratios of capex and opex to revenue flat. This requires a very clear focus on cost management, and a determination to take every step possible to do so.

Elisa’s capex/revenue ratio is surprisingly low and stable

Elisa capex ratio

 

Source: Elisa

Out of this need came a powerful drive for automation: not to simply cut costs or reduce headcount, but to make the company as efficient as possible.

The result is Elisa Automate, a fully automated Network Operations Centre (NOC), one of three new business concepts that it is selling to others (in this case, telcos), along with Elisa SmartFactory and its video conferencing aggregation service.

Elisa is clearly succeeding, and not just in its financial results. For example, 18% of Finnish business customers say that it is the most innovative IT actor in its market, compared to 6% for CGI and 5% for Fujitsu.

STL Partners has long watched Elisa’s progress with a high degree of fascination. Elisa and its Finnish peers are a little like the Galapagos Islands of telecoms evolution but marked extraordinary by their distinctive approaches rather than extreme geographical isolation.

Contents:

  • Introduction
  • Elisa: creating an innovator
  • Building a stable foundation for innovation
  • Making the most of Finland’s advantages
  • The genesis of Elisa Automate
  • The early drivers of automation
  • The move towards ‘zero touch’
  • Augmenting human intelligence
  • Automation supports rapid mobile service revenue growth
  • Commercialising the opportunity
  • The value proposition
  • Customer spotlight: Orange Spain
  • Conclusions

Figures:

  1. Elisa’s share price has quadrupled since 2009
  2. Elisa’s data traffic has grown massively
  3. Elisa has grown its revenues and EBIT too
  4. Elisa’s Capex/Revenue ratio is surprisingly low and stable
  5. Elisa shares data showing network performance improvements through automation