Convergence, coexistence or competition: How will 5G and Wi-Fi 6 interact?

Introduction: Wi-Fi vs cellular

The debate around Wi-Fi and cellular convergence is not new. However, the introduction of next generation mobile and cellular technologies, Wi-Fi 6 and 5G, has once again reignited this debate. Further impetus for discussion has been provided by industry bodies, including the Wi-Fi Alliance, IEEE, Wireless Broadband Alliance (WBA), Next Generation Mobile Networks Alliance (NGMN) and 3GPP, developing standards to enable the convergence between 5G and Wi-Fi.

5G, introduced by 3GPP’s release 15 in 2018, and deployed internationally by telecoms operators since 2019, is considered a significant upgrade to 4G and LTE. Its improved capabilities such as increased speed, coverage, reliability, and security promise to enable a host of new use cases in a wide range of industries.

Simultaneously, Wi-Fi has evolved into its 6th generation, with Wi-Fi 6 technology emerging in 2019. This new evolution of Wi-Fi can provide speeds that are 40% higher than its predecessor, as well as improved visibility and transparency for better network control and management. Some of the key enhancements of the new generation are detailed in the table below.

Figure 1: There are a number of key differences between next generation Wi-Fi and cellular connectivity

key-differences-next-generation-wifi-cellular-activity

Source: STL Partners

The market context for convergence

Industry bodies have been promoting convergence

The Wireless Broadband Alliance (WBA) and the Next Generation Mobile Networks Alliance (NGMN) produced a joint report in 2021 promoting the future convergence between Wi-Fi and 5G. The report highlights the merits of convergence, noting a number of use cases and verticals that could stand to benefit from closer alignment between the two technologies. Further, the 3GPP have increasingly sought to include standards with each new release that enable convergence between Wi-Fi and cellular. 3GPP’s release 8 introduced the concept of ‘access network discovery and selection function’ (ANDSF) which allowed user equipment to discover non-3GPP access networks, including Wi-Fi. In 2018, release 15 included optional 3GPP access for native 5G services via these non 3GPP access networks. Most recently, release 16 introduced ‘access traffic steering, splitting and switching’ (ATSSS), allowing both 3GPP and non-3GPP connectivity to multiple access networks, which is a key enabler of the resilience model of convergence. Similarly, the IEEE, sponsored by the Wi-Fi Alliance has been discussing the potential pathways to convergence for a number of years. However, these bodies are less vocal about future convergence possibilities, likely given Wi-Fi’s current dominance in the provision of enterprise wireless connectivity.

Spectrum auctions

The possibility of convergence has been further supported in recent years by releases of spectrum in the 6GHz band for unlicensed use in the USA, UK, South Korea and other major markets. Spectrum in the same 6GHz range can also be used to support 5G connectivity in addition to the existing 5GHz band. With the ability to share the same spectrum, this could theoretically promote closer coupling of 5G and Wi-Fi. However, given similar propagation characteristics for each technology, it remains to be proven as to whether the increasing availability of spectrum will help to push convergence forward.

There is a disconnect between theory and practice

While standards define what is possible, the purpose of industry bodies is to be future-focused, paving the way for the rest of the ecosystem to follow. What is possible in theory must be supported in practice, and the supply-side ecosystem, including network operators, system integrators (SIs), network equipment providers (NEPs) and hardware manufacturers have a role to play if convergence is to become more widespread.

Similarly, for devices to access converged networks, they must be equipped with 5G and Wi-Fi chips. While mobile phones support both connectivity types, the vast majority of connected devices that enterprises deploy are Wi-Fi only. Until 5G chips or modules become more widely available, and used in a greater number of devices, convergence will likely remain relegated to specific use cases. For example, use cases that depend on the mobility afforded by being able to ‘switch over’ from Wi-Fi to mobile seamlessly, or highly mission critical use cases in verticals such as manufacturing that can justify the investment in (private) 5G as a back-up to Wi-Fi. We discuss both of these use cases in more detail in the report. The full ecosystem must ultimately work in concert for convergence to become a realistic possibility for a larger number of enterprises.

 

Table of Contents

  • Executive Summary
    • Convergence is still immature on both the demand and supply sides
    • What do we mean by co-existence, convergence and competition?
  • Preface
  • Introduction
  • The market context for convergence
    • Industry bodies have been promoting convergence
    • Spectrum auctions
    • There is a disconnect between theory and practice
    • There are two key use cases for convergence
  • A future trend towards convergence is still immature
    • Regional differences in the maturity of 5G
    • Inconsistent definitions
    • Who manages convergence?
  • It is still too early to see high levels of demand for convergence from enterprise customers
    • Wi-Fi is the incumbent, 5G must overcome a number of barriers before it can become a genuine partner or alternative
    • Decisions regarding convergence are driven by industry characteristics
    • Supply side players must educate enterprise customers about convergence (if they believe it is beneficial to the enterprise)
  • Conclusion

Related research

Indoor wireless: A new frontier for IoT and 5G

Introduction to Indoor Wireless

A very large part of the usage of mobile devices – and mobile and other wireless networks – is indoors. Estimates vary but perhaps 70-80% of all wireless data is used while fixed or “nomadic”, inside a building. However, the availability and quality of indoor wireless connections (of all types) varies hugely. This impacts users, network operators, businesses and, ultimately, governments and society.

Whether the use-case is watching a YouTube video on a tablet from a sofa, booking an Uber from a phone in a company’s reception, or controlling a moving robot in a factory, the telecoms industry needs to give much more thought to the user-requirements, technologies and obstacles involved. This is becoming ever more critical as sensitive IoT applications emerge, which are dependent on good connectivity – and which don’t have the flexibility of humans. A sensor or piece of machinery cannot move and stand by a window for a better signal – and may well be in parts of a building that are inaccessible to both humans and many radio transmissions.

While mobile operators and other wireless service providers have important roles to play here, they cannot do everything, everywhere. They do not have the resources, and may lack site access. Planning, deploying and maintaining indoor coverage can be costly.

Indeed, the growing importance and complexity is such that a lot of indoor wireless infrastructure is owned by the building or user themselves – which then brings in further considerations for policymakers about spectrum, competition and more. There is a huge upsurge of interest in both improved Wi-Fi, and deployments of private cellular networks indoors, as some organisations recognise connectivity as so strategically-important they wish to control it directly, rather than relying on service providers. Various new classes of SP are emerging too, focused on particular verticals or use-cases.

In the home, wireless networks are also becoming a battleground for “ecosystem leverage”. Fixed and cable networks want to improve their existing Wi-Fi footprint to give “whole home” coverage worthy of gigabit fibre or cable connections. Cellular providers are hoping to swing some residential customers to mobile-only subscriptions. And technology firms like Google see home Wi-Fi as a pivotal element to anchor other smart-home services.

Large enterprise and “campus” sites like hospitals, chemical plants, airports, hotels and shopping malls each have complex on-site wireless characteristics and requirements. No two are alike – but all are increasingly dependent on wireless connections for employees, visitors and machines. Again, traditional “outdoors” cellular service-providers are not always best-placed to deliver this – but often, neither is anyone else. New skills and deployment models are needed, ideally backed with more cost—effective (and future-proofed) technology and tools.

In essence, there is a conflict between “public network service” and “private property” when it comes to wireless connectivity. For the fixed network, there is a well-defined “demarcation point” where a cable enters the building, and ownership and responsibilities switch from telco to building owner or end-user. For wireless, that demarcation is much harder to institutionalise, as signals propagate through walls and windows, often in unpredictable and variable fashion. Some large buildings even have their own local cellular base stations, and dedicated systems to “pipe the signal through the building” (distributed antenna systems, DAS).

Where is indoor coverage required?

There are numerous sub-divisions of “indoors”, each of which brings its own challenges, opportunities and market dynamics:

• Residential properties: houses & apartment blocks
• Enterprise “carpeted offices”, either owned/occupied, or multi-tenant
• Public buildings, where visitors are more numerous than staff (e.g. shopping malls, sports stadia, schools), and which may also have companies as tenants or concessions.
• Inside vehicles (trains, buses, boats, etc.) and across transport networks like metro systems or inside tunnels
• Industrial sites such as factories or oil refineries, which may blend “indoors” with “onsite”

In addition to these broad categories are assorted other niches, plus overlaps between the sectors. There are also other dimensions around scale of building, single-occupant vs. shared tenancy, whether the majority of “users” are humans or IoT devices, and so on.

In a nutshell: indoor wireless is complex, heterogeneous, multi-stakeholder and often expensive to deal with. It is no wonder that most mobile operators – and most regulators – focus on outdoor, wide-area networks both for investment, and for license rules on coverage. It is unreasonable to force a telco to provide coverage that reaches a subterranean, concrete-and-steel bank vault, when their engineers wouldn’t even be allowed access to it.

How much of a problem is indoor coverage?

Anecdotally, many locations have problems with indoor coverage – cellular networks are patchy, Wi- Fi can be cumbersome to access and slow, and GPS satellite location signals don’t work without line- of-sight to several satellites. We have all complained about poor connectivity in our homes or offices, or about needing to stand next to a window. With growing dependency on mobile devices, plus the advent of IoT devices everywhere, for increasingly important applications, good wireless connectivity is becoming more essential.

Yet hard data about indoor wireless coverage is also very patchy. UK regulator Ofcom is one of the few that reports on availability / usability of cellular signals, and few regulators (Japan’s is another) enforce it as part of spectrum licenses. Fairly clearly, it is hard to measure, as operators cannot do systematic “drive tests” indoors, while on-device measurements usually cannot determine if they are inside or outside without being invasive of the user’s privacy. Most operators and regulators estimate coverage, based on some samples plus knowledge of outdoor signal strength and typical building construction practices. The accuracy (and up-to-date assumptions) is highly questionable.

Indoor coverage data is hard to find

Contents:

  • Executive Summary
  • Likely outcomes
  • What telcos need to do
  • Introduction to Indoor Wireless
  • Overview
  • Where is indoor coverage required?
  • How much of a problem is indoor coverage?
  • The key science lesson of indoor coverage
  • The economics of indoor wireless
  • Not just cellular coverage indoors
  • Yet more complications are on the horizon…
  • The role of regulators and policymakers
  • Systems and stakeholders for indoor wireless
  • Technical approaches to indoor wireless
  • Stakeholders for indoor wireless
  • Home networking: is Mesh Wi-Fi the answer?
  • Is outside-in cellular good enough for the home on its own?
  • Home Wi-Fi has complexities and challenges
  • Wi-Fi innovations will perpetuate its dominance
  • Enterprise/public buildings and the rise of private cellular and neutral host models
  • Who pays?
  • Single-operator vs. multi-operator: enabling “neutral hosts”
  • Industrial sites and IoT
  • Conclusions
  • Can technology solve MNO’s “indoor problem”?
  • Recommendations

Figures:

  • Indoor coverage data is hard to find
  • Insulation impacts indoor penetration significantly
  • 3.5GHz 5G might give acceptable indoor coverage
  • Indoor wireless costs and revenues
  • In-Building Wireless face a dynamic backdrop
  • Key indoor wireless architectures
  • Different building types, different stakeholders
  • Whole-home meshes allow Wi-Fi to reach all corners of the building
  • Commercial premises now find good wireless essential
  • Neutral Hosts can offer multi-network coverage to smaller sites than DAS
  • Every industrial sector has unique requirements for wireless

Free-T-Mobile: Disruptive Revolution or a Bridge Too Far?

Free’s Bid for T-Mobile USA 

The future of the US market and its 3rd and 4th operators has been a long-running saga. The market, the world’s richest, remains dominated by the duopoly of AT&T and Verizon Wireless. It was long expected that Softbank’s acquisition of Sprint heralded disruption, but in the event, T-Mobile was simply quicker to the punch.

Since the launch of T-Mobile’s “uncarrier” price-war strategy, we have identified signs of a “Free Mobile-like” disruption event, for example, substantial net-adds for the disruptor, falling ARPUs, a shakeout of MVNOs and minor operators, and increased industry-wide subscriber growth. However, other key indicators like a rapid move towards profitability by the disruptor are not yet in evidence, and rather than industry-wide deflation, we observe divergence, with Verizon Wireless increasing its ARPU, revenues, and margins, while AT&T’s are flat, Sprint’s flat to falling, and T-Mobile’s plunging.

This data is summarised in Figure 1.

Figure 1: Revenue and margins in the US. The duopoly is still very much with us

 

Source: STL Partners, company filings

Compare and contrast Figure 2, which shows the fully developed disruption in France. 

 

Figure 2: Fully-developed disruption. Revenue and margins in France

 

Source: STL Partners, company filings

T-Mobile: the state of play in Q2 2014

When reading Figure 1, you should note that T-Mobile’s Q2 2014 accounts contain a negative expense item of $747m, reflecting a spectrum swap with Verizon Wireless, which flatters their margin. Without it, the operating margin would be 2.99%, about a third of Sprint’s. Poor as this is, it is at least positive territory, after a Q1 in which T-Mobile lost money. It is not quite true to say that T-Mobile only made it to profitability thanks to the one-off spectrum deal; excluding it, the carrier would have made $215m in operating income in Q2, a $243m swing from the $28m net loss in Q1. This is explained by a $223m narrowing of T-Mobile’s losses on device sales, as shown in Figure 2, and may explain why the earnings release makes no mention of profits instead of adjusted EBITDA despite it being a positive quarter.

Figure 3: T-Mobile’s return to underlying profitability – caused by moderating its smartphone bonanza somewhat

Source: STL Partners, company filings

T-Mobile management likes to cite its ABPU (Average Billings per User) metric in preference to ARPU, which includes the hire-purchase charges on device sales under its quick-upgrade plans. However, as Figure 3 shows, this is less exciting than it sounds. The T-Mobile management story is that as service prices, and hence ARPU, fall in order to bring in net-adds, payments for device sales “decoupled” from service plans will rise and take up the slack. They are, so far, only just doing so. Given that T-Mobile is losing money on device pricing, this is no surprise.

 

  • Executive Summary
  • Free’s Bid for T-Mobile USA
  • T-Mobile: the state of play in Q2 2014
  • Free-Mobile: the financials
  • Indicators of a successful LBO
  • Free.fr: a modus operandi for disruption
  • Surprise and audacity
  • Simple products
  • The technical edge
  • Obstacles to the Free modus operandi
  • Spectrum
  • Fixed-mobile synergy
  • Regulation
  • Summary
  • Two strategic options
  • Hypothesis one: change the circumstances via a strategic deal with the cablecos
  • Hypothesis two: 80s retro LBO
  • Problems that bite whichever option is taken
  • The other shareholders
  • Free’s management capacity and experience
  • Conclusion

 

  • Figure 1: Revenue and margins in the US. The duopoly is still very much with us
  • Figure 2: Fully-developed disruption. Revenue and margins in France
  • Figure 3: T-Mobile’s return to underlying profitability – caused by moderating its smartphone bonanza somewhat
  • Figure 4: Postpaid ARPU falling steadily, while ABPU just about keeps up
  • Figure 5: T-Mobile’s supposed “decoupling” of devices from service has extended $3.5bn of credit to its customers, rising at $1bn/quarter
  • Figure 6: Free’s valuation of T-Mobile is at the top end of a rising trend
  • Figure 7: Example LBO
  • Figure 8: Free-T-Mobile in the context of notable leveraged buyouts
  • Figure 9: Free Mobile’s progress towards profitability has been even more impressive than its subscriber growth

 

Connected Home: Telcos vs Google (Nest, Apple, Samsung, +…)

Introduction 

On January 13th 2014, Google announced its acquisition of Nest Labs for $3.2bn in cash consideration. Nest Labs, or ‘Nest’ for short, is a home automation company founded in 2010 and based in California which manufactures ‘smart’ thermostats and smoke/carbon monoxide detectors. Prior to this announcement, Google already had an approximately 12% equity stake in Nest following its Series B funding round in 2011.

Google is known as a prolific investor and acquirer of companies: during 2012 and 2013 it spent $17bn on acquisitions alone, which was more than Apple, Microsoft, Facebook and Yahoo combined (at $13bn) . Google has even been known to average one acquisition per week for extended periods of time. Nest, however, was not just any acquisition. For one, whilst the details of the acquisition were being ironed out Nest was separately in the process of raising a new round of investment which implicitly valued it at c. $2bn. Google, therefore, appears to have paid a premium of over 50%.

This analysis can be extended by examining the transaction under three different, but complementary, lights.

Google + Nest: why it’s an interesting and important deal

  • Firstly, looking at Nest’s market capitalisation relative to its established competitors suggests that its long-run growth prospects are seen to be very strong

At the time of the acquisition, estimates placed Nest as selling 100k of its flagship product (the ‘Nest Thermostat’) per month . With each thermostat retailing at c. $250 each, this put its revenue at approximately $300m per annum. Now, looking at the ratio of Nest’s market capitalisation to revenue compared to two of its established competitors (Lennox and Honeywell) tells an interesting story:

Figure 1: Nest vs. competitors’ market capitalisation to revenue

 

Source: Company accounts, Morgan Stanley

Such a disparity suggests that Nest’s long-run growth prospects, in terms of both revenue and free cash flow, are believed to be substantially higher than the industry average. 
  • Secondly, looking at Google’s own market capitalisation suggests that the capital markets see considerable value in (and synergies from) its acquisition of Nest

Prior to the deal’s announcement, Google’s share price was oscillating around the $560 mark. Following the acquisition, Google’s share price began averaging closer to $580. On the day of the announcement itself, Google’s share price increased from $561 to $574 which, crucially, reflected a $9bn increase in market capitalisation . In other words, the value placed on Google by the capital markets increased by nearly 300% of the deal’s value. This is shown in Figure 2 below:

Figure 2: Google’s share price pre- and post-Nest acquisition

 

Source: Google Finance

This implies that the capital markets either see Google as being well positioned to add unique value to Nest, Nest as being able to strongly complement Google’s existing activities, or both.

  • Thirdly, viewing the Nest acquisition in the context of Google’s historic and recent M&A activity shows both its own specific financial significance and the changing face of Google’s acquisitions more generally

At $3.2bn, the acquisition of Nest represents Google’s second largest acquisition of all time. The largest was its purchase of Motorola Mobility in 2011 for $12.5bn, but Google has since reached a deal to sell the majority of its assets (excluding its patent portfolio) to Lenovo for $2.9bn. In other words, Nest is soon to become Google’s largest active, inorganic investment. Google’s ten largest acquisitions, as well as some smaller but important ones, are shown in Figure 3 below:

Figure 3: Selected acquisitions by Google, 2003-14

Source: Various

Beyond its size, the Nest acquisition also continues Google’s recent trend of acquiring companies seemingly less directly related to its core business. For example, it has been investing in artificial intelligence (DeepMind Technologies), robotics (Boston Dynamics, Industrial Perception, Redwood Robotics) and satellite imagery (Skybox Imaging).

Three questions raised by Google’s acquisition of Nest

George Geis, a professor at UCLA, claims that Google develops a series of metrics at an early stage which it later uses to judge whether or not the acquisition has been successful. He further claims that, according to these metrics, Google on average rates two-thirds of its acquisitions as successful. This positive track record, combined with the sheer size of the Nest deal, suggests that the obvious question here is also an important one:

  • What is Nest’s business model? Why did Google spend $3.2bn on Nest?

Nest’s products, the Nest Thermostat and the Nest Protect (smoke/carbon monoxide detector), sit within the relatively young space referred to as the ‘connected home’, which is defined and discussed in more detail here. One natural question following the Nest deal is whether Google’s high-profile involvement and backing of a (leading) company in the connected home space will accelerate its adoption. This suggests the following, more general, question:

  • What does the Nest acquisition mean for the broader connected home market?

Finally, there is a question to be asked around the implications of this deal for Telcos and their partners. Many Telcos are now active in this space, but they are not alone: internet players (e.g. Google and Apple), big technology companies (e.g. Samsung), utilities (e.g. British Gas) and security companies (e.g. ADT) are all increasing their involvement too. With different strategies being adopted by different players, the following question follows naturally:

  • What does the Nest acquisition mean for telcos?

 

  • Executive Summary
  • Introduction
  • Google + Nest: why it’s an interesting and important deal
  • Three questions raised by Google’s acquisition of Nest
  • Understanding Nest and Connected Homes
  • Nest: reinventing everyday objects to make them ‘smart’
  • Nest’s future: more products, more markets
  • A general framework for connected home services
  • Nest’s business model, and how Google plans to get a return on its $3.2bn investment 
  • Domain #1: Revenue from selling Nest devices is of only limited importance to Google
  • Domain #2: Energy demand response is a potentially lucrative opportunity in the connected home
  • Domain #3: Data for advertising is important, but primarily within Google’s broader IoT ambitions
  • Domain #4: Google also sees Nest as partial insurance against IoT-driven disruption
  • Domain #5: Google is pushing into the IoT to enhance its advertising business and explore new monetisation models
  • Implications for Telcos and the Connected Home
  • The connected home is happening now, but customer experience must not be overlooked
  • Telcos can employ a variety of monetisation strategies in the connected home
  • Conclusions

 

  • Figure 1: Nest vs. competitors’ market capitalisation relative to revenue
  • Figure 2: Google’s share price, pre- and post-Nest acquisition
  • Figure 3: Selected acquisitions by Google, 2003-14
  • Figure 4: The Nest Thermostat and Protect
  • Figure 5: Consumer Electronics vs. Electricity Spending by Market
  • Figure 6: A connected home services framework
  • Figure 7: Nest and Google Summary Motivation Matrix
  • Figure 8: Nest hardware revenue and free cash flow forecasts, 2014-23
  • Figure 9: PJM West Wholesale Electricity Prices, 2013
  • Figure 10: Cooling profile during a Rush Hour Rewards episode
  • Figure 11: Nest is attempting to position itself at the centre of the connected home
  • Figure 12: US smartphone market share by operating system (OS), 2005-13
  • Figure 13: Google revenue breakdown, 2013
  • Figure 14: Google – Generic IoT Strategy Map
  • Figure 15: Connected device forecasts, 2010-20
  • Figure 16: Connected home timeline, 1999-Present
  • Figure 17: OnFuture EMEA 2014: The recent surge in interest in the connected home is due to?
  • Figure 18: A spectrum of connected home strategies between B2C and B2B2C (examples)
  • Figure 19: Building, buying or partnering in the connected home (examples)
  • Figure 20: Telco 2.0™ ‘two-sided’ telecoms business model

Disruptive Strategy: ‘Uncarrier’ T-Mobile vs. AT&T, VZW, and Free.fr

Introduction

Ever since the original Softbank bid for Sprint-Nextel, the industry has been awaiting a wave of price disruption in the United States, the world’s biggest and richest mobile market, and one which is still very much dominated by the dynamic duo, Verizon Wireless and AT&T Mobility.

Figure 1: The US, a rich and high-spending market

The US a rich and high-spending market

Source: Onavo, Ofcom, CMT, BNETZA, TIA, KCC, Telco accounts, STL Partners

However, the Sprint-Softbank deal saga delayed any aggressive move by Sprint for some time, and in the meantime T-Mobile USA stole a march, implemented its own very similar ‘uncarrier’ proposition strategy, and achieved a dramatic turnaround of their customer numbers.

As Figure 2 shows, the duopoly marches on, with Verizon in the lead, although the gap with AT&T has closed a little lately. Sprint, meanwhile, looks moribund, while T-Mobile has closed half the gap with the duopolists in an astonishingly short period of time.

Figure 2: The duopolists hold a lead, but a new challenger arises…

The duopolists hold a lead but a new challenger arises
Source: STL Partners

Now, a Sprint-T-Mobile merger is seriously on the cards. Again, Softbank CEO Masayoshi Son is on record as promising to launch a price war. But to what extent is a Free Mobile-like disruption event already happening? And what strategies are carriers adopting?

For more STL analysis of the US cellular market, read the original Sprint-Softbank EB , the Telco 2.0 Transformation Index sections on Verizon  and AT&T , and our Self-Disruption: How Sprint Blew It EB . Additional coverage of the fixed domain can be found in the Triple-Play in the USA: Infrastructure Pays Off EB  and the Telco 2.0 Index sections mentioned above

The US Market is Changing

In our previous analysis Self-Disruption: How Sprint Blew It, we used the following chart, Figure 3, under the title “…And ARPU is Holding Up”. Updating it with the latest data, it becomes clear that ARPU – and in this case pricing – is no longer holding up so well. Rather than across-the-board deflation, though, we are instead seeing increasingly diverse strategies.

Figure 3: US carriers are pursuing diverse pricing strategies, faced with change

US carriers are pursuing diverse pricing strategies, faced with change

Source: STL Partners

AT&T’s ARPU is being very gradually eroded (it’s come down by $5 since Q1 2011), while Sprint’s plunged sharply with the shutdown of Nextel (see report referenced above for more detail). Since then, AT&T and Sprint have been close to parity, a situation AT&T management surely can’t be satisfied with. T-Mobile USA has slashed prices so much that the “uncarrier” has given up $10 of monthly ARPU since the beginning of 2012. And Verizon Wireless has added almost as much monthly ARPU in the same timeframe.

Each carrier has adopted a different approach in this period:

  • T-Mobile has gone hell-for-leather after net adds at any price.
  • AT&T has tried to compete with T-Mobile’s price slashing by offering more hardware and bigger bundles and matching T-Mobile’s eye-catching initiatives, while trying to hold the line on headline pricing, perhaps hoping to limit the damage and wait for Deutsche Telekom to tire of the spending. For example, AT&T recently increased its device activation fee by $4, citing the increased number of smartphone activations under its early-upgrade plan. This does not appear in service-ARPU or in headline pricing, but it most certainly does contribute to revenue, and even more so, to margin.
  • Verizon Wireless has declined to get involved in the price war, and has concentrated on maintaining its status as a premium brand, selling on coverage, speed, and capacity. As the above chart shows, this effort to achieve network differentiation has met with a considerable degree of success.
  • Sprint, meanwhile, is responding tactically with initiatives like its “Framily” tariff, while sorting out the network, but is mostly just suffering. The sharp drop in mid-2012 is a signature of high-value SMB customers fleeing the shutdown of Nextel, as discussed in Self-Disruption: How Sprint Blew It.

Figure 4: Something went wrong at Sprint in mid-2012

Something went wrong at Sprint in mid-2012

Source: STL Partners, Sprint filings

 

  • Executive Summary
  • Contents
  • Introduction
  • The US Market is Changing
  • Where are the Customers Coming From?
  • Free Mobile: A Warning from History?
  • T-Mobile, the Expensive Disruptor
  • Handset subsidy: it’s not going anywhere
  • Summarising change in the US and French cellular markets
  • Conclusions

 

  • Figure 1: The US, a rich and high-spending market
  • Figure 2: The duopolists hold a lead, but a new challenger arises…
  • Figure 3: US carriers are pursuing diverse pricing strategies, faced with change
  • Figure 4: Something went wrong at Sprint in mid-2012
  • Figure 5: US subscriber net-adds by source
  • Figure 6: The impact of disruption – prices fall across the board
  • Figure 7: Free’s spectacular growth in subscribers – but who was losing out?
  • Figure 8: The main force of Free Mobile’s disruption didn’t fall on the carriers
  • Figure 9: Disruption in France primarily manifested itself in subscriber growth, falling ARPU, and the death of the MVNOs
  • Figure 10: T-Mobile has so far extended $3bn of credit to its smartphone customers
  • Figure 11: T-Mobile’s losses on device sales are large and increasing, driven by smartphone volumes
  • Figure 12: Size and profitability still go together in US mobile – although this conceals a lot of change below the surface
  • Figure 13: Fully-developed disruption, in France
  • Figure 14: Quality beats quantity. Sprint repeatedly outspent VZW on its network

Broadband 2.0: Mobile CDNs and video distribution

Summary: Content Delivery Networks (CDNs) are becoming familiar in the fixed broadband world as a means to improve the experience and reduce the costs of delivering bulky data like online video to end-users. Is there now a compelling need for their mobile equivalents, and if so, should operators partner with existing players or build / buy their own? (August 2011, Executive Briefing Service, Future of the Networks Stream).
Telco 2.0 Mobile CDN Schematic Small
  Read in Full (Members only)    Buy This Report    To Subscribe

Below is an extract from this 25 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can buy a Single User license for this report online here for £595 (+VAT) or subscribe here. For multiple user licenses, or to find out about interactive strategy workshops on this topic, please email contact@telco2.net or call +44 (0) 207 247 5003.

To share this article easily, please click:

//

Introduction

As is widely documented, mobile networks are witnessing huge growth in the volumes of 3G/4G data traffic, primarily from laptops, smartphones and tablets. While Telco 2.0 is wary of some of the headline shock-statistics about forecast “exponential” growth, or “data tsunamis” driven by ravenous consumption of video applications, there is certainly a fast-growing appetite for use of mobile broadband.

That said, many of the actual problems of congestion today can be pinpointed either to a handful of busy cells at peak hour – or, often, the inability of the network to deal with the signalling load from chatty applications or “aggressive” devices, rather than the “tonnage” of traffic. Another large trend in mobile data is the use of transient, individual-centric flows from specific apps or communications tools such as social networking and messaging.

But “tonnage” is not completely irrelevant. Despite the diversity, there is still an inexorable rise in the use of mobile devices for “big chunks” of data, especially the special class of software commonly known as “content” – typically popular/curated standalone video clips or programmes, or streamed music. Images (especially those in web pages) and application files such as software updates fit into a similar group – sizeable lumps of data downloaded by many individuals across the operator’s network.

This one-to-many nature of most types of bulk content highlights inefficiencies in the way mobile networks operate. The same data chunks are downloaded time and again by users, typically going all the way from the public Internet, through the operator’s core network, eventually to the end user. Everyone loses in this scenario – the content publisher needs huge servers to dish up each download individually. The operator has to deal with transport and backhaul load from repeatedly sending the same content across its network (and IP transit from shipping it in from outside, especially over international links). Finally, the user has to deal with all the unpredictability and performance compromises involved in accessing the traffic across multiple intervening points – and ends up paying extra to support the operator’s heavier cost base.

In the fixed broadband world, many content companies have availed themselves of a group of specialist intermediaries called CDNs (content delivery networks). These firms on-board large volumes of the most important content served across the Internet, before dropping it “locally” as near to the end user as possible – if possible, served up from cached (pre-saved) copies. Often, the CDN operating companies have struck deals with the end-user facing ISPs, which have often been keen to host their servers in-house, as they have been able to reduce their IP interconnection costs and deliver better user experience to their customers.

In the mobile industry, the use of CDNs is much less mature. Until relatively recently, the overall volumes of data didn’t really move the needle from the point of view of content firms, while operators’ radio-centric cost bases were also relatively immune from those issues as well. Optimising the “middle mile” for mobile data transport efficiency seemed far less of a concern than getting networks built out and handsets and apps perfected, or setting up policy and charging systems to parcel up broadband into tiered plans. Arguably, better-flowing data paths and video streams would only load the radio more heavily, just at a time when operators were having to compress video to limit congestion.

This is now changing significantly. With the rise in smartphone usage – and the expectations around tablets – Internet-based CDNs are pushing much more heavily to have their servers placed inside mobile networks. This is leading to a certain amount of introspection among the operators – do they really want to have Internet companies’ infrastructure inside their own networks, or could this be seen more as a Trojan Horse of some sort, simply accelerating the shift of content sales and delivery towards OTT-style models? Might it not be easier for operators to build internal CDN-type functions instead?

Some of the earlier approaches to video traffic management – especially so-called “optimisation” without the content companies’ permission of involvement – are becoming trickier with new video formats and more scrutiny from a Net Neutrality standpoint. But CDNs by definition involve the publishers, so potentially any necessary compression or other processing can be collaboratively, rather than “transparently” without cooperation or willingness.

At the same time, many of the operators’ usual vendors are seeing this transition point as a chance to differentiate their new IP core network offerings, typically combining CDN capability into their routing/switching platforms, often alongside the optimisation functions as well. In common with other recent innovations from network equipment suppliers, there is a dangled promise of Telco 2.0-style revenues that could be derived from “upstream” players. In this case, there is a bit more easily-proved potential, since this would involve direct substitution of the existing revenues already derived from content companies, by the Internet CDN players such as Akamai and Limelight. This also holds the possibility of setting up a two-sided, content-charging business model that fits OK with rules on Net Neutrality – there are few complaints about existing CDNs except from ultra-purist Neutralists.

On the other hand, telco-owned CDNs have existed in the fixed broadband world for some time, with largely indifferent levels of success and adoption. There needs to be a very good reason for content companies to choose to deal with multiple national telcos, rather than simply take the easy route and choose a single global CDN provider.

So, the big question for telcos around CDNs at the moment is “should I build my own, or should I just permit Akamai and others to continue deploying servers into my network?” Linked to that question is what type of CDN operation an operator might choose to run in-house.

There are four main reasons why a mobile operator might want to build its own CDN:

  • To lower costs of network operation or upgrade, especially in radio network and backhaul, but also through the core and in IP transit.
  • To improve the user experience of video, web or applications, either in terms of data throughput or latency.
  • To derive incremental revenue from content or application providers.
  • For wider strategic or philosophical reasons about “keeping control over the content/apps value chain”

This Analyst Note explores these issues in more details, first giving some relevant contextual information on how CDNs work, especially in mobile.

What is a CDN?

The traditional model for Internet-based content access is straightforward – the user’s browser requests a piece of data (image, video, file or whatever) from a server, which then sends it back across the network, via a series of “hops” between different network nodes. The content typically crosses the boundaries between multiple service providers’ domains, before finally arriving at the user’s access provider’s network, flowing down over the fixed or mobile “last mile” to their device. In a mobile network, that also typically involves transiting the operator’s core network first, which has a variety of infrastructure (network elements) to control and charge for it.

A Content Delivery Network (CDN) is a system for serving Internet content from servers which are located “closer” to the end user either physically, or in terms of the network topology (number of hops). This can result in faster response times, higher overall performance, and potentially lower costs to all concerned.

In most cases in the past, CDNs have been run by specialist third-party providers, such as Akamai and Limelight. This document also considers the role of telcos running their own “on-net” CDNs.

CDNs can be thought of as analogous to the distribution of bulky physical goods – it would be inefficient for a manufacturer to ship all products to customers individually from a single huge central warehouse. Instead, it will set up regional logistics centres that can be more responsive – and, if appropriate, tailor the products or packaging to the needs of specific local markets.

As an example, there might be a million requests for a particular video stream from the BBC. Without using a CDN, the BBC would have to provide sufficient server capacity and bandwidth to handle them all. The company’s immediate downstream ISPs would have to carry this traffic to the Internet backbone, the backbone itself has to carry it, and finally the requesters’ ISPs’ access networks have to deliver it to the end-points. From a media-industry viewpoint, the source network (in this case the BBC) is generally called the “content network” or “hosting network”; the destination is termed an “eyeball network”.

In a CDN scenario, all the data for the video stream has to be transferred across the Internet just once for each participating network, when it is deployed to the downstream CDN servers and stored. After this point, it is only carried over the user-facing eyeball networks, not any others via the public Internet. This also means that the CDN servers may be located strategically within the eyeball networks, in order to use its resources more efficiently. For example, the eyeball network could place the CDN server on the downstream side of its most expensive link, so as to avoid carrying the video over it multiple times. In a mobile context, CDN servers could be used to avoid pushing large volumes of data through expensive core-network nodes repeatedly.

When the video or other content is loaded into the CDN, other optimisations such as compression or transcoding into other formats can be applied if desired. There may also be various treatments relating to new forms of delivery such as HTTP streaming, where the video is broken up into “chunks” with several different sizes/resolutions. Collectively, these upfront processes are called “ingestion”.

Figure 1 – Content delivery with and without a CDN

Mobile CDN Schematic, Fig 1 Telco 2.0 Report

Source: STL Partners / Telco 2.0

Value-added CDN services

It is important to recognise that the fixed-centric CDN business has increased massively in richness and competition over time. Although some of the players have very clever architectures and IPR in the forms of their algorithms and software techniques, the flexibility of modern IP networks has tended to erode away some of the early advantages and margins. Shipping large volumes of content is now starting to become secondary to the provision of associated value-added functions and capabilities around that data. Additional services include:

  • Analytics and reporting
  • Advert insertion
  • Content ingestion and management
  • Application acceleration
  • Website security management
  • Software delivery
  • Consulting and professional services

It is no coincidence that the market leader, Akamai, now refers to itself as “provider of cloud optimisation services” in its financial statements, rather than a CDN, with its business being driven by “trends in cloud computing, Internet security, mobile connectivity, and the proliferation of online video”. In particular, it has started refocusing away from dealing with “video tonnage”, and towards application acceleration – for example, speeding up the load times of e-commerce sites, which has a measurable impact on abandonment of purchasing visits. Akamai’s total revenues in 2010 were around $1bn, less than half of which came from “media and entertainment” – the traditional “content industries”. Its H1 2011 revenues were relatively disappointing, with growth coming from non-traditional markets such as enterprise and high-tech (eg software update delivery) rather than media.

This is a critically important consideration for operators that are looking to CDNs to provide them with sizeable uplifts in revenue from upstream customers. Telcos – especially in mobile – will need to invest in various additional capabilities as well as the “headline” video traffic management aspects of the system. They will need to optimise for network latency as well as throughput, for example – which will probably not have the cost-saving impacts expected from managing “data tonnage” more effectively.

Although in theory telcos’ other assets should help – for example mapping download analytics to more generalised customer data – this is likely to involve extra complexity with the IT side of the business. There will also be additional efforts around sales and marketing that go significantly beyond most mobile operators’ normal footprint into B2B business areas. There is also a risk that an analysis of bottlenecks for application delivery / acceleration ends up simply pointing the finger of blame at the network’s inadequacies in terms of coverage. Improving delivery speed, cost or latency is only valuable to an upstream customer if there is a reasonable likelihood of the end-user actually having connectivity in the first place.

Figure 2: Value-added CDN capabilities

Mobile CDN Schematic - Functionality Chart - Telco 2.0 Report

Source: Alcatel-Lucent

Application acceleration

An increasingly important aspect of CDNs is their move beyond content/media distribution into a much wider area of “acceleration” and “cloud enablement”. As well as delivering large pieces of data efficiently (e.g. video), there is arguably more tangible value in delivering small pieces of data fast.

There are various manifestations of this, but a couple of good examples illustrate the general principles:

  • Many web transactions are abandoned because websites (or apps) seem “slow”. Few people would trust an airline’s e-commerce site, or a bank’s online interface, if they’ve had to wait impatiently for images and page elements to load, perhaps repeatedly hitting “refresh” on their browsers. Abandoned transactions can be directly linked to slow or unreliable response times – typically a function of congestion either at the server or various mid-way points in the connection. CDN-style hosting can accelerate the service measurably, leading to increased customer satisfaction and lower levels of abandonment.
  • Enterprise adoption of cloud computing is becoming exceptionally important, with both cost savings and performance enhancements promised by vendors. Sometimes, such platforms will involve hybrid clouds – a mixture of private (Internal) and public (Internet) resources and connectivity. Where corporates are reliant on public Internet connectivity, they may well want to ensure as fast and reliable service as possible, especially in terms of round-trip latency. Many IT applications are designed to be run on ultra-fast company private networks, with a lot of “hand-shaking” between the user’s PC and the server. This process is very latency-dependent, and especially as companies also mobilise their applications the additional overhead time in cellular networks may otherwise cause significant problems.

Hosting applications at CDN-type cloud acceleration providers achieves much the same effect as for video – they can bring the application “closer”, with fewer hops between the origin server and the consumer. Additionally, the CDN is well-placed to offer additional value-adds such as firewalling and protection against denial-of-service attacks.

To read the 25 note in full, including the following additional content…

  • How do CDNs fit with mobile networks?
  • Internet CDNs vs. operator CDNs
  • Why use an operator CDN?
  • Should delivery mean delivery?
  • Lessons from fixed operator CDNs
  • Mobile video: CDNs, offload & optimisation
  • CDNs, optimisation, proxies and DPI
  • The role of OVPs
  • Implementation and planning issues
  • Conclusion & recommendations

… and the following additional charts…

  • Figure 3 – Potential locations for CDN caches and nodes
  • Figure 4 – Distributed on-net CDNs can offer significant data transport savings
  • Figure 5 – The role of OVPs for different types of CDN player
  • Figure 6 – Summary of Risk / Benefits of Centralised vs. Distributed and ‘Off Net’ vs. ‘On-Net’ CDN Strategies

……Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 25 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595 (+VAT), or for multi-user licenses and any other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.

Organisations and products referenced: 3GPP, Acision, Akamai, Alcatel-Lucent, Allot, Amazon Cloudfront, Apple’s Time Capsule, BBC, BrightCove, BT, Bytemobile, Cisco, Ericsson, Flash Networks, Huawei, iCloud, ISPs, iTunes, Juniper, Limelight, Netflix, Nokia Siemens Networks, Ooyala, OpenWave, Ortiva, Skype, smartphone, Stoke, tablets, TiVo, Vantrix, Velocix, Wholesale Content Connect, Yospace, YouTube.

Technologies and industry terms referenced: acceleration, advertising, APIs, backhaul, caching, CDN, cloud, distributed caches, DNS, Evolved Packet Core, eyeball network, femtocell, fixed broadband, GGSNs, HLS, HTTP streaming, ingestion, IP network, IPR, laptops, LIPA, LTE, macro-CDN, micro-CDN, middle mile, mobile, Net Neutrality, offload, optimisation, OTT, OVP, peering proxy, QoE, QoS, RNCs, SIPTO, video, video traffic management, WiFi, wireless.

Apple iCloud/iOS: Killing SMS Softly?

Summary: Our analysis of how Apple’s iCloud, iOS5, and MacOS developments build value and control for Apple’s digital platform, and their consequences on other parts of the digital ecosystem, including the impact of iMessage on text messaging. (June 2011, Executive Briefing Service) Apple iCloud logo in analysis of impact of iCloud/iOS on digital ecosystem

 

Read in Full (Members only)    Buy This Report    To Subscribe

Below is an extract from this 32 page Telco 2.0 Executive Briefing that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service here. Non-members can buy a Single User license for this report online here for £995 (+VAT) or subscribe here. For multiple user licenses or other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.

Creating effective commercial strategies in the digital ecosystem, including learning from and dealing with major players like Apple and Google, is a key theme of Telco 2.0’s ‘Best Practice Live!, a free global online event on 28-29 June 2011, as well as of other Telco 2.0 research and analysis.

To share this article, please click:

//

Introduction

 

icloudious 1 - WWDC June 2011.pngApple provided a glimpse into some of the upcoming new features of its key software platforms iOS and MacOS at its WorldWide Developer Conference (WWDC) in June 2011. It also announced its much anticipated move into providing cloud based services and away from using the PC as the controlling hub.

iOS and MacOS are Apple’s key software assets – the assets which add soul to Apple’s key money spinning devices (iPhone, iPad and Mac). iCloud is the first iteration of the missing third leg – the software that ties all the devices together seamlessly. Together iOS, MacOS and iCloud are both the differentiator for the consumer and the barrier-to-entry for competitors. They are the soul of the Apple overall platform.

The Apple platform is evolving, and its new features will impact on many players in the value chain: namely the various distributors including mobile operators, aggregators, content creators and of course end consumers.

Nearly every main feature launched seems to support our general theory that Apple is squeezing value from the aggregators and distributors and pushing that value into the device manufacturers (i.e. them).

Contents

The rest of this webpage covers:

  • iMessage – killing SMS softly? [NB There is additional analysis of this in the full Briefing]
  • iTunes in the Cloud – getting one up on Amazon
  • Notifications – Apple robs Windows Phone and Android advantage


The full Briefing, which contains the complete section on iMessage, also includes the following sections:

  • The impact of iMessage on SMS revenues, and telco defence strategies
  • MacOS Software – Apple shuts out other retailers
  • Newsstand – Appeasing Publishers (to a degree)
  • MobileMe – just ‘making it work’ …and building the moat
  • iCloud and Video Services – holding fire for now
  • Activation – Cutting the PC cord
  • Photo Stream – yes, but why?
  • Data Centre Economics – making a start
  • Conclusions – Lessons from Apple’s Strategy

 

1. iMessage – killing SMS softly?

icloudious 1a - iMessage iCloud June 2011.png iMessage, which is the primary mechanism for SMS and MMS features, has been radically reengineered with messages between Apple platform consumers no longer being carried on the mobile network SMS and MMS infrastructure. All of this happens transparently to the consumer and they don’t need to know if their recipients are also using Apple devices – the message routing is determined by the Apple platform.

iMessage is great for consumers as these onnet messages are free, but dreadful for MNOs as they all will probably take a hit on messaging revenues. Apple is competing with the MNO’s core services, and they have even made it easier for consumers to see the value proposition by colouring the bubbles for onnet and offnet messages differently.

Apple has been quite clever in the timing of the release of this feature. Applications such as WhatsApp have already been blamed by some MNOs for declining messaging revenues – in particular KPN that has recently experienced a very significant impact on revenues. Apple effectively is doing nothing differently to them, just improving the consumer experience by making it easier to send and receive offnet messages.

In terms of platform economics, Apple is adding value to the consumer via the device and squeezing value from the mobile network distributors. We believe it is only a matter of time before Apple start offering voice features. This, together with their video conferencing application Facetime, leaves mobile operators staring into the future where they will only be selling data access services.

[NB There’s further analysis of these impacts and defences against them in the full Briefing.]

2. iTunes in the Cloud – getting one up on Amazon

icloudious 2 - iTunes iCloud June 2011.png The key value proposition of “iTunes in the Cloud” is that all songs historically purchased through iTunes are available for download to any Apple device at no extra cost wirelessly either through a WiFi or 3G connection as long as the consumer remains within their data tier. The user has control over which songs he wants to download to what devices thus avoiding a situation where all storage on an iPhone or iPad is consumed by a vast collection.

The level of consumer control is such that a consumer can even download a previously purchased album for a specific journey and then remove it after listening to save space. New purchases can immediately downloaded to all devices or selectively as with the case of historical purchases. This feature definitely improves the Apple platform, and especially compared to alternate music retailers such as Amazon.

Currently, Apple users can purchase songs or albums from Amazon and they will be automatically added to iTunes on the laptop, then on synchronization the songs transfer to the iPhone or iPad. Previously, buying songs through the Amazon store on the PC was as simple as buying through the Apple iTunes store, and Amazon has been slowly gaining market share in music downloads, because it competes on price and often offers songs cheaper than in the Apple iTunes store. Now, with “iTunes in the Cloud”, Amazon may still be able to beat Apple iTunes Store on price, but the user experience is now deficient.

We seriously doubt that Apple will allow 3rd party retailers access to their iTunes in the Cloud service, and argue that Apple is using their platform to improve the position of their retail arm compared to 3rd parties.

 

iCloudios 4 - iTunes Match June 2011.jpgThe other service offered, iTunes Match, also adds incredible value to the platform. Apple has negotiated a deal with the major record labels to offer the opportunity to consumers to add tracks from their collections not purchased via the Apple store to the iTunes in the Cloud service for a cost of $25/year. Reputedly, Apple is sharing this revenue 70:30 with the record labels and as a paid a huge advance of US$100m-US$150m for the USA rights alone. Apple has set the benchmark price for cloud music licensing and has set the bar so high that it is hard to see new entrants having sufficient funding to gain similar licenses. Even Amazon or Google will be questioning whether they can generate enough money from music to justify the price of the licenses.

At the launch event, Steve Jobs presented the use-case of customers who had ripped their physical CDs. The more discussed use-case in the media is those people who have obtained their songs from illegal means, either via P2P networks or friend sharing, who effectively now have a US$25/annum service which legitimizes not only their past behaviour, but potentially also their future behaviour. The third use-case is people who buy cheaper digital music from other digital retailers, e.g. Amazon, and now have an option to pay an ongoing fee to add the simplicity of the iTunes in the Cloud service. Effectively, the usability advantage of the Apple platform is priced at US$25/annum which means this use-case only makes sense to heavy ongoing purchasers of music.

Apple didn’t face the same licensing issue from the publishers and has added a very similar service for all Books bought from the iBookstore with the added feature of bookmarks are synchronized and shared across devices. Overall, Apple has built very compelling cloud services for music, books and magazines and erected larger barriers for its competitors. If iMessage show Apple leveraging interconnected with other networks when it suits them, iTunes and iBookstore show Apple adding features which not only make interconnect more difficult for other companies, but firmly closing previously open doors.

3. Notifications – Apple robs Windows Phone and Android advantage

 

iCloudios 5 - notifications June 2011.pngA notification is the mechanism that consumers are alerted to events – for instance, an incoming email or sms. It is the key mechanism that 3rd party developers communicate with their users – for instance, in a sports application a notification can alert the user that their football team has scored a goal. Apple has completely revamped their notifications user experience with the addition of a notifications centre.

Apple have pushed over 100 billion notifications to iPhone and iPad which presumably partly accounts for the high consumption of signaling capacity which many mobile operators have been complaining about.

It also shows that Apple is quick to address deficiencies in their platform compared to others. This is a key feature of platform economics; you have to invest sometimes to play catch-up. It also highlights the risks for developers of building solutions which address platform weaknesses – yesterday’s successful application is tomorrow inbuilt into the platform.

Interestingly, an alternate notification application was never approved by Apple in their AppStore and instead went into the wilds of only being available on jailbroken iPhones. Apple new notification centre bears a striking resemblance to the non-approved one. iCloudios 6 - notifications June 2011.png Another example of this approach is with the feature for reminders, where a plethora of applications were already being sold in the Application store. Apple added a feature called Reminders which is part of the initial application load, and which effectively destroys the market for 3rd party applications. This in some ways looks like a repeat of the Microsoft strategy with Windows and Internet Explorer which got them in such trouble with regulators across the globe.

To read the full Briefing, members of the Telco 2.0 Executive Briefing Subscription Service can download the full 32 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for for £995, or for multi-user licenses and any other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.

Organisations, company types, areas, people and industry models referenced: Apple, platform, Amazon, Cloud, Google, strategy, Vodafone, WhatsApp, O2, Orange, publishers, Steve Jobs, WWDC, ARPU, Blackberry, Carphone Warehouse, Everything Everywhere, MNO, Prepay, record labels, Telefonica, T-Mobile, Viber.

Technologies and products referenced: iPad, iPhone, PC, Windows, iCloud, iTunes, iMessage, Android, iOS, messaging, MMS, MobileMe, SMS, voice, WiFi, Windows Phone, 3G, Activation, AppStore, Data Centre, NewsStand, Notifications, Photo Stream, Video, BlackBerry Messenger, Facetime, Freebee, Gmail, GSM, HTML5, iBookstore, Internet Explorer, Microsoft Live, P2P, Photostream, RCS-e, Snow Leopard, UltraViolet, VoIP, Windows7.