5G: The spectrum game is changing – but how to play?

Introduction

Why does spectrum matter?

Radio spectrum is a key “raw material” for mobile networks, together with evolution of the transmission technology itself, and the availability of suitable cell-site locations. The more spectrum is made available for telcos, the more capacity there is overall for current and future mobile networks. The ability to provide good coverage is also determined largely by spectrum allocations.

Within the industry, we are accustomed to costly auction processes, as telcos battle for tranches of frequencies to add capacity, or support new generations of technology. In contrast, despite the huge costs to telcos for different spectrum allocation, most people have very little awareness of what bands their phones support, other than perhaps that it can use ‘mobile/cellular’ and WiFi.

Most people, even in the telecoms industry, don’t grasp the significance of particular numbers of MHz or GHz involved (Hz = number of cycles per second, measured in millions or billions). And that is just the tip of the jargon and acronym iceberg – a full discussion of mobile RAN (radio access network) technology involves different sorts of modulation, multiple antennas, propagation metrics, path loss (in decibels, dB) and so forth.

Yet as 5G pulls into view, it is critical to understand the process by which new frequencies will be released by governments, or old ones re-used by the mobile industry. To deliver the much-promised peak speeds and enhanced coverage of 5G, big chunks of frequencies are needed. Yet spectrum has many other uses besides public mobile networks, and battles will be fierce about any reallocations of incumbent users’ rights. The broadcast industry (especially TV), satellite operators, government departments (notably defence), scientific research communities and many other constituencies are involved here. In addition, there are growing demands for more bandwidth for unlicensed usage (as used for WiFi, Bluetooth and other low-power IoT networks such as SigFox).

Multiple big industries – usually referred to by the mobile community as “verticals” – are flexing their own muscles as well. Energy, transport, Internet, manufacturing, public safety and other sectors all see the benefits of wireless connectivity – but don’t necessarily want to involve mobile operators, nor subscribe to their preferred specifications and standards. Many have huge budgets, a deep legacy of systems-building and are hiring mobile specialists.

Lastly, parts of the technology industry are advocates of more nuanced approaches to spectrum management. Rather than dedicate bands to single companies, across whole countries or regions, they would rather develop mechanisms for sharing spectrum – either on a geographic basis, or by allowing some form of “peaceful coexistence” where different users’ radios behave nicely together, instead of creating interference. In theory, this could improve the efficient use of spectrum – but adds complexity, and perhaps introduces so much extra competition than willingness to invest suffers.

Which bands are made available for 5G, on what timescales, in what type of “chunks”, and the authorisation / licensing schemes involved, all define the potential opportunity for operators in 5G – as well as the risks of disruption, and (for some) how large the window is to fully-monetise 4G investments.

The whole area is a minefield to understand – it brings together the hardest parts of wireless technology to grasp, along with impenetrable legal processes, and labyrinthine politics at national and international levels. And ideally, it is possible to somehow to layer on consideration of end-user needs, and economic/social outputs as well.

Who are the stakeholders for spectrum?

At first sight, it might seem that spectrum allocations for mobile networks ought to be a comparatively simple affair, with governments deciding on tranches of frequencies and an appropriate auction process. MNOs can bid for their desired bands, and then deploy networks (and, perhaps, gripe about the costs afterwards).

The reality is much more complex. A later section describes some of the international bureaucracy involved in defining appropriate bands, which can then be doled out by governments (assuming they don’t decide to act unilaterally). But even before that, it is important to consider which organisations want to get involved in the decision process – and their motivations, whether for 5G or other issues that are closer to their own priorities, which intersect with it.

Governments have a broad set of drivers and priorities to reconcile – technological evolution of the economy as a whole, the desire for a competitive telecoms industry, exports, auction receipts – and the protection of other spectrum user groups such as defence, transport and public safety. Different branches of government and the public administration have differing views, and there may sometimes be tussles between the executive branch and various regulators.

Much the same is true at regional levels, especially in Europe, where there are often disagreements between European Commission, European Parliament, the regulators’ groups and 28 different EU nations’ parliaments (plus another 23 non-EU nations).

Even within the telecoms industry there are differences of opinion – some operators see 5G as an urgent strategic priority, that can help differentiation and reduce costs of existing infrastructure deployments. Others are still in the process of rolling out 4G networks and want to ensure that those investments continue to have relevance. There are variations in how much credence is assigned to the projections of IoT growth – and even there, whether there needs to be breathing room for 4G cellular types such as NB-IoT, which is yet to be deployed despite its putative replacement being discussed already.

The net result is many rounds of research, debate, consultation, disagreement and (eventually) compromise. Yet in many ways, 5G is different from 3G and 4G, especially because many new sectors are directly involved in helping define the use-cases and requirements. In many ways, telecoms is now “too important to be left to the telcos”, and many other voices will therefore need to be heard.

 

  • Executive Summary
  • Introduction
  • Why does spectrum matter?
  • Who are the stakeholders for spectrum?
  • Spectrum vs. business models
  • Does 5G need spectrum harmonisation as much as 4G?
  • Spectrum authorisation types & processes
  • Licensed, unlicensed and shared spectrum
  • Why is ITU involved, and what is IMT spectrum?
  • Key bands for 5G
  • Overview
  • 5G Phase 1: just more of the same?
  • mmWave beckons – the high bands >6GHz
  • Conclusions

 

  • Figure 1 – 5G spectrum has multiple stakeholders with differing priorities
  • Figure 2 – Multi-band support has improved hugely since early 4G phones
  • Figure 3 – A potential 5G deployment & standardisation timeline
  • Figure 4 – ITU timeline for 5G spectrum harmonisation, 2014-2020
  • Figure 5 – High mmWave frequencies (e.g. 28GHz) don’t go through solid walls
  • Figure 6 – mmWave brings new technology and design challenges

eSIM: How Much Should Operators Worry?

What is eSIM? Or RSP?

There is a lot of confusion around what eSIM actually means. While the “e” is often just assumed to stand for “embedded”, this is only half the story – and one which various people in the industry are trying to change.

In theory the term “eSIM” refers only to the functionality of “remote provisioning”; that is, the ability to download an operator profile to an in-market SIM (and also potentially switch between profiles or delete them). This contrasts with the traditional methods of pre-provisioning specific, fixed profiles into SIMs during manufacture. Most SIMs today have a particular operator’s identity and encryption credentials set at the factory. This is true of both the familiar removable SIM cards used in mobile phones, and the “soldered-in” form used in some M2M devices.

In other words, the original “e” was a poor choice – it was intended to stand for “enhanced”, “electronic” or just imply “new and online” like eCommerce or eGovernment. In fact, the first use in 2011 was for eUICC – the snappier term eSIM only emerged a couple of years later. UICCs (Universal Integrated Circuit Cards) are the smart-card chips themselves, that are used both in SIMs and other applications, for example, bank, transport and access-security cards. Embedded, solderable SIMs have existed for certain M2M uses since 2010.

In an attempt to separate out the “form factor” (removable vs. embedded) aspect from the capability (remote vs. factory provisioned), the term RSP sometimes gets used, standing for Remote SIM Provisioning. This is the title of GSMA’s current standard. But unsurprisingly, the nicer term eSIM is hard to dislodge in observers’ consciousness, so it is likely to stick around. Most now think of eSIMs as having both the remote-provisioning function and an embedded non-removable form-factor. In theory, we might even get remote-provisioning for removable SIMs (the 2014 Apple SIM was a non-standard version of this).

Figure 1: What does eSIM actually mean?

What does esim mean

Source: Disruptive Analysis

This picture is further muddied by different sets of GSMA standards for M2M and consumer use-cases at present, where the latter involves some way for the end-user to choose which profiles to download and when to activate them – for example, linking a new cellular tablet to an existing data-plan. This is different to a connected car or an industrial M2M use-case, where the manufacturer designs in the connectivity, and perhaps needs to manage whole “fleets” of eSIMs together. The GSMA M2M version of the standards were first released in 2013, and the first consumer specifications were only released in 2016. Both are being enhanced over time, and there are intentions to develop a converged M2M/consumer specification, probably in H2 2017.

eSims vs Soft-SIM / vSims

This is another area of confusion – some people confuse eSIMs with the concept of a “soft-SIM” (also called virtual SIMs/vSIMs). These have been discussed for years as a possible option for replacing physical SIM chips entirely, whether remotely provisioned, removable/soldered or not. They use purely software-based security credentials and certificates, which could be based in the “secure zone” of some mobile processors.

However, the mobile industry has strongly pushed-back on the Soft-SIM concept and standardisation, for both security reasons and also (implicit) commercial concerns. Despite this we are aware of at least two Asian handset vendors that have recently started using virtual SIMs for roaming applications.

For now, soft-SIMs appear to be far from the standards agenda, although there is definitely renewed interest. They also require a secondary market in “profiles”, which is at a very early stage and not receiving much industry attention at the moment. STL thinks that there is a possibility that we could see a future standardised version of soft-SIMs and the associated value-chain and controls, but it will take a lot of convincing for the telco industry (and especially GSMA) to push for it. It might get another nudge from Apple (which indirectly catalysed the whole eSIM movement with a 2010 patent), but as discussed later that seems improbable in the short term.

Multi-IMSI: How does multi-IMSI work?

It should also be noted that multi-IMSI (International Mobile Subscriber Identity) SIMs are yet another category here. Already used in various niches, these allow a single operator profile to be associated with multiple phone numbers – for example in different geographies. Combined with licences in different countries or multiple MVNO arrangements, this allows various clever business models, but anchored in one central operator’s system. Multi-local operators such as Truphone exploit this, as does Google in its Fi service which blends T-Mobile US and Sprint networks together. It is theoretically possible to blend multi-IMSI functionality with eSIM remote-provisioning.

eSIMs use cases and what do stakeholders hope to gain

  • There are two sets of use-cases and related stakeholder groups for eSIMs:
  • Devices that already use cellular radios & SIMs today; This group can be sub-divided into:
    • Mobile phones
    • M2M uses (e.g. connected cars and industrial modules)
    • Connected devices such as tablets, PC dongles and portable WiFi hotspots.
  • Devices that do not have cellular connectivity currently; this covers a huge potential range of IoT
    devices.
  • Broadly speaking, it is hoped that eSIM will improve the return on investment and/or efficiency of existing cellular devices and services, or help justify and enable the inclusion of cellular connections in new ones. Replacing existing SIMs is (theoretically) made easier by scrutinising existing channels and business processes and improving them – while new markets (again theoretically) offer win-win scenarios where there is no threat of disruption to existing business models.

The two different stakeholders want to receive different benefits from eSIMs. Mobile operators want:

  • Lower costs for procuring and distributing SIMs.
  • Increased revenue from adding more cellular devices and related services, which can be done incrementally with an eSIM, e.g. IoT connectivity and management.
  • Better functionality and security compared to competing non-cellular technologies.
  • Limited risk of disintermediation, increased churn or OEMs acting as gatekeepers.

And device manufacturers want:

  • To reduce their “bill of material” (BoM) costs and number of design compromises compared to existing removable SIMs
  • To sell more phones and other connected devices
  • To provide better user experience, especially compared to competing OEMs / ecosystems
  • To create additional revenue streams related to service connectivityTo upgrade existing embedded (but non-programmable) soldered SIMs for M2M

The truth, however, is more complex than that – there needs to be clear proof that eSIM improves existing devices’ costs or associated revenues, without introducing extra complexity or risk. And new device categories need to justify the addition of the (expensive, power-consuming) radio itself, as well as choosing SIM vs. eSIM for authentication. In both cases, the needs and benefits for cellular operators and device OEMs (plus their users and channels) must coincide.

There are also many other constituencies involved here: niche service providers of many types, network equipment and software suppliers, IoT specialists, chipset companies, enterprises and their technology suppliers, industry associations, SIM suppliers and so forth. In each case there are both incumbents, and smaller innovators/disruptors trying to find a viable commercial position.

This brings in many “ifs” and “buts” that need to be addressed.

Contents

  • Executive Summary
  • Introduction: What is eSIM? Or RSP?
  • Not a Soft-SIM, or multi-IMSI
  • What do stakeholders hope to gain?
  • A million practical problems So where does eSIM make sense?
  • Phones or just IoT?
  • Forecasts for eSIM
  • Conclusion 

 

  • Figure 1: What does eSIM actually mean?
  • Figure 2: eSIM standardisation & industry initiatives timeline
  • Figure 3: eSIM shipment forecasts, by device category, 2016-2021

Arete Research: Getting to a Billion Smartphones in 2013

This is an extract from a report by Arete Research, a Telco 2.0TM partner specalising in investment analysis. The views in this article are not intended to constitute investment advice from Telco 2.0TM or STL Partners. We are reprinting Arete’s analysis to give our customers some additional insight into how some investors see the Telecoms market.

This report can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service using the links below.

Read in Full (Members only)        To Subscribe

‘Growing the Mobile Internet’ and ‘Fostering Vibrant Ecosystems: Lessons from Apple’ are also key session themes at our upcoming ‘New Digital Economics’ Brainstorms (Palo Alto, 4-7 April and London, 11-13 May). Please use the links or email contact@telco2.net or call +44 (0) 207 247 5003 to find out more.

To share this article easily, please click:

//

A billion Smartphones by 2013?

 

In August ’05 we [Arete Research] published A Billion Handsets in ’07?, where we argued that the coming of low-cost ($25 BoM) handsets would open a new segment and take the market to unimaginable volumes (consensus at the time saw 6% growth to ~800m units in ’07). By ’07 the global handset market hit 1.2bn units, Samsung passed Motorola in volumes, and Mediatek began supplying many new entrants. Now we see the same pattern being repeated in smartphones: Western vendors staked out the first 350m units, and Apple is on track to be the #1 by value share in 1Q11. We see $80 BoM cost smartphones by YE’11, sparking rapid growth and further dramatic shifts in the mobile device landscape

The lure of the largest consumer electronics end-market in the world, the emergence of Chinese vendors with global ambitions, the lowering of barrier entries, and the value attributed to connecting wirelessly to the Internet will drive significant uptake of smartphones and tablets in ’12. It is no longer crazy to talk about “A Billion Smartphones in 2013”.

Table 1: Smartphone and Tablet Shipments by Region

 

’10

% of Units

’11E

% of Units

’12E

% of Units

’13E

% of Units

’14E

% of Units

’15E

% of Units

Europe

87m

28%

142m

45%

220m

66%

272m

80%

286m

83%

298m

85%

N. America

90m

47%

140m

67%

166m

75%

181m

80%

197m

85%

210m

90%

Asia

94m

16%

130m

19%

230m

30%

301m

37%

377m

43%

463m

49%

MEA

20m

8%

34m

11%

98m

26%

151m

38%

213m

50%

268m

60%

LatAm

26m

13%

38m

18%

85m

34%

177m

67%

203m

74%

219m

78%

Total

316m

21%

484m

28%

800m

41%

1,083m

53%

1,276m

59%

1,458m

65%

ASPs

$324

 

$304

 

$215

 

$182

 

$166

 

$155

 

Tablets

17m

57m

 

102m

 

150m

 

175m

 

218m

 

Source: Arete Research estimates

Will This Bring Growth?

In ’10 developed market demand lifted global ASPs after relentless declines. Apple, HTC and RIM took 9% of value share at Nokia’s expense (down 11% from ’08 to ’10). Even with incremental smartphone demand driven by emerging markets from 2H11, it is clear consumers are willing to spend more for a smartphone. This should soften historic ASP declines: we see the overall handset industry rising from $182bn in ’10 to $214bn in ’11 (+18%) and $223bn in ’12 (+4%). Smartphone units should reach 480m in ’11 and 800m in ’12. We estimate smartphones will be on a 1bn annual run rate by 4Q12.

Wireless Logic Semis Could Double. The rise of mass market smartphones will raise the semis content within device BoMs. Apps processors, connectivity and memory (mobile DRAM and flash) will rise in the mix as display prices drop. We think the $9.5bn logic semis market can rise to $18bn+ in ’12. We also see steep price declines in tablets as traditional PC and mobile device ecosystems battle to control new low-cost computing platforms.

Memory Madness. We think combined DRAM and NAND demand will lead mobile device memory sales to nearly triple from ~$9bn in ’10 to $24bn in ’12, offsetting weak PC demand.

Telcos: Dummies No More. Developed markets operators will offer bundled tablets from mid-’11, while low-cost computing will reach emerging markets like India. Smartphones offer a vast upselling opportunity for operator dataplans.

Smartphones: Mix and Margins

We see the same failure of imagination around smartphone volumes that we saw and wrote about in traditional handsets: industry and financial analysts simply cannot break the habit of forecasting mid-single digit annual growth. Yet it is clear that adding touchscreens and better browsers, as well as rolling out 3G infrastructure in emerging markets significantly boosts the marginal utility of a mobile device to end consumers.

In 2010 consumers showed a willingness to pay more for devices, because they were worth more to them. Although the apps developed for emerging markets will be different from those well-known in the US and Europe, we have little doubts that the openness of smartphone platforms will allow new usage patterns there as well. The software flexibility and gradual release of more language packs for low-end smartphones only widens the addressable base for these devices. While we appreciate some people think emerging markets consumers will not want a “good enough” smartphone, we also cannot model 200-300m users jumping from sub-$100 ASPs to over $200; we think a smaller display touchscreen model with limited onboard memory would find broad mass market acceptance.

Table 2: Differing Needs

 

Pop.
(m)

Pen.

Fixed Line Broadband per 100 people

GDP per Capita (Current US$)

China

1,332

60%

6.29

6,890

India

1,177

58%

0.46

3,250

Pakistan

178

58%

0.10

2,680

Nigeria

146

55%

0.04

2,070



 


 


 


 

US

310

98%

24.02

45,640

UK

61

138%

28.13

37,230

Germany

82

128%

27.52

36,780

Source: Worldbank Data from ’08 and ’09, ITU

Table 2 shows how the smartphone is poised to provide the primary method of Internet access in markets where there is negligible penetration of fixed line services. And Table 3 shows how the Middle East and Africa are joining Asia as absorbing the largest share of overall device units over time, even as value share remains concentrated in N. America and Europe. The time is rapidly coming for local brands to take centre stage in these regions. Here we see names like TCL, TianYu, CoolPad, Huawei, ZTE and Micromax and others as but the tip of the iceberg. This will only increase the pressure on the old group of five traditional handset vendors (Nokia, Motorola, Samsung, LGE and SonyEricsson) that commanded ~80% market share from ’00 to ’07, before the rise of Apple RIM and HTC altered industry dynamics irrevocably.

Table 3: Majority of Units in Asia & MEA ’09-’12E

 

’09

’10E

yoy

’11E

yoy

’12E

Yoy

Europe

267m

304m

14%

317m

4%

331m

4%

  

20%

20%

 

18%

 

17%

 

N.Am

180m

192m

7%

209m

9%

220m

5%

 

13%

13%

 

12%

 

11%

 

Asia

526m

596m

13%

675m

13%

760m

13%

 

39%

39%

 

39%

 

39%

 

MEA

202m

243m

20%

306m

26%

369m

21%

 

15%

16%

 

18%

 

19%

 

LatAm

158m

193m

22%

218m

13%

251m

15%

 

12%

11%

 

13%

 

13%

 

Global

1.33bn

1.53bn

15%

1.73bn

13%

1.93bn

13%

Source: Arete Research estimates

How do we get to an $80 BoM smartphone, which supports a $100 trade price? The bill of materials below shows the key components and what changes in low end smartphones. Bear in mind this is a device with a <3″ display and limited on-board memory; that said, retail prices for 8GB microSD cards are c. $6. We think this will bring smartphones to the mass market in place of featurephones, with emerging markets consumers getting open software platforms that allow customisation via locally relevant applications.

Our estimates by region for smartphone penetration are seen in Table 1 for ’10-’15. We see the industry reaching a 1bn unit run rate by 4Q12, and exceeding this in 2013. Effectively we think that half of overall global volumes will be smartphones in 2013, i.e. users will get a software upgradable device at all but the lowest price points, with the rare exception of niche voice-centric products for developed markets (for example, for older users).

Figure 1: Getting to an $80 BoM Smartphone?

Source: Arete Research estimates

The ’10 surge in developed markets smartphones drove large value share redistribution, while also sustaining global handset ASPs. Apple, RIM and HTC saw their ASPs actually increase in 2H10, while the overall industry ASPs rose 5% yoy in ’10. We believe that in ’11 we will see further modest ASP rises, as half of the incremental smartphone units will still come from N. America and Europe, and supply remains tight in 1H11. As emerging markets start driving growth in ’12, we expect ASPs will drop, but remain above ’09 levels (part of this is influenced by Apple in the mix). We reckon smartphones will constitute 84% of industry value share in ’13 (based on 53% of units), up from 56% in ’10 (from just 21% of units).

This does not account for the additional value that integrated handset vendors, and service providers will capture from a rising smartphone penetration, through content and other services. This dramatic shift in market value towards smartphones is shaking up what had from ’00 to ’07 been a cosy oligopoly in which the top five vendors consistently took ~80% of volume and value.

Figure 2: Migration to Smartphones Sustaining Industry ASPs

Source: Arete Research estimates

The recent announcement by Nokia to end-of-life Symbian and abandon its half of MeeGo threw the OS picture into chaos; we see Android becoming the dominant OS choice by 2012 (see Table 4), though expect a number of branches or variants on the Android kernel, as regional players and Tier One OEMs seek to differentiate smartphone offerings…

To read the Briefing in full, including in addition to the above analysis of:

  • Share by Smartphone OS
  • Semis: Logically, More – Doubling the logic chip market
  • Memory Madness – Trippling the memory markets
  • Telcos: Dummies no More – Growth opportunities for telcos with sophisticated tiers
  • Billions and Billions – Putting the numbers in context

…and additional tables and figures…

  • Table 4 – Share of Smartphone OS
  • Table 5 – Total Mobile Memory Markets
  • Table 6 – DRAM
  • Table 7 – NAND
  • Figure 3 – AT&T Post Paid ARPU ($) vs. Integrated Devices
  • Figure 4 – US Wireless Revenue Share
  • Figure 5 – VZW/AT&T Margins
  • Figure 6 – 3Q10 Mobile Service Revenue Growth (yoy)
  • Figure 7 – Consensus Forward P/E Multiple
  • Figure 8 – What Price Mobile Exposure in EU?

Members of the Telco 2.0TM Executive Briefing Subscription Service can download the full 7 page report in PDF format here. Non-Members, please see here for how to subscribe. Please email contact@telco2.net or
call +44 (0) 207 247 5003 for further details.

There’s also more on Device Strategies at our AMERICAS, EMEA and APAC Executive Brainstorms and Best Practice Live! virtual events.

Full Article: Devices 2.0: ‘Beyond Smartphones’ – Innovation Strategies for Operators

Summary: managing the role of new device categories in new and existing fixed and mobile business models is a key strategic challenge for operators. This report includes analysis of the practicalities and challenges of creating customised devices, best / worst practice, inserting ‘control points’ in open products, the role of ‘ODMs’, and reviews leading alternative approaches.

NB A PDF Version of this 45 page report can be downloaded here.

Introduction

As part of its recently-published report on Mobile and Fixed Broadband Business Models, Telco 2.0 highlighted four potential strategic scenarios, one of which was for operators to become “device specialists” as a deliberate element of strategy, either in wireline and wireless domains. This theme was also covered at the April 2010 Telco 2.0 Brainstorm event in London.

Clearly, recent years have displayed accelerating innovation in numerous “end-point” domains – from smartphones, through to machine-to-machine systems and a broad array of new consumer electronics products. Yet there has been only limited effort made in mapping this diversity onto the broader implications for operators and their business prospects. 

Moving on from legacy views

An important aspect of device specialisation for telcos is one of attitude and philosophy. In the past, the power of the network has had primacy – large switching centres were at the heart of the business model, driving telephones – in some cases even supplying them with electrical power via the copper lines as well. Former government monopolies and powerful regulators have further enshrined the doctrines of central control in telecom executives’ minds.

Yet, as has been seen for many years in the computing industry, centralised systems give way to power at the edge of the network, increasingly assisted by a “cloud” of computing resource which is largely independent of the “wiring” need to connect it. The IT industry has long grasped the significance of client/server technology and, more recently, the power of the web and distributed computing, linked to capable and flexible PCs.

But in the telecom industry, some network-side traditionalists still refer to “terminals” as if Moore’s Law has no relevance to their businesses’ success. But the more progressive (or scared) are realising that the concentration of power “at the telecom edge”, coupled with new device-centred ecosystems (think iPhone + iTunes + AppStore), is changing the dynamics of the industry to one ruled by a perspective starting from the user’s hand back inwards to the core.

With the arrival of many more classes of “connected device” – from e-readers, to smart meters or in-vehicle systems – the role of the device becomes ever more pivotal in determining both the structure of supporting business models and the role of telcos in the value chain. It also has many implications for vendors.

The simplest approach is for operators to source and give away attractive devices in order to differentiate and gain new, or retain existing customers – especially in commoditised access segments like ADSL. At the other end of the spectrum, telcos could pursue a much deeper level of integration with new services to drive new fixed or mobile revenue streams – or create completely unique end-to-end propositions to rival those of 3rd-party device players like Apple, Sony or TiVo.

This Executive Brief examines the device landscape from an operator’s or network vendor’s standpoint. It looks at whether service providers should immerse themselves in creating on-device software and unique user experiences – or even commission the manufacture of custom hardware products or silicon. Alternatively, it considers the potential to “outsource” device smarts to friendlier suppliers like RIM or Thomson/Technicolor, which generally have operators’ success at the centre of their strategies. The alternative may be to surrender yet more value to the likes of Apple, Sony or Sling Media, allowing independent Internet or local services to be monetised without an “angle” for telco services.

Structure of this report

The narrative of this document follows this structure:

  • Introduction
  • The four broadband scenarios for operators, and an introduction to the “device specialist”
  • Developing an initial mechanism for mapping the universe of devices onto operator business models, which generally fit with four modes of communication
  • Consider why devices are such a potential threat if not tackled head-on
  • Provide case studies of previous telco experience of device focus, from a stance of best/worst practice
  • Examine enhancements to existing bus models via device focus
  • Analyse examples of new business models enabled by devices
  • Consider the practicalities of device creation and customisation
  • Suggest a mechanism for studying risk/reward in telcos’ device strategies
  • Recommendations and conclusions

A recap: 4 end-game scenarios

Broadband as the driver

Given the broad diversity of national markets in terms of economic development, regulation, competition and technology adoption, it is difficult to create simplistic categories for the network operators of the future. Clearly, there is a big distance between an open access, city-owned local fibre deployment in Europe, versus a start-up WiMAX provider in Africa, or a cable provider in North America.

Nevertheless, it is worth attempting to set out a few ‘end-game’ scenarios, at least for broadband providers in developed markets for which the ‘end’ might at least be in sight. This is an important consideration, as it sets parameters for what different types of telco and network owner can reasonably expect to do in the realm of device innovation and control.

The four approaches we have explored are:

  1. Telco 2.0 Broadband Player. This is the ultimate manifestation of the centralised Telco model, able to gain synergies from vertical integration as well as able to monetise various partner relationships and ecosystems. It involves some combination of:
    • Enhanced retail model providing well-structured connectivity offerings (E.g. tiered, capped and with other forms of granular pricing), as well as an assortment of customer-facing, value-added services. This may well have a device dimension. We also sometimes call this “Telco 1.0+” – improving current ways of doing business, especially through better up-selling, bundling and price discrimination.
    • Improved variants of ‘bulk wholesale’, providing a rich set of options for virtual operators or other types of service provider (e.g. electricity smart grid)
    • New revenue opportunities from granular or ‘slice and dice’ wholesale, based on two-sided business models for access capacity. This could involve prioritised bandwidth for content providers or mobile network offload, various ‘third-party paid’ data propositions, capabilities to embed broadband ‘behind the scenes’ in new types of device and so on.
    • A diverse set of ‘network capability’ or ‘platform’ value-add services for wholesale and upstream customers, such as authentication and billing APIs, and aggregated customer intelligence for advertisers. Again, there may be a device “angle” here – for example the provision of device-management capabilities to third parties.
    • A provider of open Internet services, consumed on other operators’ networks or devices, via normal Internet connectivity, essentially making the telco a so-called ‘over the top’ Internet application provider itself. This requires a measure of device expertise, in terms of application development and user-experience design.
  2. The Happy Piper. The broadband industry often likes to beat itself up with the threat of becoming a ‘dumb pipe’, threatened by service-layer intelligence and value being abstracted by ‘over the top players’. Telco 2.0 believes that this over-simplifies a complex situation, polarising opinion by using unnecessarily emotive terms. There is nothing wrong with being a pipe provider, as many utility companies and satellite operators know to their considerable profit. There are likely to be various sub-types of Telco that believe they can thrive without hugely complex platforms and multiple retail and wholesale offers, either running “wholesale-only” networks, participating in some form of shared or consortium-based approach, or offering “smart pipe services”.
  3. Government Department. There is an increasing trend towards government intervention in broadband and telecoms. In particular, state-guided, fully-open wholesale broadband is becoming a major theme, especially in the case of fibre deployments. There is also the role of stimulus funds, or the role of the public sector itself in driving demand for ‘pipes’ to enable national infrastructure projects such as electricity smart grids. Some telcos are likely to undergo structural separation of network from service assets, or become sub-contract partners for major projects around national infrastructure, such as electricity smart grids or tele-health.
  4. Device specialist, as covered in the rest of this report. This is where the operator puts its device skills at the core of its strategy – in particular, where the end-points become perhaps the most important functional component of the overall service platform. Most of the evolution of the telco’s service / proposition (and/or cost structure) would not work with generic “vanilla” devices – some form of customisation and control is essential. An analogy here is Apple – its iTunes and AppStore ecosystems and business models would not work with generic handsets. Conversely, Google is much less dependent on Android-powered handsets – it is able to benefit from advertising consumed on any type of device with a browser or its own software clients. 

There are also a few others categories of service provider that could be considered but which are outside the scope of this report. Most obvious is ‘Marginalised and unprofitable’, which clearly is not so much a business model as a route towards acquisition or withdrawal. The other obvious group is ‘Greenfield telco in emerging market’, which is likely to focus on basic retail connectivity offers, although perhaps with some innovative pricing and bundling approaches. (A full analysis of all these scenarios is available in Telco 2.0’s new strategy report on Fixed and Mobile Broadband Business Models).

It should be stressed that these options apply to operators’ broadband access in particular. Taking a wider view of their overall businesses, it is probable that different portfolio areas will reflect these (and other) approaches in various respects. In particular, many Telco 2.0 platform plays will often dovetail with specific device ecosystems – for example, where operators deploy their own mobile AppStores for widgets or Android applications.

 

Figure 1: Potential end-game scenarios for BSPs

Source: Telco 2.0 Initiative

Introducing the device specialist

In many ways, recent trends around telecoms services and especially mobile broadband have been driven as much by end-user device evolution as by network technology, tariffing or operation. Whilst it may be uncomfortable reading for telcos and their equipment vendors, value is moving beyond their immediate grasp. In future, operators will need to accept this – and if appropriate, develop strategies for regaining some measure of influence in that domain.

Smartphones have been around for years, but it has been Apple that has really kick-started the market as a distinct category for active use of broadband access, aided by certain operators which managed to strike exclusive deals to supply it. PCs have clearly driven the broadband market’s growth – but at the expense of a default assumption of “pipe” services. Huawei’s introduction of cheap and simple USB modems helped establish the market for consumer-grade mobile broadband, with well over 50 million “dongles” now shipped. Set-top boxes, ADSL gateways and now femtocells are further helping to redefine fixed broadband propositions, for those broadband providers willing to go beyond basic modems.

Going forward, new classes of device for mobile, nomadic and fixed use promise a mix of new revenue streams – and, potentially, more control over operator business models. In 2010, the advent of the Apple iPad has preceded a stream of “me-too” tablets, with an expectation of strong operator involvement in many of them.

However, not all telcos, either fixed or mobile, can be classified as device specialists. There is a definite art to using hardware or client software as a basis for new and profitable services, with differentiated propositions, new revenue streams and improved user loyalty. There are also complexities with running device management systems, pre-loading software, organising physical sales and supply chains, managing support issues and so on.

Operators can either define and source their own specific device requirements, or sometimes benefit from exclusivity or far-sightedness in recognising attractive products from independent vendors. Various operators’ iPhone exclusives are probably the easiest to highlight, but it is also important to recognise the skills of companies, such as NTT DOCOMO, which defines most of the software stack for its handsets, licensing it out to the device manufacturers.
In the fixed domain, some operators are able to leverage relationships with PC vendors, and in future it seems probable that new categories like smart meters and home entertainment solutions will provide additional opportunities for device-led partnerships.

  • Consequently, it is fair to say that device specialism can involve a number of different activities for operators:
  • A particularly strong focus on device selection, testing, promotion and support.
  • Development of own-brand devices, either produced bespoke in collaboration with ODMs (detailed later in this document), or through relatively superficial customisation of existing devices.
  • Negotiation of periods of device exclusivity in a given market (eg AT&T / iPhone)
  • Definition of the operator’s own in-house OS or device hardware platform, such as the strategies employed by NTT DoCoMo (with its Symbian / Linux variants) or KDDI (modified Qualcomm BREW) in Japan.
  • Provision of detailed specifications and requirements for other vendors’ devices, for example through Orange’s lengthy “Signature” device profiles.
  • Development of the operator’s own UI, applications and services – such as Vodafone’s 360 interface or its previous Live suite.
  • Deployment of device-aware network elements which can optimise end-to-end performance (or manage traffic) differentially by device type or brand.
  • The ability to embed and use “control points” in devices to enable particular business models or usage modes. Clearly, the SIM card is a controller, but it may also be desirable to have more fine-grained mechanisms for policy at an OS level as well. For example, some handset software platforms are designed to allow operators to licence and even “revoke” particular applications, while another emerging group are focused on handset apps used to track data usage and sell upgrades.
  • Development of end-to-end integrated services with devices as core element (similar to Apple or RIM). Much of the value around smartphones has been driven by the link of device-side intelligence to some form of “cloud” feature – RIM’s connection to Microsoft Exchange servers, or Apple iPhone + AppStore / iTunes, for example. Clearly, operators are hoping to emulate this type of distributed device/server symbiosis – perhaps through their own app stores.
  • Lastly, operators may be able to exercise influence on device availability through the enablement of a “device ecosystem” around its services & network. In this case, the telco provides certain platform capabilities, along with testing and certification resources. This enables it to benefit from exclusive devices created by partners, rather than in-house. Verizon’s attempt with its M2M-oriented “Open Device Initiative” is a good example.

 

Clearly, few operators will be in a place to pursue all of these options. However, in Telco 2.0’s view, there remains significant clear water between those which put device-related activities front and centre in their efforts – and those which are more driven by events and end-point evolution from afar.

New business models vs. old

Despite the broad set of options outlined in the previous section, it is important to recognise that operators’ device initiatives can be grouped into two broad categories:

  • Improving existing business models, for example through improving subscriber acquisition, reducing opex, or inducing an uplift in revenues on a like-for-like basis over older or more generic devices.
  • Enabling new business models, for example by selling devices linked to new end-to-end services, enabling the sale of incremental end-user subscriptions, or better facilitating certain new Telco 2.0-style two-sided opportunities (e.g. advertising).

 

 

Although much of the publicity and industry “noise” focuses on the strategic implications of the latter, it is arguably the former, more mundane aspects of device expertise that have the potential to make a bottom-line difference in the near term. While Telco 2.0 also generally prefers to focus on the creation of new revenues and new business model innovation in general, this is one area of the industry where it is also important to consider the inertia of existing services and propositions and the opportunities to reduce opex by optimising the way that devices work with networks. A good example of this is the efficiency and network friendliness of RIM’s Blackberry in comparison with Apple’s iPhone in both data compression technologies and use of signalling.

That said, the initial impetus for deploying the iPhone was mostly around customer acquisition and upselling higher-ARPU plans – but the unexpected success of apps quickly distracted some telcos away from the basics, and more towards their preferred and familiar territory of centralised control.

What are the risks without device focus?

Although many operators bemoan the risks of becoming a “dumb pipe”, few seem to have focused on exactly what is generating that risk. While the “power of the web” and the seeming acceptability of “best effort” communications get cited, it is rare that the finger of blame has pointed directly at the device space.

Over many decades, telecoms engineers and managers have grown up with the idea that devices are properly called “terminals”. Evocative of the 1960s or 1970s, when the most visible computers were “dumb” end-points attached to mainframes, this reflects the historic use of analogue, basic machines like fixed telephones, answering machines or primitive data devices.

Nevertheless, some people in the telecoms industry still stick with this anachronistic phrasing, despite the last twenty or thirty years of ever-smarter devices. The refusal to admit the importance of “the edge” is characteristic of those within telcos and their suppliers that don’t “get” devices, instead remaining convinced that it is possible to control an entire ecosystem from the core outwards.

This flat-earth philosophy is never better articulated than the continuing mantra of fear about becoming “dumb pipes”. It is pretty clear that there are indeed many alternatives for creating “smart pipes”, but those that succeed tend to be aware that, often, the end-points in customers’ hands or living rooms will be smarter still.

In our view, one of the most important drivers of change – if not the most important – is the fast-improving power of devices to become more flexible, open and intelligent. They are increasingly able to “game” the network in a way that older, closed devices were not. Where necessary, they can work around networks rather than simply through them. And, unlike the “dumb” end-points of the past such as basic phones and fax machines, there is considerable value in many products when they are used “offline”.

The markets tend to agree as well – the capitalisation of Apple alone is now over $200bn, with other major device or component suppliers (Nokia, Qualcomm, Microsoft, Intel, RIM) also disproportionately large.

“Openness” is a double-edged sword. While having a basic platform enables operators to customise and tinker to meet their own requirements, that same level of openness is also available to anyone else who wishes to compete. Some operators have managed the delicate balancing act of retaining the benefits of openness for themselves, but closing it down for end-users to access directly – DoCoMo’s use of Symbian and Linux in the “guts” of its phones is probably the best example.

Openness is also being made even easier to exploit through the continued evolution of the web browser. At the moment, it takes considerable programming skill to harness the power of an iPhone or a Nokia Symbian device – or, especially, a less-accessible device like an Internet TV. As it becomes more and more possible to run services and applications inside the browser, the barriers to entry for competing service providers become lowered still further. Even Ericsson, typically one of the most traditional telephony vendors, has experimented with browser-based VoIP . That said, there are some approaches to the web, such as the OMTP BONDI project, which might yet provide telcos with control points over browser capabilities, for example in terms of permitting/denying their access to underlying device features, such as making phone calls or accessing the phonebook.

Compute power: the elephant in the room

There is clear evidence that “intelligence” moves towards the edge of networks, especially when it can be coordinated via the Internet or private IP data connections. This has already been widely seen in the wired domain, with PCs and servers connected through office LANs and home fixed broadband, and is now becoming evident in mobile. There are now several hundred million iPhones, BlackBerries and other smartphones in active data-centric use, as well as over 50m 3G-connected notebooks and netbooks. Home gateways and other device such as femtocells, gaming consoles and Internet TVs are further examples, with billions more smart edge-points on the horizon with M2M and RFID initiatives.

This is a consequence of scale economies and also Moore’s Law, reflecting processors getting faster and cheaper. This applies not just to the normal “computing” chips used for applications, but also to the semiconductors used for the communications parts of devices. Newer telecom technologies like LTE, WiMAX and VDSL are themselves heavily dependent on advanced signal processing techniques, to squeeze more bits into the available network channels.
Ericsson’s talk of 50 billion connected devices by the end of the decade seems plausible, although Amdocs’ sound-bite of 7 trillion by 2017 seems to have acquired a couple of rogue zeroes. That said, even in the smaller figure, not all will be fully “smart”.

Unsurprisingly, we therefore see a continued focus on this “edge intelligence” as a key battleground – who controls and harnesses that power? Is it device suppliers, telcos, end users, or 3rd-party application providers (so-called “over-the-top players”)? Does it complement “services” in the network? Or drive the need for new ones? Could it, perhaps, make them obsolete entirely.

So what remains unclear is how operators might adopt a device strategy that complements their network capabilities, to strengthen their position within the digital value chain and foster two-sided business models. It is important for operators to be realistic about how much of the “edge” they can realistically control, and under what circumstances. Given that price points of devices are plummeting, few customers will voluntarily choose “locked” or operator-restricted devices if similarly-capable but more flexible alternatives cost much the same. Some devices will always be open – in particular PCs. Others will be more closed, but under the control of their manufacturers rather than the telcos – the iPhone being the prime example.

It is therefore hugely important for operators to look at devices as a way of packaging that intelligence into new, specific and valuable business models and propositions – ideally, ones which are hard to replicate through alternative methods. This might imply design and development of completely exclusive devices, or making existing categories more usable. At the margins, there is also the perennial option for subsidy or financing – although that clearly puts even more pressure on the ongoing business model to have a clear profit stream.

There are so many inter-dependent factors here that it is difficult to examine the whole problem space methodically. How do developments like Android and device management help? Should the focus be on dedicated devices, or continued attempts to control the design, OS or browser of multi-purpose products? What aspects of the process of device creation and supply should be outsourced?

Where’s the horsepower?

The telcos are already very familiar with the impact of traditional PCs on their business models – they are huge consumers of data download and upload, but almost impossible to monetise for extra services, as they are bought separately and are generally seen more as endpoints for standalone applications rather than services. The specific issue of the PC (connected via fixed or mobile broadband) is covered separately, but the bottom line is that it is a case study in the ultimate power of open computing and networks. PCs have also been embedded in other “vertical market” end-points such as retail EPOS machines, bank ATMs and various in-vehicle systems.

The problem is now shifting to a much broader consumer environment, as PC-like computing capability shifts to other device categories, most notably smartphones, but also a whole array of other products in the home or pocket.
It is worth considering an illustration of the shifting power of the “edge”, as it applies to mobile phones.

If we go back five or six years, the average mobile phone had a single main processor “core” in its chipset, probably an ARM7, clocking perhaps 30MHz. Much of this was used for the underlying radio (the “modem”) and telephony functions, with a little “left over” for some very basic applications and UI tools, like Java games.

Today, many of the higher-end handsets have separate applications processors as well as the modem chip. The apps processor is used for the high-level OS and related capabilities, and is the cornerstone of the change being observed. An iPhone has a 600MHz+ chip, and various suppliers of Android phones are using a 1GHz Qualcomm Snapdragon chip. Even midrange featurephones can have 200MHz+ to play with, most of which is actually usable for “cool stuff” rather than the radio.

This is where the danger lies for the telcos, as like PCs, it can shift the bias of the device away from consuming billable services and towards running software. (The situation is actually a bit more complex than just the apps processor, as phones can also have various other chips for signal processing, which can be usable in some circumstances for aspects of general computing. The net effect is the same though – massively more computational power, coupled with more sophisticated and open software).

Now, let’s project forward another five years. The average device (in developed markets at least) will have at least 500MHz, with top-end devices at 2GHz+, especially if they are not phones but tablets, netbooks or similar products. Set top-boxes, screenphones, game consoles and other CPE devices are growing smarter in parallel – especially enabled for browsers which can then act as general-purpose (distributed) computing environments. A new class of low-end devices is emerging as well. How and where operators might be able to control web applications is considered below, as it is somewhat different to the “native applications” seen on smartphones.

For the sake of argument, let’s take an average of 500MHz chips, and multiply by (say) 8 billion endpoints.
That’s 4 Exahertz (EHz, 1018) of application-capable computing power in people’s hands or home networks, without even considering ordinary PCs and “smart TVs” as well. And much – probably most – of that power will be uncontrolled by the operators, instead being the playground of user- or vendor-installed applications.

Even smart pipes are dumb in comparison

It is tricky to calculate an equivalent figure for “the network”, but consider an approximation of 10 million network nodes (datapoint: there are 3 million cell sites worldwide), at a generous 5GHz each. That means there would be 50 Petahertz (PHz, 1015) of computing power in the carrier cloud, and it’s including the assumption that most operators will also have thousands of servers in the back-office systems as well as the production network itself.

In other words, the telcos, collectively, have maybe an 80th of the collective compute power of the edge. It is quite possibly much lower than that, but the calculation is intended as an upper bound.

Now clearly, this is not quite as bad a deficit as that makes it sound – the network can obviously leverage intelligence in a few big control points in the core such as GGSNs and DPI boxes, as traffic funnels through them. It can exert control and policy over data flows, as well as what is done at the endpoints.

But at the other end of the pipe is the Internet, with Google and Amazon’s and countless other companies’ servers and “cloud computing” infrastructures. Trying to calculate the aggregate computing power of the web isn’t easy either, but it’s also likely to be in the Exahertz range too. Google is thought to have around one million servers on its own, for example, while the overall server population of the planet (including both Internet and enterprise) is thought to be of the order of 50 million, many of which have multiple processor cores.

 

 

Whatever else happens, it seems the pipe will inevitably become relatively “dumber” (i.e. less smart) than the devices at the edge, irrespective of smart Telco 2.0 platforms and 4G/NGN networks. The question is how much of that edge intelligence can be “owned” by the operators themselves.

Controlling device software vs. hardware

The answer is for telcos to attempt to take control of more of this enormous “edge intelligence”, and exploit it for their own benefit and in-house services or two-sided strategies.

There are three main strategies for operators wanting to exert influence on edge devices:

  • Provide dedicated and fully-controlled and customised hardware and software end-points which are “locked down” – such as cable set-top boxes, or operator-developed phones in Japan. This is essentially an evolution of the old approach of providing “terminals” that exist solely to act as access points for network-based services. This concept is being reinvented with new Telco-developed consumer electronic products like digital picture frames, but is a struggle for variants of multi-function devices like PCs and smartphones
  • Provide separate hardware products that sit “at the edge” between the user’s own smart device and the network, such as cable modems, femtocells, or 3G modems for PCs. These can act as hosts for certain new services, and may also exert policy and QoS control on the connection. Arguably the SIM card fits into this category as well
  • Develop control points, in hardware or software, that live inside otherwise notionally “open” devices. This includes SIM-locks, Telco-customised UI and OS layers, “policy-capable” connection manager software for notebooks, application and widget certification for smartphones, or secured APIs for handset browsers. Normally, it will be necessary for the operator to be the original device supplier/retailer for these capabilities to be enabled before sale – few users will be happy for their own device to be configured after purchase with extra controls from their service provider.

Case studies and best / worst practice

Going back 30 years, before telecoms deregulation, many telcos were originally device specialists. In many cases, the incumbent government monopolies were also the only source of supply of telephones and various other communications products (“CPE” – customer premises equipment) – often renting them to users rather than selling them. Since then of course, much has changed. Not only have customers been able to buy standards-compliant, certified terminals on the open market, but the rise of personal computing and mobile communications has vastly expanded the range and capability of end-points available.

But while few telcos could benefit today from owning physical manufacturing plants, there is an increasing argument for operators once again to take a stronger role in defining, sourcing and customising end-user hardware in both mobile and fixed domains. As discussed throughout this document, there is a variety of methods that can be adopted – and also a wide level of depth of focus and investment. Clearly, owning factories is unlikely to be an attractive option – but at the other end of the scale, it is unclear whether merely issuing vague “specifications” or sticking logos on white-labelled goods from China really achieves anything meaningful from a business model standpoint.

It is instructive to examine a few case studies of operator involvement in the device marketplace, to better understand where it can add value as a core plank of strategy, rather than simply as a tactical add-on.

NTT DoCoMo

Perhaps the best example of a device-centric operator is NTT DoCoMo in Japan. It would perhaps be more accurate to describe the firm as a technology-centric firm, as it pretty much defines its complete end-to-end system in-house, usually as a front-runner for more general 3GPP systems like WCDMA and LTE, but with subtle local modifications.About 10 years ago, it recognised that handset development was going to be a pivotal factor in delaying its then-new 3G FOMA services, and committed very significant funds to driving the whole device ecosystem to accelerate this.

In fact, DoCoMo has a very significant R&D budget in general, which means that it has been able to develop complete end-to-end platforms like i-Mode, spanning both handset software and back-end infrastructure and services. Although it is known for initiatives like these, as well as its participation in Symbian, Android and LiMo ecosystems, its device expertise goes far beyond handset software. For example, its own in-house research journal covers innovative areas of involvement, such as:

Improved video display on handsets

Development of its own in-vehicle 3G module for telematics applications

Measurement of handset antenna efficiency

In some ways, DoCoMo is in a unique position. It did not have to pay for original 3G spectrum and channelled funds into device and infrastructure development instead. It also operates in an affluent and gadget-centric market that has at times been willing to spend $500-600 on massmarket handsets. It has close ties with a number of Japanese vendors, with whom it spends large amounts on infrastructure and joint R&D. And its early pragmatism with web and software developers (in terms of revenue-share) has largely kept the ecosystem “on-side”, compared with other markets in which a mass of disgruntled application providers have eagerly jumped on off-portal and “open” OS platforms, to the detriment of operators.

In its financial year to March 2009, DoCoMo had a total R&D spend of 100 billion Yen (approximately $1bn). While this is split across both basic research and various initiatives around networks and services, it also has a dedicated “device development” centre. It compares to R&D spending by Vodafone Group in the same period of £280m, or about $450m, while mid-size global mobile group Telenor spent just NOK1.1bn ($180m) in calendar year 2008. For comparison, Apple’s current annualised R&D spend is around $1.6bn per year, and Google’s is $3.2bn – while Nokia’s was over $8bn in 2009 – albeit spread across a much larger number of products, as well as its share in NSN. Even smaller device players such as SonyEricsson spend >$1bn per year.

Although DoCoMo is best known for its handset software involvement – i-Mode, Symbian, LiMo, MOAP and so forth – it also conducts a significant amount of work on more hardcore technology platform development. Between 2005 and 2007, for example, it invested 12.5 billion Yen ($125m) in chipset design for its 3G phones.

It has huge leverage with Japanese handset manufacturers like NEC and Matsushita, as they have limited international reach. This means that DoCoMo is able to enforce adoption of its preferred technology components – such as single integrated chips that it helps design, rather than multiple more expensive processors.

While various operators are now present in handset-OS organisations such as the LiMO Foundation and Open Handset Alliance (Android), DoCoMo’s profile in device software has been considerably greater in the past. It is a founder member of Symbian, driving development of one of the original 3 Symbian user interfaces (the other two being Nokia’s S60 and the now-defunct UIQ). DoCoMo now makes royalty revenues, in some instances, from use of its handset software by manufacturers. It also owns a sizeable stake in browser vendor Access, and has also invested in other handset software suppliers like Aplix.

Verizon Open Device Initiative

From the discussion about DoCoMo above, it is clear that for an operator to start creating its own device platform from the bottom up, it will need extremely deep pockets and very close relationships with willing OEMs to use its designs. For individual handsets or a small series of similar devices, it can clearly choose the ODM route, although this risks limiting differentiation to a thin layer of software and a few “off the peg” hardware choices.

Another option is to try to create a fully-fledged hardware ecosystem, putting in place the tools and business frameworks to help innovative manufacturers create a broad set of niche “long tail” devices that conform to a given operator’s specifications. If successful, this enables a given telco to benefit from a set of unique devices that may well come with new business models attached. Clearly, the operator needs to be of sufficient scale to make the volumes worthwhile – and there also needs to be a guarantee of network robustness, channels to market and back-office support.

Verizon’s “Open Device Initiative” is perhaps the highest-profile example of this type of approach, aiming to foster the creation of a wide range of embedded and M2M products. It assists in the certification of modules, and also links in with its partnership with Qualcomm and nPhase in creating an M2M-enabling platform. A critical aspect of its purpose is a huge reduction in certification and testing time for new devices against its network – something which had historically been a time-to-market disaster lasting up to 12 months, clearly unworkable for categories like connected consumer-oriented devices. It has been targeting a 4-week turnaround instead, working with a streamlined process involving multiple labs and testing facilities.

US rival operator AT&T is attempting a similar approach with its M2M partner Jasper Wireless, although Verizon ODI has been more conspicuous to date.

3 / INQ Mobile

Another interesting approach to device creation is that espoused by the Hutchison 3 group. Its parent company, Hutchison Whampoa, set up a separately-branded device subsidiary called INQ Mobile in October 2008. INQ specialises in producing Internet-centric featurephones with tight integration of web services like Skype, Facebook and Twitter on low-cost platforms. Before the launch of INQ, 3 had already produced an earlier product, the SkypePhone, but had not sold that to the outside marketplace.

At around $100 price points, it is strongly aimed at prepaid-centric or low-subsidy markets where users want access to a subset of Internet properties, but without incurring the costs of a full-blown smartphone. It has worked closely with Qualcomm, especially using its BREW featurephone software stack to enable tight integration with web services and the UI. That said, the company is now switching at least part of its attention to Android-based devices in order to create touchscreen-enabled midmarket devices.

3/INQ highlights one of the paradoxes of operator involvement in device creation – while it is clearly desirable to have a differentiated, exclusive device, it is also important to have a target market of sufficient scale to justify the upfront investment in its creation. Setting up a vehicle to sell the resulting phones or other products in geographies outside the parent’s main market footprint is a way to grow the overall volumes, without losing the benefits of exclusivity.
In this sense, although the 3 Group clearly benefits from its association with INQ, it is not specifically part of the operator’s strategy but that of its ultimate holding company. The separate branding also makes good sense. It is also worth noting that 3 is not wholly beholden to INQ for supply of own-brand devices; its current S2x version of its Skypephone is manufactured by ZTE.

BT Fusion

It is also worthwhile discussing one of the less-successful device initiatives attempted by operators in recent years. Between 2003-2009, BT developed and sold a fixed-mobile converged service called Fusion, which flipped handsets between an ordinary outdoor cellular connection and a local wireless VoIP service when indoors and connected to a BT broadband line.

Intended to reduce the costs associated with use of then-expensive mobile calls, when in range of “free” landline or VoIP connections, it relied on switching to Bluetooth or WiFi voice when within range of a suitable hotspot. The consumer and small-business version relied on a technology called UMA (Universal Mobile Access), while a corporate version used SIP. The mobile portion of the service used Vodafone’s network on an MVNO basis.

Recognising that it needed widespread international adoption to gain traction and scale, BT did many things that were “right”. In particular, it supported the creation of the FMCA (Fixed-Mobile Convergence Alliance) and engaged directly with many handset vendors and network suppliers, notably Motorola for devices and Alcatel-Lucent for systems integration. It also ran extensive trials and testing, and participated in various standards-setting fora.
The service never gained significant uptake, blamed largely on falling prices for mobile calls which reduced the core value proposition. It also reflected a very limited handset portfolio, especially as the technology only supported 2G mobile devices at launch – at just the point when many higher-value customers wanted to transition to 3G.

Conversely, lower-end users generally tend to use prepaid mobile in the UK, which did not fit well with BT’s contract-based pricing oriented around Fusion’s position as an add-on to normal home broadband. In addition, there were significant issues around the user interface, and the interaction of the UMA technology with certain other uses of the WiFi radio that the user did not wish to involve the operator.

The main failure for BT here was in its poor focus on what its customers wanted from devices themselves, as well as certain other aspects of the service wrapper, such as numbering. It was so focused on some of the network and service-centric aspects of Fusion (especially “seamless handover” of voice services) that it ignored many of the reasons that customers buy mobile phones – a range of device brands and models, increasing appeal of 3G, battery life, the latest features like high-resolution cameras and so forth. Towards the end of Fusion’s life, it also looked even weaker once the (unsupported) Apple iPhone raised the bar for massmarket adoption of smartphones. It was withdrawn from sale in early 2009.

BT also overlooked (or over-estimated) the addressable market size for UMA-enabled phones, which should have made it realise that support of the technology was always going to be an afterthought for the OEMs. It also over-relied upon Motorola for lower-end devices, and supported Windows Mobile for its smartphone variants more for reasons of pragmatism than customer demand.

Lastly, BT appears to have underestimated the length of time it would take to get devices from concept, through development and testing to market. In particular, it takes many years (and a clear economic rationale) for an optional feature to become built-into mobile device platforms as standard – and until that occurs, the subset of devices featuring that capability tends to be smaller, more expensive, and often late-to-market as OEMs focus their best engineers and project resources on more scalable investments.

Perhaps the main takeaway here is that telcos’ involvement in complex, technology-led device creation is very risky where the main customer benefit is simply cheaper services, in markets where the incumbent providers have scope to reduce margins to compete. A corollary lesson is that encouraging device vendors to support new functions that only benefit the operators (and only a small proportion of customers) is tricky unless the telcos are prepared to guarantee better purchase prices or large volumes. This may well be a reason that leads to the failure of other phone-based enhancements, such as NFC to date.

The role of the ODM in telco-centric devices

An important group of players in operators’ device strategies are the ODMs (original design manufacturers). Usually based in parts of Asia such as Taiwan and Korea, these firms specialise in developing customised “white label” hardware to certain specifications, which are then re-branded by more well-known vendors. ODMs are rather higher up the value-add hierarchy than CMs (contract manufacturers) that are more just factory-outsourcing companies, with much less design input.

Historically, the ODMs’ main customers were the device “OEMs” (original equipment manufacturers) – including well-known firms like Motorola, SonyEricsson and Palm. Even Nokia contracts-out some device development and manufacturing, despite its huge supply chain effectiveness. Almost all laptops are actually manufactured by ODMs – this supply route is not solely about handsets.

Examples of ODMs include firms like Inventec, Wistron, Arima, Compal and Quanta. Others such as HTC, ZTE and Huawei also design and sell own-brand products (ie act as OEMs) as well as manufacturing additional lines for other firms as ODMs.

In a growing number of instances, operators themselves are now contracting directly with ODMs to produce own-brand products for both mobile and fixed marketplaces. This is not especially new in concept – HTC in particular has provided ODM-based Windows Mobile smartphones and PDAs to various operators for many years. The original O2 XDA, T-Mobile MDA and Orange SPV series of smart devices all came via this route.

More recently, the ODM focus has swung firmly behind Android as the best platform, although there are still Microsoft-based products in the background as well. There are also patchy uses of ODMs to supply own-branded featurephones, usually for low-end prepaid segments of the market.

One trend that is conspicuous has been that the ODMs favoured by operators have tended to differ from those favoured by the other OEMs. MNOs have tended to work with the more experienced and technically-deep ODMs (which often have sizeable own-brand sales as well), perhaps to compensate for their limitations in areas such as radio and chipset expertise. They also want vendors that are capable of executing on sophisticated UI and applications requirements. HTC, ZTE and Sagem have made considerable headway in cutting deals with operators, with ZTE in particular able to leverage its growing global footprint associated with infrastructure sales. Conversely, some of the more “traditional” ODMs from Taiwan, such as Compal and Arima, have struggled to engage with operators to the same degree they can outsource design / manufacture from companies like Motorola and Sony Ericsson.

One of the most interesting recent trends is around new device form-factors, such as web tablets, ebook readers and netbooks/smartbooks. Operators are working with ODMs in the hope of deploying such devices as part of new business models and service propositions – either separate from conventional mobile phone service contracts, or as part of more complex integrated three / four screen converged offers. Again, Android is playing an important role here, especially for products that are Internet-centric such as tablets. Not all such devices are cellular-enabled: some, especially where they are intended for use just within the home, will be WiFi-only, connected via home broadband. Android is important here because of its malleability – it is much easier for operators (and their device partners) to create complete, customised user experiences, as the architecture does not have such a fixed “baseline” of user interface components or applications as Windows. It is also cheaper.

It is nevertheless important to note that ODM-based device strategies are often difficult to turn into new business models, and have various practical complexities in execution. Most ODMs base their products on off-the-shelf “reference designs” from chipset suppliers, alongside standard OS’s (hence Android and WinMob) and a fairly thin layer of in-house IPR and design skills. There is often limited differentiation over commodity designs for a given product, except in the case of the few ODMs that have built up strong software expertise over years (notably HTC).

In addition, the “distance” in terms of both value-chain position and geography often makes operator/ODM partnerships difficult to manage. Often, neither has particularly good skill sets in terms of RF design, embedded software development, UI design and ecosystem management. This means that a range of extra consultants and integrators also need to be roped into the projects. While open OS’s like Android provide an off-the-shelf ecosystem to add spice to the offerings, the overall propositions can suffer from a lack of centralised ownership.

It is worth considering that most previous operator/ODM collaborations have been successful in two contexts:

  • Early Windows Mobile and Pocket PC devices sold to businesses and later consumers, to compete primarily against Nokia/Symbian and provide support for email, web browsing and a limited set of applications. Since the growth of Apple and BlackBerry, these offerings have looked weak, although ODM Android-based smartphones are restoring the balance somewhat.
  • Low-end commodity handsets, primarily aimed at prepaid customers in markets where phones are sold through operator channels. Typically, these have been aimed at less brand-conscious consumers who might otherwise have bought low-tier Nokia, Samsung or LG handsets.

On the other hand, other operator / ODM tie-ups have been rather less successful. In 2009, a number of operators tried rolling out small handheld MIDs (mobile Internet devices), with lacklustre market impact.

One possibility is that ODMs will start to shift focus away from mobile handsets, and more towards other classes of device such as tablets, femtocells and in-car systems. These are all areas in which there is much less incumbency from the major OEM brands like Apple and Samsung, and where operators may be able to sell completely new classes of device, packaged with services.

It has been estimated that own-brand operator handsets remain a “minority sport”, with IDC reported as estimating they only accounted for 1.4% of units shipped in Western Europe in 2008.

Enhancing existing business models

Returning to one of the points made in the introduction, there are two broad methods by which device expertise can enhance operators’ competitive position and foster the creation and growth of extra revenue:

  • Improving the success and profitability of existing services and business models
  • Underpinning the creation of wholly new offerings and propositions

This section of the document considers the former rationale – extending the breadth and depth of current services and brand. Although much of the recent media emphasis (and perceived “sexiness”) around devices is on the creation of new business models and revenue streams, arguably the main benefits of device specialisation for telcos are more prosaic. Deploying or designing the right hardware can reduce ongoing opex, help delay or minimise the need for incremental network capex, improve customer loyalty and directly generate revenue uplift in existing services.

Clearly, it is not new analysis to assert that mobile operators benefit from having an attractive portfolio of devices, in markets where they sell direct to end-users. Exclusive releases of the Apple iPhone clearly drove customer acquisition for operators such as AT&T and O2. Even four years ago, operators which merely gained preferential access to new colours of the iconic Motorola RAZR saw an uplift in subscriber numbers.

But the impact on ongoing business models goes much further than this, for those telcos that have the resources and skill to delve more deeply into the ramifications of device selection. Some examples include:

  • There is a significant difference between devices in terms of return rates from dissatisfied customers – either because of specific faults (crashing, for example) or poor user experience. This can cause significant losses in terms of the financial costs of device supply/subsidy, along with negative impact on customer loyalty.
  • Less serious than outright returns, it is also important to recognise the difference in performance of devices on an ongoing basis. In 2009, Andorra-based research lab Broadband Testing found huge variations between different smartphones in the basics of “being a phone” – some regularly dropped calls under certain circumstances such as 3G-to-2G transitions, for example. Often, users will wrongly associate dropped calls with flaws in the network rather than the handset – thereby generating a negative perception for the telco.
  • Another important aspect of opex relates to handling support calls, which can easily cost $20 per event – and sometimes much more for complex inquiries needing a technical specialist. This becomes much more of an issue for certain products, such as advanced data-capable products, where configuration of network settings, email accounts and VoIP services can be hugely problematic. A single extra technical call, per user per year, can wipe out the telco’s gross margin. Devices which have setup “wizards” or even just clearer menus can reduce the call-centre burden considerably. Even in the fixed world, home gateways or other products designed to work well “out of the box” are essential to avoid profit-sapping support calls (or worse, “truck rolls”). This can mean something as trivial as colour-coding cables and sockets – or as sophisticated as remote device management and diagnostics.
  • Selection of data devices with specific chipsets and radio components can have a measurable impact on network performance. Certain standards and techniques are only implemented in particular semiconductor suppliers’ products, which can use available capacity more efficiently. Used in sufficiently large numbers, the cumulative effect can result in reduced capex on network upgrades. While few carriers have the leverage to force new chip designs into major handset brands’ platforms, the situation could be very different for 3G dongles and internal modules used in PCs, which tend to be much more generic and less brand-driven. UK start-up Icera Semiconductor has been pursuing this type of engagement strategy with network operators such as Japan’s SoftBank.
  • Device accessories can add value to a service provider’s existing offerings, adding loyalty, encouraging contract renewal, and potentially justifying uplift to higher-tier bundles. For home broadband, the provision of capable gateways with good WiFi can differentiate versus alternative ISPs. For those providing VoIP or IPTV, the addition of cordless handsets or PVRs / media servers can add value. In mobile, the provision of car-kits can improve voice usage and revenues significantly.
  • Operators’ choice of devices can impact significantly on ARPU. There is historical evidence that a good SMS client on a mobile phone will drive greater usage and revenue, for example. In the fixed-broadband world, providing gateways with (good) WiFi instead of simple modems has driven multiple users per household – and thus a need for higher-tier services and greater overall perception of value.

Figure 2: Operators need to consider the effects of basic device performance on customer satisfaction and the network

Source: Broadband Testing

There are also much simpler ways in which devices can bolster current services’ attractiveness: 2010 and 2011 are likely to see an increasing number of new devices being sold or given away by operators in order to retain existing customers using existing services.

In particular, a new class of WiFi-based web tablets are expected to become quite popular among fixed broadband companies looking to avoid churn or downward pricing pressure, as well as (perhaps) acting as future platforms for new services as well. Although there are numerous technical platforms for tablets, it seems likely that Android will enable a broad array of inexpensive Asian ODMs to produce competent products, especially as they will not need complex integration of voice telephony or other similar features. The growing maturity of web browsers and widgets (for example with HTML5), as well as the flexibility of the Android Marketplace, should enable sufficient flexibility for use of the products with most leading-edge web services.

Expect to see plenty of “free” pseudo-iPads being given as inducements to retain customers, or perhaps to upsell them to a higher-tier package. The ability for fixed broadband providers to compete with their mobile peers, through providing subsidised devices, should not be underestimated. By the same token, mobile operators may choose to give away free or discounted femtocells

It is also possible for operators’ direct involvement in the device marketplace to lead to lower costs for existing business models. Various groups of operators have collectively acted in partnership to reduce device prices through collective purchasing and negotiation, as well as enabling larger-scale logistics and supply chain operations. In Japan, NTT DoCoMo has conducted a considerable amount of research on chipset integration, with the result of enabling cheaper handset platforms (see case study below).

Operator home gateways

Probably the most visible and successful area for operator-controlled and branded devices has been the home gateway provided by many ADSL operators, as well as their peers’ offerings of cable modems and set-top boxes. While these are usually produced by companies such as Thomson / Technicolor and 2Wire, many operators undertake very substantial customisation of both hardware and software.

Up to a point, these products have acted as service “hubs”, enabling fixed broadband providers to offer a variety of value-added options such as IPTV, VoIP, remote storage and other service offerings. They normally have WiFi (and, sometimes, “community” connectivity such as the BT / FON tie-up) and various ports for PCs and other devices. Some incorporate wireless DECT or WiFi phones. Most are remotely manageable and can support software upgrades, as well as some form of interactivity via the customer’s PC. Given that most home broadband contracts last at least a year – and are rarely churned – the cost can be defrayed relatively easily into the ongoing service costs. 

 

 

That is the good side of home gateways. The downside is that they rarely generate additional incremental revenue streams after the initial installation. Users only infrequently visit operators’ portals, or even less often use the in-built management software for the device. They respond with indifference to most forms of marketing after the initial sign-up: anecdotally, telephone sales and direct mail have poor response rates.

Nevertheless, these products still form a centrepiece of many broadband providers’ strategies and competitive differentiation:

  • Most obviously, they are needed to support higher broadband speeds, which remains the key differentiator between telcos selling ADSL or cable connectivity. “Upgradeability” to faster speeds is one of the most likely options to drive aftermarket revenue uplift or induce loyalty via “free” improvements whilst maintaining price against a falling market. In some countries, the ability to support fibre as well as copper is an important form of future-proofing. Potentially, the inclusion of femtocell modules also confers extra upgrade potential.
  • If well-designed, they can prompt selection of a higher-end monthly tariff or bundle at the initial sale, especially where the operator has a range of alternative products. For example, Orange sells its low-end plans with a basic wireless router, while its higher-end offerings use its LiveBox to support value-adds like VoIP, UMA and so forth. BT offers a free DECT handset with its top-end bundle.
  • Gateways can have the ability to reduce operating costs, especially if they have good self-diagnostics and helpdesk software.
  • In some cases, the gateway can stimulate an ecosystem of accessories such as cordless handsets or other add-ons. Orange, once again, uses its LiveBox as a platform for additional “digital home” products such as a networked storage drive, Internet radio and even a smoke detector*. These can either generate additional revenue directly in hardware sales, or by incremental services – or even just greater utilisation of the base offers. In the future, it seems likely that this approach could evolve into a much broader set of services, such as smart-grid electricity monitoring.

 

(*The Orange France smoke detector service is interesting, in that it comes with two additional options for the user to subscribe to either Orange’s own €2 per month alerting service, or a third-party “upstream” insurance and assistance firm’s more comprehensive offering [Mondial Assistance] at €9 per month)

As such, it is (in the long term) a potentially massive assistance to operators wishing to pursue two-sided models. It can act as a control point for network QoS, helping differentiate certain end-user ‘consumption’ devices through physical ports or separate WiFi identities. It can store information or provide built-in applications (for example, web caching). This approach could enable a work-around for Net Neutrality, if two-sided upstream partners’ applications are prioritised not over the Internet connection, but instead by virtue of having some form of local ‘client’ and intelligence in the operator’s broadband box. While this might not work for live TV or real-time gaming, there could certainly be other options that might allow more ‘slice and dice’ revenue to be extracted.

It is also much more feasible (net neutrality laws permitting) to offer differentiated QoS or bandwidth guarantees on fixed broadband, when there is a separate hardware device acting as a “demarcation point”, and able to measure and report on real-world conditions and observed connectivity behaviour. This is critical, as it seems likely that “upstream” providers will demand proof that the network actually delivered on the QoS promises.

The bottom line is that operators intending to leverage in-home services need a fully-functional gateway. It is notable that some operators are now backing away from these towards less-functional and cheaper ADSL modems (for example, Telecom Italia’s Alice service), which may reflect a recognition that added-value sales are much more difficult than initially thought.

It is difficult to monetise PCs beyond “pipes”

Despite our general enthusiasm for innovation in gaining revenues from new “upstream” providers, Telco 2.0 believes that the most important two-sided opportunities will involve devices other than PCs. We also feel it is highly unlikely that operators will be able to sell many incremental “retail” services to PCs users, beyond connectivity. That said, we can envisage some innovation in pricing models, especially for mobile broadband in which factors like prepaid, “occasional” nomadicity and offload may play a part. There may also be some bundling – for example of music services, online storage or hosted anti-virus / anti-spam functions. One other area of exception may be around cloud computing services for small businesses.

Although the popular image of broadband is people on FaceBook, running Skype or BitTorrent or watching YouTube on a laptop, these services are not likely to support direct ‘slice and dice’ wholesale capacity revenues from the upstream providers. Telco 2.0 believes that in certain cases (eg fixed IPTV), Internet or media companies might be prepared to pay an operator extra for improved delivery of content or applications. But there is very little evidence that PC-oriented providers such as YouTube, for example, will be prepared to pay “cold hard cash” to broadband providers for supposed “quality of service”. PCs are ideal platforms for alternative approaches – rate adaptation, buffering, or other workarounds. PC users are comparatively tolerant, and are more prone to multi-tasking while downloads occur. However, these companies may still be able to generate advertising revenue-share, telco B2B value-added services (VAS) and API-based revenues in some circumstances – especially via mobile broadband.

That said, for mobile broadband, PCs are really more of a problem than an opportunity, generating upwards of 80% of downstream data traffic for many mobile operators – 99.9% of which goes straight to the Internet, through what is actually quite complex and expensive core network “machinery”. Offloading PC-based mobile traffic to the Internet via WiFi or femtocell is a highly attractive option – even if it means forgoing a small opportunity for uplift. The benefits of increasing capacity available for smartphones or niche devices without extra capex on upgrades far outweighs this downside in most cases.

In the fixed world, the data consumption of PCs may eventually look like a red herring, except for the most egregiously-demanding users. The real pain (and, perhaps, opportunity) in terms of network costs will increasingly come from other devices connected via broadband, especially those capable of showing long-form HD video like large-screen TVs and digital video recorders. Other non-PC devices connected via fixed broadband including game consoles, tablets, smartphones (via WiFi), femtocells, smart meters, healthcare products and so on.

As the following section describes, PC-based applications are generally too difficult to track or charge for on a granular basis, while other supplementary products and associated applications tend to be easier to monitor and bill – and often have value chains and consumer expectations that are more accepting of paid services.

The characteristics which distinguish PCs from other broadband-connected devices include:

  • High-volume traffic. With a few exceptions that can be dealt with via caps or throttling, most PC users struggle to use more than perhaps 30GB/month today on fixed broadband, and 5GB on mobile. This is likely to scale roughly in parallel with overall network capacity, rather than out-accelerate it. Conversely, long-form professional video content has the potential to use many GB straight away, with a clear roadmap to ever-higher traffic loads as pixel densities increase. Clearly, PCs are today often facilitators in video downloads, but relatively few users can be bothered to hook their computers up to a large screen. In the future, there are likely to be more directly Internet-connected TVs, as well as specialist boxes like the Roku;
  • Multiple / alternative accesses. PCs will increasingly be used with different access networks – perhaps ADSL and WiFi at home, 3G mobile broadband while travelling, and paid WiFi hotspots in specific locations. This makes it much more difficult to monetise any individual pipe, as the user (and content/app provider) has relatively simple methods for arbitrage and ‘least cost routing’;
  • Likelihood of obfuscation. PCs are much more likely to be able to work around network policies and restrictions, as they are ideal platforms for new software and are generally much less controlled by the operator or vendor. Conversely, the software in a TV or health monitoring terminal is likely to be static, and certainly less prone to user experimentation. This means that if the network can identify certain traffic flows to/from a TV today, they are unlikely to have changed significantly in a year’s time. Nobody will install a new open-source P2P application on their Panasonic TV, or a VPN client in their blood-pressure monitor. Conversely, PC applications will require a continued game of cat-and-mouse to stay on top of. There is also much less risk of Google, Microsoft or another supplier giving away free encryption / tunnelling / proxying software and hiding all the data from prying DPI eyes;
  • Cost of sale and support. Few Telcos are going to want to continually make hundreds of new sales and marketing calls to the newest ‘flavour of the month’ Web 2.0 companies in the hope of gaining a small amount of wholesale revenue. Conversely, a few ‘big names’ in other areas offer much more scope for solid partnerships – Netflix, Blockbuster, BBC, Xbox Live, Philips healthcare, Ubiquisys femtocells and so on. A handful of consumer electronics manufacturers and other Telcos represents a larger and simpler opportunity than a long tail of PC-oriented web players. Some of the latter’s complexity will be reduced by the emergence of intermediary companies but even with these, operators will almost certainly focus on the big deals;
  • Reverse wholesale threats. The viral adoption and powerful network effects of many PC-based applications mean that operators may be playing with fire if they try to extract wholesale revenues for data capacity. It is very easy for users of a popular site or service (e.g. Facebook) to mobilise against the operator – or even for the service provider to threaten to boycott specific ISPs and suggest that users churn. This is much less likely for individual content-to-person models like TV, where it is easier to assert control from a BSP point of view;
  • Consumer behaviour and expectations. Consumers (and content providers) are used to paying more/differently for video viewed on a TV versus on a PC. Similarly, the value chains for other non-PC services are less mature and are probably easier for fixed BSPs to interpose themselves in, especially while developers and manufacturers are still dealing with ‘best efforts’ Internet access. PC-oriented developers are already good at managing variable connection reliability, so tend to have less incentive to pay for improvements. There are some exceptions here, such as applications which are ‘mission critical’ (e.g. hosted Cloud / SaaS software for businesses, or real time healthcare monitoring), but most PC-based applications and their users are remarkably tolerant of poor connectivity. Conversely, streaming HD video, femtocell traffic and smart metering have some fairly critical requirements in terms of network quality and security, which could be monetised by fixed BSPs;
  • Congestion-aware applications. PC applications (and to degree those on smartphones) are becoming much better at watching network conditions and adapting to congestion. It is much more difficult for a BSP to charge a content or application provider for transport, if they can instead invest the money in more adaptive and intelligent software. This is much more likely to occur on higher-end open computing devices with easily-updateable software.

Taken as a whole, Telco 2.0 is doubtful that PCs represent a class of device that can be exploited by operators much, beyond connectivity revenues. In the fixed world, we feel that telcos have other, better, opportunities and more important threats (around video, tablets and new ecosystems like smart grids). In the mobile world, we think operators need to consider the cost of servicing PC-based mobile broadband, rather than the mostly-mythical new revenue streams – and just focus on managing or offloading the traffic with the greatest ease and lowest cost feasible.

PCs are unlikely to disappear – but they should not command an important share of telcos’ limited bandwidth for services innovation.

Devices and new telco business models

The last part of previous section has given a flavour of how network end-points might contribute to business model innovation, or at least permit the layering-on of incremental services such as the Orange smoke-detector service. It is notable that, in that case, the new proposition is actually a “two box” service, involving a generic telco-controlled unit (the LiveBox gateway), together with a separate device that actually enabled and instantiated the new service (the detector itself).

When it comes to generating new device-based operating and revenue models, telcos have two main choices:

  • Developing services around existing multi-purpose devices (principally PCs or smartphones)
  • Developing services around new and mostly single-application devices (Internet TVs, smart meters, healthcare monitors, in-vehicle systems, sensors and so forth).

The home gateway, discussed above, is a bit of a special category, as it is potentially both a “service end-point” in its own right and the hub for extra gadgets hooked into it through WiFi.

The first option – using multi-function devices – has both advantages and disadvantages. The upside is a large existing user base, established manufacturers and scale economies, and well-understood distribution channels. The downside is the diversity of those marketplaces in terms of fragmented platforms and routes to market, huge competition from alternative developers and service providers, an urgent need to avoid disruption to existing revenues streams and experience – and the strategic presence of behemoths such as Apple, Google and Nokia.
Smartphones and PCs are separately analysed later in this document, as each group has very separate challenges that impinge to only a limited degree on the newer and more fragmented device types.

With new devices there are also a series of important considerations. In theory, many can be deployed in “closed” end-to-end systems with a much greater measure of operator control. Even where they rely on notionally “open” OS’s or other platforms, that openness might be exploited by the telco in terms of, say, user interface and internal programming – but not left fully-open to the user to add in additional applications. (This is perfectly normal in the M2M world – many devices have Windows or Linux internals, such as barcode scanners and bank ATM machines, but these are isolated from the user’s intervention).

However, despite the ability to create completely standalone revenue models, there are still other practical concerns. Certain device types may fit poorly with telcos’ back-office systems, especially old and inflexible billing systems. There will also be huge issues about developing dedicated retail and customer-support channels for niche devices, outside their usual mechanisms for selling mobile services or mass-market broadband and telephony. There may also be challenges dealing with the role of the incumbent brands and their existing partnerships.

Devices map onto 4 communications models

Clearly, the device universe driving telecom services is a broad one – dominated in volume terms by mobile phones and smartphones, as well as driven from a data standpoint by PCs. There are also the numerically smaller, but highly important constituencies of fixed phones, servers and corporate PBXs. But increasingly, the landscape looks more fragmented, with ever more devices becoming network-connected and also open to applications and “smartness”. TVs, tablets, sensors, meters, advertising displays, gaming products and so forth – plus newcomers in diverse areas of machine-to-machine and consumer electronics.

Consequently, it is difficult to develop broad-brush strategies that span this diversity, especially given the parallel divergence of business models and demands on the network. To help clarify the space, we have developed a broad mechanism for classifying devices into different ”communications models”. Although the correlation is not perfect, we feel that there is a good-enough mapping between the ways in which devices communicate, and the ways in which users or ecosystems might be expected to pay for services.

(Note: P2P here refers to devices that are primarily for person-to-person communications, not peer-to-peer in the context of BitTorrent etc. In essence, these devices are “phones” or variants thereof, although they may also have additional “smart” data capabilities).

 

It is worth pointing out that PCs represent a combination of all of these models. They are discussed separately, in another section – although Telco 2.0 feels that they are much more difficult to monetise beyond connectivity for operators.

Person-to-person communication

The majority of devices connected to telcos’ networks today are primarily intended for person-to-person (also sometimes called peer-to-peer) communications: they are phones, used for calling or texting other phones, both mobile and fixed. Because they have been associated with numbers – and specific people, locations or businesses – the business models have always revolved around subscriptions and continuity.

Telco 2.0 believes that there is limited scope for device innovation here beyond additional smartness – and to a degree, smartphones (like PCs) also could be considered special cases that transcend the categories described here. They are examined below. [Note: this refers to the types of communication application – there are likely to be yet more new ways in which voice and SMS can be used, controlled and monetised even on basic phones through back-end APIs in the network].

Yes, there could be niche products which evolve specifically intended as “social network devices” and clearly there is also a heritage of products optimised for email and various forms of instant messaging. But these functions are generally integrated into handsets, either operator-controlled or through third-party platforms such as BlackBerry’s email and messaging.

A recurring theme among fixed operators for the past 20 years has been that of videophones. Despite numerous attempts to design, specify or sell them, we have yet to see any rapid uptake, despite widespread use of webcams on PCs. The most recent attempt has been the advent of “screenphones” optimised for web/widget display, with additional video capture and display capabilities they hope may eventually become more widely-used. These too have had limited appeal.

Although handsets clearly represent a huge potential opportunity for telcos’ two-sided aspirations through voice/SMS APIs and smartphone applications and advertising, it seems unlikely that device innovation will result in totally new classes of product here. As such, operators’ peer-to-peer device strategy will likely to revolve around better control of smartphones’ experience and application suites, along with attempts to bring on new massmarket services for featurephones. This is likely to take the form of various new web/widget frameworks such as the Joint Innovation Labs’ platform (JIL), run by Vodafone, Verizon, SoftBank and China Mobile.

Other less-likely handset business models could evolve around new “core” communications modes – although we remain sceptical that the 3GPP- and GSMA-backed Rich Communications Suite will succeed in the fashion of SMS for a huge number of reasons. In particular, any new core P2P mode needs very high penetration levels to be attained before reaching critical mass for uptake – something hard to achieve given the diversity of device platforms, the routes to market, and the existing better-than-RCS capabilities already built into products such as the iPhone and BlackBerry. Adding in a lack of clear business case, poor fit with prepay models and weak links to consumer behaviour and psychology (eg “coolness”), we feel that “silo” optimised solutions developed by operators, device vendors or third parties are much more likely to succeed than lowest-common-denominator “official” standards.

Downloads and streaming

The most visible – and potentially problematic – category of new connected devices are those that are intended as media consumption products. This includes TVs, e-book readers, PVRs, Internet radios, advertising displays and so forth. Clearly, some of these have been connected to telco services in some way before (notably via IPTV), but the recent trends of embedding intelligence (and “raw” direct Internet access) is changing the game further. Although it is also quite flexible, we believe that the new Apple iPad is best represented within this category.

There are four main problems here:

  • The suppliers of these devices are often strong consumer electronic brands, with limited experience of engaging with operators at all, let alone permitting them to interfere in hardware or software specification or design. Furthermore, their products generally have significant “offline” usage modes such as terrestrial TV display, over which operators cannot hope to exert influence at all. As such, any telco involvement will likely need to be ring-fenced to new services supported. This also makes it difficult to conceive of many products which could be profitable if confined solely to sales within an individual operator’s customer base.
  • It is unlikely that many of the more expensive items of display and media consumption technology will be supplied directly by operators, or subsidised by them. This makes it very difficult for operators to get their software/UI load into the supply chain, unless there were generic open-Internet downloads available.
  • These devices – especially those which display high-definition video – can consume huge amounts of network resource. Living-room LCD TVs can pull down 5GB per hour, if connected to the Internet for streamed IPTV, which might not even be watched if the viewer leaves the room. In the mobile domain, dedicated TV technologies have gained limited traction, but streaming music and audio can instead soak up large volumes of 3G bandwidth. There is a risk that as display technology evolves (3D, HD etc), these products may become even more of a threat to economics than open PCs.
  • For in-home or in-office usage scenarios, the devices will normally be used “behind” the telco access gateway and thus be outside the usual domain of operator influence. This makes it less palatable to consumers to have “control points”, and also raises the issue of responsibility for poor in-home connectivity if they are operator-controlled.

All that said, there are still important reasons for telcos to become more skilled in this category of devices. Firstly, it is important for them to understand the types of traffic that may be generated – and, possibly, learn how to identify it in the network for prioritisation. There could well be options for two-sided models here – for example, prioritisation or optimisation of HD video for display on living-room TVs, for which there may well be revenue streams to share, as well as user expectations that would not embrace “buffering” of streamed data during congested periods.

Moreover, there is a subset of this class of “display” devices which are much more amenable to entirely new business models beyond connectivity. Mobile devices such as the Apple iPad (or operator-controlled equivalents) could be bundled with content and applications. Non-consumer products such as connected advertising displays could benefit from many telco value-adds: imagine a road-side advert that changed to reflect the real-time mix of drivers in the vicinity, calculated via the operator’s network intelligence.

There are also further positives to this group of products that may offset the problems listed above. Generally, they are much less “open” than PCs and smartphones, and tend to have fixed software and application environments. This predictability makes it much less likely that new usage modes will emerge suddenly, or new work-arounds for network controls be implemented. It also makes “illicit” usage far less probable – few people are going to download a new BitTorrent client to their TV, or run Skype on a digital-advertising display.

Cloud services & control

Probably the most interesting class of new devices are those that are expected to form the centrepiece of emerging “cloud services” business models, or which are centrally-controlled in some way. In both cases, while the bulk of data traffic is downstream, there is an important back-channel from the device back to the network. Possible examples here would be smart meters for next-generation electricity grids, personal healthcare terminals, or “locked” tablets used for delivering operator-managed (or at least, operator-mediated) services into the home.

These devices would typically be layered onto existing broadband service connections in the home (probably linked in via WiFi), or else could have a separate cellular module for wide-area connectivity. While they may have some form of user interface or screen, it is likely that this will not be “watched” in the same sense as a TV or media tablet, instead used for specific interactive tasks.

These types of application have some different network requirements to other devices – most typically, they will require comparatively small volumes of data, but often with extremely high levels of security and reliability, especially for use cases such as healthcare and energy management. Other devices may be less constrained by network quality – perhaps new appliances for the home, such as “family agenda and noticeboard” tablets.

There are numerous attractions here for operators – while these devices are likely to be used for a variety of tasks, their impact on the network in terms of capacity should generally be light. Conversely, the requirements for security should enable a premium to be charged – probably to the “ecosystem owner” such as a public-sector body or a utility. In some cases, there could well be additional associated revenue streams open to the telco alongside connectivity – both direct from end users, and perhaps also from managing delivery to upstream providers.

There is also a significant likelihood that cloud-based services will be based around long-term, subscription-type billing models, as the devices will likely be in regular and ongoing use, and also probably of minimal functionality when disconnected.

Upload

A number of new device categories are emerging that are “upload-centric” – using the telco network as a basis for gathering data or content, rather than consuming it. Examples include CCTV cameras, networks of sensors (eg for environmental monitoring), or digital cameras that can upload photos directly.

These are highly interesting in terms of new business models for telcos:

  • Firstly, they are almost all incremental to existing connections rather than substitutional – and thus represent a source of entirely new revenue, even if the operators are just supplying connectivity.
  • Secondly, this class of device is likely to involve new, wider ecosystems, often involving parties that have limited experience and skill in managing networks or devices. This provides the opportunity for operator to add significant value in terms of overall management and control. Examples include camera manufacturers, public-sector authorities operating surveillance or measurement networks and so forth. This yields significant opportunity for two-sided revenues for telcos, or perhaps overall “managed service” provision.
  • Thirdly, it is probable that traditional “subscription” models, as seen in normal telephony services, will be unwieldy or a generally poor fit with this class of device. For example, a digital 3G-uploading camera is likely to be used irregularly and is thus unsuited to regular monthly fees. It may also make sense to price such devices on a customised “per photo” basis, rather than per-MB – and it would probably be desirable to bundle a certain allowance into the upfront device purchase price. Clearly, there is value to be gained by the telco or a specialist service provider like Jasper Wireless here, re-working the billing and charging mechanisms, handling separate roaming deals and so forth.

In addition, there is an opportunity to engineer these new business models from the ground up to reflect network usage and load. They are likely to generate fairly predictable traffic – most of it upstream. This may present certain challenges, as most assumptions are for download-centric networks, but the fact that application-specific devices should be “deterministic” should help assuage those problems from a planning point of view. For example, if an operator knows that it has to support a million CCTV cameras, each uploading an average of 3MB per hour from fixed locations, that is relatively straightforward to add into the capacity planning process – certainly much more so than an extra million smartphones using unknown applications at unknown times, while moving around.

All that said, it remains unclear that the total number of device sales and aggregate revenues make this category a truly critical area for telcos. In many cases it is likely to be “nice to have” rather than must-have – and it is certainly not obvious that the current nascent market will be large enough to accommodate every operator in a given market attempting to enter the space simultaneously. For a few operators this area may “move the needle” if a few choice deals are struck (e.g. for national environmental monitoring), but for others it will be many years, if ever.

One example of this category of product is the remote smoke-detector offered by Orange in France, which is provided as a value-add to its home broadband offer. This has a variety of service models, including one involving a subscription to another upstream provider of monitoring/alerting functions (Mondial Assistance), for which Orange presumably gains a revenue share.

Operators’ influence on smartphones and featurephones

Perhaps the key telco battleground at present is around smartphones. The growth of the iPhone, the entrenched position of BlackBerry, the emergence of Android and the theoretical numeric advantage of Symbian and Nokia are all important aspects of the landscape. They are encouraging data plan uptake by consumers, catalysing the applications ecosystem and – on the downside – fostering rampant bandwidth utilisation and providing ready platforms for Internet behemoths to drive services loyalty at the expense of the telcos.

In principle smartphones should be excellent platforms for operators launching new services and exploiting alternative business models – advertising, downloadable apps linked to identity or billing services, third-party payments for enhanced connectivity and so forth. Yet up until now, with a few exceptions (notably DoCoMo in Japan), there have been very limited new revenue streams on handsets beyond basic voice, messaging, ringtones and flat (or flattish) data plans. BlackBerry’s BES and BIS services are the only widely-adopted 3rd-party data services sold outside of bundles by a significant number of operators, although operator billing for their own (or others’) appstores holds potential.

This is a general area that Telco 2.0 has covered in various recent research reports, examining the role of Apple, Google, RIM and others. Fixed operators have long known what their mobile peers are now learning – as intelligence increases in the devices at the edge, it becomes far more difficult to control how they are used. And as control ebbs away, it becomes progressively easier for those devices to be used in conjunction with services or software provided by third parties, often competitive or substitutive to the operators’ own-brand offerings.

A full discussion of the smartphone space merits its own strategy report, and thus coverage in this document on the broader device markets is necessarily summarised.

What is less visible is how and where operators can impose themselves in this space from a business model point of view. There is some precedent for operators developing customised versions of smartphone OS software, as well as unique devices (eg Vodafone / LiMo, DoCoMo / Symbian and Linux, or KDDI / Qualcomm BREW). Many have fairly “thin” layers of software to add some branding and favoured applications, over the manufacturer’s underlying OS and UI. Symbian and LiMo have been more accommodating in this regard, compared to Apple and RIM, with Microsoft and Palm somewhere in the middle.

However, in the majority of cases this has not led to sustainable revenue increases or competitive advantage for the operators concerned – not least because there appears to have been a negative correlation with overall usability, especially given links to back-end services like iTunes and the BlackBerry BIS email infrastructure. Where one company has complete control of the “stovepipe”, it is much easier to optimise for complexities such as battery life, manage end-to-end performance criteria such as latency and responsiveness, and be incentivised to ensure that fixing one problem does not lead to unintended consequences elsewhere. In contrast, where operators merely customise a smartphone OS or its applications, they often lack the ability to drill down into the lower levels of the platform where needed.

More recently, Android has seemed to represent a greater opportunity, as its fully open-source architecture enables operators to tinker with the lower layers of the OS if they so desire, although there are endless complexities in creating “good” smartphones outside of telcos’ main competence, such as software integration and device power management. Symbian’s move to openness could also produce a similar result. It is in this segment that operators have the greatest opportunity for business model innovation. We are already seeing moves to operator-controlled application ecosystems, as well as mobile advertising linked to the browser or other functions. That said, early attempts by operators to create own-label social networking services, or “cross-operator” applications, seem to have had limited success.

Further down the chain, it is important not to forget the huge market occupied by their less-glamorous featurephone brethren. Especially in prepaid-centric markets where subsidy is rare, the majority of customers use lesser devices from the likes of Nokia’s Series 40 range, or the huge range from Samsung and LG. Worse still for operators, many of these devices are bought “vanilla” from separate retail channels over which they have little control.
While it is theoretically possible for service providers to “push” their UIs and applications down to non-customised handsets in the aftermarket, in reality that rarely happens as it has huge potential to cause customer dissatisfaction. More generally, some minimal customisation is provided via the SIM card applications – although over time this may become slightly more sophisticated.

Realistically, the only way that operator can easily control new business models linked to prepaid mobile phone subscribers is through own-brand phones (see ODM section below), or via very simple “per day” or “per month” fixed-fee services like web access or maybe video.

Overall, it could be viewed that operators are continually facing a “one step forward, two steps back” battle for handset application and UI control. For every new Telco-controlled initiative like in-house appstores, customised/locked smartphone OS’s, BONDI-type web security, or managed “policy” engines, there is another new source of “control leakage” – Apple’s device management, Nokia’s Ovi client, or even just open OS’s and third-party appstores enabling easy download of competing (and often better/free) software apps.

Multi-platform user experience

The rest of this document has talked about devices as standalone products, linked to particular services or business models. But it actually seems fair to assume that many users will be using a variety of platforms, in a variety of contexts, acquired through a myriad of channels.

This suggests that operators have some scope to define and own a new space – “multi-platform experience”. The idea is to compete to get as great an aggregate share of attention and familiarity as possible, tied to the provision of both end-user service fees and, potentially, two-sided offerings that benefit from this extra customer insight and access.

For example, users may wish to view their photos, or access their social networks, via digital cameras, mobile phone(s), PC, tablet, TV, in-car system and various other endpoints. They will want to have similar (but not identical) preferences and modes of behaviour. Yet there will likely be one which is the cornerstone of the overall experience, with the others expected to be reflections of it. This will drive ongoing purchasing behaviour of additional devices and services – Apple has understood this well.

Operators need to either start to drive these user experience expectations and preferred interaction patterns – or be prepared to accommodate others’. For example, there now appears to significant value to many users in ensuring that new technology products are optimised for Facebook. While this may be a blow to the operators’ hopes of dominating a particular service domain, relinquishing it may be a small price to pay for overall importance in the user’s digital lifestyle. A telco providing a tablet with a Grade-A Facebook experience has a portal to introducing the user to other in-house services.

Recommendations

For mobile operators

  • The key element of device strategy remains the selection, testing and sale of handsets – along with basic customisation and obtaining exclusivity where possible. Larger operators – especially those which are in post-paid centric markets – have more flexibility in creating or pushing new device classes and supporting new business models.
  • Mobile operators do not have a distinguished past in creating device UIs, with various failed experiments in on-device portals and application stacks. Consider focusing on control points (eg API security) underneath the apps and browser, rather than branding the direct interface to the user.
  • New classes of mobile device (tablets, in-car devices, M2M) are less risky than smartphones, but are unlikely to “move the dial” in terms of revenues for many years. They will also likely require more complex and customised back-end systems to support new business models. Nonetheless, they can prove fruitful for long-term initiatives and partnerships (eg in healthcare or smart metering).
  • Bridge the gap between RAN and device teams within your organisation, to understand the likely radio impacts of new products – especially if they are for data-hungry applications or ones with unusual traffic patterns such as upstream-heavy. Silicon and RF may be complex and “unsexy”, but they can make a huge difference to overall network opex and capex.
  • While Android appeals because of its ODM-friendliness and flexibility, it remains unproven as an engine for new business models and still has uncertain customer appeal. Do not turn your back on existing device partnerships (RIM, Apple, Nokia etc) until this becomes clearer.
  • Yoda in Star Wars had wise advice “Do. Or do not. There is no ‘try’”. Creating devices is expensive, time-consuming and not for the faint-hearted. Uncommitted or under-resourced approaches may end up causing more harm than good. Be prepared to write some large cheques and do it right, first time.
  • If you are serious about investing in fully-customised handsets, consider following 3’s path with INQ and sell them to other non-competing operators around the world, to amortise the costs over greater volumes.
  • Examine the potential for raising revenue or customer satisfaction from device-side utilities rather principle applications. For example, self-care or account-management apps on a smartphone can be very useful, while well thought-out connection management clients for mobile broadband PCs are a major determinant of customer loyalty.
  • Another promising domain of device specialism lies around creating enhanced experiences for existing successful applications – for example porting FaceBook and Twitter, or particular media properties, to custom software loads on handsets. Done well, this also has the potential to form the basis of a two-sided business model. For example, if an operator pitched a “YouTube-optimised” phone, tied in with end-to-end network policy management and customer data exposure, there could be significant advertising revenue-share opportunities.
  • Mobile operators should generally consider enterprise-grade devices (eg tablets, meters, in-vehicle systems) only in conjunction with specialist partners.

  • De-prioritise initatives around netbooks and laptops with embedded 3G connectivity. They represent huge loads on the network, are difficult to sell, and are extremely hard to monetise beyond “pipe” revenues.

For fixed & cable operators

  • The core recommendation is to continue focusing on (and enhancing) existing home gateway and set-top box products. These should be viewed as platforms for existing and future services – some of which will be directly monetisable (eg IPTV) while others are more about loyalty and reduction of opex (eg self-care and integrated femtocell modules).
  • Consider the use of relatively inexpensive custom devices (eg WiFi tablets) which are locked to usage via your gateway. Potentially, these could be given for free in exchange for a commitment to longer/renewed contracts or higher service tiers – and may also form the basis of future services provided via appstores or widgets.
  • Work collaboratively with innovative consumer electronics suppliers in areas such as Internet-connected TVs and games consoles. These vendors are potentially interested in end-to-end cloud services – including value-added capabilities from the network operators. They may also be amenable to suggestions on how to create “network-friendly” products, and co-market them with the operator.
  • Some operators may have the customer branding strength and physical distribution channels to sell adjunct product such as storage devices, Internet radios, IPTV remote controls and so forth. There may additional revenue opportunities from services as well – for example, including a Spotify subscription with a set of external speakers. However, do not underestimate the challenges of overall system integration or customer support.
  • Take a leadership role in pursuing digital home opportunities. There is a narrow window of opportunity in which fixed operators have the upper hand here – over time, it is likely that mobile operators and their device vendors will start to gain more traction. For now, WiFi (and maybe powerline) connections are the in-home network of choice, with the WiFi router provided by a fixed/cable operator being at its centre.
  • A pivotal element of success is ensuring that an adequate customer support and device-management system is in place. Otherwise incremental opex costs may more than offset the benefits from incremental revenue streams.

  • Fixed telcos should look to exploit home networking gateways, femtocells and other CPE, before consumer electronic devices like TVs and HiFi’s adopt too many “smarts” and start to work around the carrier core, perhaps accessing YouTube or Facebook directly from the remote control. At present, it is only open devices with a visible, capable and accessible user interface or browser (e.g. PCs and smartphones) that can exploit the wider Internet. Inclusion of improved Internet connectivity and user control in other classes of device will broaden their ability to circumvent operator-hosted services.

Conclusions

Telcos need to face the inevitable – in most cases, they will not be able to control more than a fraction of the total computing and application power of the device universe, especially in mobile or for “contested” general-purpose devices. Even broadband “device specialists” will need to accept that their role cannot diminish the need for some completely “vanilla” network end-points, such as most PCs.

But that does not mean they should give up trying to exert influence or design their own hardware and software where it makes sense – as well as developing services that compete on equal terms with the web, for those devices beyond their direct reach.

They should also ensure that at least as much consideration is given to optimising devices for their current business models, as well as hoping they can form the basis of innovative offerings.

Some of the most promising new options include:

  • Single-application “locked” mobile devices, perhaps optimised for gaming or utility metering or navigation or similar functions, which have a lot of potential as true “terminals” and the cornerstone of specific business models, albeit used in parallel with users’ other smart devices.
  • Even notionally-open devices like smartphones and tablets can be controlled, especially through application-layer pinch points. Apple is the pre-eminent exponent of this art, controlling the appstore with an iron fist. This is not easy for operators to emulate, but is a very stark benchmark of the possible outcome. Android can help here, but only for those operators prepared to invest sufficient time and money on getting devices right. Another option is to work with firms like RIM, which tend to have more “controllable” OS’s and which are operator-friendly.
  • It is far easier for the operator to exert its control at the edge with a standalone, wholly-owned and managed device, than via a software agent on a general computing device like a smartphone or notebook PC. However, it is more difficult and expensive to create and distribute a wholly-owned and branded device in the first place. Few people will buy a Vodafone television, or an AT&T camera – partnerships will be key here.
  • Devices which support web applications only (eg tablets) are somewhat different propositions to those which can also support “native” applications. Operators are more likely to find the “security model” for a browser cheaper and easier to manage than a full, deep OS, affording more fine-grained control over what the user can and cannot do. The downside is that browser-resident apps are generally not as flexible or powerful as native apps.
  • On devices with multiple network interfaces (3G, WiFi, Bluetooth, USB etc) a pivotal control layer is the “connection manager”, which directs traffic through different or multiple paths. In many cases, some of those paths will be outside operator control, allowing “leakage” of application data and thus revenue opportunity.
  • Even where aspects of the device itself lie outside Telcos’ spheres of control, there are still many “exposable” network-side capabilities that could be exploited and offered to application providers, if Telcos’ own integrated offerings are too slow or too expensive. Identity, billing, location, call-control can be provided via APIs to add value to third-party services, while potentially, customer data could be used to help personalise services, subject to privacy constraints. However, carriers need to push hard and fast, before these are disintermediated as well. Google’s clever mapping and location capabilities should be seen as a warning sign that there will be substitutes available that do not rely on the telcos.
  • We may also see ‘comes with data’ products offered by the Telco themselves with their own product teams as a sort of internal upstream customer. If Dell or Apple or Sony can sell a product with connectivity bundled into the upfront price, but no ongoing contract, why not the operators themselves?

The other side to device specialists is the potential for them to become buyers rather than sellers of two-sided services. If Operator X has a particularly good UI or application capability, then (if commercial arrangements permit), it could exploit Operator Y’s willingness to offer managed QoS or other capabilities. This is most likely to happen where the two Telcos don’t compete in a given market – or if one is fixed and the other mobile. Our managed offload use case in the recent Broadband report envisages a situation in which a fixed ‘device specialist’ uses a WiFi or femto-enabled gateway to assist a mobile broadband provider in removing traffic from the macro network.

In addition to these, there are numerous device-related “hygiene factors” that can improve operators’ bottom line, through reducing capex/opex costs, or improving customer acquisition and ongoing revenue streams. Improved testing and specification to reduce customer support needs, minimise impact on networks and guarantee good performance are all examples. For example, RIM’s BlackBerry devices are often seen as being particularly network-friendly, as are some 3G modems featuring advanced radio receiver technology.

Overall, the battle for control of the edge is multi-dimensional, and outcomes are highly uncertain, particularly given the economy and wide national variations in areas like device subsidy and brand preference. But Telcos need to focus on winnable battles – and exploit Moore’s Law rather than beat against it with futility.

Figure 3: Both hardware and software/UI provide grounds for telco differentiation

 

Full Article: Handsets – Demolition Derby

Summary: ‘Hyper-competition’ in the mobile handset market, particularly in ‘smartphones’, will drive growth in 2010, but also emaciate profits for the majority of manufacturers. Predicted winners, losers and other market consequences.

This is a Guest Note from Arete Research, a Telco 2.0™ partner specialising in investment analysis.Arete Members can download a PDF of this Note here.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

Handsets: Demolition Derby

demo_derby_01_cars.jpg

Arete’s last annual look at global handset markets (Handsets: Wipe-Out!, Oct. ’08) predicted every vendor would see margins fall by ~500bps. This happened: overall industry profitability dropped, as did industry sales. Now everyone is revving their engines with vastly improved product portfolios for 2010. Even with 15% unit and sales growth in ’10, we see the industry entering a phase of desperate “hyper-competition.” Smartphone vendors (Apple, RIMM, Palm, HTC) should grab $15bn of the $23bn increase in industry sales.

Longer term, the handset space is evolving into a split between partly commoditised hardware and high margin software and services. Managements face a classic moral hazard problem, incentivised to gain share rather than preserve capital. Each vendor sees 2010 as “their year.” Individually rational strategies are collectively insane: the question is who has deep enough pockets to keep their vehicles in one piece.
Revving the Engines. Every vendor is making huge technology leaps in 2010: high end devices will have 64/128GBs of NAND, 8-12Mpx cameras, OLED nHD capacitive touch displays, and more features than consumers can use. Smartphones should rise 50% to 304m units (while feature phones drop 21% in units). As chipmakers support sub-$200 complete device solutions, we see a race to the bottom in smartphone pricing.

Software Smash-Up. The rush of OEMs into Android will bring differentiation issues (as Symbian faced). Beyond Apple, every software platform faces serious issues, while operators will use “open” platforms to develop their own UIs (360, OPhone, myFaves, etc.). Rising software costs will force some OEMs to adopt a PC-ODM business model, while higher-margin models of RIM and Nokia are most at risk.

Finally, the Asian Invasion. Samsung, HTC and LGE now have 30% ’09E share, with ZTE, Huawei, MTEK customers and PC ODMs all joining the fray. All seek 20%+ growth. Motorola and SonyEricsson are being forced to shrink footprint, and shift risk to ODM partners. Nokia already has an Asian cost base, but lacks new high-end devices outside its emerging markets franchise. Apple looks set to claim 40% of industry profits in ’10, as other OEMs fight a brutal war of attrition, egged on by buoyant demand for fresh products at record low prices.

demo_derb-table1.jpg


Forget Defensive Driving

Our thesis for 2010 is as follows: unit volumes will rebound with 15% growth, with highly competitive pricing to keep volumes flowing. This will be driven by highly attractive devices at previously unimaginably low prices. Industry sales will also rise 15%, by $23bn, but half of the extra sales ($11bn) will be taken by Apple. Industry margins will remain under pressure from pricing and rising BoM costs. Every traditional OEM, smartphone pure-play, and new entrant are following individually rational strategies: improve portfolios, promise the moon to operators, and price to gain share. Those that fail to secure range planning slots at leading operators will develop other channels to market. Collectively, the industry is entering a period of desperation and dangerous self-belief. There are few incentives to exercise restraint for the likes of Dell (led by ex-Motorola management), Acer (the consistent PC winner at the low-end), Huawei and ZTE (which view devices as complementary to infrastructure offerings) or Samsung (where rising device units help improve utilisation of its memory and display fabs). Motorola and SonyEricsson must promote themselves actively, just to find sustainable business models on 4% share each.

Table 2 shows industry value; adjusted for the impact of Apple, it shows a continuous 4-5% decline in ASPs (though currencies also play a role). The challenge for mainstream OEMs (Nokia, Samsung, LGE, etc.) is to win back customers now exhibiting high loyalty after switching to iPhone or Blackberry. Excluding gains by Apple and RIM, industry sales are on track to fall 13% in ’09. Apple, RIM, Palm and HTC will collectively account for $15bn of our forecast incremental $23bn in industry sales in ’10E.

dem0_derby-table2.jpg

Within this base, we see smartphones rising from 162m units in ’08 (13% of the total) to 304m units, or 23% of total ’10E shipments. At the same time, featurephone/mid-range units will drop by 21% in ’09 and 21% again in ’10.

Key Products for 2010

  • Both SonyEricsson and LGE have innovative Android models coming in 1H10, LG with distinctive designs and gesture input, and a new SonyEricsson UI and messaging method.
  • Nokia’s roadmap features slimmer form factors, but a range of capacitive touch models will not come until 2H10. It will update the popular 6300/6700 series with a S40 touch device in 1H10.
  • Samsung has its usual vast array of product, and plans for 100m touch models in ’10 underlining the extent of their form factor transition.
  • Motorola’s line-up will focus on operator variants, with a lead device shipping in 2Q10, but a number of operators think Motorola lacks distinctive designs and see little need for Blur.
  • RIM will not change its current form factor approach until 2H10, when it moves to a new software platform to enhance its traditional QWERTY base. It faces commercial challenges around activation and services fees with carrier partners.
  • We expect Apple to reach lower price points and also launch CDMA-based iPhones in ’10.
  • HTC must also reduce its costs to address mid-range prices.
  • Every vendor plans to widen its portfolio with several “hero” models in 2010; if anything the window to hype any single launch is narrowing.

Main Trends

Discussions with a wide range of operators, vendors and chipmakers about 2010 device roadmaps point to an explosion of attractive products – a few trends stand out:

  • Operators are now deeply engaging Chinese vendors. Huawei and ZTE have Android devices coming, while TCL and Taiwanese ODMs offer low-end devices. Chipmakers confirm Android devices will drop under $100 BoM levels by YE10. This will pressure both prices and margins. The value chain is shifting rapidly to more compute-intensive devices, with Qualcomm and others enabling Asian ODMs to be active in new PC segments with smartphone-like features (touch, Adobe Flash, 3G connectivity, etc.) in large-screen form factors, to leverage their LCD base.
  • All devices will become “smartphones.” Samsung and Nokia are opening up APIs for mass market phones. The smartphone tag (vs. dumb ones) will be applied to devices of all sorts, the way we formerly spoke of handsets. By the end of 2010, all devices (except basic pre-paid models) will be customisable with popular applications (e.g., search, social networking, IM, etc.) even if they lack hardware for video content (i.e., memory and codecs) or mapping (GPS chipsets). Open OS devices should rise 50% to 304m units, 23% of the total market.
  • Pure play smartphone vendors (RIMM, HTC, Palm) must transition business models to emulate Apple (i.e., linking devices with services and content). Launching lower-cost versions of popular models (RIMM’s 8520, HTC’s Tattoo, Palm’s Pixi) implicitly recognises how crowded the high-end ($400+) is becoming. This will get worse as Motorola and SonyEricsson seek to re-invent themselves with aspirational models, and Android devices hit mid-range prices in ’10.

Fearless Drivers

We had said before that key purchase criteria (design, features, brand) were reaching parity across OEMs, splitting the market into basic “phones” (voice/camera/radio) and Internet devices. The former has room for two to three scale players: Nokia, Samsung, and a third based on a PC-OEM model using standard offerings (e.g., Qualcomm or MTEK chipsets). LG and ZTE are both seeking this position, from which SonyEricsson and Motorola retreated to focus on Internet devices. This does not mean mobile devices are now commodities, like wheat or steel. The complexity of melding software and hardware in tiny, highly functional packages is not the stuff of commodity markets. But we see a split where a narrow range of standard hardware platforms will accommodate an equally narrow set of software choices. Mediatek is blazing a trail here. Some operators (Vodafone, China Mobile, etc.) aim to follow this model for pre-paid and mid-range featurephones. Preserving software and services value-add for consumers in a market where hardware pricing is fairly transparent is a challenge for all OEMs.

This model is not confined to the low-end: In Wipe Out! we said Motorola (among others) would adopt an HTC/Dell model (integrating standard chipsets/software and cutting R&D). This is happening, with Motorola no longer trying to control its software roadmap, having fully adopted Android. SonyEricsson is following suit, with initial Android devices coming in 1Q10.

Recent management changes make it even more likely SonyEricsson gets absorbed into Sony to integrate with content (as its new marketing campaign pre-sages). Internet devices will become even more fragmented by would-be new entrants in ’10. In addition to Nokia, Apple, RIMM, HTC and Palm, LG and Samsung intend to build a presence in smartphones, as do Huawei, ZTE and PC ODMs. We had expected LGE or Samsung to consider M&A (i.e., buying HTC or Palm) to cement their scale or get a native OS platform. We forecast the shift to Internet devices would bring 27m incremental units from RIM, HTC, and Apple in ’09E. This now looks like it will be 21m units (partly due to weaker HTC sales), a growth of 58% vs. an overall market decline of 6%.

Growth: Steaming Again

After a long string of rises in both units and industry value, the global handset market retreated in ’09. We see risk of a weaker 1H10 mitigated in part by trends in China (3G) and India (competition among new operators). The industry had already scaled up for 10-20%+ growth during the ’05-’08 boom; most vendors have highly outsourced business models and/or partly idle capacity, meaning they could produce additional units relatively quickly. Paradoxically, 15% unit and sales growth will further encourage aggressive efforts to gain share.

Our regional forecasts are in Table 3. Emerging markets are two-thirds of volumes in ’09E and ’10E, and will lead growth – at ever lower price points – as they adopt 3G. Market dynamics vary sharply between highly-subsidised, contract-led markets (i.e., the US, Japan/Korea, and W. Europe) and pre-paid-led emerging markets (China, India, E. Europe, MEA and LatAm). In the former, operators are driving smartphone adoption; while price erosion helps limit subsidy budgets, we see growth in handset market value. As Table 4 shows, mobile data handsets hit 10%+ of EU operator sales, but are not yet driving operators’ sales growth.

demo_derb-table3.jpg

demo_derb-table4.jpg

In emerging markets, the growth in value is led by further volume increases for LCHs. In ’05, we saw an inflection point around Low-Cost Handsets: Every Penny Counts (July ’05) and A Billion Handsets in ’07? (Aug. ’05). Since ’05, there were 1.2bn handsets shipped in China and India alone. LCH chipsets now sell for <$5, with only Infineon and Mediatek actively supplying meaningful volumes. The ongoing mix shift to emerging markets and weak sales of mid-range devices in developed markets were behind the 13% decline in industry value in ’09E, excluding Apple’s sales. Of the extra 170m units we see shipping in ’10E, 105m come from emerging markets, with ~50m sold in China and India.

Costs: Relentless Slamming

In Wipe Out!, Arete laid out four areas where costs might rise in ’09 and beyond, as the source of structural pressure on industry margins. None of these costs are easing or receding. First, the chipset market is increasingly concentrating. TI is exiting, ST-Ericsson continues to lose money, Infineon recovered but still lacks scale in 3G, and Mediatek dominates outside the top five OEMs. This leaves Qualcomm in a de facto leadership position in 3G. This structure does not support meaningful cost reduction for OEMs. Intel may seek an entry to disrupt the market (see Qualcomm v Intel, Fight of the Century, Sept. ’09) but this is unlikely to happen until ’11. Memory may be in short supply in ’10, while high-end OLED displays still face shortages. Capacity cuts and losses at smaller component suppliers in ’09 limit how much OEMs can save. Outsourced manufacturers like Foxconn, Compal, Jabil, BYD, and Flextronics have low margins and poor cash flow. OEMs want to transfer more risk to suppliers that have little room to cut further.

Second, feature creep also thwarts cost reduction efforts: packing more into every phone is needed to stimulate demand, but adds cost. There are rising requirements in the mid-range, going from 2Mpx to 3.2/5Mpx camera modules, and adding touch, more memory, and multi-radio chipsets (3G, WiFi, BT, FM, etc.). Samsung already offers a 2Mpx touchscreen 2G phone for <$100 on pre-paid tariffs.

Third, software remains the fastest-rising element of handset costs. In Mobile Software Home Truths (Sept. ’09), we discussed how software was adding costs, but how many OEMs were struggling to realise value from software investments? Adopting “licence-free” or open source software does not necessarily reduce these costs: it must still be managed within industrial processes. Yet saving licence costs will be the argument used by OEMs forced to limit the number of platforms they support, as Samsung did by recently indicating it would abandon Symbian. We understand WinMo efforts have been largely mothballed at Motorola and SonyEricsson, even as LG is increasing its spend around Microsoft. Costs are also rising for integration of services, while Software costs are not falling; vendors are just shifting them from handset bill-of-materials (BoM) to other companies’ R&D budgets.

Finally, marketing costs are also rising. Vendors must provide $10m-50m per market of above-the-line marketing support and in-store promotions, to get operators to feature “hero” products. Services adds costs for integration and (often-overlooked) indirect product costs (testing, warranty, logistics, price protection in the channel). SG&A must rise to educate users about new services. OEMs cannot retain or win customers in a mature market without more marketing.

The case for services remains simple and compelling: Nokia’s 33% gross margin on €65 ASPs yields €22 gross profit per device, or €1/month over a two-year lifetime. This is the only way to offset further pressure on device profits. The drive to launch Services is another cost OEMs must bear, with a longer payback than that of 12-18 month design cycles for devices.

Margins: Beyond Fender Benders

When Motorola has lost $4bn since ’07 and SonyEricsson may lose as much as €1bn in ’09, we are no longer talking about minor dents. Gross margins for both are already low (sub-20%). The most notable feature of the past few years was how exposed some vendors were when extensions of hit products (or product families) fell flat. SonyEricsson went from 13% 4Q07 margins to breakeven by 2Q08, and RIM saw group gross margins drop 1000bps. Only Nokia (at 33%), RIM, Apple and HTC have gross margins above 30%. Few OEMs managed to raise gross margins after seeing them decline, though we see SonyEricsson and Motorola seeking to do so by vastly reducing their scope of activities.

Having an Asian low-cost base is a necessary but not sufficient condition of survival. Nokia is already the largest Asian producer, with the industry’s two largest plants (in China and India) giving it the lowest cost structure (i.e., the lowest ASPs, but consistently among the highest margins). Few OEMs other than Nokia make money selling LCHs (i.e., sub-€30). Nokia made ~60% of industry profits in ’08, but will be surpassed in profits in ’09 by Apple, which should make 40% of industry profits in ’10, while Nokia has 25%. It is also worth noting that we forecast margins to fall at nearly every vendor in ’10, though Motorola and SonyEricsson must end large losses, and Nokia will benefit for IPR income within its Devices margin.

demo_derb-table5.jpg


Software: Mutual Destruction?

The mobile industry is rapidly adopting the IT industry’s software as a service (SaaS) model. The handset is becoming a distribution platform for services and content; vendors aim to monetise a “community” of their device users. Yet for all the attention it gets, software is a means to an end, and not part of the product. Beyond RIM and Apple, only Nokia can afford its own smartphone platform R&D (i.e., Symbian), yet we see Nokia itself moving closer to Microsoft. Money alone cannot solve software or services issues; if so, Nokia’s industry-leading €3bn R&D budget would have yielded more success, while Apple would not have grabbed as much profit share with a $1.3bn group-wide R&D budget.

No vendor yet excels at ease-of-use for multiple applications (voice, SMS, music, video, browsing, navigation, etc.). RIM offers best-in-class messaging, but falls short in other use cases. The iPhone’s Web experience allowed it to overcome shortcomings in multi-threading and voice/text. Samsung has few services to accompany its sleek designs or high-spec displays and cameras. Just going to 70-100m touch-screed devices in ’10 will not resolve ease-of-use issues.

A number of vendors risk getting addicted to “free” software platforms where others reap the benefits (e.g., Android). Few OEMs have embraced regular updates of components (media players, browser plug-ins, etc.) to meet changing requirements. This is Apple’s edge (and in theory Microsoft’s, but it has not managed handset software efficiently). The current slowdown will only hasten moves to abstraction of hardware and software, long the case in PCs. What is the point of OEMs having their own “developer programmes” (e.g., MOTODEV, Samsung Mobile Innovation, SonyEricsson Developer World, etc.) if they adopt Android? To escape high software costs, some vendors are adopting a PC-OEM model: sub-20% gross margins, 1-5% R&D/sales, with little control over how services are implemented on devices.

When the Dust Settles…

After turmoil and consolidation in ’06, industry margins were robust in ’07, then plunged in ’08. Yet a hoped-for recovery in ’09 has given heart to a range of weaker players, sealing the industry’s fate.

Even with a resumption of growth, rising costs and hyper competition look set to put pressure on margins. The precipitous impact of this may not be seen until 2011; for now, managements are not inclined to call it quits, or admit they lack a services or software play. The handset market is hardly gone ex-growth, but its rules and value chain are shifting, as seen in Apple and Google staking their claims.

The market looks to be falling less than the $11bn we forecast for ’09 (“only” $9bn), but it is Apple’s incremental sales that are changing the dynamics most. We are no fans of M&A, but would welcome moves to remove industry capacity. There are few obvious options, beyond HTC and Palm. We also think Samsung and LGE would benefit from deals that might open up their insular corporate cultures. Nokia has showed how difficult it is for an OEM to assemble a portfolio of Services offerings: none are yet best-in-class. Our verdicts on the key questions for vendors are listed in the following table: We see room for two to three scale players in LCHs/feature-phones (Nokia, Samsung and one other following a PC-OEM model). Smartphones will grow even more fragmented and hotly contested. We are not certain whether the others – SonyEricsson, LGE, Motorola, ZTE, HTC, and Japanese vendors – will emerge from 2010 in one piece.

demo_derb-table6.jpg

Richard Kramer, Analyst
Arete Research Services LLP
richard.kramer@arete.net / +44 (0)20 7959 1303

Brett Simpson, Analyst
Arete Research Services LLP
brett.simpson@arete.net / +44 (0)20 7959 1320

 

Regulation AC – The research analyst(s) whose name(s) appear(s) above certify that: all of the views expressed in this report accurately reflect their personal views about the subject company or companies and its or their securities, and that no part of their compensation was, is, or will be, directly or indirectly, related to the specific recommendations or views expressed in this report.

Required Disclosures

For important disclosure information regarding the companies in this report, please call +44 (0)207 959 1300, or send an email to michael.pizzi@arete.net.

Primary Analyst(s) Coverage Group: Alcatel-Lucent, Cisco, Ericsson, HTC, Laird, Motorola, Nokia, Palm, RIM, Starent.

Rating System: Long (L), Positive (+ve), Neutral (N), Negative (-ve), and Short (S) – Analysts recommend stocks as Long or Short for inclusion in Arete Best Ideas, a monthly publication consisting of the firm’s highest conviction recommendations.  Being assigned a Long or Short rating is determined by a stock’s absolute return potential, related investment risks and other factors which may include share liquidity, debt refinancing, estimate risk, economic outlook of principal countries of operation, or other company or industry considerations.  Any stock not assigned a Long or Short rating for inclusion in Arete Best Ideas, may be rated Positive or Negative indicating a directional preference relative to the absolute return potential of the analyst’s coverage group.  Any stock not assigned a Long, Short, Positive or Negative rating is deemed to be Neutral.  A stock’s absolute return potential represents the difference between the current stock price and the target price over a period as defined by the analyst.

Distribution of Ratings – As of 15 October 2009, 10.8% of stocks covered were rated Long, 6.8% Positive, 25.7% Short, 10.8% Negative  and 45.9% deemed Neutral.

Global Research Disclosures – This globally branded report has been prepared by analysts associated with Arete Research Services LLP (“Arete LLP”) and/or Arete Research, LLC (“Arete LLC”), as indicated on the cover page hereof.  This report has been approved for publication and is distributed in the United Kingdom and Europe by Arete LLP (Registered Number: OC303210, Registered Office: Fairfax House, 15 Fulwood Place, London WC1V 6AY), which is authorized and regulated by the UK Financial Services Authority (“FSA”), and in the United States by Arete LLC (3 PO Square, Boston, MA 02109), a wholly owned subsidiary of Arete LLP, registered as a broker-dealer with the Financial Industry Regulatory Authority (“FINRA”).  Additional information is available upon request.  Reports are prepared using sources believed to be wholly reliable and accurate but which cannot be warranted as to accuracy or completeness.  Opinions held are subject to change without prior notice.  No Arete director, employee or representative accepts liability for any loss arising from the use of any advice provided.  Please see www.arete.net for details of any interests held by Arete representatives in securities discussed and for our conflicts of interest policy.

U.S. Disclosures – Arete provides investment research and related services to institutional clients around the world.  Arete receives no compensation from, and purchases no equity securities in, the companies its analysts cover, conducts no investment banking, market-making or proprietary trading, derives no compensation from these activities and will not engage in these activities or receive compensation for these activities in the future.  Arete restricts the distribution of its investment research and related services to approved institutions only.  Analysts associated with Arete LLP are not registered as research analysts with FINRA.  Additionally, these analysts may not be associated persons of Arete LLC and therefore may not be subject to Rule 2711 restrictions on communications with a subject company, public appearances and trading securities held by a research analyst account.

Section 28(e) Safe Harbor – Arete LLC has entered into commission sharing agreements with a number of broker-dealers pursuant to which Arete LLC is involved in “effecting” trades on behalf of its clients by agreeing with the other broker-dealer that Arete LLC will monitor and respond to customer comments concerning the trading process, which is one of the four minimum functions listed by the Securities and Exchange Commission in its latest guidance on client commission practices under Section 28(e).  Arete LLC encourages its clients to contact Anthony W. Graziano, III (+1 617 357 4800 or anthony.graziano@arete.net) with any comments or concerns they may have concerning the trading process.

General Disclosures – This research is not an offer to sell or the solicitation of an offer to buy any security.  It does not constitute a personal recommendation or take into account the particular investment objectives, financial situations, or need of the individual clients.  Clients should consider whether any advice or recommendation in this research is suitable for their particular circumstances and, if appropriate, seek professional advice.  The price and value of the investments referred to in this research and the income from them may fluctuate.  Past performance is not a guide to future performance, future returns are not guaranteed, and a loss of original capital may occur.  Fluctuations in exchange rates could have adverse effects on the value or price of, or income derived from, certain instruments.

© 2009.  All rights reserved.  No part of this report may be reproduced or distributed in any manner without Arete’s written permission.  Arete specifically prohibits the re-distribution of this report and accepts no liability for the actions of third parties in this respect.

Full Article: 7 Strategic Priority Areas for new Telecoms Business Models

This 30+ page article can be downloaded in PDF format here.  The Executive Summary is reproduced below.

Executive Summary

Following the brainstorming sessions in Nice, we have set out below what we consider to be the most important takeaways on high-level telco strategy and each of the seven hot topics in business model innovation covered in dedicated sessions at the event:

Telco Strategy – New Revenue from New Business Models

  • There is almost universal agreement among telco executives that their industry needs to find new sources of revenue.
  • Despite the current gloomy economic climate, 93% of the delegates in Nice agreed that exploring new business models that generate new revenue is just as important in the near term as achieving operational efficiency and retaining customers
  • Three-quarters of the delegates characterised existing business and technical transformation efforts by their company or industry as either “not very effective” or “very poor”.
  • The delegates voted top 3 strategic actions for the industry as “Creating new levels of collaboration between service providers”, “Understanding the needs of upstream industries much better” and “Understanding the needs of end users much better”.  

Open APIs – Where’s the joined-up commercial strategy?

  • There is a great deal of work being done on APIs by the operator and vendor community, but there is a real risk of this activity being derailed by the emergence of numerous independent “islands” of APIs and developer programmes.
  • It is still early days for the commercial model for APIs, but it is already becoming apparent that a one-size-fits-all solution will be difficult to achieve. It is important for operators to ensure that API platforms (and the associated revenue mechanisms) can service two distinct classes of customer:
  • Broad adoption by thousands, perhaps millions, of developers via automated web interfaces (similar to signing up for Google Adwords or Amazon’s cloud storage & computing services);
  • Large-scale one-off projects and collaborations, which may require custom or bespoke capabilities, such as being linked to subscriber data management systems or “semi-closed” or “private” APIs, for example with governments or major media companies.

Retail Services 2.0 – ‘Supermarket strategy’ not enough

  • The most attractive options around retail services involve turning the operator’s network (and possibly devices) into a platform of “enablers” for third party services and applications. These assets and capabilities may not be easy to deliver, but once in place, should provide a defensible source of value.
  • Whether a telco should also sell “enabled” services at retail depends upon their existing customer relationships, portfolio of existing in-house services and ease of developing retail partnerships.

  • Some applications simply cannot be “sold” through an operator’s retail store, as they will be integral parts of much larger services. Although Amazon can enable the sale of a huge variety of products, delivering fresh food or fuels, for example, would not fit with its logistics business. But suppliers of such goods might still exploit Amazon’s various online commerce enablers.

Devices 2.0 – Still no consistent industry strategy

  • Few fixed or mobile operators have successfully created new types of devices on their own. Few consumers, for example, would view their broadband “box” as a central hub of a home network – despite more than 10 years of discussion of interconnection with consumer electronics, utility meters and home automation.
  • In the mobile space, probably the most important customisation has been the configuration of the telco’s own portal as the default browser home page. If anything, the shift towards smartphones and PC-based mobile broadband has further weakened telcos’ role – the majority of 3G data traffic goes straight to and from the Internet from “vanilla” devices.
  • The future possibly holds some more hope. Delegates were strongly in favour of pushing for telco “control points” in otherwise open devices, which fits well with the heritage of SIM cards (which are expanding in capability) as well as standardisation in areas like the browser and widget frameworks (e.g. OMTP BONDI). Software pre-loaded with PC dongles or embedded 3G modems is another option.
  • In the converged triple/quadplay space, femtocells offer another point of control and service delivery, close to the customer, but delegates viewed the notion of a separate “gateway” product with less enthusiasm. New classes of devices such as mobile Internet devices (MIDs), operator-enabled consumer electronics (Internet TVs, 3G music players, in-car systems etc.) also hold promise, but are seen more as low-risk experiments at this point.

Online Video Distribution – Time to sort out the “Net Neutrality” Issue

  • Those pushing the ‘network neutrality’ issue are (deliberately or otherwise) causing confusion over differential pricing which creates public relations and regulatory risks for operators that need to be addressed.
  • Operators need to develop a suite of value-added products and services for third-parties sending digital goods over their networks so they can generate incremental revenues that will enable continued network investment.
  • Sending-party pays models may or may not work – this is an area where more experiments need to be tried. Distributors need to be working on disentangling bits that are able to be free from those that have to pay, not letting anyone get a free ride.

Enterprise Services 2.0 – A broader suite of platform services needed

  • Telcos need to learn how to develop, sell and support services which are customised, as well as mass-market “basic” applications and APIs. Ideally, the technical platform will be made up of underlying components (e.g. the API interface “machinery” and the associated back-office support systems) designed to cope with both ‘off the shelf’ and ‘bespoke’ go-to-market models for new services.
  • Especially in the two-sided model, there are very few opportunities to gain millions – or even tens of thousands – of B2B customers buying the same basic “product”. Google has managed it for advertising, while Amazon has large numbers of hosting and “cloud computing” customers – but these are the exceptions.
  • Perhaps the easiest and most universal horizontal markets will be enhancements to voice and messaging capabilities – after all, these are the ubiquitous cross-sector services today.
  • To really exploit unique assets and take friction out of business processes, there is a need to understand specific companies’ (or sectors’) processes in detail – and offer customised or integrated solutions. Despite the lower scale, the aggregated value may be even higher.

Technical Architecture 2.0 – Good Start, but Significant Gaps

  • Operators are in a unique position in that they have a fuller picture of customers than any single website or retailer or service provider. Several have already recognised this, and a number of vendors are offering scalable platforms which claim to be in line with the current EU legislation on data protection.
  • But as well as user profile data, the 2-sided business model requires on-demand response from the network infrastructure. Both the network and IT elements must work together to deliver this, implementing new control & monitoring systems such as Resource & Service Control Systems (RSC).
  • Most new applications are centred around apps stores, mash-up environments, XaaS environments, and smartphone Web browsers, etc. which do not demand a traditional service delivery platform (SDP). In addition, enabling services are becoming an essential element in operators’ core products.
  • These enabling services need a framework, which is highly flexible, agile and responsive, and integrated with the features defined by the Next Generation Mobile Networks (NGMN) alliance.

Telco 2.0 Pilots – How to trial Telco 2.0 business models

  • There is insufficient time to pursue the usual protracted telco timescales for research and deliberation. Moreover, projects with long lead times – such as those involving governments – are typically unsuitable. Some target industries are also experiencing lengthening sales/decision cycles in the recession, which are also not optimal conditions for pilots.
  • Web-based companies are often the most flexible, as are some academic institutions. There may also be a geographic dimension to this – countries with low regulatory burdens, or where it is unusual to have projects stuck for months with lawyers, are attractive for pilots.
  • Working alone may be fastest, but collaborating with other operators is likely to be more effective in demonstrating the validity of the Telco 2.0 concept. 

© Copyright 2009. STL Partners. All rights reserved.
STL Partners published this content for the sole use of STL Partners’ customers and Telco 2.0™ subscribers. It may not be duplicated, reproduced or retransmitted in whole or in part without the express permission of STL Partners, Elmwood Road, London SE24 9NU (UK). Phone: +44 (0) 20 3239 7530. E-mail: contact@telco2.net. All rights reserved. All opinions and estimates herein constitute our judgment as of this date and are subject to change without notice.

Full Article: Devices 2.0 – Battle for the Edge; Executive Briefing Special

Introduction

NB A PDF version of this Executive Briefing can be downloaded here.

This special Executive Briefing report summarises the brainstorming output from the Devices 2.0 section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.

It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0’ business model framework (see www.telco2research.com).

Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.

There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Content Distribution 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and API’s 2.0. In addition there is an overall ‘Executive Summary’ report highlighting the overall messages from the event.

Each report contains:

  • Our independent summary of some of the key points from the stimulus presentations
  • An analysis of the brainstorming output, including a large selection of verbatim comments
  • The ‘next steps’ vote by the participants
  • Our conclusions of the key lessons learnt and our suggestions for industry next steps.

The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points. Over the next few weeks we will produce additional ‘Analyst Notes’ with some of these more detailed responses.

NOTE: The presentations referred to in this and other reports, some videos of the presentations themselves, and whole series of post-event reports are available at the event download site.

Access is for event participants only or for subscribers to our Executive Briefing service. If you would like more details on the latter please contact: andrew.collinson@stlpartners.com.

Background to this report

The growing profusion of end-user devices creates opportunities and threats for Operators, OTT players, and handset / CPE / operating software companies. Increasing amounts of intelligence and capability are to be found in Netbooks, Smartphones, Set-Top Boxes and routers. What is the consequence of this for Telcos? Are Telcos inevitably going to become dis-intermediated dumb-pipes? Or can operators deploy a device strategy that complements their network capabilities to strengthen their position within the digital value chain?

 

Brainstorm Topics

Stimulus Presenters and Panellists

  • Anssi Vanjoki, EVP, Nokia
  • Yves Maitre, SVP Devices, Orange Group
  • Alberto Ciarniello, VP Service Innovation, Telecom Italia
  • Rainer Deutschmann, EVP Mobile Internet, T-Mobile International
  • Dean Bubley, CEO, Disruptive Analysis; Associate, Telco 2.0™ Initiative

 

Facilitator

  • Simon Torrance, CEO, Telco 2.0 Initiative

 

Analysts

  • Chris Barraclough, Managing Director, Telco 2.0 Initiative
  • Dean Bubley, Senior Associate, Telco 2.0 Initiative
  • Alex Harrowell, Analyst, Telco 2.0 Initiative

 

Stimulus Presentation Summaries

Devices 2.0 – Battle for the Edge

Dean Bubley, CEO of Disruptive Analysis and a Senior Associate of the Telco 2.0 Initiative presented on the growing power of devices. Computing power at the edge is rapidly growing; even if the individual devices are still slow, they are speeding up faster than PCs, and there are so many of them. Edge power dwarfs that in the network. Control, responsibility, loyalty, attachment – these will move towards the intelligence, as they always have done. Where is the intelligence? At the edge, as the horde of gadgets gets smarter and smarter.

Soon, he argues, they will be able to tell what the operator is doing – if certain classes of traffic are being favoured or disfavoured, what they are paying for roaming or interconnection, and counteract it through ad-hoc radio networking, P2P, protocol spoofing, and encryption.

Beware! The edge and the cloud could gang up on the smart pipe, Bubley argues. In response, it’s possible to control the device, the gateway, or open up the device but try to maintain a control point somehow. What is certain is that it’s impossible to control everything..

Balancing the home and mobile environments

Yves Maitre, SVP Devices & Mobile Multimedia, Orange Group spoke on the importance of the home environment, which originates with the French experience with Minitel in the 1980s. They hope to integrate this with Orange’s history of mobility. The company is the biggest provider in Europe of VoIP, ADSL, and IPTV; their strategy is based on the Livebox and UMA. 

Netbooks are another ”first”. The company was also interested in Maemo (www.maemo.org), but more as a test than anything else. The device ecosystem remains incredibly complex – so much closed, proprietary stuff is still out there. The industry needs to embrace a small number of strategic open source platforms.

Yves Maitre, SVP Devices & Mobile Multimedia, Orange: ”Certain things have enduring relevance – my money, my business, my friends, my health”

It’s hard to say how far to go with the customer; there are serious privacy and security risks.

Yves_Maitre_Orange_Devices_telco2.png

Converged Services across Three Screens at DTAG

Rainer Deutschmann, EVP Mobile Internet at T-Mobile International said: Our aim is to provide access to digital assets, anywhere, any time, and on any screen, respecting the following principles: simplicity, freedom of choice, virtualisation, openness.

The walled garden was one of the worst things that ever happened; we took the decision to always be open. Digital music sales are about to pass CD sales; we’re looking at the increasing gap between Moore’s and Gilder’s laws. We want increasingly to get rid of storing anything, anywhere. Hence, he says, T-Mobile created its Connected Life and Work product, which is launching this month in Germany.

This provides an integrated multi-device contacts book, e-mail account, and content store; if you have photos on Flickr and other Web services, now you can consolidate them in one place in a very logical fashion. Importance of mobile widgets – people pay for them, unlike web pages – for key web services.

OMTP BONDI and Mobile User Experience 2.0

Alberto Ciarniello, VP Service Innovation, Telecom Italia wanted us to think of a time when it was nearly impossible to move data or applications between PCs. But mobile is like that, now!  There are lots of opportunities in mobile broadband and mobile applications, he says; there is a sharp drop off approaching in shipments of basic phones, and for that matter in SMS revenues. And everything will soon have a Web browser.

Alberto_Ciarnello_TelecomItalia_Devices_Telco2-2.png

At the moment, the ”what” – the application logic or what you’re trying to achieve – is easy; the ”how” – the practical implementation or what you express in software – is much harder. A lifestyle based on devices, he says, can turn into one based on applications; and perhaps to one based on data relationships. Hence, he says, TIM has the notion of ”user experience 2.0”. Every application should work on every device. This is, or should be, true of Web/WRT applications in particular. They should be consistent – some people already have 3 or 4 devices, he says. The aim of BONDI is to provide consistent and secure access from Web applications to device and network capabilities. This is an example of successful operator-led change; BONDI and the OMTP’s organisation are designed to represent operators.

Why do we need to end fragmentation? So we can have app stores that don’t make you change your shoes. Therefore, the best approach is through the browser. But this means we need a standard for access to the device’s OS from browserspace.  He offers the example of a click-to-dial e-commerce application, something which traditionally involved big-telco technology.

Alberto_Ciarnello_TelecomItalia_Devices_Telco2.png

Security, of course, is a huge issue; without it, BONDI would be giving Web pages access to low-level functionality! He says we need to delegate this to operators, or other security agents, because otherwise the users will struggle to manage all these issues.

BONDI must be open source and free; which will make it the first such thing ever to come from operators. He reminds us that defining the ”what”, the application logic, not discussing the ”how”, is what really matters; everything must ”just work”; porting costs should be zero. Once minimal requirements are satisfied, he says, user experience becomes the ruling factor. So check out TIM’s new dev platform at nextinnovation.it!

How Open do we need to be?

Anssi Vanjöki, EVP New Markets, Nokia believes in one Internet – no fixed or mobile. It’s easy to forecast technology, after all, we make it. But forecasting customers? People? That’s really hard.

In general, he says, future devices will all have good hardware capabilities and native programming support. This has important consequences. According to Nokia R&D’s usability research, 12% of the time an N-Series is switched on is spent making or receiving telephone calls or text messages. The rest is camera, media playback, Web browsing, e-mail, applications.

Most of the network activity involved is over cellular, but all the RF stacks they put in the devices get used. Packet radio of one form or other is now universal. And increasing chunks of total device shipments are meaningfully programmable. Devices will all be networked and programmed.

At the top end this will include 500MHz CPU, 64GB RAM, and more sensors – GPS, accelerometer, RFID/NFC, proximity… Networks will be HSPA+ merging into LTE, but it doesn’t really matter much which of those. So the next evolution of the Web will be highly contextual and semantic, based on the information from these sensors. And devices will be servers as well as clients.

Anssi Vanjöki, EVP New Markets, Nokia: ”a lot of devices look a lot like N810s – usability must be the basis of the business model.”

Participant Feedback

Introduction

The devices section of the event brought together Nokia and three operator representatives, one of whom was speaking on behalf of the Open Mobile Terminal Platforms industry group (OMTP). The session considered the evolving role of both mobile phones and fixed devices like home gateways. It focused on whether or not the Telcos are able to either control their use (for example with 3rd-party Internet applications), or extract extra value from the embedded capabilities and expose these to third parties.

Telco 2.0’s view is that the device space is still poorly understood by many of those tasked with developing next generation business models, many of whom come from a staunchly network-centric background. The shift of computing power and capability towards the “edge” has already been seen in the fixed world with the advent of PCs, and is now happening in mobile with products like the iPhone and 3G dongle modems. This means that the collective power of devices in users’ homes and hands outweighs that of the operator-controlled boxes in the network core.

Operators are faced with a stark choice of either relinquishing control of the edge, attempting instead to monetise “smart pipes”, or trying to reinsert themselves into the device space to ensure greater control and pursue new revenue opportunities by prioritising their own customisations and applications. Some Telcos still seem to feel that network intelligence like DPI can outwit Internet applications running between devices and the web. Others feel that they can offer dedicated devices (fixed or mobile) that are optimised for inhouse services rather than the Internet.

The general feedback from the session highlighted the wariness with which people view devices – and to some extent the relative immaturity of device-level control and business models, versus network-resident platforms and APIs.

Feedback: General (verbatim)

  • Fantastic: 3 MNO’s with 3 different views and Nokia throwing in some disruption! [#6]
  • Why is it that the Telco presentations are so individual, when they will learn and start to learn and work across the industry, all they seem to do is want to own it all? [#7]
  • Fragmented session: device 2.0 is not shared at all. One panellist one opinion…. [#37]
  • The operators leave me cold with their lack of vision and benefit for consumers, they seem to believe they can dominate and force their ways on the consumer, this will not last [#36]
  • o    Ref 36: I totally agree. They seem to underestimate the power of the user and their ability to get what they want an not what is thrust upon them [#56]
  • Good to hear different views – at the end not the MNOs or device manufacturers will decide but the customers, who in total will be no nerds but just users [#41]
  • The balance within edge and network located intelligence will be set by customer behaviour [#60]
  • How about changing the name of the event to economy 2.0 where the digital consumer is king. The relevance then is how Telco’s can react. But certainly digital consumers (with their gadgets/devices) are king. [#22]
  • Apple have bypassed future value chain for operators and proven the operators could become dumb pipes. How do operators get back into the value chain? [#53]
  • o    Re 53, the value of the network is the ability to shape experience. The operators need to quickly figure out how they monetise the fact they know who, where, what device and crucially, what access to bandwidth you have at the time you are using a service. If they can’t do that, they become a utility. [#63]
  • re 63, SP’s should avoid being directly involved in any content activities and focus on building a highly dynamic, mass scale transactional networking business, adding experience value to partnered content. [#70]
  • Telcos must open the networks to community… and what about mobile device suppliers? [#25]
  • Will device manufacturer own the value chain, will the operator or will they really work together? [#47]
  • Depends on your view of connectivity. Is it always preferred or even feasible to connect through the internet vs. directly via personal area networks? [#58]
  • Where is the debate on the customer experience and who (and how) owns it between the Telco’s, device suppliers and service/application providers? [#62]

 

Feedback: T-Mobile/DTAG’s plans

Deutsche Telekom demonstrated its concept of services that run across multiple devices – PC, mobile and TV, intending to help them drive triple-play sales, and also compete in the social network / personal portal marketplace. While it was a clear demonstration of multi-device strategy, it was less clear whether it would appeal to users already loyal to services which do not require a tie-in to an access subscription.

  • Interesting plans. [#23]
  • With the different approaches mentioned by DT/TI aren’t we re-inventing the wheel again? The gentleman from the BBC put it quite eloquently. Don’t view it as Build it and they will come, instead listen to your customers and build what they want [#11]
  • The t-mobile service looks interesting, but hasn’t apple already done this? I can get all my content, e-mail etc across my i-phone, apple TV and Mac, seamlessly synced [#16]
  • o    re. 16: but apple doesn’t federate other services (except mail) – it’s largely apple services. this is about aggregating services from other service providers [#33]
  • Why does T-Mobile think it will succeed in the market with yet another UC product? And why do they think users will demand the same interface across all devices? This is not reflective of current practice [#19]
  • o    Agree on 19 – this looked like a walled garden approach despite Rainer saying DT was very opposed to walled gardens. [#29]
  • o    Re 19: Not my view – current practice doesn’t take the customer in account at all – at least not the mass market. You have to be an expert to make reasonable cross devices usage, if you succeed at all [#68]
  • T-Mobile: What’s is the Biz model behind Connected Live and the differentiation regarding specialist like GMail, Flickr, Napster, etc? [#12]
  • How will T-mobile cover the social network of customer outside their customer base? [#13]
  • Very sexy Rainer. Can you tell us more about the business model? Does the product provide pull through for IPTV, mobile and fixed broadband? or do you charge users a subscription (which will kill it before it takes off)? or is there a two-sided play? [#14]
  • Very interesting model from DT and very similar as mobile-me from Apple who does the same since a year back. Any other SP who will do the same? [#27]
  • I have 100Gb + of digital assets/content. will the t-mobile service provide enough storage in the cloud for all of this? [#28]
  • If T-Mobile is open, why do you block Skype and alike? [#31]
  • Three screen strategies from operators like t-mobile and orange are simply enhanced defensive lock in strategies, do they really move the open and 2-way business model forward at all? [#59]

 

Feedback: Nokia

As well as the operator viewpoint on devices, the vendor angle was clearly expounded upon by Nokia’s Anssi Vanjoki, who pointed out the increasing capabilities of mobile phones and other products. Although reference was made to operator-related services, it was also clear that Nokia’s view of future business models did not need to rely on Telco platforms. Several commentators expanded on the implications of this.

  • Nokia at least have a view of the future that is a vision people can buy into, not a vision that seeks to control all [#10]
  • All the intelligence in the mobile phones (Nokia like) means no intelligence for all!!!! [#40]
  • If Nokia thinks 15% for voice and messaging – does he think subsidy on high end phones will go? [#8]
  • How does Anssi (Nokia) resolve the need to synch between multiple devices (Tera-play)? [#9]
  • If it’s all in the handset, why did Nokia buy empocket? [#17]
  • Any idea if Ovi Store will accept PyS60 apps as well as WRT and Objective C? [#39]
  • Mobile Web Server is just a techie toy isn’t it? [#42]
  • Nokia’s view: contextual/real-time awareness. [#43]
  • If Nokia is right, what will be about the internet-enabled TV’s that start to spread and will also have (controlled) access –> customers will need device independence! [#54]

Feedback: BONDI/Telecom Italia

Telecom Italia spoke about the OMTP’s BONDI Initiative, which involves working with W3C to develop a new way to run interactive widgets and web applications across phones supporting different OS’s. However, Telco 2.0 believes that this concept (which is alien to a lot of network-centric people) still needs to be explained more clearly and more widely, before the industry understands its potential significance. In theory, the ability for an operator to assist web-based applications with secure access to underlying device capabilities should enable various new business models.

  • BONDI concept is necessary to eliminate the multi-device compatibility issues [#45]
  • Bondi looks something that limits customer freedom. All customer freedom limitations will fault quickly! [#50]
  • Is the vision that Bondi becomes an industry standard or a proprietary solution for TI? [#15]
  • o    Re 15 – BONDI is an industry initiative, not TI only! [#26]
  • Are Apple and Nokia all supporters of BONDI or is this just the Telcos and the open source companies supporting this? [#35]
  • Who owns the application in Bondi? Will the Telco be able to get revenue [#44]

Feedback: Outstanding Issues

What remained unclear from the session was exactly how devices might fit into two-sided business models. How could developers or other “upstream” players benefit from device capabilities? The emphasis seemed to be much more on Telcos using their device input to exert control, rather than monetising openness, and several of the contributors commented on this.

  • History repeats itself. Brings my thoughts to OSI – ODA etc. Loads of time spent with little result [#38]
  • When will the SP community come together to develop a sustainable model for their API’s that works across all networks, thereby getting the benefits of scale, why do they all reinvent the wheel [#24]
  • One item mentioned earlier today was applications were the way to monetize a ‘service’ or website, nothing on the business model was mentioned. Where is the enhanced revenue in each of these approaches? [#30]
  • Where is the 2-sided business model? [#32]
  • What is the open mobile platform of the future? [#34]
  • Rainer: we heard Yves telling us that we need standards for the devices. How can reach a consistent experience around 3-screens (clients etc) and how are you capable to open to 3rd parties? [#49]
  • How to open the Telco infrastructure without open device site? [#51]
  • Don’t we have already a 2-sided business model today between mobile Opcos and device manufacturers in many countries? Opcos buy handsets wholesale and resell them with a rebate (aka handset subsidy) to the end customer. [#52]
  • o    Note 52: how is that 2-sided? Isn’t the revenue being produced only by the customer? [#57]

General Questions

Perhaps reflecting the lower general emphasis on devices within the Telco 2.0 community, the session also threw up a number of more general questions (some of which we’ve tried to answer in red).

1.     Do we that the trend is the same for devices in developed as in emerging markets? [#18] [Telco 2.0 – devices in emerging markets are slowly becoming more powerful, but probably 3-4 years behind on average. More interestingly, the majority in markets like India are unsubsidised and do not feature operator-specific features, so will be even more difficult to control. Lack of fixed broadband means that the home gateway is less prominent]

2.     If devices become much more powerful how quickly will batteries die? [#20] [Telco 2.0 – battery life improves much more slowly than processors. However, it is often the screen that draws most power, not the processor. Various initiatives like multi-core processors will appear in handsets to help manage power, but it’s still an important issue]

3.     Where will dell and other pc manufacturers play in this conversation? [#66] [Telco 2.0 – yes, up to a point, especially with 3G-enabled laptops and MIDs. However, only a small fraction are likely to be directly Telco-controlled, especially in the enterprise. Most PC users are unlikely to accept operator interference in their choice of apps, although this changes a little where cheap PCs are subsidised]

4.     What is the projected battery life? Especially when up / downloading masses of data [#67]

5.     re 65 where will storage companies play [#69] [Telco 2.0 – There may be a broader role to play in the device space either for home servers (eg Linksys) or flash memory (eg Sandisk) in enabling new services, although this sector is still immature]

6.     What about the role of the SIM and can operators leverage the SIM to restrict the power of device manufacturers with consumers and use it to innovate new service models? [#61] [Telco 2.0 – the SIM card is definitely a core element of operator control and some new services. But not all devices have SIMs, and consumers are unlikely to accept SIM-locked PCs or TVs, especially if they connect via non-mobile access channels, or are unsubsidised. SIMs also have issues of legacy replacement, and are awkward for running applications across multiple carriers]

7.     A lot of focus in this session on mobile devices, what about a truly open set top box and EPG, not tied to an operator service but open for the user to choose services and subscribe as appropriate [#55]

8.     [Telco 2.0 – There is huge potential for a standards-based platform for STBs across multiple operators, which would enable diverse business models for video delivery. The BBC’s Project Canvas is an effort in this direction, and the Linux community has developed several technical solutions for the CPE. However, the regulatory issues have been extremely problematic everywhere it has been tried.]

Participants ‘Next Steps’ Vote

Participants were asked which device strategy would offer Telcos the most realistic opportunity to deliver profitable new services and business models in the future?

  • Telco designed and controlled smart devices (e.g. custom smartphones, operator specific digital picture frame)
  • Separate Telco controlled gateway device (e.g. femtocell, set top box) used with open edge device.
  • Open device with Telco control of policy software (e.g. netbook with sim & operator connection software).
  • Forget about controlling devices, we can manage everything in the network.
  • Forget devices, we can control things in the network.

devices-vote.png

Lessons learnt & next steps

Unfortunately, Telco strategists still appear to expend more efforts on examining infrastructure and centralised application platforms, rather than the network “edge”. Although obviously some within operator organisations are focused on the users’ hands and homes, there is often no more general recognition of the shifting balance of power – in terms of both influence and computation. The rise of the iPhone and similar devices has helped redress the balance somewhat – but even there, the emphasis has shifted to the more “comfortable” centralised AppStore as something for operators to emulate.

This is understandable. By and large, few fixed or mobile operators have successfully helped create new types of devices on their own. A few broadband providers have used home gateways as new service platforms, or as ways to reduce churn, but even these have tended to just be through the addition of functions like IPTV or VoIP. Few consumers would view their broadband “box” as a central hub of a home network – despite 10+ years of discussion of interconnection with consumer electronics, utility meters and home automation. All the talk of Telcos exploiting connectivity to HiFis or “screen fridges” has been hot air.

Alberto Ciarniello, VP Service Innovation, TIM: ”’Apple shipped 1bn apps at significant average revenue per user. It’s unprecedented. It’s generated a lot of traffic and a lot of stickiness.”

In the mobile space, the power of Nokia, Apple, RIM and others is always set against operators’ desire to customise applications or user experience. Although in developed markets, a high % of phones are sold through operator channels, the use of embedded operator-specific applications and on-device portals has had only limited commercial benefit. Probably the most important customisation has been the configuration of the Telco’s own portal as the default browser home page. If anything, the shift towards smartphones and PC-based mobile broadband has further weakened Telcos’ role – the majority of 3G data traffic goes straight to and from the Internet from “vanilla” devices.

Anssi Vanjöki, EVP New Markets, Nokia: Our user studies show that 12% of user time on the N-series is telephony or messaging; the rest is Web browsing, camera, media playback, e-mail, and applications.

The future possibly holds some more hope. The audience at the event was strongly in favour of pushing for Telco “control points” in otherwise open devices. This fits well with the heritage of SIM cards (which are expanding in capability) as well as standardisation in areas like the browser and widget frameworks (eg OMTP BONDI). Software pre-loaded with PC dongles or embedded 3G modems is another option. [Telco 2.0 is much more sceptical of the benefits of the RCS client advocated by the GSMA and certain operators]. In the converged triple/quadplay space, femtocells offer another point of control and service delivery, close to the customer – although the notion of a separate “gateway” product was viewed with less enthusiasm at the Nice event. New classes of devices such as MIDs, operator-enabled consumer electronics (Internet TVs, 3G musicplayers, in-car systems etc) also hold promise, but are seen more as low-risk experiments at this point.

In terms of next steps, the Telco 2.0 team feel that, in the short term (c12 months), operators should:

  • Aggressively pursue “must have” devices like the iPhone – even if there is a short-term pain point around loss of control. At the moment, customers are still device-centric.
  • Think twice about pushing end-users towards smartphones – instead, look at data plans coupled to higher-end featurephones, especially those with good browsers, touchscreens etc.
  • Assess the business opportunities around OMTP’s BONDI model at a strategic level
  • Revisit the realistic opportunities afforded by next-generation SIM cards for both PCs and phones.
  • Beware of certain device categories which will need new business/charging models to succeed broadly in the marketplace – for example, embedded-3G PCs are an “elegant concept”, but fail to meet the needs of massmarket consumers (or enterprises) at present.

 

Longer term, additional considerations are more pertinent:

  • Look at exploiting devices used by customers on other Telcos’ networks – there is no reason that operators cannot themselves become successful “over the top” players.
  • Look closely at using femtocells (plus handsets) as a new platform for innovative in-home services.
  • Work closely with utility companies on new smart metering / environmental monitoring applications.
  • Remain wary of new technical standards for devices that promise new opportunities – but require the creation of complete new ecosystems, and which potentially compete with other easier technologies. RCS and NFC are particularly exposed, in Telco 2.0’s view.
  • Expect developers to migrate towards the coolest and most computationally-powerful platforms. This may mean that the API strategy of the operator needs to become more device-centric over time.

Full Article: Nokia’s Strange Services Strategy – Lessons from Apple iPhone and RIM

The profuse proliferation of poorly integrated projects suggests either – if we’re being charitable – a deliberate policy of experimenting with many different ideas, or else – if we’re not – the absence of a coherent strategy.

Clearly Nokia is aware of the secular tendency in all information technology fields that value migrates towards software and specifically towards applications. Equally clearly, they have the money, scale, and competence to deliver major projects in this field. However, so far they have failed to make services into a meaningful line of business, and even the well developed software ecosystem hasn’t seen a major hit like the iPhone and its associated app store.

Nokia Services: project proliferator

So far, the Services division in its various incarnations has brought forward Club Nokia, the Nokia Game, Forum Nokia, Symbian Developer Network, WidSets, Nokia Download!, MOSH, Nokia Comes With Music, Nokia Music Store, N-Gage, Ovi, Mail on Ovi, Contacts on Ovi, Ovi Store…it’s a lot of brands for one company, and that’s not even an exhaustive list. They’ve further acquired Intellisync, Sega.com, Loudeye, Twango, Enpocket, Oz Communications, Gate5, Starfish Software, Navteq and Avvenu since 2005 – that makes an average of just over two services acquisitions a year. Further, despite the decision to integrate all (or most) services into Ovi, there are still five different functional silos inside the Services division.

The great bulk of applications or services available or proposed for mobile devices fall into two categories – social or media. Under social we’re grouping anything that is primarily about communications; under media we’re grouping video, music, games, and content in general. Obviously there is a significant overlap. This is driven by fundamentals; no-one is likely to want to do computationally intensive graphics editing, CAD, or heavy data analysis on a mobile, run a database server on one, or play high-grade full-3D games. Batteries, CPU limitations, and most of all, form factor limitations see to that. And on the other side, communication is a fundamental human need, so there is demand pull as well as constraint push. As we pointed out back in the autumn of 2007, communication, not content, is king.

Aims

In trying to get user adoption of its applications and services, Nokia is pursuing two aims – one is to create products that will help to ship more Nokia devices, and to ship higher-value N- or E- series devices rather than featurephones, and the other is a longer-range hope to create a new business in its own right, which will probably be monetised through subscriptions, advertising,or transactions. This latter aim is much further off that the first, and is affected by the operators’ suspicion of any activity that seems to rival their treasured billing relationship. For example, although quick signup and data import are crucial to deploying a social application, Nokia probably wouldn’t get away with automatically enrolling all users in its services – the operators likely wouldn’t wear it.

Historical lessons

There have been several historical examples of similar business models, in which sales of devices are driven by a social network. However, the common factor is that success has always come from facilitating existing social networks rather than trying to create new ones. This is also true of the networks themselves; if new ones emerge, it’s usually as an epi-phenomenon of generally reduced friction. Some examples:

  1. Telephony itself: nobody subscribed in order to join the telephone community, they subscribed to talk to the people they wanted to talk to anyway.
  2. GSM: the unique selling point was that the people who might want to talk to you could reach you anywhere, and PSTN interworking was crucial.
  3. RIM’s BlackBerry: early BlackBerries weren’t that impressive as such, but they provided access to the social value of your e-mail workflow and groupware anywhere. Remember, the only really valuable IM user base is the 17 million Lotus Notes Sametime users.
  4. 3’s INQ: the Global Mobile Award-winning handset is really a hardware representation of the user’s virtual presence . Hutchison isn’t interested in trying to make people join Club Hutch or use 3Book; they’re interested in helping their users manage their social networks and charging for the privilege.

So it’s unlikely that trying to recruit users into Nokia-specific communities is at all sensible. Nobody likes vendor lock-in. And, if your product is really good, why restrict it to Nokia hardware users? As far as Web applications go, of course, there’s absolutely no reason why other devices shouldn’t be allowed to play. But this fundamental issue – that no-one organises their lives around their friends’ or the friends’ mobile operators’ choices of device vendor – would tend to explain why there have been so many service launches, mergers, and shutdowns. Nokia is trying to find the answer by trial and error, but it’s looking in the wrong place. There is some evidence, however, that they are looking more at facilitating other social applications, but this is subject to negotiation with the operators.

The operator relationship – root of the problem

One of the reasons why is the conflict with operators mentioned above. Nokia’s efforts to build a Nokia-only community mirror the telco fascination with the billing relationship. Telcos tend to imagine that being a customer of Telco X is enough to constitute a substantial social and emotional link; Nokia is apparently working on the assumption that being a customer of Nokia is sufficient to make you more like other Nokia customers than everyone else. So both parties are trying to “own the customer”, when in fact this is probably pointless, and they are succeeding in spoiling each others’ plans. Although telcos like to imagine they have a unique relationship with their subscribers, they in fact know surprisingly little about them, and carriers tend to be very unpopular with the public. Who wants to have a relationship with the Big Expensive Phone Company anyway? Both parties need to rethink their approach to sociability.

What would a Telco 2.0 take on this look like?

First of all, the operator needs to realise that the subscribers don’t love them for themselves; it was the connectivity they were after all along! Tears! Secondly, Nokia needs to drop the fantasy of recruiting users into a vendor-specific Nokiasphere. It won’t work. Instead, both ought to be looking at how they can contribute to other people’s processes. If Nokia can come up with a better service offering, very well – let them use the telco API suite. In fact, perhaps the model should be flipped, and instead of telcos marketing Nokia devices as a bundled add-in with their service, Nokia ought to be marketing its devices (and services) with connectivity and much else bundled into the upfront price, with the telcos getting their share through richer wholesale mechanisms and platform services.

Consider the iPhone. Looking aside from the industrial design and GUI for a moment – I dare you! you can do it! – its key features were integration with iTunes (i.e. with content), a developer platform that offered good APIs and documentation, but also a route to market for the developers and an easy way for users to discover, buy, and install their products, and an internal business model that sweetened the deal for the operators, by offering them exclusivity and a share of the revenue. Everyone still loves the iPhone, everyone still hates AT&T, but would AT&T ever consider not renewing the contract with Apple? They’re stealing our customers’ hearts! Of course not.

Apple succeeded in improving the following processes for two out of three key customer groups:

  1. Users: Acquiring and managing music and video across multiple devices.
  2. Users: Discovering, installing, and sharing mobile applications
  3. Developers: Deploying and selling mobile applications

And as two-sidedness would suggest, they offered the remaining group a share of revenue. The rest is history; the iPhone has become the main driver of growth and profitability at Apple, more than one billion applications downloads have been shipped from the App Store, etc, etc.

Conclusions: turn to small business?

So far, however, Nokia’s approach has mirrored the worst aspects of telcos’ attitude to their subscribers; a combination of possessiveness and indifference. They want to own the customer; they don’t know how or why. It might be more defensible if there was any sign that Nokia is serious about making money from services; that, of course, is poison to the operators and is therefore permanently delayed. Similarly, Nokia would like to have the sort of brand loyalty Apple enjoys and to build the sort of integrated user experience Apple specialises in, but it is paranoid about the operators. The result is essentially an Apple strategy, but not quite.

What else could they try? Consider Nokia Life Tools, the package of information services for farmers and small businesses they are building for the developing world. One thing that Nokia’s services strategy has so far lacked is engagement with enterprises; it’s all been about swapping photos and music and status updates. Although Nokia makes great business-class gadgets, and they provide a lot of useful enablers (multiple e-mail boxes, support for different push e-mail systems, VPN clients, screen output, printer support), there’s a hole shaped like work in their services offering. RIM has been much better here, working together with IBM and Salesforce.com to expand the range of enterprise applications they can mobilise.

Life Tools, however, shows a possible opportunity – it’s all right catering to companies who already have complex workflow systems, but who’s serving the ones that don’t have the scale to invest there? None of the vendors are addressing this, and neither are the telcos. It fits a whole succession of Telco 2.0 principles – focus on enterprises, look for areas where there’s a big difference between the value of bits and their quantity, and work hard at improving wholesale.

It’s almost certainly a better idea than trying to be Apple, but not quite.

Next Steps for Nokia and telcos

  • It is unlikely that ”Nokia users” are a valid community

  • Really successful social hardware facilitates existing social networks

  • Nokia’s problems are significantly explained by their difficult relationship with operators

  • Nokia’s emerging-market Life Tools package might be more of an example than they think

  • A Telco 2.0 approach would emphasise small businesses, offer bundled connectivity, and deal with the operators through better wholesale

Full Article: Device evolution: More power at the edge

The battle for the edge

This document examines the role of “edge” devices that sit at the periphery of a telco’s network – products like mobile phones or broadband gateways that live in the user’s hand or home. Formerly called “terminals”, with the inclusion of ever-better chips and software, such devices are now getting “smarter”. In particular, they are capable of absorbing many new functions and applications – and permit the user or operator to install additional software at a later point in time.

In fact, there is fairly incontrovertible evidence that “intelligence” always moves towards the edge of telecom networks, particularly when it can exploit the Internet and IP data connections. This has already been seen in PCs connected to fixed broadband, or in the shift from mainframes to client/server architectures in the enterprise. The trend is now becoming clearer in mobile, with the advent of the iPhone and other smartphones, as well as 3G-connected notebooks. Home networking boxes like set-tops, gaming consoles and gateways are further examples, which also get progressively more powerful.

This is all a consequence of Moore’s Law: as processors get faster and cheaper, there is a tendency for simple massmarket devices to gain more computing capability and take on new roles. Unsurprisingly, we therefore see a continued focus on the “edge” as a key battleground – who controls and harnesses that intelligence? Is it device vendors, operators, end users themselves, or 3rd-party application providers (“over-the-top players”, to use the derogatory slang term)? Is the control at a software, application or hardware level? Can operators deploy a device strategy that complements their network capabilities, to strengthen their position within the digital value chain and foster two-sided business models? Do developments like Android and femtocells help? Should the focus be on dedicated single-application devices, or continued attempts to control the design, OS or browser of multi-purpose products like PCs and smartphones?

Where’s the horsepower?

First, an illustration of the power of the edge.

If we go back five years, the average mobile phone had a single processor, probably an ARM7, clocking perhaps 30MHz. Much of this was used for the underlying radio and telephony functions, with a little “left over” for some basic applications and UI tools, like Java games.

Today, many the higher-end devices have separate applications processors, and often graphics and other accelerators too. An iPhone has a 600MHz+ chip, and Toshiba recently announced one of the first devices with a 1GHz Qualcomm Snapdragon. Even midrange featurephones can have 200MHz+ to play with, most of which is actually usable for “cool stuff” rather than the radio. [note: 1,000,000,000,000MHz (Megahertz) = 1,000,000,000GHz (Gigahertz) = 1,000,000THz (Terahertz) = 1,000PHz (Petahertz) = 1EHz (Exahertz)] Now project forward another five years. The average device (in developed markets at least) will have 500MHz, with top-end devices at 2GHz+, especially if they are not phones but 3G-connected PCs or MIDs. (These numbers are simplified – in the real world there’s lots of complexity because of different sorts of chips like digital signal processors, graphics accelerators or multicore processors). Set-top boxes, PVRs, game consoles and other CPE devices are growing smarter in parallel.

Now multiply by (say) 8 billion endpoints – mobile handsets, connected PCs, broadband modems, smart consumer electronics and so forth. In developed markets, people may well have 2-4 such devices each. That’s 4 Exahertz (EHz, 1018) of application-capable computing power in people’s hands or home networks, without even considering ordinary PCs and “smart TVs” as well. And much – probably most – of that power will be uncontrolled by the operators, instead being the playground of user- or vendor-installed applications.

Even smart pipes are dumb in comparison

It’s tricky to calculate an equivalent figure for “the network”, but let’s take an approximation of 10 million network nodes (datapoint: there are 3 million cell sites worldwide), at a generous 5GHz each. That means there would be 50 Petahertz (PHz, 1015) in the carrier cloud. In other words, about an 80th of the collective compute power of the edge.

bubley-device-1.png

Now clearly, it’s not quite as bad as that makes it sound – the network can obviously leverage intelligence in a few big control points in the core like DPI boxes, as traffic funnels through them. But at the other end of the pipe is the Internet, with Google and Amazon’s and countless other companies’ servers and “cloud computing” infrastructures. Trying to calculate the aggregate computing power of the web isn’t easy either, but it’s likely to be in the Exahertz range too. Google is thought to have 0.5-1.0 million servers on its own, for example.

bubley-device-2.png

So one thing is certain – the word “terminal” is obsolete. Whatever else happens, the pipe will inevitably become “dumber” (OK, less smart) than the edge, irrespective of smart Telco 2.0 platforms and 4G/NGN networks.

Now, add in all the cool new “web telco” companies (eComm 2009 was full of them) like BT/Ribbit, Voxeo, Jaduka, IfByPhone, Adhearsion and the Telco 2.0 wings of longtime infrastructure players like Broadsoft and Metaswitch (not to mention Skype and Google Voice), and the legacy carrier network platforms look even further disadvantaged.

Intelligent mobile devices tend to be especially hard to control, because they can typically connect to multiple networks – the operator cellular domain, public or private WiFi, Bluetooth, USB and so forth – which makes it easier for applications to “arbitrage” between them for access, content and services – and price.

Controlling device software vs. hardware

The answer is for telcos to try to take control of more of this enormous “edge intelligence”, and exploit it for their own benefit and inhouse services or two-sided strategies. There are three main strategies for operators wanting to exert influence on edge devices:

  1. Provide dedicated and fully-controlled and customised hardware and software end-points which are “locked down” – such as cable set-top boxes, or operator-developed phones in Japan. This is essentially an evolution of the old approach of providing “terminals” that exist solely to act as access points for network-based services. This concept is being reinvented with new Telco-developed consumer electronic products like digital picture frames, but is a struggle for variants of multi-function devices like PCs and smartphones.
  2. Provide separate hardware products that sit “at the edge” between the user’s own smart device and the network, such as cable modems, femtocells, or 3G modems for PCs. These can act as hosts for certain new services, and may also exert policy and QoS control on the connection. Arguably the SIM card fits into this category as well.
  3. Develop control points, in hardware or software, that live inside otherwise notionally “open” devices. This includes Telco-customised UI and OS layers, “policy-capable” connection manager software for notebooks, application certification for smartphones, or secured APIs for handset browsers.

bubley-device-3.png Controlling mobile is even harder than fixed

Fixed operators have long known what their mobile peers are now learning – as intelligence increases in the devices at the edge, it becomes far more difficult to control how they are used. And as control ebbs away, it becomes progressively easier for those devices to be used in conjunction with services or software provided by third parties, often competitive or substitutive to the operators’ own-brand offerings.

But there is a difference between fixed and mobile worlds – fixed broadband operators have been able to employ the second strategy outlined above – pushing out their own fully-controlled edge devices closer to the customer. Smart home gateways, set-top boxes and similar devices are able to sit “in front” of the TV and PC, and can therefore perform a number of valuable roles. IPTV, operator VoIP, online backups and various other “branded” services can exploit the home gateways, in parallel with Internet applications resident on the PC.

Conversely, mobile operators are still finding it extremely hard to control handset software at the OS level. Initiatives like SavaJe have failed, while more recently LiMO is struggling outside Japan. Endless complexities outside of Telcos’ main competence, such as software integration and device power management, are to blame. Meanwhile, other smartphone OS’s from firms like Nokia, Apple, RIM and Microsoft have continually evolved – albeit given huge investments. But most of the “smarts” are not controlled by the operators, most of the time. Further, low-end devices continue to be dominated by closed and embedded “RTOSs” (realtime operating systems), which tend to be incapable of supporting much carrier control either.

In fact, operators are continually facing a “one step forward, two steps back” battle for handset application and UI control . For every new Telco-controlled initiative like branded on-device portals, customised/locked smartphone OS’s, BONDI-type web security, or managed “policy” engines, there is another new source of “control leakage” – Apple’s device management, Nokia’s Ovi client, or even just open OS’s and usable appstores enabling easy download of competing (and often better/free) software apps.

The growing use of mobile broadband computing devices – mostly bought through non-operator channels – makes things worse. Even when sold by Telcos, most end users will not accept onerous operator control-points in their PCs’ application or operating systems, even where those computers are subsidised. There may be 300m+ mobile-connected computers by 2014.

Conclusions

Telcos need to face the inevitable – in most cases, they will not be able to control more than a fraction of the total computing and application power of the network edge, especially in mobile or for “contested” general-purpose devices. But that does not mean they should give up trying to exert influence wherever possible. Single-application “locked” mobile devices, perhaps optimised for gaming or navigation or similar functions have a lot of potential as true “terminals”, albeit used in parallel with users’ other smart devices.

It is far easier for the operator to exert its control at the edge with a wholly-owned and managed device, than via a software agent on a general computing device like a smartphone or notebook PC. Femtocells may turn out to be critical application control points for mobile operators in future. Telcos should look to exploit home networking gateways and other CPE with added-value software and services as soon as possible. Otherwise, consumer electronic devices like TVs and HiFi’s will adopt “smarts” themselves and start to work around the carrier core, perhaps accessing YouTube or Facebook directly from the remote control.

For handsets, controlling smartphone OS’s looks like a lost battle. But certain tactical or upper layers of the stack – browser, UI and connection-manager in particular – are perhaps still winnable. Even where the edge lies outside Telcos’ spheres of control, there are still many network-side capabilities that could be exploited and offered to those that do control the edge intelligence. Telco 2.0 platforms can manage security, QoS, billing, provide context data on location or roaming and so forth. However, carriers need to push hard and fast, before these are disintermediated as well. Google’s clever mapping and location capabilities should be seen as a warning sign that there will be work-arounds for “exposable” network capabilities, if Telcos’ offerings are too slow or too expensive.

Overall, the battle for control of the edge is multi-dimensional, and outcomes are highly uncertain, particularly given the economy and wide national variations in areas like device subsidy and brand preference. But Telcos need to focus on winnable battles – and exploit Moore’s Law rather than beat against it with futility.

We’ll be drilling into this area in much more depth during the Devices panel session at the upcoming Telco 2.0 Brainstorm in Nice in early May 2009.

Full Article: LiMo – The Tortoise picks up Momentum

Mobile Linux foundation LiMo‘s presence at the Mobile World Congress was impressive. DoCoMo demonstrated a series of handsets built on the OS; and LG & Samsung showed a series of reference implementations. But more impressive than the actual and reference handsets were the toolkits launched by
Access & Azingo.

 limo-1.png
We believe that LiMo has an important role to play in the Mobile Ecosystem and the platform is so compelling that over time more and more handsets based upon the OS will find their way into consumers hands. So why is LiMo different and important?

In a nutshell, it is not owned by anyone and is not being driven forward by any one member. Symbian and Android may also be open-source, but no-one has any serious doubt who is paying for the majority of the resources and therefore whether consciously or sub-consciously whose business model they could favour. The LiMo founder members were split evenly between operators (DoCoMo, Vodafone and Orange) and Consumer Electronic Companies (NEC, Panasonic & Samsung). Since then several other operators, handset makers, chip makers and software vendors have joined. The current board contains a representative sample of organisations across the mobile value chain.

LiMo as the Unifying Entity

The current handset OS market reminds us very much of the days when the computing industry shifted from proprietary operating systems to various mutations of Unix. Over time, more and more companies moved away from proprietary extensions and moved them into full open source. Unix was broken down into a core kernel, various drivers, thousands of bytes of middleware and a smattering of User Interfaces. Value shifted to the applications and services. Today, as open source has matured each company can decide which bits of Unix they want to push resources onto to develop further and which bits they want to include in their own distribution.

 limo-2.png
Figure 2: LiMo Architecture

The reason that Unix developed this way is pure economics – it is just too expensive for many companies to build and maintain their own flavours of operating systems. In fact there is only currently two mainstream companies who can afford to build their own – Microsoft and Apple – and the house of Apple is built upon Unix foundations anyway. Today, we are seeing the same dynamics in the mobile space and it is only a question of time, before more and more companies shift resources away from internal projects and onto open-source ones. LiMo is the perfect home for coordinating this open-source effort – especially if the Limo foundation allows freedom for the suppliers of code to develop their own roadmap according to areas of perceived value and weakness.

 LiMo should be really promiscuous to succeed

 In June 2008, LiMo merged with the LiPS foundation – great news. It is pointless and wasteful to have two foundations doing more or less the same thing, one from a silicon viewpoint and the other from an operator viewpoint. Just before Barcelona, LiMo endorsed the OMTP BONDI specification and announced that it expects future LiMo handsets using a web runtime to support the BONDI specification. Again, great news. It is pointless to redo specification work, perhaps with a slightly different angle. These type of actions are critical to the success of LiMo – embracing the work done by others and implementing it in an open-source, available to all manner.

Compelling base for Application Innovation

The real problem with developing mobile applications today is the porting cost to support the wide array of operating systems. LiMo offers the opportunity to radically reduce this cost. This is going to become critical for the next generation of devices which become wirelessly connected, whether machine-2-machine, general consumer devices or niche applications serving vertical industries. For the general consumer market, the key is to get handsets to the consumers. DoCoMo has done a great job of driving LiMo-based handsets into the Japanese market. 2009 needs to be the year that some European (eg Vodafone) or US (eg Verizon) deploy handsets in other markets.

Also, it is vital that the operators also make available some of its internal capabilties for use directly by the LiMo handsets and allow coupling to externally developed applications. These assets are not just the standard network services, but also internal service delivery platform capabilities. This adds benefits to the cost advantage that LiMo will ultimately have over the other handset operating systems. As in the computing world before, over time value will move away from hardware and operating systems towards applications and services. It is no accident that both Nokia and Google are moving into mobile services as a future growth area. The operators need an independent operating system to hold back their advance onto traditional operator turf.

In summary:

We feel that as complexity increases in the mobile world, the economics of LiMo will become more favourable. It is only a matter of time, but LiMo market share will start to increase – the only question is the timeframe. Crucially, LiMo is well placed to get the buy-in of the most important stakeholders – operators. Operators are to mobile devices as content creators were to VHS; how well would the iPhone have done without AT&T?

Plus:

  • following the same path as the evolution of the computing industry
  • broad and growing industry support

Minus:

  • not yet reached critical mass
  • economic incentives for application developers are still vague

Interesting:

  • commodisation of hardware and operating system layer – value moving towards applications and services
  • a way for operators to counter the growing strength of Apple, Nokia & Google.

Questions:

  • how can operators add their assets to make the operating system more compelling?
  • how can the barriers of intellectual property ownership be overcome?

Full Article: Verizon’s Volte-Face, Virtue or Vice?

“Verizon Wireless will open up its network to any device that a partner wishes to bring along. What are the business model implications and how should Verizon finesse this into a Telco 2.0 play?”

It’s been all across the tech news and blogosphere: Verizon Wireless has announced that they’re moving to a, well, less closed, network attachment model. For those whose job isn’t to surf the web, the summary is that pace certification testing by Verizon’s labs, and an unknown amount of bizdev negotiation, you can attach any device you like to the Verizon Wireless network. If you had to sum up Verizon’s strategy to date, it would be “Execute!?. They’ve simply done a great job of merging Airtouch, GTE and other properties; building out more coverage than the opposition; keeping an adequate level of handset and content innovation; and generally not screwing up.

The key details of the new offer — price, process and terms — remain hidden behind the PR fog. So what’s the unique Telco 2.0 slant on the news? When the market leader switches strategy, it’s not some short-term panic over Apple, Google, WiMax or spectrum auctions. It’s part of the deeper structural shifts in progress. So as we’re in the final assembly stage of our shiny new Broadband Business Models 2.0 report, here’s what’s on our minds about the future of connected devices:

Firstly, disaggregation of the value chain is a long-term inevitability. Regardless of Verizon’s taste in cell phones, there’s always going to be a need to assemble devices, software and content in configurations Verizon doesn’t think of, and sell through channels Verizon can’t access. For example, we’ve concluded content aggregation has strong increasing returns to scale, and as telcos aren’t very good at being media companies, they will exit the portal business — be it for text, video or music.

So in one sense, it’s “what took you so long?.

Secondly, Verizon can now offer up its assets — billing, retail logistics, care, etc. — to partners. These assets can be sweated far harder. Verizon takes ideas for handset and content, develops them together, markets, retails, supports and bills for it. Along that chain there is always a weak link. By allowing other businesses to go around the weak link, the full value of the other parts of the business can be realised.

As it happens, they’ve got a good network, retail, customer care and billing. So handsets and content are probably the bottleneck in delivering value. The CDMA ecosystem is smaller than the GSM one, and many of the handsets available only appeal to the techno-mad Japanese and Koreans. Far from being a “dumb pipe?, we’d anticipate Verizon moving to a broad platform play offering a suite of services to partners. For example, did you know there are around 30000 sales tax jurisdictions in the US? Offering phone service is hideously complex due to a maze of federal, state and local regulation. Why fight through these thorns to the sleeping princess yourself when Verizon can rent you a ladder over the hedge?

If you were having a Coasian view of the world, you’d simply note that commodity IT, web services and the Internet have lowered the cost of integrating outsiders into the business. Hence the relative value of internal vs. market transactions has changed. “Open? and “closed? aren’t virtues and vices, but merely stages of evolution.

Next up, if you were running Verizon Wireless and looking to make your business more efficient, what can you do? You could set higher targets for your execs and exhort them to do better. However, management targets and incentives (as the UK public sector has discovered) are blunt instruments. You might get cost savings or revenues at the expense of investment and brand quality. So instead you draw inspiration from the structurally separated fixed-line business. Benchmark your retail operation against outsiders. Make them feel the heat of competition purely on the merits of their own sub-part of the business. The Verizon retail business is no longer a monopsonist buyer of Verizon’s network and wholesale assets.

Finally, the most critical factor in Verizon making a success of wholesale network access will be to construct pricing that incentivises the desired behavior. Offering flat-rate vanilla ISP plans to wholesale partners will be a fatal mistake. Consumers need simplicity. Wholesale clients do not. You should not be afraid to confront them with complex wholesale pricing that reflects both the value and the cost of running the network. That means reflecting peak and off-peak times, congested areas where there is less spectrum or fewer attachment rights, and differential pricing depending on where Verizon has an advantage over competitors in terms of speed or coverage. The partners then have to work out how to design and package their product around these parameters, and create the simple retail propositions.