Strategy 2.0: Google’s Strategic Identity Crisis

Summary: Google’s shares have made little headway recently despite its dominance in search and advertising, and it faces increasing regulatory threats in this area. It either needs to find new sources of value growth or start paying out dividends, like Microsoft, Apple (or indeed, a telco). Overall, this is resulting in something of a strategic identity crisis. A review of Google’s strategy and implications for Telcos. (March 2012, Executive Briefing Service, Dealing with Disruption Stream).

Google's Advertising Revenues Cascade

  Read in Full (Members only)  Buy a single user license online  To Subscribe click here

Below is an extract from this 24 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and the Telco 2.0 Dealing with Disruption Stream here. Non-members can subscribe here, buy a Single User license for this report online here for £595 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email / call +44 (0) 207 247 5003. We’ll also be discussing our findings and more on Google at the Silicon Valley (27-28 March) and London (12-13 June) New Digital Economics Brainstorms.

To share this article easily, please click:

Executive Summary

Google appears to be suffering from a strategic identity crisis. It is the giant of search advertising but it also now owns a handset maker, fibre projects, an increasingly fragmented mobile operating system, a social network of questionable success, and a driverless car programme (among other things). It has a great reputation for innovation and creativity, but risks losing direction and value by trying to focus on too many strategies and initiatives.

We believe that Google needs to stop trying to copy what Apple and Facebook are doing, de-prioritise its ‘Hail Mary’ hunt for a strategy (e.g. driverless cars), and continue to build new solutions that serve better the customers who are already willing to pay – namely, advertisers.

It is our view that the companies who have created most value in the market have done so by solving a customer problem really well. Apple’s recent success derives from creating a simpler and more beautiful way (platform + products) for people to manage their digital lives. People pay because it’s appealing and it works.

Google initially solved how people could find relevant information online and then, critically, how to use this to help advertisers get more customers. They do this so well that Google’s $37bn revenues continue to grow at double digit pace, and there’s plenty of headroom in the market for now. While the TV strategy may not yet be paying off, it would seem sensible to keep working at it to try to keep extending the reach of Google’s platform. 

While Android keeps Google in the mobile game to a degree, and has certainly helped to constrain certain rivals, we think Google should cast a hard eye over its other competing and distracting activities: Motorola, Payments, Google +, Driverless Cars etc. Its management team should look at the size of the opportunity, the strength of the competition, and their ability to execute in each. 

Pruning the projects might also lose Google an adversary or two, and it might also afford some reward to shareholders too. After all, even Apple has recently decided to pay back some cash to investors.

This may be very difficult for Google’s current leadership. Larry Page seems to have the restless instincts of the stereotypical Valley venture capitalist, hunting the latest ideas, and constantly trying to create the next big beautiful thing. The trouble is that this is Google in 2012, not 1995, and it looks to us at least that a degree of ‘sticking to the knitting’ within Google’s huge, profitable and growing search advertising business may be a better bet than the highly speculative (and expensive) ‘Hail Mary’ strategy route. 

This may sound surprising coming from us, the inveterate fans of innovation at Telco 2.0, so we’d like to point out some important differences between the situations that Google and the telcos are in:

  • Google’s core markets are growing, not flat or shrinking, and are at a different life-stage to the telecoms market;
  • Google is global, rather than being confined to any given geography. There are many opportunities still out there.
  • We are not saying that Google should stop innovating, but we are saying it should focus its innovative energy more clearly on activities that grow the core business.


In January this year, Google achieved a first – it missed the consensus forecast for its quarterly earnings. There is of course no magic in the consensus, which is an average of highly conventionalised guesses from a bunch of City analysts, but it is as good a moment as ever to review Google’s strategic position. If you bought Google stock at the beginning, you may not need to read this, as you’re probably very rich (the return since then is of the order of 400%). The entirety of this return, however, is accounted for by the 2004-2007 bull run. On a five-year basis, Google stock is ahead 30%, which sounds pretty impressive (a 6% annual return), but again, all the growth is accounted for by the last surge upwards over the summer of 2007. The peak was achieved on the 2nd of November, 2007. 

As this chart shows, Google stock is still down about 9% from the peak, and perhaps more importantly, its path tracks Microsoft very closely indeed. Plus Microsoft investors get a dividend, whereas Google investors do not.

Figure 1: Google, Microsoft 2.0?

Google, Microsoft 2.0?
Source: Google Finance

Larry Page is reported to have said that “Google is no longer a “search company.” He says its model is now 

“invent wild things that will help humanity, get them adopted by users, profit, and then use the corporate structure to keep inventing new things.”

No longer a search company? Take a look at the revenues. Out of Google’s $37.9bn in revenues in 2011, $36bn came from advertising, aka the flip side of Google Search. Despite a whole string of mammoth product launches since 2007, Google’s business is essentially what it was in 2007 – a massive search-based advertising machine.

Google’s Challenges

Our last Google coverage – Android: An Anti-Apple Virus ? and the Dealing with the Disruptors Strategy Report   suggested that the search giant was suffering from a lack of direction, although some of this was accounted for by a deliberate policy of experimenting and shutting down failed initiatives.

Since then, Google has launched Google +, closed Google Buzz, and closed Google Wave while releasing it into a second life as an open-source project. It has been involved in major litigation over patents and in regulatory inquiries. It has seen an enormous boom in Android shipments but not necessarily much revenue. It is about to become a major hardware manufacturer by acquiring Motorola. And it has embarked on extensive changes to the core search product and to company-wide UI design.

In this note, we will explore Google’s activities since our last note, summarise key threats to the business and strategies to counter them, and consider if a bearish view of the company is appropriate.

We’ve found it convenient to organise Google’s business  into several themed groups as follows:

1: Questionable Victories

Pyrrhic victory is defined as a victory so costly it is indistinguishable from defeat. Although there is nothing so bad at Google, it seems to have a knack of creating products that are hugely successful without necessarily generating cash. Android is exhibit A. 

The obvious point here is surging, soaring growth – forecasts for Android shipments have repeatedly been made, beaten on the upside, adjusted upwards, and then beaten again. Android has hugely expanded the market for smartphones overall, caused seismic change in the vendor industry, and triggered an intellectual property war. It has found its way into an awe-inspiring variety of devices and device classes.

But questions are still hanging over how much actual money is involved. During the Q4 results call, a figure for “mobile” revenues of $2.5bn was quoted. This turns out to consist of advertising served to browsers that present a mobile device user-agent string. However, Google lawyer Susan Creighton is on record as saying  that 66% of Google mobile web traffic originates from Apple iOS devices. It is hard to see how this can be accounted for as Android revenue.

Further, the much-trailed “fragmentation” began in 2011 with a vengeance. “Forkdroids”, devices using an operating system based on Android but extensively adapted (“forked” from the main development line), appeared in China and elsewhere. Amazon’s Kindle Fire tablet is an example closer to home.

And the intellectual property fights with Oracle, Apple, and others are a constant source of disruption and a potentially sizable leakage of revenue. In so far as Google’s motivation in acquiring Motorola Mobility was to get hold of its patent portfolio, this has already involved very large sums of money. Another counter-strategy is the partnership with Intel and Lenovo to produce x86-based Android devices, which cannot be cheap either and will probably mean even more fragmentation.

This is not the only example, though – think of Google Books, an extremely expensive product which caused a great deal of litigation, eventually got its way (although not all the issues are resolved), and is now an excellent free tool for searching in old books but no kind of profit centre. Further, Google’s patented automatic scanning has the unfortunate feature of pulling in marginalia, etc. from the original text that its rivals (such as Amazon Kindle) don’t.
Further, Google has recently been trying to monetise one of its classic products, the Google Maps API that essentially started the Web 2.0 phenomenon, with the result that several heavy users (notably Apple and Foursquare)  have migrated to the free OpenStreetMap project and its OpenLayers API.

2: Telco-isation

Like a telco, Google is dependent on one key source of revenue that cross-subsidises the rest of the company – search-based advertising. 

Figure 2: Google’s advertising revenues cascade into all other divisions

Google's Advertising Revenues Cascade

[NB TAC = Traffic Acquisition Cost, CoNR = Cost of Net Revenues]

Having proven to be a category killer for search and advertising across the  whole of the Internet, the twins (search and ads) are hugely critical for Google and also for millions of web sites, content creators, and applications developers. As a result, just like a telco, they are increasingly subject to regulation and political risk. 

Google search rankings have always been subject to an arms race between the black art of search-engine optimisation and Google engineers’ efforts to ensure the integrity of their results, but the whole issue has taken a more serious twist with the arrival of a Federal Trade Commission inquiry into Google’s business practices. The potential problems were dramatised by the so-called “white lady from Google”  incident at Google Kenya, where Google employees scraped a rival directory website’s customers and cold-called them, misrepresenting their competitors’ services, and further by the $500 million online pharmacy settlement. Similarly, the case of the Spanish camp site that wants to be disassociated from horrific photographs of a disaster demonstrates both that there is a demand for regulation and that sooner or later, a regulator or legislator will be tempted to supply it.

The decision to stream Google search quality meetings online should be seen in this light, as an effort to cover this political flank.

As well as the FTC, there is also substantial regulatory risk in the EU. The European Commission, in giving permission for the Motorola acquisition, also stated that it would consider further transactions involving Google and Motorola’s intellectual property on a case-by-case basis. To put it another way, after the Motorola deal, the Commission has set up a Google Alert for M&A activity involving Google.

3: Look & Feel Problems

Google is in the process of a far-reaching refresh of its user interfaces, graphic design, and core search product. The new look affects Search, GMail, and Google + so far, but is presumably going to roll out across the entire company. At the same time, they have begun to integrate Google + content into the search results.

This is, unsurprisingly, controversial and has attracted much criticism, so far only from the early adopter crowd. There is a need for real data to evaluate it. However, there are some reasons to think that Search is looking in the wrong place.

Since the major release codenamed Caffeine in 2008, Google Search engineers have been optimising the system for speed and for first-hit relevance, while also indexing rapidly-changing content faster by redesigning the process of “spidering” web sites to work in parallel. Since then, Google Instant has further concentrated on speed to the first result. In the Q4 results, it was suggested that mobile users are less valuable to Google than desktop ones. One reason for this may be that “obvious” search – Wikipedia in the first two hits – is well served by mobile apps. Some users find that Google’s “deep web” search has suffered.

Under “Google and your world”, recommendations drawn from Google + are being injected into search results. This is especially controversial for a mixture of privacy and user-experience reasons. Danny Sullivan’s SearchEngineLand, for example, argues that it harms relevance without adding enough private results to be of value. Further, doubt has been cast on Google’s numbers regarding the new policy of integrating Google accounts into G+ and G+ content into search.

Another, cogent criticism is that it introduces an element of personality that will render regulatory issues more troublesome. When Google’s results were visibly the output of an algorithm, it was easier for Google to claim that they were the work of impartial machines. If they are given agency and associated with individuals, it may be harder to deny that there is an element of editorial judgment and hence the possibility of bias involved.

Social search has been repeatedly mooted since the mid-2000s as the next-big-thing, but it seems hard to implement. Yahoo!, Facebook, and several others have tried and failed.

Figure 3: Google + on Google Trends: fading into the noise?

 Google + on Google Trends: Fading Into the Noise?
Source: Google Trends

It is possible that Google may have a structural weakness in design as opposed to engineering (which is as excellent as ever). This may explain why a succession of design-focused initiatives have failed – Wave and Buzz have been shut down, Google TV hasn’t gained traction (there are less than one million active devices), and feedback on the developer APIs is poor.

4: Palpable Project Proliferation

Google’s tendency to launch new products is as intimidating as ever. However, there is a strong argument that its tireless creativity lacks focus, and the hit-rate is worrying low. Does Google really need two cut-down OSs for ultra-mobile devices? It has both Android, and ChromeOS, and if the first was intended for mobile phones and the second for netbooks, you can now buy a netbook-like (but rather more powerful) Asus PC that runs Android. Further, Google supports a third operating system for its own internal purposes – the highly customised version of Linux that powers the Google Platform – and could be said to support a fourth, as it pays the Mozilla Foundation substantial amounts of money under the terms of their distribution agreement and their Boot to Gecko project is essentially a mobile OS. IBM also supported four operating systems at its historic peak in the 1980s.  

Also, does Google really need to operate an FTTH network, or own a smartphone vendor? The Larry Page quote we opened with tends to suggest that Google’s historical tendency to do experiments is at work, but both Google’s revenue raisers (Ads and YouTube, which from an economic point of view is part of the advertising business) date from the first three years as a public company. The only real hit Google has had for some time is Android, and as we have seen, it’s not clear that it makes serious money.

Google Wallet, for example, was launched with a blaze of publicity, but failed to attract support from either the financial or the telecoms industry, rather like its predecessor Google Checkout. It also failed to gain user adoption, but it has this in common with all NFC-based payments initiatives. Recently, a major security bug was discovered, and key staff have been leaving steadily, including the head of consumer payments. Another shutdown is probably on the cards. 

Meanwhile, a whole range of minor applications have been shuttered

Another heavily hyped project which does not seem to be gaining traction is the Chromebook, the hardware-as-a-service IT offering aimed at enterprises. This has been criticised on the basis that its $28/seat/month pricing is actually rather high. Over a typical 3 year depreciation cycle for IT equipment, it’s on a par with Apple laptops, and has the restriction that all the applications must work in a Web browser on netbook-class hardware. Google management has been promoting small contract wins in US school districts . Meanwhile, it is frequently observed that Google’s own PC fleet consists mostly of Apple hardware. If Google won’t use them itself, why should any other enterprise IT shop do so? The Google Search meeting linked above contains 2 Lenovo ThinkPads and 13 Apple MacBooks of various models and zero Chromebooks, while none other than Eric Schmidt used a Mac for his MWC 2012 keynote. Traditionally, Google insisted on “dogfooding” its products by using them internally.

The Google Fibre project in Kansas City, for its part, has been struggling with regulatory problems related to its access to city-owned civil infrastructure. Kansas City’s utility poles have reserved areas for different services, for example telecoms and electrical power. Google was given the concession to string the fibre in the more spacious electrical section – however, this requires high voltage electricians rather than telecoms installers to do the job and costs substantially more. Google has been trying to change the terms, and use the telecoms section, but (unsurprisingly) local cable and Bell operators are objecting. As with the muni-WLAN projects of the mid-2000s, the abortive attempt to market the Nexus One without the carriers, and Google Voice, Google has had to learn the hard way that telecoms is difficult.

And while all this has been going on, you might wonder where Google Enterprise 2.0 or Google Ads 2.0 are.

5. Google Play – a Collection of Challenges?

Google recently announced its “new ecosystem”, Google Play. This consists of what was historically known as the Android Market, plus Google Books, Google Music, and the web-based elements of Google Wallet (aka Google Checkout). All of these products are more or less challenged. Although the Android Market has been a success in distributing apps to the growing fleets of Android devices, it continues to contain an unusually high percentage of free apps, developer payouts tend to be lower than on its rivals, and it has had repeated problems with malware. Google Books has been an expensive hobby, involving substantial engineering work and litigation, and seems unlikely to be a profit centre. Google Music – as opposed to YouTube – is also no great success, and it is worth asking why both projects continue.

However, it will be the existing manager of Google Music who takes charge, with Android Market management moving out. It is worth noting that in fact there were two heads of the Android Market – Eric Chu for developer relations and David Conway for product management. This is not ideal in itself.

Further, an effort is being made to force app developers to use the ex-Google Checkout system for in-app billing. This obviously reflects an increased concern for monetisation, but it also suggests a degree of “arguing with the customers”.

To read the note in full, including the following additional analysis…

  • On the Other Hand…
  • Strengths of the Core Business
  • “Apple vs. Google”
  • Content acquisition
  • Summary Key Product Review
  • Search & Advertising
  • YouTube and Google TV
  • Communications Products
  • Android
  • Enterprise
  • Developer Products
  • Summary: Google Dashboard
  • Conclusion
  • Recommendations for Operators
  • The Telco 2.0™ Initiative
  • Index

…and the following figures…

  • Figure 1: Google, Microsoft 2.0?
  • Figure 2: Google’s advertising revenues cascade into all other divisions
  • Figure 3: Google + on Google Trends: fading into the noise?
  • Figure 4: Google’s Diverse Advertiser Base
  • Figure 5: Google’s Content Acquisition. 2008-2009, the missing data point
  • Figure 6: Google Product Dashboard

Members of the Telco 2.0 Executive Briefing Subscription Service and the Telco 2.0 Dealing with Disruption Stream can download the full 24 page report in PDF format hereNon-Members, please subscribe here, buy a Single User license for this report online here for £595 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email / call +44 (0) 207 247 5003.

Organisations, geographies, people and products referenced: AdSense, AdWords, Amazon, Android, Apple, Asus, AT&T, Australia, BBVA, Bell Labs, Boot to Gecko, Caffeine, CES, China, Chromebook, ChromeOS, ContentID, David Conway, Eric Chu, Eric Schmidt, European Commission, Facebook, Federal Trade Commission, GMail, Google, Google +, Google Books, Google Buzz, Google Checkout, Google Maps, Google Music, Google Play, Google TV, Google Voice, Google Wave, GSM, IBM, Intel, Kenya, Keyhole Software, Kindle Fire, Larry Page, Lenovo, Linux, MacBooks, Microsoft, Motorola, Mozilla Foundation, Netflix, Nexus, Office 365, OneNet, OpenLayers API, OpenStreetMap, Oracle, Susan Creighton, ThinkPads, VMWare, Vodafone, Western Electric, Wikipedia, Yahoo!, Your World, YouTube, Zynga

Technologies and industry terms referenced: advertisers, API, content acquisition costs, driverless car, Fibre, Forkdroids, M&A, mobile apps, muni-WLAN, NFC, Search, smart TV, spectrum, UI, VoIP, Wallet

Dealing with the ‘Disruptors’: Google, Apple, Facebook, Microsoft/Skype and Amazon (Updated Extract)

Executive Summary (Extract)

This report analyses the strategies behind the success of Amazon, Apple, Facebook, Google and Skype, before going on to consider the key risks they face and how telcos and their partners should deal with these highly-disruptive Internet giants.

As the global economy increasingly goes digital, these five companies are using the Internet to create global brands with much broader followings than those of the traditional telecoms elite, such as Vodafone, AT&T and Nokia. However, the five have markedly different business models that offer important insights into how to create world-beating companies in the digital economy:

  • Amazon: Amazon’s business-to-business Marketplace and Cloud offerings are text-book examples of how to repurpose assets and infrastructure developed to serve consumers to open up new upstream markets. As the digital economy goes mobile, Amazon’s highly-efficient two-sided commerce platform is enabling it to compete effectively with rivals that control the leading smartphone and tablet platforms – Apple and Google.
  • Apple: Apple has demonstrated that, with enough vision and staying power, an individual company can single-handedly build an entire ecosystem. By combining intuitive and very desirable products, with a highly-standardised platform for software developers, Apple has managed to create an overall customer experience that is significantly better than that offered by more open ecosystems. But Apple’s strategy depends heavily on it continuing to produce the very best devices on the market, which will be difficult to sustain over the long-term.
  • Facebook: A compelling example of how to build a business on network effects. It took Facebook four years of hard work to reach a tipping point of 100 million users, but the social networking service has been growing easily and rapidly ever since. Facebook has the potential to attract 1.4 billion users worldwide, but only if it continues to sidestep rising privacy concerns, consumer fatigue or a sudden shift to a more fashionable service.
  • Google: The search giant’s virtuous circle keeps on spinning to great effect – Google develops scores of free, and often-compelling, Internet services, software platforms and apps, which attract consumers and advertisers, enabling it to create yet more free services. But Google’s acquisition of Motorola Mobility risks destabilising the Android ecosystem on which a big chunk of its future growth depends.
  • Skype: Like Facebook and Google, Skype sought users first and revenues second. By creating a low-cost, yet feature-rich, product, Skype has attracted more than 660 million users and created sufficient strategic value to persuade Microsoft to hand over $8.5bn. Skype’s share of telephony traffic is rising inexorably, but Google and Apple may go to great lengths to prevent a Microsoft asset gaining a dominant position in peer-to-peer communications.

The strategic challenge

There is a clear and growing risk that consumers’ fixation on the products and services provided by the five leading disruptors could leave telcos providing commoditised connectivity and struggling to make a respectable return on their massive investment in network infrastructure and spectrum.

In developed countries, telcos’ longstanding cash-cows – mobile voice calls and SMS – are already being undermined by Internet-based alternatives offered by Skype, Google, Facebook and others. Competition from these services could see telcos lose as much as one third of their messaging and voice revenues within five years (see Figure 1) based on projections from our global survey, carried out in September 2011.

Figure 1 – The potential combined impact of the disruptors on telcos’ core services

Impact of Google, Apple, Facebook, Microsoft/Skype, Amaxon on telco services

Source: Telco 2.0 online survey, September 2011, 301 respondents

Moreover, most individual telcos lack the scale and the software savvy to compete effectively in other key emerging mobile Internet segments, such as local search, location-based services, digital content, apps distribution/retailing and social-networking.

The challenge for telecoms and media companies is to figure out how to deal with the Internet giants in a strategic manner that both protects their core revenues and enables them to expand into new markets. Realistically, that means a complex, and sometimes nuanced, co-opetition strategy, which we characterise as the “Great Game”.

In Figure 3 below, we’ve mapped the players’ roles and objectives against the markets they operate in, giving an indication of the potential market revenue at stake, and telcos’ generic strategies.

Figure 3- The Great Game – Positions, Roles and Strategies

The Great Game - Telcos, Amazon, Apple, Google, Facebook, Skype/Microsoft

Our in-depth analysis, presented in this report, describes the ‘Great Game’ and the strategies that we recommend telcos and others can adopt in summary and in detail. [END OF FIRST EXTRACT]

Report contents

  • Executive Summary [5 pages – including partial extract above]
  • Key Recommendations for telcos and others [20 pages]
  • Introduction [10 pages – including further extract below]

The report then contains c.50 page sections with detailed analysis of objectives, business model, strategy, and options for co-opetition for:

  • Google
  • Apple
  • Facebook
  • Microsoft/Skype
  • Amazon

Followed by:

  • Conclusions and recommendations [10 pages]
  • Index

The report includes 124 charts and tables.

The rest of this page comprises an extract from the report’s introduction, covering the ‘new world order’, investor views, the impact of disruptors on telcos, and how telcos are currently fighting back (including pricing, RCS and WAC), and further details of the report’s contents. 



The new world order

The onward march of the Internet into daily life, aided and abetted by the phenomenal demand for smartphones since the launch of the first iPhone in 2007, has created a new world order in the telecoms, media and technology (TMT) industry.

Apple, Google and Facebook are making their way to the top of that order, pushing aside some of the world’s biggest telcos, equipment makers and media companies. This trio, together with Amazon and Skype (soon to be a unit of Microsoft), are fundamentally changing consumers’ behaviour and dismantling longstanding TMT value chains, while opening up new markets and building new ecosystems.

Supported by hundreds of thousands of software developers, Apple, Google and Facebook’s platforms are fuelling innovation in consumer and, increasingly, business services on both the fixed and mobile Internet. Amazon has set the benchmark for online retailing and cloud computing services, while Skype is reinventing telephony, using IP technology to provide compelling new functionality and features, as well as low-cost calls.

On their current trajectory, these five companies are set to suck much of the value out of the telecoms services market, substituting relatively expensive and traditional voice and messaging services with low-cost, feature-rich alternatives and leaving telcos simply providing data connectivity. At the same time, Apple, Amazon, Google and Facebook have become major conduits for software applications, games, music and other digital content, rewriting the rules of engagement for the media industry.

In a Telco2.0 online survey of industry executives conducted in September 2011, respondents said they expect Apple, Google, Facebook and Skype together to have a major impact on telcos’ voice and messaging revenues in the next three to five years . Although these declines will be partially compensated for by rising revenues from mobile data services, the respondents in the survey anticipate that telcos will see a major rise in data carriage costs (see Figure 1 – The potential combined impact of the disruptors on telcos’ core services).

In essence, we consider Amazon, Apple, Facebook, Google and Skype-Microsoft to be the most disruptive players in the TMT ecosystem right now and, to keep this report manageable, we have focused on these five giants. Still, we acknowledge that other companies, such as RIM, Twitter and Baidu, are also shaping consumers’ online behaviour and we will cover these players in more depth in future research.

The Internet is, of course, evolving rapidly and we fully expect new disruptors to emerge, taking advantage of the so-called Social, Local, Mobile (SoLoMo) forces, sweeping through the TMT landscape. At the same time, the big five will surely disrupt each other. Google is increasingly in head-to-head competition with Facebook, as well as Microsoft, in the online advertising market, while squaring up to Apple and Microsoft in the smartphone platform segment. In the digital entertainment space, Amazon and Google are trying to challenge Apple’s supremacy, while also attacking the cloud services market.

Investor trust

Unlike telcos, the disruptors are generally growing quickly and are under little, or no, pressure from shareholders to pay dividends. That means they can accumulate large war chests and reinvest their profits in new staff, R&D, more data centres and acquisitions without any major constraints. Investors’ confidence and trust enables the disruptors to spend money freely, keep innovating and outflank dividend-paying telcos, media companies and telecoms equipment suppliers.

By contrast, investors generally don’t expect telcos to reinvest all their profits in their businesses, as they don’t believe telcos can earn a sufficiently high return on capital. Figure 16 shows the dividend yields of the leading telcos (marked in blue). Of the disruptors, only Microsoft (marked in green) pays a dividend to shareholders.

Figure 16: Investors expect dividends, not growth, from telcos

Figure 1 Chart Google Apple Facebook Microsoft Skype Amazon Sep 2011 Telco 2.0

Source: Google Finance 2/9/2011

The top telcos’ turnover and net income is comparable, or superior, to that of the leading disruptors, but this isn’t reflected in their respective market capitalisations. AT&T’s turnover is approximately four times that of Google and its net income twice as great, yet their market cap is similar. Even accounting for their different capital structures, investors clearly expect Google to grow much faster than AT&T and syphon off more of the value in the TMT sector.

More broadly, the disparity in the market value between the leading disruptors and the leading telcos’ market capitalisations suggest that investors expect Apple, Microsoft and Google’s revenues and profits to keep rising, while they believe telcos’ will be stable or go into decline. Figure 17 shows how the market capitalisation of the disruptors (marked in green) compares with that of the most valuable telcos (marked in blue) at the beginning of September 2011.

Figure 17: Investors value the disruptors highly

Figure 2 Chart Google Apple Facebook Microsoft Skype Amazon Market Capitalisation Sep 2011 Telco 2.0

Source: Google Finance 2/9/2011 (Facebook valued at Facebook $66bn based on IPG sale in August 2011)

Impact of disruptors on telcos

It has taken longer than many commentators expected, but Internet-based messaging and social networking services are finally eroding telcos’ SMS revenues in developed markets. KPN, for example, has admitted that smartphones, equipped with data communications apps (and Whatsapp in particular), are impacting its voice and SMS revenues in its consumer wireless business in its home market of The Netherlands (see Figure 18). Reporting its Q2 2011 results, KPN said that changing consumer behaviour cut its consumer wireless service revenues in Holland by 2% year-on-year.

Figure 18: KPN reveals falling SMS usage

Figure 3 Chart Google Apple Facebook Microsoft Skype Amazon KPN Trends Sep 2011 Telco 2.0

Source: KPN Q2 results

In the second quarter, Vodafone also reported a fall in messaging revenue in Spain and southern Africa, while Orange saw its average revenue per user from data and SMS services fall in Poland.

How telcos are fighting back

Big bundles

Carefully-designed bundles are the most common tactic telcos are using to try and protect their voice and messaging business. Most postpaid monthly contracts now come with hundreds of SMS messages and voice minutes, along with a limited volume of data, bundled into the overall tariff package. This mix encourages consumers to keep using the telcos’ voice and SMS services, which they are paying for anyway, rather than having Skype or another VOIP service soak up their precious data allowance.

To further deter usage of VOIP services, KPN and some other telcos are also creating tiered data tariffs offering different throughput speeds. The lower-priced tariffs tend to have slow uplink speeds, making them unsuitable for VOIP (see Figure 19 below). If consumers want to use VOIP, they will need to purchase a higher-priced data tariff, earning the telco back the lost voice revenue.

Figure 19: How KPN is trying to defend its revenues

Figure 4 Chart Google Apple Facebook Microsoft Skype Amazon KPN Defence Sep 2011 Telco 2.0

Source: KPN’s Q2 results presentation

Of course, such tactics can be undermined by competition – if one mobile operator in a market begins offering generous data-only tariffs, consumers may well gravitate towards that operator, forcing the others to adjust their tariff plans.

Moreover, bundling voice, SMS and data will generally only work for contract customers. Prepaid customers, who only want to pay for what they are use, are naturally charged for each minute of calls they make and each message they send. These customers, therefore, have a stronger financial incentive to find a free WiFi network and use that to send messages via Facebook or make calls via Skype.

The Rich Communications Suite (RCS)

To fend off the threat posed by Skype, Facebook, Google and Apple’s multimedia communications services, telcos are also trying to improve their own voice and messaging offerings. Overseen by mobile operator trade association the GSMA, the Rich Communications Suite is a set of standards and protocols designed to enable mobile phones to exchange presence information, instant messages, live video footage and files across any mobile network.

In an echo of social networks, the GSMA says RCS will enable consumers to create their own personal community and share content in real time using their mobile device.

From a technical perspective, RCS uses the Session Initiation Protocol (SIP) to manage presence information and relay real-time information to the consumer about which service features they can use with a specific contact. The actual RCS services are carried over an IP-Multimedia Subsystem (IMS), which telcos are using to support a shift to all-IP fixed and mobile networks.

Deutsche Telekom, Orange, Telecom Italia, Telefonica and Vodafone have publically committed to deploy RCS services, indicating that the concept has momentum in Europe, in particular. The GSMA says that interoperable RCS services will initially be launched by these operators in Spain, Germany, France and Italy in late 2011 and 2012. [NB We’ll be discussing RCSe with some of the operators at our EMEA event in London in November 2011.]

In theory, at least, RCS will have some advantages over many of the communications services offered by the disruptors. Firstly, it will be interoperable across networks, so you’ll be able to reach people using different service providers. Secondly, the GSMA says RCS service features will be automatically available on mobile devices from late 2011 without the need to download and install software or create an account (by contrast, Apple’s iMessage service, for example, will only be installed on Apple devices).

But questions remain over whether RCS devices will arrive in commercial quantities fast enough, whether RCS services will be priced in an attractive way and will be packaged and marketed effectively. Moreover, it isn’t yet clear whether IMS will be able to handle the huge signalling load that would arise from widespread usage of RCS.

Internet messaging protocols, such as XMPP, require the data channel to remain active continuously. Tearing down and reconnecting generates lots of signalling traffic, but the alternative – maintaining a packet data session – will quickly drain the device’s battery.
By 2012, Facebook and Skype may be even more entrenched than they are today and their fans may see no need to use telcos’ RCS services.

Competing head-on

Some of the largest mobile operators have tried, and mostly failed, to take on the disruptors at their own game. Vodafone 360, for example, was Vodafone’s much-promoted, but ultimately, unsuccessful €500 million attempt to insert itself between its customers and social networking and messaging services from the likes of Facebook, Windows Live, Google and Twitter.

As well as aggregating contacts and feeds from several social networks, Vodafone 360 also served as a gateway to the telco’s app and music store. But most Vodafone customers didn’t appear to see the need to have an aggregator sit between them and their Facebook feed. During 2011, the service was stripped back to be just the app and music store. In essence, Vodafone 360 didn’t add enough value to what the disruptors are already offering. We understand, from discussions with executives at Vodafone, that the service is now being mothballed.

A small number of large telcos, mostly in emerging markets where smartphones are not yet commonplace, have successfully built up a portfolio of value-added consumer services that go far beyond voice and messaging. One of the best examples is China Mobile, which claims more than 82 million users for its Fetion instant messaging service, for example (see Figure 20 – China Mobile’s Internet Services).

Figure 20 – China Mobile’s Internet Services

China Mobile Services, Google, Apple, Facebook Report, Telco 2.0

Source: China Mobile’s Q2 2011 results

However, it remains to be seen whether China Mobile will be able to continue to attract so many customers for its (mostly paid-for) Internet services once smartphones with full web access go mass-market in China, making it easier for consumers to access third-parties’ services, such as the popular QQ social network.

Some telcos have tried to compete with the disruptors by buying innovative start-ups. A good example is Telefonica’s acquisition of VOIP provider Jajah for US$207 million in January 2010. Telefonica has since used Jajah’s systems and expertise to launch low-cost international calling services in competition with Skype and companies offering calling cards. Telefonica expects Jajah’s products to generate $280 million of revenue in 2011, primarily from low-cost international calls offered by its German and UK mobile businesses, according to a report in the FT.

The Wholesale Applications Community (WAC)

Concerned about their growing dependence on the leading smartphone platforms, such as Android and Apple’s iOS, many of the world’s leading telcos have banded together to form the Wholesale Applications Community (WAC).

WAC’s goal is to create a platform developers can use to create apps that will run across different device operating systems, while tapping the capabilities of telcos’ networks and messaging and billing systems.

At the Mobile World Congress in February 2011, WAC said that China Mobile, MTS, Orange, Smart, Telefónica, Telenor, Verizon and Vodafone are “connected to the WAC platform”, while adding that Samsung and LG will ensure “that all devices produced by the two companies that are capable of supporting the WAC runtime will do so.”

It also announced the availability of the WAC 2.0 specification, which supports HTML5 web applications, while WAC 3.0, which is designed to enable developers to tap network assets, such as in-app billing and user authentication, is scheduled to be available in September 2011.

Ericsson, the leading supplier of mobile networks, is a particularly active supporter of WAC, which also counts leading Alcatel-Lucent, Huawei, LG Electronics, Qualcomm, Research in Motion, Samsung and ZTE, among its members.

In theory, at least, apps developers should also throw their weight behind WAC, which promises the so far unrealised dream of “write once, run anywhere.” But, in reality, games developers, in particular, will probably still want to build specific apps for specific platforms, to give their software a performance and functionality edge over rivals.

Still, the ultimate success or failure of WAC will likely depend on how enthusiastically Apple and Google, in particular, embrace HTML5 and actively support it in their respective smartphone platforms. We discuss this question further in the Apple and Google chapters of this report.

Summarising current telcos’ response to disruptors


Telcos, and their close allies in the equipment market, are clearly alert to the threat posed by the major disruptors, but they have yet to develop a comprehensive game plan that will enable them to protect their voice and messaging revenue, while expanding into new markets.

Collective activities, such as RCS and WAC, are certainly necessary and worthwhile, but are not enough. Telcos, and companies across the broader TMT ecosystem, need to also adapt their individual strategies to the rise of Amazon, Apple, Facebook, Google and Skype-Microsoft. This report is designed to help them do that.



RIM: R.I.P. or ‘Reports of my death are greatly exaggerated’?

Summary: RIM’s shares have plummeted in value over the last four months, prompting an eruption of finger-pointing in the media and speculation of its demise or acquisition. In this analysis we examine whether the doom-mongers are right and what RIM’s recovery strategy might be. (July 2011, Executive Briefing Service) Apple iCloud logo in analysis of impact of iCloud/iOS on digital ecosystem
  Read in Full (Members only)    Buy This Report    To Subscribe

Below is an extract from this 12 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service here. Non-members can buy a Single User license for this report online here for £295 (+VAT) or subscribe here. For multiple user licenses or other enquiries please email or call +44 (0) 207 247 5003.

To share this article, please click:


Background – RIM’s share price disaster

RIM’s shares have plummeted in value over the last four months, prompting an eruption of finger-pointing in the media and speculation of its demise or acquisition. In this analysis we examine whether the doom-mongers are right and what RIM’s recovery strategy might be.

‘Reports of my death are greatly exaggerated’ – US writer Mark Twain, 1907, when he failed to return to New York City as scheduled and The New York Times speculated that he might have been “lost at sea.”

Figure 1 – RIM has obviously underperformed Apple, but incredibly it has also underperformed Nokia.

RIM, Apple, Nokia Share Prices July 2011 Telco 2.0

With its iconic Blackberry devices, RIM led the way in the mobile messaging era – first in corporate and then in consumer markets. But the transition to the mobile web has seen it surpassed by Apple and Google in consumer developed markets. In this respect RIM faces the same challenge as Nokia. And yet, despite facing the same challenge, RIM and Nokia have taken completely different strategic options for their future. When Nokia announced its partnership with Microsoft it pointedly talked about the creation of the third platform for the mobile web alongside Apple and Google – Nokia effectively discounted RIM from the game.

Previous Telco 2.0 analysis on RIM includes: RIM: how does the BlackBerry fit with Telco 2.0 strategies?; Mobile Software Platforms: Rapid Consolidation Forecast; and Nokia’s Strange Services Strategy – Lessons from Apple iPhone and RIM.

Current Position – on the surface, OK, but…

At first glance, RIM looks in a healthy position and its recent results show that both handset shipments (13.2m vs 11.2m) and revenues (US$4.9bn vs US$4.2bn) were up on the previous year. RIM is making reasonable profits (US$695m) and has a healthy cash position (US$2.9bn). But under the hood, life is not looking as rosy.

Profits: Under Pressure

RIM’s accounts show that its absolute profits are declining as growth in R&D and S&M costs are exceeding the slowing growth in revenues.

Figure 2 – RIM’s Profits are down against growth in R&D and S&M costs

RIM Profits, R&D Costs, Sales and Marketing Costs, July 2011 Telco 2.0

Of course, rising R&D and S&M costs may ultimately result in new revenues, although at present the effects of this spending are not yet evident in overall performance.

Revenues: Squeezed out of Key Markets

RIM’s revenues are dropping in key markets, particularly the USA, and its growth in revenues is coming from emerging markets.

Figure 3 – RIM’s Changing Market Revenues

Table of RIM Worldwide Sources of Revenue and changes, July 2011, Telco 2.0

Market Share: Declining

RIM’s share of the overall smartphone market is declining.

Figure 4 – RIM’s Declining Worldwide Market Share

Table of RIM, Apple, Nokia, Android Smartphone Market Share May 2011, Telco 2.0 (Gartner)

Core Product Advantages: Eroded

Core product advantages (e.g. Blackberry Messenger) are being eroded and surpassed as the competition (e.g. Apple iMessage) improves.

New Products: Late

New devices such as the updated Bold 9900 have missed planned release dates.

To read the rest of this report, including…

  • Outlook – a time of transition?
  • QNX & TAT – RIM’s saviours?
  • Playbook – A disappointing start
  • Coming: the Android / Emerging Market Crunch
  • Corporate Strength
  • Telco 2.0 Conclusions & Recommendations – Is there a recovery strategy?


Members of the Telco 2.0 Executive Briefing Subscription Service can download the full 14 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £295, or for multi-user licenses and any other enquiries please email or call +44 (0) 207 247 5003.

Companies, technologies and products referenced: 7digital, Adobe Flash, Amazon, Android, Apple, Blackberry, BlackberryOS 8, Bold 9900, Carphone Warehouse, Google, Huawei, iMessage, iPad, iPhone, Microsoft, Nokia, Phones4U, Playbook, QNX Software Systems, RIM, The Astonishing Tribe (TAT).



Arete Research: Getting to a Billion Smartphones in 2013

This is an extract from a report by Arete Research, a Telco 2.0TM partner specalising in investment analysis. The views in this article are not intended to constitute investment advice from Telco 2.0TM or STL Partners. We are reprinting Arete’s analysis to give our customers some additional insight into how some investors see the Telecoms market.

This report can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service using the links below.

Read in Full (Members only)        To Subscribe

‘Growing the Mobile Internet’ and ‘Fostering Vibrant Ecosystems: Lessons from Apple’ are also key session themes at our upcoming ‘New Digital Economics’ Brainstorms (Palo Alto, 4-7 April and London, 11-13 May). Please use the links or email or call +44 (0) 207 247 5003 to find out more.

To share this article easily, please click:


A billion Smartphones by 2013?


In August ’05 we [Arete Research] published A Billion Handsets in ’07?, where we argued that the coming of low-cost ($25 BoM) handsets would open a new segment and take the market to unimaginable volumes (consensus at the time saw 6% growth to ~800m units in ’07). By ’07 the global handset market hit 1.2bn units, Samsung passed Motorola in volumes, and Mediatek began supplying many new entrants. Now we see the same pattern being repeated in smartphones: Western vendors staked out the first 350m units, and Apple is on track to be the #1 by value share in 1Q11. We see $80 BoM cost smartphones by YE’11, sparking rapid growth and further dramatic shifts in the mobile device landscape

The lure of the largest consumer electronics end-market in the world, the emergence of Chinese vendors with global ambitions, the lowering of barrier entries, and the value attributed to connecting wirelessly to the Internet will drive significant uptake of smartphones and tablets in ’12. It is no longer crazy to talk about “A Billion Smartphones in 2013”.

Table 1: Smartphone and Tablet Shipments by Region



% of Units


% of Units


% of Units


% of Units


% of Units


% of Units














N. America


























































































Source: Arete Research estimates

Will This Bring Growth?

In ’10 developed market demand lifted global ASPs after relentless declines. Apple, HTC and RIM took 9% of value share at Nokia’s expense (down 11% from ’08 to ’10). Even with incremental smartphone demand driven by emerging markets from 2H11, it is clear consumers are willing to spend more for a smartphone. This should soften historic ASP declines: we see the overall handset industry rising from $182bn in ’10 to $214bn in ’11 (+18%) and $223bn in ’12 (+4%). Smartphone units should reach 480m in ’11 and 800m in ’12. We estimate smartphones will be on a 1bn annual run rate by 4Q12.

Wireless Logic Semis Could Double. The rise of mass market smartphones will raise the semis content within device BoMs. Apps processors, connectivity and memory (mobile DRAM and flash) will rise in the mix as display prices drop. We think the $9.5bn logic semis market can rise to $18bn+ in ’12. We also see steep price declines in tablets as traditional PC and mobile device ecosystems battle to control new low-cost computing platforms.

Memory Madness. We think combined DRAM and NAND demand will lead mobile device memory sales to nearly triple from ~$9bn in ’10 to $24bn in ’12, offsetting weak PC demand.

Telcos: Dummies No More. Developed markets operators will offer bundled tablets from mid-’11, while low-cost computing will reach emerging markets like India. Smartphones offer a vast upselling opportunity for operator dataplans.

Smartphones: Mix and Margins

We see the same failure of imagination around smartphone volumes that we saw and wrote about in traditional handsets: industry and financial analysts simply cannot break the habit of forecasting mid-single digit annual growth. Yet it is clear that adding touchscreens and better browsers, as well as rolling out 3G infrastructure in emerging markets significantly boosts the marginal utility of a mobile device to end consumers.

In 2010 consumers showed a willingness to pay more for devices, because they were worth more to them. Although the apps developed for emerging markets will be different from those well-known in the US and Europe, we have little doubts that the openness of smartphone platforms will allow new usage patterns there as well. The software flexibility and gradual release of more language packs for low-end smartphones only widens the addressable base for these devices. While we appreciate some people think emerging markets consumers will not want a “good enough” smartphone, we also cannot model 200-300m users jumping from sub-$100 ASPs to over $200; we think a smaller display touchscreen model with limited onboard memory would find broad mass market acceptance.

Table 2: Differing Needs




Fixed Line Broadband per 100 people

GDP per Capita (Current US$)








































Source: Worldbank Data from ’08 and ’09, ITU

Table 2 shows how the smartphone is poised to provide the primary method of Internet access in markets where there is negligible penetration of fixed line services. And Table 3 shows how the Middle East and Africa are joining Asia as absorbing the largest share of overall device units over time, even as value share remains concentrated in N. America and Europe. The time is rapidly coming for local brands to take centre stage in these regions. Here we see names like TCL, TianYu, CoolPad, Huawei, ZTE and Micromax and others as but the tip of the iceberg. This will only increase the pressure on the old group of five traditional handset vendors (Nokia, Motorola, Samsung, LGE and SonyEricsson) that commanded ~80% market share from ’00 to ’07, before the rise of Apple RIM and HTC altered industry dynamics irrevocably.

Table 3: Majority of Units in Asia & MEA ’09-’12E

































































































Source: Arete Research estimates

How do we get to an $80 BoM smartphone, which supports a $100 trade price? The bill of materials below shows the key components and what changes in low end smartphones. Bear in mind this is a device with a <3″ display and limited on-board memory; that said, retail prices for 8GB microSD cards are c. $6. We think this will bring smartphones to the mass market in place of featurephones, with emerging markets consumers getting open software platforms that allow customisation via locally relevant applications.

Our estimates by region for smartphone penetration are seen in Table 1 for ’10-’15. We see the industry reaching a 1bn unit run rate by 4Q12, and exceeding this in 2013. Effectively we think that half of overall global volumes will be smartphones in 2013, i.e. users will get a software upgradable device at all but the lowest price points, with the rare exception of niche voice-centric products for developed markets (for example, for older users).

Figure 1: Getting to an $80 BoM Smartphone?

Source: Arete Research estimates

The ’10 surge in developed markets smartphones drove large value share redistribution, while also sustaining global handset ASPs. Apple, RIM and HTC saw their ASPs actually increase in 2H10, while the overall industry ASPs rose 5% yoy in ’10. We believe that in ’11 we will see further modest ASP rises, as half of the incremental smartphone units will still come from N. America and Europe, and supply remains tight in 1H11. As emerging markets start driving growth in ’12, we expect ASPs will drop, but remain above ’09 levels (part of this is influenced by Apple in the mix). We reckon smartphones will constitute 84% of industry value share in ’13 (based on 53% of units), up from 56% in ’10 (from just 21% of units).

This does not account for the additional value that integrated handset vendors, and service providers will capture from a rising smartphone penetration, through content and other services. This dramatic shift in market value towards smartphones is shaking up what had from ’00 to ’07 been a cosy oligopoly in which the top five vendors consistently took ~80% of volume and value.

Figure 2: Migration to Smartphones Sustaining Industry ASPs

Source: Arete Research estimates

The recent announcement by Nokia to end-of-life Symbian and abandon its half of MeeGo threw the OS picture into chaos; we see Android becoming the dominant OS choice by 2012 (see Table 4), though expect a number of branches or variants on the Android kernel, as regional players and Tier One OEMs seek to differentiate smartphone offerings…

To read the Briefing in full, including in addition to the above analysis of:

  • Share by Smartphone OS
  • Semis: Logically, More – Doubling the logic chip market
  • Memory Madness – Trippling the memory markets
  • Telcos: Dummies no More – Growth opportunities for telcos with sophisticated tiers
  • Billions and Billions – Putting the numbers in context

…and additional tables and figures…

  • Table 4 – Share of Smartphone OS
  • Table 5 – Total Mobile Memory Markets
  • Table 6 – DRAM
  • Table 7 – NAND
  • Figure 3 – AT&T Post Paid ARPU ($) vs. Integrated Devices
  • Figure 4 – US Wireless Revenue Share
  • Figure 5 – VZW/AT&T Margins
  • Figure 6 – 3Q10 Mobile Service Revenue Growth (yoy)
  • Figure 7 – Consensus Forward P/E Multiple
  • Figure 8 – What Price Mobile Exposure in EU?

Members of the Telco 2.0TM Executive Briefing Subscription Service can download the full 7 page report in PDF format here. Non-Members, please see here for how to subscribe. Please email or
call +44 (0) 207 247 5003 for further details.

There’s also more on Device Strategies at our AMERICAS, EMEA and APAC Executive Brainstorms and Best Practice Live! virtual events.

Full Article: Devices 2.0: ‘Beyond Smartphones’ – Innovation Strategies for Operators

Summary: managing the role of new device categories in new and existing fixed and mobile business models is a key strategic challenge for operators. This report includes analysis of the practicalities and challenges of creating customised devices, best / worst practice, inserting ‘control points’ in open products, the role of ‘ODMs’, and reviews leading alternative approaches.

NB A PDF Version of this 45 page report can be downloaded here.


As part of its recently-published report on Mobile and Fixed Broadband Business Models, Telco 2.0 highlighted four potential strategic scenarios, one of which was for operators to become “device specialists” as a deliberate element of strategy, either in wireline and wireless domains. This theme was also covered at the April 2010 Telco 2.0 Brainstorm event in London.

Clearly, recent years have displayed accelerating innovation in numerous “end-point” domains – from smartphones, through to machine-to-machine systems and a broad array of new consumer electronics products. Yet there has been only limited effort made in mapping this diversity onto the broader implications for operators and their business prospects. 

Moving on from legacy views

An important aspect of device specialisation for telcos is one of attitude and philosophy. In the past, the power of the network has had primacy – large switching centres were at the heart of the business model, driving telephones – in some cases even supplying them with electrical power via the copper lines as well. Former government monopolies and powerful regulators have further enshrined the doctrines of central control in telecom executives’ minds.

Yet, as has been seen for many years in the computing industry, centralised systems give way to power at the edge of the network, increasingly assisted by a “cloud” of computing resource which is largely independent of the “wiring” need to connect it. The IT industry has long grasped the significance of client/server technology and, more recently, the power of the web and distributed computing, linked to capable and flexible PCs.

But in the telecom industry, some network-side traditionalists still refer to “terminals” as if Moore’s Law has no relevance to their businesses’ success. But the more progressive (or scared) are realising that the concentration of power “at the telecom edge”, coupled with new device-centred ecosystems (think iPhone + iTunes + AppStore), is changing the dynamics of the industry to one ruled by a perspective starting from the user’s hand back inwards to the core.

With the arrival of many more classes of “connected device” – from e-readers, to smart meters or in-vehicle systems – the role of the device becomes ever more pivotal in determining both the structure of supporting business models and the role of telcos in the value chain. It also has many implications for vendors.

The simplest approach is for operators to source and give away attractive devices in order to differentiate and gain new, or retain existing customers – especially in commoditised access segments like ADSL. At the other end of the spectrum, telcos could pursue a much deeper level of integration with new services to drive new fixed or mobile revenue streams – or create completely unique end-to-end propositions to rival those of 3rd-party device players like Apple, Sony or TiVo.

This Executive Brief examines the device landscape from an operator’s or network vendor’s standpoint. It looks at whether service providers should immerse themselves in creating on-device software and unique user experiences – or even commission the manufacture of custom hardware products or silicon. Alternatively, it considers the potential to “outsource” device smarts to friendlier suppliers like RIM or Thomson/Technicolor, which generally have operators’ success at the centre of their strategies. The alternative may be to surrender yet more value to the likes of Apple, Sony or Sling Media, allowing independent Internet or local services to be monetised without an “angle” for telco services.

Structure of this report

The narrative of this document follows this structure:

  • Introduction
  • The four broadband scenarios for operators, and an introduction to the “device specialist”
  • Developing an initial mechanism for mapping the universe of devices onto operator business models, which generally fit with four modes of communication
  • Consider why devices are such a potential threat if not tackled head-on
  • Provide case studies of previous telco experience of device focus, from a stance of best/worst practice
  • Examine enhancements to existing bus models via device focus
  • Analyse examples of new business models enabled by devices
  • Consider the practicalities of device creation and customisation
  • Suggest a mechanism for studying risk/reward in telcos’ device strategies
  • Recommendations and conclusions

A recap: 4 end-game scenarios

Broadband as the driver

Given the broad diversity of national markets in terms of economic development, regulation, competition and technology adoption, it is difficult to create simplistic categories for the network operators of the future. Clearly, there is a big distance between an open access, city-owned local fibre deployment in Europe, versus a start-up WiMAX provider in Africa, or a cable provider in North America.

Nevertheless, it is worth attempting to set out a few ‘end-game’ scenarios, at least for broadband providers in developed markets for which the ‘end’ might at least be in sight. This is an important consideration, as it sets parameters for what different types of telco and network owner can reasonably expect to do in the realm of device innovation and control.

The four approaches we have explored are:

  1. Telco 2.0 Broadband Player. This is the ultimate manifestation of the centralised Telco model, able to gain synergies from vertical integration as well as able to monetise various partner relationships and ecosystems. It involves some combination of:
    • Enhanced retail model providing well-structured connectivity offerings (E.g. tiered, capped and with other forms of granular pricing), as well as an assortment of customer-facing, value-added services. This may well have a device dimension. We also sometimes call this “Telco 1.0+” – improving current ways of doing business, especially through better up-selling, bundling and price discrimination.
    • Improved variants of ‘bulk wholesale’, providing a rich set of options for virtual operators or other types of service provider (e.g. electricity smart grid)
    • New revenue opportunities from granular or ‘slice and dice’ wholesale, based on two-sided business models for access capacity. This could involve prioritised bandwidth for content providers or mobile network offload, various ‘third-party paid’ data propositions, capabilities to embed broadband ‘behind the scenes’ in new types of device and so on.
    • A diverse set of ‘network capability’ or ‘platform’ value-add services for wholesale and upstream customers, such as authentication and billing APIs, and aggregated customer intelligence for advertisers. Again, there may be a device “angle” here – for example the provision of device-management capabilities to third parties.
    • A provider of open Internet services, consumed on other operators’ networks or devices, via normal Internet connectivity, essentially making the telco a so-called ‘over the top’ Internet application provider itself. This requires a measure of device expertise, in terms of application development and user-experience design.
  2. The Happy Piper. The broadband industry often likes to beat itself up with the threat of becoming a ‘dumb pipe’, threatened by service-layer intelligence and value being abstracted by ‘over the top players’. Telco 2.0 believes that this over-simplifies a complex situation, polarising opinion by using unnecessarily emotive terms. There is nothing wrong with being a pipe provider, as many utility companies and satellite operators know to their considerable profit. There are likely to be various sub-types of Telco that believe they can thrive without hugely complex platforms and multiple retail and wholesale offers, either running “wholesale-only” networks, participating in some form of shared or consortium-based approach, or offering “smart pipe services”.
  3. Government Department. There is an increasing trend towards government intervention in broadband and telecoms. In particular, state-guided, fully-open wholesale broadband is becoming a major theme, especially in the case of fibre deployments. There is also the role of stimulus funds, or the role of the public sector itself in driving demand for ‘pipes’ to enable national infrastructure projects such as electricity smart grids. Some telcos are likely to undergo structural separation of network from service assets, or become sub-contract partners for major projects around national infrastructure, such as electricity smart grids or tele-health.
  4. Device specialist, as covered in the rest of this report. This is where the operator puts its device skills at the core of its strategy – in particular, where the end-points become perhaps the most important functional component of the overall service platform. Most of the evolution of the telco’s service / proposition (and/or cost structure) would not work with generic “vanilla” devices – some form of customisation and control is essential. An analogy here is Apple – its iTunes and AppStore ecosystems and business models would not work with generic handsets. Conversely, Google is much less dependent on Android-powered handsets – it is able to benefit from advertising consumed on any type of device with a browser or its own software clients. 

There are also a few others categories of service provider that could be considered but which are outside the scope of this report. Most obvious is ‘Marginalised and unprofitable’, which clearly is not so much a business model as a route towards acquisition or withdrawal. The other obvious group is ‘Greenfield telco in emerging market’, which is likely to focus on basic retail connectivity offers, although perhaps with some innovative pricing and bundling approaches. (A full analysis of all these scenarios is available in Telco 2.0’s new strategy report on Fixed and Mobile Broadband Business Models).

It should be stressed that these options apply to operators’ broadband access in particular. Taking a wider view of their overall businesses, it is probable that different portfolio areas will reflect these (and other) approaches in various respects. In particular, many Telco 2.0 platform plays will often dovetail with specific device ecosystems – for example, where operators deploy their own mobile AppStores for widgets or Android applications.


Figure 1: Potential end-game scenarios for BSPs

Source: Telco 2.0 Initiative

Introducing the device specialist

In many ways, recent trends around telecoms services and especially mobile broadband have been driven as much by end-user device evolution as by network technology, tariffing or operation. Whilst it may be uncomfortable reading for telcos and their equipment vendors, value is moving beyond their immediate grasp. In future, operators will need to accept this – and if appropriate, develop strategies for regaining some measure of influence in that domain.

Smartphones have been around for years, but it has been Apple that has really kick-started the market as a distinct category for active use of broadband access, aided by certain operators which managed to strike exclusive deals to supply it. PCs have clearly driven the broadband market’s growth – but at the expense of a default assumption of “pipe” services. Huawei’s introduction of cheap and simple USB modems helped establish the market for consumer-grade mobile broadband, with well over 50 million “dongles” now shipped. Set-top boxes, ADSL gateways and now femtocells are further helping to redefine fixed broadband propositions, for those broadband providers willing to go beyond basic modems.

Going forward, new classes of device for mobile, nomadic and fixed use promise a mix of new revenue streams – and, potentially, more control over operator business models. In 2010, the advent of the Apple iPad has preceded a stream of “me-too” tablets, with an expectation of strong operator involvement in many of them.

However, not all telcos, either fixed or mobile, can be classified as device specialists. There is a definite art to using hardware or client software as a basis for new and profitable services, with differentiated propositions, new revenue streams and improved user loyalty. There are also complexities with running device management systems, pre-loading software, organising physical sales and supply chains, managing support issues and so on.

Operators can either define and source their own specific device requirements, or sometimes benefit from exclusivity or far-sightedness in recognising attractive products from independent vendors. Various operators’ iPhone exclusives are probably the easiest to highlight, but it is also important to recognise the skills of companies, such as NTT DOCOMO, which defines most of the software stack for its handsets, licensing it out to the device manufacturers.
In the fixed domain, some operators are able to leverage relationships with PC vendors, and in future it seems probable that new categories like smart meters and home entertainment solutions will provide additional opportunities for device-led partnerships.

  • Consequently, it is fair to say that device specialism can involve a number of different activities for operators:
  • A particularly strong focus on device selection, testing, promotion and support.
  • Development of own-brand devices, either produced bespoke in collaboration with ODMs (detailed later in this document), or through relatively superficial customisation of existing devices.
  • Negotiation of periods of device exclusivity in a given market (eg AT&T / iPhone)
  • Definition of the operator’s own in-house OS or device hardware platform, such as the strategies employed by NTT DoCoMo (with its Symbian / Linux variants) or KDDI (modified Qualcomm BREW) in Japan.
  • Provision of detailed specifications and requirements for other vendors’ devices, for example through Orange’s lengthy “Signature” device profiles.
  • Development of the operator’s own UI, applications and services – such as Vodafone’s 360 interface or its previous Live suite.
  • Deployment of device-aware network elements which can optimise end-to-end performance (or manage traffic) differentially by device type or brand.
  • The ability to embed and use “control points” in devices to enable particular business models or usage modes. Clearly, the SIM card is a controller, but it may also be desirable to have more fine-grained mechanisms for policy at an OS level as well. For example, some handset software platforms are designed to allow operators to licence and even “revoke” particular applications, while another emerging group are focused on handset apps used to track data usage and sell upgrades.
  • Development of end-to-end integrated services with devices as core element (similar to Apple or RIM). Much of the value around smartphones has been driven by the link of device-side intelligence to some form of “cloud” feature – RIM’s connection to Microsoft Exchange servers, or Apple iPhone + AppStore / iTunes, for example. Clearly, operators are hoping to emulate this type of distributed device/server symbiosis – perhaps through their own app stores.
  • Lastly, operators may be able to exercise influence on device availability through the enablement of a “device ecosystem” around its services & network. In this case, the telco provides certain platform capabilities, along with testing and certification resources. This enables it to benefit from exclusive devices created by partners, rather than in-house. Verizon’s attempt with its M2M-oriented “Open Device Initiative” is a good example.


Clearly, few operators will be in a place to pursue all of these options. However, in Telco 2.0’s view, there remains significant clear water between those which put device-related activities front and centre in their efforts – and those which are more driven by events and end-point evolution from afar.

New business models vs. old

Despite the broad set of options outlined in the previous section, it is important to recognise that operators’ device initiatives can be grouped into two broad categories:

  • Improving existing business models, for example through improving subscriber acquisition, reducing opex, or inducing an uplift in revenues on a like-for-like basis over older or more generic devices.
  • Enabling new business models, for example by selling devices linked to new end-to-end services, enabling the sale of incremental end-user subscriptions, or better facilitating certain new Telco 2.0-style two-sided opportunities (e.g. advertising).



Although much of the publicity and industry “noise” focuses on the strategic implications of the latter, it is arguably the former, more mundane aspects of device expertise that have the potential to make a bottom-line difference in the near term. While Telco 2.0 also generally prefers to focus on the creation of new revenues and new business model innovation in general, this is one area of the industry where it is also important to consider the inertia of existing services and propositions and the opportunities to reduce opex by optimising the way that devices work with networks. A good example of this is the efficiency and network friendliness of RIM’s Blackberry in comparison with Apple’s iPhone in both data compression technologies and use of signalling.

That said, the initial impetus for deploying the iPhone was mostly around customer acquisition and upselling higher-ARPU plans – but the unexpected success of apps quickly distracted some telcos away from the basics, and more towards their preferred and familiar territory of centralised control.

What are the risks without device focus?

Although many operators bemoan the risks of becoming a “dumb pipe”, few seem to have focused on exactly what is generating that risk. While the “power of the web” and the seeming acceptability of “best effort” communications get cited, it is rare that the finger of blame has pointed directly at the device space.

Over many decades, telecoms engineers and managers have grown up with the idea that devices are properly called “terminals”. Evocative of the 1960s or 1970s, when the most visible computers were “dumb” end-points attached to mainframes, this reflects the historic use of analogue, basic machines like fixed telephones, answering machines or primitive data devices.

Nevertheless, some people in the telecoms industry still stick with this anachronistic phrasing, despite the last twenty or thirty years of ever-smarter devices. The refusal to admit the importance of “the edge” is characteristic of those within telcos and their suppliers that don’t “get” devices, instead remaining convinced that it is possible to control an entire ecosystem from the core outwards.

This flat-earth philosophy is never better articulated than the continuing mantra of fear about becoming “dumb pipes”. It is pretty clear that there are indeed many alternatives for creating “smart pipes”, but those that succeed tend to be aware that, often, the end-points in customers’ hands or living rooms will be smarter still.

In our view, one of the most important drivers of change – if not the most important – is the fast-improving power of devices to become more flexible, open and intelligent. They are increasingly able to “game” the network in a way that older, closed devices were not. Where necessary, they can work around networks rather than simply through them. And, unlike the “dumb” end-points of the past such as basic phones and fax machines, there is considerable value in many products when they are used “offline”.

The markets tend to agree as well – the capitalisation of Apple alone is now over $200bn, with other major device or component suppliers (Nokia, Qualcomm, Microsoft, Intel, RIM) also disproportionately large.

“Openness” is a double-edged sword. While having a basic platform enables operators to customise and tinker to meet their own requirements, that same level of openness is also available to anyone else who wishes to compete. Some operators have managed the delicate balancing act of retaining the benefits of openness for themselves, but closing it down for end-users to access directly – DoCoMo’s use of Symbian and Linux in the “guts” of its phones is probably the best example.

Openness is also being made even easier to exploit through the continued evolution of the web browser. At the moment, it takes considerable programming skill to harness the power of an iPhone or a Nokia Symbian device – or, especially, a less-accessible device like an Internet TV. As it becomes more and more possible to run services and applications inside the browser, the barriers to entry for competing service providers become lowered still further. Even Ericsson, typically one of the most traditional telephony vendors, has experimented with browser-based VoIP . That said, there are some approaches to the web, such as the OMTP BONDI project, which might yet provide telcos with control points over browser capabilities, for example in terms of permitting/denying their access to underlying device features, such as making phone calls or accessing the phonebook.

Compute power: the elephant in the room

There is clear evidence that “intelligence” moves towards the edge of networks, especially when it can be coordinated via the Internet or private IP data connections. This has already been widely seen in the wired domain, with PCs and servers connected through office LANs and home fixed broadband, and is now becoming evident in mobile. There are now several hundred million iPhones, BlackBerries and other smartphones in active data-centric use, as well as over 50m 3G-connected notebooks and netbooks. Home gateways and other device such as femtocells, gaming consoles and Internet TVs are further examples, with billions more smart edge-points on the horizon with M2M and RFID initiatives.

This is a consequence of scale economies and also Moore’s Law, reflecting processors getting faster and cheaper. This applies not just to the normal “computing” chips used for applications, but also to the semiconductors used for the communications parts of devices. Newer telecom technologies like LTE, WiMAX and VDSL are themselves heavily dependent on advanced signal processing techniques, to squeeze more bits into the available network channels.
Ericsson’s talk of 50 billion connected devices by the end of the decade seems plausible, although Amdocs’ sound-bite of 7 trillion by 2017 seems to have acquired a couple of rogue zeroes. That said, even in the smaller figure, not all will be fully “smart”.

Unsurprisingly, we therefore see a continued focus on this “edge intelligence” as a key battleground – who controls and harnesses that power? Is it device suppliers, telcos, end users, or 3rd-party application providers (so-called “over-the-top players”)? Does it complement “services” in the network? Or drive the need for new ones? Could it, perhaps, make them obsolete entirely.

So what remains unclear is how operators might adopt a device strategy that complements their network capabilities, to strengthen their position within the digital value chain and foster two-sided business models. It is important for operators to be realistic about how much of the “edge” they can realistically control, and under what circumstances. Given that price points of devices are plummeting, few customers will voluntarily choose “locked” or operator-restricted devices if similarly-capable but more flexible alternatives cost much the same. Some devices will always be open – in particular PCs. Others will be more closed, but under the control of their manufacturers rather than the telcos – the iPhone being the prime example.

It is therefore hugely important for operators to look at devices as a way of packaging that intelligence into new, specific and valuable business models and propositions – ideally, ones which are hard to replicate through alternative methods. This might imply design and development of completely exclusive devices, or making existing categories more usable. At the margins, there is also the perennial option for subsidy or financing – although that clearly puts even more pressure on the ongoing business model to have a clear profit stream.

There are so many inter-dependent factors here that it is difficult to examine the whole problem space methodically. How do developments like Android and device management help? Should the focus be on dedicated devices, or continued attempts to control the design, OS or browser of multi-purpose products? What aspects of the process of device creation and supply should be outsourced?

Where’s the horsepower?

The telcos are already very familiar with the impact of traditional PCs on their business models – they are huge consumers of data download and upload, but almost impossible to monetise for extra services, as they are bought separately and are generally seen more as endpoints for standalone applications rather than services. The specific issue of the PC (connected via fixed or mobile broadband) is covered separately, but the bottom line is that it is a case study in the ultimate power of open computing and networks. PCs have also been embedded in other “vertical market” end-points such as retail EPOS machines, bank ATMs and various in-vehicle systems.

The problem is now shifting to a much broader consumer environment, as PC-like computing capability shifts to other device categories, most notably smartphones, but also a whole array of other products in the home or pocket.
It is worth considering an illustration of the shifting power of the “edge”, as it applies to mobile phones.

If we go back five or six years, the average mobile phone had a single main processor “core” in its chipset, probably an ARM7, clocking perhaps 30MHz. Much of this was used for the underlying radio (the “modem”) and telephony functions, with a little “left over” for some very basic applications and UI tools, like Java games.

Today, many of the higher-end handsets have separate applications processors as well as the modem chip. The apps processor is used for the high-level OS and related capabilities, and is the cornerstone of the change being observed. An iPhone has a 600MHz+ chip, and various suppliers of Android phones are using a 1GHz Qualcomm Snapdragon chip. Even midrange featurephones can have 200MHz+ to play with, most of which is actually usable for “cool stuff” rather than the radio.

This is where the danger lies for the telcos, as like PCs, it can shift the bias of the device away from consuming billable services and towards running software. (The situation is actually a bit more complex than just the apps processor, as phones can also have various other chips for signal processing, which can be usable in some circumstances for aspects of general computing. The net effect is the same though – massively more computational power, coupled with more sophisticated and open software).

Now, let’s project forward another five years. The average device (in developed markets at least) will have at least 500MHz, with top-end devices at 2GHz+, especially if they are not phones but tablets, netbooks or similar products. Set top-boxes, screenphones, game consoles and other CPE devices are growing smarter in parallel – especially enabled for browsers which can then act as general-purpose (distributed) computing environments. A new class of low-end devices is emerging as well. How and where operators might be able to control web applications is considered below, as it is somewhat different to the “native applications” seen on smartphones.

For the sake of argument, let’s take an average of 500MHz chips, and multiply by (say) 8 billion endpoints.
That’s 4 Exahertz (EHz, 1018) of application-capable computing power in people’s hands or home networks, without even considering ordinary PCs and “smart TVs” as well. And much – probably most – of that power will be uncontrolled by the operators, instead being the playground of user- or vendor-installed applications.

Even smart pipes are dumb in comparison

It is tricky to calculate an equivalent figure for “the network”, but consider an approximation of 10 million network nodes (datapoint: there are 3 million cell sites worldwide), at a generous 5GHz each. That means there would be 50 Petahertz (PHz, 1015) of computing power in the carrier cloud, and it’s including the assumption that most operators will also have thousands of servers in the back-office systems as well as the production network itself.

In other words, the telcos, collectively, have maybe an 80th of the collective compute power of the edge. It is quite possibly much lower than that, but the calculation is intended as an upper bound.

Now clearly, this is not quite as bad a deficit as that makes it sound – the network can obviously leverage intelligence in a few big control points in the core such as GGSNs and DPI boxes, as traffic funnels through them. It can exert control and policy over data flows, as well as what is done at the endpoints.

But at the other end of the pipe is the Internet, with Google and Amazon’s and countless other companies’ servers and “cloud computing” infrastructures. Trying to calculate the aggregate computing power of the web isn’t easy either, but it’s also likely to be in the Exahertz range too. Google is thought to have around one million servers on its own, for example, while the overall server population of the planet (including both Internet and enterprise) is thought to be of the order of 50 million, many of which have multiple processor cores.



Whatever else happens, it seems the pipe will inevitably become relatively “dumber” (i.e. less smart) than the devices at the edge, irrespective of smart Telco 2.0 platforms and 4G/NGN networks. The question is how much of that edge intelligence can be “owned” by the operators themselves.

Controlling device software vs. hardware

The answer is for telcos to attempt to take control of more of this enormous “edge intelligence”, and exploit it for their own benefit and in-house services or two-sided strategies.

There are three main strategies for operators wanting to exert influence on edge devices:

  • Provide dedicated and fully-controlled and customised hardware and software end-points which are “locked down” – such as cable set-top boxes, or operator-developed phones in Japan. This is essentially an evolution of the old approach of providing “terminals” that exist solely to act as access points for network-based services. This concept is being reinvented with new Telco-developed consumer electronic products like digital picture frames, but is a struggle for variants of multi-function devices like PCs and smartphones
  • Provide separate hardware products that sit “at the edge” between the user’s own smart device and the network, such as cable modems, femtocells, or 3G modems for PCs. These can act as hosts for certain new services, and may also exert policy and QoS control on the connection. Arguably the SIM card fits into this category as well
  • Develop control points, in hardware or software, that live inside otherwise notionally “open” devices. This includes SIM-locks, Telco-customised UI and OS layers, “policy-capable” connection manager software for notebooks, application and widget certification for smartphones, or secured APIs for handset browsers. Normally, it will be necessary for the operator to be the original device supplier/retailer for these capabilities to be enabled before sale – few users will be happy for their own device to be configured after purchase with extra controls from their service provider.

Case studies and best / worst practice

Going back 30 years, before telecoms deregulation, many telcos were originally device specialists. In many cases, the incumbent government monopolies were also the only source of supply of telephones and various other communications products (“CPE” – customer premises equipment) – often renting them to users rather than selling them. Since then of course, much has changed. Not only have customers been able to buy standards-compliant, certified terminals on the open market, but the rise of personal computing and mobile communications has vastly expanded the range and capability of end-points available.

But while few telcos could benefit today from owning physical manufacturing plants, there is an increasing argument for operators once again to take a stronger role in defining, sourcing and customising end-user hardware in both mobile and fixed domains. As discussed throughout this document, there is a variety of methods that can be adopted – and also a wide level of depth of focus and investment. Clearly, owning factories is unlikely to be an attractive option – but at the other end of the scale, it is unclear whether merely issuing vague “specifications” or sticking logos on white-labelled goods from China really achieves anything meaningful from a business model standpoint.

It is instructive to examine a few case studies of operator involvement in the device marketplace, to better understand where it can add value as a core plank of strategy, rather than simply as a tactical add-on.


Perhaps the best example of a device-centric operator is NTT DoCoMo in Japan. It would perhaps be more accurate to describe the firm as a technology-centric firm, as it pretty much defines its complete end-to-end system in-house, usually as a front-runner for more general 3GPP systems like WCDMA and LTE, but with subtle local modifications.About 10 years ago, it recognised that handset development was going to be a pivotal factor in delaying its then-new 3G FOMA services, and committed very significant funds to driving the whole device ecosystem to accelerate this.

In fact, DoCoMo has a very significant R&D budget in general, which means that it has been able to develop complete end-to-end platforms like i-Mode, spanning both handset software and back-end infrastructure and services. Although it is known for initiatives like these, as well as its participation in Symbian, Android and LiMo ecosystems, its device expertise goes far beyond handset software. For example, its own in-house research journal covers innovative areas of involvement, such as:

Improved video display on handsets

Development of its own in-vehicle 3G module for telematics applications

Measurement of handset antenna efficiency

In some ways, DoCoMo is in a unique position. It did not have to pay for original 3G spectrum and channelled funds into device and infrastructure development instead. It also operates in an affluent and gadget-centric market that has at times been willing to spend $500-600 on massmarket handsets. It has close ties with a number of Japanese vendors, with whom it spends large amounts on infrastructure and joint R&D. And its early pragmatism with web and software developers (in terms of revenue-share) has largely kept the ecosystem “on-side”, compared with other markets in which a mass of disgruntled application providers have eagerly jumped on off-portal and “open” OS platforms, to the detriment of operators.

In its financial year to March 2009, DoCoMo had a total R&D spend of 100 billion Yen (approximately $1bn). While this is split across both basic research and various initiatives around networks and services, it also has a dedicated “device development” centre. It compares to R&D spending by Vodafone Group in the same period of £280m, or about $450m, while mid-size global mobile group Telenor spent just NOK1.1bn ($180m) in calendar year 2008. For comparison, Apple’s current annualised R&D spend is around $1.6bn per year, and Google’s is $3.2bn – while Nokia’s was over $8bn in 2009 – albeit spread across a much larger number of products, as well as its share in NSN. Even smaller device players such as SonyEricsson spend >$1bn per year.

Although DoCoMo is best known for its handset software involvement – i-Mode, Symbian, LiMo, MOAP and so forth – it also conducts a significant amount of work on more hardcore technology platform development. Between 2005 and 2007, for example, it invested 12.5 billion Yen ($125m) in chipset design for its 3G phones.

It has huge leverage with Japanese handset manufacturers like NEC and Matsushita, as they have limited international reach. This means that DoCoMo is able to enforce adoption of its preferred technology components – such as single integrated chips that it helps design, rather than multiple more expensive processors.

While various operators are now present in handset-OS organisations such as the LiMO Foundation and Open Handset Alliance (Android), DoCoMo’s profile in device software has been considerably greater in the past. It is a founder member of Symbian, driving development of one of the original 3 Symbian user interfaces (the other two being Nokia’s S60 and the now-defunct UIQ). DoCoMo now makes royalty revenues, in some instances, from use of its handset software by manufacturers. It also owns a sizeable stake in browser vendor Access, and has also invested in other handset software suppliers like Aplix.

Verizon Open Device Initiative

From the discussion about DoCoMo above, it is clear that for an operator to start creating its own device platform from the bottom up, it will need extremely deep pockets and very close relationships with willing OEMs to use its designs. For individual handsets or a small series of similar devices, it can clearly choose the ODM route, although this risks limiting differentiation to a thin layer of software and a few “off the peg” hardware choices.

Another option is to try to create a fully-fledged hardware ecosystem, putting in place the tools and business frameworks to help innovative manufacturers create a broad set of niche “long tail” devices that conform to a given operator’s specifications. If successful, this enables a given telco to benefit from a set of unique devices that may well come with new business models attached. Clearly, the operator needs to be of sufficient scale to make the volumes worthwhile – and there also needs to be a guarantee of network robustness, channels to market and back-office support.

Verizon’s “Open Device Initiative” is perhaps the highest-profile example of this type of approach, aiming to foster the creation of a wide range of embedded and M2M products. It assists in the certification of modules, and also links in with its partnership with Qualcomm and nPhase in creating an M2M-enabling platform. A critical aspect of its purpose is a huge reduction in certification and testing time for new devices against its network – something which had historically been a time-to-market disaster lasting up to 12 months, clearly unworkable for categories like connected consumer-oriented devices. It has been targeting a 4-week turnaround instead, working with a streamlined process involving multiple labs and testing facilities.

US rival operator AT&T is attempting a similar approach with its M2M partner Jasper Wireless, although Verizon ODI has been more conspicuous to date.

3 / INQ Mobile

Another interesting approach to device creation is that espoused by the Hutchison 3 group. Its parent company, Hutchison Whampoa, set up a separately-branded device subsidiary called INQ Mobile in October 2008. INQ specialises in producing Internet-centric featurephones with tight integration of web services like Skype, Facebook and Twitter on low-cost platforms. Before the launch of INQ, 3 had already produced an earlier product, the SkypePhone, but had not sold that to the outside marketplace.

At around $100 price points, it is strongly aimed at prepaid-centric or low-subsidy markets where users want access to a subset of Internet properties, but without incurring the costs of a full-blown smartphone. It has worked closely with Qualcomm, especially using its BREW featurephone software stack to enable tight integration with web services and the UI. That said, the company is now switching at least part of its attention to Android-based devices in order to create touchscreen-enabled midmarket devices.

3/INQ highlights one of the paradoxes of operator involvement in device creation – while it is clearly desirable to have a differentiated, exclusive device, it is also important to have a target market of sufficient scale to justify the upfront investment in its creation. Setting up a vehicle to sell the resulting phones or other products in geographies outside the parent’s main market footprint is a way to grow the overall volumes, without losing the benefits of exclusivity.
In this sense, although the 3 Group clearly benefits from its association with INQ, it is not specifically part of the operator’s strategy but that of its ultimate holding company. The separate branding also makes good sense. It is also worth noting that 3 is not wholly beholden to INQ for supply of own-brand devices; its current S2x version of its Skypephone is manufactured by ZTE.

BT Fusion

It is also worthwhile discussing one of the less-successful device initiatives attempted by operators in recent years. Between 2003-2009, BT developed and sold a fixed-mobile converged service called Fusion, which flipped handsets between an ordinary outdoor cellular connection and a local wireless VoIP service when indoors and connected to a BT broadband line.

Intended to reduce the costs associated with use of then-expensive mobile calls, when in range of “free” landline or VoIP connections, it relied on switching to Bluetooth or WiFi voice when within range of a suitable hotspot. The consumer and small-business version relied on a technology called UMA (Universal Mobile Access), while a corporate version used SIP. The mobile portion of the service used Vodafone’s network on an MVNO basis.

Recognising that it needed widespread international adoption to gain traction and scale, BT did many things that were “right”. In particular, it supported the creation of the FMCA (Fixed-Mobile Convergence Alliance) and engaged directly with many handset vendors and network suppliers, notably Motorola for devices and Alcatel-Lucent for systems integration. It also ran extensive trials and testing, and participated in various standards-setting fora.
The service never gained significant uptake, blamed largely on falling prices for mobile calls which reduced the core value proposition. It also reflected a very limited handset portfolio, especially as the technology only supported 2G mobile devices at launch – at just the point when many higher-value customers wanted to transition to 3G.

Conversely, lower-end users generally tend to use prepaid mobile in the UK, which did not fit well with BT’s contract-based pricing oriented around Fusion’s position as an add-on to normal home broadband. In addition, there were significant issues around the user interface, and the interaction of the UMA technology with certain other uses of the WiFi radio that the user did not wish to involve the operator.

The main failure for BT here was in its poor focus on what its customers wanted from devices themselves, as well as certain other aspects of the service wrapper, such as numbering. It was so focused on some of the network and service-centric aspects of Fusion (especially “seamless handover” of voice services) that it ignored many of the reasons that customers buy mobile phones – a range of device brands and models, increasing appeal of 3G, battery life, the latest features like high-resolution cameras and so forth. Towards the end of Fusion’s life, it also looked even weaker once the (unsupported) Apple iPhone raised the bar for massmarket adoption of smartphones. It was withdrawn from sale in early 2009.

BT also overlooked (or over-estimated) the addressable market size for UMA-enabled phones, which should have made it realise that support of the technology was always going to be an afterthought for the OEMs. It also over-relied upon Motorola for lower-end devices, and supported Windows Mobile for its smartphone variants more for reasons of pragmatism than customer demand.

Lastly, BT appears to have underestimated the length of time it would take to get devices from concept, through development and testing to market. In particular, it takes many years (and a clear economic rationale) for an optional feature to become built-into mobile device platforms as standard – and until that occurs, the subset of devices featuring that capability tends to be smaller, more expensive, and often late-to-market as OEMs focus their best engineers and project resources on more scalable investments.

Perhaps the main takeaway here is that telcos’ involvement in complex, technology-led device creation is very risky where the main customer benefit is simply cheaper services, in markets where the incumbent providers have scope to reduce margins to compete. A corollary lesson is that encouraging device vendors to support new functions that only benefit the operators (and only a small proportion of customers) is tricky unless the telcos are prepared to guarantee better purchase prices or large volumes. This may well be a reason that leads to the failure of other phone-based enhancements, such as NFC to date.

The role of the ODM in telco-centric devices

An important group of players in operators’ device strategies are the ODMs (original design manufacturers). Usually based in parts of Asia such as Taiwan and Korea, these firms specialise in developing customised “white label” hardware to certain specifications, which are then re-branded by more well-known vendors. ODMs are rather higher up the value-add hierarchy than CMs (contract manufacturers) that are more just factory-outsourcing companies, with much less design input.

Historically, the ODMs’ main customers were the device “OEMs” (original equipment manufacturers) – including well-known firms like Motorola, SonyEricsson and Palm. Even Nokia contracts-out some device development and manufacturing, despite its huge supply chain effectiveness. Almost all laptops are actually manufactured by ODMs – this supply route is not solely about handsets.

Examples of ODMs include firms like Inventec, Wistron, Arima, Compal and Quanta. Others such as HTC, ZTE and Huawei also design and sell own-brand products (ie act as OEMs) as well as manufacturing additional lines for other firms as ODMs.

In a growing number of instances, operators themselves are now contracting directly with ODMs to produce own-brand products for both mobile and fixed marketplaces. This is not especially new in concept – HTC in particular has provided ODM-based Windows Mobile smartphones and PDAs to various operators for many years. The original O2 XDA, T-Mobile MDA and Orange SPV series of smart devices all came via this route.

More recently, the ODM focus has swung firmly behind Android as the best platform, although there are still Microsoft-based products in the background as well. There are also patchy uses of ODMs to supply own-branded featurephones, usually for low-end prepaid segments of the market.

One trend that is conspicuous has been that the ODMs favoured by operators have tended to differ from those favoured by the other OEMs. MNOs have tended to work with the more experienced and technically-deep ODMs (which often have sizeable own-brand sales as well), perhaps to compensate for their limitations in areas such as radio and chipset expertise. They also want vendors that are capable of executing on sophisticated UI and applications requirements. HTC, ZTE and Sagem have made considerable headway in cutting deals with operators, with ZTE in particular able to leverage its growing global footprint associated with infrastructure sales. Conversely, some of the more “traditional” ODMs from Taiwan, such as Compal and Arima, have struggled to engage with operators to the same degree they can outsource design / manufacture from companies like Motorola and Sony Ericsson.

One of the most interesting recent trends is around new device form-factors, such as web tablets, ebook readers and netbooks/smartbooks. Operators are working with ODMs in the hope of deploying such devices as part of new business models and service propositions – either separate from conventional mobile phone service contracts, or as part of more complex integrated three / four screen converged offers. Again, Android is playing an important role here, especially for products that are Internet-centric such as tablets. Not all such devices are cellular-enabled: some, especially where they are intended for use just within the home, will be WiFi-only, connected via home broadband. Android is important here because of its malleability – it is much easier for operators (and their device partners) to create complete, customised user experiences, as the architecture does not have such a fixed “baseline” of user interface components or applications as Windows. It is also cheaper.

It is nevertheless important to note that ODM-based device strategies are often difficult to turn into new business models, and have various practical complexities in execution. Most ODMs base their products on off-the-shelf “reference designs” from chipset suppliers, alongside standard OS’s (hence Android and WinMob) and a fairly thin layer of in-house IPR and design skills. There is often limited differentiation over commodity designs for a given product, except in the case of the few ODMs that have built up strong software expertise over years (notably HTC).

In addition, the “distance” in terms of both value-chain position and geography often makes operator/ODM partnerships difficult to manage. Often, neither has particularly good skill sets in terms of RF design, embedded software development, UI design and ecosystem management. This means that a range of extra consultants and integrators also need to be roped into the projects. While open OS’s like Android provide an off-the-shelf ecosystem to add spice to the offerings, the overall propositions can suffer from a lack of centralised ownership.

It is worth considering that most previous operator/ODM collaborations have been successful in two contexts:

  • Early Windows Mobile and Pocket PC devices sold to businesses and later consumers, to compete primarily against Nokia/Symbian and provide support for email, web browsing and a limited set of applications. Since the growth of Apple and BlackBerry, these offerings have looked weak, although ODM Android-based smartphones are restoring the balance somewhat.
  • Low-end commodity handsets, primarily aimed at prepaid customers in markets where phones are sold through operator channels. Typically, these have been aimed at less brand-conscious consumers who might otherwise have bought low-tier Nokia, Samsung or LG handsets.

On the other hand, other operator / ODM tie-ups have been rather less successful. In 2009, a number of operators tried rolling out small handheld MIDs (mobile Internet devices), with lacklustre market impact.

One possibility is that ODMs will start to shift focus away from mobile handsets, and more towards other classes of device such as tablets, femtocells and in-car systems. These are all areas in which there is much less incumbency from the major OEM brands like Apple and Samsung, and where operators may be able to sell completely new classes of device, packaged with services.

It has been estimated that own-brand operator handsets remain a “minority sport”, with IDC reported as estimating they only accounted for 1.4% of units shipped in Western Europe in 2008.

Enhancing existing business models

Returning to one of the points made in the introduction, there are two broad methods by which device expertise can enhance operators’ competitive position and foster the creation and growth of extra revenue:

  • Improving the success and profitability of existing services and business models
  • Underpinning the creation of wholly new offerings and propositions

This section of the document considers the former rationale – extending the breadth and depth of current services and brand. Although much of the recent media emphasis (and perceived “sexiness”) around devices is on the creation of new business models and revenue streams, arguably the main benefits of device specialisation for telcos are more prosaic. Deploying or designing the right hardware can reduce ongoing opex, help delay or minimise the need for incremental network capex, improve customer loyalty and directly generate revenue uplift in existing services.

Clearly, it is not new analysis to assert that mobile operators benefit from having an attractive portfolio of devices, in markets where they sell direct to end-users. Exclusive releases of the Apple iPhone clearly drove customer acquisition for operators such as AT&T and O2. Even four years ago, operators which merely gained preferential access to new colours of the iconic Motorola RAZR saw an uplift in subscriber numbers.

But the impact on ongoing business models goes much further than this, for those telcos that have the resources and skill to delve more deeply into the ramifications of device selection. Some examples include:

  • There is a significant difference between devices in terms of return rates from dissatisfied customers – either because of specific faults (crashing, for example) or poor user experience. This can cause significant losses in terms of the financial costs of device supply/subsidy, along with negative impact on customer loyalty.
  • Less serious than outright returns, it is also important to recognise the difference in performance of devices on an ongoing basis. In 2009, Andorra-based research lab Broadband Testing found huge variations between different smartphones in the basics of “being a phone” – some regularly dropped calls under certain circumstances such as 3G-to-2G transitions, for example. Often, users will wrongly associate dropped calls with flaws in the network rather than the handset – thereby generating a negative perception for the telco.
  • Another important aspect of opex relates to handling support calls, which can easily cost $20 per event – and sometimes much more for complex inquiries needing a technical specialist. This becomes much more of an issue for certain products, such as advanced data-capable products, where configuration of network settings, email accounts and VoIP services can be hugely problematic. A single extra technical call, per user per year, can wipe out the telco’s gross margin. Devices which have setup “wizards” or even just clearer menus can reduce the call-centre burden considerably. Even in the fixed world, home gateways or other products designed to work well “out of the box” are essential to avoid profit-sapping support calls (or worse, “truck rolls”). This can mean something as trivial as colour-coding cables and sockets – or as sophisticated as remote device management and diagnostics.
  • Selection of data devices with specific chipsets and radio components can have a measurable impact on network performance. Certain standards and techniques are only implemented in particular semiconductor suppliers’ products, which can use available capacity more efficiently. Used in sufficiently large numbers, the cumulative effect can result in reduced capex on network upgrades. While few carriers have the leverage to force new chip designs into major handset brands’ platforms, the situation could be very different for 3G dongles and internal modules used in PCs, which tend to be much more generic and less brand-driven. UK start-up Icera Semiconductor has been pursuing this type of engagement strategy with network operators such as Japan’s SoftBank.
  • Device accessories can add value to a service provider’s existing offerings, adding loyalty, encouraging contract renewal, and potentially justifying uplift to higher-tier bundles. For home broadband, the provision of capable gateways with good WiFi can differentiate versus alternative ISPs. For those providing VoIP or IPTV, the addition of cordless handsets or PVRs / media servers can add value. In mobile, the provision of car-kits can improve voice usage and revenues significantly.
  • Operators’ choice of devices can impact significantly on ARPU. There is historical evidence that a good SMS client on a mobile phone will drive greater usage and revenue, for example. In the fixed-broadband world, providing gateways with (good) WiFi instead of simple modems has driven multiple users per household – and thus a need for higher-tier services and greater overall perception of value.

Figure 2: Operators need to consider the effects of basic device performance on customer satisfaction and the network

Source: Broadband Testing

There are also much simpler ways in which devices can bolster current services’ attractiveness: 2010 and 2011 are likely to see an increasing number of new devices being sold or given away by operators in order to retain existing customers using existing services.

In particular, a new class of WiFi-based web tablets are expected to become quite popular among fixed broadband companies looking to avoid churn or downward pricing pressure, as well as (perhaps) acting as future platforms for new services as well. Although there are numerous technical platforms for tablets, it seems likely that Android will enable a broad array of inexpensive Asian ODMs to produce competent products, especially as they will not need complex integration of voice telephony or other similar features. The growing maturity of web browsers and widgets (for example with HTML5), as well as the flexibility of the Android Marketplace, should enable sufficient flexibility for use of the products with most leading-edge web services.

Expect to see plenty of “free” pseudo-iPads being given as inducements to retain customers, or perhaps to upsell them to a higher-tier package. The ability for fixed broadband providers to compete with their mobile peers, through providing subsidised devices, should not be underestimated. By the same token, mobile operators may choose to give away free or discounted femtocells

It is also possible for operators’ direct involvement in the device marketplace to lead to lower costs for existing business models. Various groups of operators have collectively acted in partnership to reduce device prices through collective purchasing and negotiation, as well as enabling larger-scale logistics and supply chain operations. In Japan, NTT DoCoMo has conducted a considerable amount of research on chipset integration, with the result of enabling cheaper handset platforms (see case study below).

Operator home gateways

Probably the most visible and successful area for operator-controlled and branded devices has been the home gateway provided by many ADSL operators, as well as their peers’ offerings of cable modems and set-top boxes. While these are usually produced by companies such as Thomson / Technicolor and 2Wire, many operators undertake very substantial customisation of both hardware and software.

Up to a point, these products have acted as service “hubs”, enabling fixed broadband providers to offer a variety of value-added options such as IPTV, VoIP, remote storage and other service offerings. They normally have WiFi (and, sometimes, “community” connectivity such as the BT / FON tie-up) and various ports for PCs and other devices. Some incorporate wireless DECT or WiFi phones. Most are remotely manageable and can support software upgrades, as well as some form of interactivity via the customer’s PC. Given that most home broadband contracts last at least a year – and are rarely churned – the cost can be defrayed relatively easily into the ongoing service costs. 



That is the good side of home gateways. The downside is that they rarely generate additional incremental revenue streams after the initial installation. Users only infrequently visit operators’ portals, or even less often use the in-built management software for the device. They respond with indifference to most forms of marketing after the initial sign-up: anecdotally, telephone sales and direct mail have poor response rates.

Nevertheless, these products still form a centrepiece of many broadband providers’ strategies and competitive differentiation:

  • Most obviously, they are needed to support higher broadband speeds, which remains the key differentiator between telcos selling ADSL or cable connectivity. “Upgradeability” to faster speeds is one of the most likely options to drive aftermarket revenue uplift or induce loyalty via “free” improvements whilst maintaining price against a falling market. In some countries, the ability to support fibre as well as copper is an important form of future-proofing. Potentially, the inclusion of femtocell modules also confers extra upgrade potential.
  • If well-designed, they can prompt selection of a higher-end monthly tariff or bundle at the initial sale, especially where the operator has a range of alternative products. For example, Orange sells its low-end plans with a basic wireless router, while its higher-end offerings use its LiveBox to support value-adds like VoIP, UMA and so forth. BT offers a free DECT handset with its top-end bundle.
  • Gateways can have the ability to reduce operating costs, especially if they have good self-diagnostics and helpdesk software.
  • In some cases, the gateway can stimulate an ecosystem of accessories such as cordless handsets or other add-ons. Orange, once again, uses its LiveBox as a platform for additional “digital home” products such as a networked storage drive, Internet radio and even a smoke detector*. These can either generate additional revenue directly in hardware sales, or by incremental services – or even just greater utilisation of the base offers. In the future, it seems likely that this approach could evolve into a much broader set of services, such as smart-grid electricity monitoring.


(*The Orange France smoke detector service is interesting, in that it comes with two additional options for the user to subscribe to either Orange’s own €2 per month alerting service, or a third-party “upstream” insurance and assistance firm’s more comprehensive offering [Mondial Assistance] at €9 per month)

As such, it is (in the long term) a potentially massive assistance to operators wishing to pursue two-sided models. It can act as a control point for network QoS, helping differentiate certain end-user ‘consumption’ devices through physical ports or separate WiFi identities. It can store information or provide built-in applications (for example, web caching). This approach could enable a work-around for Net Neutrality, if two-sided upstream partners’ applications are prioritised not over the Internet connection, but instead by virtue of having some form of local ‘client’ and intelligence in the operator’s broadband box. While this might not work for live TV or real-time gaming, there could certainly be other options that might allow more ‘slice and dice’ revenue to be extracted.

It is also much more feasible (net neutrality laws permitting) to offer differentiated QoS or bandwidth guarantees on fixed broadband, when there is a separate hardware device acting as a “demarcation point”, and able to measure and report on real-world conditions and observed connectivity behaviour. This is critical, as it seems likely that “upstream” providers will demand proof that the network actually delivered on the QoS promises.

The bottom line is that operators intending to leverage in-home services need a fully-functional gateway. It is notable that some operators are now backing away from these towards less-functional and cheaper ADSL modems (for example, Telecom Italia’s Alice service), which may reflect a recognition that added-value sales are much more difficult than initially thought.

It is difficult to monetise PCs beyond “pipes”

Despite our general enthusiasm for innovation in gaining revenues from new “upstream” providers, Telco 2.0 believes that the most important two-sided opportunities will involve devices other than PCs. We also feel it is highly unlikely that operators will be able to sell many incremental “retail” services to PCs users, beyond connectivity. That said, we can envisage some innovation in pricing models, especially for mobile broadband in which factors like prepaid, “occasional” nomadicity and offload may play a part. There may also be some bundling – for example of music services, online storage or hosted anti-virus / anti-spam functions. One other area of exception may be around cloud computing services for small businesses.

Although the popular image of broadband is people on FaceBook, running Skype or BitTorrent or watching YouTube on a laptop, these services are not likely to support direct ‘slice and dice’ wholesale capacity revenues from the upstream providers. Telco 2.0 believes that in certain cases (eg fixed IPTV), Internet or media companies might be prepared to pay an operator extra for improved delivery of content or applications. But there is very little evidence that PC-oriented providers such as YouTube, for example, will be prepared to pay “cold hard cash” to broadband providers for supposed “quality of service”. PCs are ideal platforms for alternative approaches – rate adaptation, buffering, or other workarounds. PC users are comparatively tolerant, and are more prone to multi-tasking while downloads occur. However, these companies may still be able to generate advertising revenue-share, telco B2B value-added services (VAS) and API-based revenues in some circumstances – especially via mobile broadband.

That said, for mobile broadband, PCs are really more of a problem than an opportunity, generating upwards of 80% of downstream data traffic for many mobile operators – 99.9% of which goes straight to the Internet, through what is actually quite complex and expensive core network “machinery”. Offloading PC-based mobile traffic to the Internet via WiFi or femtocell is a highly attractive option – even if it means forgoing a small opportunity for uplift. The benefits of increasing capacity available for smartphones or niche devices without extra capex on upgrades far outweighs this downside in most cases.

In the fixed world, the data consumption of PCs may eventually look like a red herring, except for the most egregiously-demanding users. The real pain (and, perhaps, opportunity) in terms of network costs will increasingly come from other devices connected via broadband, especially those capable of showing long-form HD video like large-screen TVs and digital video recorders. Other non-PC devices connected via fixed broadband including game consoles, tablets, smartphones (via WiFi), femtocells, smart meters, healthcare products and so on.

As the following section describes, PC-based applications are generally too difficult to track or charge for on a granular basis, while other supplementary products and associated applications tend to be easier to monitor and bill – and often have value chains and consumer expectations that are more accepting of paid services.

The characteristics which distinguish PCs from other broadband-connected devices include:

  • High-volume traffic. With a few exceptions that can be dealt with via caps or throttling, most PC users struggle to use more than perhaps 30GB/month today on fixed broadband, and 5GB on mobile. This is likely to scale roughly in parallel with overall network capacity, rather than out-accelerate it. Conversely, long-form professional video content has the potential to use many GB straight away, with a clear roadmap to ever-higher traffic loads as pixel densities increase. Clearly, PCs are today often facilitators in video downloads, but relatively few users can be bothered to hook their computers up to a large screen. In the future, there are likely to be more directly Internet-connected TVs, as well as specialist boxes like the Roku;
  • Multiple / alternative accesses. PCs will increasingly be used with different access networks – perhaps ADSL and WiFi at home, 3G mobile broadband while travelling, and paid WiFi hotspots in specific locations. This makes it much more difficult to monetise any individual pipe, as the user (and content/app provider) has relatively simple methods for arbitrage and ‘least cost routing’;
  • Likelihood of obfuscation. PCs are much more likely to be able to work around network policies and restrictions, as they are ideal platforms for new software and are generally much less controlled by the operator or vendor. Conversely, the software in a TV or health monitoring terminal is likely to be static, and certainly less prone to user experimentation. This means that if the network can identify certain traffic flows to/from a TV today, they are unlikely to have changed significantly in a year’s time. Nobody will install a new open-source P2P application on their Panasonic TV, or a VPN client in their blood-pressure monitor. Conversely, PC applications will require a continued game of cat-and-mouse to stay on top of. There is also much less risk of Google, Microsoft or another supplier giving away free encryption / tunnelling / proxying software and hiding all the data from prying DPI eyes;
  • Cost of sale and support. Few Telcos are going to want to continually make hundreds of new sales and marketing calls to the newest ‘flavour of the month’ Web 2.0 companies in the hope of gaining a small amount of wholesale revenue. Conversely, a few ‘big names’ in other areas offer much more scope for solid partnerships – Netflix, Blockbuster, BBC, Xbox Live, Philips healthcare, Ubiquisys femtocells and so on. A handful of consumer electronics manufacturers and other Telcos represents a larger and simpler opportunity than a long tail of PC-oriented web players. Some of the latter’s complexity will be reduced by the emergence of intermediary companies but even with these, operators will almost certainly focus on the big deals;
  • Reverse wholesale threats. The viral adoption and powerful network effects of many PC-based applications mean that operators may be playing with fire if they try to extract wholesale revenues for data capacity. It is very easy for users of a popular site or service (e.g. Facebook) to mobilise against the operator – or even for the service provider to threaten to boycott specific ISPs and suggest that users churn. This is much less likely for individual content-to-person models like TV, where it is easier to assert control from a BSP point of view;
  • Consumer behaviour and expectations. Consumers (and content providers) are used to paying more/differently for video viewed on a TV versus on a PC. Similarly, the value chains for other non-PC services are less mature and are probably easier for fixed BSPs to interpose themselves in, especially while developers and manufacturers are still dealing with ‘best efforts’ Internet access. PC-oriented developers are already good at managing variable connection reliability, so tend to have less incentive to pay for improvements. There are some exceptions here, such as applications which are ‘mission critical’ (e.g. hosted Cloud / SaaS software for businesses, or real time healthcare monitoring), but most PC-based applications and their users are remarkably tolerant of poor connectivity. Conversely, streaming HD video, femtocell traffic and smart metering have some fairly critical requirements in terms of network quality and security, which could be monetised by fixed BSPs;
  • Congestion-aware applications. PC applications (and to degree those on smartphones) are becoming much better at watching network conditions and adapting to congestion. It is much more difficult for a BSP to charge a content or application provider for transport, if they can instead invest the money in more adaptive and intelligent software. This is much more likely to occur on higher-end open computing devices with easily-updateable software.

Taken as a whole, Telco 2.0 is doubtful that PCs represent a class of device that can be exploited by operators much, beyond connectivity revenues. In the fixed world, we feel that telcos have other, better, opportunities and more important threats (around video, tablets and new ecosystems like smart grids). In the mobile world, we think operators need to consider the cost of servicing PC-based mobile broadband, rather than the mostly-mythical new revenue streams – and just focus on managing or offloading the traffic with the greatest ease and lowest cost feasible.

PCs are unlikely to disappear – but they should not command an important share of telcos’ limited bandwidth for services innovation.

Devices and new telco business models

The last part of previous section has given a flavour of how network end-points might contribute to business model innovation, or at least permit the layering-on of incremental services such as the Orange smoke-detector service. It is notable that, in that case, the new proposition is actually a “two box” service, involving a generic telco-controlled unit (the LiveBox gateway), together with a separate device that actually enabled and instantiated the new service (the detector itself).

When it comes to generating new device-based operating and revenue models, telcos have two main choices:

  • Developing services around existing multi-purpose devices (principally PCs or smartphones)
  • Developing services around new and mostly single-application devices (Internet TVs, smart meters, healthcare monitors, in-vehicle systems, sensors and so forth).

The home gateway, discussed above, is a bit of a special category, as it is potentially both a “service end-point” in its own right and the hub for extra gadgets hooked into it through WiFi.

The first option – using multi-function devices – has both advantages and disadvantages. The upside is a large existing user base, established manufacturers and scale economies, and well-understood distribution channels. The downside is the diversity of those marketplaces in terms of fragmented platforms and routes to market, huge competition from alternative developers and service providers, an urgent need to avoid disruption to existing revenues streams and experience – and the strategic presence of behemoths such as Apple, Google and Nokia.
Smartphones and PCs are separately analysed later in this document, as each group has very separate challenges that impinge to only a limited degree on the newer and more fragmented device types.

With new devices there are also a series of important considerations. In theory, many can be deployed in “closed” end-to-end systems with a much greater measure of operator control. Even where they rely on notionally “open” OS’s or other platforms, that openness might be exploited by the telco in terms of, say, user interface and internal programming – but not left fully-open to the user to add in additional applications. (This is perfectly normal in the M2M world – many devices have Windows or Linux internals, such as barcode scanners and bank ATM machines, but these are isolated from the user’s intervention).

However, despite the ability to create completely standalone revenue models, there are still other practical concerns. Certain device types may fit poorly with telcos’ back-office systems, especially old and inflexible billing systems. There will also be huge issues about developing dedicated retail and customer-support channels for niche devices, outside their usual mechanisms for selling mobile services or mass-market broadband and telephony. There may also be challenges dealing with the role of the incumbent brands and their existing partnerships.

Devices map onto 4 communications models

Clearly, the device universe driving telecom services is a broad one – dominated in volume terms by mobile phones and smartphones, as well as driven from a data standpoint by PCs. There are also the numerically smaller, but highly important constituencies of fixed phones, servers and corporate PBXs. But increasingly, the landscape looks more fragmented, with ever more devices becoming network-connected and also open to applications and “smartness”. TVs, tablets, sensors, meters, advertising displays, gaming products and so forth – plus newcomers in diverse areas of machine-to-machine and consumer electronics.

Consequently, it is difficult to develop broad-brush strategies that span this diversity, especially given the parallel divergence of business models and demands on the network. To help clarify the space, we have developed a broad mechanism for classifying devices into different ”communications models”. Although the correlation is not perfect, we feel that there is a good-enough mapping between the ways in which devices communicate, and the ways in which users or ecosystems might be expected to pay for services.

(Note: P2P here refers to devices that are primarily for person-to-person communications, not peer-to-peer in the context of BitTorrent etc. In essence, these devices are “phones” or variants thereof, although they may also have additional “smart” data capabilities).


It is worth pointing out that PCs represent a combination of all of these models. They are discussed separately, in another section – although Telco 2.0 feels that they are much more difficult to monetise beyond connectivity for operators.

Person-to-person communication

The majority of devices connected to telcos’ networks today are primarily intended for person-to-person (also sometimes called peer-to-peer) communications: they are phones, used for calling or texting other phones, both mobile and fixed. Because they have been associated with numbers – and specific people, locations or businesses – the business models have always revolved around subscriptions and continuity.

Telco 2.0 believes that there is limited scope for device innovation here beyond additional smartness – and to a degree, smartphones (like PCs) also could be considered special cases that transcend the categories described here. They are examined below. [Note: this refers to the types of communication application – there are likely to be yet more new ways in which voice and SMS can be used, controlled and monetised even on basic phones through back-end APIs in the network].

Yes, there could be niche products which evolve specifically intended as “social network devices” and clearly there is also a heritage of products optimised for email and various forms of instant messaging. But these functions are generally integrated into handsets, either operator-controlled or through third-party platforms such as BlackBerry’s email and messaging.

A recurring theme among fixed operators for the past 20 years has been that of videophones. Despite numerous attempts to design, specify or sell them, we have yet to see any rapid uptake, despite widespread use of webcams on PCs. The most recent attempt has been the advent of “screenphones” optimised for web/widget display, with additional video capture and display capabilities they hope may eventually become more widely-used. These too have had limited appeal.

Although handsets clearly represent a huge potential opportunity for telcos’ two-sided aspirations through voice/SMS APIs and smartphone applications and advertising, it seems unlikely that device innovation will result in totally new classes of product here. As such, operators’ peer-to-peer device strategy will likely to revolve around better control of smartphones’ experience and application suites, along with attempts to bring on new massmarket services for featurephones. This is likely to take the form of various new web/widget frameworks such as the Joint Innovation Labs’ platform (JIL), run by Vodafone, Verizon, SoftBank and China Mobile.

Other less-likely handset business models could evolve around new “core” communications modes – although we remain sceptical that the 3GPP- and GSMA-backed Rich Communications Suite will succeed in the fashion of SMS for a huge number of reasons. In particular, any new core P2P mode needs very high penetration levels to be attained before reaching critical mass for uptake – something hard to achieve given the diversity of device platforms, the routes to market, and the existing better-than-RCS capabilities already built into products such as the iPhone and BlackBerry. Adding in a lack of clear business case, poor fit with prepay models and weak links to consumer behaviour and psychology (eg “coolness”), we feel that “silo” optimised solutions developed by operators, device vendors or third parties are much more likely to succeed than lowest-common-denominator “official” standards.

Downloads and streaming

The most visible – and potentially problematic – category of new connected devices are those that are intended as media consumption products. This includes TVs, e-book readers, PVRs, Internet radios, advertising displays and so forth. Clearly, some of these have been connected to telco services in some way before (notably via IPTV), but the recent trends of embedding intelligence (and “raw” direct Internet access) is changing the game further. Although it is also quite flexible, we believe that the new Apple iPad is best represented within this category.

There are four main problems here:

  • The suppliers of these devices are often strong consumer electronic brands, with limited experience of engaging with operators at all, let alone permitting them to interfere in hardware or software specification or design. Furthermore, their products generally have significant “offline” usage modes such as terrestrial TV display, over which operators cannot hope to exert influence at all. As such, any telco involvement will likely need to be ring-fenced to new services supported. This also makes it difficult to conceive of many products which could be profitable if confined solely to sales within an individual operator’s customer base.
  • It is unlikely that many of the more expensive items of display and media consumption technology will be supplied directly by operators, or subsidised by them. This makes it very difficult for operators to get their software/UI load into the supply chain, unless there were generic open-Internet downloads available.
  • These devices – especially those which display high-definition video – can consume huge amounts of network resource. Living-room LCD TVs can pull down 5GB per hour, if connected to the Internet for streamed IPTV, which might not even be watched if the viewer leaves the room. In the mobile domain, dedicated TV technologies have gained limited traction, but streaming music and audio can instead soak up large volumes of 3G bandwidth. There is a risk that as display technology evolves (3D, HD etc), these products may become even more of a threat to economics than open PCs.
  • For in-home or in-office usage scenarios, the devices will normally be used “behind” the telco access gateway and thus be outside the usual domain of operator influence. This makes it less palatable to consumers to have “control points”, and also raises the issue of responsibility for poor in-home connectivity if they are operator-controlled.

All that said, there are still important reasons for telcos to become more skilled in this category of devices. Firstly, it is important for them to understand the types of traffic that may be generated – and, possibly, learn how to identify it in the network for prioritisation. There could well be options for two-sided models here – for example, prioritisation or optimisation of HD video for display on living-room TVs, for which there may well be revenue streams to share, as well as user expectations that would not embrace “buffering” of streamed data during congested periods.

Moreover, there is a subset of this class of “display” devices which are much more amenable to entirely new business models beyond connectivity. Mobile devices such as the Apple iPad (or operator-controlled equivalents) could be bundled with content and applications. Non-consumer products such as connected advertising displays could benefit from many telco value-adds: imagine a road-side advert that changed to reflect the real-time mix of drivers in the vicinity, calculated via the operator’s network intelligence.

There are also further positives to this group of products that may offset the problems listed above. Generally, they are much less “open” than PCs and smartphones, and tend to have fixed software and application environments. This predictability makes it much less likely that new usage modes will emerge suddenly, or new work-arounds for network controls be implemented. It also makes “illicit” usage far less probable – few people are going to download a new BitTorrent client to their TV, or run Skype on a digital-advertising display.

Cloud services & control

Probably the most interesting class of new devices are those that are expected to form the centrepiece of emerging “cloud services” business models, or which are centrally-controlled in some way. In both cases, while the bulk of data traffic is downstream, there is an important back-channel from the device back to the network. Possible examples here would be smart meters for next-generation electricity grids, personal healthcare terminals, or “locked” tablets used for delivering operator-managed (or at least, operator-mediated) services into the home.

These devices would typically be layered onto existing broadband service connections in the home (probably linked in via WiFi), or else could have a separate cellular module for wide-area connectivity. While they may have some form of user interface or screen, it is likely that this will not be “watched” in the same sense as a TV or media tablet, instead used for specific interactive tasks.

These types of application have some different network requirements to other devices – most typically, they will require comparatively small volumes of data, but often with extremely high levels of security and reliability, especially for use cases such as healthcare and energy management. Other devices may be less constrained by network quality – perhaps new appliances for the home, such as “family agenda and noticeboard” tablets.

There are numerous attractions here for operators – while these devices are likely to be used for a variety of tasks, their impact on the network in terms of capacity should generally be light. Conversely, the requirements for security should enable a premium to be charged – probably to the “ecosystem owner” such as a public-sector body or a utility. In some cases, there could well be additional associated revenue streams open to the telco alongside connectivity – both direct from end users, and perhaps also from managing delivery to upstream providers.

There is also a significant likelihood that cloud-based services will be based around long-term, subscription-type billing models, as the devices will likely be in regular and ongoing use, and also probably of minimal functionality when disconnected.


A number of new device categories are emerging that are “upload-centric” – using the telco network as a basis for gathering data or content, rather than consuming it. Examples include CCTV cameras, networks of sensors (eg for environmental monitoring), or digital cameras that can upload photos directly.

These are highly interesting in terms of new business models for telcos:

  • Firstly, they are almost all incremental to existing connections rather than substitutional – and thus represent a source of entirely new revenue, even if the operators are just supplying connectivity.
  • Secondly, this class of device is likely to involve new, wider ecosystems, often involving parties that have limited experience and skill in managing networks or devices. This provides the opportunity for operator to add significant value in terms of overall management and control. Examples include camera manufacturers, public-sector authorities operating surveillance or measurement networks and so forth. This yields significant opportunity for two-sided revenues for telcos, or perhaps overall “managed service” provision.
  • Thirdly, it is probable that traditional “subscription” models, as seen in normal telephony services, will be unwieldy or a generally poor fit with this class of device. For example, a digital 3G-uploading camera is likely to be used irregularly and is thus unsuited to regular monthly fees. It may also make sense to price such devices on a customised “per photo” basis, rather than per-MB – and it would probably be desirable to bundle a certain allowance into the upfront device purchase price. Clearly, there is value to be gained by the telco or a specialist service provider like Jasper Wireless here, re-working the billing and charging mechanisms, handling separate roaming deals and so forth.

In addition, there is an opportunity to engineer these new business models from the ground up to reflect network usage and load. They are likely to generate fairly predictable traffic – most of it upstream. This may present certain challenges, as most assumptions are for download-centric networks, but the fact that application-specific devices should be “deterministic” should help assuage those problems from a planning point of view. For example, if an operator knows that it has to support a million CCTV cameras, each uploading an average of 3MB per hour from fixed locations, that is relatively straightforward to add into the capacity planning process – certainly much more so than an extra million smartphones using unknown applications at unknown times, while moving around.

All that said, it remains unclear that the total number of device sales and aggregate revenues make this category a truly critical area for telcos. In many cases it is likely to be “nice to have” rather than must-have – and it is certainly not obvious that the current nascent market will be large enough to accommodate every operator in a given market attempting to enter the space simultaneously. For a few operators this area may “move the needle” if a few choice deals are struck (e.g. for national environmental monitoring), but for others it will be many years, if ever.

One example of this category of product is the remote smoke-detector offered by Orange in France, which is provided as a value-add to its home broadband offer. This has a variety of service models, including one involving a subscription to another upstream provider of monitoring/alerting functions (Mondial Assistance), for which Orange presumably gains a revenue share.

Operators’ influence on smartphones and featurephones

Perhaps the key telco battleground at present is around smartphones. The growth of the iPhone, the entrenched position of BlackBerry, the emergence of Android and the theoretical numeric advantage of Symbian and Nokia are all important aspects of the landscape. They are encouraging data plan uptake by consumers, catalysing the applications ecosystem and – on the downside – fostering rampant bandwidth utilisation and providing ready platforms for Internet behemoths to drive services loyalty at the expense of the telcos.

In principle smartphones should be excellent platforms for operators launching new services and exploiting alternative business models – advertising, downloadable apps linked to identity or billing services, third-party payments for enhanced connectivity and so forth. Yet up until now, with a few exceptions (notably DoCoMo in Japan), there have been very limited new revenue streams on handsets beyond basic voice, messaging, ringtones and flat (or flattish) data plans. BlackBerry’s BES and BIS services are the only widely-adopted 3rd-party data services sold outside of bundles by a significant number of operators, although operator billing for their own (or others’) appstores holds potential.

This is a general area that Telco 2.0 has covered in various recent research reports, examining the role of Apple, Google, RIM and others. Fixed operators have long known what their mobile peers are now learning – as intelligence increases in the devices at the edge, it becomes far more difficult to control how they are used. And as control ebbs away, it becomes progressively easier for those devices to be used in conjunction with services or software provided by third parties, often competitive or substitutive to the operators’ own-brand offerings.

A full discussion of the smartphone space merits its own strategy report, and thus coverage in this document on the broader device markets is necessarily summarised.

What is less visible is how and where operators can impose themselves in this space from a business model point of view. There is some precedent for operators developing customised versions of smartphone OS software, as well as unique devices (eg Vodafone / LiMo, DoCoMo / Symbian and Linux, or KDDI / Qualcomm BREW). Many have fairly “thin” layers of software to add some branding and favoured applications, over the manufacturer’s underlying OS and UI. Symbian and LiMo have been more accommodating in this regard, compared to Apple and RIM, with Microsoft and Palm somewhere in the middle.

However, in the majority of cases this has not led to sustainable revenue increases or competitive advantage for the operators concerned – not least because there appears to have been a negative correlation with overall usability, especially given links to back-end services like iTunes and the BlackBerry BIS email infrastructure. Where one company has complete control of the “stovepipe”, it is much easier to optimise for complexities such as battery life, manage end-to-end performance criteria such as latency and responsiveness, and be incentivised to ensure that fixing one problem does not lead to unintended consequences elsewhere. In contrast, where operators merely customise a smartphone OS or its applications, they often lack the ability to drill down into the lower levels of the platform where needed.

More recently, Android has seemed to represent a greater opportunity, as its fully open-source architecture enables operators to tinker with the lower layers of the OS if they so desire, although there are endless complexities in creating “good” smartphones outside of telcos’ main competence, such as software integration and device power management. Symbian’s move to openness could also produce a similar result. It is in this segment that operators have the greatest opportunity for business model innovation. We are already seeing moves to operator-controlled application ecosystems, as well as mobile advertising linked to the browser or other functions. That said, early attempts by operators to create own-label social networking services, or “cross-operator” applications, seem to have had limited success.

Further down the chain, it is important not to forget the huge market occupied by their less-glamorous featurephone brethren. Especially in prepaid-centric markets where subsidy is rare, the majority of customers use lesser devices from the likes of Nokia’s Series 40 range, or the huge range from Samsung and LG. Worse still for operators, many of these devices are bought “vanilla” from separate retail channels over which they have little control.
While it is theoretically possible for service providers to “push” their UIs and applications down to non-customised handsets in the aftermarket, in reality that rarely happens as it has huge potential to cause customer dissatisfaction. More generally, some minimal customisation is provided via the SIM card applications – although over time this may become slightly more sophisticated.

Realistically, the only way that operator can easily control new business models linked to prepaid mobile phone subscribers is through own-brand phones (see ODM section below), or via very simple “per day” or “per month” fixed-fee services like web access or maybe video.

Overall, it could be viewed that operators are continually facing a “one step forward, two steps back” battle for handset application and UI control. For every new Telco-controlled initiative like in-house appstores, customised/locked smartphone OS’s, BONDI-type web security, or managed “policy” engines, there is another new source of “control leakage” – Apple’s device management, Nokia’s Ovi client, or even just open OS’s and third-party appstores enabling easy download of competing (and often better/free) software apps.

Multi-platform user experience

The rest of this document has talked about devices as standalone products, linked to particular services or business models. But it actually seems fair to assume that many users will be using a variety of platforms, in a variety of contexts, acquired through a myriad of channels.

This suggests that operators have some scope to define and own a new space – “multi-platform experience”. The idea is to compete to get as great an aggregate share of attention and familiarity as possible, tied to the provision of both end-user service fees and, potentially, two-sided offerings that benefit from this extra customer insight and access.

For example, users may wish to view their photos, or access their social networks, via digital cameras, mobile phone(s), PC, tablet, TV, in-car system and various other endpoints. They will want to have similar (but not identical) preferences and modes of behaviour. Yet there will likely be one which is the cornerstone of the overall experience, with the others expected to be reflections of it. This will drive ongoing purchasing behaviour of additional devices and services – Apple has understood this well.

Operators need to either start to drive these user experience expectations and preferred interaction patterns – or be prepared to accommodate others’. For example, there now appears to significant value to many users in ensuring that new technology products are optimised for Facebook. While this may be a blow to the operators’ hopes of dominating a particular service domain, relinquishing it may be a small price to pay for overall importance in the user’s digital lifestyle. A telco providing a tablet with a Grade-A Facebook experience has a portal to introducing the user to other in-house services.


For mobile operators

  • The key element of device strategy remains the selection, testing and sale of handsets – along with basic customisation and obtaining exclusivity where possible. Larger operators – especially those which are in post-paid centric markets – have more flexibility in creating or pushing new device classes and supporting new business models.
  • Mobile operators do not have a distinguished past in creating device UIs, with various failed experiments in on-device portals and application stacks. Consider focusing on control points (eg API security) underneath the apps and browser, rather than branding the direct interface to the user.
  • New classes of mobile device (tablets, in-car devices, M2M) are less risky than smartphones, but are unlikely to “move the dial” in terms of revenues for many years. They will also likely require more complex and customised back-end systems to support new business models. Nonetheless, they can prove fruitful for long-term initiatives and partnerships (eg in healthcare or smart metering).
  • Bridge the gap between RAN and device teams within your organisation, to understand the likely radio impacts of new products – especially if they are for data-hungry applications or ones with unusual traffic patterns such as upstream-heavy. Silicon and RF may be complex and “unsexy”, but they can make a huge difference to overall network opex and capex.
  • While Android appeals because of its ODM-friendliness and flexibility, it remains unproven as an engine for new business models and still has uncertain customer appeal. Do not turn your back on existing device partnerships (RIM, Apple, Nokia etc) until this becomes clearer.
  • Yoda in Star Wars had wise advice “Do. Or do not. There is no ‘try’”. Creating devices is expensive, time-consuming and not for the faint-hearted. Uncommitted or under-resourced approaches may end up causing more harm than good. Be prepared to write some large cheques and do it right, first time.
  • If you are serious about investing in fully-customised handsets, consider following 3’s path with INQ and sell them to other non-competing operators around the world, to amortise the costs over greater volumes.
  • Examine the potential for raising revenue or customer satisfaction from device-side utilities rather principle applications. For example, self-care or account-management apps on a smartphone can be very useful, while well thought-out connection management clients for mobile broadband PCs are a major determinant of customer loyalty.
  • Another promising domain of device specialism lies around creating enhanced experiences for existing successful applications – for example porting FaceBook and Twitter, or particular media properties, to custom software loads on handsets. Done well, this also has the potential to form the basis of a two-sided business model. For example, if an operator pitched a “YouTube-optimised” phone, tied in with end-to-end network policy management and customer data exposure, there could be significant advertising revenue-share opportunities.
  • Mobile operators should generally consider enterprise-grade devices (eg tablets, meters, in-vehicle systems) only in conjunction with specialist partners.

  • De-prioritise initatives around netbooks and laptops with embedded 3G connectivity. They represent huge loads on the network, are difficult to sell, and are extremely hard to monetise beyond “pipe” revenues.

For fixed & cable operators

  • The core recommendation is to continue focusing on (and enhancing) existing home gateway and set-top box products. These should be viewed as platforms for existing and future services – some of which will be directly monetisable (eg IPTV) while others are more about loyalty and reduction of opex (eg self-care and integrated femtocell modules).
  • Consider the use of relatively inexpensive custom devices (eg WiFi tablets) which are locked to usage via your gateway. Potentially, these could be given for free in exchange for a commitment to longer/renewed contracts or higher service tiers – and may also form the basis of future services provided via appstores or widgets.
  • Work collaboratively with innovative consumer electronics suppliers in areas such as Internet-connected TVs and games consoles. These vendors are potentially interested in end-to-end cloud services – including value-added capabilities from the network operators. They may also be amenable to suggestions on how to create “network-friendly” products, and co-market them with the operator.
  • Some operators may have the customer branding strength and physical distribution channels to sell adjunct product such as storage devices, Internet radios, IPTV remote controls and so forth. There may additional revenue opportunities from services as well – for example, including a Spotify subscription with a set of external speakers. However, do not underestimate the challenges of overall system integration or customer support.
  • Take a leadership role in pursuing digital home opportunities. There is a narrow window of opportunity in which fixed operators have the upper hand here – over time, it is likely that mobile operators and their device vendors will start to gain more traction. For now, WiFi (and maybe powerline) connections are the in-home network of choice, with the WiFi router provided by a fixed/cable operator being at its centre.
  • A pivotal element of success is ensuring that an adequate customer support and device-management system is in place. Otherwise incremental opex costs may more than offset the benefits from incremental revenue streams.

  • Fixed telcos should look to exploit home networking gateways, femtocells and other CPE, before consumer electronic devices like TVs and HiFi’s adopt too many “smarts” and start to work around the carrier core, perhaps accessing YouTube or Facebook directly from the remote control. At present, it is only open devices with a visible, capable and accessible user interface or browser (e.g. PCs and smartphones) that can exploit the wider Internet. Inclusion of improved Internet connectivity and user control in other classes of device will broaden their ability to circumvent operator-hosted services.


Telcos need to face the inevitable – in most cases, they will not be able to control more than a fraction of the total computing and application power of the device universe, especially in mobile or for “contested” general-purpose devices. Even broadband “device specialists” will need to accept that their role cannot diminish the need for some completely “vanilla” network end-points, such as most PCs.

But that does not mean they should give up trying to exert influence or design their own hardware and software where it makes sense – as well as developing services that compete on equal terms with the web, for those devices beyond their direct reach.

They should also ensure that at least as much consideration is given to optimising devices for their current business models, as well as hoping they can form the basis of innovative offerings.

Some of the most promising new options include:

  • Single-application “locked” mobile devices, perhaps optimised for gaming or utility metering or navigation or similar functions, which have a lot of potential as true “terminals” and the cornerstone of specific business models, albeit used in parallel with users’ other smart devices.
  • Even notionally-open devices like smartphones and tablets can be controlled, especially through application-layer pinch points. Apple is the pre-eminent exponent of this art, controlling the appstore with an iron fist. This is not easy for operators to emulate, but is a very stark benchmark of the possible outcome. Android can help here, but only for those operators prepared to invest sufficient time and money on getting devices right. Another option is to work with firms like RIM, which tend to have more “controllable” OS’s and which are operator-friendly.
  • It is far easier for the operator to exert its control at the edge with a standalone, wholly-owned and managed device, than via a software agent on a general computing device like a smartphone or notebook PC. However, it is more difficult and expensive to create and distribute a wholly-owned and branded device in the first place. Few people will buy a Vodafone television, or an AT&T camera – partnerships will be key here.
  • Devices which support web applications only (eg tablets) are somewhat different propositions to those which can also support “native” applications. Operators are more likely to find the “security model” for a browser cheaper and easier to manage than a full, deep OS, affording more fine-grained control over what the user can and cannot do. The downside is that browser-resident apps are generally not as flexible or powerful as native apps.
  • On devices with multiple network interfaces (3G, WiFi, Bluetooth, USB etc) a pivotal control layer is the “connection manager”, which directs traffic through different or multiple paths. In many cases, some of those paths will be outside operator control, allowing “leakage” of application data and thus revenue opportunity.
  • Even where aspects of the device itself lie outside Telcos’ spheres of control, there are still many “exposable” network-side capabilities that could be exploited and offered to application providers, if Telcos’ own integrated offerings are too slow or too expensive. Identity, billing, location, call-control can be provided via APIs to add value to third-party services, while potentially, customer data could be used to help personalise services, subject to privacy constraints. However, carriers need to push hard and fast, before these are disintermediated as well. Google’s clever mapping and location capabilities should be seen as a warning sign that there will be substitutes available that do not rely on the telcos.
  • We may also see ‘comes with data’ products offered by the Telco themselves with their own product teams as a sort of internal upstream customer. If Dell or Apple or Sony can sell a product with connectivity bundled into the upfront price, but no ongoing contract, why not the operators themselves?

The other side to device specialists is the potential for them to become buyers rather than sellers of two-sided services. If Operator X has a particularly good UI or application capability, then (if commercial arrangements permit), it could exploit Operator Y’s willingness to offer managed QoS or other capabilities. This is most likely to happen where the two Telcos don’t compete in a given market – or if one is fixed and the other mobile. Our managed offload use case in the recent Broadband report envisages a situation in which a fixed ‘device specialist’ uses a WiFi or femto-enabled gateway to assist a mobile broadband provider in removing traffic from the macro network.

In addition to these, there are numerous device-related “hygiene factors” that can improve operators’ bottom line, through reducing capex/opex costs, or improving customer acquisition and ongoing revenue streams. Improved testing and specification to reduce customer support needs, minimise impact on networks and guarantee good performance are all examples. For example, RIM’s BlackBerry devices are often seen as being particularly network-friendly, as are some 3G modems featuring advanced radio receiver technology.

Overall, the battle for control of the edge is multi-dimensional, and outcomes are highly uncertain, particularly given the economy and wide national variations in areas like device subsidy and brand preference. But Telcos need to focus on winnable battles – and exploit Moore’s Law rather than beat against it with futility.

Figure 3: Both hardware and software/UI provide grounds for telco differentiation


Full Article: Handsets – Demolition Derby

Summary: ‘Hyper-competition’ in the mobile handset market, particularly in ‘smartphones’, will drive growth in 2010, but also emaciate profits for the majority of manufacturers. Predicted winners, losers and other market consequences.

This is a Guest Note from Arete Research, a Telco 2.0™ partner specialising in investment analysis.Arete Members can download a PDF of this Note here.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

Handsets: Demolition Derby


Arete’s last annual look at global handset markets (Handsets: Wipe-Out!, Oct. ’08) predicted every vendor would see margins fall by ~500bps. This happened: overall industry profitability dropped, as did industry sales. Now everyone is revving their engines with vastly improved product portfolios for 2010. Even with 15% unit and sales growth in ’10, we see the industry entering a phase of desperate “hyper-competition.” Smartphone vendors (Apple, RIMM, Palm, HTC) should grab $15bn of the $23bn increase in industry sales.

Longer term, the handset space is evolving into a split between partly commoditised hardware and high margin software and services. Managements face a classic moral hazard problem, incentivised to gain share rather than preserve capital. Each vendor sees 2010 as “their year.” Individually rational strategies are collectively insane: the question is who has deep enough pockets to keep their vehicles in one piece.
Revving the Engines. Every vendor is making huge technology leaps in 2010: high end devices will have 64/128GBs of NAND, 8-12Mpx cameras, OLED nHD capacitive touch displays, and more features than consumers can use. Smartphones should rise 50% to 304m units (while feature phones drop 21% in units). As chipmakers support sub-$200 complete device solutions, we see a race to the bottom in smartphone pricing.

Software Smash-Up. The rush of OEMs into Android will bring differentiation issues (as Symbian faced). Beyond Apple, every software platform faces serious issues, while operators will use “open” platforms to develop their own UIs (360, OPhone, myFaves, etc.). Rising software costs will force some OEMs to adopt a PC-ODM business model, while higher-margin models of RIM and Nokia are most at risk.

Finally, the Asian Invasion. Samsung, HTC and LGE now have 30% ’09E share, with ZTE, Huawei, MTEK customers and PC ODMs all joining the fray. All seek 20%+ growth. Motorola and SonyEricsson are being forced to shrink footprint, and shift risk to ODM partners. Nokia already has an Asian cost base, but lacks new high-end devices outside its emerging markets franchise. Apple looks set to claim 40% of industry profits in ’10, as other OEMs fight a brutal war of attrition, egged on by buoyant demand for fresh products at record low prices.


Forget Defensive Driving

Our thesis for 2010 is as follows: unit volumes will rebound with 15% growth, with highly competitive pricing to keep volumes flowing. This will be driven by highly attractive devices at previously unimaginably low prices. Industry sales will also rise 15%, by $23bn, but half of the extra sales ($11bn) will be taken by Apple. Industry margins will remain under pressure from pricing and rising BoM costs. Every traditional OEM, smartphone pure-play, and new entrant are following individually rational strategies: improve portfolios, promise the moon to operators, and price to gain share. Those that fail to secure range planning slots at leading operators will develop other channels to market. Collectively, the industry is entering a period of desperation and dangerous self-belief. There are few incentives to exercise restraint for the likes of Dell (led by ex-Motorola management), Acer (the consistent PC winner at the low-end), Huawei and ZTE (which view devices as complementary to infrastructure offerings) or Samsung (where rising device units help improve utilisation of its memory and display fabs). Motorola and SonyEricsson must promote themselves actively, just to find sustainable business models on 4% share each.

Table 2 shows industry value; adjusted for the impact of Apple, it shows a continuous 4-5% decline in ASPs (though currencies also play a role). The challenge for mainstream OEMs (Nokia, Samsung, LGE, etc.) is to win back customers now exhibiting high loyalty after switching to iPhone or Blackberry. Excluding gains by Apple and RIM, industry sales are on track to fall 13% in ’09. Apple, RIM, Palm and HTC will collectively account for $15bn of our forecast incremental $23bn in industry sales in ’10E.


Within this base, we see smartphones rising from 162m units in ’08 (13% of the total) to 304m units, or 23% of total ’10E shipments. At the same time, featurephone/mid-range units will drop by 21% in ’09 and 21% again in ’10.

Key Products for 2010

  • Both SonyEricsson and LGE have innovative Android models coming in 1H10, LG with distinctive designs and gesture input, and a new SonyEricsson UI and messaging method.
  • Nokia’s roadmap features slimmer form factors, but a range of capacitive touch models will not come until 2H10. It will update the popular 6300/6700 series with a S40 touch device in 1H10.
  • Samsung has its usual vast array of product, and plans for 100m touch models in ’10 underlining the extent of their form factor transition.
  • Motorola’s line-up will focus on operator variants, with a lead device shipping in 2Q10, but a number of operators think Motorola lacks distinctive designs and see little need for Blur.
  • RIM will not change its current form factor approach until 2H10, when it moves to a new software platform to enhance its traditional QWERTY base. It faces commercial challenges around activation and services fees with carrier partners.
  • We expect Apple to reach lower price points and also launch CDMA-based iPhones in ’10.
  • HTC must also reduce its costs to address mid-range prices.
  • Every vendor plans to widen its portfolio with several “hero” models in 2010; if anything the window to hype any single launch is narrowing.

Main Trends

Discussions with a wide range of operators, vendors and chipmakers about 2010 device roadmaps point to an explosion of attractive products – a few trends stand out:

  • Operators are now deeply engaging Chinese vendors. Huawei and ZTE have Android devices coming, while TCL and Taiwanese ODMs offer low-end devices. Chipmakers confirm Android devices will drop under $100 BoM levels by YE10. This will pressure both prices and margins. The value chain is shifting rapidly to more compute-intensive devices, with Qualcomm and others enabling Asian ODMs to be active in new PC segments with smartphone-like features (touch, Adobe Flash, 3G connectivity, etc.) in large-screen form factors, to leverage their LCD base.
  • All devices will become “smartphones.” Samsung and Nokia are opening up APIs for mass market phones. The smartphone tag (vs. dumb ones) will be applied to devices of all sorts, the way we formerly spoke of handsets. By the end of 2010, all devices (except basic pre-paid models) will be customisable with popular applications (e.g., search, social networking, IM, etc.) even if they lack hardware for video content (i.e., memory and codecs) or mapping (GPS chipsets). Open OS devices should rise 50% to 304m units, 23% of the total market.
  • Pure play smartphone vendors (RIMM, HTC, Palm) must transition business models to emulate Apple (i.e., linking devices with services and content). Launching lower-cost versions of popular models (RIMM’s 8520, HTC’s Tattoo, Palm’s Pixi) implicitly recognises how crowded the high-end ($400+) is becoming. This will get worse as Motorola and SonyEricsson seek to re-invent themselves with aspirational models, and Android devices hit mid-range prices in ’10.

Fearless Drivers

We had said before that key purchase criteria (design, features, brand) were reaching parity across OEMs, splitting the market into basic “phones” (voice/camera/radio) and Internet devices. The former has room for two to three scale players: Nokia, Samsung, and a third based on a PC-OEM model using standard offerings (e.g., Qualcomm or MTEK chipsets). LG and ZTE are both seeking this position, from which SonyEricsson and Motorola retreated to focus on Internet devices. This does not mean mobile devices are now commodities, like wheat or steel. The complexity of melding software and hardware in tiny, highly functional packages is not the stuff of commodity markets. But we see a split where a narrow range of standard hardware platforms will accommodate an equally narrow set of software choices. Mediatek is blazing a trail here. Some operators (Vodafone, China Mobile, etc.) aim to follow this model for pre-paid and mid-range featurephones. Preserving software and services value-add for consumers in a market where hardware pricing is fairly transparent is a challenge for all OEMs.

This model is not confined to the low-end: In Wipe Out! we said Motorola (among others) would adopt an HTC/Dell model (integrating standard chipsets/software and cutting R&D). This is happening, with Motorola no longer trying to control its software roadmap, having fully adopted Android. SonyEricsson is following suit, with initial Android devices coming in 1Q10.

Recent management changes make it even more likely SonyEricsson gets absorbed into Sony to integrate with content (as its new marketing campaign pre-sages). Internet devices will become even more fragmented by would-be new entrants in ’10. In addition to Nokia, Apple, RIMM, HTC and Palm, LG and Samsung intend to build a presence in smartphones, as do Huawei, ZTE and PC ODMs. We had expected LGE or Samsung to consider M&A (i.e., buying HTC or Palm) to cement their scale or get a native OS platform. We forecast the shift to Internet devices would bring 27m incremental units from RIM, HTC, and Apple in ’09E. This now looks like it will be 21m units (partly due to weaker HTC sales), a growth of 58% vs. an overall market decline of 6%.

Growth: Steaming Again

After a long string of rises in both units and industry value, the global handset market retreated in ’09. We see risk of a weaker 1H10 mitigated in part by trends in China (3G) and India (competition among new operators). The industry had already scaled up for 10-20%+ growth during the ’05-’08 boom; most vendors have highly outsourced business models and/or partly idle capacity, meaning they could produce additional units relatively quickly. Paradoxically, 15% unit and sales growth will further encourage aggressive efforts to gain share.

Our regional forecasts are in Table 3. Emerging markets are two-thirds of volumes in ’09E and ’10E, and will lead growth – at ever lower price points – as they adopt 3G. Market dynamics vary sharply between highly-subsidised, contract-led markets (i.e., the US, Japan/Korea, and W. Europe) and pre-paid-led emerging markets (China, India, E. Europe, MEA and LatAm). In the former, operators are driving smartphone adoption; while price erosion helps limit subsidy budgets, we see growth in handset market value. As Table 4 shows, mobile data handsets hit 10%+ of EU operator sales, but are not yet driving operators’ sales growth.



In emerging markets, the growth in value is led by further volume increases for LCHs. In ’05, we saw an inflection point around Low-Cost Handsets: Every Penny Counts (July ’05) and A Billion Handsets in ’07? (Aug. ’05). Since ’05, there were 1.2bn handsets shipped in China and India alone. LCH chipsets now sell for <$5, with only Infineon and Mediatek actively supplying meaningful volumes. The ongoing mix shift to emerging markets and weak sales of mid-range devices in developed markets were behind the 13% decline in industry value in ’09E, excluding Apple’s sales. Of the extra 170m units we see shipping in ’10E, 105m come from emerging markets, with ~50m sold in China and India.

Costs: Relentless Slamming

In Wipe Out!, Arete laid out four areas where costs might rise in ’09 and beyond, as the source of structural pressure on industry margins. None of these costs are easing or receding. First, the chipset market is increasingly concentrating. TI is exiting, ST-Ericsson continues to lose money, Infineon recovered but still lacks scale in 3G, and Mediatek dominates outside the top five OEMs. This leaves Qualcomm in a de facto leadership position in 3G. This structure does not support meaningful cost reduction for OEMs. Intel may seek an entry to disrupt the market (see Qualcomm v Intel, Fight of the Century, Sept. ’09) but this is unlikely to happen until ’11. Memory may be in short supply in ’10, while high-end OLED displays still face shortages. Capacity cuts and losses at smaller component suppliers in ’09 limit how much OEMs can save. Outsourced manufacturers like Foxconn, Compal, Jabil, BYD, and Flextronics have low margins and poor cash flow. OEMs want to transfer more risk to suppliers that have little room to cut further.

Second, feature creep also thwarts cost reduction efforts: packing more into every phone is needed to stimulate demand, but adds cost. There are rising requirements in the mid-range, going from 2Mpx to 3.2/5Mpx camera modules, and adding touch, more memory, and multi-radio chipsets (3G, WiFi, BT, FM, etc.). Samsung already offers a 2Mpx touchscreen 2G phone for <$100 on pre-paid tariffs.

Third, software remains the fastest-rising element of handset costs. In Mobile Software Home Truths (Sept. ’09), we discussed how software was adding costs, but how many OEMs were struggling to realise value from software investments? Adopting “licence-free” or open source software does not necessarily reduce these costs: it must still be managed within industrial processes. Yet saving licence costs will be the argument used by OEMs forced to limit the number of platforms they support, as Samsung did by recently indicating it would abandon Symbian. We understand WinMo efforts have been largely mothballed at Motorola and SonyEricsson, even as LG is increasing its spend around Microsoft. Costs are also rising for integration of services, while Software costs are not falling; vendors are just shifting them from handset bill-of-materials (BoM) to other companies’ R&D budgets.

Finally, marketing costs are also rising. Vendors must provide $10m-50m per market of above-the-line marketing support and in-store promotions, to get operators to feature “hero” products. Services adds costs for integration and (often-overlooked) indirect product costs (testing, warranty, logistics, price protection in the channel). SG&A must rise to educate users about new services. OEMs cannot retain or win customers in a mature market without more marketing.

The case for services remains simple and compelling: Nokia’s 33% gross margin on €65 ASPs yields €22 gross profit per device, or €1/month over a two-year lifetime. This is the only way to offset further pressure on device profits. The drive to launch Services is another cost OEMs must bear, with a longer payback than that of 12-18 month design cycles for devices.

Margins: Beyond Fender Benders

When Motorola has lost $4bn since ’07 and SonyEricsson may lose as much as €1bn in ’09, we are no longer talking about minor dents. Gross margins for both are already low (sub-20%). The most notable feature of the past few years was how exposed some vendors were when extensions of hit products (or product families) fell flat. SonyEricsson went from 13% 4Q07 margins to breakeven by 2Q08, and RIM saw group gross margins drop 1000bps. Only Nokia (at 33%), RIM, Apple and HTC have gross margins above 30%. Few OEMs managed to raise gross margins after seeing them decline, though we see SonyEricsson and Motorola seeking to do so by vastly reducing their scope of activities.

Having an Asian low-cost base is a necessary but not sufficient condition of survival. Nokia is already the largest Asian producer, with the industry’s two largest plants (in China and India) giving it the lowest cost structure (i.e., the lowest ASPs, but consistently among the highest margins). Few OEMs other than Nokia make money selling LCHs (i.e., sub-€30). Nokia made ~60% of industry profits in ’08, but will be surpassed in profits in ’09 by Apple, which should make 40% of industry profits in ’10, while Nokia has 25%. It is also worth noting that we forecast margins to fall at nearly every vendor in ’10, though Motorola and SonyEricsson must end large losses, and Nokia will benefit for IPR income within its Devices margin.


Software: Mutual Destruction?

The mobile industry is rapidly adopting the IT industry’s software as a service (SaaS) model. The handset is becoming a distribution platform for services and content; vendors aim to monetise a “community” of their device users. Yet for all the attention it gets, software is a means to an end, and not part of the product. Beyond RIM and Apple, only Nokia can afford its own smartphone platform R&D (i.e., Symbian), yet we see Nokia itself moving closer to Microsoft. Money alone cannot solve software or services issues; if so, Nokia’s industry-leading €3bn R&D budget would have yielded more success, while Apple would not have grabbed as much profit share with a $1.3bn group-wide R&D budget.

No vendor yet excels at ease-of-use for multiple applications (voice, SMS, music, video, browsing, navigation, etc.). RIM offers best-in-class messaging, but falls short in other use cases. The iPhone’s Web experience allowed it to overcome shortcomings in multi-threading and voice/text. Samsung has few services to accompany its sleek designs or high-spec displays and cameras. Just going to 70-100m touch-screed devices in ’10 will not resolve ease-of-use issues.

A number of vendors risk getting addicted to “free” software platforms where others reap the benefits (e.g., Android). Few OEMs have embraced regular updates of components (media players, browser plug-ins, etc.) to meet changing requirements. This is Apple’s edge (and in theory Microsoft’s, but it has not managed handset software efficiently). The current slowdown will only hasten moves to abstraction of hardware and software, long the case in PCs. What is the point of OEMs having their own “developer programmes” (e.g., MOTODEV, Samsung Mobile Innovation, SonyEricsson Developer World, etc.) if they adopt Android? To escape high software costs, some vendors are adopting a PC-OEM model: sub-20% gross margins, 1-5% R&D/sales, with little control over how services are implemented on devices.

When the Dust Settles…

After turmoil and consolidation in ’06, industry margins were robust in ’07, then plunged in ’08. Yet a hoped-for recovery in ’09 has given heart to a range of weaker players, sealing the industry’s fate.

Even with a resumption of growth, rising costs and hyper competition look set to put pressure on margins. The precipitous impact of this may not be seen until 2011; for now, managements are not inclined to call it quits, or admit they lack a services or software play. The handset market is hardly gone ex-growth, but its rules and value chain are shifting, as seen in Apple and Google staking their claims.

The market looks to be falling less than the $11bn we forecast for ’09 (“only” $9bn), but it is Apple’s incremental sales that are changing the dynamics most. We are no fans of M&A, but would welcome moves to remove industry capacity. There are few obvious options, beyond HTC and Palm. We also think Samsung and LGE would benefit from deals that might open up their insular corporate cultures. Nokia has showed how difficult it is for an OEM to assemble a portfolio of Services offerings: none are yet best-in-class. Our verdicts on the key questions for vendors are listed in the following table: We see room for two to three scale players in LCHs/feature-phones (Nokia, Samsung and one other following a PC-OEM model). Smartphones will grow even more fragmented and hotly contested. We are not certain whether the others – SonyEricsson, LGE, Motorola, ZTE, HTC, and Japanese vendors – will emerge from 2010 in one piece.


Richard Kramer, Analyst
Arete Research Services LLP / +44 (0)20 7959 1303

Brett Simpson, Analyst
Arete Research Services LLP / +44 (0)20 7959 1320


Regulation AC – The research analyst(s) whose name(s) appear(s) above certify that: all of the views expressed in this report accurately reflect their personal views about the subject company or companies and its or their securities, and that no part of their compensation was, is, or will be, directly or indirectly, related to the specific recommendations or views expressed in this report.

Required Disclosures

For important disclosure information regarding the companies in this report, please call +44 (0)207 959 1300, or send an email to

Primary Analyst(s) Coverage Group: Alcatel-Lucent, Cisco, Ericsson, HTC, Laird, Motorola, Nokia, Palm, RIM, Starent.

Rating System: Long (L), Positive (+ve), Neutral (N), Negative (-ve), and Short (S) – Analysts recommend stocks as Long or Short for inclusion in Arete Best Ideas, a monthly publication consisting of the firm’s highest conviction recommendations.  Being assigned a Long or Short rating is determined by a stock’s absolute return potential, related investment risks and other factors which may include share liquidity, debt refinancing, estimate risk, economic outlook of principal countries of operation, or other company or industry considerations.  Any stock not assigned a Long or Short rating for inclusion in Arete Best Ideas, may be rated Positive or Negative indicating a directional preference relative to the absolute return potential of the analyst’s coverage group.  Any stock not assigned a Long, Short, Positive or Negative rating is deemed to be Neutral.  A stock’s absolute return potential represents the difference between the current stock price and the target price over a period as defined by the analyst.

Distribution of Ratings – As of 15 October 2009, 10.8% of stocks covered were rated Long, 6.8% Positive, 25.7% Short, 10.8% Negative  and 45.9% deemed Neutral.

Global Research Disclosures – This globally branded report has been prepared by analysts associated with Arete Research Services LLP (“Arete LLP”) and/or Arete Research, LLC (“Arete LLC”), as indicated on the cover page hereof.  This report has been approved for publication and is distributed in the United Kingdom and Europe by Arete LLP (Registered Number: OC303210, Registered Office: Fairfax House, 15 Fulwood Place, London WC1V 6AY), which is authorized and regulated by the UK Financial Services Authority (“FSA”), and in the United States by Arete LLC (3 PO Square, Boston, MA 02109), a wholly owned subsidiary of Arete LLP, registered as a broker-dealer with the Financial Industry Regulatory Authority (“FINRA”).  Additional information is available upon request.  Reports are prepared using sources believed to be wholly reliable and accurate but which cannot be warranted as to accuracy or completeness.  Opinions held are subject to change without prior notice.  No Arete director, employee or representative accepts liability for any loss arising from the use of any advice provided.  Please see for details of any interests held by Arete representatives in securities discussed and for our conflicts of interest policy.

U.S. Disclosures – Arete provides investment research and related services to institutional clients around the world.  Arete receives no compensation from, and purchases no equity securities in, the companies its analysts cover, conducts no investment banking, market-making or proprietary trading, derives no compensation from these activities and will not engage in these activities or receive compensation for these activities in the future.  Arete restricts the distribution of its investment research and related services to approved institutions only.  Analysts associated with Arete LLP are not registered as research analysts with FINRA.  Additionally, these analysts may not be associated persons of Arete LLC and therefore may not be subject to Rule 2711 restrictions on communications with a subject company, public appearances and trading securities held by a research analyst account.

Section 28(e) Safe Harbor – Arete LLC has entered into commission sharing agreements with a number of broker-dealers pursuant to which Arete LLC is involved in “effecting” trades on behalf of its clients by agreeing with the other broker-dealer that Arete LLC will monitor and respond to customer comments concerning the trading process, which is one of the four minimum functions listed by the Securities and Exchange Commission in its latest guidance on client commission practices under Section 28(e).  Arete LLC encourages its clients to contact Anthony W. Graziano, III (+1 617 357 4800 or with any comments or concerns they may have concerning the trading process.

General Disclosures – This research is not an offer to sell or the solicitation of an offer to buy any security.  It does not constitute a personal recommendation or take into account the particular investment objectives, financial situations, or need of the individual clients.  Clients should consider whether any advice or recommendation in this research is suitable for their particular circumstances and, if appropriate, seek professional advice.  The price and value of the investments referred to in this research and the income from them may fluctuate.  Past performance is not a guide to future performance, future returns are not guaranteed, and a loss of original capital may occur.  Fluctuations in exchange rates could have adverse effects on the value or price of, or income derived from, certain instruments.

© 2009.  All rights reserved.  No part of this report may be reproduced or distributed in any manner without Arete’s written permission.  Arete specifically prohibits the re-distribution of this report and accepts no liability for the actions of third parties in this respect.