Telco Cloud: Why it hasn’t delivered, and what must change for 5G

Related Webinar – 5G Telco Clouds: Where we are and where we are headed

This research report will be expanded upon on our upcoming webinar 5G Telco Clouds: Where we are and where we are headed. In this webinar we will argue that 5G will only pay if telcos find a way to make telco clouds work. We will look to address the following key questions:

  • Why have telcos struggled to realise the telco cloud promise?
  • What do telcos need to do to unlock the key benefits?
  • Why is now the time for telcos to try again?

Join us on April 8th 16:00 – 17:00 GMT by using this registration link.

Telco cloud: big promises, undelivered

A network running in the cloud

Back in the early 2010s, the idea that a telecoms operator could run its network in the cloud was earth-shattering. Telecoms networks were complicated and highly-bespoke, and therefore expensive to build, and operate. What if we could find a way to run networks on common, shared resources – like the cloud computing companies do with IT applications? This would be beneficial in a whole host of ways, mostly related to flexibility and efficiency. The industry was sold.

In 2012, ETSI started the ball rolling when it unveiled the Network Functions Virtualisation (NFV) whitepaper, which borrowed the IT world’s concept of server-virtualisation and gave it a networking spin. Network functions would cease to be tied to dedicated pieces of equipment, and instead would run inside “virtual machines” (VMs) hosted on generic computing equipment. In essence, network functions would become software apps, known as virtual network functions (VNFs).

Because the software (the VNF) is not tied to hardware, operators would have much more flexibility over how their network is deployed. As long as we figure out a suitable way to control and configure the apps, we should be able to scale deployments up and down to meet requirements at a given time. And as long as we have enough high-volume servers, switches and storage devices connected together, it’s as simple as spinning up a new instance of the VNF – much simpler than before, when we needed to procure and deploy dedicated pieces of equipment with hefty price tags attached.

An additional benefit of moving to a software model is that operators have a far greater degree of control than before over where network functions physically reside. NFV infrastructure can directly replace old-school networking equipment in the operator’s central offices and points of presence, but the software can in theory run anywhere – in the operator’s private centralised data centre, in a datacentre managed by someone else, or even in a public hyperscale cloud. With a bit of re-engineering, it would be possible to distribute resources throughout a network, perhaps placing traffic-intensive user functions in a hub closer to the user, so that less traffic needs to go back and forth to the central control point. The key is that operators are free to choose, and shift workloads around, dependent on what they need to achieve.

The telco cloud promise

Somewhere along the way, we began talking about the telco cloud. This is a term that means many things to many people. At its most basic level, it refers specifically to the data centre resources supporting a carrier-grade telecoms network: hardware and software infrastructure, with NFV as the underlying technology. But over time, the term has started to also be associated with cloud business practices – that is to say, the innovation-focussed business model of successful cloud computing companies

Figure 2: Telco cloud defined: New technology and new ways of working

Telco cloud: Virtualised & programmable infrastructure together with cloud business practices

Source: STL Partners

In this model, telco infrastructure becomes a flexible technology platform which can be leveraged to enable new ways of working across an operator’s business. Operations become easier to automate. Product development and testing becomes more straightforward – and can happen more quickly than before. With less need for high capital spend on equipment, there is more potential for shorter, success-based funding cycles which promote innovation.

Much has been written about the vast potential of such a telco cloud, by analysts and marketers alike. Indeed, STL Partners has been partial to the same. For this reason, we will avoid a thorough investigation here. Instead, we will use a simplified framework which covers the four major buckets of value which telco cloud is supposed to help us unlock:

Figure 3: The telco cloud promise: Major buckets of value to be unlocked

Four buckets of value from telco cloud: Openness; Flexibility, visibility & control; Performance at scale; Agile service introduction

Source: STL Partners

These four buckets cover the most commonly-cited expectations of telcos moving to the cloud. Swallowed within them all, to some extent, is a fifth expectation: cost savings, which have been promised as a side-effect. These expectations have their origin in what the analyst and vendor community has promised – and so, in theory, they should be realistic and achievable.

The less-exciting reality

At STL Partners, we track the progress of telco cloud primarily through our NFV Deployment Tracker, a comprehensive database of live deployments of telco cloud technologies (NFV, SDN and beyond) in telecoms networks across the planet. The emphasis is on live rather than those running in testbeds or as proofs of concept, since we believe this is a fairer reflection of how mature the industry really is in this regard.

What we find is that, after a slow start, telcos have really taken to telco cloud since 2017, where we have seen a surge in deployments:

Figure 4: Total live deployments of telco cloud technology, 2015-2019
Includes NFVi, VNF, SDN deployments running in live production networks, globally

Telco cloud deployments have risen substantially over the past few years

Source: STL Partners NFV Deployment Tracker

All of the major operator groups around the world are now running telco clouds, as well as a significant long tail of smaller players. As we have explained previously, the primary driving force in that surge has been the move to virtualise mobile core networks in response to data traffic growth, and in preparation for roll-out of 5G networks. To date, most of it is based on NFV: taking existing physical core network functions (components of the Evolved Packet Core or the IP Multimedia Subsystem, in most cases) and running them in virtual machines. No operator has completely decommissioned legacy network infrastructure, but in many cases these deployments are already very ambitious, supporting 50% or more of a mobile operator’s total network traffic.

Yet, despite a surge in deployments, operators we work with are increasingly frustrated in the results. The technology works, but we are a long way from unlocking the value promised in Figure 2. Solutions to date are far from open and vendor-neutral. The ability to monitor, optimise and modify systems is far from ubiquitous. Performance is acceptable, but nothing to write home about, and not yet proven at mass scale. Examples of truly innovative services built on telco cloud platforms are few and far between.

We are continually asked: will telco cloud really deliver? And what needs to change for that to happen?

The problem: flawed approaches to deployment

Learning from those on the front line

The STL Partners hypothesis is that telco cloud, in and of itself, is not the problem. From a theoretical standpoint, there is no reason that virtualised and programmable network and IT infrastructure cannot be a platform for delivering the telco cloud promise. Instead, we believe that the reason it has not yet delivered is linked to how the technology has been deployed, both in terms of the technical architecture, and how the telco has organised itself to operate it.

To test this hypothesis, we conducted primary research with fifteen telecoms operators at different stages in their telco cloud journey. We asked them about their deployments to date, how they have been delivered, the challenges encountered, how successful they have been, and how they see things unfolding in the future.

Our sample includes individuals leading telco cloud deployment at a range of mobile, fixed and converged network operators of all shapes and sizes, and in all regions of the world. Titles vary widely, but include Chief Technology Officers, Heads of Technology Exploration and Chief Network Architects. Our criteria were that individuals needed to be knee-deep in their organisation’s NFV deployments, not just from a strategic standpoint, but also close to the operational complexities of making it happen.

What we found is that most telco cloud deployments to date fall into two categories, driven by the operator’s starting point in making the decision to proceed:

Figure 5: Two starting points for deploying telco cloud

Function-first "we need to virtualise XYZ" vs platform-first "we want to build a cloud platform"

Source: STL Partners

The operators we spoke to were split between these two camps. What we found is that the starting points greatly affect how the technology is deployed. In the coming pages, we will explain both in more detail.

Table of contents

  • Executive Summary
  • Telco cloud: big promises, undelivered
    • A network running in the cloud
    • The telco cloud promise
    • The less-exciting reality
  • The problem: flawed approaches to deployment
    • Learning from those on the front line
    • A function-first approach to telco cloud
    • A platform-first approach to telco cloud
  • The solution: change, collaboration and integration
    • Multi-vendor telco cloud is preferred
    • The internal transformation problem
    • The need to foster collaboration and integration
    • Standards versus blueprints
    • Insufficient management and orchestration solutions
    • Vendor partnerships and pre-integration
  • Conclusions: A better telco cloud is possible, and 5G makes it an urgent priority

Mobile Standards Processes: Do they inhibit business model innovation?

(This is a modified version of a post first published by Dean Bubley on his Disruptive Wireless blog)

An important area for Telco 2.0 strategists to consider is the way that technical standards are created in the communications industry, and the direct and indirect impact this has on future business models.

Either by deliberate intent by “traditionalists”, or accidental inertia, standards often tend to entrench Telco 1.0 thinking and processes. Going forward, it will be important to influence the way standards (and requirements) are developed, in order to ensure that business model innovation is not “frozen” out of future technlogical deployments.

Attendance at the recent LTE Summit in Amsterdam stimulated this note, as various presentations and offline discussions highlighted the way that standards bodies operate (notably 3GPP).  Various examples showed risks that could delay ecosystem development, entrenching legacy business models for operators and others.

This is not necessarily intentional, as many standards groups are staffed by engineering-type people, who often try and avoid the whole issue of commercial models. This is either because they have limited understanding of that side of the industry, or limited time – or perhaps are worried about regulatory and anti-trust implications. But having said that, there is also a huge amount of politics involved as well, ultimately driven by commercial concerns.

Often, this is particularly driven by vendor-centric vested interests, looking to promote inhouse specialisms and protect future revenue streams, rather than operator-based concerns. Where they are operator-based, they tend to reflect “old-school” views from legacy silo business units, rather than the newer “2.0” thinking which is evolving in corporate strategy departments.

The problems arise because certain aspects of technical architecture can act as limiting factors in a market context. Physical SIM cards, for example, need to be distributed physically. Which means that a customer has to go physically to a store, or have them delivered via the post. So what seems like a basic technology-led decision can actually mitigate against particular business models – for instance, ad-hoc usage of mobile networks – by adding in “latency” of hours or days to a process of sign-up for the customer. In this example, there is also an implicit requirement for the distribution of the SIMs, as well as associated production and management costs.

Alternatively, dependencies between otherwise separate sub-systems can cause huge brittleness overall. LTE is being optimised for use with IMS-based core networks and applications. But not all operators want to deploy IMS, even if (in theory) they want LTE – again, restricting business model choices or forcing them towards what is now a non-optimised radio technology. It seems as though 3GPP is trying to use a popular technology (LTE) as a means to crowbar an unpopular one (IMS) into broader adoption.

A specific knock-on impact of this has been the very slow definition of a voice and SMS service for LTE mobile networks – the IMS-based standard called MMtel has been criticised for three years and has achieved almost zero market traction. This now threatens to delay overall deployment of LTE networks – which ironically may mean that vendor politicking and intransigence in the standards processes will ultimately prove counter-productive. It has already drawn pragmatists to work on a non-standard voice-over-LTE technology called VoLGA.

A parallel set of issues are seen in the insistence of a lot of mobile operators to only view each other as peers, through the GSMA club, and various of its standards initiatives like IPX. This reinforces the notion that alternative communications service providers like Skype or FaceBook are *not* peers, but instead deadly enemies, pilloried as “over the top players”. For some operators that competitive stance may be valid, but for others they might be critical partners or even (whisper it) in a dominant role, for which the MNO is a junior part of the ecosystem.

Freezing old-fashioned assumptions into standards and architectures, often without even identifying that those assumptions exist is a recipe for disaster. Numerous other standards processes, such as RCS (Rich Communications Suite) also seem to perpetuate 1.0 models – essentially next-gen walled gardens.

This isn’t to say that standards are bad – but just that there is often no mechanism by which seemingly-sensible technology decisions are double-checked against potential future business models. Having a cycle in which people ask questions like “Will this work with prepay?” or “What’s the wholesale model?” or “What happens if 3 people want to share one ‘account'” and so forth, would avoid many of these mistakes. You can never account for all eventualities, but you can certainly test for flexbility against quite a range.

Again regarding the LTE event, it is notable that there was not a single mention of the word “MVNO” during the whole conference. Nobody has thought what an LTE-based MVNO might look like – or whether there might be cool features which could enable such an provider to provide more valuable services – and generate more revenue for the host MNO. One panel of luminaries returned blank stares when asked about implementing open APIs on the radio network, to make it “programmable” for developers or partners. It seems likely that there won’t be latency-optimised virtual mobile networks for gaming, any time soon.

Many speakers appeared to view the only mobile broadband business models as traditional contract and prepay mechanisms – there was no talk of sponsored or third-party paid access. No consideration of the importance of Telco 2.0 strategies. No discussion about where in the 4G architecture that a content delivery network might interface, and so on.

One option for fixing this problem is via the other industry bodies that don’t set standards themselves, but which can consider use cases and business models a bit more deeply – NGMN, OMTP, Femto Forum and so forth. Many of these groups are pragmatic and much broader in vision than the purely technical standards groups.

Perhaps this is the level to bring in these commercial considerations, so that they can then “suggest” specifications for the standards bodies to work to. They could ask questions like “does this architecture make MVNOs easier or more difficult to create & run?” or “what could be done to the standard to enable richer wholesale propositions?”

So maybe documents which say things like “Future mobile networks MUST be able to support a variety of MVNOs of example types A, B and C” could force the standards groups to refocus their efforts.

In any case, the Telco 2.0 Initiative believes that the future success of business model innovation could be gated by well-meaning but inflexible standards. Strategists and C-level executives must ensure that their organisation’s participants in standards bodies are aligned in thinking with holistic, 2.0 view – and not just trying to protect historic silos against the forces of change.

Full Article: Devices 2.0 – Battle for the Edge; Executive Briefing Special


NB A PDF version of this Executive Briefing can be downloaded here.

This special Executive Briefing report summarises the brainstorming output from the Devices 2.0 section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See:

It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0’ business model framework (see

Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.

There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Content Distribution 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and API’s 2.0. In addition there is an overall ‘Executive Summary’ report highlighting the overall messages from the event.

Each report contains:

  • Our independent summary of some of the key points from the stimulus presentations
  • An analysis of the brainstorming output, including a large selection of verbatim comments
  • The ‘next steps’ vote by the participants
  • Our conclusions of the key lessons learnt and our suggestions for industry next steps.

The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points. Over the next few weeks we will produce additional ‘Analyst Notes’ with some of these more detailed responses.

NOTE: The presentations referred to in this and other reports, some videos of the presentations themselves, and whole series of post-event reports are available at the event download site.

Access is for event participants only or for subscribers to our Executive Briefing service. If you would like more details on the latter please contact:

Background to this report

The growing profusion of end-user devices creates opportunities and threats for Operators, OTT players, and handset / CPE / operating software companies. Increasing amounts of intelligence and capability are to be found in Netbooks, Smartphones, Set-Top Boxes and routers. What is the consequence of this for Telcos? Are Telcos inevitably going to become dis-intermediated dumb-pipes? Or can operators deploy a device strategy that complements their network capabilities to strengthen their position within the digital value chain?


Brainstorm Topics

Stimulus Presenters and Panellists

  • Anssi Vanjoki, EVP, Nokia
  • Yves Maitre, SVP Devices, Orange Group
  • Alberto Ciarniello, VP Service Innovation, Telecom Italia
  • Rainer Deutschmann, EVP Mobile Internet, T-Mobile International
  • Dean Bubley, CEO, Disruptive Analysis; Associate, Telco 2.0™ Initiative



  • Simon Torrance, CEO, Telco 2.0 Initiative



  • Chris Barraclough, Managing Director, Telco 2.0 Initiative
  • Dean Bubley, Senior Associate, Telco 2.0 Initiative
  • Alex Harrowell, Analyst, Telco 2.0 Initiative


Stimulus Presentation Summaries

Devices 2.0 – Battle for the Edge

Dean Bubley, CEO of Disruptive Analysis and a Senior Associate of the Telco 2.0 Initiative presented on the growing power of devices. Computing power at the edge is rapidly growing; even if the individual devices are still slow, they are speeding up faster than PCs, and there are so many of them. Edge power dwarfs that in the network. Control, responsibility, loyalty, attachment – these will move towards the intelligence, as they always have done. Where is the intelligence? At the edge, as the horde of gadgets gets smarter and smarter.

Soon, he argues, they will be able to tell what the operator is doing – if certain classes of traffic are being favoured or disfavoured, what they are paying for roaming or interconnection, and counteract it through ad-hoc radio networking, P2P, protocol spoofing, and encryption.

Beware! The edge and the cloud could gang up on the smart pipe, Bubley argues. In response, it’s possible to control the device, the gateway, or open up the device but try to maintain a control point somehow. What is certain is that it’s impossible to control everything..

Balancing the home and mobile environments

Yves Maitre, SVP Devices & Mobile Multimedia, Orange Group spoke on the importance of the home environment, which originates with the French experience with Minitel in the 1980s. They hope to integrate this with Orange’s history of mobility. The company is the biggest provider in Europe of VoIP, ADSL, and IPTV; their strategy is based on the Livebox and UMA. 

Netbooks are another ”first”. The company was also interested in Maemo (, but more as a test than anything else. The device ecosystem remains incredibly complex – so much closed, proprietary stuff is still out there. The industry needs to embrace a small number of strategic open source platforms.

Yves Maitre, SVP Devices & Mobile Multimedia, Orange: ”Certain things have enduring relevance – my money, my business, my friends, my health”

It’s hard to say how far to go with the customer; there are serious privacy and security risks.


Converged Services across Three Screens at DTAG

Rainer Deutschmann, EVP Mobile Internet at T-Mobile International said: Our aim is to provide access to digital assets, anywhere, any time, and on any screen, respecting the following principles: simplicity, freedom of choice, virtualisation, openness.

The walled garden was one of the worst things that ever happened; we took the decision to always be open. Digital music sales are about to pass CD sales; we’re looking at the increasing gap between Moore’s and Gilder’s laws. We want increasingly to get rid of storing anything, anywhere. Hence, he says, T-Mobile created its Connected Life and Work product, which is launching this month in Germany.

This provides an integrated multi-device contacts book, e-mail account, and content store; if you have photos on Flickr and other Web services, now you can consolidate them in one place in a very logical fashion. Importance of mobile widgets – people pay for them, unlike web pages – for key web services.

OMTP BONDI and Mobile User Experience 2.0

Alberto Ciarniello, VP Service Innovation, Telecom Italia wanted us to think of a time when it was nearly impossible to move data or applications between PCs. But mobile is like that, now!  There are lots of opportunities in mobile broadband and mobile applications, he says; there is a sharp drop off approaching in shipments of basic phones, and for that matter in SMS revenues. And everything will soon have a Web browser.


At the moment, the ”what” – the application logic or what you’re trying to achieve – is easy; the ”how” – the practical implementation or what you express in software – is much harder. A lifestyle based on devices, he says, can turn into one based on applications; and perhaps to one based on data relationships. Hence, he says, TIM has the notion of ”user experience 2.0”. Every application should work on every device. This is, or should be, true of Web/WRT applications in particular. They should be consistent – some people already have 3 or 4 devices, he says. The aim of BONDI is to provide consistent and secure access from Web applications to device and network capabilities. This is an example of successful operator-led change; BONDI and the OMTP’s organisation are designed to represent operators.

Why do we need to end fragmentation? So we can have app stores that don’t make you change your shoes. Therefore, the best approach is through the browser. But this means we need a standard for access to the device’s OS from browserspace.  He offers the example of a click-to-dial e-commerce application, something which traditionally involved big-telco technology.


Security, of course, is a huge issue; without it, BONDI would be giving Web pages access to low-level functionality! He says we need to delegate this to operators, or other security agents, because otherwise the users will struggle to manage all these issues.

BONDI must be open source and free; which will make it the first such thing ever to come from operators. He reminds us that defining the ”what”, the application logic, not discussing the ”how”, is what really matters; everything must ”just work”; porting costs should be zero. Once minimal requirements are satisfied, he says, user experience becomes the ruling factor. So check out TIM’s new dev platform at!

How Open do we need to be?

Anssi Vanjöki, EVP New Markets, Nokia believes in one Internet – no fixed or mobile. It’s easy to forecast technology, after all, we make it. But forecasting customers? People? That’s really hard.

In general, he says, future devices will all have good hardware capabilities and native programming support. This has important consequences. According to Nokia R&D’s usability research, 12% of the time an N-Series is switched on is spent making or receiving telephone calls or text messages. The rest is camera, media playback, Web browsing, e-mail, applications.

Most of the network activity involved is over cellular, but all the RF stacks they put in the devices get used. Packet radio of one form or other is now universal. And increasing chunks of total device shipments are meaningfully programmable. Devices will all be networked and programmed.

At the top end this will include 500MHz CPU, 64GB RAM, and more sensors – GPS, accelerometer, RFID/NFC, proximity… Networks will be HSPA+ merging into LTE, but it doesn’t really matter much which of those. So the next evolution of the Web will be highly contextual and semantic, based on the information from these sensors. And devices will be servers as well as clients.

Anssi Vanjöki, EVP New Markets, Nokia: ”a lot of devices look a lot like N810s – usability must be the basis of the business model.”

Participant Feedback


The devices section of the event brought together Nokia and three operator representatives, one of whom was speaking on behalf of the Open Mobile Terminal Platforms industry group (OMTP). The session considered the evolving role of both mobile phones and fixed devices like home gateways. It focused on whether or not the Telcos are able to either control their use (for example with 3rd-party Internet applications), or extract extra value from the embedded capabilities and expose these to third parties.

Telco 2.0’s view is that the device space is still poorly understood by many of those tasked with developing next generation business models, many of whom come from a staunchly network-centric background. The shift of computing power and capability towards the “edge” has already been seen in the fixed world with the advent of PCs, and is now happening in mobile with products like the iPhone and 3G dongle modems. This means that the collective power of devices in users’ homes and hands outweighs that of the operator-controlled boxes in the network core.

Operators are faced with a stark choice of either relinquishing control of the edge, attempting instead to monetise “smart pipes”, or trying to reinsert themselves into the device space to ensure greater control and pursue new revenue opportunities by prioritising their own customisations and applications. Some Telcos still seem to feel that network intelligence like DPI can outwit Internet applications running between devices and the web. Others feel that they can offer dedicated devices (fixed or mobile) that are optimised for inhouse services rather than the Internet.

The general feedback from the session highlighted the wariness with which people view devices – and to some extent the relative immaturity of device-level control and business models, versus network-resident platforms and APIs.

Feedback: General (verbatim)

  • Fantastic: 3 MNO’s with 3 different views and Nokia throwing in some disruption! [#6]
  • Why is it that the Telco presentations are so individual, when they will learn and start to learn and work across the industry, all they seem to do is want to own it all? [#7]
  • Fragmented session: device 2.0 is not shared at all. One panellist one opinion…. [#37]
  • The operators leave me cold with their lack of vision and benefit for consumers, they seem to believe they can dominate and force their ways on the consumer, this will not last [#36]
  • o    Ref 36: I totally agree. They seem to underestimate the power of the user and their ability to get what they want an not what is thrust upon them [#56]
  • Good to hear different views – at the end not the MNOs or device manufacturers will decide but the customers, who in total will be no nerds but just users [#41]
  • The balance within edge and network located intelligence will be set by customer behaviour [#60]
  • How about changing the name of the event to economy 2.0 where the digital consumer is king. The relevance then is how Telco’s can react. But certainly digital consumers (with their gadgets/devices) are king. [#22]
  • Apple have bypassed future value chain for operators and proven the operators could become dumb pipes. How do operators get back into the value chain? [#53]
  • o    Re 53, the value of the network is the ability to shape experience. The operators need to quickly figure out how they monetise the fact they know who, where, what device and crucially, what access to bandwidth you have at the time you are using a service. If they can’t do that, they become a utility. [#63]
  • re 63, SP’s should avoid being directly involved in any content activities and focus on building a highly dynamic, mass scale transactional networking business, adding experience value to partnered content. [#70]
  • Telcos must open the networks to community… and what about mobile device suppliers? [#25]
  • Will device manufacturer own the value chain, will the operator or will they really work together? [#47]
  • Depends on your view of connectivity. Is it always preferred or even feasible to connect through the internet vs. directly via personal area networks? [#58]
  • Where is the debate on the customer experience and who (and how) owns it between the Telco’s, device suppliers and service/application providers? [#62]


Feedback: T-Mobile/DTAG’s plans

Deutsche Telekom demonstrated its concept of services that run across multiple devices – PC, mobile and TV, intending to help them drive triple-play sales, and also compete in the social network / personal portal marketplace. While it was a clear demonstration of multi-device strategy, it was less clear whether it would appeal to users already loyal to services which do not require a tie-in to an access subscription.

  • Interesting plans. [#23]
  • With the different approaches mentioned by DT/TI aren’t we re-inventing the wheel again? The gentleman from the BBC put it quite eloquently. Don’t view it as Build it and they will come, instead listen to your customers and build what they want [#11]
  • The t-mobile service looks interesting, but hasn’t apple already done this? I can get all my content, e-mail etc across my i-phone, apple TV and Mac, seamlessly synced [#16]
  • o    re. 16: but apple doesn’t federate other services (except mail) – it’s largely apple services. this is about aggregating services from other service providers [#33]
  • Why does T-Mobile think it will succeed in the market with yet another UC product? And why do they think users will demand the same interface across all devices? This is not reflective of current practice [#19]
  • o    Agree on 19 – this looked like a walled garden approach despite Rainer saying DT was very opposed to walled gardens. [#29]
  • o    Re 19: Not my view – current practice doesn’t take the customer in account at all – at least not the mass market. You have to be an expert to make reasonable cross devices usage, if you succeed at all [#68]
  • T-Mobile: What’s is the Biz model behind Connected Live and the differentiation regarding specialist like GMail, Flickr, Napster, etc? [#12]
  • How will T-mobile cover the social network of customer outside their customer base? [#13]
  • Very sexy Rainer. Can you tell us more about the business model? Does the product provide pull through for IPTV, mobile and fixed broadband? or do you charge users a subscription (which will kill it before it takes off)? or is there a two-sided play? [#14]
  • Very interesting model from DT and very similar as mobile-me from Apple who does the same since a year back. Any other SP who will do the same? [#27]
  • I have 100Gb + of digital assets/content. will the t-mobile service provide enough storage in the cloud for all of this? [#28]
  • If T-Mobile is open, why do you block Skype and alike? [#31]
  • Three screen strategies from operators like t-mobile and orange are simply enhanced defensive lock in strategies, do they really move the open and 2-way business model forward at all? [#59]


Feedback: Nokia

As well as the operator viewpoint on devices, the vendor angle was clearly expounded upon by Nokia’s Anssi Vanjoki, who pointed out the increasing capabilities of mobile phones and other products. Although reference was made to operator-related services, it was also clear that Nokia’s view of future business models did not need to rely on Telco platforms. Several commentators expanded on the implications of this.

  • Nokia at least have a view of the future that is a vision people can buy into, not a vision that seeks to control all [#10]
  • All the intelligence in the mobile phones (Nokia like) means no intelligence for all!!!! [#40]
  • If Nokia thinks 15% for voice and messaging – does he think subsidy on high end phones will go? [#8]
  • How does Anssi (Nokia) resolve the need to synch between multiple devices (Tera-play)? [#9]
  • If it’s all in the handset, why did Nokia buy empocket? [#17]
  • Any idea if Ovi Store will accept PyS60 apps as well as WRT and Objective C? [#39]
  • Mobile Web Server is just a techie toy isn’t it? [#42]
  • Nokia’s view: contextual/real-time awareness. [#43]
  • If Nokia is right, what will be about the internet-enabled TV’s that start to spread and will also have (controlled) access –> customers will need device independence! [#54]

Feedback: BONDI/Telecom Italia

Telecom Italia spoke about the OMTP’s BONDI Initiative, which involves working with W3C to develop a new way to run interactive widgets and web applications across phones supporting different OS’s. However, Telco 2.0 believes that this concept (which is alien to a lot of network-centric people) still needs to be explained more clearly and more widely, before the industry understands its potential significance. In theory, the ability for an operator to assist web-based applications with secure access to underlying device capabilities should enable various new business models.

  • BONDI concept is necessary to eliminate the multi-device compatibility issues [#45]
  • Bondi looks something that limits customer freedom. All customer freedom limitations will fault quickly! [#50]
  • Is the vision that Bondi becomes an industry standard or a proprietary solution for TI? [#15]
  • o    Re 15 – BONDI is an industry initiative, not TI only! [#26]
  • Are Apple and Nokia all supporters of BONDI or is this just the Telcos and the open source companies supporting this? [#35]
  • Who owns the application in Bondi? Will the Telco be able to get revenue [#44]

Feedback: Outstanding Issues

What remained unclear from the session was exactly how devices might fit into two-sided business models. How could developers or other “upstream” players benefit from device capabilities? The emphasis seemed to be much more on Telcos using their device input to exert control, rather than monetising openness, and several of the contributors commented on this.

  • History repeats itself. Brings my thoughts to OSI – ODA etc. Loads of time spent with little result [#38]
  • When will the SP community come together to develop a sustainable model for their API’s that works across all networks, thereby getting the benefits of scale, why do they all reinvent the wheel [#24]
  • One item mentioned earlier today was applications were the way to monetize a ‘service’ or website, nothing on the business model was mentioned. Where is the enhanced revenue in each of these approaches? [#30]
  • Where is the 2-sided business model? [#32]
  • What is the open mobile platform of the future? [#34]
  • Rainer: we heard Yves telling us that we need standards for the devices. How can reach a consistent experience around 3-screens (clients etc) and how are you capable to open to 3rd parties? [#49]
  • How to open the Telco infrastructure without open device site? [#51]
  • Don’t we have already a 2-sided business model today between mobile Opcos and device manufacturers in many countries? Opcos buy handsets wholesale and resell them with a rebate (aka handset subsidy) to the end customer. [#52]
  • o    Note 52: how is that 2-sided? Isn’t the revenue being produced only by the customer? [#57]

General Questions

Perhaps reflecting the lower general emphasis on devices within the Telco 2.0 community, the session also threw up a number of more general questions (some of which we’ve tried to answer in red).

1.     Do we that the trend is the same for devices in developed as in emerging markets? [#18] [Telco 2.0 – devices in emerging markets are slowly becoming more powerful, but probably 3-4 years behind on average. More interestingly, the majority in markets like India are unsubsidised and do not feature operator-specific features, so will be even more difficult to control. Lack of fixed broadband means that the home gateway is less prominent]

2.     If devices become much more powerful how quickly will batteries die? [#20] [Telco 2.0 – battery life improves much more slowly than processors. However, it is often the screen that draws most power, not the processor. Various initiatives like multi-core processors will appear in handsets to help manage power, but it’s still an important issue]

3.     Where will dell and other pc manufacturers play in this conversation? [#66] [Telco 2.0 – yes, up to a point, especially with 3G-enabled laptops and MIDs. However, only a small fraction are likely to be directly Telco-controlled, especially in the enterprise. Most PC users are unlikely to accept operator interference in their choice of apps, although this changes a little where cheap PCs are subsidised]

4.     What is the projected battery life? Especially when up / downloading masses of data [#67]

5.     re 65 where will storage companies play [#69] [Telco 2.0 – There may be a broader role to play in the device space either for home servers (eg Linksys) or flash memory (eg Sandisk) in enabling new services, although this sector is still immature]

6.     What about the role of the SIM and can operators leverage the SIM to restrict the power of device manufacturers with consumers and use it to innovate new service models? [#61] [Telco 2.0 – the SIM card is definitely a core element of operator control and some new services. But not all devices have SIMs, and consumers are unlikely to accept SIM-locked PCs or TVs, especially if they connect via non-mobile access channels, or are unsubsidised. SIMs also have issues of legacy replacement, and are awkward for running applications across multiple carriers]

7.     A lot of focus in this session on mobile devices, what about a truly open set top box and EPG, not tied to an operator service but open for the user to choose services and subscribe as appropriate [#55]

8.     [Telco 2.0 – There is huge potential for a standards-based platform for STBs across multiple operators, which would enable diverse business models for video delivery. The BBC’s Project Canvas is an effort in this direction, and the Linux community has developed several technical solutions for the CPE. However, the regulatory issues have been extremely problematic everywhere it has been tried.]

Participants ‘Next Steps’ Vote

Participants were asked which device strategy would offer Telcos the most realistic opportunity to deliver profitable new services and business models in the future?

  • Telco designed and controlled smart devices (e.g. custom smartphones, operator specific digital picture frame)
  • Separate Telco controlled gateway device (e.g. femtocell, set top box) used with open edge device.
  • Open device with Telco control of policy software (e.g. netbook with sim & operator connection software).
  • Forget about controlling devices, we can manage everything in the network.
  • Forget devices, we can control things in the network.


Lessons learnt & next steps

Unfortunately, Telco strategists still appear to expend more efforts on examining infrastructure and centralised application platforms, rather than the network “edge”. Although obviously some within operator organisations are focused on the users’ hands and homes, there is often no more general recognition of the shifting balance of power – in terms of both influence and computation. The rise of the iPhone and similar devices has helped redress the balance somewhat – but even there, the emphasis has shifted to the more “comfortable” centralised AppStore as something for operators to emulate.

This is understandable. By and large, few fixed or mobile operators have successfully helped create new types of devices on their own. A few broadband providers have used home gateways as new service platforms, or as ways to reduce churn, but even these have tended to just be through the addition of functions like IPTV or VoIP. Few consumers would view their broadband “box” as a central hub of a home network – despite 10+ years of discussion of interconnection with consumer electronics, utility meters and home automation. All the talk of Telcos exploiting connectivity to HiFis or “screen fridges” has been hot air.

Alberto Ciarniello, VP Service Innovation, TIM: ”’Apple shipped 1bn apps at significant average revenue per user. It’s unprecedented. It’s generated a lot of traffic and a lot of stickiness.”

In the mobile space, the power of Nokia, Apple, RIM and others is always set against operators’ desire to customise applications or user experience. Although in developed markets, a high % of phones are sold through operator channels, the use of embedded operator-specific applications and on-device portals has had only limited commercial benefit. Probably the most important customisation has been the configuration of the Telco’s own portal as the default browser home page. If anything, the shift towards smartphones and PC-based mobile broadband has further weakened Telcos’ role – the majority of 3G data traffic goes straight to and from the Internet from “vanilla” devices.

Anssi Vanjöki, EVP New Markets, Nokia: Our user studies show that 12% of user time on the N-series is telephony or messaging; the rest is Web browsing, camera, media playback, e-mail, and applications.

The future possibly holds some more hope. The audience at the event was strongly in favour of pushing for Telco “control points” in otherwise open devices. This fits well with the heritage of SIM cards (which are expanding in capability) as well as standardisation in areas like the browser and widget frameworks (eg OMTP BONDI). Software pre-loaded with PC dongles or embedded 3G modems is another option. [Telco 2.0 is much more sceptical of the benefits of the RCS client advocated by the GSMA and certain operators]. In the converged triple/quadplay space, femtocells offer another point of control and service delivery, close to the customer – although the notion of a separate “gateway” product was viewed with less enthusiasm at the Nice event. New classes of devices such as MIDs, operator-enabled consumer electronics (Internet TVs, 3G musicplayers, in-car systems etc) also hold promise, but are seen more as low-risk experiments at this point.

In terms of next steps, the Telco 2.0 team feel that, in the short term (c12 months), operators should:

  • Aggressively pursue “must have” devices like the iPhone – even if there is a short-term pain point around loss of control. At the moment, customers are still device-centric.
  • Think twice about pushing end-users towards smartphones – instead, look at data plans coupled to higher-end featurephones, especially those with good browsers, touchscreens etc.
  • Assess the business opportunities around OMTP’s BONDI model at a strategic level
  • Revisit the realistic opportunities afforded by next-generation SIM cards for both PCs and phones.
  • Beware of certain device categories which will need new business/charging models to succeed broadly in the marketplace – for example, embedded-3G PCs are an “elegant concept”, but fail to meet the needs of massmarket consumers (or enterprises) at present.


Longer term, additional considerations are more pertinent:

  • Look at exploiting devices used by customers on other Telcos’ networks – there is no reason that operators cannot themselves become successful “over the top” players.
  • Look closely at using femtocells (plus handsets) as a new platform for innovative in-home services.
  • Work closely with utility companies on new smart metering / environmental monitoring applications.
  • Remain wary of new technical standards for devices that promise new opportunities – but require the creation of complete new ecosystems, and which potentially compete with other easier technologies. RCS and NFC are particularly exposed, in Telco 2.0’s view.
  • Expect developers to migrate towards the coolest and most computationally-powerful platforms. This may mean that the API strategy of the operator needs to become more device-centric over time.