Augmented reality supports many use cases across industries
Revisiting the themes explored in the AR/VR: Won’t move the 5G needle report STL Partners published in January 2018, this report explores whether augmented reality (AR) could become a catalyst for widespread adoption of 5G, as leading chip supplier Qualcomm and some telcos hope.
It considers how this technology is developing, its relationship with virtual reality (VR), and the implications for telcos trying to find compelling reasons for customers to use low latency 5G networks.
This report draws the following distinction between VR and AR
Virtual reality: use of an enclosed headset for total immersion in a digital3D
Augmented reality: superimposition of digital graphics onto images of the real world via a camera viewfinder, a pair of glasses or onto a screen fixed in real world.
In other words, AR is used both indoors and outdoors and on a variety of devices. Whereas Wi-Fi/fibre connectivity will be the preferred connectivity option in many scenarios, 5G will be required in locations lacking high-speed Wi-Fi coverage. Many AR applications rely on responsive connectivity to enable them to interact with the real world. To be compelling, animated images superimposed on those of the real world need to change in a way that is consistent with changes in the real world and changes in the viewing angle.
AR can be used to create innovative games, such as the 2016 phenomena Pokemon Go, and educational and informational tools, such as travel guides that give you information about the monument you are looking at. At live sports events, spectators could use AR software to identify players, see how fast they are running, check their heart rates and call up their career statistics.
Note, an advanced form of AR is sometimes referred to as mixed reality or extended reality (XR). In this case, fully interactive digital 3D objects are superimposed on the real world, effectively mixing virtual objects and people with physical objects and people into a seamless interactive scene. For example, an advanced telepresence service could project a live hologram of the person you are talking to into the same room as you. Note, this could be an avatar representing the person or, where the connectivity allows, an actual 3D video stream of the actual person.
Widespread usage of AR services will be a hallmark of the Coordination Age, in the sense that they will bring valuable information to people as and when they need it. First responders, for example, could use smart glasses to help work their way through smoke inside a building, while police officers could be immediately fed information about the owner of a car registration plate. Office workers may use smart glasses to live stream a hologram of a colleague from the other side of the world or a 3D model of a new product or building.
In the home, both AR and VR could be used to generate new entertainment experiences, ranging from highly immersive games to live holograms of sports events or music concerts. Some people may even use these services as a form of escapism, virtually inhabiting alternative realities for several hours a day.
Given sufficient time to develop, STL Partners believes mixed-reality services will ultimately become widely adopted in the developed world. They will become a valuable aid to everyday living, providing the user with information about whatever they are looking at, either on a transparent screen on a pair of glasses or through a wireless earpiece. If you had a device that could give you notifications, such as an alert about a fast approaching car or a delay to your train, in your ear or eyeline, why wouldn’t you want to use it?
How different AR applications affect mobile networks
One of the key questions for the telecoms industry is how many of these applications will require very low latency, high-speed connectivity. The transmission of high-definition holographic images from one place to another in real time could place enormous demands on telecoms networks, opening up opportunities for telcos to earn additional revenues by providing dedicated/managed connectivity at a premium price. But many AR applications, such as displaying reviews of the restaurant a consumer is looking at, are unlikely to generate much data traffic. the figure below lists some potential AR use cases and indicates how demanding they will be to support.
Examples of AR use cases and the demands they make on connectivity
Source: STL Partners
Although telcos have always struggled to convince people to pay a premium for premium connectivity, some of the most advanced AR applications may be sufficiently compelling to bring about this kind of behavioural shift, just as people are prepared to pay more for a better seat at the theatre or in a sports stadium. This could be on a pay-as-you-go or a subscription basis.
Enter your details below to request an extract of the report
To access the report chart pack in PPT download the additional file on the left
Drivers for cloud gaming services
Although many people still think of PlayStation and Xbox when they think about gaming, the console market represents only a third of the global games market. From its arcade and console-based beginnings, the gaming industry has come a long way. Over the past 20 years, one of the most significant market trends has been growth of casual gamers. Whereas hardcore gamers are passionate about frequent play and will pay more to play premium games, casual gamers play to pass the time. With the rapid adoption of smartphones capable of supporting gaming applications over the past decade, the population of casual/occasional gamers has risen dramatically.
This trend has seen the advent of free-to-play business models for games, further expanding the industry’s reach. In our earlier report, STL estimated that 45% of the population in the U.S. are either casual gamers (between 2 and 5 hours a week) or occasional gamers (up to 2 hours a week). By contrast, we estimated that hardcore gamers (more than 15 hours a week) make up 5% of the U.S. population, while regular players (5 to 15 hours a week) account for a further 15% of the population.
The expansion in the number of players is driving interest in ‘cloud gaming’. Instead of games running on a console or PC, cloud gaming involves streaming games onto a device from remote servers. The actual game is stored and run on a remote compute with the results being live streamed to the player’s device. This has the important advantage of eliminating the need for players to purchase dedicated gaming hardware. Now, the quality of the internet connection becomes the most important contributor to the gaming experience. While this type of gaming is still in its infancy, and faces a number of challenges, many companies are now entering the cloud gaming fold in an effort to capitalise on the new opportunity.
5G can support cloud gaming traffic growth
Cloud gaming requires not just high bandwidth and low latency, but also a stable connection and consistent low latency (jitter).In theory, 5Gpromises to deliver stable ultra-low latency. In practice, an enormous amount of infrastructure investment will be required in order to enable a fully loaded 5Gnetwork to perform as well as end-to-end fibre. 5Gnetworks operating in the lower frequency bands would likely buckle under the load if lots of gamers in a cell needed a continuous 25Mbps stream. While 5Gin millimetre-wave spectrum would have more capacity, it would require small cells and other mechanisms to ensure indoor penetration, given the spectrum is short range and could be blocked by obstacles such as walls.
A complicated ecosystem
As explained in our earlier report, Cloud gaming: New opportunities for telcos?, the cloud gaming ecosystem is beginning to take shape. This is being accelerated by the growing availability of fibre and high-speed broadband, which is now being augmented by 5G and, in some cases, edge data centres. Early movers in cloud gaming are offering a range of services, from gaming rigs, to game development platforms, cloud computing infrastructure, or an amalgamation of these.
One of the main attractions of cloud gaming is the potential hardware savings for gamers. High-end PC gaming can be an extremely expensive hobby: gaming PCs range from £500 for the very cheapest to over £5,000 for the very top end. They also require frequent hardware upgrades in order to meet the increasing processing demands of new gaming titles. With cloud gaming, you can access the latest graphics processing unit at a much lower cost.
By some estimates, cloud gaming could deliver a high-end gaming environment at a quarter of the cost of a traditional console-based approach, as it would eliminate the need for retailing, packaging and delivering hardware and software to consumers, while also tapping the economies of scale inherent in the cloud. However, in STL Partners’ view that is a best-case scenario and a 50% reduction in costs is probably more realistic.
STL Partners believes adoption of cloud gaming will be gradual and piecemeal for the next few years, as console gamers work their way through another generation of consoles and casual gamers are reluctant to commit to a monthly subscription. However, from 2022, adoption is likely to grow rapidly as cloud gaming propositions improve.
At this stage, it is not yet clear who will dominate the value chain, if anyone. Will the “hyperscalers” be successful in creating a ‘Netflix’ for games? Google is certainly trying to do this with its Stadia platform, which has yet to gain any real traction, due to both its limited games library and its perceived technological immaturity. The established players in the games industry, such as EA, Microsoft (Xbox) and Sony (PlayStation), have launched cloud gaming offerings, or are, at least, in the process of doing so. Some telcos, such as Deutsche Telekom and Sunrise, are developing their own cloud gaming services, while SK Telecom is partnering with Microsoft.
What telcos can learn from Shadow’s cloud gaming proposition
The rest of this report explores the business models being pursued by cloud gaming providers. Specifically, it looks at cloud gaming company Shadow and how it fits into the wider ecosystem, before evaluating how its distinct approach compares with that of the major players in online entertainment, such as Sony and Google. The second half of the report considers the implications for telcos.
Some people in the telecom industry believe that “voice is dead” – or, at least, that traditional phone calls are dying off. Famously, many younger mobile users eschew standalone realtime communications, instead preferring messaging loaded with images and emoji, via apps such as Facebook Messenger and WeChat, or those embedded e.g. in online gaming applications. At the other end of the spectrum, various forms of video-based communications are important, such as SnapChat’s disappearing video stories, as well as other services such as Skype and FaceTime.
Even for basic calling-type access, WhatsApp and Viber have grown huge, while assorted enterprise UC/UCaaS services such as Skype for Business and RingCentral are often “owning” the business customer base. Other instances of voice (and messaging and video) are appearing as secondary features “inside” other applications – games, social networks, enterprise collaboration, mobile apps and more – often enabled by the WebRTC standard and assorted platforms-as-a-service.
Smartphones and the advent of 4G have accelerated all these trends – although 3G networks have seen them as well, especially for messaging in developing markets. Yet despite the broad uptake of Internet-based messaging and voice/video applications, it is still important for mobile operators to provide “boring old phone calls” for mobile handset subscribers, not least in order to enable “ubiquitous connection” to friends, family and businesses – plus also emergency calls. Plenty of businesses still rely on the phone – and normal phone numbers as identifiers – from banks to doctors’ practices. Many of the VoIP services can “fall back” to normal telephony, or dial out (or in) from the traditional telco network. Many license terms mandate provision of voice capability.
This is true for both fixed and mobile users – and despite the threat of reaching “peak telephony”, there is a long and mostly-stable tail of calling that won’t be displaced for years, if ever.
Figure 1: Various markets are beyond “peak telephony” despite lower call costs
Source: Disruptive Analysis, National Regulators
In other words, even if usage and revenues are falling, telcos – and especially mobile operators – need to keep Alexander Graham Bell’s 140-year legacy alive. If the network transitions to 4G and all-IP, then the telephony service needs to do so as well – ideally with feature-parity and conformance to all the legacy laws and regulation.
(As a quick aside, it is worth noting that telephony is only one sort of “voice communication”, although people often use the terms synonymously. Other voice use-cases vary from conferencing, push-to-talk, audio captioning for the blind, voice-assistants like Siri and Alexa, karaoke, secure encrypted calls and even medical-diagnostics apps that monitor breathing noise. We discuss the relevance of non-telephony voice services for telcos later in this report). 4G phone calls: what are the options?
CSFB (Circuit-Switched Fallback): The connection temporarily drops from 4G, down to 3G or 2G. This enables a traditional non-IP (CS – circuit-switched) call to be made or received on a 4G phone. This is the way most LTE subscribers access telephony today.
VoLTE: This is a “pure” 4G phone call, made using the phone’s in-built dialler, the cellular IP connection and tightly-managed connectivity with prioritisation of voice packets, to ensure good QoS. It hooks into the telco’s IMS core network, from where it can either be directly connected to the other party (end-to-end over IP), go via a transit provider or exchange, or else it can interwork with the historic circuit-based phone network.
App-based calling: This involves making a VoIP call over the normal, best-efforts, data connection. The function could be provided by a telco itself (eg Reliance Jio’s 4GVoice app), an enterprise UC provider, or an Internet application like Skype or Viber. Increasingly, these applications are also integrated into phones “native dialler” interface and can share call-logs and other functions. [Note – STL’s Future of The Network research stream does not use the pejorative, obsolete and inaccurate term “OTT”.]
None of these three options is perfect.
Telephony is still necessary in the 4G era
4G phone calls: what are the options?
The history of VoLTE
The Good, the Bad & the Ugly
The motivations for VoLTE deployment
The problems for VoLTE deployment?
Market Status & Forecasts
Business & Strategic Implications
Is VoLTE really just “ToLTE”?
Link to NFV & Cloud
GSMA Universal Profile: Heaven or Hell for Telcos?
Do telcos have a role in video communications?
Intersection with enterprise voice
Figure 1: Various markets are beyond “peak telephony” despite lower call costs
Figure 2: VoLTE, mobile VoIP & LTE timeline
Figure 3: VoLTE coverage is often deployed progressively
Figure 4: LTE subscribers, by voice technology, 2009-2021
Although they make extensive use of WhatsApp, Facebook Messenger, Snapchat and other Internet-based communications services, consumers still expect mobile operators to enable them to make voice calls and text messages. Indeed, communication services are widely regarded as a fundamental part of a telco’s proposition, but telcos’ telephony and messaging services are losing ground to the Internet-based competitors and are generating less and less revenue.
Should telcos allow this business to gradually melt away of should they attempt to rebuild a competitive communications proposition for consumers? How much strategic value is there in providing voice calls and messaging services?
This report explores telcos’ strategic options in the consumer communications market, building on previous STL Partners’ research reports, notably:
This report evaluates telcos’ current position in the consumer market for voice calls and messaging, before considering what they can learn from three leading Internet-based players: Tencent, Facebook and Snap. The report then lays out four strategic options for telcos and recommends which of these options particular types of telcos should pursue.
What do telcos have to lose?
Learning from the competition
Tencent pushes into payments to monetise messaging
Facebook – nurturing network effects with fast footwork
Snapchat – highly-focused innovation
Telcos’ strategic options
Maximise data traffic
Embed communications into other services
Differentiate on reliability, security, privacy and reach
Compete head-on with Internet players
Figure 1: Vodafone still makes large sums from incoming calls & messages
Figure 2: Usage of Vodafone’s voice services is rising in emerging markets
Figure 3: Vodafone Europe sees some growth in voice usage
Figure 4: Internet-based services are overtaking telco services in China
Figure 5: Usage of China Mobile’s voice services is sliding downwards
Figure 6: China Mobile’s SMS traffic shows signs of stabilising
Figure 7: Vodafone’s SMS volumes fall in Europe, but rise in AMAP
Figure 8: Voice & messaging account for 38% of China Mobile’s service revenues
Figure 9: Line is also seeing rapid growth in advertising revenue in Japan
Figure 10: More WeChat users are making purchases through the service
Figure 11: About 20% of WeChat official accounts act as online shops
Figure 12: Line’s new customer service platform harnesses AI
Figure 13: Snapchat’s user growth seems to be slowing down
Figure 14: Vodafone Spain is offering zero-rated access to rival services
Figure 15: Google is integrating communications services into Maps
Figure 16: Xbox Live users can interact with friends and other gamers
Figure 17: RCS is being touted as a business-friendly option
Figure 18: Turkcell’s broad and growing range of digital services
The way in which audiences consume movies and television content appears to be changing. While ‘linear’ viewing of scheduled channels remains robust, the market for DVD has collapsed and new pricing and consumption models are opening up.
At the forefront of this is Netflix – with a total of 63M paying subscribers across 50 markets (it is present in a large number of locations in Latin America and the Caribbean) and a penetration of over 34% in the US, Netflix has created a new paradigm for on demand content.
How this model is going to impact other players in the market in the long term is as yet unclear. To date in the US, pay platform penetration has remained robust, premium channels such as HBO are also performing strongly, and for rights owners and producers a new player bidding for rights is hugely welcome.
So is Netflix a ‘win: win’ opportunity for all concerned? It may not be that straightforward.
For leading pay TV players, Netflix will be yet another component forcing them to invest in innovation to minimise customers churning from bundled packages, and reducing flexibility around price increases;
For TV channels Netflix could lead to programme rights inflation, as a new player with a distinct business model comes into bid for premium exclusive content rights
For both established TV platforms and premium channels there is the risk that in price sensitive markets or demographics Netflix offers may gain traction, particularly among younger consumers at the expense of traditional subscription models.
For telcos looking to compete with cable and satellite, while Netflix could offer a cost effective way to deliver attractive premium content, it also carries a risk of constraining the telcos into the position of a ‘dumb (or happy) pipe’, not sharing in upsides and not owning the consumer who deals directly with Netflix.
STL Partners has partnered with Prospero Strategy Consultants who work extensively with content and platform players on new market dynamics to prepare this Briefing. The work has drawn on interviews with key players and analysis of quantitative and qualitative market data, to determine the threats and opportunities emerging from this new content ecosystem and how these are likely to develop.
Overview of Netflix History
Netflix began as a postal DVD business in the US in 1997, launching its US subscription streaming service in 2007. Since 2011 it has focused on rapid expansion into international markets with the biggest growth now coming from international subscribers (67% growth between 2013 and 2014) while its US DVD business is now in decline.
Figure 4: Netflix subscribers 1999 – 2014(Q3) in 000s
Netflix changed its reporting methodology from Q1 2011
Consumer Proposition and USPs
The success of the Netflix proposition to consumers has been based on a number of components:
Low Price and refusal to tie users into long-term contracts
Volume and exclusivity of content
Effective User Interface, recommendation engine and multi-device access
Low Price The low monthly price point of Netflix (USD7.99 per month in the US rising to USD8.99 for new subscribers in 2014) has been a key component of the company’s success. This price point is less than the cost of purchasing a single DVD and significantly less than monthly premium drama channels such as HBO (at ~USD15 per month). This price point (and that users are not tied into long term contracts) allows Netflix to attract distinct audience groups.
First, the high-end audience who are already pay subscribers. These customers have demonstrated that they are typically price inelastic and willing to pay for more, buying Netflix on top of existing services.
Second, the price constrained audiences, for whom traditional pay TV is out of reach but who are interested in expanded choice. These are often younger demographics for whom the concept of non-linear consumption is very familiar.
There is a third audience group, the price sensitive pay TV subscribers for whom Netflix could be an effective substitute and who could churn off traditional pay TV (either completely or partially) as a result. While the evidence around the impact on this group is as yet nascent, it is this segment that is making incumbent pay TV players nervous.
Figure 5: Reasons Netflix streamers subscribe to the Service
As demonstrated in Figure 5 above a key to success has been offering both range and quality of content. However, over time the shape of the Netflix library has changed as it has used its customer insight and data to inform its rights strategy.
In February 2012 the Netflix US library consisted of ~15k titles (Source: SNL Kagan) of which nearly three quarters were movie titles.
Since 2012 the volume of library titles has declined by approximately 30% nearly all of which is accounted for by a decline in movie titles. Netflix has increased its focus on long run drama series which already have brand recognition and which are effective at attracting and keeping audiences.
Interestingly, the volume of content being offered in its international markets is significantly less than in the US (about one-third) as Netflix shifts its focus to quality (as opposed to quantity of content)
Netflix’s early content deals were typically library rights and non-exclusive. Over time that mix has shifted as Netflix increasingly looks to have a component of exclusivity with the aim of shifting from a “nice to have” to a “must have” service
Netflix is investing in original production of a limited number of high profile, high end drama series (such as House of Cards, Orange is the only Black and the recently announced Crouching Tiger Hidden Dragon sequel). For these Netflix can retain its exclusive rights indefinitely.
In addition, Netflix is bidding aggressively for exclusive windows for high end content (such as the recently announced deal for exclusive VOD rights in all territories for Gotham and first window rights in several territories for Penny Dreadful).
Figure 6: Netflix’s Evolving Content Proposition
Source: STL Partners & Prospero analysis
Effective consumer interface on multiple devices
Netflix has evolved a highly effective consumer interface, enabling personalisation by individual members in the household, with an easy to manage and visually effective selection mechanism.
Since 2008 Netflix has rolled out its proposition across multiple connected devices, with the most recent development being access on mobile devices and partnership with 4G operators such as Vodafone. Cross device functionality gives users a consistent experience.
The consumer is able to choose when and where to consume Netflix content – leading to a new dynamic of series “bingeing” analogous to box set consumption. In addition, Netflix’s deals with Smart TV providers gives consumers the ability to by-pass traditional pay TV gatekeepers.
Figure 7: Netflix’s user interface
Source: Netflix & SNL Kagan
Underlying a huge part of their success is Netflix’s control of its data. This includes knowledge of individuals within households (who will have their own profiles), detailed insight into viewing behaviour (not just what, but when and how much), knowledge that no linear channel can match.
In all markets (regardless of its distribution partners) Netflix retains its customer data and does not share it. This informs its rights negotiations and new programme investments.
Netflix continues to refine its customer understanding using sophisticated A/B testing where small sub groups are given slightly different user experiences to see how this changes behaviour
Summary: Changing consumer behaviours and the transition to 4G are likely to bring about a fresh surge of video traffic on many networks. Fortunately, mobile content delivery networks (CDNs), which should deliver both better customer experience and lower costs, are now potentially an option for carriers using a combination of technical advances and new strategic approaches to network design. This briefing examines why, how, and what operators should do, and includes lessons from Akamai, Level 3, Amazon, and Google. (May 2013, Executive Briefing Service).
Content delivery networks (CDNs) are by now a proven pattern for the efficient delivery of heavy content, such as video, and for better user experience in Web applications. Extensively deployed worldwide, they can be optimised to save bandwidth, to provide greater resilience, or to help scale up front-end applications. In the autumn of 2012, it was estimated that CDN providers accounted for 40% of the traffic entering residential ISP networks from the Internet core. This is likely to be an underestimate if anything, as a major use case for CDN is to reduce the volume of traffic that has to transit the Internet and to localise traffic within ISP networks. Craig Labovitz of DeepField Networks, formerly the head of Arbor’s ATLAS instrumentation project, estimates that from 35-45% of interdomain Internet traffic is accounted for by CDNs, rising to 60% for some smaller networks, and 85% of this is video.
Figure 1: CDNs, the supertankers of the Internet, are growing
Source: DeepField, STL
In the past, we have argued that mobile networks could benefit from deploying CDN, both in order to provide CDN services to content providers and in order to reduce their Internet transit and internal backhaul costs. We have also looked at the question of whether telcos should try to compete with major Internet CDN providers directly. In this note, we will review the CDN business model and consider whether the time has come for mobile CDN, in the light of developments at the market leader, Akamai.
The CDN Business Model
Although CDNs account for a very large proportion of Internet traffic and are indispensable to many content and applications providers, they are relatively small businesses. Dan Rayburn of Frost & Sullivan estimates that the video CDN market, not counting services provided by telcos internally, is around $1bn annually. In 2011, Cisco put it at $2bn with a 20% CAGR.
This is largely because much of the economic value created by CDNs accrues to the operators in whose networks they deploy their servers, in the form of efficiency savings, and to the content providers, in the form of improved sales conversions, less downtime, savings on hosting and transit, and generally, as an improvement in the quality of their product. It’s possible to see this as a two-sided business model – although the effective customer is the content provider, whose decisions determine the results of competition, much of the economic value created accrues to the operator and the content provider’s customer.
On top of this, it’s often suggested that margins in the core CDN product, video delivery, are poor and it would be worth moving to supposedly more lucrative “media services”, products like transcoding (converting original video files into the various formats served out of the CDN for networks with more or less bandwidth, mobile versus fixed devices, Apple HLS versus Adobe Flash, etc) and analytics aimed at content creators and rightsholders, or to lower-scale but higher-margin enterprise products. We are not necessarily convinced of this, and we will discuss the point further on page 9. For the time being, note that it is relatively easy to enter the CDN market, and it is influenced by Moore’s law. Therefore, as with most electronic, computing, and telecoms products, there is structural pressure on prices.
The Problem: The Traffic Keeps Coming
A major 4G operator recently released data on the composition of traffic over their new network. As much as 40% of the total, it turned out, was music or video streaming. The great majority of this will attract precisely no revenue for the operator, unless by chance it turns out to represent the marginal byte that induces a user to spend money on out-of-bundle data. However, it all consumes spectrum and needs backhauling and therefore costs money.
The good news is that most, or even all, of this could potentially be distributed via a CDN, and in many cases probably will be distributed by a CDN as far as the mobile operator’s Internet point of presence. Some of this traffic will be uplink, a segment likely to grow fast with better radios and better device cameras, but there are technical options related to CDN that can benefit uplink applications as well.
Figure 2: Video, music, and photos are filling up a 4G mobile network
Source: EE, STL
As 29% of their traffic originates from the top 3 point sources – YouTube, Facebook, and iTunes – it’s also observable that signing-up a relatively small subset of content providers as customers will provide considerable benefit. Out of those three, all of them use a CDN, and two of those – Facebook and iTunes – are customers of Akamai, while YouTube relies on Google’s own solution.
We can re-arrange the last chart to illustrate this more fully. (Note that Skype, as a peer-to-peer application that is also live, is unsuitable for CDN as usually understood.)
Figure 3: The top 9 CDN-able point sources represent 40% of EE’s traffic
Source: EE, STL
Looking further afield, the next chart shows the traffic breakdown by application from DeepField’s observations in North American ISP networks.
Figure 4: The Web giants ride on the CDNs
Clearly, the traffic sources and traffic types that are served from CDNs are both the heaviest to transport and also the ones that contribute most to the busy hour; note that these are peak measurements, and the total of the CDN traffic here (Netflix, YouTube, CDN other, Facebook) is substantially more than it is on average.
To read the Software Defined Networking in full, including the following sections detailing additional analysis…
Akamai: the World’s No.1 CDN
Financial and KPI review
The Choice for CDN Customers: Akamai, Amazon, or DIY like Google?
CDN depth: the key question
CDN depth and mobile networks
Akamai’s guidelines for deployment
Why has mobile CDN’s time come?
What has held mobile CDN back?
But the world has changed…
…Networks are much less centralised…
…and IP penetrates much more deeply into the network
Licensed or Virtual CDN – a (relatively) new business model
SDN: a disruptive opportunity
So, why right now?
It may be time for telcos to move on mobile CDN
The CDN industry is exhibiting familiar category killer dynamics
Regional point sources remain important
CDN internals are changing the structure of the Internet
Recommendations for action
…and the following figures…
Figure 1: CDNs, the supertankers of the Internet, are growing
Figure 2: Video, music, and photos are filling up a 4G mobile network
Figure 3: The top 9 CDN-able point sources represent 40% of EE’s traffic
Summary: For mobile entertainment services to generate revenues commensurate to the attention they receive, the industry needs to improve ‘discovery’ tools, create more effective creative inventory, and deliver proof of its effectiveness. A summary of the Digital Entertainment 2.0 session of the 2013 Silicon Valley Brainstorm. (April 2013)
Below are the high-level analysis and detailed contents from a 27 page Telco 2.0 Briefing Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service here. The Digital Economy, Consumer Experience (including service ‘discovery’), Digital Commerce and the Internet of Things will also be explored in depth at the EMEA Executive Brainstorm in London, 5-6 June, 2013. Non-members can find out more about subscriptions here, or to find out more about this and other enquiries, please email firstname.lastname@example.org or call +44 (0) 207 247 5003.
To share this article easily, please click:
Part of the New Digital Economics Executive Brainstorm series, the Digital Entertainment 2.0 session took place at the Intercontinental Hotel, San Francisco, on the 20th March, 2013. The title and objective of the session was ‘How to Make Mobile Work’.
Analysis: What Gets Measured Gets Money
The key steps for mobile entertainment services to generate revenues commensurate to the attention it receives in North America are: to improve measurement of the success of ‘discovery’ tools, create more effective creative advertising inventory, and deliver proof of its effectiveness, not just the attention.
Mobile is a ‘break out’ entertainment media
Mobile has for some time been an entertainment media in the eyes of consumers, and particularly younger ones who soak up ‘dead time’ by playing games, using apps and even just communicating for fun, although to date not all these forms of entertainment have been connected.
In the past 3 years there has been a significant increase in ‘on demand’ and mobile consumption in North American and European markets, particularly in these younger segments, although a key challenge has been that monetisation has not followed the use of time spent on mobile.
Mobile entertainment itself can be defined as related to a context (e.g. ‘out and about’, ‘dead time’, ‘second screen’), devices (featurephone, smartphone or tablet), or type of connection (e.g. none, 3G, 4G, Wi-Fi). In general though, there are two main scenarios: mobile as a medium in its own right; and mobile as a ‘second screen’ experience. So in either scenario, we think the clearest answer to ‘what is the role of mobile?’ is that it is a ‘break out’ media, either extending the context of a form of entertainment, or extending the nature of entertainment in the existing context.
For video in North America, TV is still the dominant form of consumption, but mobile is growing rapidly as the ‘second screen’ that controls or supplements the main screen, especially with the explosive growth of tablets since the introduction of the Apple iPad.
Segmentation – there’s no single dominant business model
There has been much debate about the viability of different business models, broadly: advertising funded; consumer ownership; and subscription. While most participants believed that the ownership model would be most successful in music, where there is a higher likelihood that a consumer will want to listen to a track or album numerous times, ‘collectors’ or owners will still exist for videos, books and games.
Equally, demand exists for single ‘on demand’ services (e.g. pay per view), subscription (e.g. Spotify, cable), and advertising funded (e.g. YouTube). The balance is likely to change in video in particular with a move to increasing ‘on demand’ services in line with the current trend in consumer behaviour.
‘Discovery’: finding a model that proves it works
As the previously dominant channel-based model of curation in broadcast media gradually dissolves, and as the screen size, context and characteristics of consumption change, consumers face an increasing challenge finding out what they want to see, play or listen to.
Curation still exists through channel guides, taste-makers and review sites, and indeed through many offline sources, but is increasingly less the property of the content producer or distributor that it once was.
Content Discovery, one of the great buzz phrases of the industry, is therefore an ongoing challenge, and the application of networked computing power provide some advantages to connected and interactive devices like smartphones and tablets. Approaches used include:
3rd party classification (e.g. by genre, subject), enabling more structured self-selection through menu choices etc.;
Recommendation engines (the Amazon/Netflix model), that can be based on a ‘Big Data’ approach (‘other people who bought this also bought that’) and/or semantic association (‘here’s another sentimental family comedy you might like’);
Social approaches, based on what your friends like or are watching, either through generic social media like Facebook, and specialised social media such as Zeebox.com (for video) and Goodread.com (for books);
Search – although this is non-trivial due to the volume of material in existence, and the ever-changing art of Search Engine Optimisation (SEO) – getting the right items at the top of the list.
Hybrid approaches that combine ‘Big Data’ with ‘semantic association’ (e.g. see Jinni.com) and/or other forms e.g. Social (e.g. see this intriguing article on the $1m recommendations challenge from Netflix).
For all methods, the inconsistencies of the metadata recorded (e.g. is the media described accurately using your terms) is frequently a challenging limitation.
To a degree though, content discovery has always been a process of ‘trial and error’. Consumers read, hear or see a load of ideas, try a few out, stick to the ones they like, and grow to trust the means of discovery that is most successful for them.
To this end, an element that appears to be missing in many discovery processes today is the measurement of success rate for the user – “was this a good recommendation for you”? In our view, discovery applications that accurately track success well (easily, with a good UI, and with a tangibly good and improving success rate) will ultimately prove successful. All of the above techniques could and to a greater or lesser extent do adopt this approach, although it isn’t yet clear which will perform the best in the market.
Delivering the goods
The challenges of delivering content, particularly large volume (e.g. HD Video) and/or latency sensitive content (such as multi-player virtual gaming), were not addressed in the Digital Entertainment session, though Software Defined Networking (SDN), which offers the promise of more efficient routing through networks for certain traffic, was discussed in the Digital Economy session. Content Delivery Networks and other Broadband design techniques have also been addressed at length in other STL Partners research and brainstorms
However, ‘Bandwidth’ was one of the key determinants of success according to the ‘BBC’ heuristic offered from Mitch Berman’s experience as a guide to how mobile entertainment will operate in different markets: Bandwidth; Business Model; and Culture. (NB We think this can also be seen as a shorthand variant of our business model framework, with culture being a key driver of the content proposition, bandwidth of the technical capability, and business model as the value proposition.)
Getting money commensurate with the time spent on mobile
A major challenge for advertising funded mobile entertainment is that there is a significant gap between the ratio of the amount of time spent viewing mobile and the money spent on it, and other forms of media. This is illustrated by the stats that:
For The Weather Channel, mobile is 1.5 X the traffic but less than 50% of the revenue;
10% of media consumption occurs online Vs. 1% media spend (from the presentation by Cary Tild, CIO GroupM in subsequent Marketing and Advertising session).
While this imbalance is genuine, there are important advantages and limitations to mobile as a medium that haven’t yet been fully exploited or overcome, respectively.
One of the major limitations has been the need for more effective commercial inventory on mobile. At the Brainstorm there was, for example, much discussion on the limitations of banner type ads in a mobile environment. Many more innovative forms are now evolving, illustrated by:
The Weather Channel’s experimental creative commercial content within its app, in which instead of a rectangular banner at the top of the screen, appropriate commercial content is embedded on background of the weather screen (e.g. a cloud-wrapped image from a mystery film on a cloudy day’s forecast screen);
New trial applications that insert products virtually in existing content (e.g. a soft-drink on a table in an old TV show, as demonstrated in test form by ReinCloud);
And subsequently, the launch of Facebook Home, designed to increase the commercial inventory available to Facebook by taking over the screen of a user’s smartphone.
In the same way that advertising has always evolved (from print to radio, radio to TV, etc.), there is still much to be learned through innovation and experimentation – and of course the related measurements of success.
Charging differently for content rights by content owners, e.g. by the use of content rather than as an upfront fee, was also discussed, although many content owners are reluctant to move to or even test this model as they see it representing a significant risk to existing revenue streams.
The digital economy core themes of ‘big data’ and ‘localisation’ were also raised, and an example given by the Weather Company of a highly effective promotion of grass seeds based on locality and the detection of key seasonal weather changes.
Finally, a key theme in common with the subsequent advertising session was that proving the effectiveness of models to consumers, brands, and investors was the key step for most mobile entertainment concepts. We see thoughtful design, coupled with trial and experimentation, effective measurement and the ongoing application of learning processes to be central to achieving that proof.
Effectiveness in the ‘discovery’ phase of digital service is a key success criterion, particularly in Digital Entertainment. We will continue to research and explore this area in our Executive Brainstorms in Europe, the Middle East, and Asia-Pacific.
To read the note in full, including the following sections detailing additional analysis…
Brainstorm: Stimulus Presentations – summary and key points
Brainstorm: Table Discussions
Verbatim delegate questions & comments
Brainstorm: Panel Session in summary
…and the following figures…
Figure 1 – Traditional linear TV model is facing multiple disruptions
Figure 2 – Non-linear forms of TV becoming a massmarket requirement
Figure 3 – Tablets are changing the TV/video landscape
Figure 4 – The mobile problem
Figure 5 – Whither digital collectors?
Figure 6 – Shine on you crazy diamond?
Figure 7 – Sharing the locker?
Figure 8 – Do we all want libraries?
…Members of the Telco 2.0 Executive Briefing Subscription Service can download the full 27 page report in PDF format here. Non-Members, please subscribe here. The Digital Economy, Consumer Experience (including service ‘discovery’), Digital Commerce and the Internet of Things will also be explored in depth at the EMEA Executive Brainstorm in London, 5-6 June, 2013. For this or any other enquiries, please email email@example.com / call +44 (0) 207 247 5003.
Background & Further Information
The 2013 Silicon Valley Brainstorm used STL’s unique ‘Mindshare’ interactive format, including cutting-edge new research, case studies, use cases and a showcase of innovators, structured small group discussion on round-tables, panel debates and instant voting using on-site collaborative technology. Around 30 executives from entertainment, media, telecoms and technology companies participated in this session in total.
The focus was on looking at “the true role for mobile” in the digital entertainment industry. Opening the session informally, various attendees were canvassed about their intentions & hopes for the day. This yielded a desire for information to assist in business modelling, to learn about the realities of the US entertainment market – or just to experience “inspiration and surprise” from a diverse set of speakers.
Objective: How to Make Mobile Work
The session covered three presentations and a demo, spanning the width of the entertainment business from TV to books, and from user behaviour to advertising. Its principle focus was around how content and telecom companies could generate sustainable businesses by leveraging the trend towards mobility – both devices and networks.
Designing compelling mobile entertainment experiences
4G: The impact on video distribution and consumption economics
Latest models for monetisation
The session included three Stimulus Speakers:
Andre James, Partner, Bain & Co
Alex Linde, Vice President, Mobile & Digital Apps,The Weather Channel
Keith McMahon, Senior Analyst, STL Partners/Telco 2.0
In addition, Dan Reitan, CEO, Reincloud gave an Innovation Showcase demo, after which these four were joined on the debate panel by two other industry luminaries:
David Gale, EVP New Media, MTV
Mitchell Berman, Principal, Blend Digital
We’d like to thank the sponsors of the Brainstorm:
Summary: Google’s shares have made little headway recently despite its dominance in search and advertising, and it faces increasing regulatory threats in this area. It either needs to find new sources of value growth or start paying out dividends, like Microsoft, Apple (or indeed, a telco). Overall, this is resulting in something of a strategic identity crisis. A review of Google’s strategy and implications for Telcos. (March 2012, Executive Briefing Service, Dealing with Disruption Stream).
Below is an extract from this 24 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and the Telco 2.0 Dealing with Disruption Stream here. Non-members can subscribe here, buy a Single User license for this report online here for £595 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email firstname.lastname@example.org / call +44 (0) 207 247 5003. We’ll also be discussing our findings and more on Google at the Silicon Valley (27-28 March) and London (12-13 June) New Digital Economics Brainstorms.
To share this article easily, please click:
Google appears to be suffering from a strategic identity crisis. It is the giant of search advertising but it also now owns a handset maker, fibre projects, an increasingly fragmented mobile operating system, a social network of questionable success, and a driverless car programme (among other things). It has a great reputation for innovation and creativity, but risks losing direction and value by trying to focus on too many strategies and initiatives.
We believe that Google needs to stop trying to copy what Apple and Facebook are doing, de-prioritise its ‘Hail Mary’ hunt for a strategy (e.g. driverless cars), and continue to build new solutions that serve better the customers who are already willing to pay – namely, advertisers.
It is our view that the companies who have created most value in the market have done so by solving a customer problem really well. Apple’s recent success derives from creating a simpler and more beautiful way (platform + products) for people to manage their digital lives. People pay because it’s appealing and it works.
Google initially solved how people could find relevant information online and then, critically, how to use this to help advertisers get more customers. They do this so well that Google’s $37bn revenues continue to grow at double digit pace, and there’s plenty of headroom in the market for now. While the TV strategy may not yet be paying off, it would seem sensible to keep working at it to try to keep extending the reach of Google’s platform.
While Android keeps Google in the mobile game to a degree, and has certainly helped to constrain certain rivals, we think Google should cast a hard eye over its other competing and distracting activities: Motorola, Payments, Google +, Driverless Cars etc. Its management team should look at the size of the opportunity, the strength of the competition, and their ability to execute in each.
Pruning the projects might also lose Google an adversary or two, and it might also afford some reward to shareholders too. After all, even Apple has recently decided to pay back some cash to investors.
This may be very difficult for Google’s current leadership. Larry Page seems to have the restless instincts of the stereotypical Valley venture capitalist, hunting the latest ideas, and constantly trying to create the next big beautiful thing. The trouble is that this is Google in 2012, not 1995, and it looks to us at least that a degree of ‘sticking to the knitting’ within Google’s huge, profitable and growing search advertising business may be a better bet than the highly speculative (and expensive) ‘Hail Mary’ strategy route.
This may sound surprising coming from us, the inveterate fans of innovation at Telco 2.0, so we’d like to point out some important differences between the situations that Google and the telcos are in:
Google’s core markets are growing, not flat or shrinking, and are at a different life-stage to the telecoms market;
Google is global, rather than being confined to any given geography. There are many opportunities still out there.
We are not saying that Google should stop innovating, but we are saying it should focus its innovative energy more clearly on activities that grow the core business.
In January this year, Google achieved a first – it missed the consensus forecast for its quarterly earnings. There is of course no magic in the consensus, which is an average of highly conventionalised guesses from a bunch of City analysts, but it is as good a moment as ever to review Google’s strategic position. If you bought Google stock at the beginning, you may not need to read this, as you’re probably very rich (the return since then is of the order of 400%). The entirety of this return, however, is accounted for by the 2004-2007 bull run. On a five-year basis, Google stock is ahead 30%, which sounds pretty impressive (a 6% annual return), but again, all the growth is accounted for by the last surge upwards over the summer of 2007. The peak was achieved on the 2nd of November, 2007.
As this chart shows, Google stock is still down about 9% from the peak, and perhaps more importantly, its path tracks Microsoft very closely indeed. Plus Microsoft investors get a dividend, whereas Google investors do not.
“invent wild things that will help humanity, get them adopted by users, profit, and then use the corporate structure to keep inventing new things.”
No longer a search company? Take a look at the revenues. Out of Google’s $37.9bn in revenues in 2011, $36bn came from advertising, aka the flip side of Google Search. Despite a whole string of mammoth product launches since 2007, Google’s business is essentially what it was in 2007 – a massive search-based advertising machine.
Since then, Google has launched Google +, closed Google Buzz, and closed Google Wave while releasing it into a second life as an open-source project. It has been involved in major litigation over patents and in regulatory inquiries. It has seen an enormous boom in Android shipments but not necessarily much revenue. It is about to become a major hardware manufacturer by acquiring Motorola. And it has embarked on extensive changes to the core search product and to company-wide UI design.
In this note, we will explore Google’s activities since our last note, summarise key threats to the business and strategies to counter them, and consider if a bearish view of the company is appropriate.
We’ve found it convenient to organise Google’s business into several themed groups as follows:
1: Questionable Victories
Pyrrhic victory is defined as a victory so costly it is indistinguishable from defeat. Although there is nothing so bad at Google, it seems to have a knack of creating products that are hugely successful without necessarily generating cash. Android is exhibit A.
The obvious point here is surging, soaring growth – forecasts for Android shipments have repeatedly been made, beaten on the upside, adjusted upwards, and then beaten again. Android has hugely expanded the market for smartphones overall, caused seismic change in the vendor industry, and triggered an intellectual property war. It has found its way into an awe-inspiring variety of devices and device classes.
But questions are still hanging over how much actual money is involved. During the Q4 results call, a figure for “mobile” revenues of $2.5bn was quoted. This turns out to consist of advertising served to browsers that present a mobile device user-agent string. However, Google lawyer Susan Creighton is on record as saying that 66% of Google mobile web traffic originates from Apple iOS devices. It is hard to see how this can be accounted for as Android revenue.
Further, the much-trailed “fragmentation” began in 2011 with a vengeance. “Forkdroids”, devices using an operating system based on Android but extensively adapted (“forked” from the main development line), appeared in China and elsewhere. Amazon’s Kindle Fire tablet is an example closer to home.
And the intellectual property fights with Oracle, Apple, and others are a constant source of disruption and a potentially sizable leakage of revenue. In so far as Google’s motivation in acquiring Motorola Mobility was to get hold of its patent portfolio, this has already involved very large sums of money. Another counter-strategy is the partnership with Intel and Lenovo to produce x86-based Android devices, which cannot be cheap either and will probably mean even more fragmentation.
This is not the only example, though – think of Google Books, an extremely expensive product which caused a great deal of litigation, eventually got its way (although not all the issues are resolved), and is now an excellent free tool for searching in old books but no kind of profit centre. Further, Google’s patented automatic scanning has the unfortunate feature of pulling in marginalia, etc. from the original text that its rivals (such as Amazon Kindle) don’t. Further, Google has recently been trying to monetise one of its classic products, the Google Maps API that essentially started the Web 2.0 phenomenon, with the result that several heavy users (notably Apple and Foursquare) have migrated to the free OpenStreetMap project and its OpenLayers API.
Like a telco, Google is dependent on one key source of revenue that cross-subsidises the rest of the company – search-based advertising.
Figure 2: Google’s advertising revenues cascade into all other divisions
[NB TAC = Traffic Acquisition Cost, CoNR = Cost of Net Revenues]
Having proven to be a category killer for search and advertising across the whole of the Internet, the twins (search and ads) are hugely critical for Google and also for millions of web sites, content creators, and applications developers. As a result, just like a telco, they are increasingly subject to regulation and political risk.
Google search rankings have always been subject to an arms race between the black art of search-engine optimisation and Google engineers’ efforts to ensure the integrity of their results, but the whole issue has taken a more serious twist with the arrival of a Federal Trade Commission inquiry into Google’s business practices. The potential problems were dramatised by the so-called “white lady from Google” incident at Google Kenya, where Google employees scraped a rival directory website’s customers and cold-called them, misrepresenting their competitors’ services, and further by the $500 million online pharmacy settlement. Similarly, the case of the Spanish camp site that wants to be disassociated from horrific photographs of a disaster demonstrates both that there is a demand for regulation and that sooner or later, a regulator or legislator will be tempted to supply it.
As well as the FTC, there is also substantial regulatory risk in the EU. The European Commission, in giving permission for the Motorola acquisition, also stated that it would consider further transactions involving Google and Motorola’s intellectual property on a case-by-case basis. To put it another way, after the Motorola deal, the Commission has set up a Google Alert for M&A activity involving Google.
3: Look & Feel Problems
Google is in the process of a far-reaching refresh of its user interfaces, graphic design, and core search product. The new look affects Search, GMail, and Google + so far, but is presumably going to roll out across the entire company. At the same time, they have begun to integrate Google + content into the search results.
This is, unsurprisingly, controversial and has attracted much criticism, so far only from the early adopter crowd. There is a need for real data to evaluate it. However, there are some reasons to think that Search is looking in the wrong place.
Since the major release codenamed Caffeine in 2008, Google Search engineers have been optimising the system for speed and for first-hit relevance, while also indexing rapidly-changing content faster by redesigning the process of “spidering” web sites to work in parallel. Since then, Google Instant has further concentrated on speed to the first result. In the Q4 results, it was suggested that mobile users are less valuable to Google than desktop ones. One reason for this may be that “obvious” search – Wikipedia in the first two hits – is well served by mobile apps. Some users find that Google’s “deep web” search has suffered.
Under “Google and your world”, recommendations drawn from Google + are being injected into search results. This is especially controversial for a mixture of privacy and user-experience reasons. Danny Sullivan’s SearchEngineLand, for example, argues that it harms relevance without adding enough private results to be of value. Further, doubt has been cast on Google’s numbers regarding the new policy of integrating Google accounts into G+ and G+ content into search.
Another, cogent criticism is that it introduces an element of personality that will render regulatory issues more troublesome. When Google’s results were visibly the output of an algorithm, it was easier for Google to claim that they were the work of impartial machines. If they are given agency and associated with individuals, it may be harder to deny that there is an element of editorial judgment and hence the possibility of bias involved.
Social search has been repeatedly mooted since the mid-2000s as the next-big-thing, but it seems hard to implement. Yahoo!, Facebook, and several others have tried and failed.
Figure 3: Google + on Google Trends: fading into the noise?
Source: Google Trends
It is possible that Google may have a structural weakness in design as opposed to engineering (which is as excellent as ever). This may explain why a succession of design-focused initiatives have failed – Wave and Buzz have been shut down, Google TV hasn’t gained traction (there are less than one million active devices), and feedback on the developer APIs is poor.
4: Palpable Project Proliferation
Google’s tendency to launch new products is as intimidating as ever. However, there is a strong argument that its tireless creativity lacks focus, and the hit-rate is worrying low. Does Google really need two cut-down OSs for ultra-mobile devices? It has both Android, and ChromeOS, and if the first was intended for mobile phones and the second for netbooks, you can now buy a netbook-like (but rather more powerful) Asus PC that runs Android. Further, Google supports a third operating system for its own internal purposes – the highly customised version of Linux that powers the Google Platform – and could be said to support a fourth, as it pays the Mozilla Foundation substantial amounts of money under the terms of their distribution agreement and their Boot to Gecko project is essentially a mobile OS. IBM also supported four operating systems at its historic peak in the 1980s.
Also, does Google really need to operate an FTTH network, or own a smartphone vendor? The Larry Page quote we opened with tends to suggest that Google’s historical tendency to do experiments is at work, but both Google’s revenue raisers (Ads and YouTube, which from an economic point of view is part of the advertising business) date from the first three years as a public company. The only real hit Google has had for some time is Android, and as we have seen, it’s not clear that it makes serious money.
Google Wallet, for example, was launched with a blaze of publicity, but failed to attract support from either the financial or the telecoms industry, rather like its predecessor Google Checkout. It also failed to gain user adoption, but it has this in common with all NFC-based payments initiatives. Recently, a major security bug was discovered, and key staff have been leaving steadily, including the head of consumer payments. Another shutdown is probably on the cards.
Another heavily hyped project which does not seem to be gaining traction is the Chromebook, the hardware-as-a-service IT offering aimed at enterprises. This has been criticised on the basis that its $28/seat/month pricing is actually rather high. Over a typical 3 year depreciation cycle for IT equipment, it’s on a par with Apple laptops, and has the restriction that all the applications must work in a Web browser on netbook-class hardware. Google management has been promoting small contract wins in US school districts . Meanwhile, it is frequently observed that Google’s own PC fleet consists mostly of Apple hardware. If Google won’t use them itself, why should any other enterprise IT shop do so? The Google Search meeting linked above contains 2 Lenovo ThinkPads and 13 Apple MacBooks of various models and zero Chromebooks, while none other than Eric Schmidt used a Mac for his MWC 2012 keynote. Traditionally, Google insisted on “dogfooding” its products by using them internally.
The Google Fibre project in Kansas City, for its part, has been struggling with regulatory problems related to its access to city-owned civil infrastructure. Kansas City’s utility poles have reserved areas for different services, for example telecoms and electrical power. Google was given the concession to string the fibre in the more spacious electrical section – however, this requires high voltage electricians rather than telecoms installers to do the job and costs substantially more. Google has been trying to change the terms, and use the telecoms section, but (unsurprisingly) local cable and Bell operators are objecting. As with the muni-WLAN projects of the mid-2000s, the abortive attempt to market the Nexus One without the carriers, and Google Voice, Google has had to learn the hard way that telecoms is difficult.
And while all this has been going on, you might wonder where Google Enterprise 2.0 or Google Ads 2.0 are.
5. Google Play – a Collection of Challenges?
Google recently announced its “new ecosystem”, Google Play. This consists of what was historically known as the Android Market, plus Google Books, Google Music, and the web-based elements of Google Wallet (aka Google Checkout). All of these products are more or less challenged. Although the Android Market has been a success in distributing apps to the growing fleets of Android devices, it continues to contain an unusually high percentage of free apps, developer payouts tend to be lower than on its rivals, and it has had repeated problems with malware. Google Books has been an expensive hobby, involving substantial engineering work and litigation, and seems unlikely to be a profit centre. Google Music – as opposed to YouTube – is also no great success, and it is worth asking why both projects continue.
However, it will be the existing manager of Google Music who takes charge, with Android Market management moving out. It is worth noting that in fact there were two heads of the Android Market – Eric Chu for developer relations and David Conway for product management. This is not ideal in itself.
To read the note in full, including the following additional analysis…
On the Other Hand…
Strengths of the Core Business
“Apple vs. Google”
Summary Key Product Review
Search & Advertising
YouTube and Google TV
Summary: Google Dashboard
Recommendations for Operators
The Telco 2.0™ Initiative
…and the following figures…
Figure 1: Google, Microsoft 2.0?
Figure 2: Google’s advertising revenues cascade into all other divisions
Figure 3: Google + on Google Trends: fading into the noise?
Figure 4: Google’s Diverse Advertiser Base
Figure 5: Google’s Content Acquisition. 2008-2009, the missing data point
Figure 6: Google Product Dashboard
…Members of the Telco 2.0 Executive Briefing Subscription Service and the Telco 2.0 Dealing with Disruption Stream can download the full 24 page report in PDF format here. Non-Members, please subscribe here, buy a Single User license for this report online here for £595 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email email@example.com / call +44 (0) 207 247 5003.
Organisations, geographies, people and products referenced: AdSense, AdWords, Amazon, Android, Apple, Asus, AT&T, Australia, BBVA, Bell Labs, Boot to Gecko, Caffeine, CES, China, Chromebook, ChromeOS, ContentID, David Conway, Eric Chu, Eric Schmidt, European Commission, Facebook, Federal Trade Commission, GMail, Google, Google +, Google Books, Google Buzz, Google Checkout, Google Maps, Google Music, Google Play, Google TV, Google Voice, Google Wave, GSM, IBM, Intel, Kenya, Keyhole Software, Kindle Fire, Larry Page, Lenovo, Linux, MacBooks, Microsoft, Motorola, Mozilla Foundation, Netflix, Nexus, Office 365, OneNet, OpenLayers API, OpenStreetMap, Oracle, Susan Creighton, ThinkPads, VMWare, Vodafone, Western Electric, Wikipedia, Yahoo!, Your World, YouTube, Zynga
Technologies and industry terms referenced: advertisers, API, content acquisition costs, driverless car, Fibre, Forkdroids, M&A, mobile apps, muni-WLAN, NFC, Search, smart TV, spectrum, UI, VoIP, Wallet
This report analyses the strategies behind the success of Amazon, Apple, Facebook, Google and Skype, before going on to consider the key risks they face and how telcos and their partners should deal with these highly-disruptive Internet giants.
As the global economy increasingly goes digital, these five companies are using the Internet to create global brands with much broader followings than those of the traditional telecoms elite, such as Vodafone, AT&T and Nokia. However, the five have markedly different business models that offer important insights into how to create world-beating companies in the digital economy:
Amazon: Amazon’s business-to-business Marketplace and Cloud offerings are text-book examples of how to repurpose assets and infrastructure developed to serve consumers to open up new upstream markets. As the digital economy goes mobile, Amazon’s highly-efficient two-sided commerce platform is enabling it to compete effectively with rivals that control the leading smartphone and tablet platforms – Apple and Google.
Apple: Apple has demonstrated that, with enough vision and staying power, an individual company can single-handedly build an entire ecosystem. By combining intuitive and very desirable products, with a highly-standardised platform for software developers, Apple has managed to create an overall customer experience that is significantly better than that offered by more open ecosystems. But Apple’s strategy depends heavily on it continuing to produce the very best devices on the market, which will be difficult to sustain over the long-term.
Facebook: A compelling example of how to build a business on network effects. It took Facebook four years of hard work to reach a tipping point of 100 million users, but the social networking service has been growing easily and rapidly ever since. Facebook has the potential to attract 1.4 billion users worldwide, but only if it continues to sidestep rising privacy concerns, consumer fatigue or a sudden shift to a more fashionable service.
Google: The search giant’s virtuous circle keeps on spinning to great effect – Google develops scores of free, and often-compelling, Internet services, software platforms and apps, which attract consumers and advertisers, enabling it to create yet more free services. But Google’s acquisition of Motorola Mobility risks destabilising the Android ecosystem on which a big chunk of its future growth depends.
Skype: Like Facebook and Google, Skype sought users first and revenues second. By creating a low-cost, yet feature-rich, product, Skype has attracted more than 660 million users and created sufficient strategic value to persuade Microsoft to hand over $8.5bn. Skype’s share of telephony traffic is rising inexorably, but Google and Apple may go to great lengths to prevent a Microsoft asset gaining a dominant position in peer-to-peer communications.
The strategic challenge
There is a clear and growing risk that consumers’ fixation on the products and services provided by the five leading disruptors could leave telcos providing commoditised connectivity and struggling to make a respectable return on their massive investment in network infrastructure and spectrum.
In developed countries, telcos’ longstanding cash-cows – mobile voice calls and SMS – are already being undermined by Internet-based alternatives offered by Skype, Google, Facebook and others. Competition from these services could see telcos lose as much as one third of their messaging and voice revenues within five years (see Figure 1) based on projections from our global survey, carried out in September 2011.
Figure 1 – The potential combined impact of the disruptors on telcos’ core services
Source: Telco 2.0 online survey, September 2011, 301 respondents
Moreover, most individual telcos lack the scale and the software savvy to compete effectively in other key emerging mobile Internet segments, such as local search, location-based services, digital content, apps distribution/retailing and social-networking.
The challenge for telecoms and media companies is to figure out how to deal with the Internet giants in a strategic manner that both protects their core revenues and enables them to expand into new markets. Realistically, that means a complex, and sometimes nuanced, co-opetition strategy, which we characterise as the “Great Game”.
In Figure 3 below, we’ve mapped the players’ roles and objectives against the markets they operate in, giving an indication of the potential market revenue at stake, and telcos’ generic strategies.
Figure 3- The Great Game – Positions, Roles and Strategies
Our in-depth analysis, presented in this report, describes the ‘Great Game’ and the strategies that we recommend telcos and others can adopt in summary and in detail. [END OF FIRST EXTRACT]
Executive Summary [5 pages – including partial extract above]
Key Recommendations for telcos and others [20 pages]
Introduction [10 pages – including further extract below]
The report then contains c.50 page sections with detailed analysis of objectives, business model, strategy, and options for co-opetition for:
Conclusions and recommendations [10 pages]
The report includes 124 charts and tables.
The rest of this page comprises an extract from the report’s introduction, covering the ‘new world order’, investor views, the impact of disruptors on telcos, and how telcos are currently fighting back (including pricing, RCS and WAC), and further details of the report’s contents.
The new world order
The onward march of the Internet into daily life, aided and abetted by the phenomenal demand for smartphones since the launch of the first iPhone in 2007, has created a new world order in the telecoms, media and technology (TMT) industry.
Apple, Google and Facebook are making their way to the top of that order, pushing aside some of the world’s biggest telcos, equipment makers and media companies. This trio, together with Amazon and Skype (soon to be a unit of Microsoft), are fundamentally changing consumers’ behaviour and dismantling longstanding TMT value chains, while opening up new markets and building new ecosystems.
Supported by hundreds of thousands of software developers, Apple, Google and Facebook’s platforms are fuelling innovation in consumer and, increasingly, business services on both the fixed and mobile Internet. Amazon has set the benchmark for online retailing and cloud computing services, while Skype is reinventing telephony, using IP technology to provide compelling new functionality and features, as well as low-cost calls.
On their current trajectory, these five companies are set to suck much of the value out of the telecoms services market, substituting relatively expensive and traditional voice and messaging services with low-cost, feature-rich alternatives and leaving telcos simply providing data connectivity. At the same time, Apple, Amazon, Google and Facebook have become major conduits for software applications, games, music and other digital content, rewriting the rules of engagement for the media industry.
In a Telco2.0 online survey of industry executives conducted in September 2011, respondents said they expect Apple, Google, Facebook and Skype together to have a major impact on telcos’ voice and messaging revenues in the next three to five years . Although these declines will be partially compensated for by rising revenues from mobile data services, the respondents in the survey anticipate that telcos will see a major rise in data carriage costs (see Figure 1 – The potential combined impact of the disruptors on telcos’ core services).
In essence, we consider Amazon, Apple, Facebook, Google and Skype-Microsoft to be the most disruptive players in the TMT ecosystem right now and, to keep this report manageable, we have focused on these five giants. Still, we acknowledge that other companies, such as RIM, Twitter and Baidu, are also shaping consumers’ online behaviour and we will cover these players in more depth in future research.
The Internet is, of course, evolving rapidly and we fully expect new disruptors to emerge, taking advantage of the so-called Social, Local, Mobile (SoLoMo) forces, sweeping through the TMT landscape. At the same time, the big five will surely disrupt each other. Google is increasingly in head-to-head competition with Facebook, as well as Microsoft, in the online advertising market, while squaring up to Apple and Microsoft in the smartphone platform segment. In the digital entertainment space, Amazon and Google are trying to challenge Apple’s supremacy, while also attacking the cloud services market.
Unlike telcos, the disruptors are generally growing quickly and are under little, or no, pressure from shareholders to pay dividends. That means they can accumulate large war chests and reinvest their profits in new staff, R&D, more data centres and acquisitions without any major constraints. Investors’ confidence and trust enables the disruptors to spend money freely, keep innovating and outflank dividend-paying telcos, media companies and telecoms equipment suppliers.
By contrast, investors generally don’t expect telcos to reinvest all their profits in their businesses, as they don’t believe telcos can earn a sufficiently high return on capital. Figure 16 shows the dividend yields of the leading telcos (marked in blue). Of the disruptors, only Microsoft (marked in green) pays a dividend to shareholders.
Figure 16: Investors expect dividends, not growth, from telcos
Source: Google Finance 2/9/2011
The top telcos’ turnover and net income is comparable, or superior, to that of the leading disruptors, but this isn’t reflected in their respective market capitalisations. AT&T’s turnover is approximately four times that of Google and its net income twice as great, yet their market cap is similar. Even accounting for their different capital structures, investors clearly expect Google to grow much faster than AT&T and syphon off more of the value in the TMT sector.
More broadly, the disparity in the market value between the leading disruptors and the leading telcos’ market capitalisations suggest that investors expect Apple, Microsoft and Google’s revenues and profits to keep rising, while they believe telcos’ will be stable or go into decline. Figure 17 shows how the market capitalisation of the disruptors (marked in green) compares with that of the most valuable telcos (marked in blue) at the beginning of September 2011.
Figure 17: Investors value the disruptors highly
Source: Google Finance 2/9/2011 (Facebook valued at Facebook $66bn based on IPG sale in August 2011)
Impact of disruptors on telcos
It has taken longer than many commentators expected, but Internet-based messaging and social networking services are finally eroding telcos’ SMS revenues in developed markets. KPN, for example, has admitted that smartphones, equipped with data communications apps (and Whatsapp in particular), are impacting its voice and SMS revenues in its consumer wireless business in its home market of The Netherlands (see Figure 18). Reporting its Q2 2011 results, KPN said that changing consumer behaviour cut its consumer wireless service revenues in Holland by 2% year-on-year.
Figure 18: KPN reveals falling SMS usage
Source: KPN Q2 results
In the second quarter, Vodafone also reported a fall in messaging revenue in Spain and southern Africa, while Orange saw its average revenue per user from data and SMS services fall in Poland.
How telcos are fighting back
Carefully-designed bundles are the most common tactic telcos are using to try and protect their voice and messaging business. Most postpaid monthly contracts now come with hundreds of SMS messages and voice minutes, along with a limited volume of data, bundled into the overall tariff package. This mix encourages consumers to keep using the telcos’ voice and SMS services, which they are paying for anyway, rather than having Skype or another VOIP service soak up their precious data allowance.
To further deter usage of VOIP services, KPN and some other telcos are also creating tiered data tariffs offering different throughput speeds. The lower-priced tariffs tend to have slow uplink speeds, making them unsuitable for VOIP (see Figure 19 below). If consumers want to use VOIP, they will need to purchase a higher-priced data tariff, earning the telco back the lost voice revenue.
Figure 19: How KPN is trying to defend its revenues
Source: KPN’s Q2 results presentation
Of course, such tactics can be undermined by competition – if one mobile operator in a market begins offering generous data-only tariffs, consumers may well gravitate towards that operator, forcing the others to adjust their tariff plans.
Moreover, bundling voice, SMS and data will generally only work for contract customers. Prepaid customers, who only want to pay for what they are use, are naturally charged for each minute of calls they make and each message they send. These customers, therefore, have a stronger financial incentive to find a free WiFi network and use that to send messages via Facebook or make calls via Skype.
The Rich Communications Suite (RCS)
To fend off the threat posed by Skype, Facebook, Google and Apple’s multimedia communications services, telcos are also trying to improve their own voice and messaging offerings. Overseen by mobile operator trade association the GSMA, the Rich Communications Suite is a set of standards and protocols designed to enable mobile phones to exchange presence information, instant messages, live video footage and files across any mobile network.
In an echo of social networks, the GSMA says RCS will enable consumers to create their own personal community and share content in real time using their mobile device.
From a technical perspective, RCS uses the Session Initiation Protocol (SIP) to manage presence information and relay real-time information to the consumer about which service features they can use with a specific contact. The actual RCS services are carried over an IP-Multimedia Subsystem (IMS), which telcos are using to support a shift to all-IP fixed and mobile networks.
Deutsche Telekom, Orange, Telecom Italia, Telefonica and Vodafone have publically committed to deploy RCS services, indicating that the concept has momentum in Europe, in particular. The GSMA says that interoperable RCS services will initially be launched by these operators in Spain, Germany, France and Italy in late 2011 and 2012. [NB We’ll be discussing RCSe with some of the operators at our EMEA event in London in November 2011.]
In theory, at least, RCS will have some advantages over many of the communications services offered by the disruptors. Firstly, it will be interoperable across networks, so you’ll be able to reach people using different service providers. Secondly, the GSMA says RCS service features will be automatically available on mobile devices from late 2011 without the need to download and install software or create an account (by contrast, Apple’s iMessage service, for example, will only be installed on Apple devices).
But questions remain over whether RCS devices will arrive in commercial quantities fast enough, whether RCS services will be priced in an attractive way and will be packaged and marketed effectively. Moreover, it isn’t yet clear whether IMS will be able to handle the huge signalling load that would arise from widespread usage of RCS.
Internet messaging protocols, such as XMPP, require the data channel to remain active continuously. Tearing down and reconnecting generates lots of signalling traffic, but the alternative – maintaining a packet data session – will quickly drain the device’s battery.
By 2012, Facebook and Skype may be even more entrenched than they are today and their fans may see no need to use telcos’ RCS services.
Some of the largest mobile operators have tried, and mostly failed, to take on the disruptors at their own game. Vodafone 360, for example, was Vodafone’s much-promoted, but ultimately, unsuccessful €500 million attempt to insert itself between its customers and social networking and messaging services from the likes of Facebook, Windows Live, Google and Twitter.
As well as aggregating contacts and feeds from several social networks, Vodafone 360 also served as a gateway to the telco’s app and music store. But most Vodafone customers didn’t appear to see the need to have an aggregator sit between them and their Facebook feed. During 2011, the service was stripped back to be just the app and music store. In essence, Vodafone 360 didn’t add enough value to what the disruptors are already offering. We understand, from discussions with executives at Vodafone, that the service is now being mothballed.
A small number of large telcos, mostly in emerging markets where smartphones are not yet commonplace, have successfully built up a portfolio of value-added consumer services that go far beyond voice and messaging. One of the best examples is China Mobile, which claims more than 82 million users for its Fetion instant messaging service, for example (see Figure 20 – China Mobile’s Internet Services).
Figure 20 – China Mobile’s Internet Services
Source: China Mobile’s Q2 2011 results
However, it remains to be seen whether China Mobile will be able to continue to attract so many customers for its (mostly paid-for) Internet services once smartphones with full web access go mass-market in China, making it easier for consumers to access third-parties’ services, such as the popular QQ social network.
Some telcos have tried to compete with the disruptors by buying innovative start-ups. A good example is Telefonica’s acquisition of VOIP provider Jajah for US$207 million in January 2010. Telefonica has since used Jajah’s systems and expertise to launch low-cost international calling services in competition with Skype and companies offering calling cards. Telefonica expects Jajah’s products to generate $280 million of revenue in 2011, primarily from low-cost international calls offered by its German and UK mobile businesses, according to a report in the FT.
The Wholesale Applications Community (WAC)
Concerned about their growing dependence on the leading smartphone platforms, such as Android and Apple’s iOS, many of the world’s leading telcos have banded together to form the Wholesale Applications Community (WAC).
WAC’s goal is to create a platform developers can use to create apps that will run across different device operating systems, while tapping the capabilities of telcos’ networks and messaging and billing systems.
At the Mobile World Congress in February 2011, WAC said that China Mobile, MTS, Orange, Smart, Telefónica, Telenor, Verizon and Vodafone are “connected to the WAC platform”, while adding that Samsung and LG will ensure “that all devices produced by the two companies that are capable of supporting the WAC runtime will do so.”
It also announced the availability of the WAC 2.0 specification, which supports HTML5 web applications, while WAC 3.0, which is designed to enable developers to tap network assets, such as in-app billing and user authentication, is scheduled to be available in September 2011.
Ericsson, the leading supplier of mobile networks, is a particularly active supporter of WAC, which also counts leading Alcatel-Lucent, Huawei, LG Electronics, Qualcomm, Research in Motion, Samsung and ZTE, among its members.
In theory, at least, apps developers should also throw their weight behind WAC, which promises the so far unrealised dream of “write once, run anywhere.” But, in reality, games developers, in particular, will probably still want to build specific apps for specific platforms, to give their software a performance and functionality edge over rivals.
Still, the ultimate success or failure of WAC will likely depend on how enthusiastically Apple and Google, in particular, embrace HTML5 and actively support it in their respective smartphone platforms. We discuss this question further in the Apple and Google chapters of this report.
Summarising current telcos’ response to disruptors
Telcos, and their close allies in the equipment market, are clearly alert to the threat posed by the major disruptors, but they have yet to develop a comprehensive game plan that will enable them to protect their voice and messaging revenue, while expanding into new markets.
Collective activities, such as RCS and WAC, are certainly necessary and worthwhile, but are not enough. Telcos, and companies across the broader TMT ecosystem, need to also adapt their individual strategies to the rise of Amazon, Apple, Facebook, Google and Skype-Microsoft. This report is designed to help them do that.
Summary: our in-depth look at the UK’s highly competitive digital TV market which reflects many global trends, such as competition between different types of content distributor (LoveFilm, YouTube, Virgin Media, BBC, BSkyB, BT, etc.), channel proliferation, new devices used for viewing, and the increasing prevalence of connected TVs. What are the key trends and who will be the winners and losers? (August 2011, Executive Briefing Service)
With every wave of innovation, there are always winners and losers. In this note we examine who are likely to be the winners and losers in the UK as increasingly, TVs become connected to the internet.
While it is difficult to generalise with TV markets across the globe as the markets are fundamentally different in structure, especially with key variables such as PayTV penetration, state broadcaster involvement and fast broadband penetration varying widely, the comprehensive range of players and highly competitive nature of the UK market makes it a useful benchmark for many key global trends.
The UK TV Market
According to OFCOM’s latest research, there are 26.6m TV households in the UK with 60m TV sets or an average of 2.25 TV sets per household.
Figure 1 – Average UK TV Viewing Per Day
TV viewing over the last few years has been remarkably resilient despite the internet and other platforms competing hard for attention. Where the TV market differs is that average consumption is very strongly proportional to age. In typical technology adoption cycles, adoption is indirectly proportional to age. This presents a real challenge to the connected TV market: the main TV consumers are more than likely to be adverse to technological change.
TV Device Manufacturers
Figure 2 – Annual UK TV Set Sales by Type 2002-2010
The long term volume trend for TV manufacturers has been healthy. This has mainly been due to the innovation in device form and screen quality, with flat screen and HD features becoming the norm. TV manufacturers are now hoping that internet connected TV’s will generate another spurt in growth. Samsung and Sony are the UK market leaders.
But the challenge is the replacement cycles. With a 60m installed base of TV’s in the UK, and assuming that all the 9.5m TV’s sold in 2010 are replacements and not simply increasing the number of sets per household, the implication is that the replacement cycle is currently roughly every six years at a minimum. This is relatively slow when compared to two years for mobile phones and three years for laptops, and this in turn suggests that the adoption rate for standalone connected TVs will be much slower than the technology cycles for these devices.
While we expect internet connectivity to become a pretty standard feature with TV over the next couple of years we are sceptical about their active use for viewing video. The content offering is currently too limited. We would be surprised if within a couple of years, there are more than 1m homes regularly using TVs to watch video over the internet.
Set Top Boxes
The Set Top Box market in the UK falls into two categories: a subsidised segment which the consumer generally gets either for free or heavily discounted by their PayTV provider; and a retail segment where the consumer generally pays a full price and gives the consumer access to a limited set of free to air (FTA) channels and quite often DVR features.
In the subsidised segment, the market leader is Sky which currently manufacturers its own boxes. All the current models contain internet connectivity but require a subscription to Sky Broadband service to access Sky’s closed pull VOD service, Sky Anytime+. Sky has seeded the market for quite a few years with its Sky+ HD boxes which are currently in a minimum of 3,822k UK homes. We say a minimum because the figure is for homes with a HD subscription and Sky also installs a HD box for homes who do not subscribe to HD. This market seeding strategy accounts for the high initial take-up of the Sky Anytime+ service of 800k in the first quarter of launch. Since the service is effectively free, or rather bundled into the Sky TV and Broadband prices, we expect a rapid take up and within a couple of years Sky will have over 4m homes with their main TV connected to the internet.
Virgin Media has chosen TiVo as its exclusive connected set top box provider. The TiVo box is more open than the Sky box with the future promise of allowing independent Flash developers to deploy applications. TiVo is off to a steady start with around 50k homes in the first quarter of 2011. We expect TiVo adoption to be slower than Sky because the need for a new box which is priced at £50 with an ongoing service fee of £3/month. We expect these prices to reduce over time, but still can envisage an uptake of over 2m homes within two to three years assuming effective promotion by Virgin Media.
Another interesting opportunity is the launch of YouView. YouView is expected to come in two flavours, subsidised by CSPs and retail. BT and TalkTalk are shareholders, and are committed to launching YouView boxes by Pace and Huawei respectively in time for the London Olympics in 2012. Humax is committed to launching retail boxes. It is too early to properly forecast demand for YouView as neither the pricing or applications have yet been revealed. However, we struggle to see an installed base of over 1m homes even with the large base of broadband connections that BT and TalkTalk can market the product to. All the original BT Vision set top boxes were manufactured by Pace (through their purchase of Philips) and need to be connected to BT broadband and therefore the whole of 575k subscribers count as connected TVs. We expect over time for BT to replace these BT Vision boxes with YouView boxes.
The major problem for YouView is that it is a proprietary UK standard whereas other European countries are committing to the hbbTV standard. This places other set top box makers in something of a quandary – will the UK market be large enough to support product development costs? Sony, Technicolour and Cisco have already publically stated that they have no current plans to develop a YouView box.
Other commentators express confidence in the Bluray players to provide the TV connectivity. We are bears of Bluray players and think the market will be niche at best.
Around half of UK homes contain a games console. The market is dominated by Microsoft, Sony and Nintendo and a growing number of consoles are connected to the internet. Primarily, for online gaming, but also for watching video content either via the internet or through playback of physical media such as DVD or Bluray.
Figure 4 – What UK Consumers use games consoles for
Source: Ofcom residential tracker, w1 2011. Base: all adults 16+ with access to a games console at home (1,793).
We expect Gaming Consoles to become the most important method for secondary TV sets to connect to the internet, especially in children’s bedrooms. As more and more gaming moves online, we can easily see 75% of gaming consoles regularly connecting to the internet (c. 10m). However, the proportion using the console for regularly viewing video will remain small, perhaps as low as 20%. This will mean that although important Gaming Consoles will be secondary to STB’s for watching video over the internet.
To read the note in full, including additional analysis on…
Communications Service Providers (CSPs)
‘Mainstream’ TV Channels
New Entrants and Online Players
Google – YouTube
…and the following charts…
Figure 1 – Average UK TV Viewing Per Day
Figure 2 – Annual UK TV Set Sales by Type 2002-2010
Figure 3 – Gaming Console Household Penetration
Figure 4 – What UK Consumers use games consoles for
Figure 5 – Main UK CSPs – Broadband and TV reach
Figure 6 – Take-up of multichannel TV on main sets
Figure 7 – Video on demand use in Virgin Media Homes
Figure 8 – Total UK TV Revenue by Sector
Figure 9 – UK TV Channel shares in all homes 1983-2010
Figure 10 – UK Online TV revenues by type of service
Figure 11 – Unique audiences to selected online film and TV sites
Figure 12 – Unique audiences to selected video-sharing sites
Figure 13 – Forecast of Connected TV market by device in 2013
Figure 14 – Table summarising strategy and winners/losers by type 19
…Members of the Telco 2.0 Executive Briefing Subscription Service can download the full 23 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595 (+VAT), or for multi-user licenses and any other enquiries please email firstname.lastname@example.org or call +44 (0) 207 247 5003.
Technologies and industry terms referenced: Bluray, catch-up TV, Connected TV, Digital Terrestrial, DVD, DVR, flat screen, free to air, Games Consoles, hbbTV, HD, IPTV, online, PayTV, regulatory relief, replacement cycles, Set Top Box, Tablets, Video, Video on demand (VOD).
Summary: Content Delivery Networks (CDNs) are becoming familiar in the fixed broadband world as a means to improve the experience and reduce the costs of delivering bulky data like online video to end-users. Is there now a compelling need for their mobile equivalents, and if so, should operators partner with existing players or build / buy their own? (August 2011, Executive Briefing Service, Future of the Networks Stream).
Below is an extract from this 25 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can buy a Single User license for this report online here for £595 (+VAT) or subscribe here. For multiple user licenses, or to find out about interactive strategy workshops on this topic, please email email@example.com or call +44 (0) 207 247 5003.
To share this article easily, please click:
As is widely documented, mobile networks are witnessing huge growth in the volumes of 3G/4G data traffic, primarily from laptops, smartphones and tablets. While Telco 2.0 is wary of some of the headline shock-statistics about forecast “exponential” growth, or “data tsunamis” driven by ravenous consumption of video applications, there is certainly a fast-growing appetite for use of mobile broadband.
That said, many of the actual problems of congestion today can be pinpointed either to a handful of busy cells at peak hour – or, often, the inability of the network to deal with the signalling load from chatty applications or “aggressive” devices, rather than the “tonnage” of traffic. Another large trend in mobile data is the use of transient, individual-centric flows from specific apps or communications tools such as social networking and messaging.
But “tonnage” is not completely irrelevant. Despite the diversity, there is still an inexorable rise in the use of mobile devices for “big chunks” of data, especially the special class of software commonly known as “content” – typically popular/curated standalone video clips or programmes, or streamed music. Images (especially those in web pages) and application files such as software updates fit into a similar group – sizeable lumps of data downloaded by many individuals across the operator’s network.
This one-to-many nature of most types of bulk content highlights inefficiencies in the way mobile networks operate. The same data chunks are downloaded time and again by users, typically going all the way from the public Internet, through the operator’s core network, eventually to the end user. Everyone loses in this scenario – the content publisher needs huge servers to dish up each download individually. The operator has to deal with transport and backhaul load from repeatedly sending the same content across its network (and IP transit from shipping it in from outside, especially over international links). Finally, the user has to deal with all the unpredictability and performance compromises involved in accessing the traffic across multiple intervening points – and ends up paying extra to support the operator’s heavier cost base.
In the fixed broadband world, many content companies have availed themselves of a group of specialist intermediaries called CDNs (content delivery networks). These firms on-board large volumes of the most important content served across the Internet, before dropping it “locally” as near to the end user as possible – if possible, served up from cached (pre-saved) copies. Often, the CDN operating companies have struck deals with the end-user facing ISPs, which have often been keen to host their servers in-house, as they have been able to reduce their IP interconnection costs and deliver better user experience to their customers.
In the mobile industry, the use of CDNs is much less mature. Until relatively recently, the overall volumes of data didn’t really move the needle from the point of view of content firms, while operators’ radio-centric cost bases were also relatively immune from those issues as well. Optimising the “middle mile” for mobile data transport efficiency seemed far less of a concern than getting networks built out and handsets and apps perfected, or setting up policy and charging systems to parcel up broadband into tiered plans. Arguably, better-flowing data paths and video streams would only load the radio more heavily, just at a time when operators were having to compress video to limit congestion.
This is now changing significantly. With the rise in smartphone usage – and the expectations around tablets – Internet-based CDNs are pushing much more heavily to have their servers placed inside mobile networks. This is leading to a certain amount of introspection among the operators – do they really want to have Internet companies’ infrastructure inside their own networks, or could this be seen more as a Trojan Horse of some sort, simply accelerating the shift of content sales and delivery towards OTT-style models? Might it not be easier for operators to build internal CDN-type functions instead?
Some of the earlier approaches to video traffic management – especially so-called “optimisation” without the content companies’ permission of involvement – are becoming trickier with new video formats and more scrutiny from a Net Neutrality standpoint. But CDNs by definition involve the publishers, so potentially any necessary compression or other processing can be collaboratively, rather than “transparently” without cooperation or willingness.
At the same time, many of the operators’ usual vendors are seeing this transition point as a chance to differentiate their new IP core network offerings, typically combining CDN capability into their routing/switching platforms, often alongside the optimisation functions as well. In common with other recent innovations from network equipment suppliers, there is a dangled promise of Telco 2.0-style revenues that could be derived from “upstream” players. In this case, there is a bit more easily-proved potential, since this would involve direct substitution of the existing revenues already derived from content companies, by the Internet CDN players such as Akamai and Limelight. This also holds the possibility of setting up a two-sided, content-charging business model that fits OK with rules on Net Neutrality – there are few complaints about existing CDNs except from ultra-purist Neutralists.
On the other hand, telco-owned CDNs have existed in the fixed broadband world for some time, with largely indifferent levels of success and adoption. There needs to be a very good reason for content companies to choose to deal with multiple national telcos, rather than simply take the easy route and choose a single global CDN provider.
So, the big question for telcos around CDNs at the moment is “should I build my own, or should I just permit Akamai and others to continue deploying servers into my network?” Linked to that question is what type of CDN operation an operator might choose to run in-house.
There are four main reasons why a mobile operator might want to build its own CDN:
To lower costs of network operation or upgrade, especially in radio network and backhaul, but also through the core and in IP transit.
To improve the user experience of video, web or applications, either in terms of data throughput or latency.
To derive incremental revenue from content or application providers.
For wider strategic or philosophical reasons about “keeping control over the content/apps value chain”
This Analyst Note explores these issues in more details, first giving some relevant contextual information on how CDNs work, especially in mobile.
What is a CDN?
The traditional model for Internet-based content access is straightforward – the user’s browser requests a piece of data (image, video, file or whatever) from a server, which then sends it back across the network, via a series of “hops” between different network nodes. The content typically crosses the boundaries between multiple service providers’ domains, before finally arriving at the user’s access provider’s network, flowing down over the fixed or mobile “last mile” to their device. In a mobile network, that also typically involves transiting the operator’s core network first, which has a variety of infrastructure (network elements) to control and charge for it.
A Content Delivery Network (CDN) is a system for serving Internet content from servers which are located “closer” to the end user either physically, or in terms of the network topology (number of hops). This can result in faster response times, higher overall performance, and potentially lower costs to all concerned.
In most cases in the past, CDNs have been run by specialist third-party providers, such as Akamai and Limelight. This document also considers the role of telcos running their own “on-net” CDNs.
CDNs can be thought of as analogous to the distribution of bulky physical goods – it would be inefficient for a manufacturer to ship all products to customers individually from a single huge central warehouse. Instead, it will set up regional logistics centres that can be more responsive – and, if appropriate, tailor the products or packaging to the needs of specific local markets.
As an example, there might be a million requests for a particular video stream from the BBC. Without using a CDN, the BBC would have to provide sufficient server capacity and bandwidth to handle them all. The company’s immediate downstream ISPs would have to carry this traffic to the Internet backbone, the backbone itself has to carry it, and finally the requesters’ ISPs’ access networks have to deliver it to the end-points. From a media-industry viewpoint, the source network (in this case the BBC) is generally called the “content network” or “hosting network”; the destination is termed an “eyeball network”.
In a CDN scenario, all the data for the video stream has to be transferred across the Internet just once for each participating network, when it is deployed to the downstream CDN servers and stored. After this point, it is only carried over the user-facing eyeball networks, not any others via the public Internet. This also means that the CDN servers may be located strategically within the eyeball networks, in order to use its resources more efficiently. For example, the eyeball network could place the CDN server on the downstream side of its most expensive link, so as to avoid carrying the video over it multiple times. In a mobile context, CDN servers could be used to avoid pushing large volumes of data through expensive core-network nodes repeatedly.
When the video or other content is loaded into the CDN, other optimisations such as compression or transcoding into other formats can be applied if desired. There may also be various treatments relating to new forms of delivery such as HTTP streaming, where the video is broken up into “chunks” with several different sizes/resolutions. Collectively, these upfront processes are called “ingestion”.
Figure 1 – Content delivery with and without a CDN
Source: STL Partners / Telco 2.0
Value-added CDN services
It is important to recognise that the fixed-centric CDN business has increased massively in richness and competition over time. Although some of the players have very clever architectures and IPR in the forms of their algorithms and software techniques, the flexibility of modern IP networks has tended to erode away some of the early advantages and margins. Shipping large volumes of content is now starting to become secondary to the provision of associated value-added functions and capabilities around that data. Additional services include:
Analytics and reporting
Content ingestion and management
Website security management
Consulting and professional services
It is no coincidence that the market leader, Akamai, now refers to itself as “provider of cloud optimisation services” in its financial statements, rather than a CDN, with its business being driven by “trends in cloud computing, Internet security, mobile connectivity, and the proliferation of online video”. In particular, it has started refocusing away from dealing with “video tonnage”, and towards application acceleration – for example, speeding up the load times of e-commerce sites, which has a measurable impact on abandonment of purchasing visits. Akamai’s total revenues in 2010 were around $1bn, less than half of which came from “media and entertainment” – the traditional “content industries”. Its H1 2011 revenues were relatively disappointing, with growth coming from non-traditional markets such as enterprise and high-tech (eg software update delivery) rather than media.
This is a critically important consideration for operators that are looking to CDNs to provide them with sizeable uplifts in revenue from upstream customers. Telcos – especially in mobile – will need to invest in various additional capabilities as well as the “headline” video traffic management aspects of the system. They will need to optimise for network latency as well as throughput, for example – which will probably not have the cost-saving impacts expected from managing “data tonnage” more effectively.
Although in theory telcos’ other assets should help – for example mapping download analytics to more generalised customer data – this is likely to involve extra complexity with the IT side of the business. There will also be additional efforts around sales and marketing that go significantly beyond most mobile operators’ normal footprint into B2B business areas. There is also a risk that an analysis of bottlenecks for application delivery / acceleration ends up simply pointing the finger of blame at the network’s inadequacies in terms of coverage. Improving delivery speed, cost or latency is only valuable to an upstream customer if there is a reasonable likelihood of the end-user actually having connectivity in the first place.
Figure 2: Value-added CDN capabilities
An increasingly important aspect of CDNs is their move beyond content/media distribution into a much wider area of “acceleration” and “cloud enablement”. As well as delivering large pieces of data efficiently (e.g. video), there is arguably more tangible value in delivering small pieces of data fast.
There are various manifestations of this, but a couple of good examples illustrate the general principles:
Many web transactions are abandoned because websites (or apps) seem “slow”. Few people would trust an airline’s e-commerce site, or a bank’s online interface, if they’ve had to wait impatiently for images and page elements to load, perhaps repeatedly hitting “refresh” on their browsers. Abandoned transactions can be directly linked to slow or unreliable response times – typically a function of congestion either at the server or various mid-way points in the connection. CDN-style hosting can accelerate the service measurably, leading to increased customer satisfaction and lower levels of abandonment.
Enterprise adoption of cloud computing is becoming exceptionally important, with both cost savings and performance enhancements promised by vendors. Sometimes, such platforms will involve hybrid clouds – a mixture of private (Internal) and public (Internet) resources and connectivity. Where corporates are reliant on public Internet connectivity, they may well want to ensure as fast and reliable service as possible, especially in terms of round-trip latency. Many IT applications are designed to be run on ultra-fast company private networks, with a lot of “hand-shaking” between the user’s PC and the server. This process is very latency-dependent, and especially as companies also mobilise their applications the additional overhead time in cellular networks may otherwise cause significant problems.
Hosting applications at CDN-type cloud acceleration providers achieves much the same effect as for video – they can bring the application “closer”, with fewer hops between the origin server and the consumer. Additionally, the CDN is well-placed to offer additional value-adds such as firewalling and protection against denial-of-service attacks.
To read the 25 note in full, including the following additional content…
How do CDNs fit with mobile networks?
Internet CDNs vs. operator CDNs
Why use an operator CDN?
Should delivery mean delivery?
Lessons from fixed operator CDNs
Mobile video: CDNs, offload & optimisation
CDNs, optimisation, proxies and DPI
The role of OVPs
Implementation and planning issues
Conclusion & recommendations
… and the following additional charts…
Figure 3 – Potential locations for CDN caches and nodes
Figure 4 – Distributed on-net CDNs can offer significant data transport savings
Figure 5 – The role of OVPs for different types of CDN player
Figure 6 – Summary of Risk / Benefits of Centralised vs. Distributed and ‘Off Net’ vs. ‘On-Net’ CDN Strategies
……Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 25 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595 (+VAT), or for multi-user licenses and any other enquiries please email firstname.lastname@example.org or call +44 (0) 207 247 5003.
NB A full PDF copy of this briefing can be downloaded here.
This special Executive Briefing report summarises the brainstorming output from the Content Distribution 2.0 (Broadband Video) section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.
It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0′ business model framework (see www.telco2research.com).
Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.
There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and APIs 2.0. In addition there will be an overall ‘Executive Summary’ report highlighting the overall messages from the event.
Each report contains:
Our independent summary of some of the key points from the stimulus presentations
An analysis of the brainstorming output, including a large selection of verbatim comments
The ‘next steps’ vote by the participants
Our conclusions of the key lessons learnt and our suggestions for industry next steps.
The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points.
Background to this report
The demand for internet video is exploding. This is putting significant stress on the current fixed and mobile distribution business model. Infrastructure investments and operating costs required to meet demand are growing faster than revenues. The strategic choices facing operators are to charge consumers more when they expect to pay less, to risk upsetting content providers and users by throttling bandwidth, or to unlock new revenues to support investment and cover operating costs by creating new valuable digital distribution services for the video content industry.
A summary of the new Telco 2.0 Online Video Market Study: Options and Opportunities for Distributors in a time of massive disruption.
What are the most valuable new digital distribution services that telcos could create?
What is the business model for these services – who are the potential buyers and what are prior opportunity areas?
What progress has been made in new business models for video distribution – including FTTH deployment, content-delivery networking, and P2P?
Preliminary results of the UK cross-carrier trial of sender-pays data
How the TM Forum’s IPSphere programme can support video distribution
Stimulus Presenters and Panellists
Richard D. Titus, Controller, Future Media, BBC
Trudy Norris-Grey, MD Transformation and Strategy, BT Wholesale
Scott Shoaf, Director, Strategy and Planning, Juniper Networks
Ibrahim Gedeon, CTO, Telus
Andrew Bud, Chairman, Mobile Entertainment Forum
Alan Patrick, Associate, Telco 2.0 Initiative
Simon Torrance, CEO, Telco 2.0 Initiative
Chris Barraclough, Managing Director, Telco 2.0 Initiative
Dean Bubley, Senior Associate, Telco 2.0 Initiative
Alex Harrowell, Analyst, Telco 2.0 Initiative
Stimulus Presentation Summaries
Content Distribution 2.0
Scott Shoaf, Director, Strategy and Planning, Juniper Networks opened the session with a comparison of the telecoms industry’s response to massive volumes of video and that of the US cable operators. He pointed out that the cable companies’ raison d’etre was to deliver vast amounts of video; therefore their experience should be worth something.
The first question, however, was to define the problem. Was the problem the customer, in which case the answer would be to meter, throttle, and cap bandwidth usage? If we decided this was the solution, though, the industry would be in the position of selling broadband connections and then trying to discourage its customers from using them!
Or was the problem not one of cost, but one of revenue? Networks cost money; the cloud is not actually a cloud, but is made up of cables, trenches, data centres and machines. Surely there wouldn’t be a problem if revenues rose with higher usage? In that case, we ought to be looking at usage-based pricing, but also at alternative business models – like advertising and the two-sided business model.
Or is it an engineering problem? It’s not theoretically impossible to put in bigger pipes until all the HD video from everyone can reach everyone else without contention – but in practice there is always some degree of oversubscription. What if we focused on specific sources of content? Define a standard of user experience, train the users to that, and work backwards?
If it is an engineering problem, the first step is to reduce the problem set. The long tail obviously isn’t the problem; it’s too long, as has been pointed out, and doesn’t account for very much traffic. It’s the ‘big head’ or ‘short tail’ stuff that is the heart of the problem: we need to deal with this short tail of big traffic generators. We need a CDN or something similar to deliver for this.
On cable, the customers are paying for premium content – essentially movies and TV – and the content providers are paying for distribution. We need to escape from the strict distinctions between Internet, IPTV, and broadcast. After all, despite the alarming figures for people leaving cable, many of them are leaving existing cable connections to take a higher grade of service. Consider Comcast’s Fancast – focused on users, not lines, with an integrated social-recommendation system, it integrates traditional cable with subscription video. Remember that broadcast is a really great way to deliver!
Advertising – at the moment, content owners are getting 90% of the ad money.
Getting away from this requires us to standardise the technology and the operational and commercial practices involved. The cable industry is facing this with the SCTE130 and Advanced Advertising 1.0 standards, which provide for fine-grained ad insertion and reporting. We need to blur the definition of TV advertising – the market is much bigger if you include Internet and TV ads together. Further, 20,000 subscribers to IPTV aren’t interesting to anyone – we need to attack this across the industry and learn how to treat the customer as an asset.
The Future of Online Video, 6 months on
Alan Patrick, Associate, Telco 2.0 updated the conference on how things had changed since he introduced the ”Pirate World” concept from our Online Video Distribution strategy report at the last Telco 2.0 event. The Pirate World scenario, he said, had set in much faster and more intensely than we had expected, and was working in synergy with the economic crisis.
Richard Titus, Controller, Future Media, BBC: ”I have no problem with carriers making money, in fact, I pay over the odds for a 50Mbits link, but the real difference is between a model that creates opportunities for the public and one which constrains them.”
Ad revenues were falling; video traffic still soaring; rights-holders’ reaction had been even more aggressive than we had expected, but there was little evidence that it was doing any good. Entire categories of content were in crisis.
On the other hand, the first stirrings of the eventual “New Players Emerge” scenario were also observable; note the success of Apple in creating a complete, integrated content distribution and application development ecosystem around its mobile devices.
The importance of CPE is only increasing; especially with the proliferation of devices capable of media playback (or recording) and interacting with Internet resources. There’s a need for a secure gateway to help manage all the gadgets and deliver content efficiently. Similarly, CDNs are only becoming more central – there is no shortage of bandwidth, but only various bottlenecks. It’s possible that this layer of the industry may become a copyright policing point.
We think new forms of CPE and CDNs are happening now; efforts to police copyright in the network are in the near future; VAS platforms are the next wave after that, and then customer data will become a major line of business.
Most of all, time is flying by, and the overleveraged, or undercapitalised, are being eaten first.
The Content Delivery Framework
Ibrahim Gedeon, CTO, Telus introduced some lessons from Telus’s experience deploying both on-demand bandwidth and developer APIs. Telcos aren’t good at content, he said; instead, we need to be the smartest pipe and make use of our trusted relationship with customers, built up over the last 150 years.
We’re working in an environment where cash is scarce and expensive, and pricing is a zero- or even negative-sum game; impossible to raise prices, and hard to cut without furthering the price war. So what should we be doing? A few years ago the buzzword was SDP; now it’s CDN. We’d better learn what those actually mean!
Trudy Norris-Gray, Managing Director, BT Wholesale: ”There is no capacity problem in the core, but there is to the consumer – and three bad experiences means the end of an application or service for that individual user.”
Anyway, we’re both a mobile and fixed operator and ISP, and we’ve got an IPTV network. We’ve learned the hard way that technology isn’t our place in the value chain. When we got the first IPTV system from Microsoft, it used 2,500 servers and far, far too much power. So we’re moving to a CDF (Content Delivery Framework) – which looks a lot like a SDP. Have the vendors just changed the labels on these charts?
So why do we want this? So we can charge for bandwidth, of course; if it was free, we wouldn’t care! But we’re making around $10bn in revenues and spending 20% of that in CAPEX. We need a business case for this continued investment.
We need the CDF to help us to dynamically manage the delivery and charging process for content. There was lots of goodness in IMS, the buzzword of five years ago, and in SDPs. But in the end it’s the APIs that matter. And we like standards because we’re not very big. So, we want to use TM Forum’s IPSphere to extend the CDF and SDF; after all, in roaming we apply different rate cards dynamically and settle transactions, so why not here too, for video or data? I’d happily pay five bucks for good 3G video interconnection.
And we need to do this for developer platforms too, which is why we’re supporting the OneAPI reference architecture. To sum up, let’s not forget subscriber identity, online charging – we’ve got to make money – the need for policy management because not all users are equal, and QoS for a differentiated user experience.
Sender-Pays Data in Practice
Andrew Bud, Chairman, MEF gave an update on the trial of sender-pays data he announced at the last event. This is no longer theoretical, he said; it’s functioning, just with a restricted feature set. Retail-only Internet has just about worked so far; because people pay for the services through their subscription and they’re free. Video breaks this, he said; it will be impossible to be comprehensive, meaningful, and sustainable.
You can’t, he said, put a meaningful customer warning that covers all the possible prices you might encounter due to carrier policy with your content; and everyone is scared of huge bills after the WAP experience. Further, look at the history of post offices, telegraphy and telephony – it’s been sender-pays since the 1850s. Similarly, Amazon.com is sender-pays, as is Akamai.
Hence we need sending-party pays data – that way, we can have truly free ads: not one where the poor end users ends up paying the delivery cost!
Our trial: we have relationships with carriers making up 85% of the UK market. We have contracts, priced per-MB of data, with them. And we have four customers – Jamster, who brought you the Crazy Frog, Shorts, THMBNLS, who produce mobisodes promoting public health, and Creative North – mobile games as a gift from the government. Of course, without sender-pays this is impossible.
We’ve discovered that the carriers have no idea how much data costs; wholesale pricing has some very interesting consequences. Notably the prices are being set too high. Real costs and real prices mean that quality of experience is a real issue; it’s a very complicated system to get right. The positive sign, and ringing endorsement for the trial, is that some carriers are including sender-pays revenue in their budgets now!
The business of video is a prime battleground for Telco 2.0 strategies. It represents the heaviest data flows, the cornerstone of triple/quad-play bundling, powerful entrenched interests from broadcasters and content owners, and a plethora of regulators and industry bodies. For many people, it lies at the heart of home-based service provision and entertainment, as well as encroaching on the mobile space. The growth of P2P and other illegal or semi-legal download mechanisms puts pressure on network capacity – and invites controversial measures around protecting content rights and Net Neutrality.
In theory, operators ought to be able to monetise video traffic, even if they don’t own or aggregate content themselves. There should be options for advertising, prioritised traffic or blended services – but these are all highly dependent on not just capable infrastructure, but realistic business models. Operators also need to find a way to counter the ‘Network Neutrality’ lobbyists who are confounding the real issue (access to the internet for all service providers on a ‘best efforts’ basis) with spurious arguments that operators should not be able to offer premium services, such as QoS and identity, to customers that want to pay for them. Telco 2.0 would argue that the right to offer (and the right to buy) a better service is a cornerstone of capitalism and something that is available in every other industry. Telecoms should be no different. Of course, it remains up to the operators to develop services that customers are willing to pay more for…
A common theme in the discussion was “tempus fugit” – time flies. The pace of evolution has been staggering, especially in Internet video distribution – IPTV, YouTube, iPlayer, Hulu, Qik, P2P, mashups and so forth. Telcos do not have the luxury of time for extended pilot projects or grandiose collaborations that take years to come to fruition.
With this timing issue in mind, the feedback from the audience was collected in three categories, although here the output has been aggregated thematically, as follows:
STOP – What should we stop doing?
START – What should we start doing?
DO MORE – What things should we do more of?
Feedback: STOP the current business model
There was broad agreement that the current model is unsustainable, especially given the demands that “heavy” content like video traffic places on the network…..
· [Stop] giving customers bandwidth for free [#5]
· Stop complex pricing models for end-user [#9]
· Stop investing so much in sustaining old order [#18]
· Stop charging mobile subscribers on a per megabyte basis. [#37]
· Current peering agreement/ip neutrality is not sustainable. [#41]
· [Stop] assuming things are free. [#48]
· [Stop] lowering prices for unlimited data. [#61]
· Have to develop more models for upstream charging for data rather than just flat rate to subscribers. [#11]
· Build rational pricing segmentation for data to monetize both sides of the value chain with focus on premium value items. [#32]
Feedback: Transparency and pricing
… with many people suggesting that Telcos first need to educate users and service providers about the “true cost” of transporting data…. although whether they actually know the answer themselves is another question, as it is much an issue of accounting practices as network architecture.
· Make the service providers aware of the cost they generate to carriers. [#31]
· Make pricing transparency for consumers a must. [#10]
· Mobile operators start being honest with themselves about the true cost of data before they invest in LTE. [#7]
· When resources are limited, then rationing is necessary. Net Neutrality will not work. Today people pay for water in regions where it is limited in supply. Its use is abused when there are no limits. [#17]
· Start being transparent in data charges, it will all stay or fall with cost transparency. [#12]
· You can help people understand usage charges, with meters or regular updates, requires education for a behavioural change, easier for fixed than mobile. [#14]
· Service providers need to have a more honest dialogue with subscribers and give them confidence to use services [#57]
· As an industry we must invest more in educating the market about network economics, end-users as well as service providers. [#58]
· Start charging subscribers flat rate data fee rather than per megabyte. [#46]
Feedback: Sender-pays data
Andrew Bud’s concept of “sender pays data”, in which a content provider bundles in the notional cost of data transport into the download price for the consumer, generated both enthusiasm and concerns (although very little outright disagreement). Telco 2.0 agrees with the fundamental ‘elegance’ of the notion, but thinks that there are significant practical, regulatory and technical issues that need to be resolved. In particular, the delivery of “monolithic” chunks of content like movies may be limited, especially in mobile networks where data traffic is dominated by PCs with mobile broadband, usually conducting a wide variety of two-way applications like social networking.
· Sender pays is the only sane model. [#6]
· Do sender pays on both ‘sides’ consumer as well…gives ‘control’ and clarity to user. [#54]
· Sender Pays is one specific example of a much larger category of 3rd-party pays data, which also includes venue owners (e.g. hotels or restaurants), advertisers/sponsors (‘thanks for flying Virgin, we’re giving you 10MB free as a thank-you’), software developers, government (e.g. ‘benefit’ data for the unemployed etc) etc. The opportunity for Telcos may be much larger from upstream players outside the content industry [#73]
· We already do sender pays on our mobile portal – on behalf of all partner content providers including Napster mobile. [#77]
· Change the current peering model into an end to end sender pay model where all carriers in the chain receive the appropriate allocation of the sender pay revenue in order to guarantee the QoS for the end user. [#63]
· Focus on the money flows e.g. confirm the sender pays model. [#19]
Qualified Support/Implementation concerns
· Business models on sender pays, but including the fact, that roaming is needed, data costs will be quite different across mobile carriers and the aggregators costs and agreements are based on the current carriers. These things need to be solved first [#26]
· Sender pays is good but needs the option of ‘only deliver via WiFi or femtocell when the user gets home’ at 1/100th the cost of ‘deliver immediately via 3G macro network’. [#15]
· Who pays for AJAX browsers proactively downloading stuff in the background without explicit user request? [#64]
· Be realistic about sender pays data. It will not take off it is not standard across the market, and the data prices currently break the content business model – you have to compare to the next alternative. A video on iTunes costs 1.89 GBP including data… Operators should either take a long term view or forget about it. [#20]
· Sender-pays data can be used to do anything the eco-system needs, including quality/HD. It doesn’t yet today only because the carriers don’t know how to provide those. [#44]
· Sender pays works for big monolithic chunks like songs or videos. But doesn’t work for mash up or communications content/data like Facebook (my Facebook page has 30 components from different providers – are you going to bill all of them separately?) [#53]
· mBlox: more or less like a free-call number. doesn’t guarantee quality/HD [#8]
· Stop sender pays because user is inundated with spam. [#23]
o Re 23: At least the sender is charged for the delivery. I do not want to pay for your SPAM! [#30]
A fair amount of the discussion revolved around the thorny issues of capacity, congestion, prioritisation and QoS, although some participants felt this distracted a little from the “bigger picture” of integrated business models.
· Part of bandwidth is dedicated to high quality contents (paid for). Rest is shared/best effort. [#27]
· Start annotating the network, by installing the equivalent of gas meters at all points across the network, in order that they truly understand the nature of traffic passing over the network – to implement QoS. [#56]
o Re: 56 – that’s fine in the fixed world or mobile core, but it doesn’t work in the radio network. Managing QoS in mobile is difficult when you have annoying things like concrete walls and metallised reflective windows in the way [#75]
· [Stop] being telecom focused and move more towards solutions. It is more than bandwidth. [#25]
· Stop pretending that mobile QoS is important, as coverage is still the gating factor for user experience. There’s no point offering 99.9% reliability when you only have 70% coverage, especially indoors [#29]
· Start preparing for a world of fewer, but converged fixed-mobile networks that are shared between operators. In this world there will need to be dynamic model of allocating and charging for network capacity. [#67]
· We need applications that are more aware of network capacity, congestion, cost and quality – and which alter their behaviour to optimise for the conditions at any point in time e.g. with different codec’s or frame rate or image size. The intelligence to do this is in the device, not the network. [#68]
o Re: 68, is it really in the CPE? If the buffering of the content is close at the terminal, perhaps, otherwise there is no jitter guarantee. [#78]
§ Re 78 – depends on the situation, and download vs. streaming etc. Forget the word ‘terminal’, it’s 1980s speak, if you have a sufficiently smart endpoint you can manage this – hence PCs being fine for buffering YouTube or i-Player etc, and some of the video players auto-sensing network conditions [#81]
· QoE – for residential cannot fully support devices which are not managed for streamed content. [#71]
· Presumably CDNs and caching have a bit of a problem with customised content, e.g. with inserted/overlaid personalised adverts in a video stream? [#76]
Feedback: platforms, APIs, and infrastructure
However, the network and device architecture is only part of the issue. It is clear that video distribution fits centrally within the wider platform problems of APIs and OSS/BSS architecture, which span the overall Telco 2.0 reach of a given operator.
· Too much focus on investment in the network, where is the innovation in enterprise software innovation to support the network? [#70]
· For operator to open up access to the business assets in a consistent manner to innovative. Intermediaries who can harmonise APIs across a national or global marketplace. [#13]
· The BSS back office; billing, etc will not support robust interactive media for the most part. [#22]
· Let content providers come directly to Telcos to avoid a middle layer (aggregators) to take the profit. This requires collaboration and standardization among Telco’s for the technical interfaces and payment models. [#28]
· More analysis on length of time and cost of managing billing vendor for support of 2-sided business model. Prohibitively expensive in back office to take risks. Why? [#65]
· It doesn’t matter how strong the network is if you can’t monetize it on the back end OSS/BSS. [#40]
Feedback: Business models for video
Irrespective of the technical issues, or specific point commercial innovations like sender pays, there are also assorted problems in managing ecosystem dynamics, or more generalised business models for online video or IPTV. A significant part of the session’s feedback explored the concerns and possible solutions – with the “elephant in the room” of Net Neutrality lurking on the sidelines.
· Open up to lower cost lower risk trials to see what does and doesn’t work. [#35]
· Real multi quality services in order to monetize high quality services. [#36]
· Transform net neutrality issues into a fair policy approach… meaning that you cannot have equal treatment when some parties abuse the openness. [#39]
o Re 39: I want QoE for content I want to see. Part of this is from speed of access. Net Neutrality comes from the Best Effort and let is fight out in the scarce network. I.e. I do not get the QoE for all the other rubbish in the network. [#69]
· Why not bundling VAS with content transportation to ease migration from a free world to a pay for value world? [#43]
· Do more collaborative models which incorporate the entire value chain. [#55]
· Service providers start partnering to resell long tail content from platform providers with big catalogues. [#59]
· [Start to] combine down- and up-stream models in content. Especially starts get paid to deliver long tail content. [#60]
· Start thinking longer term instead of short term profit, to create a new ecosystem that is bigger and healthier. [#62]
· Exploit better the business models between content providers and carriers. [#16]
· Adapt price to quality of service. [#21]
· Put more attention on quality of end user experience. [#24]
· I am prepared to pay a higher retail DSL subscription if I get a higher quality of experience. – not just monthly download limits. [#38]
· maximize revenues based on typical Telco capabilities (billing, delivery, assurance on million of customers) [#50]
· Need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
· On our fixed network, DSL resellers pay for the access and for the bandwidth used – this corresponds to the sender pays model; due to rising bandwidth demand the charge for the resellers continuously increases. so we have to adapt bandwidth tariffs every year in order not to suffocate our DSL resellers. Among them are also companies offering TV streaming. [#82]
· More settlement free peering with content/app suppliers – make the origination point blazingly fast and close to zero cost. rather focus on charging for content distribution towards the edge of the access network (smart caching, torrent seeds, multicast nodes etc) [#74]
In addition to these central themes, the session’s participants also offered a variety of other comments concerning regulatory issues, industry collaboration, consumer issues and other non-video services like SMS.
· Start addressing customer data privacy issues now, before it’s too late and there is a backlash from subscribers and the media. [#42]
· Consolidating forums and industry bodies so we end up with one practical solution. [#45]
· Identifying what an operator has potential to be of use for to content SP other than a pipe. [#49]
· Getting regulators to stimulate competition by enforcing structural separation – unbundle at layer 1, bring in agile players with low operating cost. Let customers vote with their money – focus on deliverable the fastest basic IP pipe at a reasonable price. If the basic price point is reasonable customers will be glad to pay for extra services – either sender or receiver based. [#72]
· IPTV <> Internet TV. In IPTV the Telco chooses my content, Internet TV I choose. [#79]
· Put attention on creating industry collaboration models. [#47]
· Stop milking the SMS cash cow and stop worrying about cannibalising it, otherwise today’s rip-off mobile data services will never take off. [#33]
· SMS combined with the web is going to play a big role in the future, maybe bigger that the role it played in the past. Twitter is just the first of a wave of SMS based social media and comms applications for people. [#51]
Participants ‘Next Steps’ Vote
Participants were then asked: Which of the following do we need to understand better in the next 6 months?
Is there really a capacity problem, and what is the nature of it?
How to tackle the net neutrality debate and develop an acceptable QOS solution for video?
Is there a long term future for IPTV?
How to take on the iPhone regarding mobile video?
More aggressive piloting / roll-out of sender party pays data?
Lessons learnt & next steps
The vote itself reflects the nature of the discussions and debates at the event: there are lots of issues and things that the industry is not yet clear on that need to be ironed out. The world is changing fast and how we overcome issues and exploit opportunities is still hazy. And all the time, there is a concern that the speed of change could overtake existing players (including Telcos and ISPs)!
However, there does now seem to be greater clarity on several issues with participants becoming increasingly keen to see the industry tackle the business model issue of flat-rate pricing to consumers and little revenue being attached to the distribution of content (particularly bandwidth hungry video). Overall, most seem to agree that:
1. End users like simple pricing models (hence success of flat rate) but that some ‘heavy users’ will require a variable rate pricing scheme to cover the demands they make;
2. Bandwidth is not free and costs to Telcos and ISPs will continue to rise as video traffic grows;
3. Asking those sending digital goods to pay for the distribution cost is sensible…;
4. …but plenty of work needs to be done on the practicalities of the sender-pays model before it can be widely adopted across fixed and mobile;
5. Operators need to develop a suite of value-added products and services for those sending digital goods over their networks so they can charge incremental revenues that will enable continued network investment;
6. Those pushing the ‘network neutrality’ issue are (deliberately or otherwise) causing confusion over such differential pricing which creates PR and regulatory risks for operators that need to be addressed.
There are clearly details to be ironed out – and probably experiments in pricing and charging to be done. Andrew Bud’s (and many others, it must be added, have suggested similar) sending-party pays model may work, or it may not – but this is an area where experiments need to be tried. The idea of “educating” upstream users is euphemistic – they are well aware of the benefits they currently are accruing, which is why the Net Neutrality debate is being deliberately muddied. Distributors need to be working on disentangling bits that are able to be free from those that pay to ride, not letting anyone get a free ride.
As can be seen in the responses, there is also a growing realisation that the Telco has to understand and deal with the issues of the overall value chain, end-to-end, not just the section under its direct control, if it wishes to add value over and above being a bit pipe. This is essentially moving towards a solution of the “Quality of Service” issue – they need to decide how much of the solution is capacity increase, how much is traffic management, and how much is customer expectation management.
Alan Patrick, Telco 2.0: ”98.7% of users don’t have an iPhone, but 98% of mobile developers code for it because it has an integrated end-to-end experience, rather than a content model based on starving in a garage.”
The “Tempus Fugit” point is well made too – the Telco 2.0 participants are moving towards an answer, but it is not clear that the same urgency is being seen among wider Telco management.
Two areas were skimmed through a little too quickly in the feedback:
Managing a way through the ‘Pirate World’ environment
The economic crisis has helped in that it has reduced the amount of venture capital and other risk equity going into funding plays that need not make revenue, never mind profit. In our view this means that the game will resolve into a battle of deep pockets to fund the early businesses. Incumbents typically suffer from higher cost bases and higher hurdle rates for new ventures. New players typically have less revenue, but lower cost structures. For existing Telcos this means using existing assets as effectively as possible and we suggest a more consolidated approach from operators and associated forums and industry bodies so the industry ends up with one practical solution. This is particularly important when initially tackling the ‘Network Neutrality’ issue and securing customer and regulatory support for differential pricing policies.
Adopting a policing role, particularly in the short-term during Pirate World, may be valuable for operators. Telco 2.0 believes the real value is in managing the supply of content from companies (rather than end users) and ensuring that content is legal (paid for!).
What sort of video solution should Telcos develop?
The temptation for operators to push iPTV is huge – it offers, in theory, steady revenues and control of the set-top box. Unfortunately, all the projected growth is expected to be in Web TV, delivered to PCs or TVs (or both). Providing a suite of value-added distribution services is perhaps a more lucrative strategy for operators:
Operators must better understand the needs of upstream segments and individual customers (media owners, aggregators, broadcasters, retailers, games providers, social networks, etc.) and develop propositions for value-added services in response to these. Managing end user data is likely to be important here. As one participant put it:
o We need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
Customer privacy will clearly be an issue if operators develop solutions for upstream customers that involve the management of data flows between both sides of the platform. End users want to know what upstream customers are providing, how they can pay, whether the provider is trusted, etc. and the provider needs to be able to identify and authenticate the customer, as well as understand what content they want and how they want to pay for it. Opt-in is one solution but is complex and time-consuming to build scale so operators need to explore ways of protecting data while using it to add value to transactions over the network.
BSkyB is probably the most misunderstood publicly quoted company in the UK. Most analysts view them as a media company; our theory is that BSkyB is a platform company and comparisons to Apple, Microsoft or Google are more appropriate than UK media players such as the BBC, ITV or even potential new entrants such as BT.
Figure 1: Sky Delivery Platform
Rule #1 – You have to keep loading extra features onto your platform…
Microsoft historically were tops at this – Windows for some grew fatter and fatter release by release, but in reality every release contained new features that appealed to some. BSkyB have done the same – they moved from analogue to digital, introduced interactive TV, took PVRs to the mass-market, and now are doing the same with HD-TV. As soon as that is complete ,they will move onto the next thing – 3D TV or even true on-demand VOD.
BSkyB seem to be one step ahead of the competition all the time. The only exception to this is Virgin Media networked VOD service, which is far superior to the BSkyB limited caching of programmes to the Sky+ device – currently a big hole in the portfolio. Notice that BSkyB is totally agnostic whether the features are driven by hardware, software or network – their platform contains all three elements.
Rule #2 – You have to design your platform to be the easiest to use…
Apple are the kings of usability – they control both client hardware, software and in the networked world they are taking more control of delivery though not by ownership of underlying assets – think of the parallel with BSkyB using 3rd parties for satellite and broadband delivery.
Very little is thought of BSkyB’s innovation in design – the Sky remote control in its day was a huge advance from TV manufacturers’ efforts. Similarly, they have extended this advantage in the PVR world – still keeping with their own designs. This is just the start of BSkyB’s advantage in usability.
It is not by accident that BSkyB wins awards for customer service – they realise that excellence in usability also needs to consider every touch point with customer – from sales to installation and care. BSkyB have made huge investments here and it pays off – within a couple of years of entering the voice and broadband market, they seem to be winning almost every award on offer.
Rule #3 – You need the lowest cost platform…
This is all about the economics of scale and running a tight ship especially with regard to corporate overheads. Nearly every platform businesses involve huge upfront risks & investment with many years of losses, with a steady ascent to profitability as the platform gains volume & pricing power.
Organisational inefficiency is a disease that afflicts almost all large companies. Very little data comparative data is available here, but I suspect that Sky is far more efficient if costs were normalized and compared to other payTV (eg Virgin Media), fixed (eg BT) or even Mobile (eg Vodafone) competitors for the share of the consumer wallet.
Rule #4 – You need the best content on your platform…
This is probably the most misunderstood element of BSkyB’s business – people still think BSkyB’s advantage is all about exclusive rights to Premier League Football & Movies. The truth for most content owners is that the BSkyB platform is the most profitable mass market route. And the attractiveness grows year-by-year as subscribers and revenues increase compared to their competitors.
People tend to dismiss how flexible the BSkyB platform is to content owners: you can sell your individual rights to your preferred bidder on a particular Sky Channel, for example HBO selling rights to the “The Wire” to a minority channel, FoxFX; you can build your own channel(s), e.g. MTV and Discovery Channels, or get a particular Sky-owned channel to do the production, e.g. sports events.
Strangely, even the major public service broadcasters (BBC, ITV, C4 & Five) are not only happy to have their content attached to the BSkyB platform, but are willing to pay for the pleasure.
Rule #5 – You need to extend your platform into adjacent areas…
Currently the focus on BSkyB is the move into the home broadband and voice market, which is a much bigger market than pure TV. To date, despite the rapid gain in scale profitability is still an aspiration. The defensive qualities of the play are always underestimated, as is how it constrains the freedom of major competitors, especially Virgin Media and BT, to differentiate.
However, it should never be forgotten that moving into adjacent areas is nothing new to BSkyB: the early digital days featured BSkyB investing/losing money in “t-commerce”; a move into gambling via SkyBet has been more successful, but the retail focus of the service was dwarfed by the emergence of a gambling platform play, BetFair; and Sky has investing in online properties, especially football, but it is too early to access the success.
The BSkyB defensive investments pale into insignificance compared to those made by Microsoft in protecting their franchise against the growing encroachment from Google. These defensive investments are also crucial in negotiations with regulators – for every complaint that BT states about the lack of profitability in the payTV market; BSkyB can counter with allegations about the impossibility of making profits in the home broadband & voice market.
Following the Rules
Of course following the rules is extremely difficult – continual innovation and improvement is hard. But the end result is a platform with plenty of levers for growth to play with. Different levers can be used at different times – all of which make the platform more attractive to particular segments of consumers and thereby generate the growth. The underlying dynamic is that BSkyB needs to show customer growth for success – or more importantly the right type of customer growth, ones who’ll happily pay for the extra features. Building the platform, increasing eyeballs, and increasing diversity are crucial. Traditional TV metrics such as share of viewing for an overall channel is completely irrelevant.
Figure 2: Sky Customer Growth and major platform upgrades
Impact on Regulation
UK Regulators, especially OFCOM, have struggled with the BSkyB business model and tend to try to examine market share in very narrow vertical segments – eg the current ongoing PayTV consultation. These ways of looking at BSkyB are doomed to fail. OFCOM is not alone and the EU battles with Microsoft over its platform business are already legendary and still ongoing. We don’t yet have the answer of how to regulate in a platform world, but we do know that a different approach is required.
Most importantly, BSkyB’s platform shows a multi-sided business model in action, bringing value to both upstream and downstream customers and earning decent returns for shareholders.
Online Video consumption is booming. The good news is that clearer demand patterns are beginning to emerge which should help in capacity planning and improving the user experience; the bad news is that an overall economic model which works for all players in the value chain is about as clear as mud.
We previously analysed the leffect of the launch of the BBC iPlayer on the ISP business model, but the truth is that, even in the UK, YouTube traffic still far outweighs the BBC iPlayer in the all important peak hour slot – even though the bitrate is far lower.
Looking at current usage data at a UK ISP we can see that the number of concurrent people using YouTube is roughly seven times that of the iPlayer. However, our analysis suggests that this situation is set to change quite dramatically as traditional broadcasters increase their presence online, with significant impact for all players. Here’s why:
Streaming Traffic Patterns
Our friends at Plusnet, a small UK ISP, have provided Telco 2.0 with their latest data on traffic patterns. The important measurement for ISPs is peak hour load as this determines variable-cost capacity requirements.
iPlayer accounts for around 7% of total bandwidth at peak hour. The peaks are quite variable and follows the hit shows: the availability of Dr Who episodes or the latest in a long string of British losers at Wimbledon increase traffics.
Included within the iPlayer 7% is the Flash-based streaming traffic. The Kontiki-P2P based free-rental-download iPlayer traffic is included within general streaming volumes. This accounts for 5% of total peak-hour traffic and includes such applications as Real Audio, iChat, Google Video, Joost, Squeezebox, Slingbox, Google Earth, Multicast, DAAP, Kontiki (4OD, SkyPlayer, iPlayer downloads), Quicktime, MS Streaming, Shoutcast, Coral Video, H.323 and IGMP.
The BBC are planning to introduce a “bookmarking��? feature to the iPlayer which will allow pre-ordering of content and hopefully time-of-day based delivery options. This is a win-win-win enhancement and we can’t see any serious objections to the implementation: for the consumers it is great because they can view higher-quality video and allow the download when traffic is not counted towards their allowance; for ISPs it is great because it encourages non-peak hour downloads; and for the BBC it is great as it will potentially reduce their CDN costs.
YouTube traffic accounts for 17% of peak-hour usage – this is despite YouTube streaming at around 200kbps compared to the iPlayer 500kbps. There are about seven times the amount of concurrent users using YouTube compared to the iPlayer at peak hour. Concurrent is important here: YouTubers watch short-length clips whereas iPlayers watch longer shows of broadcast length.
P2P is declining in importance
The real interesting part of the PlusNet data is that peak-hour streaming at around 30% far outweighs p2p and usenet traffic at around 10%. Admittedly the peakhour p2p/usenet traffic at Plusnet is probably far lower than at other ISPs, but it goes to show how ISPs can control their destiny and manage consumption through the use of open and transparent traffic shaping policies. Overall, p2p consumption is 26% of Plusnet traffic across a 24-hour window – the policies are obviously working and people are p2p and usenet downloading when the network is not busy.
Quality and therefore bandwidth bound to increase
Both YouTube and the iPlayer are relatively low-bandwidth solutions compared to broadcast quality shows either in SD (standard definition) or HD (high-definition), however applications are emerging which are real headache material for the ISPs.
The interesting part of the Move Networks technology is dynamic adjustment of the bit-rate according to the quality of the connection. Also, it does not seem to suffer from the buffering “feature��? that unfortunately seems to be part of the YouTube experience. Move Networks achieve this by installing a client in the form of a browser plug-in which switches the video stream according to the connection much in the same way as the TCP protocol works. We have regularly streamed content at 1.5Mbps which is good enough to view on a big widescreen TV and is indistinguishable to the naked eye from broadcast TV.
Unlike Akamai there is no secret sauce in the Move Networks technology and we expect other Media Players to start to use similar features — after all every content owner wants the best possible experience for viewers.
Clearing the rights
The amount of iPlayer content is also increasing: Wimbledon coverage was available for the first time and upcoming is the Beijing Olympics and the British Golf Open. We also expect that the BBC will eventually get permission to make available content outside of the iPlayer 7-day window. The clearing of rights for the BBC’s vast archive will take many years, but slowly but surely more and more content will be available. This is true for all major broadcasters in the UK and the rest of the world.
YouTube to shrink in importance
It will be extremely interesting to see how YouTube responds to the challenge of the traditional broadcasters — personally we can’t see a future where YouTube market share is anywhere near its current level. We believe watching User Generated Content, free of copyright, will always be a niche market.
Online Video Distribution and the associated economics is a key area of study for the Telco 2.0 team.
The UK’s largest broadcaster finally launched its online video streaming and download service on Christmas Day. Plusnet, a small ISP owned by BT, has provided a preliminary analysis of the traffic and the results should send shivers down the spine of any ISP currently offering an unlimited “all-you-eat” service.
The iPlayer service is basically a 7-day catch-up service which enables people who missed and didn’t record a broadcast to watch the programme at their leisure on a PC connected to the internet. The iPlayer differs from any other internet-based video service in certain key respects:
It is funded by the £135.50 annual licence fee which pays for the majority of BBC activities.
The BBC collected 25.1m licence fees in 2006/7. No advertising is required for the iPlayer business model to work.
It is heavily promoted on the BBC broadcast TV channels. The BBC had a 42.6% share of overall UK viewing in 2006/7 and therefore a lot of people already know about the existence of the iPlayer after one month of launch.
it is a high quality service and is designed for watching whole programmes rather than consumption of small vignettes.
This is sharp contrast to the current #1 streaming site, YouTube.
This equates for Plusnet to streaming cost increasing in total to £51.7k/month from £17.2k, or an increase of 18.3p/user from 6.1p/user. This is a 200% cost increase in just the first MONTH of the service. If we assume that the Plusnet base of 282k customers is a representative sample of the whole UK internet universe than we can draw some interesting conclusions about the overall impact of the iPlayer on the UK internet. On the whole UK IPstream base of 8.5m the introduction of the iPlayer would equate to an increase in costs to £1.5m in January from 500k.
Despite access unbundling, ‘middle mile’ costs remain a key bottleneck
IPstream is a wholesale product from BT, with BT being being responsible for the transit of the data from the customer’s home to an interconnect point of the ISP’s choice. The ISP pays for bandwidth capacity at the point of interconnect. BT Retail acts like an external ISP in the structurally separated model. The overall effect of the iPlayer for the BT’s IPstream-based customers is roughly neutral, with the increase in revenues at wholesale (external base of 4.2m customers) being offset by the increase in costs at BT Retail (total base of 4.2m customers). Of course, this assumes no bandwidth overages at BT Retail, which probably is not the case as both BT and Plusnet have bandwidth caps. In effect, incremental cost for ISPs using the IPstream product is determined by ordering extra BT IPstream pipes which come in 155-meg bit size chunks. The option for the ISP is either to allow a degradation in performance or order more capacity.
Time to buy more pipes
We tested the bandwidth profile using Wireshark watching a 59mins documentary celebrating the 50 year anniversary of Sputnik with both streaming and P2P. The streaming traffic is easy to analyse as it comes through on port 1935, which is the port used by Flash for streaming. Basically a jitter-free screening ran on average at around 0.5Mbit/sec. Using the 155-meg ordering slice this means only around 300 people need to be watching the iPlayer at the same time (peak = 8pm-10pm) to fill a pipe. Seeing that IPstream customers are aggregated across the UK to a single point, a lot of ISPs will be thinking of the need to order extra capacity. The BBC also offers a P2P download which is of higher quality than the streaming. We managed to download the 500Mb file in just over 20 minutes at an average speed of 3.5Mbit/sec. The total traffic (including overhead) for the streaming was 231MB and for the P2P delivery was 544Mb.
Full unbundling still leaves ISPs at the mercy of backhaul costs
The story for facility-based LLU(Local Loop Unbundling) players, which account for another 3.7m UK broadband customers, is slightly different as it depends completely on network design and distribution of the base across the exchanges. Telco 2.0 market intelligence says that some unbundlers have ordered 1-gig links for the backhaul and should be unaffected least in the short term. However, some unbundlers have only ordered 100-meg links and could be in deep trouble with peak hour people really noticing the difference in experience. The only real option for these unbundlers is to order extra capacity on their backhaul links which could be extremely expensive. The average speed for someone just browsing and doing emails is quite low compared to someone sat back watching videos stream.
Cable companies understand sending telly over wires
The story for Virgin Media, which is the main UK cable operator with 3.3m broadband subscribers, is again is dependent on network design. This time it depends upon the load on the UBR(Universal Broadband Router) within the network segment. Virgin Media have a special angle to this as the iPlayer will be coming to their Video-on-Demand service in the spring, and therefore we assume this will take a lot of load off their IP network. The Virgin VoD service runs on dedicated bandwidth within their network and allows for the content to be watched on TV rather than PC. A big bonus for the Virgin Media subscribers.
Modelling the cost impact
For both cable and LLU players the cost profile is radically different to IPstream players, and it is not a trivial task to calculate the impact. However, we can extrapolate the Plusnet traffic figures to note the effect in volumes of data. We have modelled four scenarios: usage the same as in Jan 2008 (i.e. an average of 19min/month/user) rising to 1 hour/month, 1 hour/week and 1 hour/day. These would give an increase in cost of £1,035k/month, £3,243k/month, £14,053k/month and £98,638k/month respectively for the IPstream industry, only based upon Plusnet cost assumptions. Of course this is assuming the IPstream base stays the same (and they don’t just all go bust straight away!). Across the whole of the UK ISP industry, the increase in traffic (Gb/month) is 1,166, 3,655, 15,837 and 111,161 respectively. That’s a lot of data. The obvious conclusion is that ISP pricing will need to be raised and extra capacity will needed to be added. The data reinforces our belief expressed in our recent Broadband Report that “Video will kill the ISP star”. The problem with the current ISP model is it is like an all you can eat buffet, where one in ten customers eats all the food, one in a hundred takes his chair home too, and one in a thousand unscrews all the fixtures and fittings and loads them into a van as well.
A trigger for industry structural change?
An interesting corollary to the increase in costs for the ISPs is that we believe that the iPlayer will actually speed up consolidation across the industry and make the life of smaller ISPs even more difficult than it is today. Additionally because of the high bandwidth needs of the iPlayer, the long copper lengths in rural England and the lack of cable or LLU competition to the IPstream product, we believe that the iPlayer will increase the digital divide between rural and suburban UK. The iPlayer also poses an interesting question for the legion of UK small businesses who rely on broadband and yet don’t have a full set of telecommunications skills. What do they do about the employee who wants to eat their lunch at their desk whilst simultaneously watching last nights episode of top soap EastEnders?
Time to stop the game of ‘pass the distribution cost parcel’
The BBC is actually in quite a difficult situation, especially as publicity starts to mount over the coming months with users breaking their bandwidth limits and more or more start to get charged for overages. The UK licence payers expect they paid for both content and distribution when they handed over £133.50. In 2006/7, the BBC paid £99.7m for distributing its broadcast TV signal, £42.6m for its radio signal and only £8.8m for its online content. This is out of a total of £3.2bn licence fee income. I would suggest that the easiest way for the BBC to escape the iPlayer conundrum is for them to pay an equitable fee to the ISPs for distributing their content and the ISP plan comes with unlimited BBC content, possibly with a small retail mark-up. The alternative of traffic-shaping your users to death doesn’t seem like a great way of creating high customer satisfaction. The old media saying sums up the situation quite nicely:
“If content is King, then distribution is King Kong”
[Ed – to participate in the debate on sustainable business models in the telecoms-media-tech space, do come to the Telco 2.0 ‘Executive Brainstorm’ on 16-17 April in London.]