Live entertainment spans everything from a handful of people enjoying stand-up comedy in a pub to a football match attended by 100,000 fans. Although there are many different forms and formats of live entertainment, they share three inter-related characteristics – immediacy, interactivity and immersion. The performers make things happen and people tend to react, by clapping, shouting, singing or gesticulating at the performers or by interacting with each other. A compelling event will also be immersive in the sense that the spectators will focus entirely on the action.
For telcos, live events present specific challenges and opportunities. Simultaneously providing millions of people with high quality images and audio from live events can soak up large amounts of bandwidth on networks, forcing telcos to invest in additional capacity. Yet, it should be feasible to make a return on that investment: live events are an enormously popular form of entertainment on which people around the world are prepared to spend vast sums of money. This is a market where demand often outstrips supply: tickets for top tier sports events or music concerts can cost US$150 or more.
With the advent of 5G and Wi-Fi 6E, telcos have an opportunity to improve spectators’ enjoyment of live events both within a venue and in remote locations. Indeed, telcos could play a key role in enabling many more people to both participate in and appreciate live entertainment, thereby helping them to enjoy more fulfilling and enriching lives.
Enter your details below to request an extract of the report
The opportunities to use new technologies to enhance live events
Source: STL Partners
More broadly, telecoms networks and related services have become fundamental to the smooth running of our increasingly digital economy. Our landmark report The Coordination Age: A third age of telecoms explained how reliable and ubiquitous connectivity can enable companies and consumers to use digital technologies to efficiently allocate and source assets and resources. In the case of live entertainment, telcos can help people to make better use of their leisure time – a precious and very finite resource for most individuals.
This report begins by providing an overview of the live entertainment opportunity for telcos, outlining the services they could provide to support both professional and amateur events. It then considers the growing demand for high-definition, 360-degree coverage of live events, before discussing why it is increasingly important to deliver footage in real-time, rather than near real-time. Subsequent sections explore the expanding role of edge computing in facilitating live broadcasts and how augmented reality and virtual reality could be used to create more immersive and interactive experiences.
This report draws on the experiences and actions of AT&T, BT, NTT and Verizon, which are all very active in the coverage of live sports. It also builds on previous STL Partners research including:
On Wednesday, November 9th, Donald J. Trump won the 58th US presidential election. During his campaign Mr Trump made many statements. Now that he has won, we look beyond the rhetoric and initial political shock and uncertainty at what he might actually do and how this could affect the TMT sector.
This is a difficult task because, to date, Trump has not made many detailed statements about his policies. During the campaign he made many declarations, but these will not necessarily translate into bold policy decisions. Indeed, within one week of being elected his rhetoric has become more measured and he has already changed his stance somewhat on Obamacare and immigration.
Trump is now in the process of choosing his senior advisors and cabinet, and these choices will indicate more about what his presidency will be like than his behaviour during the campaign. At the time of writing, only two senior advisors had been chosen, Reince Priebus as White House Chief of Staff and Steve Bannon as Senior Counselor to the President. Neither of these positions require Senate approval. Priebus has served as the chair of the Republican National Committee since 2011, so is a reassuring choice for establishment Republicans, but Bannon is much more controversial. Bannon was until his appointment the executive chairman of Breitbart News, a far-right website. His appointment as Senior Counselor to the President has caused dismay among liberals, but leading Republicans have declined to criticise the appointment, calling for party unity instead.
We expect that deciding Trump’s cabinet will be a difficult and turbulent process and will take some weeks to settle, as Trump will have to put some of his own views aside in order to choose a cabinet that the broader Republican party will approve. Although Trump has indicated that he will work with the party through his choice of Priebus, his choice of Bannon indicates that he is not afraid of pushing the boundaries. We therefore expect to see some more controversial choices in the next few weeks, but whether these get approved by the Senate is another matter.
From a TMT perspective, the most important appointment will be the Attorney General, which will need Senate approval. Another position of interest to the TMT sector is the chair of the Federal Trade Commission (FTC). Trump can choose a new chair of the FTC from its commissioners, who are confirmed by the Senate. Although there should be five commissioners there are currently only three, and Trump could decide to replace the current, Democrat chair with a Republican, and also nominate more Republicans as commissioners. The Attorney General and FTC roles are important because they will influence Trump’s position on data privacy and security, consumer protection, and antitrust, which are key issues for the TMT industry.
Two potential scenarios affecting five key areas for TMT
Because of the uncertainty around how Trump will behave as president, rather than try to definitively predict what he will do, STL Partners has decided to focus on five key areas for the telco industry and developed two scenarios which may play out under a Trump presidency, as outlined in the table below.
Hardline Trump leadership
Trump’s leadership decisions closely match his most extreme campaign rhetoric; he leads the US into a period of right wing, isolationist politics.
Moderate Trump leadership
Trump’s leadership decisions are more moderate; he listens to advice from the wider Republican party, is moderated by Congress and the Senate, and does not follow through on extreme claims, such as the wall preventing illegal Mexican immigrants reaching the US.
Source: STL Partners
We run through the five areas below, discussing what is known about Trump’s views and what might happen under each scenario, as well as highlighting our view on the most likely outcome.
Predictions are difficult with Trump
Two potential scenarios affecting five key areas for TMT
As one of the most regulated sectors of the economy, telecoms services are the product of a complex mix of market forces and a multitude of rules governing everything from prices to the availability of spectrum. Many of these rules date from the days when an incumbent telco, often state-owned, was the dominant player in the market and needed to be carefully scrutinised by regulators. However, some of these rules, such as those governing Net Neutrality, are relatively new and relate to telcos’ role as the gateway to the Internet, which has become so fundamental to modern life. For more on this topic, please see STL Partners’ recent report: Net Neutrality 2021: IoT, NFV and 5G ready?
As telcos’ profitability has come under increasing pressure, they are lobbying hard for greater regulatory freedom. This report outlines and analyses telcos’ various campaigns to improve the business case for infrastructure investment and level the playing field with Internet players, such as Google and Facebook. It also considers whether telcos are actually putting their money where their mouth is. Is the current regulatory and competitive climate actually prompting them to cut back on investment? What will be the impact on 5G?
For their part, governments are increasingly aware of the need to stimulate new investments and new solutions in the digital economy. Greater digitisation could help solve important socio-economic problems. For example, most governments believe that digital technologies can improve the business environment, and support lower-cost, but effective, healthcare, education and security services, that will make their economies function and grow. The EU, for example, is trying to build a Digital Single Market, while the Indian government’s Digital India initiative aims to make all public services available online.
Thus governments need telcos and tech companies to succeed. Given that telcos are typically more national than global in their outlook and organisation, they tend to seem a more natural partner for national governments than the giant Internet players, such as Google and Apple.
In light of these factors, this report explores whether policymakers’ priorities are changing and how regulatory principles and competition policy are evolving. In particular, it considers whether policymakers and regulators are now taking a tougher stance with the major Internet platforms. Finally, the report analyses several areas of uncertainty – arenas in which telcos and others are likely to concentrate their lobbying efforts in future, and gives our high level analysis of areas of potential for telcos – and regulators – to make progress.
The regulatory constraints on telcos
Telcos’ lobbying efforts
More than just talk?
Policymakers change their priorities
Taking a tougher line with Internet players
Conclusions and areas of uncertainty
Figure 1: EBIT margins for various segments of the digital economy
Figure 2: ROCE in various segments of the digital value chain
Figure 3: Western Europe isn’t investing enough in telecoms infrastructure
Figure 4: Europe’s big five have stepped up capital spending
Figure 5: Vodafone & Telecom Italia invest more than 20% of revenues
Figure 6: The capital intensity of European telcos has been rising
Figure 7: Europe’s large telcos are seeing ROCE fall
Figure 8: Europe lags behind on LTE availability
Figure 9: In the UK, mobile operators already share infrastructure
Figure 10: The EU alleges Google uses Android to unfairly promote its apps
Figure 11: The key issues in telecoms regulation & their relative importance
Figure 12: The flywheel that can be driven by ROCE-aware regulation
In most emerging markets, which are the focus of this report, mobile networks are fast becoming the primary distribution channel for entertainment content. Although television is popular all over the world, in much of sub-Saharan Africa and developing Asia, terrestrial television coverage is patchy, while cable TV is rare. Satellite television is broadly available, but fewer than half of households can afford to buy a television, meaning many people only watch TV in bars, cafes or in the houses of friends.
In Kenya, for example, only 28% of households have a television, according to the World Bank development indicators, while in Tanzania that figure is just 15%. In some major developing markets, television has a stronger grip – in Nigeria, 40% of households have a TV and 47% of households in India. For sub-Saharan Africa, as a whole, television penetration is about 25% and in South Asia, 36%.
For many people in these regions, purchasing a versatile smartphone, which can be used for communications, information access, commerce and entertainment, is a higher priority than acquiring a television. The advent of sub US$40 smartphones means more and more people can now afford mobile devices with decent screens capable of displaying multimedia and processors that can run apps and full Internet browsers. In India, 220 million smartphones were in use at the end of 2015, according to one estimate , while Ericsson has forecast that the number of smartphones in use in Sub-Saharan Africa will leap to 690 million in 2021 from 170 million at the end of 2015 (see Figure 1).
Figure 1: Predicted smartphone growth in developing regions
Source: Ericsson Mobility Report, November 2015
In emerging markets, most Internet users don’t own a television (see Figure 2) and many rely entirely on a smartphone for digital entertainment. Moreover, a scarcity of fixed line infrastructure means much of the entertainment content is delivered over mobile networks. Mobile trade group the GSMA estimates that 3G networks, which are typically fast enough to transmit reasonable video images, reach about three quarters of the planet’s people. Mobile network supplier Ericsson has forecast that mobile broadband networks (3G and/or 4G) will cover more than 90% of the world’s population by 2021.
Figure 2: Device ownership among Internet users in selected markets
The reliance on cellular infrastructure in developing countries has enabled mobile operators to take on a central role in the provision of online entertainment. The fact that many people rely almost solely on mobile networks for entertainment is presenting mobile operators with a major opportunity to boost their relevance and revenues. Given the capacity constraints on mobile networks and the implications for cellular tariffs, entertainment services need to be optimised to ensure that the costs of bandwidth don’t become prohibitive for consumers. Mobile operators’ understanding and real-time knowledge of their networks means they are in a good position to both manage the optimisation and package connectivity and content (regulation permitting) into one service bundle with a predictable and transparent tariff.
Although the network effects and economies of scale and scope enjoyed by YouTube and Facebook mean that both these players have strong positions in much of developing Asia, Latin America, the Middle East and Africa, some emerging market telcos have also built a solid foundation in the fast growing online entertainment sector. In Africa and India, for example, the leading telcos enable third party content providers to reach new customers through the telcos’ dedicated entertainment platforms, including web portals, individual apps and app stores selling music, TV and games. In return for supporting content offerings with their brands, networks, messaging, billing and payment systems, these telcos typically earn commission and capture valuable behavioural data.
Telcos and the entertainment opportunity
Roles in the online entertainment value chain
Further disruption ahead
Vodafone India faces up to new competition
The land-grab in India’s online entertainment market
Vodafone India combines content and connectivity
Takeaways – greater differentiation required
Music Gives MTN an Edge
Takeaways – music could be a springboard
Figure 1: Predicted smartphone growth in developing regions
Figure 2: Device ownership among Internet users in selected markets
Figure 3: How the key roles in online content are changing
Figure 4: How future-proof are telcos’ entertainment portfolios?
Figure 5: Vodafone India curates a wide range of infotainment content
Figure 6: Smartphone adoption in India will more than double in the next five years
Figure 7: Vodafone Mobile TV enables customers to subscribe to channels
Figure 8: The new Vodafone Play app combines TV, films and music
Figure 9: Vodafone India offers an app that makes it easy to track data usage
Figure 10: Vodafone’s Mobile TV app hasn’t attracted a strong following
Figure 11: Competitive and regulatory pressures are pushing down prices
Figure 12: In 3G, Vodafone India has kept pace with market leader Airtel
Figure 13: Vodafone India’s growth in data traffic compared with that of other telcos
Figure 14: Vodafone’s performance in India this decade
Figure 15: MTN’s Telco 2.0 strategy is focused on digital services
Figure 16: MTN’s growing array of digital services
Figure 17: MTN Play has been localised for each of MTN’s operations
Figure 18: The Ugandan version of MTN Play caters for local tastes
Figure 19: MTN bundles in some data traffic with each music plan
Figure 20: MTN’s digital services are particularly strong in Nigeria
Figure 21: MTN tops a list of most admired brands in Africa in 2015
As telecoms networks are the primary distribution channels for the digital economy, all telcos are in the entertainment business to a certain extent. With more than 3.2 billion people worldwide now connected to the Internet, according to the ITU, entertainment is increasingly delivered online and on-demand over telecoms and cable networks. The major Internet ecosystems – Amazon, Apple, Facebook and Google – are looking to dominate this market. But telcos could also play a pivotal role in an emerging new world order, either by providing enablers or by delivering their own differentiated entertainment offerings.
Many telcos have long flirted with offering their own entertainment services, typically as a retaliatory response to cable television providers’ push into communications. But these flings are now morphing into something more serious: connectivity and entertainment are becoming increasingly intertwined in telcos’ portfolios. Television, in particular, is shifting from the periphery, both in terms of telcos’ revenues and top management focus, onto centre stage. Some of the world’s largest telcos are beginning to invest in securing exclusive drama and sports content, even going as far as developing their own programming. This push is part of telcos’ broader search for ways to remain relevant in the consumer market, as usage of telcos’ voice and messaging services is curbed by over-the-top alternatives.
The central strategic dilemma for telcos is whether they should be selling services directly to the consumer or whether they should be providing enablers to other players (such as Amazon, Google, Netflix and Spotify) who might be prepared to pay for the use of dedicated content delivery networks, messaging, distribution, authentication, billing and payments. In many respects, this is not a new dilemma: Operators have tried to become content developers and distributors in the past, building portals, selling ringtones and games, and establishing app stores. What is new is the size of the table stakes: The expansion of broadband coverage and capacity has put the focus very much on increasingly high definition and immersive television and video. Creating this kind of content can be very expensive, prompting some of the largest telcos to invest billions of dollars, rather than tens of millions, in their entertainment proposition.
It isn’t just telcos undergoing a strategic rethink. The spread of broadband, the proliferation of connected digital devices and the shift to a multimedia Internet are shaking up the entertainment industry itself. Mobile and online entertainment accounts for US$195 billion (almost 11%) of the US$1.8 trillion global entertainment market today . And that proportion is growing. By some estimates, that figure is on course to rise to more than 13% of the global entertainment market, which could be worth US$2.2 trillion in 2019.
For incumbents in the media industry, this is a seismic shift. Cable television companies, for example, have had to rethink their longstanding business model, which involved selling big bundles of television channels encompassing the good, the bad and the ugly. Individual customers typically only watch a small fraction of the cable TV channels they are paying for, prompting a growing number of them to seek out more cost-effective and more targeted propositions from over-the-top players.
Cable companies have responded by offering more choice and expanding across the entertainment value chain. For example, Comcast, a leading US cableco, offers an increasingly broad range of TV packages, ranging from US$16 a month (for about 10 local channels) to US$80 a month (for about 140 channels bundled with high speed Internet access). Moreover, Comcast is making its TV services more flexible, enabling customers to download/record video content to watch on mobile devices and PCs at their convenience. Even so, Comcast has been shedding cable TV subscriptions for most of the past decade. But the cableco’s vertical-integration strategy has more than compensated. Growth in Comcast’s NBCUniversal television and film group, which owns a major Hollywood studio, together with rising demand for high-speed Internet access, has kept the top line growing.
Roles in the online entertainment value chain
Other cablecos and telcos are following a similar playbook to Comcast, increasingly involving themselves in all four of the key roles in the online content value chain, identified by STL Partners. These four key roles are:
Programme: Content creation: producing drama series, movies or live sports programmes.
Package: Packaging programmes into channels or music into playlists and then selling these packages on a subscription basis or providing them free, supported by advertising.
Platform: Distributing TV channels, films or music created and curated by another entity.
Pipe: Providing connectivity, either to the Internet or to a walled content garden.
Clearly, virtually all telcos and cablecos play the pipe role, providing connectivity for online content. Many also operate platforms, essentially reselling television on behalf of others. But now a growing number, including BT, Telefónica and Verizon, are creating packages and even developing their own programming. The pipe and package roles present opportunities to capture behavioural data that can then be used to further hone the entertainment proposition and make personalised recommendations and offers. At the same time, the package and programme roles are becoming increasingly important as the platforms with the best content, the best channels and the best recommendations are likely to attract the most traffic.
Figure 1 illustrates how the package and platform roles, in particular, are increasingly converging, as consumers seek out services that can help them find and discover entertainment that suits their particular tastes. Google’s YouTube platform, for example, increasingly promotes its many channels (packages) to better engage consumers, help them discover content and help viewers navigate their way through the vast amount of video on offer.
By venturing into packaging and programming, telcos are hoping to differentiate their platforms from those of the major global online players – Amazon, Apple, Facebook, Google and Netflix – which benefit from substantial economies of scale and scope. But pursuing such a strategy can involve compromises.
In many cases, regulators force telcos to also make their programming and packaging available on third party TV platforms, including those of direct competitors. In the UK, for example, BT has to wholesale its BT Sports channels to other TV platforms, including that of arch rival Sky. Figure 2 shows how BT’s platform, packaging and programming is intertwined with that of third parties, creating a complex, multi-faceted market in which BT content is available through BT TV/BT Broadband and through other platforms and pipes.
Figure 1: How the key roles in online content are changing
Source: STL Partners analysis
Figure 2: BT has to provide standalone packaging & programming, as well as a platform
Source: STL Partners analysis
Telcos and the entertainment opportunity
Roles in the online entertainment value chain
Further disruption ahead
BT – betting big on sport
Takeaways – sport gives BT a broad springboard
Telefónica – leveraging languages
Takeaways – Telefónica could lead Hispanic entertainment
Verizon – acquiring and accumulating expertise
Takeaways – Verizon needs bigger and better content
Annex: Recommendations for telcos & cablecos in entertainment
Figure 1: How the key roles in online content are changing
Figure 2: BT has to provide standalone packaging & programming, as well as a platform
Figure 3: How future-proof are telcos’ entertainment portfolios?
Figure 4: The extras and upgrades to the free BT TV and BT Sports offer
Figure 5: The differences between BT TV’s free and premium packages
Figure 6: BT’s app enables consumers to watch premium content on handsets
Figure 7: BT Sport has driven broadband net-adds, but the rights bill is also rising
Figure 8: In the UK, BT is still behind the Sky TV platform but on a par with YouTube
Figure 9: How BT Sport creates value for BT
Figure 10: Telefónica offers a selection of bolt-ons to cater for different tastes
Figure 11: Acquisitions boosted Telefónica’s pay TV business in 2015
Figure 12: Pay TV and fibre broadband are the growth engines in Spain
Figure 13: Telefónica TV’s position versus that of Netflix and YouTube in Spain
Figure 14: Verizon’s three-tier strategy envisages providing platforms and solutions
Figure 15: Verizon was attracted by AOL’s growing platforms business
Figure 16: Verizon’s go90 is designed to be a content and social hybrid
Figure 17: AOL ranks sixth in terms of online visitors in the US
Figure 18: Verizon’s new go90 app has had a fairly positive response from users
Figure 19: AOL video trails far behind Internet rivals YouTube and Netflix in terms of usage
Figure 20: How future-proof are telcos’ entertainment portfolios?
This report analyses the market position and strategies of five global online entertainment platforms – Amazon, Apple, Facebook, Google and Netflix.
It also explores how improvements in digital technologies, consumer electronics and bandwidth are changing the online entertainment market, while explaining the ongoing uncertainty around net neutrality. The report then considers how well each of the five major entertainment platforms is prepared for the likely technological and regulatory changes in this market. Finally, it provides a high level overview of the implications for telco, paving the way for a forthcoming STL Partners report going into more detail about potential strategies for telcos in online entertainment.
The rise and rise of online entertainment
As in many other sectors, digital technologies are shaking up the global entertainment industry, giving rise to a new world order. Now that 3.2 billion people around the world have Internet access, according to the ITU, entertainment is increasingly delivered online and on-demand.
Mobile and online entertainment accounts for US$195 million (almost 11%) of the US$1.8 trillion global entertainment market today. By some estimates, that figure is on course to rise to more than 13% of the global entertainment market, which could be worth US$2.2 trillion in 2019.
Two leading distributors of online content – Google and Facebook – have infiltrated the top ten media owners in the world as defined by ZenithOptimedia (see Figure 1). ZenithOptimedia ranks media companies according to all the revenues they derive from businesses that support advertising – television broadcasting, newspaper publishing, Internet search, social media, and so on. As well as advertising revenues, it includes all revenues generated by these businesses, such as circulation revenues for newspapers or magazines. However, for pay-TV providers, only revenues from content in which the company sells advertising are included.
Figure 1 – How Google and Facebook differ from other leading media owners
Source: ZenithOptimedia, May 2015/STL Partners
ZenithOptimedia says this approach provides a clear picture of the size and negotiating power of the biggest global media owners that advertisers and agencies have to deal with. Note, Figure 1 draws on data from the financial year 2013, which is the latest year for which ZenithOptimedia had consistent revenue figures from all of the publicly listed companies. Facebook, which is growing fast, will almost certainly have climbed up the table since then.
Figure 1 also shows STL Partners’ view of the extent to which each of the top ten media owners is involved in the four key roles in the online content value chain. These four key roles are:
Programme: Content creation. E.g. producing drama series, movies or live sports programmes.
Package: Content curation. E.g. packaging programmes into channels or music into playlists and then selling these packages on a subscription basis or providing them free, supported by advertising.
Platform: Content distribution. E.g. Distributing TV channels, films or music created and curated by another entity.
Pipe: Providing connectivity. E.g. providing Internet access
Increasing vertical integration
Most of the world’s top ten media owners have traditionally focused on programming and packaging, but the rise of the Internet with its global reach has brought unprecedented economies of scale and scope to the platform players, enabling Google and now Facebook to break into the top ten. These digital disruptors earn advertising revenues by providing expansive two-sided platforms that link creators with viewers. However, intensifying competition from other major ecosystems, such as Amazon, and specialists, such as Netflix, is prompting Google, in particular, to seek new sources of differentiation. The search giant is increasingly investing in creating and packaging its own content. The need to support an expanding range of digital devices and multiple distribution networks is also blurring the boundaries between the packaging and platform roles (see Figure 2, below) – platforms increasingly need to package content in different ways for different devices and for different devices.
Figure 2 – How the key roles in online content are changing
Source: STL Partners
These forces are prompting most of the major media groups, including Google and, to a lesser extent, Facebook, to expand across the value chain. Some of the largest telcos, including Verizon and BT, are also investing heavily in programming and packaging, as they seek to fend off competition from vertically-integrated media groups, such as Comcast and Sky (part of 21st Century Fox), who are selling broadband connectivity, as well as content.
In summary, the strongest media groups will increasingly create their own exclusive programming, package it for different devices and sell it through expansive distribution platforms that also re-sell third party content. These three elements feed of each other – the behavioural data captured by the platform can be used to improve the programming and packaging, creating a virtuous circle that attracts more customers and advertisers, generating economies of scale.
Although some leading media groups also own pipes, providing connectivity is less strategically important – consumers are increasingly happy to source their entertainment from over-the-top propositions. Instead of investing in networks, the leading media and Internet groups lobby regulators and run public relations campaigns to ensure telcos and cablecos don’t discriminate against over-the-top services. As long as these pipes are delivering adequate bandwidth and are sufficiently responsive, there is little need for the major media groups to become pipes.
The flip-side of this is that if telcos can convince the regulator and the media owners that there is a consumer and business benefit to differentiated network services (or discrimination to use the pejorative term), then the value of the pipe role increases. Guaranteed bandwidth or low-latency are a couple of the potential areas that telcos could potentially pursue here but they will need to do a significantly better job in lobbying the regulator and in marketing the benefits to consumers and the content owner/distributor if this strategy is to be successful.
To be sure, Google has deployed some fibre networks in the US and is now acting as an MVNO, reselling airtime on mobile networks in the US. But these efforts are part of its public relations effort – they are primarily designed to showcase what is possible and put pressure on telcos to improve connectivity rather than mount a serious competitive challenge.
The rise and rise of online entertainment
Increasing vertical integration
The world’s leading online entertainment platforms
A regional breakdown
The future of online entertainment market
1. Rising investment in exclusive content
2. Back to the future: Live programming
3. The changing face of user generated content
4. Increasingly immersive games and interactive videos
5. The rise of ad blockers & the threat of a privacy backlash
6. Net neutrality uncertainty
How the online platforms are responding
Conclusions and implications for telcos
STL Partners and Telco 2.0: Change the Game
Google is the leading generator of online entertainment traffic in most regions
How future-proof are the major online platforms?
Figure 1: How Google and Facebook differ from other leading media owners
Figure 2: How the key roles in online content are changing
Figure 3: Google leads in most regions in terms of entertainment traffic
Figure 4: YouTube serves up an eclectic mix of music videos, reality TV and animals
Figure 5: Facebook users recommend videos to one another
Figure 6: Apple introduces apps for television
Figure 7: Netflix, Google, Facebook and Amazon all gaining share in North America
Figure 8: YouTube & Facebook increasingly about entertainment, not interaction
Figure 9: YouTube maintains lead over Facebook on American mobile networks
Figure 10: US smartphones may be posting fewer images and videos to Facebook
Figure 11: Over-the-top entertainment is a three-way fight in North America
Figure 12: YouTube, Facebook & Netflix erode BitTorrent usage in Europe
Figure 13: File sharing falling back in Europe
Figure 14: iTunes cedes mobile share to YouTube and Facebook in Europe
Figure 15: Facebook consolidates strong upstream lead on mobile in Europe
Figure 16: YouTube accounts for about one fifth of traffic on Europe’s networks
Summary: Key trends, tactics, and technologies for mobile broadband networks and services that will influence mid-term revenue opportunities, cost structures and competitive threats. Includes consideration of LTE, network sharing, WiFi, next-gen IP (EPC), small cells, CDNs, policy control, business model enablers and more.(March 2012, Executive Briefing Service, Future of the Networks Stream).
Below is an extract from this 44 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email email@example.com / call +44 (0) 207 247 5003. We’ll also be discussing our findings and more on Facebook at the Silicon Valley (27-28 March) and London (12-13 June) New Digital Economics Brainstorms.
In our recent ‘Under the Floor (UTF) Players‘ Briefing we looked at strategies to deal with some of of the challenges facing operators’ resulting from market structure and outsourcing
This Executive Briefing is intended to complement and extend those efforts, looking specifically at those technical and business trends which are truly “disruptive”, either immediately or in the medium-term future. In essence, the document can be thought of as a checklist for strategists – pointing out key technologies or trends around mobile broadband networks and services that will influence mid-term revenue opportunities and threats. Some of those checklist items are relatively well-known, others more obscure but nonetheless important. What this document doesn’t cover is more straightforward concepts around pricing, customer service, segmentation and so forth – all important to get right, but rarely disruptive in nature.
During 2012, Telco 2.0 will be rolling out a new MBB workshop concept, which will audit operators’ existing technology strategy and planning around mobile data services and infrastructure. This briefing document is a roundup of some of the critical issues we will be advising on, as well as our top-level thinking on the importance of each trend.
It starts by discussing some of the issues which determine the extent of any disruption:
Growth in mobile data usage – and whether the much-vaunted “tsunami” of traffic may be slowing down
The role of standardisation , and whether it is a facilitator or inhibitor of disruption
Whether the most important MBB disruptions are likely to be telco-driven, or will stem from other actors such as device suppliers, IT companies or Internet firms.
The report then drills into a few particular domains where technology is evolving, looking at some of the most interesting and far-reaching trends and innovations. These are split broadly between:
Network infrastructure evolution (radio and core)
Control and policy functions, and business-model enablers
It is not feasible for us to cover all these areas in huge depth in a briefing paper such as this. Some areas such as CDNs and LTE have already been subject to other Telco 2.0 analysis, and this will be linked to where appropriate. Instead, we have drilled down into certain aspects we feel are especially interesting, particularly where these are outside the mainstream of industry awareness and thinking – and tried to map technical evolution paths onto potential business model opportunities and threats.
This report cannot be truly exhaustive – it doesn’t look at the nitty-gritty of silicon components, or antenna design, for example. It also treads a fine line between technological accuracy and ease-of-understanding for the knowledgeable but business-focused reader. For more detail or clarification on any area, please get in touch with us – email mailto:firstname.lastname@example.org or call +44 (0) 207 247 5003.
Telco-driven disruption vs. external trends
There are various potential sources of disruption for the mobile broadband marketplace:
New technologies and business models implemented by telcos, which increase revenues, decrease costs, improve performance or alter the competitive dynamics between service providers.
3rd party developments that can either bolster or undermine the operators’ broadband strategies. This includes both direct MBB innovations (new uses of WiFi, for example), or bleed-over from adjacent related marketplaces such as device creation or content/application provision.
External, non-technology effects such as changing regulation, economic backdrop or consumer behaviour.
The majority of this report covers “official” telco-centric innovations – LTE networks, new forms of policy control and so on,
External disruptions to monitor
But the most dangerous form of innovation is that from third parties, which can undermine assumptions about the ways mobile broadband can be used, introducing new mechanisms for arbitrage, or somehow subvert operators’ pricing plans or network controls.
In the voice communications world, there are often regulations in place to protect service providers – such as banning the use of “SIM boxes” to terminate calls and reduce interconnection payments. But in the data environment, it is far less obvious that many work-arounds can either be seen as illegal, or even outside the scope of fair-usage conditions. That said, we have already seen some attempts by telcos to manage these effects – such as charging extra for “tethering” on smartphones.
It is not really possible to predict all possible disruptions of this type – such is the nature of innovation. But by describing a few examples, market participants can gauge their level of awareness, as well as gain motivation for ongoing “scanning” of new developments.
Some of the areas being followed by Telco 2.0 include:
Connection-sharing. This is where users might link devices together locally, perhaps through WiFi or Bluetooth, and share multiple cellular data connections. This is essentially “multi-tethering” – for example, 3 smartphones discovering each other nearby, perhaps each with a different 3G/4G provider, and pooling their connections together for shared use. From the user’s point of view it could improve effective coverage and maximum/average throughput speed. But from the operators’ view it would break the link between user identity and subscription, and essentially offload traffic from poor-quality networks on to better ones.
SoftSIM or SIM-free wireless. Over the last five years, various attempts have been made to decouple mobile data connections from SIM-based authentication. In some ways this is not new – WiFi doesn’t need a SIM, while it’s optional for WiMAX, and CDMA devices have typically been “hard-coded” to just register on a specific operator network. But the GSM/UMTS/LTE world has always relied on subscriber identification through a physical card. At one level, it s very good – SIMs are distributed easily and have enabled a successful prepay ecosystem to evolve. They provide operator control points and the ability to host secure applications on the card itself. However, the need to obtain a physical card restricts business models, especially for transient/temporary use such as a “one day pass”. But the most dangerous potential change is a move to a “soft” SIM, embedded in the device software stack. Companies such as Apple have long dreamed of acting as a virtual network provider, brokering between user and multiple networks. There is even a patent for encouraging bidding per-call (or perhaps per data-connection) with telcos competing head to head on price/quality grounds. Telco 2.0 views this type of least-cost routing as a major potential risk for operators, especially for mobile data – although it also possible enables some new business models that have been difficult to achieve in the past.
Encryption. Various of the new business models and technology deployment intentions of operators, vendors and standards bodies are predicated on analysing data flows. Deep packet inspection (DPI) is expected to be used to identify applications or traffic types, enabling differential treatment in the network, or different charging models to be employed. Yet this is rendered largely useless (or at least severely limited) when various types of encryption are used. Various content and application types already secure data in this way – content DRM, BlackBerry traffic, corporate VPN connections and so on. But increasingly, we will see major Internet companies such as Apple, Google, Facebook and Microsoft using such techniques both for their own users’ security, but also because it hides precise indicators of usage from the network operators. If a future Android phone sends all its mobile data back via a VPN tunnel and breaks it out in Mountain View, California, operators will be unable to discern YouTube video from search of VoIP traffic. This is one of the reasons why application-based charging models – one- or two-sided – are difficult to implement.
Application evolution speed. One of the largest challenges for operators is the pace of change of mobile applications. The growing penetration of smartphones, appstores and ease of “viral” adoption of new services causes a fundamental problem – applications emerge and evolve on a month-by-month or even week-by-week basis. This is faster than any realistic internal telco processes for developing new pricing plans, or changing network policies. Worse, the nature of “applications” is itself changing, with the advent of HTML5 web-apps, and the ability to “mash up” multiple functions in one app “wrapper”. Is a YouTube video shared and embedded in a Facebook page a “video service”, or “social networking”?
It is also really important to recognise that certain procedures and technologies used in policy and traffic management will likely have some unanticipated side-effects. Users, devices and applications are likely to respond to controls that limit their actions, while other developments may result in “emergent behaviours” spontaneously. For instance, there is a risk that too-strict data caps might change usage models for smartphones and make users just connect to the network when absolutely necessary. This is likely to be at the same times and places when other users also feel it necessary, with the unfortunate implication that peaks of usage get “spikier” rather than being ironed-out.
There is no easy answer to addressing these type of external threats. Operator strategists and planners simply need to keep watch on emerging trends, and perhaps stress-test their assumptions and forecasts with market observers who keep tabs on such developments.
The mobile data explosion… or maybe not?
It is an undisputed fact that mobile data is growing exponentially around the world. Or is it?
A J-curve or an S-curve?
Telco 2.0 certainly thinks that growth in data usage is occurring, but is starting to see signs that the smooth curves that drive so many other decisions might not be so smooth – or so steep – after all. If this proves to be the case, it could be far more disruptive to operators and vendors than any of the individual technologies discussed later in the report. If operator strategists are not at least scenario-planning for lower data growth rates, they may find themselves in a very uncomfortable position in a year’s time.
In its most recent study of mobile operators’ traffic patterns, Ericsson concluded that Q2 2011 data growth was just 8% globally, quarter-on-quarter, a far cry from the 20%+ growths seen previously, and leaving a chart that looks distinctly like the beginning of an S-curve rather than a continued “hockey stick”. Given that the 8% includes a sizeable contribution from undoubted high-growth developing markets like China, it suggests that other markets are maturing quickly. (We are rather sceptical of Ericsson’s suggestion of seasonality in the data). Other data points come from O2 in the UK , which appears to have had essentially zero traffic growth for the past few quarters, or Vodafone which now cites European data traffic to be growing more slowly (19% year-on-year) than its data revenues (21%). Our view is that current global growth is c.60-70%, c.40% in mature markets and 100%+ in developing markets.
Figure 1 – Trends in European data usage
Now it is possible that various one-off factors are at play here – the shift from unlimited to tiered pricing plans, the stronger enforcement of “fair-use” plans and the removal of particularly egregious heavy users. Certainly, other operators are still reporting strong growth in traffic levels. We may see resumption in growth, for example if cellular-connected tablets start to be used widely for streaming video.
But we should also consider the potential market disruption, if the picture is less straightforward than the famous exponential charts. Even if the chart looks like a 2-stage S, or a “kinked” exponential, the gap may have implications, like a short recession in the economy. Many of the technical and business model innovations in recent years have been responses to the expected continual upward spiral of demand – either controlling users’ access to network resources, pricing it more highly and with greater granularity, or building out extra capacity at a lower price. Even leaving aside the fact that raw, aggregated “traffic” levels are a poor indicator of cost or congestion, any interruption or slow-down of the growth will invalidate a lot of assumptions and plans.
Our view is that the scary forecasts of “explosions” and “tsunamis” have led virtually all parts of the industry to create solutions to the problem. We can probably list more than 20 approaches, most of them standalone “silos”.
Figure 2 – A plethora of mobile data traffic management solutions
What seems to have happened is that at least 10 of those approaches have worked – caps/tiers, video optimisation, WiFi offload, network densification and optimisation, collaboration with application firms to create “network-friendly” software and so forth. Taken collectively, there is actually a risk that they have worked “too well”, to the extent that some previous forecasts have turned into “self-denying prophesies”.
There is also another common forecasting problem occurring – the assumption that later adopters of a technology will have similar behaviour to earlier users. In many markets we are now reaching 30-50% smartphone penetration. That means that all the most enthusiastic users are already connected, and we’re left with those that are (largely) ambivalent and probably quite light users of data. That will bring the averages down, even if each individual user is still increasing their consumption over time. But even that assumption may be flawed, as caps have made people concentrate much more on their usage, offloading to WiFi and restricting their data flows. There is also some evidence that the growing numbers of free WiFi points is also reducing laptop use of mobile data, which accounts for 70-80% of the total in some markets, while the much-hyped shift to tablets isn’t driving much extra mobile data as most are WiFi-only.
So has the industry over-reacted to the threat of a “capacity crunch”? What might be the implications?
The problem is that focusing on a single, narrow metric “GB of data across the network” ignores some important nuances and finer detail. From an economics standpoint, network costs tend to be driven by two main criteria:
Network coverage in terms of area or population
Network capacity at the busiest places/times
Coverage is (generally) therefore driven by factors other than data traffic volumes. Many cells have to be built and run anyway, irrespective of whether there’s actually much load – the operators all want to claim good footprints and may be subject to regulatory rollout requirements. Peak capacity in the most popular locations, however, is a different matter. That is where issues such as spectrum availability, cell site locations and the latest high-speed networks become much more important – and hence costs do indeed rise. However, it is far from obvious that the problems at those “busy hours” are always caused by “data hogs” rather than sheer numbers of people each using a small amount of data. (There is also another issue around signalling traffic, discussed later).
Yes, there is a generally positive correlation between network-wide volume growth and costs, but it is far from perfect, and certainly not a direct causal relationship.
So let’s hypothesise briefly about what might occur if data traffic growth does tail off, at least in mature markets.
Delays to LTE rollout – if 3G networks are filling up less quickly than expected, the urgency of 4G deployment is reduced.
The focus of policy and pricing for mobile data may switch back to encouraging use rather than discouraging/controlling it. Capacity utilisation may become an important metric, given the high fixed costs and low marginal ones. Expect more loyalty-type schemes, plus various methods to drive more usage in quiet cells or off-peak times.
Regulators may start to take different views of traffic management or predicted spectrum requirements.
Prices for mobile data might start to fall again, after a period where we have seen them rise. Some operators might be tempted back to unlimited plans, for example if they offer “unlimited off-peak” or similar options.
Many of the more complex and commercially-risky approaches to tariffing mobile data might be deprioritised. For example, application-specific pricing involving packet-inspection and filtering might get pushed back down the agenda.
In some cases, we may even end up with overcapacity on cellular data networks – not to the degree we saw in fibre in 2001-2004, but there might still be an “overhang” in some places, especially if there are multiple 4G networks.
Steady growth of (say) 20-30% peak data per annum should be manageable with the current trends in price/performance improvement. It should be possible to deploy and run networks to meet that demand with reducing unit “production cost”, for example through use of small cells. That may reduce the pressure to fill the “revenue gap” on the infamous scissors-diagram chart.
Overall, it is still a little too early to declare shifting growth patterns for mobile data as a “disruption”. There is a lack of clarity on what is happening, especially in terms of responses to the new controls, pricing and management technologies put recently in place. But operators need to watch extremely closely what is going on – and plan for multiple scenarios.
Specific recommendations will depend on an individual operator’s circumstances – user base, market maturity, spectrum assets, competition and so on. But broadly, we see three scenarios and implications for operators:
“All hands on deck!”: Continued strong growth (perhaps with a small “blip”) which maintains the pressure on networks, threatens congestion, and drives the need for additional capacity, spectrum and capex.
Operators should continue with current multiple strategies for dealing with data traffic – acquiring new spectrum, upgrading backhaul, exploring massive capacity enhancement with small cells and examining a variety of offload and optimisation techniques. Where possible, they should explore two-sided models for charging and use advanced pricing, policy or segmentation techniques to rein in abusers and reward those customers and applications that are parsimonious with their data use. Vigorous lobbying activities will be needed, for gaining more spectrum, relaxing Net Neutrality rules and perhaps “taxing” content/Internet companies for traffic injected onto networks.
“Panic over”: Moderating and patchy growth, which settles to a manageable rate – comparable with the patterns seen in the fixed broadband marketplace
This will mean that operators can “relax” a little, with the respite in explosive growth meaning that the continued capex cycles should be more modest and predictable. Extension of today’s pricing and segmentation strategies should improve margins, with continued innovation in business models able to proceed without rush, and without risking confrontation with Internet/content companies over traffic management techniques. Focus can shift towards monetising customer insight, ensuring that LTE rollouts are strategic rather than tactical, and exploring new content and communications services that exploit the improving capabilities of the network.
“Hangover”: Growth flattens off rapidly, leaving operators with unused capacity and threatening brutal price competition between telcos.
This scenario could prove painful, reminiscent of early-2000s experience in the fixed-broadband marketplace. Wholesale business models could help generate incremental traffic and revenue, while the emphasis will be on fixed-cost minimisation. Some operators will scale back 4G rollouts until cost and maturity go past the tipping-point for outright replacement of 3G. Restrictive policies on bandwidth use will be lifted, as operators compete to give customers the fastest / most-open access to the Internet on mobile devices. Consolidation – and perhaps bankruptcies – may ensure as declining data prices may coincide with substitution of core voice and messaging business
To read the note in full, including the following analysis…
Telco-driven disruption vs. external trends
External disruptions to monitor
The mobile data explosion… or maybe not?
A J-curve or an S-curve?
Evolving the mobile network
Network sharing, wholesale and outsourcing
Next-gen IP core networks (EPC)
Femtocells / small cells / “cloud RANs”
Advanced offload: LIPA, SIPTO & others
Self optimising networks (SON)
M2M-specific broadband innovations
Policy, control & business model enablers
The internal politics of mobile broadband & policy
Two sided business-model enablement
Mobile video networking and CDNs
Controlling signalling traffic
Analytics & QoE awareness
Conclusions & recommendations
…and the following figures…
Figure 1 – Trends in European data usage
Figure 2 – A plethora of mobile data traffic management solutions
Figure 3 – Not all operator WiFi is “offload” – other use cases include “onload”
Figure 4 – Internal ‘power tensions’ over managing mobile broadband
Figure 5 – How a congestion API could work
Figure 6 – Relative Maturity of MBB Management Solutions
Figure 9 – Summary of disruptive network innovations
…Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 44 page report in PDF format here. Non-Members, please subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email email@example.com / call +44 (0) 207 247 5003.
Summary: Content Delivery Networks (CDNs) are becoming familiar in the fixed broadband world as a means to improve the experience and reduce the costs of delivering bulky data like online video to end-users. Is there now a compelling need for their mobile equivalents, and if so, should operators partner with existing players or build / buy their own? (August 2011, Executive Briefing Service, Future of the Networks Stream).
Below is an extract from this 25 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can buy a Single User license for this report online here for £595 (+VAT) or subscribe here. For multiple user licenses, or to find out about interactive strategy workshops on this topic, please email firstname.lastname@example.org or call +44 (0) 207 247 5003.
To share this article easily, please click:
As is widely documented, mobile networks are witnessing huge growth in the volumes of 3G/4G data traffic, primarily from laptops, smartphones and tablets. While Telco 2.0 is wary of some of the headline shock-statistics about forecast “exponential” growth, or “data tsunamis” driven by ravenous consumption of video applications, there is certainly a fast-growing appetite for use of mobile broadband.
That said, many of the actual problems of congestion today can be pinpointed either to a handful of busy cells at peak hour – or, often, the inability of the network to deal with the signalling load from chatty applications or “aggressive” devices, rather than the “tonnage” of traffic. Another large trend in mobile data is the use of transient, individual-centric flows from specific apps or communications tools such as social networking and messaging.
But “tonnage” is not completely irrelevant. Despite the diversity, there is still an inexorable rise in the use of mobile devices for “big chunks” of data, especially the special class of software commonly known as “content” – typically popular/curated standalone video clips or programmes, or streamed music. Images (especially those in web pages) and application files such as software updates fit into a similar group – sizeable lumps of data downloaded by many individuals across the operator’s network.
This one-to-many nature of most types of bulk content highlights inefficiencies in the way mobile networks operate. The same data chunks are downloaded time and again by users, typically going all the way from the public Internet, through the operator’s core network, eventually to the end user. Everyone loses in this scenario – the content publisher needs huge servers to dish up each download individually. The operator has to deal with transport and backhaul load from repeatedly sending the same content across its network (and IP transit from shipping it in from outside, especially over international links). Finally, the user has to deal with all the unpredictability and performance compromises involved in accessing the traffic across multiple intervening points – and ends up paying extra to support the operator’s heavier cost base.
In the fixed broadband world, many content companies have availed themselves of a group of specialist intermediaries called CDNs (content delivery networks). These firms on-board large volumes of the most important content served across the Internet, before dropping it “locally” as near to the end user as possible – if possible, served up from cached (pre-saved) copies. Often, the CDN operating companies have struck deals with the end-user facing ISPs, which have often been keen to host their servers in-house, as they have been able to reduce their IP interconnection costs and deliver better user experience to their customers.
In the mobile industry, the use of CDNs is much less mature. Until relatively recently, the overall volumes of data didn’t really move the needle from the point of view of content firms, while operators’ radio-centric cost bases were also relatively immune from those issues as well. Optimising the “middle mile” for mobile data transport efficiency seemed far less of a concern than getting networks built out and handsets and apps perfected, or setting up policy and charging systems to parcel up broadband into tiered plans. Arguably, better-flowing data paths and video streams would only load the radio more heavily, just at a time when operators were having to compress video to limit congestion.
This is now changing significantly. With the rise in smartphone usage – and the expectations around tablets – Internet-based CDNs are pushing much more heavily to have their servers placed inside mobile networks. This is leading to a certain amount of introspection among the operators – do they really want to have Internet companies’ infrastructure inside their own networks, or could this be seen more as a Trojan Horse of some sort, simply accelerating the shift of content sales and delivery towards OTT-style models? Might it not be easier for operators to build internal CDN-type functions instead?
Some of the earlier approaches to video traffic management – especially so-called “optimisation” without the content companies’ permission of involvement – are becoming trickier with new video formats and more scrutiny from a Net Neutrality standpoint. But CDNs by definition involve the publishers, so potentially any necessary compression or other processing can be collaboratively, rather than “transparently” without cooperation or willingness.
At the same time, many of the operators’ usual vendors are seeing this transition point as a chance to differentiate their new IP core network offerings, typically combining CDN capability into their routing/switching platforms, often alongside the optimisation functions as well. In common with other recent innovations from network equipment suppliers, there is a dangled promise of Telco 2.0-style revenues that could be derived from “upstream” players. In this case, there is a bit more easily-proved potential, since this would involve direct substitution of the existing revenues already derived from content companies, by the Internet CDN players such as Akamai and Limelight. This also holds the possibility of setting up a two-sided, content-charging business model that fits OK with rules on Net Neutrality – there are few complaints about existing CDNs except from ultra-purist Neutralists.
On the other hand, telco-owned CDNs have existed in the fixed broadband world for some time, with largely indifferent levels of success and adoption. There needs to be a very good reason for content companies to choose to deal with multiple national telcos, rather than simply take the easy route and choose a single global CDN provider.
So, the big question for telcos around CDNs at the moment is “should I build my own, or should I just permit Akamai and others to continue deploying servers into my network?” Linked to that question is what type of CDN operation an operator might choose to run in-house.
There are four main reasons why a mobile operator might want to build its own CDN:
To lower costs of network operation or upgrade, especially in radio network and backhaul, but also through the core and in IP transit.
To improve the user experience of video, web or applications, either in terms of data throughput or latency.
To derive incremental revenue from content or application providers.
For wider strategic or philosophical reasons about “keeping control over the content/apps value chain”
This Analyst Note explores these issues in more details, first giving some relevant contextual information on how CDNs work, especially in mobile.
What is a CDN?
The traditional model for Internet-based content access is straightforward – the user’s browser requests a piece of data (image, video, file or whatever) from a server, which then sends it back across the network, via a series of “hops” between different network nodes. The content typically crosses the boundaries between multiple service providers’ domains, before finally arriving at the user’s access provider’s network, flowing down over the fixed or mobile “last mile” to their device. In a mobile network, that also typically involves transiting the operator’s core network first, which has a variety of infrastructure (network elements) to control and charge for it.
A Content Delivery Network (CDN) is a system for serving Internet content from servers which are located “closer” to the end user either physically, or in terms of the network topology (number of hops). This can result in faster response times, higher overall performance, and potentially lower costs to all concerned.
In most cases in the past, CDNs have been run by specialist third-party providers, such as Akamai and Limelight. This document also considers the role of telcos running their own “on-net” CDNs.
CDNs can be thought of as analogous to the distribution of bulky physical goods – it would be inefficient for a manufacturer to ship all products to customers individually from a single huge central warehouse. Instead, it will set up regional logistics centres that can be more responsive – and, if appropriate, tailor the products or packaging to the needs of specific local markets.
As an example, there might be a million requests for a particular video stream from the BBC. Without using a CDN, the BBC would have to provide sufficient server capacity and bandwidth to handle them all. The company’s immediate downstream ISPs would have to carry this traffic to the Internet backbone, the backbone itself has to carry it, and finally the requesters’ ISPs’ access networks have to deliver it to the end-points. From a media-industry viewpoint, the source network (in this case the BBC) is generally called the “content network” or “hosting network”; the destination is termed an “eyeball network”.
In a CDN scenario, all the data for the video stream has to be transferred across the Internet just once for each participating network, when it is deployed to the downstream CDN servers and stored. After this point, it is only carried over the user-facing eyeball networks, not any others via the public Internet. This also means that the CDN servers may be located strategically within the eyeball networks, in order to use its resources more efficiently. For example, the eyeball network could place the CDN server on the downstream side of its most expensive link, so as to avoid carrying the video over it multiple times. In a mobile context, CDN servers could be used to avoid pushing large volumes of data through expensive core-network nodes repeatedly.
When the video or other content is loaded into the CDN, other optimisations such as compression or transcoding into other formats can be applied if desired. There may also be various treatments relating to new forms of delivery such as HTTP streaming, where the video is broken up into “chunks” with several different sizes/resolutions. Collectively, these upfront processes are called “ingestion”.
Figure 1 – Content delivery with and without a CDN
Source: STL Partners / Telco 2.0
Value-added CDN services
It is important to recognise that the fixed-centric CDN business has increased massively in richness and competition over time. Although some of the players have very clever architectures and IPR in the forms of their algorithms and software techniques, the flexibility of modern IP networks has tended to erode away some of the early advantages and margins. Shipping large volumes of content is now starting to become secondary to the provision of associated value-added functions and capabilities around that data. Additional services include:
Analytics and reporting
Content ingestion and management
Website security management
Consulting and professional services
It is no coincidence that the market leader, Akamai, now refers to itself as “provider of cloud optimisation services” in its financial statements, rather than a CDN, with its business being driven by “trends in cloud computing, Internet security, mobile connectivity, and the proliferation of online video”. In particular, it has started refocusing away from dealing with “video tonnage”, and towards application acceleration – for example, speeding up the load times of e-commerce sites, which has a measurable impact on abandonment of purchasing visits. Akamai’s total revenues in 2010 were around $1bn, less than half of which came from “media and entertainment” – the traditional “content industries”. Its H1 2011 revenues were relatively disappointing, with growth coming from non-traditional markets such as enterprise and high-tech (eg software update delivery) rather than media.
This is a critically important consideration for operators that are looking to CDNs to provide them with sizeable uplifts in revenue from upstream customers. Telcos – especially in mobile – will need to invest in various additional capabilities as well as the “headline” video traffic management aspects of the system. They will need to optimise for network latency as well as throughput, for example – which will probably not have the cost-saving impacts expected from managing “data tonnage” more effectively.
Although in theory telcos’ other assets should help – for example mapping download analytics to more generalised customer data – this is likely to involve extra complexity with the IT side of the business. There will also be additional efforts around sales and marketing that go significantly beyond most mobile operators’ normal footprint into B2B business areas. There is also a risk that an analysis of bottlenecks for application delivery / acceleration ends up simply pointing the finger of blame at the network’s inadequacies in terms of coverage. Improving delivery speed, cost or latency is only valuable to an upstream customer if there is a reasonable likelihood of the end-user actually having connectivity in the first place.
Figure 2: Value-added CDN capabilities
An increasingly important aspect of CDNs is their move beyond content/media distribution into a much wider area of “acceleration” and “cloud enablement”. As well as delivering large pieces of data efficiently (e.g. video), there is arguably more tangible value in delivering small pieces of data fast.
There are various manifestations of this, but a couple of good examples illustrate the general principles:
Many web transactions are abandoned because websites (or apps) seem “slow”. Few people would trust an airline’s e-commerce site, or a bank’s online interface, if they’ve had to wait impatiently for images and page elements to load, perhaps repeatedly hitting “refresh” on their browsers. Abandoned transactions can be directly linked to slow or unreliable response times – typically a function of congestion either at the server or various mid-way points in the connection. CDN-style hosting can accelerate the service measurably, leading to increased customer satisfaction and lower levels of abandonment.
Enterprise adoption of cloud computing is becoming exceptionally important, with both cost savings and performance enhancements promised by vendors. Sometimes, such platforms will involve hybrid clouds – a mixture of private (Internal) and public (Internet) resources and connectivity. Where corporates are reliant on public Internet connectivity, they may well want to ensure as fast and reliable service as possible, especially in terms of round-trip latency. Many IT applications are designed to be run on ultra-fast company private networks, with a lot of “hand-shaking” between the user’s PC and the server. This process is very latency-dependent, and especially as companies also mobilise their applications the additional overhead time in cellular networks may otherwise cause significant problems.
Hosting applications at CDN-type cloud acceleration providers achieves much the same effect as for video – they can bring the application “closer”, with fewer hops between the origin server and the consumer. Additionally, the CDN is well-placed to offer additional value-adds such as firewalling and protection against denial-of-service attacks.
To read the 25 note in full, including the following additional content…
How do CDNs fit with mobile networks?
Internet CDNs vs. operator CDNs
Why use an operator CDN?
Should delivery mean delivery?
Lessons from fixed operator CDNs
Mobile video: CDNs, offload & optimisation
CDNs, optimisation, proxies and DPI
The role of OVPs
Implementation and planning issues
Conclusion & recommendations
… and the following additional charts…
Figure 3 – Potential locations for CDN caches and nodes
Figure 4 – Distributed on-net CDNs can offer significant data transport savings
Figure 5 – The role of OVPs for different types of CDN player
Figure 6 – Summary of Risk / Benefits of Centralised vs. Distributed and ‘Off Net’ vs. ‘On-Net’ CDN Strategies
……Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 25 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595 (+VAT), or for multi-user licenses and any other enquiries please email email@example.com or call +44 (0) 207 247 5003.
Summary: ‘Net Neutrality’ has gathered increasing momentum as a market issue, with AT&T, Verizon, major European telcos and Google and others all making their points in advance of the Ofcom, EC, and FCC consultation processes. This is Telco 2.0’s input, analysis and recommendations. (September 2010, Foundation 2.0,, Executive Briefing Service, Future of the Networks Stream).
NB A PDF copy of this 17 page document can be downloaded in full here. We’ll also be discussing this at the Telco 2.0 Executive Brainstorms. Email firstname.lastname@example.org or call +44 (0) 207 247 5003 to find out more.
In this paper, Telco 2.0 recommends that the appropriate general response to concerns over ‘Net Neutrality’ is to make it easier for customers to understand what they should expect, and what they actually get, from their broadband service, rather than impose strict technical rules or regulation about how ISPs should manage their networks.
In this article we describe in detail why, and provide recommendations for how.
NB We would like to express our thanks to Dean Bubley of Disruptive Analysis, who has worked closely with our team to develop this paper.
‘Net Neutrality’ is an issue manufactured and amplified by lobbyists on the behalf of competing commercial interests. Much of the debate on the issue has become somewhat distracting and artificial as the ‘noise’ of self-interested opinion has become much louder than the ‘signal’ of potential resolutions.
The libertarian ideal that the title implies is a clever piece of PR manipulation of ideas of freedom of access of information, and freedom from interference. For the most part, this is far from the reality of the motives of the players engaged in the debate.
Additionally, the ‘public’ net neutrality debate is being driven by tech-savvy early adopters whose views and ‘use cases’ are not statistically representative of the overall internet population.
This collection of factors has created a strange landscape of idealist and specialised viewpoints congregating around the industry lobbyists’ various positions.
However, behind the scenes, the big commercial players are becoming increasingly tense, and we have recently experienced a marked reluctance from senior telco executives to comment on the issue in public.
Our position is that, beyond the hyperbole, the fair and proper management of contention between Internet Applications and ‘Specialised Services’ is important in the interests of consumers and the potential creation of new business models.
What, exactly, is the ‘problem’ and for whom?
Rapidly increasing use of the Internet and Specialised Services, particularly bandwidth hungry applications like online video, is causing (or, at least, will in theory cause) increasing contention in parts of the network.
The currently expressed primary concerns of net neutrality activists are that some consumers will receive a service whose delivery has been covertly manipulated by an external party, in this case their ISP. Similarly, some application and service providers fear that their services are or will consequently be discriminated against by telcos.
Some telcos think that certain other large and bandwidth-hungry applications are receiving a ‘free ride’ on their networks, and their corporate owners consequently receiving the benefits of expensive network investments without contribution. As a consequence, ISPs argue that they should be entitled to unilaterally constrain certain types of applications unless application providers pay for the additional bandwidth.
It’s a Commercial Issue, not a Moral Issue
One of the areas of obfuscation in the ‘Net Neutrality’ debate is the confusion between two sets of issues in the debate: ‘moral and legal’ and ‘commercial’.
Moral & legal issues include matters such as ‘freedom of expression’ and the right to unfettered internet access, the treatment of pirated content, and censorship of extreme religious or pornographic materials. We regard these as subjects for the law where the service is consumed / produced etc., but that have in some places become entangled in the ‘Net Neutrality’ debate and which should not be its focus.
The commercial issue is whether operators should be regulated in how they prioritise traffic from one commercial application over another without the user’s knowledge.
What causes this problem?
Contention can arise at different points between the service or application and the user, for example:
Caused by bulk traffic from users and applications in the ‘core network’ beyond the local exchange, (akin to the general slowing of Internet applications in the evening in Europe due to greater local and U.S. usage at that time);
Between applications on a bandwidth restricted local access route (e.g. ADSL over a copper pair, mobile broadband).
As a service may originate from and be delivered to anywhere globally, the first kind of contention can only be truly be managed if there is either a) an Internet-wide standard for prioritising different types of traffic, or b) a specific overlay network for that service which bypasses the internet to a certain ‘outer’ point in the network closer to the consumer such as a local exchange. This latter class of service delivery may be accompanied by a connection between the exchange and the end-user that is not over the internet – and this is the case in most IPTV services.
To alleviate issues of contention, various ‘Traffic Management’ strategies are available to operators, as shown in the following diagram, with increasingly controversial types of intervention to the right.
Figure 1 – Ofcom’s Traffic Management Continuum
Is It Really a Problem?
Operators already do apply traffic management techniques, an example of which was given by 3UK’s Director of Network Strategy at the recent Broadband Stakeholder Group (BSG) event in London, who explained that at peak times in the busiest cells, 3 limits SS7 signalling and P2P traffic. He explained that they selected these categories because they are essentially ‘background’ applications that have little impact on the consumer’s experience, and it was important to keep down latency so that more interactive applications like Web browsing functioned well. A major ‘use case’ for 3UK was identifying which cells needed investment.
In 3UK’s case, there was perhaps surprisingly more signalling traffic than there was P2P. Though this is a mobile peculiarity, it illustrates that assumptions about problems in managing traffic management can often be wrong, and it is important that decisions should be taken on the basis of data rather than prejudice.
While there are vociferous campaigners and powerful commercial interests at stake, it is fair to say that the streets are not often full of angry consumers waving banners reading ‘Hands off my YouTube’ and knocking on the doors of telcos’ HQs. While a quick and entirely non-representative survey of Telco 2.0’s non-technical relatives-of-choice revealed complete ignorance and lack of further interest in the subject, this does not necessarily mean that there is not, or could not be, a problem, and it is possible that consumers could unwittingly suffer. On balance though, Telco 2.0 has not yet seen significant evidence of a market failure. We also believe that the mechanisms of the market are the best means of managing potential conflict.
A case of ‘Terminological Inexactitude’
We broadly agree with Alex Blowers of OFCOM, who said that ‘80% of the net neutrality debate is in the definition’ at the recent BSG conference.
First, the term ‘Net Neutrality’ does not actually distinguish which services it refers to – does ‘Net’ mean ‘The Internet’, ‘The Network’, or something else? To most it is taken to mean ‘The Internet’, so what is ‘The Internet’? Despite the initial sense that the answer to this question seems completely obvious, a short conversation within or outside the industry will reveal an enormous range of definitions. The IT Director will give you a different answer from your non-technical relatives and friends.
These ambiguities have the straightforward consequence that the term ‘Net Neutrality’ can be used to mean whatever the user wants, and its use is therefore generally a guarantee for mindless circular arguments and confusion . In other words: perfect conditions for lobbyists with partial views.
For most people, ‘the internet’ is “everything I can get or do when my computer or phone is connected online”. A consumer with such a view probably has a broadband line and an internet service and is among those, in theory at least, most in need of protection from unscrupulous policy management that might favour one form of online traffic over another without their knowledge or control. It is their understanding and expectation of what they have bought against the reality of what they get that we see as the key in this matter.
In this paper, we discuss two classes of services that can be delivered via a broadband access line.
1. Access to ‘The Internet’ (note capitalisation), which means being able to see and interact with the full range of websites, applications and services that are legitimate and publicly available. We set out some guiding principles below on a tighter definition of what services described as ‘The Internet’ should deliver.
2. ‘Specialised Services’ are other services that use a broadband line, that often connect to a device other than a PC (e.g. IPTV via set-top boxes, smart meters, RIM’s Blackberry Exchange Server (BES)) or a service that may be connected to a PC but via a VPN, such as corporate video conferencing, Cloud or Enterprise VOIP solutions.
While ‘Specialised Services’ are not by our definition pure Internet services, they can also have an effect in certain circumstances on the provision of ‘The Internet’ to an end-user where they share parts of the connection that are in contention. Additionally, there can be contention between services on ‘The Internet’ from multiple users or applications connected via a common router.
Additionally, fixed and mobile communications present different contexts for the services, with different potential mechanisms for control and management. Mobile services have the particular difference that, other than signalling, there is no connection between device and the network when data services are not being used.
The Internet: ‘Appellation Controlee’?
One possible mechanism to improve consumer understanding and standards of marketing services is to introduce a framework for defining more tightly services sold as “Internet Access”. In our view, services sold as ‘The Internet’ should:
Provide access to all legitimate online services using the ‘public’ internet;
Perform within certain bounds of service performance as marketed (e.g. speed, latency);
Be subject to the minimum necessary ‘traffic management’ by the ISP, which should only be permissible in specific instances, such as contention (e.g. peak hour use) ;
Aim to maintain consistent delivery of all services in line with reasonable customer expectation and best possible customer experience (as exemplified in a ‘code of best practice’);
Provide published and accessible performance measures against ‘best practice’ standards.
Where a customer has paid extra for a Specialised Service, e.g. IPTV, it is reasonable to give that service priority to pre-agreed limits while in use.
The point of defining such an experience would be to give consumers a reference point, or perhaps a ‘Kitemark’, to assure them of the nature of the service they are buying. In instances where the service sold is less than that defined, the service would need to be identified, e.g. a ‘Limited Internet Access Service’.
The Internet isn’t really ‘Neutral’
To understand the limitations and possible advantages of ‘traffic management’, and put this into context, it is worth briefly reviewing some of the other ways in which customer and service experiences vary.
Different Services Work in Different Ways Many Internet Services use already use complex mechanisms to optimise their delivery to the end-user. For example:
Google has built a huge Content Delivery Network, using fibre to speed communications between data centres, dedicated delivery of traffic to international peering points, and equipment at ISPs for expediting caching and content delivery, to ensure that its content is delivered more rapidly to the outer edges of the network;
BBC News Player uses an Akamai Content Delivery Network (CDN) similarly;
Skype delivers its traffic more effectively by optimising its route through the peer-to-peer network.
Equally, most ISPs are able to ‘tune’ their data services to better match the characteristics of their own network. Although these assets are only available to the services that pay for, own, or create them, none of these techniques actively slows any other service. Indeed, and in theory, by creating or using new non-congested routes, they free capacity for other services so the whole network benefits.
Consumer Experiences are Different too Today’s consumer experience of ISP services varies widely on local factors. Two neighbours (who happen to be on different nodes) could, in theory get a very different user experience from the same ISP depending on factors such as:
Local congestion (service node contention, loading, router and backhaul capacity);
Quality and length of local loop (including customer internal wiring);
Physical signal interference at the MDF (potentially a big issue where there is lots of ULL);
Time of day;
Router make and settings (particularly relating to QOS, security).
These factors will, in many cases, massively outweigh performance variation experienced from possible ‘traffic management’ by ISPs.
Internet Protocols try to be ‘Fair’
The Internet runs using a set of data traffic rules or protocols which determine how different pieces of data reach their destinations. These protocols, e.g. TCP/IP, OSPF, BGP, are designed to ensure that traffic from different sources is transmitted with equal priority and efficiency.
Further Technical Fixes Are Possible
Network congestion is not an issue that appeared overnight with the FCC’s 2005 Comcast decision. In fact, the Internet engineering community has been grappling with it with some success since the near-disaster in the late 1980s that led to the introduction of congestion control mechanisms in TCP.
Much more recently, the popular BitTorrent file sharing protocol, frequently criticised for getting around TCP’s congestion control, has been adapted to provide application-level congestion control. The P4P protocol, created at MIT and tested by Verizon and Telefonica, provides means for P2P systems and service provider networks to cooperate better. However, it remains essentially unused.
A further consideration is that it is necessary to be realistic about what can be expected – we have heard the benefits from traffic-shaping cited as an extension of around 10% in the upgrade cycle in the best-case scenario.
It’s Complex, not Neutral
It is therefore simply not the case that all Internet services progress from point of origin somewhere in the cloud of cyberspace to the end-users via a random and polite system. There are assets that are not equally shared, significant local variations, and there are complex rules and standards.
‘The Internet’ is a highly complex structure with many competing mechanisms of delivery, and this is one of its great strengths – the multiplicity of routes and mechanisms creates a resilient and continually evolving and improving system. But it is not ‘neutral’, although many of its core functions (such as congestion control) are explicitly designed to be fair.
Don’t Block the Pipes, Lubricate the Market
In principle, Telco 2.0 endorses developments that support new business models, but also believes that the rights of end-users should be appropriately protected. They have, after all, already paid for the service, and having done so should have the right to access the services they believe they have paid for within the bounds of legality.
In terms of how to achieve this balance, it’s very difficult to measure and police service levels, and we believe that simply mandating traffic management solutions alone is impractical.
Moreover, we think that creating a fair and efficient market is a better mechanism than any form of regulation on the methods that operators use to prioritise services.
Empower the Customer
There are three basic ways of creating and fulfilling expectations fairly, and empowering end-customers to make better decisions on which service they choose.
Improving Transparency – being clear and honest about what the customer can expect from their service in terms of performance, and making sure that any traffic management approaches are clearly communicated.
Enabling DIY Service Management – some customers, particularly corporate clients and advanced users, are able and can be expected to manage significant components of their Internet services. For example, mechanisms already exist to flag classes of traffic as priority, and many types of CPE are capable of doing so. It’s necessary, however, that the service provider’s routers honour the attribute in question and that users are aware of it. Many customers would need support to manage this effectively, and this could be a role for 3rd parties in the market, though it is unlikely that this alone will result in fairness for all users.
Establishing Protection – for many customers, DIY Service Management is neither interesting nor possible, and we argue that a degree of protection is desirable by defining fair rules or ‘best practice’ for traffic management.
Not all customers are alike
‘Net Neutrality’ or any form of management of contention is not an issue for corporate customers, most of whom have the ability to configure their IP services at their will. For example, a financial services trader is likely to prioritise Bloomberg and trading services above all other services. This is not a new concept, as telcos have been offering managed data services (priority etc) to enterprise customers for years over their data connections and private IP infrastructure.
Some more advanced consumer users can also prioritise their own services. Some can alter the traffic management rules in their routers as described above. However, these customers are certainly in the minority of Innovators and Early Adopters. Innovation in user-experience design could change this to a degree, especially if customers have a reason to engage rather than being asked to do their service provider’s bottom line a favour.
The issue of unmanaged contention is therefore likely to affect the mass market, but is only likely to arise in certain circumstances. To illustrate this we have selected a number of specific scenarios or use cases in which we will show how we believe the principles we advocate should be applied. But first, what are our principles?
Lubricate the Market
There are broadly three regulatory market approaches.
‘Do nothing’ – the argument for this is that there is no evidence of market failure, and that regulating the service is therefore unnecessary and moreover difficult to do. We have some sympathy for this position, but believe that in practice some of direction is needed as recommended below.
‘Regulate the Market’ – so that telcos can do what they like with the traffic but customers can choose between suppliers on the basis of clear information about their practices and performance. A pure version of this approach would involve the specification of better consumer information at point of sale and published APIs on congestion.
‘Regulate the Method’ – with hard rules on traffic management rather than how the services are sold and presented. The ‘hard’ approach is potentially best suited to where the ‘market’ is insufficiently competitive / open. This method is difficult to police as services blur and ‘the game’ then becomes to be categorised as one type of service but act as another.
Telco 2.0 advocates a hybrid approach that promotes market transparency and liquidity to empower customers in their choices of ISP and services, including:
Guidelines for operators on ‘best practice in traffic management’, which in general would recommend that operators should follow the principle of “minimum intervention”;
Published assessments on how each operator meets these guidelines that make understanding operator’s performance on these issues straightforward for customers.
The criteria of the assessment would include the actual performance of the operator against claimed performance (e.g. speed, latency), and whether they adhere to the ‘Code of Best Practice’.
How might it work?
The communication of this assessment could be as simple as a ‘traffic light’ style indicator, where a full Internet service meeting best practice and consistently achieving say 90% of claimed performance would be ‘Green’, while services meeting lower standards / adherence or failing to report adequately would be signalled ‘Amber’ or ‘Red’. The principles used by the operator should also be published, though utilising this step on its own would run the risk of the “Licence Agreement” problem for software – which is that no-one reads them.
We’ll be working on refining our guidelines and thoughts on how an indicator or other system might work by working through some specific ‘Use Cases’ outlined below. In the meantime, we recommend the suggestions made by long-time Telco 2.0 Associate Dean Bubley in his Disruptive Analysis’s ‘Draft Code of Conduct for Policy Management and Net Neutrality’.
It is our view that as long as Telcos are forced to be open, regulator (and consumer bodies) can question or, ultimately, regulate for/against behaviours that could be beneficial/damaging.
The Role of the Regulator
We believe that the roles of the regulator(s) should be to:
Develop an agreed code of best practice with industry collaboration;
Agree, collect and publish measures of performance against the code;
Make it as easy as possible to switch providers by reducing the ‘hassle factor’ of clumsy processes, and by releasing consumers from onerous contractual obligations in instances of non-compliance with the code or performance at a ‘Red’ standard;
Monitor, publicise, and police market performance in line with appropriate regulatory compliance procedures.
We draw a parallel with what the UK regulator, Ofcom, used to do for telephony:
Force all providers with over a certain market share to report key performance metrics;
Publish these (ideally on the web, real time and by postcode);
Make it as easy to switch providers as possible;
Continuously review the set of performance metrics collected and published.
New ‘Enhanced Service’ Business Models?
Additionally, we see the following possible theoretical service layers within an Internet Service that could be used to create new business models:
‘Best efforts’ – e.g. ‘We try our best to deliver all of your broadband services to maximum speed and performance, but some services may take priority at certain times of the day in order to cope with network demands. The services will not cease to work but you may experience temporarily degraded performance.’
‘Protected’ – akin to the ambulance lane (e.g. Health, SmartGrid – packets that are always delivered/could be low or higher bandwidth e.g. a video health app but the principle of priority stands for both).
‘Enhanced Service’ – e.g. a TV service that the customer has paid for (e.g. IPTV) or that a network will (or might) pay extra to deliver assure a higher degree of quality.
One possibility that we will be exploring is whether it could be possible to create an ‘On-demand Enhanced Service’. For example, to deliver a better video streaming experience the video provider can pays for their traffic to take priority over other services with the express consent of the customer. This may be achieved by adding a message to the Enhanced Service e.g. ‘Click here to use our Enhanced Video Service where we’ll pay to get your video to you quicker. This may cause degradation to the service to other applications currently active on your broadband line while you are using the Enhanced Video Service’.
We have long thought that there is scope for innovation in service design and pricing – for example, rather than offering a (supposed) continuous 8Mbps throughput (which most UK operators can’t actually support and have no intention of supporting), why not offer a lower average rate and the option to “burst” up to high speed when required? ISPs actually sell each other bandwidth on similar terms, so there is no reason why this should be impossible.
Example Scenarios / Use Cases
We’ve identified a number of specific scenarios which we will be researching and developing ‘Use Cases’ to illustrate how these principles would apply. Each of these cases is intended to illustrate different aspects of how a service should be sold to, and managed by / for the customer to ensure that expectations are set and met and that consumers are protected appropriately. Fixed ‘Use Cases’
Contention between Internet services over an ADSL line on a copper pair, e.g. Dad is editing a website, Daughter is watching YouTube videos, with a SmartGrid meter in operation over a shared wireless router. This is interesting because of the limited bandwidth on the ADSL line, plus consideration of the SmartGrid monitoring as a ‘Specialised Service’, and potentially also as a ‘Protected Service’ in our exploratory classification of potential service classes.
Contention between Internet and Specialised Services over an ADSL line on a copper pair, e.g. Dad is streaming an HD video on the internet, daughter is watching IPTV. This is interesting because of the limited bandwidth on the ADSL line and the additional factor of the IPTV service over the broadband connection. Unlike a DOCSIS 3 cable link, where the CATV service is additional to the Internet service and in fact can be used to offload applications like iPlayer, the DSL environment means that “specialised services” will contend with public Internet service.
Managed Vs Unmanaged Femtocells over an ADSL connection. An Unmanaged Femtocell is e.g. a Sprint Femtocell over an AT&T ADSL connection, where the Femtocell is treated purely as another source of IP traffic. A Managed Femtocell is e.g. a Softbank Femtocell operating on a Softbank ADSL line, using techniques such as improved synchronisation with the network to produce a better service. An examination of alternate approaches to managing Femtocell traffic is interesting: 1) because a Femtocell inherently involves a combination of mobile and fixed traffic over different networks, so draws out fixed/mobile issues, and; 2) it is useful to work through how a Managed Femtocell Use Case might work within the market approach we’ve defined.
A comparison of a home worker using videoconferencing with remote colleagues in two scenarios: one using VPN software and configured router; the second using Skype with no local configuration. The objective here is to explore the relative difference in the quality of user experience as an illustration of what is possible in a advanced user ‘DIY’ management scenario.
The ‘Use Case’ of an ‘On-demand Enhanced Service’ for a professional web video-cast, with the consumer experience as outlined above. The idea here is that the user grants temporary permission to the video provider and the network to temporarily provide an ‘Enhanced Service’. This role of this ‘Use Case’ is to explore how and whether a ‘sender pays’ model could be implemented both technically and commercially in a way that respected consumer concerns.
HDTV to the living room TV. This is interesting because the huge bandwidth requirements needed to deliver HDTV are far beyond those originally envisaged and have a potentially significant impact on network costs. Would user expectations of such a service permit e.g. buffering to deliver it without extra cost, or might this also enable a legitimate ‘two-sided’ sender pays model where the upstream customer (e.g. the media provider) pays?
Mobile ‘Use Case’
VOIP over mobile. Is it right that VOIP over mobile networks should be treated differently from how it is over fixed networks?
Telco 2.0’s Position Vs the Rest
There is reasonably common ground between most analysts and commentators on the need for more transparency in Internet Access service definition and performance and management standards, though there is little clarity yet in the ways in which this clarity might be achieved.
The area which is most contentious is the notion of ‘non-discrimination’ – that is of allowing ISPs to prioritise one form or source of traffic over another. AT&T are firmly in favour of ‘paid prioritisation’ whereas Google/Verizon are not, and says ‘wireline broadband providers would not be able to discriminate against or prioritize lawful Internet content, applications or services in a way that causes harm to users or competition’.
Free Press are a US activist movement who champion ‘Net Neutrality’. While we have accord with their desire for freedom of speech, and understand the imperative to create a more level-playing field for media in the US, our position is not aligned in terms of enshrining total neutrality globally by regulation.
In terms of the regulators’ positions, the UK’s Ofcom is tentatively against ‘ex-ante’ regulation, whereas the FCC seems to favour non-discrimination as a principle. The FCC is also asking whether mobile and fixed are different – we say they are, although as the example of 3UK shows, the differences may not be the ones you expect. Ofcom is also already looking at how it might make switching easier for customers.
We also note that US-based commentators generally see less competition on fixed internet services than in Europe, and less mobile broadband options for customers. Our position is that local competitive conditions are a relevant consideration in these matters, albeit that the starting point should be to regulate the market as described before considering a stronger stance on intervention in circumstances of low local competition.
Conclusion & Recommendations
‘Net Neutrality’ is largely a clever but distracting lobbyists’ ploy that has gathered enormous momentum on the hype circuit. The debate does create a possible opportunity to market and measure broadband services better, and that’s no bad thing for customers. There may also be opportunities to create new business models, but there’s still work to be done to assess if these are material.
‘Lubricate the Market’
1. “Internet Access” should be more tightly defined to mean a service that:
Provides access to all legitimate online services using the ‘public’ internet;
Performs within certain bounds of service performance as marketed (e.g. speed, latency);
Is subject to the minimum necessary ‘traffic management’ by the ISP, which should only be permissible in specific instances, such as contention (e.g. peak hour use);
Maintains consistent delivery of all services in line with reasonable customer expectation and best possible customer experience (as exemplified in a ‘code of best practice’);
Provides published and accessible performance measures against ‘best practice’ standards.
2. Where a customer has paid extra for a ‘Specialised Service’, e.g. IPTV, it is reasonable to give that service priority to agreed limits while in use. Services not meeting these criteria should be named, e.g. “Limited Internet Access”.
3. ISPs should be:
Able to do ‘what they need’ in terms of traffic management to deliver an effective service but that they must be open and transparent about it;
Realistic about the likely limits to possible benefits from traffic-shaping.
4. The roles of the regulator are to:
Develop an agreed code of best practice with industry collaboration;
Agree, collect and publish measures of performance against the code;
Ensure sufficient competition and ease of switching in the market;
Monitor, publicise, and police market performance in line with appropriate regulatory compliance procedures.
We have also outlined:
Principles for a code of best practice;
A simple ‘traffic light’ system that might be used to signal quality and compliance levels;
‘Use Cases’ for further analysis to help refine the recommended ‘Code of Practice’ and its implementation, including exploration of an ‘On-Demand Enhanced Service’ that could potentially enable new business models within the framework outlined.
NB A full PDF copy of this briefing can be downloaded here.
This special Executive Briefing report summarises the brainstorming output from the Content Distribution 2.0 (Broadband Video) section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.
It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0′ business model framework (see www.telco2research.com).
Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.
There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and APIs 2.0. In addition there will be an overall ‘Executive Summary’ report highlighting the overall messages from the event.
Each report contains:
Our independent summary of some of the key points from the stimulus presentations
An analysis of the brainstorming output, including a large selection of verbatim comments
The ‘next steps’ vote by the participants
Our conclusions of the key lessons learnt and our suggestions for industry next steps.
The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points.
Background to this report
The demand for internet video is exploding. This is putting significant stress on the current fixed and mobile distribution business model. Infrastructure investments and operating costs required to meet demand are growing faster than revenues. The strategic choices facing operators are to charge consumers more when they expect to pay less, to risk upsetting content providers and users by throttling bandwidth, or to unlock new revenues to support investment and cover operating costs by creating new valuable digital distribution services for the video content industry.
A summary of the new Telco 2.0 Online Video Market Study: Options and Opportunities for Distributors in a time of massive disruption.
What are the most valuable new digital distribution services that telcos could create?
What is the business model for these services – who are the potential buyers and what are prior opportunity areas?
What progress has been made in new business models for video distribution – including FTTH deployment, content-delivery networking, and P2P?
Preliminary results of the UK cross-carrier trial of sender-pays data
How the TM Forum’s IPSphere programme can support video distribution
Stimulus Presenters and Panellists
Richard D. Titus, Controller, Future Media, BBC
Trudy Norris-Grey, MD Transformation and Strategy, BT Wholesale
Scott Shoaf, Director, Strategy and Planning, Juniper Networks
Ibrahim Gedeon, CTO, Telus
Andrew Bud, Chairman, Mobile Entertainment Forum
Alan Patrick, Associate, Telco 2.0 Initiative
Simon Torrance, CEO, Telco 2.0 Initiative
Chris Barraclough, Managing Director, Telco 2.0 Initiative
Dean Bubley, Senior Associate, Telco 2.0 Initiative
Alex Harrowell, Analyst, Telco 2.0 Initiative
Stimulus Presentation Summaries
Content Distribution 2.0
Scott Shoaf, Director, Strategy and Planning, Juniper Networks opened the session with a comparison of the telecoms industry’s response to massive volumes of video and that of the US cable operators. He pointed out that the cable companies’ raison d’etre was to deliver vast amounts of video; therefore their experience should be worth something.
The first question, however, was to define the problem. Was the problem the customer, in which case the answer would be to meter, throttle, and cap bandwidth usage? If we decided this was the solution, though, the industry would be in the position of selling broadband connections and then trying to discourage its customers from using them!
Or was the problem not one of cost, but one of revenue? Networks cost money; the cloud is not actually a cloud, but is made up of cables, trenches, data centres and machines. Surely there wouldn’t be a problem if revenues rose with higher usage? In that case, we ought to be looking at usage-based pricing, but also at alternative business models – like advertising and the two-sided business model.
Or is it an engineering problem? It’s not theoretically impossible to put in bigger pipes until all the HD video from everyone can reach everyone else without contention – but in practice there is always some degree of oversubscription. What if we focused on specific sources of content? Define a standard of user experience, train the users to that, and work backwards?
If it is an engineering problem, the first step is to reduce the problem set. The long tail obviously isn’t the problem; it’s too long, as has been pointed out, and doesn’t account for very much traffic. It’s the ‘big head’ or ‘short tail’ stuff that is the heart of the problem: we need to deal with this short tail of big traffic generators. We need a CDN or something similar to deliver for this.
On cable, the customers are paying for premium content – essentially movies and TV – and the content providers are paying for distribution. We need to escape from the strict distinctions between Internet, IPTV, and broadcast. After all, despite the alarming figures for people leaving cable, many of them are leaving existing cable connections to take a higher grade of service. Consider Comcast’s Fancast – focused on users, not lines, with an integrated social-recommendation system, it integrates traditional cable with subscription video. Remember that broadcast is a really great way to deliver!
Advertising – at the moment, content owners are getting 90% of the ad money.
Getting away from this requires us to standardise the technology and the operational and commercial practices involved. The cable industry is facing this with the SCTE130 and Advanced Advertising 1.0 standards, which provide for fine-grained ad insertion and reporting. We need to blur the definition of TV advertising – the market is much bigger if you include Internet and TV ads together. Further, 20,000 subscribers to IPTV aren’t interesting to anyone – we need to attack this across the industry and learn how to treat the customer as an asset.
The Future of Online Video, 6 months on
Alan Patrick, Associate, Telco 2.0 updated the conference on how things had changed since he introduced the ”Pirate World” concept from our Online Video Distribution strategy report at the last Telco 2.0 event. The Pirate World scenario, he said, had set in much faster and more intensely than we had expected, and was working in synergy with the economic crisis.
Richard Titus, Controller, Future Media, BBC: ”I have no problem with carriers making money, in fact, I pay over the odds for a 50Mbits link, but the real difference is between a model that creates opportunities for the public and one which constrains them.”
Ad revenues were falling; video traffic still soaring; rights-holders’ reaction had been even more aggressive than we had expected, but there was little evidence that it was doing any good. Entire categories of content were in crisis.
On the other hand, the first stirrings of the eventual “New Players Emerge” scenario were also observable; note the success of Apple in creating a complete, integrated content distribution and application development ecosystem around its mobile devices.
The importance of CPE is only increasing; especially with the proliferation of devices capable of media playback (or recording) and interacting with Internet resources. There’s a need for a secure gateway to help manage all the gadgets and deliver content efficiently. Similarly, CDNs are only becoming more central – there is no shortage of bandwidth, but only various bottlenecks. It’s possible that this layer of the industry may become a copyright policing point.
We think new forms of CPE and CDNs are happening now; efforts to police copyright in the network are in the near future; VAS platforms are the next wave after that, and then customer data will become a major line of business.
Most of all, time is flying by, and the overleveraged, or undercapitalised, are being eaten first.
The Content Delivery Framework
Ibrahim Gedeon, CTO, Telus introduced some lessons from Telus’s experience deploying both on-demand bandwidth and developer APIs. Telcos aren’t good at content, he said; instead, we need to be the smartest pipe and make use of our trusted relationship with customers, built up over the last 150 years.
We’re working in an environment where cash is scarce and expensive, and pricing is a zero- or even negative-sum game; impossible to raise prices, and hard to cut without furthering the price war. So what should we be doing? A few years ago the buzzword was SDP; now it’s CDN. We’d better learn what those actually mean!
Trudy Norris-Gray, Managing Director, BT Wholesale: ”There is no capacity problem in the core, but there is to the consumer – and three bad experiences means the end of an application or service for that individual user.”
Anyway, we’re both a mobile and fixed operator and ISP, and we’ve got an IPTV network. We’ve learned the hard way that technology isn’t our place in the value chain. When we got the first IPTV system from Microsoft, it used 2,500 servers and far, far too much power. So we’re moving to a CDF (Content Delivery Framework) – which looks a lot like a SDP. Have the vendors just changed the labels on these charts?
So why do we want this? So we can charge for bandwidth, of course; if it was free, we wouldn’t care! But we’re making around $10bn in revenues and spending 20% of that in CAPEX. We need a business case for this continued investment.
We need the CDF to help us to dynamically manage the delivery and charging process for content. There was lots of goodness in IMS, the buzzword of five years ago, and in SDPs. But in the end it’s the APIs that matter. And we like standards because we’re not very big. So, we want to use TM Forum’s IPSphere to extend the CDF and SDF; after all, in roaming we apply different rate cards dynamically and settle transactions, so why not here too, for video or data? I’d happily pay five bucks for good 3G video interconnection.
And we need to do this for developer platforms too, which is why we’re supporting the OneAPI reference architecture. To sum up, let’s not forget subscriber identity, online charging – we’ve got to make money – the need for policy management because not all users are equal, and QoS for a differentiated user experience.
Sender-Pays Data in Practice
Andrew Bud, Chairman, MEF gave an update on the trial of sender-pays data he announced at the last event. This is no longer theoretical, he said; it’s functioning, just with a restricted feature set. Retail-only Internet has just about worked so far; because people pay for the services through their subscription and they’re free. Video breaks this, he said; it will be impossible to be comprehensive, meaningful, and sustainable.
You can’t, he said, put a meaningful customer warning that covers all the possible prices you might encounter due to carrier policy with your content; and everyone is scared of huge bills after the WAP experience. Further, look at the history of post offices, telegraphy and telephony – it’s been sender-pays since the 1850s. Similarly, Amazon.com is sender-pays, as is Akamai.
Hence we need sending-party pays data – that way, we can have truly free ads: not one where the poor end users ends up paying the delivery cost!
Our trial: we have relationships with carriers making up 85% of the UK market. We have contracts, priced per-MB of data, with them. And we have four customers – Jamster, who brought you the Crazy Frog, Shorts, THMBNLS, who produce mobisodes promoting public health, and Creative North – mobile games as a gift from the government. Of course, without sender-pays this is impossible.
We’ve discovered that the carriers have no idea how much data costs; wholesale pricing has some very interesting consequences. Notably the prices are being set too high. Real costs and real prices mean that quality of experience is a real issue; it’s a very complicated system to get right. The positive sign, and ringing endorsement for the trial, is that some carriers are including sender-pays revenue in their budgets now!
The business of video is a prime battleground for Telco 2.0 strategies. It represents the heaviest data flows, the cornerstone of triple/quad-play bundling, powerful entrenched interests from broadcasters and content owners, and a plethora of regulators and industry bodies. For many people, it lies at the heart of home-based service provision and entertainment, as well as encroaching on the mobile space. The growth of P2P and other illegal or semi-legal download mechanisms puts pressure on network capacity – and invites controversial measures around protecting content rights and Net Neutrality.
In theory, operators ought to be able to monetise video traffic, even if they don’t own or aggregate content themselves. There should be options for advertising, prioritised traffic or blended services – but these are all highly dependent on not just capable infrastructure, but realistic business models. Operators also need to find a way to counter the ‘Network Neutrality’ lobbyists who are confounding the real issue (access to the internet for all service providers on a ‘best efforts’ basis) with spurious arguments that operators should not be able to offer premium services, such as QoS and identity, to customers that want to pay for them. Telco 2.0 would argue that the right to offer (and the right to buy) a better service is a cornerstone of capitalism and something that is available in every other industry. Telecoms should be no different. Of course, it remains up to the operators to develop services that customers are willing to pay more for…
A common theme in the discussion was “tempus fugit” – time flies. The pace of evolution has been staggering, especially in Internet video distribution – IPTV, YouTube, iPlayer, Hulu, Qik, P2P, mashups and so forth. Telcos do not have the luxury of time for extended pilot projects or grandiose collaborations that take years to come to fruition.
With this timing issue in mind, the feedback from the audience was collected in three categories, although here the output has been aggregated thematically, as follows:
STOP – What should we stop doing?
START – What should we start doing?
DO MORE – What things should we do more of?
Feedback: STOP the current business model
There was broad agreement that the current model is unsustainable, especially given the demands that “heavy” content like video traffic places on the network…..
· [Stop] giving customers bandwidth for free [#5]
· Stop complex pricing models for end-user [#9]
· Stop investing so much in sustaining old order [#18]
· Stop charging mobile subscribers on a per megabyte basis. [#37]
· Current peering agreement/ip neutrality is not sustainable. [#41]
· [Stop] assuming things are free. [#48]
· [Stop] lowering prices for unlimited data. [#61]
· Have to develop more models for upstream charging for data rather than just flat rate to subscribers. [#11]
· Build rational pricing segmentation for data to monetize both sides of the value chain with focus on premium value items. [#32]
Feedback: Transparency and pricing
… with many people suggesting that Telcos first need to educate users and service providers about the “true cost” of transporting data…. although whether they actually know the answer themselves is another question, as it is much an issue of accounting practices as network architecture.
· Make the service providers aware of the cost they generate to carriers. [#31]
· Make pricing transparency for consumers a must. [#10]
· Mobile operators start being honest with themselves about the true cost of data before they invest in LTE. [#7]
· When resources are limited, then rationing is necessary. Net Neutrality will not work. Today people pay for water in regions where it is limited in supply. Its use is abused when there are no limits. [#17]
· Start being transparent in data charges, it will all stay or fall with cost transparency. [#12]
· You can help people understand usage charges, with meters or regular updates, requires education for a behavioural change, easier for fixed than mobile. [#14]
· Service providers need to have a more honest dialogue with subscribers and give them confidence to use services [#57]
· As an industry we must invest more in educating the market about network economics, end-users as well as service providers. [#58]
· Start charging subscribers flat rate data fee rather than per megabyte. [#46]
Feedback: Sender-pays data
Andrew Bud’s concept of “sender pays data”, in which a content provider bundles in the notional cost of data transport into the download price for the consumer, generated both enthusiasm and concerns (although very little outright disagreement). Telco 2.0 agrees with the fundamental ‘elegance’ of the notion, but thinks that there are significant practical, regulatory and technical issues that need to be resolved. In particular, the delivery of “monolithic” chunks of content like movies may be limited, especially in mobile networks where data traffic is dominated by PCs with mobile broadband, usually conducting a wide variety of two-way applications like social networking.
· Sender pays is the only sane model. [#6]
· Do sender pays on both ‘sides’ consumer as well…gives ‘control’ and clarity to user. [#54]
· Sender Pays is one specific example of a much larger category of 3rd-party pays data, which also includes venue owners (e.g. hotels or restaurants), advertisers/sponsors (‘thanks for flying Virgin, we’re giving you 10MB free as a thank-you’), software developers, government (e.g. ‘benefit’ data for the unemployed etc) etc. The opportunity for Telcos may be much larger from upstream players outside the content industry [#73]
· We already do sender pays on our mobile portal – on behalf of all partner content providers including Napster mobile. [#77]
· Change the current peering model into an end to end sender pay model where all carriers in the chain receive the appropriate allocation of the sender pay revenue in order to guarantee the QoS for the end user. [#63]
· Focus on the money flows e.g. confirm the sender pays model. [#19]
Qualified Support/Implementation concerns
· Business models on sender pays, but including the fact, that roaming is needed, data costs will be quite different across mobile carriers and the aggregators costs and agreements are based on the current carriers. These things need to be solved first [#26]
· Sender pays is good but needs the option of ‘only deliver via WiFi or femtocell when the user gets home’ at 1/100th the cost of ‘deliver immediately via 3G macro network’. [#15]
· Who pays for AJAX browsers proactively downloading stuff in the background without explicit user request? [#64]
· Be realistic about sender pays data. It will not take off it is not standard across the market, and the data prices currently break the content business model – you have to compare to the next alternative. A video on iTunes costs 1.89 GBP including data… Operators should either take a long term view or forget about it. [#20]
· Sender-pays data can be used to do anything the eco-system needs, including quality/HD. It doesn’t yet today only because the carriers don’t know how to provide those. [#44]
· Sender pays works for big monolithic chunks like songs or videos. But doesn’t work for mash up or communications content/data like Facebook (my Facebook page has 30 components from different providers – are you going to bill all of them separately?) [#53]
· mBlox: more or less like a free-call number. doesn’t guarantee quality/HD [#8]
· Stop sender pays because user is inundated with spam. [#23]
o Re 23: At least the sender is charged for the delivery. I do not want to pay for your SPAM! [#30]
A fair amount of the discussion revolved around the thorny issues of capacity, congestion, prioritisation and QoS, although some participants felt this distracted a little from the “bigger picture” of integrated business models.
· Part of bandwidth is dedicated to high quality contents (paid for). Rest is shared/best effort. [#27]
· Start annotating the network, by installing the equivalent of gas meters at all points across the network, in order that they truly understand the nature of traffic passing over the network – to implement QoS. [#56]
o Re: 56 – that’s fine in the fixed world or mobile core, but it doesn’t work in the radio network. Managing QoS in mobile is difficult when you have annoying things like concrete walls and metallised reflective windows in the way [#75]
· [Stop] being telecom focused and move more towards solutions. It is more than bandwidth. [#25]
· Stop pretending that mobile QoS is important, as coverage is still the gating factor for user experience. There’s no point offering 99.9% reliability when you only have 70% coverage, especially indoors [#29]
· Start preparing for a world of fewer, but converged fixed-mobile networks that are shared between operators. In this world there will need to be dynamic model of allocating and charging for network capacity. [#67]
· We need applications that are more aware of network capacity, congestion, cost and quality – and which alter their behaviour to optimise for the conditions at any point in time e.g. with different codec’s or frame rate or image size. The intelligence to do this is in the device, not the network. [#68]
o Re: 68, is it really in the CPE? If the buffering of the content is close at the terminal, perhaps, otherwise there is no jitter guarantee. [#78]
§ Re 78 – depends on the situation, and download vs. streaming etc. Forget the word ‘terminal’, it’s 1980s speak, if you have a sufficiently smart endpoint you can manage this – hence PCs being fine for buffering YouTube or i-Player etc, and some of the video players auto-sensing network conditions [#81]
· QoE – for residential cannot fully support devices which are not managed for streamed content. [#71]
· Presumably CDNs and caching have a bit of a problem with customised content, e.g. with inserted/overlaid personalised adverts in a video stream? [#76]
Feedback: platforms, APIs, and infrastructure
However, the network and device architecture is only part of the issue. It is clear that video distribution fits centrally within the wider platform problems of APIs and OSS/BSS architecture, which span the overall Telco 2.0 reach of a given operator.
· Too much focus on investment in the network, where is the innovation in enterprise software innovation to support the network? [#70]
· For operator to open up access to the business assets in a consistent manner to innovative. Intermediaries who can harmonise APIs across a national or global marketplace. [#13]
· The BSS back office; billing, etc will not support robust interactive media for the most part. [#22]
· Let content providers come directly to Telcos to avoid a middle layer (aggregators) to take the profit. This requires collaboration and standardization among Telco’s for the technical interfaces and payment models. [#28]
· More analysis on length of time and cost of managing billing vendor for support of 2-sided business model. Prohibitively expensive in back office to take risks. Why? [#65]
· It doesn’t matter how strong the network is if you can’t monetize it on the back end OSS/BSS. [#40]
Feedback: Business models for video
Irrespective of the technical issues, or specific point commercial innovations like sender pays, there are also assorted problems in managing ecosystem dynamics, or more generalised business models for online video or IPTV. A significant part of the session’s feedback explored the concerns and possible solutions – with the “elephant in the room” of Net Neutrality lurking on the sidelines.
· Open up to lower cost lower risk trials to see what does and doesn’t work. [#35]
· Real multi quality services in order to monetize high quality services. [#36]
· Transform net neutrality issues into a fair policy approach… meaning that you cannot have equal treatment when some parties abuse the openness. [#39]
o Re 39: I want QoE for content I want to see. Part of this is from speed of access. Net Neutrality comes from the Best Effort and let is fight out in the scarce network. I.e. I do not get the QoE for all the other rubbish in the network. [#69]
· Why not bundling VAS with content transportation to ease migration from a free world to a pay for value world? [#43]
· Do more collaborative models which incorporate the entire value chain. [#55]
· Service providers start partnering to resell long tail content from platform providers with big catalogues. [#59]
· [Start to] combine down- and up-stream models in content. Especially starts get paid to deliver long tail content. [#60]
· Start thinking longer term instead of short term profit, to create a new ecosystem that is bigger and healthier. [#62]
· Exploit better the business models between content providers and carriers. [#16]
· Adapt price to quality of service. [#21]
· Put more attention on quality of end user experience. [#24]
· I am prepared to pay a higher retail DSL subscription if I get a higher quality of experience. – not just monthly download limits. [#38]
· maximize revenues based on typical Telco capabilities (billing, delivery, assurance on million of customers) [#50]
· Need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
· On our fixed network, DSL resellers pay for the access and for the bandwidth used – this corresponds to the sender pays model; due to rising bandwidth demand the charge for the resellers continuously increases. so we have to adapt bandwidth tariffs every year in order not to suffocate our DSL resellers. Among them are also companies offering TV streaming. [#82]
· More settlement free peering with content/app suppliers – make the origination point blazingly fast and close to zero cost. rather focus on charging for content distribution towards the edge of the access network (smart caching, torrent seeds, multicast nodes etc) [#74]
In addition to these central themes, the session’s participants also offered a variety of other comments concerning regulatory issues, industry collaboration, consumer issues and other non-video services like SMS.
· Start addressing customer data privacy issues now, before it’s too late and there is a backlash from subscribers and the media. [#42]
· Consolidating forums and industry bodies so we end up with one practical solution. [#45]
· Identifying what an operator has potential to be of use for to content SP other than a pipe. [#49]
· Getting regulators to stimulate competition by enforcing structural separation – unbundle at layer 1, bring in agile players with low operating cost. Let customers vote with their money – focus on deliverable the fastest basic IP pipe at a reasonable price. If the basic price point is reasonable customers will be glad to pay for extra services – either sender or receiver based. [#72]
· IPTV <> Internet TV. In IPTV the Telco chooses my content, Internet TV I choose. [#79]
· Put attention on creating industry collaboration models. [#47]
· Stop milking the SMS cash cow and stop worrying about cannibalising it, otherwise today’s rip-off mobile data services will never take off. [#33]
· SMS combined with the web is going to play a big role in the future, maybe bigger that the role it played in the past. Twitter is just the first of a wave of SMS based social media and comms applications for people. [#51]
Participants ‘Next Steps’ Vote
Participants were then asked: Which of the following do we need to understand better in the next 6 months?
Is there really a capacity problem, and what is the nature of it?
How to tackle the net neutrality debate and develop an acceptable QOS solution for video?
Is there a long term future for IPTV?
How to take on the iPhone regarding mobile video?
More aggressive piloting / roll-out of sender party pays data?
Lessons learnt & next steps
The vote itself reflects the nature of the discussions and debates at the event: there are lots of issues and things that the industry is not yet clear on that need to be ironed out. The world is changing fast and how we overcome issues and exploit opportunities is still hazy. And all the time, there is a concern that the speed of change could overtake existing players (including Telcos and ISPs)!
However, there does now seem to be greater clarity on several issues with participants becoming increasingly keen to see the industry tackle the business model issue of flat-rate pricing to consumers and little revenue being attached to the distribution of content (particularly bandwidth hungry video). Overall, most seem to agree that:
1. End users like simple pricing models (hence success of flat rate) but that some ‘heavy users’ will require a variable rate pricing scheme to cover the demands they make;
2. Bandwidth is not free and costs to Telcos and ISPs will continue to rise as video traffic grows;
3. Asking those sending digital goods to pay for the distribution cost is sensible…;
4. …but plenty of work needs to be done on the practicalities of the sender-pays model before it can be widely adopted across fixed and mobile;
5. Operators need to develop a suite of value-added products and services for those sending digital goods over their networks so they can charge incremental revenues that will enable continued network investment;
6. Those pushing the ‘network neutrality’ issue are (deliberately or otherwise) causing confusion over such differential pricing which creates PR and regulatory risks for operators that need to be addressed.
There are clearly details to be ironed out – and probably experiments in pricing and charging to be done. Andrew Bud’s (and many others, it must be added, have suggested similar) sending-party pays model may work, or it may not – but this is an area where experiments need to be tried. The idea of “educating” upstream users is euphemistic – they are well aware of the benefits they currently are accruing, which is why the Net Neutrality debate is being deliberately muddied. Distributors need to be working on disentangling bits that are able to be free from those that pay to ride, not letting anyone get a free ride.
As can be seen in the responses, there is also a growing realisation that the Telco has to understand and deal with the issues of the overall value chain, end-to-end, not just the section under its direct control, if it wishes to add value over and above being a bit pipe. This is essentially moving towards a solution of the “Quality of Service” issue – they need to decide how much of the solution is capacity increase, how much is traffic management, and how much is customer expectation management.
Alan Patrick, Telco 2.0: ”98.7% of users don’t have an iPhone, but 98% of mobile developers code for it because it has an integrated end-to-end experience, rather than a content model based on starving in a garage.”
The “Tempus Fugit” point is well made too – the Telco 2.0 participants are moving towards an answer, but it is not clear that the same urgency is being seen among wider Telco management.
Two areas were skimmed through a little too quickly in the feedback:
Managing a way through the ‘Pirate World’ environment
The economic crisis has helped in that it has reduced the amount of venture capital and other risk equity going into funding plays that need not make revenue, never mind profit. In our view this means that the game will resolve into a battle of deep pockets to fund the early businesses. Incumbents typically suffer from higher cost bases and higher hurdle rates for new ventures. New players typically have less revenue, but lower cost structures. For existing Telcos this means using existing assets as effectively as possible and we suggest a more consolidated approach from operators and associated forums and industry bodies so the industry ends up with one practical solution. This is particularly important when initially tackling the ‘Network Neutrality’ issue and securing customer and regulatory support for differential pricing policies.
Adopting a policing role, particularly in the short-term during Pirate World, may be valuable for operators. Telco 2.0 believes the real value is in managing the supply of content from companies (rather than end users) and ensuring that content is legal (paid for!).
What sort of video solution should Telcos develop?
The temptation for operators to push iPTV is huge – it offers, in theory, steady revenues and control of the set-top box. Unfortunately, all the projected growth is expected to be in Web TV, delivered to PCs or TVs (or both). Providing a suite of value-added distribution services is perhaps a more lucrative strategy for operators:
Operators must better understand the needs of upstream segments and individual customers (media owners, aggregators, broadcasters, retailers, games providers, social networks, etc.) and develop propositions for value-added services in response to these. Managing end user data is likely to be important here. As one participant put it:
o We need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
Customer privacy will clearly be an issue if operators develop solutions for upstream customers that involve the management of data flows between both sides of the platform. End users want to know what upstream customers are providing, how they can pay, whether the provider is trusted, etc. and the provider needs to be able to identify and authenticate the customer, as well as understand what content they want and how they want to pay for it. Opt-in is one solution but is complex and time-consuming to build scale so operators need to explore ways of protecting data while using it to add value to transactions over the network.