The future of the US market and its 3rd and 4th operators has been a long-running saga. The market, the world’s richest, remains dominated by the duopoly of AT&T and Verizon Wireless. It was long expected that Softbank’s acquisition of Sprint heralded disruption, but in the event, T-Mobile was simply quicker to the punch.
Since the launch of T-Mobile’s “uncarrier” price-war strategy, we have identified signs of a “Free Mobile-like” disruption event, for example, substantial net-adds for the disruptor, falling ARPUs, a shakeout of MVNOs and minor operators, and increased industry-wide subscriber growth. However, other key indicators like a rapid move towards profitability by the disruptor are not yet in evidence, and rather than industry-wide deflation, we observe divergence, with Verizon Wireless increasing its ARPU, revenues, and margins, while AT&T’s are flat, Sprint’s flat to falling, and T-Mobile’s plunging.
This data is summarised in Figure 1.
Figure 1: Revenue and margins in the US. The duopoly is still very much with us
Source: STL Partners, company filings
Compare and contrast Figure 2, which shows the fully developed disruption in France.
Figure 2: Fully-developed disruption. Revenue and margins in France
Source: STL Partners, company filings
T-Mobile: the state of play in Q2 2014
When reading Figure 1, you should note that T-Mobile’s Q2 2014 accounts contain a negative expense item of $747m, reflecting a spectrum swap with Verizon Wireless, which flatters their margin. Without it, the operating margin would be 2.99%, about a third of Sprint’s. Poor as this is, it is at least positive territory, after a Q1 in which T-Mobile lost money. It is not quite true to say that T-Mobile only made it to profitability thanks to the one-off spectrum deal; excluding it, the carrier would have made $215m in operating income in Q2, a $243m swing from the $28m net loss in Q1. This is explained by a $223m narrowing of T-Mobile’s losses on device sales, as shown in Figure 2, and may explain why the earnings release makes no mention of profits instead of adjusted EBITDA despite it being a positive quarter.
Figure 3: T-Mobile’s return to underlying profitability – caused by moderating its smartphone bonanza somewhat
Source: STL Partners, company filings
T-Mobile management likes to cite its ABPU (Average Billings per User) metric in preference to ARPU, which includes the hire-purchase charges on device sales under its quick-upgrade plans. However, as Figure 3 shows, this is less exciting than it sounds. The T-Mobile management story is that as service prices, and hence ARPU, fall in order to bring in net-adds, payments for device sales “decoupled” from service plans will rise and take up the slack. They are, so far, only just doing so. Given that T-Mobile is losing money on device pricing, this is no surprise.
Executive Summary
Free’s Bid for T-Mobile USA
T-Mobile: the state of play in Q2 2014
Free-Mobile: the financials
Indicators of a successful LBO
Free.fr: a modus operandi for disruption
Surprise and audacity
Simple products
The technical edge
Obstacles to the Free modus operandi
Spectrum
Fixed-mobile synergy
Regulation
Summary
Two strategic options
Hypothesis one: change the circumstances via a strategic deal with the cablecos
Hypothesis two: 80s retro LBO
Problems that bite whichever option is taken
The other shareholders
Free’s management capacity and experience
Conclusion
Figure 1: Revenue and margins in the US. The duopoly is still very much with us
Figure 2: Fully-developed disruption. Revenue and margins in France
Figure 3: T-Mobile’s return to underlying profitability – caused by moderating its smartphone bonanza somewhat
Figure 4: Postpaid ARPU falling steadily, while ABPU just about keeps up
Figure 5: T-Mobile’s supposed “decoupling” of devices from service has extended $3.5bn of credit to its customers, rising at $1bn/quarter
Figure 6: Free’s valuation of T-Mobile is at the top end of a rising trend
Figure 7: Example LBO
Figure 8: Free-T-Mobile in the context of notable leveraged buyouts
Figure 9: Free Mobile’s progress towards profitability has been even more impressive than its subscriber growth
Summary: Key trends, tactics, and technologies for mobile broadband networks and services that will influence mid-term revenue opportunities, cost structures and competitive threats. Includes consideration of LTE, network sharing, WiFi, next-gen IP (EPC), small cells, CDNs, policy control, business model enablers and more.(March 2012, Executive Briefing Service, Future of the Networks Stream).
Below is an extract from this 44 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email contact@telco2.net / call +44 (0) 207 247 5003. We’ll also be discussing our findings and more on Facebook at the Silicon Valley (27-28 March) and London (12-13 June) New Digital Economics Brainstorms.
In our recent ‘Under the Floor (UTF) Players‘ Briefing we looked at strategies to deal with some of of the challenges facing operators’ resulting from market structure and outsourcing
This Executive Briefing is intended to complement and extend those efforts, looking specifically at those technical and business trends which are truly “disruptive”, either immediately or in the medium-term future. In essence, the document can be thought of as a checklist for strategists – pointing out key technologies or trends around mobile broadband networks and services that will influence mid-term revenue opportunities and threats. Some of those checklist items are relatively well-known, others more obscure but nonetheless important. What this document doesn’t cover is more straightforward concepts around pricing, customer service, segmentation and so forth – all important to get right, but rarely disruptive in nature.
During 2012, Telco 2.0 will be rolling out a new MBB workshop concept, which will audit operators’ existing technology strategy and planning around mobile data services and infrastructure. This briefing document is a roundup of some of the critical issues we will be advising on, as well as our top-level thinking on the importance of each trend.
It starts by discussing some of the issues which determine the extent of any disruption:
Growth in mobile data usage – and whether the much-vaunted “tsunami” of traffic may be slowing down
The role of standardisation , and whether it is a facilitator or inhibitor of disruption
Whether the most important MBB disruptions are likely to be telco-driven, or will stem from other actors such as device suppliers, IT companies or Internet firms.
The report then drills into a few particular domains where technology is evolving, looking at some of the most interesting and far-reaching trends and innovations. These are split broadly between:
Network infrastructure evolution (radio and core)
Control and policy functions, and business-model enablers
It is not feasible for us to cover all these areas in huge depth in a briefing paper such as this. Some areas such as CDNs and LTE have already been subject to other Telco 2.0 analysis, and this will be linked to where appropriate. Instead, we have drilled down into certain aspects we feel are especially interesting, particularly where these are outside the mainstream of industry awareness and thinking – and tried to map technical evolution paths onto potential business model opportunities and threats.
This report cannot be truly exhaustive – it doesn’t look at the nitty-gritty of silicon components, or antenna design, for example. It also treads a fine line between technological accuracy and ease-of-understanding for the knowledgeable but business-focused reader. For more detail or clarification on any area, please get in touch with us – email mailto:contact@stlpartners.com or call +44 (0) 207 247 5003.
Telco-driven disruption vs. external trends
There are various potential sources of disruption for the mobile broadband marketplace:
New technologies and business models implemented by telcos, which increase revenues, decrease costs, improve performance or alter the competitive dynamics between service providers.
3rd party developments that can either bolster or undermine the operators’ broadband strategies. This includes both direct MBB innovations (new uses of WiFi, for example), or bleed-over from adjacent related marketplaces such as device creation or content/application provision.
External, non-technology effects such as changing regulation, economic backdrop or consumer behaviour.
The majority of this report covers “official” telco-centric innovations – LTE networks, new forms of policy control and so on,
External disruptions to monitor
But the most dangerous form of innovation is that from third parties, which can undermine assumptions about the ways mobile broadband can be used, introducing new mechanisms for arbitrage, or somehow subvert operators’ pricing plans or network controls.
In the voice communications world, there are often regulations in place to protect service providers – such as banning the use of “SIM boxes” to terminate calls and reduce interconnection payments. But in the data environment, it is far less obvious that many work-arounds can either be seen as illegal, or even outside the scope of fair-usage conditions. That said, we have already seen some attempts by telcos to manage these effects – such as charging extra for “tethering” on smartphones.
It is not really possible to predict all possible disruptions of this type – such is the nature of innovation. But by describing a few examples, market participants can gauge their level of awareness, as well as gain motivation for ongoing “scanning” of new developments.
Some of the areas being followed by Telco 2.0 include:
Connection-sharing. This is where users might link devices together locally, perhaps through WiFi or Bluetooth, and share multiple cellular data connections. This is essentially “multi-tethering” – for example, 3 smartphones discovering each other nearby, perhaps each with a different 3G/4G provider, and pooling their connections together for shared use. From the user’s point of view it could improve effective coverage and maximum/average throughput speed. But from the operators’ view it would break the link between user identity and subscription, and essentially offload traffic from poor-quality networks on to better ones.
SoftSIM or SIM-free wireless. Over the last five years, various attempts have been made to decouple mobile data connections from SIM-based authentication. In some ways this is not new – WiFi doesn’t need a SIM, while it’s optional for WiMAX, and CDMA devices have typically been “hard-coded” to just register on a specific operator network. But the GSM/UMTS/LTE world has always relied on subscriber identification through a physical card. At one level, it s very good – SIMs are distributed easily and have enabled a successful prepay ecosystem to evolve. They provide operator control points and the ability to host secure applications on the card itself. However, the need to obtain a physical card restricts business models, especially for transient/temporary use such as a “one day pass”. But the most dangerous potential change is a move to a “soft” SIM, embedded in the device software stack. Companies such as Apple have long dreamed of acting as a virtual network provider, brokering between user and multiple networks. There is even a patent for encouraging bidding per-call (or perhaps per data-connection) with telcos competing head to head on price/quality grounds. Telco 2.0 views this type of least-cost routing as a major potential risk for operators, especially for mobile data – although it also possible enables some new business models that have been difficult to achieve in the past.
Encryption. Various of the new business models and technology deployment intentions of operators, vendors and standards bodies are predicated on analysing data flows. Deep packet inspection (DPI) is expected to be used to identify applications or traffic types, enabling differential treatment in the network, or different charging models to be employed. Yet this is rendered largely useless (or at least severely limited) when various types of encryption are used. Various content and application types already secure data in this way – content DRM, BlackBerry traffic, corporate VPN connections and so on. But increasingly, we will see major Internet companies such as Apple, Google, Facebook and Microsoft using such techniques both for their own users’ security, but also because it hides precise indicators of usage from the network operators. If a future Android phone sends all its mobile data back via a VPN tunnel and breaks it out in Mountain View, California, operators will be unable to discern YouTube video from search of VoIP traffic. This is one of the reasons why application-based charging models – one- or two-sided – are difficult to implement.
Application evolution speed. One of the largest challenges for operators is the pace of change of mobile applications. The growing penetration of smartphones, appstores and ease of “viral” adoption of new services causes a fundamental problem – applications emerge and evolve on a month-by-month or even week-by-week basis. This is faster than any realistic internal telco processes for developing new pricing plans, or changing network policies. Worse, the nature of “applications” is itself changing, with the advent of HTML5 web-apps, and the ability to “mash up” multiple functions in one app “wrapper”. Is a YouTube video shared and embedded in a Facebook page a “video service”, or “social networking”?
It is also really important to recognise that certain procedures and technologies used in policy and traffic management will likely have some unanticipated side-effects. Users, devices and applications are likely to respond to controls that limit their actions, while other developments may result in “emergent behaviours” spontaneously. For instance, there is a risk that too-strict data caps might change usage models for smartphones and make users just connect to the network when absolutely necessary. This is likely to be at the same times and places when other users also feel it necessary, with the unfortunate implication that peaks of usage get “spikier” rather than being ironed-out.
There is no easy answer to addressing these type of external threats. Operator strategists and planners simply need to keep watch on emerging trends, and perhaps stress-test their assumptions and forecasts with market observers who keep tabs on such developments.
The mobile data explosion… or maybe not?
It is an undisputed fact that mobile data is growing exponentially around the world. Or is it?
A J-curve or an S-curve?
Telco 2.0 certainly thinks that growth in data usage is occurring, but is starting to see signs that the smooth curves that drive so many other decisions might not be so smooth – or so steep – after all. If this proves to be the case, it could be far more disruptive to operators and vendors than any of the individual technologies discussed later in the report. If operator strategists are not at least scenario-planning for lower data growth rates, they may find themselves in a very uncomfortable position in a year’s time.
In its most recent study of mobile operators’ traffic patterns, Ericsson concluded that Q2 2011 data growth was just 8% globally, quarter-on-quarter, a far cry from the 20%+ growths seen previously, and leaving a chart that looks distinctly like the beginning of an S-curve rather than a continued “hockey stick”. Given that the 8% includes a sizeable contribution from undoubted high-growth developing markets like China, it suggests that other markets are maturing quickly. (We are rather sceptical of Ericsson’s suggestion of seasonality in the data). Other data points come from O2 in the UK , which appears to have had essentially zero traffic growth for the past few quarters, or Vodafone which now cites European data traffic to be growing more slowly (19% year-on-year) than its data revenues (21%). Our view is that current global growth is c.60-70%, c.40% in mature markets and 100%+ in developing markets.
Figure 1 – Trends in European data usage
Now it is possible that various one-off factors are at play here – the shift from unlimited to tiered pricing plans, the stronger enforcement of “fair-use” plans and the removal of particularly egregious heavy users. Certainly, other operators are still reporting strong growth in traffic levels. We may see resumption in growth, for example if cellular-connected tablets start to be used widely for streaming video.
But we should also consider the potential market disruption, if the picture is less straightforward than the famous exponential charts. Even if the chart looks like a 2-stage S, or a “kinked” exponential, the gap may have implications, like a short recession in the economy. Many of the technical and business model innovations in recent years have been responses to the expected continual upward spiral of demand – either controlling users’ access to network resources, pricing it more highly and with greater granularity, or building out extra capacity at a lower price. Even leaving aside the fact that raw, aggregated “traffic” levels are a poor indicator of cost or congestion, any interruption or slow-down of the growth will invalidate a lot of assumptions and plans.
Our view is that the scary forecasts of “explosions” and “tsunamis” have led virtually all parts of the industry to create solutions to the problem. We can probably list more than 20 approaches, most of them standalone “silos”.
Figure 2 – A plethora of mobile data traffic management solutions
What seems to have happened is that at least 10 of those approaches have worked – caps/tiers, video optimisation, WiFi offload, network densification and optimisation, collaboration with application firms to create “network-friendly” software and so forth. Taken collectively, there is actually a risk that they have worked “too well”, to the extent that some previous forecasts have turned into “self-denying prophesies”.
There is also another common forecasting problem occurring – the assumption that later adopters of a technology will have similar behaviour to earlier users. In many markets we are now reaching 30-50% smartphone penetration. That means that all the most enthusiastic users are already connected, and we’re left with those that are (largely) ambivalent and probably quite light users of data. That will bring the averages down, even if each individual user is still increasing their consumption over time. But even that assumption may be flawed, as caps have made people concentrate much more on their usage, offloading to WiFi and restricting their data flows. There is also some evidence that the growing numbers of free WiFi points is also reducing laptop use of mobile data, which accounts for 70-80% of the total in some markets, while the much-hyped shift to tablets isn’t driving much extra mobile data as most are WiFi-only.
So has the industry over-reacted to the threat of a “capacity crunch”? What might be the implications?
The problem is that focusing on a single, narrow metric “GB of data across the network” ignores some important nuances and finer detail. From an economics standpoint, network costs tend to be driven by two main criteria:
Network coverage in terms of area or population
Network capacity at the busiest places/times
Coverage is (generally) therefore driven by factors other than data traffic volumes. Many cells have to be built and run anyway, irrespective of whether there’s actually much load – the operators all want to claim good footprints and may be subject to regulatory rollout requirements. Peak capacity in the most popular locations, however, is a different matter. That is where issues such as spectrum availability, cell site locations and the latest high-speed networks become much more important – and hence costs do indeed rise. However, it is far from obvious that the problems at those “busy hours” are always caused by “data hogs” rather than sheer numbers of people each using a small amount of data. (There is also another issue around signalling traffic, discussed later).
Yes, there is a generally positive correlation between network-wide volume growth and costs, but it is far from perfect, and certainly not a direct causal relationship.
So let’s hypothesise briefly about what might occur if data traffic growth does tail off, at least in mature markets.
Delays to LTE rollout – if 3G networks are filling up less quickly than expected, the urgency of 4G deployment is reduced.
The focus of policy and pricing for mobile data may switch back to encouraging use rather than discouraging/controlling it. Capacity utilisation may become an important metric, given the high fixed costs and low marginal ones. Expect more loyalty-type schemes, plus various methods to drive more usage in quiet cells or off-peak times.
Regulators may start to take different views of traffic management or predicted spectrum requirements.
Prices for mobile data might start to fall again, after a period where we have seen them rise. Some operators might be tempted back to unlimited plans, for example if they offer “unlimited off-peak” or similar options.
Many of the more complex and commercially-risky approaches to tariffing mobile data might be deprioritised. For example, application-specific pricing involving packet-inspection and filtering might get pushed back down the agenda.
In some cases, we may even end up with overcapacity on cellular data networks – not to the degree we saw in fibre in 2001-2004, but there might still be an “overhang” in some places, especially if there are multiple 4G networks.
Steady growth of (say) 20-30% peak data per annum should be manageable with the current trends in price/performance improvement. It should be possible to deploy and run networks to meet that demand with reducing unit “production cost”, for example through use of small cells. That may reduce the pressure to fill the “revenue gap” on the infamous scissors-diagram chart.
Overall, it is still a little too early to declare shifting growth patterns for mobile data as a “disruption”. There is a lack of clarity on what is happening, especially in terms of responses to the new controls, pricing and management technologies put recently in place. But operators need to watch extremely closely what is going on – and plan for multiple scenarios.
Specific recommendations will depend on an individual operator’s circumstances – user base, market maturity, spectrum assets, competition and so on. But broadly, we see three scenarios and implications for operators:
“All hands on deck!”: Continued strong growth (perhaps with a small “blip”) which maintains the pressure on networks, threatens congestion, and drives the need for additional capacity, spectrum and capex.
Operators should continue with current multiple strategies for dealing with data traffic – acquiring new spectrum, upgrading backhaul, exploring massive capacity enhancement with small cells and examining a variety of offload and optimisation techniques. Where possible, they should explore two-sided models for charging and use advanced pricing, policy or segmentation techniques to rein in abusers and reward those customers and applications that are parsimonious with their data use. Vigorous lobbying activities will be needed, for gaining more spectrum, relaxing Net Neutrality rules and perhaps “taxing” content/Internet companies for traffic injected onto networks.
“Panic over”: Moderating and patchy growth, which settles to a manageable rate – comparable with the patterns seen in the fixed broadband marketplace
This will mean that operators can “relax” a little, with the respite in explosive growth meaning that the continued capex cycles should be more modest and predictable. Extension of today’s pricing and segmentation strategies should improve margins, with continued innovation in business models able to proceed without rush, and without risking confrontation with Internet/content companies over traffic management techniques. Focus can shift towards monetising customer insight, ensuring that LTE rollouts are strategic rather than tactical, and exploring new content and communications services that exploit the improving capabilities of the network.
“Hangover”: Growth flattens off rapidly, leaving operators with unused capacity and threatening brutal price competition between telcos.
This scenario could prove painful, reminiscent of early-2000s experience in the fixed-broadband marketplace. Wholesale business models could help generate incremental traffic and revenue, while the emphasis will be on fixed-cost minimisation. Some operators will scale back 4G rollouts until cost and maturity go past the tipping-point for outright replacement of 3G. Restrictive policies on bandwidth use will be lifted, as operators compete to give customers the fastest / most-open access to the Internet on mobile devices. Consolidation – and perhaps bankruptcies – may ensure as declining data prices may coincide with substitution of core voice and messaging business
To read the note in full, including the following analysis…
Introduction
Telco-driven disruption vs. external trends
External disruptions to monitor
The mobile data explosion… or maybe not?
A J-curve or an S-curve?
Evolving the mobile network
Overview
LTE
Network sharing, wholesale and outsourcing
WiFi
Next-gen IP core networks (EPC)
Femtocells / small cells / “cloud RANs”
HetNets
Advanced offload: LIPA, SIPTO & others
Peer-to-peer connectivity
Self optimising networks (SON)
M2M-specific broadband innovations
Policy, control & business model enablers
The internal politics of mobile broadband & policy
Two sided business-model enablement
Congestion exposure
Mobile video networking and CDNs
Controlling signalling traffic
Device intelligence
Analytics & QoE awareness
Conclusions & recommendations
Index
…and the following figures…
Figure 1 – Trends in European data usage
Figure 2 – A plethora of mobile data traffic management solutions
Figure 3 – Not all operator WiFi is “offload” – other use cases include “onload”
Figure 4 – Internal ‘power tensions’ over managing mobile broadband
Figure 5 – How a congestion API could work
Figure 6 – Relative Maturity of MBB Management Solutions
Figure 9 – Summary of disruptive network innovations
…Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 44 page report in PDF format here. Non-Members, please subscribe here, buy a Single User license for this report online here for £795 (+VAT for UK buyers), or for multi-user licenses or other enquiries, please email contact@telco2.net / call +44 (0) 207 247 5003.
It is over fourteen years since David Isenberg wrote his seminal paper The Rise of the Stupid Network in which he outlined the view that telephony networks would increasingly become dumb pipes as intelligent endpoints came to control how and where data was transported. Many of his predictions have come to fruition. Cheaper computing technology has resulted in powerful ‘smartphones’ in the hands of millions of people and new powerful internet players are using data centres to distribute applications and services ‘over the top’ to users over fixed and mobile networks.
The hypothesis behind this piece of research is that endpoints cannot completely control the network. STL Partners believes that the network itself needs to retain intelligence so it can interpret the information it is transporting between the endpoints. Mobile network operators, quite rightly, will not be able to control how the network is used but must retain the ability within the network to facilitate a better experience for the endpoints. The hypothesis being tested in this research is that ‘smart pipes’ are needed to:
Ensure that data is transported efficiently so that capital and operating costs are minimised and the internet and other networks remain cheap methods of distribution.
Improve user experience by matching the performance of the network to the nature of the application or service being used. ‘Best effort’ is fine for asynchronous communication, such as email or text, but unacceptable for voice. A video call or streamed movie requires guaranteed bandwidth, and real-time gaming demands ultra-low latency;
Charge appropriately for use of the network. It is becoming increasingly clear that the Telco 1.0 business model – that of charging the end-user per minute or per Megabyte – is under pressure as new business models for the distribution of content and transportation of data are being developed. Operators will need to be capable of charging different players – end-users, service providers, third-parties (such as advertisers) – on a real-time basis for provision of broadband and guaranteed quality of service (QoS);
Facilitate interactions within the digital economy. Operators can compete and partner with other players, such as the internet companies, in helping businesses and consumers transact over the internet. Networks are no longer confined to communications but are used to identify and market to prospects, complete transactions, make and receive payments and remittances, and care for customers. The knowledge that operators have about their customers coupled with their skills and assets in identity and authentication, payments, device management, customer care etc. mean that ‘the networks’ can be ‘enablers’ in digital transactions between third-parties – helping them to happen more efficiently and effectively.
Overall, smarter networks will benefit network users – upstream service providers and end users – as well as the mobile network operators and their vendors and partners. Operators will also be competing to be smarter than their peers as, by differentiating here, they gain cost, revenue and performance advantages that will ultimately transform in to higher shareholder returns.
Sponsorship and editorial independence
This report has kindly been sponsored by Tellabs and is freely available. Tellabs developed the initial concepts, and provided STL Partners with the primary input and scope for the report. Research, analysis and the writing of the report itself was carried out independently by STL Partners. The views and conclusions contained herein are those of STL Partners.
About Tellabs
Tellabs innovations advance the mobile Internet and help our customers succeed. That’s why 43 of the top 50 global communications service providers choose our mobile, optical, business and services solutions. We help them get ahead by adding revenue, reducing expenses and optimizing networks.
Tellabs (Nasdaq: TLAB) is part of the NASDAQ Global Select Market, Ocean Tomo 300® Patent Index, the S&P 500 and several corporate responsibility indexes including the Maplecroft Climate Innovation Index, FTSE4Good and eight FTSE KLD indexes. http://www.tellabs.com
Executive Summary
Mobile operators no longer growth stocks
Mobile network operators are now valued as utility companies in US and Europe (less so APAC). Investors are not expecting future growth to be higher than GDP and so are demanding money to be returned in the form of high dividends.
Two ‘smart pipes’ strategies available to operators
In his seminal book, Michael Porter identified three generic strategies for companies – ‘Cost leadership’, ‘Differentiation’ and ‘Focus’. Two of these are viable in the mobile telecommunications industry – Cost leadership, or Happy Pipe in STL Partners parlance, and Differentiation, or Full-service Telco 2.0. No network operators have found a Focus strategy to work as limiting the customer base to a segment of the market has not yielded sufficient returns on the high capital investment of building a network. Even MVNOs that have pursued this strategy, such as Helio which targeted Korean nationals in the US, have struggled.
Underpinning the two business strategies are related ‘smart pipe’ approaches – smart network and smart services:
Normal
0
false
false
false
EN-GB
X-NONE
X-NONE
Porter
Strategy
Telco 2.0 strategy
Nature of smartness
Characteristics
Cost leadership
Happy Pipe
Smart network
Cost efficiency – minimal network, IT and commercial costs.Simple utility offering.
Differentiation
Full-service Telco 2.0
Smart services
Technical and commercial flexibility: improve customer experience by integrating network capabilities with own and third-party services and charging either end user or service provider (or both).
Normal
0
false
false
false
EN-GB
X-NONE
X-NONE
Source: STL Partners
It is important to note that, currently at least, having a smart network is a precursor of smart services.It would be impossible for an operator to implement a Full-service Telco 2.0 strategy without having significant network intelligence.Full-service Telco 2.0 is, therefore, an addition to a Happy Pipe strategy.
In a survey conducted for this report, it was clear that operators are pursuing ‘smart’ strategies, whether at the network level or extending beyond this into smart services, for three reasons:
Revenue growth: protecting existing revenue sources and finding new ones.This is seen as the single most important driver of building more intelligence.
Cost savings: reducing capital and operating costs.
Performance improvement: providing customers with an improved customer experience.
Assuming that most mobile operators currently have limited smartness in either network or services, our analysis suggests significant upside in financial performance from successfully implementing either a Happy Pipe or Full-service Telco 2.0 strategy.Most mobile operators generate Cash Returns on Invested Capital of between 5 and 7%.For the purposes of our analysis, we have a assumed a baseline of 5.8%.The lower capital and operator costs of a Happy Pipe strategy could increase this to 7.4% and the successful implementation of a Full-service Telco 2.0 strategy would increase this to a handsome 13.3%:
Normal
0
false
false
false
EN-GB
X-NONE
X-NONE
Telco 2.0 strategy
Nature of smartness
Cash Returns on Invested Capital
As-is – Telco 1.0
Low – relatively dumb
5.8%
Happy Pipe
Smart network
7.4%
Full-service Telco 2.0
Smart services
13.3%
Source: STL Partners
STL Partners has identified six opportunity areas for mobile operators to exploit with a Full-service Telco 2.0 strategy.Summarised here, these are outlined in detail in the report:
Opportunity Type
Approach
Typical Services
Core Services
Improving revenues and customer loyalty by better design, analytics, and smart use of data in existing services.
Access, Voice and Messaging, Broadband, Standard Wholesale, Generic Enterprise ICT Services (inc. SaaS)
Vertical industry solutions (SI)
Delivery of ICT projects and support to vertical enterprise sectors.
Systems Integration (SI), Vertical CEBP solutions, Vertical ICT, Vertical M2M solutions, and Private Cloud.
Infrastructure services
Optimising cost and revenue structures by buying and selling core telco ICT asset capacity.
Enabling wider use of voice, messaging, and data by facilitating access to them and embedding them in new products.
Comes with data, Sender pays delivery, Horizontal M2M Platforms, Voice, Messaging and Data APIs for 3rd Parties.
Third-pary business enablers
Enabling new telco assets (e.g. Customer data) to be leveraged in support of 3rd party business processes.
Telco enabled Identity and Authorisation, Advertising and Marketing, Payments. APIs to non-core services and assets.
Own-brand OTT services
Building value through Telco-owned online properties and ‘Over-the-Top’ services.
Online Media, Enterprise Web Services, Own Brand VOIP services.
Source: STL Partners
Normal
0
false
false
false
EN-GB
X-NONE
X-NONE
Regional approaches to smartness vary
As operators globally experience a slow-down in revenue growth, they are pursuing ways of maintaining margins by reducing costs.Unsurprisingly therefore, most operators in North America, Europe and Asia-Pacific appear to be pursuing a Happy Pipe/smart network strategy.Squeezing capital and operating costs and improving network performance is being sought through such approaches as:
Physical network sharing – usually involving passive elements such as towers, air-conditioning equipment, generators, technical premises and pylons.
Peering data traffic rather than charging (and being charged) for transit.
Wi-Fi offload – moving data traffic from the mobile network on to cheaper fixed networks.
Distributing content more efficiently through the use of multicast and CDNs.
Efficient network configuration and provisioning.
Traffic shaping/management via deep-packet inspection (DPI) and policy controls.
Network protection – implementing security procedures for abuse/fraud/spam so that network performance is maximised.
Device management to ameliorate device impact on network and improve customer experience
Vodafone Asia-Pacific is a good example of an operator pursuing these activities aggressively and as an end in itself rather than as a basis for a Telco 2.0 strategy.Yota in Russia and Lightsquared in the US are similarly content with being Happy Pipers.
In general, Asia-Pacific has the most disparate set of markets and operators.Markets vary radically in terms of maturity, structure and regulation and operators seem to polarise into extreme Happy Pipers (Vodafone APAC, China Mobile, Bharti) and Full-Service Telco 2.0 players (NTT Docomo, SK Telecom, SingTel, Globe).
In Telefonica, Europe is the home of the operator with the most complete Telco 2.0 vision globally.Telefonica has built and acquired a number of ‘smart services’ which appear to be gaining traction including O2 Priority Moments, Jajah, Tuenti and Terra.Recent structural changes at the company, in which Telefonica Digital was created to focus on opportunities in the digital economy, further indicate the company’s focus on Telco 2.0 and smart services.Europe too appears to be the most collaborative market.Vodafone, Telefonica, Orange, Telecom Italia and T-Mobile are all working together on a number of Telco 2.0 projects and, in so doing, seek to generate enough scale to attract upstream developers and downstream end-users.
The sheer scale of the two leading mobile operators in the US, AT&T and Verizon, which have over 100 million subscribers each, means that they are taking a different approach to Telco 2.0.They are collaborating on one or two opportunities, notably with ISIS, a near-field communications payments solution for mobile, which is a joint offer from AT&T, Verizon and T-Mobile.However, in the main, there is a high degree of what one interviewee described as ‘Big Bell dogma’ – the view that their company is big enough and powerful enough to take on the OTT players and ‘control’ the experiences of end users in the digital economy.The US market is more consolidated than Europe (giving the big players more power) but, even so, it seems unlikely that either AT&T or Verizon can keep customers using only their services – the lamented wall garden approach.
Implementing a Telco 2.0 strategy is important but challenging
STL Partners explored both how important and how difficult it is to implement the changes required to deliver a Happy Pipe strategy (outlined in the bullets above) and those needed for Full-service Telco 2.0 strategy, via industry interviews with operators and a quantitative survey.The key findings of this analysis were:
Overall, respondents felt that many activities were important as part of a smart strategy.In our survey, all except two activity areas – Femto/pico underlay and Enhanced switches (vs. routers) – were rated by more than 50% of respondents as either ‘Quite important’ or ‘Very important’ (see chart below).
Activities associated with a Full-service Telco 2.0 strategy were rated as particularly important:
Making operator assets available via APIs, Differentiated pricing and charging and Personalised and differentiated services were ranked 1, 2 and 3 out of the thirteen activities.
Few considered that any of the actions were dangerous and could destroy value, although Physical network sharing and Traffic shaping/DPI were most often cited here.
NOTE: Overall ranking was based on a weighted scoring policy of Very important +4, Quite important +3, Not that important +2, Unimportant +1, Dangerous -4.
Overall, most respondents to the survey and people we spoke with felt that operators had more chance in delivering a Happy Pipe strategy and that only a few Tier 1 operators would be successful with a Full-Service Telco 2.0 strategy.For both strategies, they were surprisingly sceptical about operators’ ability to implement the necessary changes.Five reasons were cited as major barriers to success and were particularly big when considering a Full-Service Telco 2.0 strategy:
Competition from internet players.Google, Apple, Facebook et al preventing operators from expanding their role in the digital economy.
Difficulty in building a viable ecosystem. Bringing together the required players for such things as near-field communications (NFC) mobile payments and sharing value among them.
Lack of mobile operators skills.The failure of operators to develop or exploit key skills required for facilitating transactions such as customer data management and privacy.
Culture.Being too wedded to existing products, services and business models to alter the direction of the super-tanker.
Organisation structure. Putting in place the people and processes to manage the change.
Looking at the specific activities required to build smartness, it was clear that those required for a Full-service Telco 2.0/smart services strategy are considered the hardest to implement (see chart below):
Personalised and differentiated services via use of customer data – content, advertising, etc.
Making operator assets available to end users and other service providers – location, presence, ID, payments
Differentiated pricing and charging based on customer segment, service, QoS
NOTE: Overall ranking was based on a weighted scoring policy of Very easy +5, Relatively straightforward +4, Manageable +3, Quite difficult +2, Very difficult -2.
Conclusions and recommendations
By comparing the relative importance of specific activities against how easy they are to implement, we were able to classify them into four categories:
Unfortunately, as the chart above shows, no activities fall clearly into the ‘Forget’ categories but there are some clear priorities:
A Full-service Telco 2.0 strategy is about striving for a new role in the digital economy and is probably most appropriate for Tier 1 MNOs, since it is going to require substantial scale and investment in new skills such as software and application development and customer data.It will also require the development of new partnerships and ecosystems and complex commercial arrangements with players from other industries (e.g. banking).
There is a cluster of smart network activities that are individually relatively straightforward to implement and will yield a big bang for the buck if investments are made – the ‘Must get right’ group:
More efficient network configuration and provisioning;
Strengthen network security to cope with abuse and fraud;
Improve device management (and cooperation with handset manufacturers and content players) to reduce the impact of smartphone burden on the network;
Although deemed more marginal in our survey, we would include as equally important:
Traffic shaping and DPI which, in many cases, underpins various smart services opportunities such as differentiated pricing based on QoS and Multicast and CDNs which are proven in the fixed world and likely to be equally beneficial in a video-dominated mobile one.
There is second cluster of smart network activities which appear to be equally easy (or difficult) to implement but are deemed by respondents to be lower value and therefore fall into a lower ‘Housekeeping’ category:
Wi-Fi offload – we were surprised by this given the emphasis placed on this by NTT Docomo, China Mobile, AT&T, O2 and others;
Peering (vs. transit) and Enhanced switches– this is surely business-as-usual for all MNOs;
Femto/Pico underlay – generally felt to be of limited importance by respondents although a few cited its importance in pushing network intelligence to the edge which would enable MNOs to more easily deliver differentiated QoS and more innovative retail and wholesale revenue models;
Physical network sharing – again, a surprising result given the keenness of the capital markets on this strategy.
Overall, it appears that mobile network operators need to continue to invest resources in developing smart networks but that a clear prioritisation of efforts is needed given the multitude of ‘moving parts’ required to develop a smart network that will deliver a successful Happy Pipe strategy.
A successful Full-Service Telco 2.0 strategy is likely to be extremely profitable for a mobile network operator and would result in a substantial increase in share price.But delivering this remains a major challenge and investors are sceptical.Collaboration, experimentation and investment are important facets of a Telco 2.0 implementation strategy as they drive scale, learning and innovation respectively.Given the demands of investors for dividend yields, investment is only likely to be available if an operator becomes more efficient, so implementing a Happy Pipe strategy which reduces capital and operating costs is critical.
Report Contents
Executive Summary
Mobile network operator challenges
The future could still be bright
Defining a ‘smart’ network
Understanding operator strategies
Video: Case study in delivering differentiation and cost leadership
The benefits of Smart on CROIC
Implementing a ‘smart’ strategy
Conclusions and recommendations
Report Figures
Figure 1: Pressure from all sides for operators
Figure 2: Vodafone historical dividend yield – from growth to income
Figure 3: Unimpressed capital markets and falling employment levels
Figure 4: Porter and Telco 2.0 competitive strategies
Figure 5: Defining Differentiation/Telco 2.0
Figure 6 – The Six Opportunity Areas – Approach, Typical Services and Examples
Figure 7: Defining Cost Leadership/Happy Pipe
Figure 8: Defining ‘smartness’
Figure 9: Telco 2.0 survey – Defining smartness
Figure 10: NTT’s smart content delivery system – a prelude to mobile CDNs?
Figure 11: Vodafone India’s ARPU levels are now below $4/month, illustrating the need for a ‘smart network’ approach
Figure 12: China Mobile’s WLAN strategy for coverage, capacity and cost control
Figure 13: GCash – Globe’s text-based payments service
Figure 14: PowerOn – SingTel’s on-demand business services
In some quarters of the telecoms industry, the received wisdom is that the network itself is merely an undifferentiated “pipe”, providing commodity connectivity, especially for data services. The value, many assert, is in providing higher-tier services, content and applications, either to end-users, or as value-added B2B services to other parties. The Telco 2.0 view is subtly different. We maintain that:
Increasingly valuable services will be provided by third-parties but that operators can provide a few end-user services themselves. They will, for example, continue to offer voice and messaging services for the foreseeable future.
Operators still have an opportunity to offer enabling services to ‘upstream’ service providers such as personalisation and targeting (of marketing and services) via use of their customer data, payments, identity and authentication and customer care.
Even if operators fail (or choose not to pursue) options 1 and 2 above, the network must be ‘smart’ and all operators will pursue at least a ‘smart network’ or ‘Happy Pipe’ strategy. This will enable operators to achieve three things.
To ensure that data is transported efficiently so that capital and operating costs are minimised and the Internet and other networks remain cheap methods of distribution.
To improve user experience by matching the performance of the network to the nature of the application or service being used – or indeed vice versa, adapting the application to the actual constraints of the network. ‘Best efforts’ is fine for asynchronous communication, such as email or text, but unacceptable for traditional voice telephony. A video call or streamed movie could exploit guaranteed bandwidth if possible / available, or else they could self-optimise to conditions of network congestion or poor coverage, if well-understood. Other services have different criteria – for example, real-time gaming demands ultra-low latency, while corporate applications may demand the most secure and reliable path through the network.
To charge appropriately for access to and/or use of the network. It is becoming increasingly clear that the Telco 1.0 business model – that of charging the end-user per minute or per Megabyte – is under pressure as new business models for the distribution of content and transportation of data are being developed. Operators will need to be capable of charging different players – end-users, service providers, third-parties (such as advertisers) – on a real-time basis for provision of broadband and maybe various types or tiers of quality of service (QoS). They may also need to offer SLAs (service level agreements), monitor and report actual “as-experienced” quality metrics or expose information about network congestion and availability.
Under the floor players threaten control (and smartness)
Either through deliberate actions such as outsourcing, or through external agency (Government, greenfield competition etc), we see the network-part of the telco universe suffering from a creeping loss of control and ownership. There is a steady move towards outsourced networks, as they are shared, or built around the concept of open-access and wholesale. While this would be fine if the telcos themselves remained in control of this trend (we see significant opportunities in wholesale and infrastructure services), in many cases the opposite is occurring. Telcos are losing control, and in our view losing influence over their core asset – the network. They are worrying so much about competing with so-called OTT providers that they are missing the threat from below.
At the point at which many operators, at least in Europe and North America, are seeing the services opportunity ebb away, and ever-greater dependency on new models of data connectivity provision, they are potentially cutting off (or being cut off from) one of their real differentiators.
Given the uncertainties around both fixed and mobile broadband business models, it is sensible for operators to retain as many business model options as possible. Operators are battling with significant commercial and technical questions such as:
Can upstream monetisation really work?
Will regulators permit priority services under Net Neutrality regulations?
What forms of network policy and traffic management are practical, realistic and responsive?
Answers to these and other questions remain opaque. However, it is clear that many of the potential future business models will require networks to be physically or logically re-engineered, as well as flexible back-office functions, like billing and OSS, to be closely integrated with the network.
Outsourcing networks to third-party vendors, particularly when such a network is shared with other operators is dangerous in these circumstances. Partners that today agree on the principles for network-sharing may have very different strategic views and goals in two years’ time, especially given the unknown use-cases for new technologies like LTE.
This report considers all these issues and gives guidance to operators who may not have considered all the various ways in which network control is being eroded, from Government-run networks through to outsourcing services from the larger equipment providers.
Figure 1 – Competition in the services layer means defending network capabilities is increasingly important for operators
Source: STL Partners
Industry structure is being reshaped
Over the last year, Telco 2.0 has updated its overall map of the telecom industry, to reflect ongoing dynamics seen in both fixed and mobile arenas. In our strategic research reports on Broadband Business Models, and the Roadmap for Telco 2.0 Operators, we have explored the emergence of various new “buckets” of opportunity, such as verticalised service offerings, two-sided opportunities and enhanced variants of traditional retail propositions.
In parallel to this, we’ve also looked again at some changes in the traditional wholesale and infrastructure layers of the telecoms industry. Historically, this has largely comprised basic capacity resale and some “behind the scenes” use of carriers-carrier services (roaming hubs, satellite / sub-oceanic transit etc).
This is an extract from a report by Arete Research, a Telco 2.0TM partner specalising in investment analysis. The views in this article are not intended to constitute investment advice from Telco 2.0TM or STL Partners. We are reprinting Arete’s analysis to give our customers some additional insight into how some investors see the Telecoms market.
This report can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream using the links below.
Please use the links or email contact@telco2.net or call +44 (0) 207 247 5003 to find out more.
To share this article easily, please click:
A New IPR Cold War Begins
Everyone in the technology industry loves “next gen” products: they solve all the problems of the previous iteration! In LTE: Late, Tempting, and Elusive in June ’09, we [Arete Research] forecast delays and said LTE would require intensive R&D and bring minimal near-term sales. Two years later, its impact is limited, mostly driven by market-specific reasons. Now we see operators adopting LTE by moving to single RAN (radio access network) platforms, giving them a choice of how to use spectrum, and sparking de facto concentration of vendor market shares.
The “single RAN” (including LTE) is another example of deflation in wireless infrastructure; peak shipments of HSPA may be five years off, but now come with LTE. Collapsing networks onto single platforms (so-called “network modernisation”) prepares operators to re-farm spectrum, even if short-term spend goes up. The vendor market is consolidating around Ericsson and Huawei (both financially stable), with ZTE and Samsung as new entrants, and ALU, NSN and NEC struggling to make profits (see Fig. 1) while “pioneering” new concepts. All vendors see LTE as their chance to gain share, a dangerous phase. LTE also threatens to add costs in ’12 as networks need optimisation. A recent LTE Asia conference reinforced our three previous meanings for this nascent technology:
Still Late. In ’09 we said “Late is Great,” with no business case for aggressive deployment. Most operators are in “commercial trials”, awaiting firmer spectrum allocations, if not also devices. LTE rollouts have been admirably measured in all but a few markets, and where accelerated, mostly done for market-specific reasons.
Less Tempting? Operators are re-setting pricing and ending unlimited plans. LTE’s better spectral efficiency requires much higher device penetration. Operators are gradually deploying LTE as part of a evolution to single RAN networks (allowing re-farming), but few talk of “enabling new business models” beyond 3G technology.
Elusive Economics. As a new air interface, LTE needs work in spectrum, standards and handsets. Device makers are cagey about ramping LTE volumes at mid-range price points. Vendors are still testing new concepts to lower costs in dense urban areas. Network economics (of any G) are driven by single RAN rollouts, often by low-cost vendors.
Transformation Hardly Happens. For all the US 4G hype, LTE is continuing a decade-old “revolution” in mobile data (DoCoMo launched 3G in ’01), boosted by smartphones since ’07. LTE or not, operators struggle to add value beyond connectivity. Investors should reward operators that reach the lowest long-term cash costs, even with upfront capex.
No Help to Vendor Margins. Despite 175 “commitments” to launch LTE, single RANs will be no bonanza, inviting fresh attempts to “buy” share. In a market we see growing ~5-10% in ’12. Ericsson and Huawei are the only vendors now generating returns above their capital costs: LTE will not make this better, while vendors like NSN and ALU must fend off aggressive new entrants like ZTE pricing low to win swaps deals.
Figure 1: Vendor “Pro-Forma” Margins ’07-’12E: Only Two Make Likely Cost of Capital
To read the Briefing in full, including in addition to the above analysis of:
Operators: Better Late than Early!
Something New Here?
Standards/Spectrum: Much to Do
Vendors: Challenges ‘Aplenty
… Not Enough Profits for All
Devices: All to Come
Transformation… Not!
…and the following charts and tables…
Figure 1: Vendor “Pro-Forma” Margins ’07-’12E: Only Two Make Likely Cost of Capital
Figure 2: Verizon LTE Just in the Dots
Figure 3: Terminals Needed to Make LTE Work
Figure 4: “Scissor Effect” Facing Operators
Figure 5: Every Bit of the Air: Potential Spectrum to Be Used for LTE
Figure 6: Vendor Scale on ’11 Sales: Clear Gaps
…Members of the Telco 2.0TM Executive Briefing Subscription Service and Future Networks Stream can download the full 7 page report in PDF format here. Non-Members, please see here for how to subscribe. Please email contact@telco2.net or call +44 (0) 207 247 5003 for further details.
Summary: Content Delivery Networks (CDNs) are becoming familiar in the fixed broadband world as a means to improve the experience and reduce the costs of delivering bulky data like online video to end-users. Is there now a compelling need for their mobile equivalents, and if so, should operators partner with existing players or build / buy their own? (August 2011, Executive Briefing Service, Future of the Networks Stream).
Below is an extract from this 25 page Telco 2.0 Report that can be downloaded in full in PDF format by members of the Telco 2.0 Executive Briefing service and Future Networks Stream here. Non-members can buy a Single User license for this report online here for £595 (+VAT) or subscribe here. For multiple user licenses, or to find out about interactive strategy workshops on this topic, please email contact@telco2.net or call +44 (0) 207 247 5003.
To share this article easily, please click:
//
Introduction
As is widely documented, mobile networks are witnessing huge growth in the volumes of 3G/4G data traffic, primarily from laptops, smartphones and tablets. While Telco 2.0 is wary of some of the headline shock-statistics about forecast “exponential” growth, or “data tsunamis” driven by ravenous consumption of video applications, there is certainly a fast-growing appetite for use of mobile broadband.
That said, many of the actual problems of congestion today can be pinpointed either to a handful of busy cells at peak hour – or, often, the inability of the network to deal with the signalling load from chatty applications or “aggressive” devices, rather than the “tonnage” of traffic. Another large trend in mobile data is the use of transient, individual-centric flows from specific apps or communications tools such as social networking and messaging.
But “tonnage” is not completely irrelevant. Despite the diversity, there is still an inexorable rise in the use of mobile devices for “big chunks” of data, especially the special class of software commonly known as “content” – typically popular/curated standalone video clips or programmes, or streamed music. Images (especially those in web pages) and application files such as software updates fit into a similar group – sizeable lumps of data downloaded by many individuals across the operator’s network.
This one-to-many nature of most types of bulk content highlights inefficiencies in the way mobile networks operate. The same data chunks are downloaded time and again by users, typically going all the way from the public Internet, through the operator’s core network, eventually to the end user. Everyone loses in this scenario – the content publisher needs huge servers to dish up each download individually. The operator has to deal with transport and backhaul load from repeatedly sending the same content across its network (and IP transit from shipping it in from outside, especially over international links). Finally, the user has to deal with all the unpredictability and performance compromises involved in accessing the traffic across multiple intervening points – and ends up paying extra to support the operator’s heavier cost base.
In the fixed broadband world, many content companies have availed themselves of a group of specialist intermediaries called CDNs (content delivery networks). These firms on-board large volumes of the most important content served across the Internet, before dropping it “locally” as near to the end user as possible – if possible, served up from cached (pre-saved) copies. Often, the CDN operating companies have struck deals with the end-user facing ISPs, which have often been keen to host their servers in-house, as they have been able to reduce their IP interconnection costs and deliver better user experience to their customers.
In the mobile industry, the use of CDNs is much less mature. Until relatively recently, the overall volumes of data didn’t really move the needle from the point of view of content firms, while operators’ radio-centric cost bases were also relatively immune from those issues as well. Optimising the “middle mile” for mobile data transport efficiency seemed far less of a concern than getting networks built out and handsets and apps perfected, or setting up policy and charging systems to parcel up broadband into tiered plans. Arguably, better-flowing data paths and video streams would only load the radio more heavily, just at a time when operators were having to compress video to limit congestion.
This is now changing significantly. With the rise in smartphone usage – and the expectations around tablets – Internet-based CDNs are pushing much more heavily to have their servers placed inside mobile networks. This is leading to a certain amount of introspection among the operators – do they really want to have Internet companies’ infrastructure inside their own networks, or could this be seen more as a Trojan Horse of some sort, simply accelerating the shift of content sales and delivery towards OTT-style models? Might it not be easier for operators to build internal CDN-type functions instead?
Some of the earlier approaches to video traffic management – especially so-called “optimisation” without the content companies’ permission of involvement – are becoming trickier with new video formats and more scrutiny from a Net Neutrality standpoint. But CDNs by definition involve the publishers, so potentially any necessary compression or other processing can be collaboratively, rather than “transparently” without cooperation or willingness.
At the same time, many of the operators’ usual vendors are seeing this transition point as a chance to differentiate their new IP core network offerings, typically combining CDN capability into their routing/switching platforms, often alongside the optimisation functions as well. In common with other recent innovations from network equipment suppliers, there is a dangled promise of Telco 2.0-style revenues that could be derived from “upstream” players. In this case, there is a bit more easily-proved potential, since this would involve direct substitution of the existing revenues already derived from content companies, by the Internet CDN players such as Akamai and Limelight. This also holds the possibility of setting up a two-sided, content-charging business model that fits OK with rules on Net Neutrality – there are few complaints about existing CDNs except from ultra-purist Neutralists.
On the other hand, telco-owned CDNs have existed in the fixed broadband world for some time, with largely indifferent levels of success and adoption. There needs to be a very good reason for content companies to choose to deal with multiple national telcos, rather than simply take the easy route and choose a single global CDN provider.
So, the big question for telcos around CDNs at the moment is “should I build my own, or should I just permit Akamai and others to continue deploying servers into my network?” Linked to that question is what type of CDN operation an operator might choose to run in-house.
There are four main reasons why a mobile operator might want to build its own CDN:
To lower costs of network operation or upgrade, especially in radio network and backhaul, but also through the core and in IP transit.
To improve the user experience of video, web or applications, either in terms of data throughput or latency.
To derive incremental revenue from content or application providers.
For wider strategic or philosophical reasons about “keeping control over the content/apps value chain”
This Analyst Note explores these issues in more details, first giving some relevant contextual information on how CDNs work, especially in mobile.
What is a CDN?
The traditional model for Internet-based content access is straightforward – the user’s browser requests a piece of data (image, video, file or whatever) from a server, which then sends it back across the network, via a series of “hops” between different network nodes. The content typically crosses the boundaries between multiple service providers’ domains, before finally arriving at the user’s access provider’s network, flowing down over the fixed or mobile “last mile” to their device. In a mobile network, that also typically involves transiting the operator’s core network first, which has a variety of infrastructure (network elements) to control and charge for it.
A Content Delivery Network (CDN) is a system for serving Internet content from servers which are located “closer” to the end user either physically, or in terms of the network topology (number of hops). This can result in faster response times, higher overall performance, and potentially lower costs to all concerned.
In most cases in the past, CDNs have been run by specialist third-party providers, such as Akamai and Limelight. This document also considers the role of telcos running their own “on-net” CDNs.
CDNs can be thought of as analogous to the distribution of bulky physical goods – it would be inefficient for a manufacturer to ship all products to customers individually from a single huge central warehouse. Instead, it will set up regional logistics centres that can be more responsive – and, if appropriate, tailor the products or packaging to the needs of specific local markets.
As an example, there might be a million requests for a particular video stream from the BBC. Without using a CDN, the BBC would have to provide sufficient server capacity and bandwidth to handle them all. The company’s immediate downstream ISPs would have to carry this traffic to the Internet backbone, the backbone itself has to carry it, and finally the requesters’ ISPs’ access networks have to deliver it to the end-points. From a media-industry viewpoint, the source network (in this case the BBC) is generally called the “content network” or “hosting network”; the destination is termed an “eyeball network”.
In a CDN scenario, all the data for the video stream has to be transferred across the Internet just once for each participating network, when it is deployed to the downstream CDN servers and stored. After this point, it is only carried over the user-facing eyeball networks, not any others via the public Internet. This also means that the CDN servers may be located strategically within the eyeball networks, in order to use its resources more efficiently. For example, the eyeball network could place the CDN server on the downstream side of its most expensive link, so as to avoid carrying the video over it multiple times. In a mobile context, CDN servers could be used to avoid pushing large volumes of data through expensive core-network nodes repeatedly.
When the video or other content is loaded into the CDN, other optimisations such as compression or transcoding into other formats can be applied if desired. There may also be various treatments relating to new forms of delivery such as HTTP streaming, where the video is broken up into “chunks” with several different sizes/resolutions. Collectively, these upfront processes are called “ingestion”.
Figure 1 – Content delivery with and without a CDN
Source: STL Partners / Telco 2.0
Value-added CDN services
It is important to recognise that the fixed-centric CDN business has increased massively in richness and competition over time. Although some of the players have very clever architectures and IPR in the forms of their algorithms and software techniques, the flexibility of modern IP networks has tended to erode away some of the early advantages and margins. Shipping large volumes of content is now starting to become secondary to the provision of associated value-added functions and capabilities around that data. Additional services include:
Analytics and reporting
Advert insertion
Content ingestion and management
Application acceleration
Website security management
Software delivery
Consulting and professional services
It is no coincidence that the market leader, Akamai, now refers to itself as “provider of cloud optimisation services” in its financial statements, rather than a CDN, with its business being driven by “trends in cloud computing, Internet security, mobile connectivity, and the proliferation of online video”. In particular, it has started refocusing away from dealing with “video tonnage”, and towards application acceleration – for example, speeding up the load times of e-commerce sites, which has a measurable impact on abandonment of purchasing visits. Akamai’s total revenues in 2010 were around $1bn, less than half of which came from “media and entertainment” – the traditional “content industries”. Its H1 2011 revenues were relatively disappointing, with growth coming from non-traditional markets such as enterprise and high-tech (eg software update delivery) rather than media.
This is a critically important consideration for operators that are looking to CDNs to provide them with sizeable uplifts in revenue from upstream customers. Telcos – especially in mobile – will need to invest in various additional capabilities as well as the “headline” video traffic management aspects of the system. They will need to optimise for network latency as well as throughput, for example – which will probably not have the cost-saving impacts expected from managing “data tonnage” more effectively.
Although in theory telcos’ other assets should help – for example mapping download analytics to more generalised customer data – this is likely to involve extra complexity with the IT side of the business. There will also be additional efforts around sales and marketing that go significantly beyond most mobile operators’ normal footprint into B2B business areas. There is also a risk that an analysis of bottlenecks for application delivery / acceleration ends up simply pointing the finger of blame at the network’s inadequacies in terms of coverage. Improving delivery speed, cost or latency is only valuable to an upstream customer if there is a reasonable likelihood of the end-user actually having connectivity in the first place.
Figure 2: Value-added CDN capabilities
Source: Alcatel-Lucent
Application acceleration
An increasingly important aspect of CDNs is their move beyond content/media distribution into a much wider area of “acceleration” and “cloud enablement”. As well as delivering large pieces of data efficiently (e.g. video), there is arguably more tangible value in delivering small pieces of data fast.
There are various manifestations of this, but a couple of good examples illustrate the general principles:
Many web transactions are abandoned because websites (or apps) seem “slow”. Few people would trust an airline’s e-commerce site, or a bank’s online interface, if they’ve had to wait impatiently for images and page elements to load, perhaps repeatedly hitting “refresh” on their browsers. Abandoned transactions can be directly linked to slow or unreliable response times – typically a function of congestion either at the server or various mid-way points in the connection. CDN-style hosting can accelerate the service measurably, leading to increased customer satisfaction and lower levels of abandonment.
Enterprise adoption of cloud computing is becoming exceptionally important, with both cost savings and performance enhancements promised by vendors. Sometimes, such platforms will involve hybrid clouds – a mixture of private (Internal) and public (Internet) resources and connectivity. Where corporates are reliant on public Internet connectivity, they may well want to ensure as fast and reliable service as possible, especially in terms of round-trip latency. Many IT applications are designed to be run on ultra-fast company private networks, with a lot of “hand-shaking” between the user’s PC and the server. This process is very latency-dependent, and especially as companies also mobilise their applications the additional overhead time in cellular networks may otherwise cause significant problems.
Hosting applications at CDN-type cloud acceleration providers achieves much the same effect as for video – they can bring the application “closer”, with fewer hops between the origin server and the consumer. Additionally, the CDN is well-placed to offer additional value-adds such as firewalling and protection against denial-of-service attacks.
To read the 25 note in full, including the following additional content…
How do CDNs fit with mobile networks?
Internet CDNs vs. operator CDNs
Why use an operator CDN?
Should delivery mean delivery?
Lessons from fixed operator CDNs
Mobile video: CDNs, offload & optimisation
CDNs, optimisation, proxies and DPI
The role of OVPs
Implementation and planning issues
Conclusion & recommendations
… and the following additional charts…
Figure 3 – Potential locations for CDN caches and nodes
Figure 4 – Distributed on-net CDNs can offer significant data transport savings
Figure 5 – The role of OVPs for different types of CDN player
Figure 6 – Summary of Risk / Benefits of Centralised vs. Distributed and ‘Off Net’ vs. ‘On-Net’ CDN Strategies
……Members of the Telco 2.0 Executive Briefing Subscription Service and Future Networks Stream can download the full 25 page report in PDF format here. Non-Members, please see here for how to subscribe, here to buy a single user license for £595 (+VAT), or for multi-user licenses and any other enquiries please email contact@telco2.net or call +44 (0) 207 247 5003.
Summary: To some, LTE is the latest mobile wonder technology – bigger, faster, better. But how do institutional investors see it?
This is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.
The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.
[Members of the Telo 2.0TM Executive Briefing Subscription Service and Future Networks Stream, please see here for the full Briefing report. Non-Members, please see here for how to subscribe or email contact@telco2.net or call +44 (0) 207 247 5003.]
Wireless Infrastructure
[Figure]
LTE is the new HSPA is the new WCDMA: another wave of new air interfaces, network architectures, and enabled services to add mobile data capacity. From 3G to 3.5G to 4G, vendors are pushing technology into a few “pioneer” operators, hoping to boost sales. Yet, like previous “G’s,” LTE will see minimal near-term sales, requires $1bn+ of R&D per vendor, and promises uncertain returns. The LTE hype is adding costs for vendors that saw margins fall for two years.
Despite large projects in China and India, we see wireless infrastructure sales down 5% in ’09, after 10% growth in ’08. As major 2G rollouts near an end, emerging markets 3G pricing should take to new lows. Some 75% of sales are with four vendors (Ericsson, NSN-Nortel, Huawei, and Alcatel-Lucent), but margins have been falling: we do not see consolidation (like the recent NSN-Nortel deal) structurally improving margins. LTE is another chapter in the story of a fundamentally deflationary market, with each successive generation having a shorter lifecycle and yielding lower sales. We expect a period of heightened (and politicised) competition for a few “strategic accounts,” and fresh attempts to “buy” share (as in NSN-Nortel, or by ZTE).
Late Is Great. We think LTE will roll out later, and in a more limited form than is even now being proposed (after delays at Verizon and others). There is little business case for aggressive deployment, even at CDMA operators whose roadmaps are reaching dead ends. HSPA+ further confuses the picture.
Temptations Galore. Like WCDMA, every vendor thinks it can take market share in LTE. And like WCDMA, we think share shifts will prove limited, and the ensuing fight for deals will leave few winners.
Elusive Economics. LTE demands $1bn in R&D spend over three to five years; with extensive testing and sharing of technical data among leading operators, there is little scope to cut corners (or costs). LTE rollouts will not improve poor industry margins, and at 2.6GHz, may force network sharing.
Reaching for the Grapes
Table 1 shows aggregate sales, EBITDA, and capex for the top global and emerging markets operators. It reflects a minefield of M&A, currencies, private equity deals, and changes in reporting structure. Getting more complete data is nearly impossible: GSA says there are 284 GSM/WCMDA operators, and CDG claims another 280 in CDMA. We have long found only limited correlation between aggregate capex numbers and OEM sales (which often lag shipments due to revenue recognition). Despite rising data traffic volumes and emerging markets capex, we think equipment vendor sales will fall 5%+ in US$. We think LTE adds risk by bringing forward R&D spend to lock down key customers, but committing OEMs to roll out immature technology with uncertain commercial demand.
Table 1: Sales and Capex Growth, ’05-’09E
’05
’06
’07
’08
’09E
Top 20 Global Operators
Sales Growth
13%
16%
15%
10%
5%
EBITDA Growth
13%
15%
14%
10%
8%
Capex Growth
10%
10%
5%
9%
-1%
Top 25 Emerging Market Operators
Sales Growth
35%
38%
29%
20%
11%
EBITDA Growth
33%
46%
30%
18%
8%
Capex Growth
38%
29%
38%
25%
-12%
Global Capex Total
16%
18%
13%
14%
-5%
Source: Arete Research
LaTE for Operators
LTE was pushed by the GSM community in a global standards war against CDMA and WiMAX. Since LTE involves new core and radio networks, and raises the prospect of managing three networks (GSM, WCMDA/HSPA, and LTE), it is a major roadmap decision for conservative operators. Added to this are questions about spectrum, IPR, devices, and business cases. These many issues render moot near-term speculation about timing of LTE rollouts.
Verizon and DoCoMo aside, few operators profess an appetite for LTE’s new radio access products, air interfaces, or early-stage handsets and single-mode datacards. We expect plans for “commercial service” in ’10 will be “soft” launches. Reasons for launching early tend to be qualitative: gaining experience with new technology, or a perception of technical superiority. A look at leading operators shows only a few have clear LTE commitments.
Verizon already pushed back its Phase I (fixed access in 20-30 markets) to 2H10, with “rapid deployment” in ’11-’12 at 700MHz, 850MHz, and 1.9GHz bands, and national coverage by ’15, easily met by rolling out at 700Mhz. Arguably, Verizon is driven more by concerns over the end of the CDMA roadmap, and management said it would “start out slow and see what we need to do.”
TeliaSonera targets a 2010 data-only launch in two cities (Stockholm and Oslo), a high-profile test between Huawei and Ericsson.
Vodafone’s MetroZone concept uses low-cost femto- or micro-cells for urban areas; it has no firm commitment on launching LTE.
3 is focussing on HSPA+, with HSPA datacards in the UK offering 15GB traffic for £15, on one-month rolling contracts.
TelefónicaO2 is awaiting spectrum auctions in key markets (Germany, UK) before deciding on LTE; it is sceptical about getting datacards for lower frequencies.
Orange says it is investing in backhaul while it “considers LTE network architectures.”
T-Mobile is the most aggressive, aiming for an ’11 LTE rollout to make up for its late start in 3G, and seeks to build an eco-system around VoLGA (Voice over LTE via Generic Access).
China Mobile is backing a China-specific version (TD-LTE), which limits the role for Western vendors until any harmonising of standards.
DoCoMo plans to launch LTE “sometime” in ’10, but was burnt before in launching WCDMA early. LTE business plans submitted to the Japanese regulator expect $11bn of spend in five years, some at unique frequency bands (e.g., 1.5GHz and 1.7GHz).
LTE’s “commercial availability” marks the start of addressing the issue of handling voice, either via fallback to circuit switched networks, or with VoIP over wireless. The lack of LTE voice means operators have to support three networks, or shut down GSM (better coverage than WCDMA) or WCDMA (better data rates than GSM). This is a major roadblock to mass market adoption: Operators are unlikely to roll out LTE based on data-only business models. The other hope is that LTE sparks fresh investment in core networks: radio is just 35-40% of Vodafone’s capex and 30% of Orange’s. The rest goes to core, transmission, IT, and other platforms. Yet large OEMs may not benefit from backhaul spend, with cheap wireline bandwidth and acceptance for point-to-multipoint microwave.
HSPA+ is a viable medium-term alternative to LTE, offering similar technical performance and spectral efficiency. (LTE needs, 20MHz vs. 10Mhz for HSPA+.) There have been four “commercial” HSPA+ launches at 21Mbps peak downlink speeds, and 20+ others are pending. Canadian CDMA operators Telus and Bell (like the Koreans) adopted HSPA only recently. HSPA+ is favoured by existing vendors: it lacks enough new hardware to be an entry point for the industry’s second-tier (Motorola, NEC, and to a lesser extent Alcatel-Lucent), but HSPA+ will also require new devices. There are also further proposed extensions of GSM, quadrupling capacity (VAMOS, introducing MIMO antennas, and MUROS for multiplexing re-use); these too need new handsets.
Vendors say successive 3G and 4G variants require “just a software upgrade.” This is largely a myth. With both HSPA+ or LTE, the use of 64QAM brings significant throughput degradation with distance, sharply reducing the cell area that can get 21Mbps service to 15%. MIMO antennas and/or multi-carrier solutions with additional power amplifiers are needed to correct this. While products shipping from ’07 onwards can theoretically be upgraded to 21Mbps downlink, both capacity (i.e., extra carriers) and output power (to 60W+) requirements demand extra hardware (and new handsets). Vendors are only now starting to ship newer multi-mode (GSM, WCDMA, and LTE) platforms (e.g., Ericsson’s RBS6000 or Huawei’s Uni-BTS). Reducing the number of sites to run 2G, 3G, and 4G will dampen overall equipment sales.
Tempting for Vendors
There are three reasons LTE holds such irresistible charm for vendors. First, OEMs want to shift otherwise largely stagnant market shares. Second, vendor marketing does not allow for “fast followers” on technology roadmaps. Leading vendors readily admit claims of 100-150Mpbs throughput are “theoretical” but cannot resist the tendency to technical one-upmanship. Third, we think there will be fewer LTE networks built than in WCDMA, especially at 2.6GHz, as network-sharing concepts take root and operators are capital-constrained. Can the US afford to build 4+ nationwide LTE networks? This scarcity makes it even more crucial for vendors to win deals.
Every vendor expected to gain share in WCDMA. NSN briefly did, but Huawei is surging ahead, while ALU struggled to digest Nortel’s WCDMA unit and Motorola lost ground. Figure 1 shows leading radio vendors’ market share. In ’07, Ericsson and Huawei gained share. In ’08, we again saw Huawei gain, as did ALU (+1ppt), whereas Ericsson was stable and Motorola and NSN lost ground.
Source: Arete Research; others incl. ZTE, Fujitsu, LG, Samsung, and direct sub-systems vendor sales (Powerwave, CommScope, Kathrein, etc.); excludes data and transmission sales from Cisco, Juniper, Harris, Tellabs, and others.
While the industry evolved into an oligopoly structure where four vendors control 75% of sales, this has not eased pricing pressure or boosted margins. Ericsson remains the industry no. 1, but its margins are half ’07 levels; meanwhile, NSN is losing money and seeking further scale buying Nortel’s CDMA and LTE assets. Huawei’s long-standing aggressiveness is being matched by ZTE (now with 1,000 staff in EU), and both hired senior former EU execs from vendors such as Nortel and Motorola. Alcatel-Lucent and Motorola are battling to sustain critical mass, with a mix of technologies for each, within ~$5bn revenue business units.
We had forecast Nortel’s 5% share would dwindle to 3% in ’09 (despite part purchase by NSN) and Motorola seems unlikely to get LTE wins it badly needs, after abandoning direct 3G sales. ALU won a slice of Verizon’s LTE rollout (though it may be struggling with its EPC core product), and hopes for a role in China Mobile’s TD-LTE rollouts, but lacks WCDMA accounts to migrate. Huawei’s market share gains came from radio access more than core networks, but we hear it recently won Telefónica for LTE. NSN was late on its HSPA roadmap (to 7.2Mpbs and 14.4Mbps), and lacks traction in packet core. It won new customers in Canada and seeks a role in AT&T’s LTE rollout, but is likely to lose share in ’09. Buying Nortel is a final (further) bid for scale, but invites risks around retaining customers and integrating LTE product lines. Finally, Ericsson’s no. 1 market share looks stable, but it has been forced to respond to fresh lows in pricing from its Asian rivals, now equally adept at producing leading-edge technology, even if their delivery capability is still lagging.
Elusive Economics
The same issues that plagued WCDMA also make LTE elusive: coverage, network performance, terminals, and volume production of standard equipment. Operators have given vendors a list of issues to resolve in networks (esp. around EPC) and terminals. Verizon has its own technical specs relating to transmit output power and receive sensitivity, and requires tri-band support. We think commercialising LTE will require vendors to commit $1bn+ in R&D over three to five years, based on teams of 2-3,000 engineers. LTE comes at a time when every major OEM is seeking €1bn cost savings via restructuring, but must match plunging price levels.
To read the rest of the article, including:
Coping with Traffic
Is There a Role for WiMax?
Will Anyone Get the Grapes?
…Members of the Telco 2.0™ Executive Briefing Service and Future Networks Stream can read on here. Non-Members please see here to subscribe.
This is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.
The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.
NB A full PDF copy of this briefing can be downloaded here.
This special Executive Briefing report summarises the brainstorming output from the Content Distribution 2.0 (Broadband Video) section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.
It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0′ business model framework (see www.telco2research.com).
Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.
There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and APIs 2.0. In addition there will be an overall ‘Executive Summary’ report highlighting the overall messages from the event.
Each report contains:
Our independent summary of some of the key points from the stimulus presentations
An analysis of the brainstorming output, including a large selection of verbatim comments
The ‘next steps’ vote by the participants
Our conclusions of the key lessons learnt and our suggestions for industry next steps.
The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points.
Background to this report
The demand for internet video is exploding. This is putting significant stress on the current fixed and mobile distribution business model. Infrastructure investments and operating costs required to meet demand are growing faster than revenues. The strategic choices facing operators are to charge consumers more when they expect to pay less, to risk upsetting content providers and users by throttling bandwidth, or to unlock new revenues to support investment and cover operating costs by creating new valuable digital distribution services for the video content industry.
Brainstorm Topics
A summary of the new Telco 2.0 Online Video Market Study: Options and Opportunities for Distributors in a time of massive disruption.
What are the most valuable new digital distribution services that telcos could create?
What is the business model for these services – who are the potential buyers and what are prior opportunity areas?
What progress has been made in new business models for video distribution – including FTTH deployment, content-delivery networking, and P2P?
Preliminary results of the UK cross-carrier trial of sender-pays data
How the TM Forum’s IPSphere programme can support video distribution
Stimulus Presenters and Panellists
Richard D. Titus, Controller, Future Media, BBC
Trudy Norris-Grey, MD Transformation and Strategy, BT Wholesale
Scott Shoaf, Director, Strategy and Planning, Juniper Networks
Ibrahim Gedeon, CTO, Telus
Andrew Bud, Chairman, Mobile Entertainment Forum
Alan Patrick, Associate, Telco 2.0 Initiative
Facilitator
Simon Torrance, CEO, Telco 2.0 Initiative
Analysts
Chris Barraclough, Managing Director, Telco 2.0 Initiative
Dean Bubley, Senior Associate, Telco 2.0 Initiative
Alex Harrowell, Analyst, Telco 2.0 Initiative
Stimulus Presentation Summaries
Content Distribution 2.0
Scott Shoaf, Director, Strategy and Planning, Juniper Networks opened the session with a comparison of the telecoms industry’s response to massive volumes of video and that of the US cable operators. He pointed out that the cable companies’ raison d’etre was to deliver vast amounts of video; therefore their experience should be worth something.
The first question, however, was to define the problem. Was the problem the customer, in which case the answer would be to meter, throttle, and cap bandwidth usage? If we decided this was the solution, though, the industry would be in the position of selling broadband connections and then trying to discourage its customers from using them!
Or was the problem not one of cost, but one of revenue? Networks cost money; the cloud is not actually a cloud, but is made up of cables, trenches, data centres and machines. Surely there wouldn’t be a problem if revenues rose with higher usage? In that case, we ought to be looking at usage-based pricing, but also at alternative business models – like advertising and the two-sided business model.
Or is it an engineering problem? It’s not theoretically impossible to put in bigger pipes until all the HD video from everyone can reach everyone else without contention – but in practice there is always some degree of oversubscription. What if we focused on specific sources of content? Define a standard of user experience, train the users to that, and work backwards?
If it is an engineering problem, the first step is to reduce the problem set. The long tail obviously isn’t the problem; it’s too long, as has been pointed out, and doesn’t account for very much traffic. It’s the ‘big head’ or ‘short tail’ stuff that is the heart of the problem: we need to deal with this short tail of big traffic generators. We need a CDN or something similar to deliver for this.
On cable, the customers are paying for premium content – essentially movies and TV – and the content providers are paying for distribution. We need to escape from the strict distinctions between Internet, IPTV, and broadcast. After all, despite the alarming figures for people leaving cable, many of them are leaving existing cable connections to take a higher grade of service. Consider Comcast’s Fancast – focused on users, not lines, with an integrated social-recommendation system, it integrates traditional cable with subscription video. Remember that broadcast is a really great way to deliver!
Advertising – at the moment, content owners are getting 90% of the ad money.
Getting away from this requires us to standardise the technology and the operational and commercial practices involved. The cable industry is facing this with the SCTE130 and Advanced Advertising 1.0 standards, which provide for fine-grained ad insertion and reporting. We need to blur the definition of TV advertising – the market is much bigger if you include Internet and TV ads together. Further, 20,000 subscribers to IPTV aren’t interesting to anyone – we need to attack this across the industry and learn how to treat the customer as an asset.
The Future of Online Video, 6 months on
Alan Patrick, Associate, Telco 2.0 updated the conference on how things had changed since he introduced the ”Pirate World” concept from our Online Video Distribution strategy report at the last Telco 2.0 event. The Pirate World scenario, he said, had set in much faster and more intensely than we had expected, and was working in synergy with the economic crisis.
Richard Titus, Controller, Future Media, BBC: ”I have no problem with carriers making money, in fact, I pay over the odds for a 50Mbits link, but the real difference is between a model that creates opportunities for the public and one which constrains them.”
Ad revenues were falling; video traffic still soaring; rights-holders’ reaction had been even more aggressive than we had expected, but there was little evidence that it was doing any good. Entire categories of content were in crisis.
On the other hand, the first stirrings of the eventual “New Players Emerge” scenario were also observable; note the success of Apple in creating a complete, integrated content distribution and application development ecosystem around its mobile devices.
The importance of CPE is only increasing; especially with the proliferation of devices capable of media playback (or recording) and interacting with Internet resources. There’s a need for a secure gateway to help manage all the gadgets and deliver content efficiently. Similarly, CDNs are only becoming more central – there is no shortage of bandwidth, but only various bottlenecks. It’s possible that this layer of the industry may become a copyright policing point.
We think new forms of CPE and CDNs are happening now; efforts to police copyright in the network are in the near future; VAS platforms are the next wave after that, and then customer data will become a major line of business.
Most of all, time is flying by, and the overleveraged, or undercapitalised, are being eaten first.
The Content Delivery Framework
Ibrahim Gedeon, CTO, Telus introduced some lessons from Telus’s experience deploying both on-demand bandwidth and developer APIs. Telcos aren’t good at content, he said; instead, we need to be the smartest pipe and make use of our trusted relationship with customers, built up over the last 150 years.
We’re working in an environment where cash is scarce and expensive, and pricing is a zero- or even negative-sum game; impossible to raise prices, and hard to cut without furthering the price war. So what should we be doing? A few years ago the buzzword was SDP; now it’s CDN. We’d better learn what those actually mean!
Trudy Norris-Gray, Managing Director, BT Wholesale: ”There is no capacity problem in the core, but there is to the consumer – and three bad experiences means the end of an application or service for that individual user.”
Anyway, we’re both a mobile and fixed operator and ISP, and we’ve got an IPTV network. We’ve learned the hard way that technology isn’t our place in the value chain. When we got the first IPTV system from Microsoft, it used 2,500 servers and far, far too much power. So we’re moving to a CDF (Content Delivery Framework) – which looks a lot like a SDP. Have the vendors just changed the labels on these charts?
So why do we want this? So we can charge for bandwidth, of course; if it was free, we wouldn’t care! But we’re making around $10bn in revenues and spending 20% of that in CAPEX. We need a business case for this continued investment.
We need the CDF to help us to dynamically manage the delivery and charging process for content. There was lots of goodness in IMS, the buzzword of five years ago, and in SDPs. But in the end it’s the APIs that matter. And we like standards because we’re not very big. So, we want to use TM Forum’s IPSphere to extend the CDF and SDF; after all, in roaming we apply different rate cards dynamically and settle transactions, so why not here too, for video or data? I’d happily pay five bucks for good 3G video interconnection.
And we need to do this for developer platforms too, which is why we’re supporting the OneAPI reference architecture. To sum up, let’s not forget subscriber identity, online charging – we’ve got to make money – the need for policy management because not all users are equal, and QoS for a differentiated user experience.
Sender-Pays Data in Practice
Andrew Bud, Chairman, MEF gave an update on the trial of sender-pays data he announced at the last event. This is no longer theoretical, he said; it’s functioning, just with a restricted feature set. Retail-only Internet has just about worked so far; because people pay for the services through their subscription and they’re free. Video breaks this, he said; it will be impossible to be comprehensive, meaningful, and sustainable.
You can’t, he said, put a meaningful customer warning that covers all the possible prices you might encounter due to carrier policy with your content; and everyone is scared of huge bills after the WAP experience. Further, look at the history of post offices, telegraphy and telephony – it’s been sender-pays since the 1850s. Similarly, Amazon.com is sender-pays, as is Akamai.
Hence we need sending-party pays data – that way, we can have truly free ads: not one where the poor end users ends up paying the delivery cost!
Our trial: we have relationships with carriers making up 85% of the UK market. We have contracts, priced per-MB of data, with them. And we have four customers – Jamster, who brought you the Crazy Frog, Shorts, THMBNLS, who produce mobisodes promoting public health, and Creative North – mobile games as a gift from the government. Of course, without sender-pays this is impossible.
We’ve discovered that the carriers have no idea how much data costs; wholesale pricing has some very interesting consequences. Notably the prices are being set too high. Real costs and real prices mean that quality of experience is a real issue; it’s a very complicated system to get right. The positive sign, and ringing endorsement for the trial, is that some carriers are including sender-pays revenue in their budgets now!
Participant Feedback
Introduction
The business of video is a prime battleground for Telco 2.0 strategies. It represents the heaviest data flows, the cornerstone of triple/quad-play bundling, powerful entrenched interests from broadcasters and content owners, and a plethora of regulators and industry bodies. For many people, it lies at the heart of home-based service provision and entertainment, as well as encroaching on the mobile space. The growth of P2P and other illegal or semi-legal download mechanisms puts pressure on network capacity – and invites controversial measures around protecting content rights and Net Neutrality.
In theory, operators ought to be able to monetise video traffic, even if they don’t own or aggregate content themselves. There should be options for advertising, prioritised traffic or blended services – but these are all highly dependent on not just capable infrastructure, but realistic business models. Operators also need to find a way to counter the ‘Network Neutrality’ lobbyists who are confounding the real issue (access to the internet for all service providers on a ‘best efforts’ basis) with spurious arguments that operators should not be able to offer premium services, such as QoS and identity, to customers that want to pay for them. Telco 2.0 would argue that the right to offer (and the right to buy) a better service is a cornerstone of capitalism and something that is available in every other industry. Telecoms should be no different. Of course, it remains up to the operators to develop services that customers are willing to pay more for…
A common theme in the discussion was “tempus fugit” – time flies. The pace of evolution has been staggering, especially in Internet video distribution – IPTV, YouTube, iPlayer, Hulu, Qik, P2P, mashups and so forth. Telcos do not have the luxury of time for extended pilot projects or grandiose collaborations that take years to come to fruition.
With this timing issue in mind, the feedback from the audience was collected in three categories, although here the output has been aggregated thematically, as follows:
STOP – What should we stop doing?
START – What should we start doing?
DO MORE – What things should we do more of?
Feedback: STOP the current business model
There was broad agreement that the current model is unsustainable, especially given the demands that “heavy” content like video traffic places on the network…..
· [Stop] giving customers bandwidth for free [#5]
· Stop complex pricing models for end-user [#9]
· Stop investing so much in sustaining old order [#18]
· Stop charging mobile subscribers on a per megabyte basis. [#37]
· Current peering agreement/ip neutrality is not sustainable. [#41]
· [Stop] assuming things are free. [#48]
· [Stop] lowering prices for unlimited data. [#61]
· Have to develop more models for upstream charging for data rather than just flat rate to subscribers. [#11]
· Build rational pricing segmentation for data to monetize both sides of the value chain with focus on premium value items. [#32]
Feedback: Transparency and pricing
… with many people suggesting that Telcos first need to educate users and service providers about the “true cost” of transporting data…. although whether they actually know the answer themselves is another question, as it is much an issue of accounting practices as network architecture.
· Make the service providers aware of the cost they generate to carriers. [#31]
· Make pricing transparency for consumers a must. [#10]
· Mobile operators start being honest with themselves about the true cost of data before they invest in LTE. [#7]
· When resources are limited, then rationing is necessary. Net Neutrality will not work. Today people pay for water in regions where it is limited in supply. Its use is abused when there are no limits. [#17]
· Start being transparent in data charges, it will all stay or fall with cost transparency. [#12]
· You can help people understand usage charges, with meters or regular updates, requires education for a behavioural change, easier for fixed than mobile. [#14]
· Service providers need to have a more honest dialogue with subscribers and give them confidence to use services [#57]
· As an industry we must invest more in educating the market about network economics, end-users as well as service providers. [#58]
· Start charging subscribers flat rate data fee rather than per megabyte. [#46]
Feedback: Sender-pays data
Andrew Bud’s concept of “sender pays data”, in which a content provider bundles in the notional cost of data transport into the download price for the consumer, generated both enthusiasm and concerns (although very little outright disagreement). Telco 2.0 agrees with the fundamental ‘elegance’ of the notion, but thinks that there are significant practical, regulatory and technical issues that need to be resolved. In particular, the delivery of “monolithic” chunks of content like movies may be limited, especially in mobile networks where data traffic is dominated by PCs with mobile broadband, usually conducting a wide variety of two-way applications like social networking.
Positive
· Sender pays is the only sane model. [#6]
· Do sender pays on both ‘sides’ consumer as well…gives ‘control’ and clarity to user. [#54]
· Sender Pays is one specific example of a much larger category of 3rd-party pays data, which also includes venue owners (e.g. hotels or restaurants), advertisers/sponsors (‘thanks for flying Virgin, we’re giving you 10MB free as a thank-you’), software developers, government (e.g. ‘benefit’ data for the unemployed etc) etc. The opportunity for Telcos may be much larger from upstream players outside the content industry [#73]
· We already do sender pays on our mobile portal – on behalf of all partner content providers including Napster mobile. [#77]
· Change the current peering model into an end to end sender pay model where all carriers in the chain receive the appropriate allocation of the sender pay revenue in order to guarantee the QoS for the end user. [#63]
· Focus on the money flows e.g. confirm the sender pays model. [#19]
Qualified Support/Implementation concerns
· Business models on sender pays, but including the fact, that roaming is needed, data costs will be quite different across mobile carriers and the aggregators costs and agreements are based on the current carriers. These things need to be solved first [#26]
· Sender pays is good but needs the option of ‘only deliver via WiFi or femtocell when the user gets home’ at 1/100th the cost of ‘deliver immediately via 3G macro network’. [#15]
· Who pays for AJAX browsers proactively downloading stuff in the background without explicit user request? [#64]
· Be realistic about sender pays data. It will not take off it is not standard across the market, and the data prices currently break the content business model – you have to compare to the next alternative. A video on iTunes costs 1.89 GBP including data… Operators should either take a long term view or forget about it. [#20]
· Sender-pays data can be used to do anything the eco-system needs, including quality/HD. It doesn’t yet today only because the carriers don’t know how to provide those. [#44]
· Sender pays works for big monolithic chunks like songs or videos. But doesn’t work for mash up or communications content/data like Facebook (my Facebook page has 30 components from different providers – are you going to bill all of them separately?) [#53]
· mBlox: more or less like a free-call number. doesn’t guarantee quality/HD [#8]
Sceptical
· Stop sender pays because user is inundated with spam. [#23]
o Re 23: At least the sender is charged for the delivery. I do not want to pay for your SPAM! [#30]
Feedback: QoS
A fair amount of the discussion revolved around the thorny issues of capacity, congestion, prioritisation and QoS, although some participants felt this distracted a little from the “bigger picture” of integrated business models.
· Part of bandwidth is dedicated to high quality contents (paid for). Rest is shared/best effort. [#27]
· Start annotating the network, by installing the equivalent of gas meters at all points across the network, in order that they truly understand the nature of traffic passing over the network – to implement QoS. [#56]
o Re: 56 – that’s fine in the fixed world or mobile core, but it doesn’t work in the radio network. Managing QoS in mobile is difficult when you have annoying things like concrete walls and metallised reflective windows in the way [#75]
· [Stop] being telecom focused and move more towards solutions. It is more than bandwidth. [#25]
· Stop pretending that mobile QoS is important, as coverage is still the gating factor for user experience. There’s no point offering 99.9% reliability when you only have 70% coverage, especially indoors [#29]
· Start preparing for a world of fewer, but converged fixed-mobile networks that are shared between operators. In this world there will need to be dynamic model of allocating and charging for network capacity. [#67]
· We need applications that are more aware of network capacity, congestion, cost and quality – and which alter their behaviour to optimise for the conditions at any point in time e.g. with different codec’s or frame rate or image size. The intelligence to do this is in the device, not the network. [#68]
o Re: 68, is it really in the CPE? If the buffering of the content is close at the terminal, perhaps, otherwise there is no jitter guarantee. [#78]
§ Re 78 – depends on the situation, and download vs. streaming etc. Forget the word ‘terminal’, it’s 1980s speak, if you have a sufficiently smart endpoint you can manage this – hence PCs being fine for buffering YouTube or i-Player etc, and some of the video players auto-sensing network conditions [#81]
· QoE – for residential cannot fully support devices which are not managed for streamed content. [#71]
· Presumably CDNs and caching have a bit of a problem with customised content, e.g. with inserted/overlaid personalised adverts in a video stream? [#76]
Feedback: platforms, APIs, and infrastructure
However, the network and device architecture is only part of the issue. It is clear that video distribution fits centrally within the wider platform problems of APIs and OSS/BSS architecture, which span the overall Telco 2.0 reach of a given operator.
· Too much focus on investment in the network, where is the innovation in enterprise software innovation to support the network? [#70]
· For operator to open up access to the business assets in a consistent manner to innovative. Intermediaries who can harmonise APIs across a national or global marketplace. [#13]
· The BSS back office; billing, etc will not support robust interactive media for the most part. [#22]
· Let content providers come directly to Telcos to avoid a middle layer (aggregators) to take the profit. This requires collaboration and standardization among Telco’s for the technical interfaces and payment models. [#28]
· More analysis on length of time and cost of managing billing vendor for support of 2-sided business model. Prohibitively expensive in back office to take risks. Why? [#65]
· It doesn’t matter how strong the network is if you can’t monetize it on the back end OSS/BSS. [#40]
Feedback: Business models for video
Irrespective of the technical issues, or specific point commercial innovations like sender pays, there are also assorted problems in managing ecosystem dynamics, or more generalised business models for online video or IPTV. A significant part of the session’s feedback explored the concerns and possible solutions – with the “elephant in the room” of Net Neutrality lurking on the sidelines.
· Open up to lower cost lower risk trials to see what does and doesn’t work. [#35]
· Real multi quality services in order to monetize high quality services. [#36]
· Transform net neutrality issues into a fair policy approach… meaning that you cannot have equal treatment when some parties abuse the openness. [#39]
o Re 39: I want QoE for content I want to see. Part of this is from speed of access. Net Neutrality comes from the Best Effort and let is fight out in the scarce network. I.e. I do not get the QoE for all the other rubbish in the network. [#69]
· Why not bundling VAS with content transportation to ease migration from a free world to a pay for value world? [#43]
· Do more collaborative models which incorporate the entire value chain. [#55]
· Service providers start partnering to resell long tail content from platform providers with big catalogues. [#59]
· [Start to] combine down- and up-stream models in content. Especially starts get paid to deliver long tail content. [#60]
· Start thinking longer term instead of short term profit, to create a new ecosystem that is bigger and healthier. [#62]
· Exploit better the business models between content providers and carriers. [#16]
· Adapt price to quality of service. [#21]
· Put more attention on quality of end user experience. [#24]
· I am prepared to pay a higher retail DSL subscription if I get a higher quality of experience. – not just monthly download limits. [#38]
· maximize revenues based on typical Telco capabilities (billing, delivery, assurance on million of customers) [#50]
· Need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
· On our fixed network, DSL resellers pay for the access and for the bandwidth used – this corresponds to the sender pays model; due to rising bandwidth demand the charge for the resellers continuously increases. so we have to adapt bandwidth tariffs every year in order not to suffocate our DSL resellers. Among them are also companies offering TV streaming. [#82]
· More settlement free peering with content/app suppliers – make the origination point blazingly fast and close to zero cost. rather focus on charging for content distribution towards the edge of the access network (smart caching, torrent seeds, multicast nodes etc) [#74]
Feedback: Others
In addition to these central themes, the session’s participants also offered a variety of other comments concerning regulatory issues, industry collaboration, consumer issues and other non-video services like SMS.
· Start addressing customer data privacy issues now, before it’s too late and there is a backlash from subscribers and the media. [#42]
· Consolidating forums and industry bodies so we end up with one practical solution. [#45]
· Identifying what an operator has potential to be of use for to content SP other than a pipe. [#49]
· Getting regulators to stimulate competition by enforcing structural separation – unbundle at layer 1, bring in agile players with low operating cost. Let customers vote with their money – focus on deliverable the fastest basic IP pipe at a reasonable price. If the basic price point is reasonable customers will be glad to pay for extra services – either sender or receiver based. [#72]
· IPTV <> Internet TV. In IPTV the Telco chooses my content, Internet TV I choose. [#79]
· Put attention on creating industry collaboration models. [#47]
· Stop milking the SMS cash cow and stop worrying about cannibalising it, otherwise today’s rip-off mobile data services will never take off. [#33]
· SMS combined with the web is going to play a big role in the future, maybe bigger that the role it played in the past. Twitter is just the first of a wave of SMS based social media and comms applications for people. [#51]
Participants ‘Next Steps’ Vote
Participants were then asked: Which of the following do we need to understand better in the next 6 months?
Is there really a capacity problem, and what is the nature of it?
How to tackle the net neutrality debate and develop an acceptable QOS solution for video?
Is there a long term future for IPTV?
How to take on the iPhone regarding mobile video?
More aggressive piloting / roll-out of sender party pays data?
Lessons learnt & next steps
The vote itself reflects the nature of the discussions and debates at the event: there are lots of issues and things that the industry is not yet clear on that need to be ironed out. The world is changing fast and how we overcome issues and exploit opportunities is still hazy. And all the time, there is a concern that the speed of change could overtake existing players (including Telcos and ISPs)!
However, there does now seem to be greater clarity on several issues with participants becoming increasingly keen to see the industry tackle the business model issue of flat-rate pricing to consumers and little revenue being attached to the distribution of content (particularly bandwidth hungry video). Overall, most seem to agree that:
1. End users like simple pricing models (hence success of flat rate) but that some ‘heavy users’ will require a variable rate pricing scheme to cover the demands they make;
2. Bandwidth is not free and costs to Telcos and ISPs will continue to rise as video traffic grows;
3. Asking those sending digital goods to pay for the distribution cost is sensible…;
4. …but plenty of work needs to be done on the practicalities of the sender-pays model before it can be widely adopted across fixed and mobile;
5. Operators need to develop a suite of value-added products and services for those sending digital goods over their networks so they can charge incremental revenues that will enable continued network investment;
6. Those pushing the ‘network neutrality’ issue are (deliberately or otherwise) causing confusion over such differential pricing which creates PR and regulatory risks for operators that need to be addressed.
There are clearly details to be ironed out – and probably experiments in pricing and charging to be done. Andrew Bud’s (and many others, it must be added, have suggested similar) sending-party pays model may work, or it may not – but this is an area where experiments need to be tried. The idea of “educating” upstream users is euphemistic – they are well aware of the benefits they currently are accruing, which is why the Net Neutrality debate is being deliberately muddied. Distributors need to be working on disentangling bits that are able to be free from those that pay to ride, not letting anyone get a free ride.
As can be seen in the responses, there is also a growing realisation that the Telco has to understand and deal with the issues of the overall value chain, end-to-end, not just the section under its direct control, if it wishes to add value over and above being a bit pipe. This is essentially moving towards a solution of the “Quality of Service” issue – they need to decide how much of the solution is capacity increase, how much is traffic management, and how much is customer expectation management.
Alan Patrick, Telco 2.0: ”98.7% of users don’t have an iPhone, but 98% of mobile developers code for it because it has an integrated end-to-end experience, rather than a content model based on starving in a garage.”
The “Tempus Fugit” point is well made too – the Telco 2.0 participants are moving towards an answer, but it is not clear that the same urgency is being seen among wider Telco management.
Two areas were skimmed through a little too quickly in the feedback:
Managing a way through the ‘Pirate World’ environment
The economic crisis has helped in that it has reduced the amount of venture capital and other risk equity going into funding plays that need not make revenue, never mind profit. In our view this means that the game will resolve into a battle of deep pockets to fund the early businesses. Incumbents typically suffer from higher cost bases and higher hurdle rates for new ventures. New players typically have less revenue, but lower cost structures. For existing Telcos this means using existing assets as effectively as possible and we suggest a more consolidated approach from operators and associated forums and industry bodies so the industry ends up with one practical solution. This is particularly important when initially tackling the ‘Network Neutrality’ issue and securing customer and regulatory support for differential pricing policies.
Adopting a policing role, particularly in the short-term during Pirate World, may be valuable for operators. Telco 2.0 believes the real value is in managing the supply of content from companies (rather than end users) and ensuring that content is legal (paid for!).
What sort of video solution should Telcos develop?
The temptation for operators to push iPTV is huge – it offers, in theory, steady revenues and control of the set-top box. Unfortunately, all the projected growth is expected to be in Web TV, delivered to PCs or TVs (or both). Providing a suite of value-added distribution services is perhaps a more lucrative strategy for operators:
Operators must better understand the needs of upstream segments and individual customers (media owners, aggregators, broadcasters, retailers, games providers, social networks, etc.) and develop propositions for value-added services in response to these. Managing end user data is likely to be important here. As one participant put it:
o We need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]
Customer privacy will clearly be an issue if operators develop solutions for upstream customers that involve the management of data flows between both sides of the platform. End users want to know what upstream customers are providing, how they can pay, whether the provider is trusted, etc. and the provider needs to be able to identify and authenticate the customer, as well as understand what content they want and how they want to pay for it. Opt-in is one solution but is complex and time-consuming to build scale so operators need to explore ways of protecting data while using it to add value to transactions over the network.
This briefing summarises key outputs from a recent STL Partners research report 1 and survey on Online Video Distribution. It considers the evolution of the key technologies, and the shifting industry structures, business models and behaviours involved in content creation, distribution, aggregation and viewing.
In this document, online video distribution refers to any video-based material, such as movies, television, sports and user-generated content, distributed via various internet-based (IP) technologies. This includes internet protocol television (IPTV), web streaming and peer-to-peer (P2P) downloading. It includes distribution via any fixed or mobile network, to any device, such as a PC, television or smartphone.
We exclude dynamic two-way video applications such as videoconferencing and video-sharing, as well as traditional broadcasting and physical means of distribution, although the impact of online video distribution on these platforms is explored briefly. Standalone mobile video broadcasting, for example using DMB or DVB-H technologies, is also not considered to be “online”.
In theory, telecom operators should be well-poised to benefit from the evolution of video technology. Fixed and mobile broadband usage are increasing in speed, while phones and set-top boxes are becoming much more sophisticated and user-friendly. Yet apart from some patchy adoption of IPTV as part of broadband triple-play in markets like Japan and France, the agenda is being set by Internet specialists like Google/YouTube and Joost, or offshoots of traditional media players like the BBC’s iPlayer and Hulu. In the background, many consumers are also turning to self-copied and pirated video content found on streaming or P2P sites. And although there is a lot of noise about the creativity of user-generated video and mashups, it is not being matched by revenue, especially from advertisers.
STL used scenario planning to understand the future. The methodology was designed to deal with many moving parts, uncertain times and rapid change. We identified three archetypal future scenarios:
Old order restored: Historic distribution structures and business models are replicated online. Existing actors succeed in reinventing and reasserting themselves against new entrants.
Pirate world: Distribution becomes commoditised, copyright declines in relevance and the Internet destroys value. A new business model is required.
New order emerges: New or “evolved” distributors replace existing ones, with content aggregation becoming more valuable, as well as delivery via a multitude of devices and networks.
In our study we found that these scenarios are not mutually exclusive. In fact, the likelihood is that the current old order will pass through a pirate world phase, before a new order emerges.
In the meantime, the two most important considerations for “distributors” are:
Investing in, and adequately managing, sufficient network access and core network capacity. In many instances this will involve partnering with specialist CDNs (content distribution networks) and deploying appropriate network management / QoS technology.
Developing improved value-added services, based on network and device intelligence and a two-sided “platform” business model, especially to assist in targeting and delivery of adverts.
Contents
Key questions for online video distribution
Online video today
Emerging industry structure
Market size
Future challenges for the industry
Future scenarios for online video
Genre differences
Mobile video evolution
Regional differences
Conclusion
Key questions for online video distribution
When thinking about this report and its ‘big sister’ (a strategy report: Online Video Distribution: Fixing the broken value chain), we asked ourselves some key questions based on what we saw happening in the online video arena. These were:
How will the online video market develop and what are the best strategies for aggregators and distributors?
As broadband pipes have grown fatter and fatter, the capability to deliver a quality video viewing experience over the internet has grown. This broadband capability has driven a tsunami of innovation in hardware, software and services – and the eyeballs have followed. Recent data suggests video is the fastest growing segment of all internet traffic and that the trend will continue for the foreseeable future. This is true, whichever metric is used, be it absolute number of viewers, total time spent viewing or data traffic volumes. In the last 24 months, the same trend has also been seen in mobile video, aided by faster 3.5G networks and more capable handsets and smartphones like the Apple iPhone.
Growth is not limited to a specific content category: adult content; sports; movies; and music are all moving online rapidly. The internet has also led to a new category of user generated content. Home movies have moved out of the privacy of the living room and are becoming more professional, while existing copyright material is being repurposed in the legal grey zone of ‘mash-ups’.
Neither is growth limited to a specific geography. The movement online is a worldwide phenomenon, as the internet has no respect for traditional geographies and boundaries. Certain markets have seen faster adoption of fibre, high-speed DSL, cable or HSPA mobile broadband, which drive more (and higher-quality) video consumption.
All the evidence points towards a future where the internet (and other “closed” IP networks) will be a critical distribution channel for all forms of video. Significantly, so far at least, revenue growth for aggregators and distributors has not followed traffic growth.
Are there historical lessons from which to learn?
Innovation in video distribution is not new. Over the past century, we have seen cinema, broadcast networks and physical media creating temporary shocks to older methods of distributing content. Despite the gloom of some predictions, live events, whether sport, theatre or music, remain popular and co-exist with home entertainment. The transition to, and evolution of, these distribution channels and the associated business models will provide clues about the outcome of video distribution as more content moves online.
However, there is only a certain amount of time in the day available for entertainment in general and watching video specifically. Legacy distribution channels are understandably worried about whether internet video will be additive to, or will cannibalise, their audiences.
A new distribution channel brings opportunities for new entrants to enter markets and disrupt existing markets and business models. The key feature of the internet as an interactive distribution channel adds to the opportunities and the challenges faced by existing players.
User empowerment – for good or ill it is happening, but what will the impact be?
Interactivity has allowed individuals to become distributors in their own right. On the positive side, individuals have generated their own content and made it available to the world. On the negative side, some individuals have used interactivity to distribute content without regard to the rights of copyright holders. Copyright holders have struggled to enforce their rights. Illegal distribution of content not only threatens the absolute value of content, but has also led to the development of unpopular and complicated mechanisms to protect content.
The volume growth of content has placed internet access providers under severe strain. Their attempts to increase prices to compensate for the growth in traffic and gain extra revenue through developing additional services are proving very difficult. Technology-based methods of blocking or prioritising certain traffic types garner a lot of publicity, but also prompt user and legal furore over “Net Neutrality”. These issues have generated a considerable amount of experimentation in the market, especially in the area of pricing models, where subscription, pay as you go, advertising-funded bundles with other distribution channels and offsets and subsidies all exist in various forms.
The net result is that the video market is in a state of chaos. Will order emerge out of the chaos? What form will this new order take? What will be the impact on existing players in the video value chain? And will powerful new players emerge?
What sort of scenario will emerge?
STL’s scenario planning methodology to understand the future identified three potential “core” future scenarios:
Old order restored: Traditional distribution methods and business models are replicated online. Existing actors succeed in reasserting themselves.
Pirate world: Distribution ceases to be valuable and copyright ceases to be relevant. A new business model is required.
New order emerges: Rather than the total breakdown of pirate world, new distributors will replace existing ones as we still need aggregation to guide us through the jungle.
In our study we found that these scenarios are not mutually exclusive. In fact, the likelihood is that the current old order will pass through a pirate world phase, before a new order emerges.
This allowed us to think about what strategies are relevant in which situations, which strategic options can be placed early and which should only be placed when the likely path becomes clearer. In addition, this approach allowed us to look at ‘what you need to believe’ for each scenario and to define milestones that will make the path predictable.
The study places the drivers of future internet video distribution in a technological, economic, social and political framework. It then evaluates the implications of these on content type for the value chain of creators, aggregators and distributors. Research includes literature reviews, desk research, industry surveys and interviews with key staff from relevant organisations. In our strategy report, case studies are produced to bring the story to life and to provide a historical context for both successes and failures.
Online video today
The rise of online video as a market in its own right has been driven by two key factors: increasing bandwidth and growing user penetration.
1. Bandwidth
Bandwidth drives the quality of online video that can be consumed in real time via streaming, rather than the quality of online video that is downloaded prior to consumption, as well as download speed. This significantly impacts user experience – YouTube really took off when it could be viewed in realtime, without lengthy waits for “buffering”. For ISPs and broadband providers, realistic bandwidth is also a key determinant of when IPTV becomes feasible as a mass commercial proposition – it is no coincidence that it tends to track rollout of fibre or higher-speed DSL.
Figure 1 shows the relative quality of media that can be streamed at different broadband speeds and the relative average broadband speeds of a selection of different countries. It should be noted that certain lower-ranked markets have pockets of much-faster users, for example those with Verizon’s FIOS network in the US.
Figure 1: Quality of media stream by bandwidth
[Figure]
Source: The Information Technology and Innovation Foundation
At 4Mbit/s broadband speeds, high quality standard definition TV is possible. By 8Mbit/s, high definition (HD) TV is possible. From 24Mbit/s upwards, any normal sized household will have full multimedia capability. In the mobile world, smaller screen sizes mean that lower speeds can generate acceptable experience, even at 1Mbit/s. Instead, the limiting factors in mobile are more often video processing power, user-friendly interface design and battery life. Different countries have vastly different average bandwidth, which means there are major differences in the sort of online video services that can be offered around the world. Advanced countries, such as South Korea and Japan, give an indication of how video distribution in other countries should develop over the next five years. In both these countries, a range of video services and new applications has emerged. Due to this, people value their broadband connections more highly and often pay more per user than elsewhere.
2. Penetration
Penetration of broadband drives the attractiveness of the market for video-focused service providers, as larger user bases can be monetised via:
Advertising: Selling adverts in various formats against video on the webpage, on the actual video media, or in the media as placement.
Offset economics: Players make money from elsewhere and choose to subsidise videos. For example, Google subsidises YouTube via search ad revenue.
Exit: Selling a successful start-up service to a larger player. For example, YouTube’s $1.65bn sale to Google has started a rush in this direction.
Consequently, the greater the penetration, the more and varied the services that can be offered. It should be noted that there are various methods of calculating penetration, including reach as a % of either population or household numbers. In many OECD countries, broadband penetration has now surpassed 50% of homes. The growing use of mobile broadband on laptops or high-end smartphones also means that some households (or even individuals) now possess 2+ separate broadband access channels.
That said, there are also benefits for high levels of acceptance within specific demographic or social niches, even if the broad average across the population is lower. So, for example, the advent of prepaid mobile broadband is enabling greater penetration into markets such as students or immigrants, who are well-suited to particular content types (e.g. foreign language).
In coming years, broadband penetration should continue to increase in developed markets, as well as certain developing nations which place an emphasis on it, such as China. One side-effect of the current financial crisis is that various countries (including the US) are including broadband in their lists of beneficiaries of “fiscal stimulus” packages. Other markets are also seeing changes in regulatory stance which should benefit fibre rollouts, or wider use of mobile broadband.
3. Other factors
In addition to bandwidth and penetration, it is also worth noting that various other factors are driving wider use of online video services and applications today. Briefly, these include:
Improved integration between web and video software. In particular, the use of Adobe’s Flash video has been a major driver in enabling services like YouTube.
Rise in social networking websites like MySpace and FaceBook, which permit easy use of video plug-ins, and encourage users to share video or links, creating viral consumption of content. Other Web 2.0 social media, such as blogs, have also become more video-friendly in recent years, especially as YouTube and other services have made it easy to embed video in web pages.
The falling costs of PCs, especially notebooks, has increased the number of PCs per household in developed markets. In some cases, laptops are replacing second TVs as the device-of-choice in studies, kitchens and bedrooms, increasing the time that “eyeballs” can spend on video.
Falling costs of video cameras, and increasing usefulness of video features on mobile phones has stimulated additional user-generated content.
Emerging industry structure
The initial online video market was driven by broadband-led IPTV, but with the increasing bandwidth and uptake of consumer broadband, the emerging market has become far more varied in the past few years. In particular, the ease of use of streamed and web-embedded video has transformed the landscape.
Figure 2 shows the emerging structure of the online video industry.
Figure 2: Structure of the online video industry
[Figure]
Source: Revision 3; STL Partners analysis
The industry is evolving along two main axes.
1. User-generated vs. professional content
The traditional video industry of TV and movies is based on professional, studio-produced content. New media plays – IPTV and web TV services such as Hulu and iPlayer – also operate in this way. But as discussed above, technology-driven cost reductions have opened up new opportunities for lower cost user-generated content, as well as an emerging ‘pro-tail’ sector between these two extremes.
One example of the pro-tail trend is iBall, a high quality, five minute, daily web TV show that is produced and distributed by a UK financial services company, Interactive Investor, rather than by a broadcaster working in tandem with a traditional production company. Its aim is to give a more in-depth view of the financial markets, while maintaining the entertainment focus associated with more mainstream media.
The other side of this middle ground between professional and amateur content is made up of increasingly competent amateurs building high-quality content in the hope of obtaining commercial sponsorship. There is also a small but growing market for business-related video, for example for video webinars or news magazines. Telecom TV, in our own industry, is a good example here.
2. Aggregated vs. curated content
The traditional video and TV industry is based on curated content written, produced, edited and scheduled by professionals. However, technology has introduced automation into this process, allowing various businesses to build simple aggregation-based services. Content is thrown up on to the internet and a search engine enables users to find what they need. YouTube is the best known example of this model.
Increasingly, automation is being used in the curation space too. Joost and Babelgum are examples of curated aggregation with content grouped into channels such as action and sport, animation, comedy and drama. Similarly, Phreadz is a video-based chat system that has a fairly sophisticated threading capability to support conversations built around specific conversation threads – or channels, in video terminology.
Another evolution is the tussle between set-top box based online video, which is mainly IPTV, and pure web TV based online video provided by services such as BBC iPlayer. The set-top box approach usually subsidises the box, but it is a subscription-based model that delivers more assured revenues. The web TV model is more likely to be ad supported and has the advantage of running over free architecture so will probably pick up more users. Cable is an interesting hybrid, as the set-top box is the broadband modem equivalent.
Market size
Today, the estimated market size of all online video – IPTV, cable, broadband and mobile – is about $2bn. This figure is made up of an amalgamation of components, including:
Subscription revenues for IPTV, typically as part of triple-play or other bundles from broadband ISPs.
Advertising revenues for IPTV, online video sites like YouTube and other avenues.
Mobile TV subscriptions, or as a component of bundles of services.
Pay-per-use or per-download services.
The figures exclude any standalone consideration of basic “pipe” revenues that can be attributed to video – for example per-MB mobile data fees incurred during video download.
Third-party estimates for the same market in 2012 vary hugely, from $10bn to around $70bn.
Figure 3 shows STL Partner’s (fairly conservative) prediction of the online video market, around $28bn, set against the total size of the global cinema and TV markets.
Figure 3: Total video market versus total online video market ($bn)
[Figure]
Source: STL Partners
In financial terms, online video looks small compared to cinema and TV, but the online video market will also drive major disruption in existing video markets as described in “Future Scenarios” below.
Future challenges for the industry
The online video market can be modelled as a value chain from content creation to customer devices. This is often called the ‘four box model’. Figure 4 shows the four box supply chain model and the key trends for online video.
Figure 4: Four box video supply chain
[Figure]
Source: STL Partners
1. Content creation
There have been two major shifts over the past five years:
Cheaper content recording and production equipment has reduced costs of capture and creation. The resulting emergence of user-generated content has had a major impact. For example, user-generated content has reduced prices in media professions where differentiation is low, for example, photography. It has also resulted in a huge inventory of short-form media on the internet.
The digitisation of video libraries, both by rights owners and increasingly by amateurs with low-cost equipment, has led to a huge back catalogue of long-form video being made available online. This has aided not only the media enterprise operators, but has also driven the market in user copied content – piracy to you and me.
2. Aggregation
The traditional high-cost manual process of content finding, editing and marketing has increasingly been replaced online by low cost, automated aggregation systems. As more media came online, finding content was initially carried out using search engines such as Google.
Now, social media is coming to the fore and networks of friends discover new content. These social networks have also invaded many of the traditional editing functions of selecting, rating and recommending content. Amazon started this with its customer reviews, but the process of peer reviews has become mainstream on the internet, reducing marketing costs for companies as customers do the job that the marketing department used to do.
3. Distribution
Moore’s Law, working open source software, de facto web service standards and a glut of cheap bandwidth and hardware left over from the dot-com bust have meant that distribution costs, per megabyte, teraflop or mbit/s, have plummeted since 2000.
In addition, distribution options have multiplied. DSL, cable modems and various wireless technologies all compete as online video distribution platforms, while standards such as WiFi and various mobile 3G technologies have expanded mobile bandwidth by 1,000 times.
Over the past few years, fixed line distributors in Europe and the US have engaged in vicious price cutting to fill their huge empty pipes. The overriding strategy has been flat rate pricing on both fixed line and broadband, giving consumers near-unlimited upload and download volumes. However, the rising use of online video is threatening to overload the capacity of these networks in some areas, leading to increasing debates about capacity throttling versus charging more to fund new capacity infrastructure build-out.
One example of this is the launch of the BBC iPlayer a year ago. The average streaming use per customer in the evening increased by 60% within a few days- see Figure 5 – and one ISP’s streaming costs tripled within a month. Since this time, the problem has been exacerbated by the launch of higher quality video with associated higher bit rates.
Figure 5: Video is soaking up bandwidth and driving up ISP costs
[Figure]
Source: PlusNet
In the mobile domain, problems can be even greater. Many operators have now launched consumer-oriented mobile broadband services, both via smartphones like the iPhone, but especially via cheap HSDPA modems connected to laptops. Whereas in the past, even “heavy” business users of 3G data cards have only tended to generate around 200MB per month of traffic, it is now not uncommon for consumers to use 5GB, 10GB or even more – especially video traffic. At first, this was just using up 3G capacity that had essentially been built-out several years ago and which had been left almost unused – in other words, incremental revenue against existing assets and sunk investment costs.
But the rapidity of adoption of consumer mobile broadband may not be all good news. Given that price points start from as little as €10-15 per month, with video traffic now moving into higher definitions, this is starting to look unsustainable, when set against the cost of mobile network capacity upgrades.
4. Customer environment and devices
The inexorable march of Moore’s Law and increasing adherence to open architectures, together with increasing device interchangeability and application flexibility, have led to the total cost of ownership falling for both hardware and software. This has created new devices and platforms for video, which people describe as the ‘four screens’ of online video – TV, PC, mobile and games consoles.
Each of these screen environments offers a different user experience, leading the environments to be optimised for different media such as full-length movies and short-form clips. This optimisation, together with bandwidth limitations and business model constraints, will form the principle driver of what video content is offered by the supply side and consumed by the demand side on each of the four screens.
5. Supply and demand side issues
The supply side has rushed to serve the perceived new market and, as already noted, a plethora of new businesses and business models has emerged. How value will be extracted is unclear. Some entrants are building audiences on the basis that a large audience will attract advertising, but it remains questionable whether advertising is going to be large enough to sustain this approach, particularly in the current economic climate.
Other start-ups hope they will be able to charge for services that they are initially offering free. But history shows that consumers are happy to pay for distribution, for example broadband access, and for consumer electronics equipment, such as an iPod, but are less keen on paying extra for online services, such as games, email and content.
Similarly, behaviour on the demand side is not yet clear. For example, a large amount of innovation on the supply side has been in the area of user-generated video (UGV), but most research shows that consumers – and thus advertisers – value long-form and high quality short-form media more. Although volumes of UGV may continue to grow, the bulk of value is expected to be in curated, higher quality content, as shown in Figure 6.
Figure 6: User-generated video drives traffic, long-form video drives revenue
[Figure]
Source: The Diffusion Group
This will have a major impact on distributors. On one hand, there is a risk that distributors will be asked to carry vast amounts of low grade traffic, without necessarily being able to generate the returns to upgrade their capacity. On the other hand, here is an opportunity, given the reluctance of consumers to pay directly for upstream services, to broker content and aggregation services to the end customer.
Future scenarios for online video
While distributors potentially face both threat and opportunity, it is still too early to predict exactly what will happen. There are a number of variables – notably economic and regulatory – that are almost impossible to call, at the beginning of 2009. The depth and length of the recession could have a variety of side-effects on mobile video, ranging from reduced broadband rollout and capex, through to increased consumption of ‘free’ services as consumers avoid the costs of more expensive forms of entertainment. There are also assorted unknowns around regulatory shifts, such as the US FCC’s changing attitudes to Net Neutrality and lobbying on copyright issues. In Europe, there remains uncertainty around the ‘digital dividend’ and switch-off of analogue TV. Allocation of spectrum to broadcasters vs. mobile operators is also a particularly thorny issue.
Instead, as in Figure 7, we can consider three future scenarios; understand what they entail, what has to be believed for them to occur and how to identify when they were occurring2.
Figure 7: Scenarios for examination
Old Order restored
This scenario explores a world in which traditional content aggregators control the value chain in the online world.
Traditional aggregators build an online presence that is additive to their existing distribution channels both in terms of overall viewing and revenue.
Pirate World
This scenario explores a world in which content is freely available online.
The short-term impact will be a shock to content creators and traditional aggregators that will see a rapid decline in traditional revenue sources as more and more people move towards acquiring content on the black market. Some of this content will be delivered online, but not all, as the techno savvy will acquire content online and act as distributors to the less savvy in friendship and family groups.
New Order emerges
This scenario explores a world in which traditional aggregators are trapped in a pincer movement by device manufacturers and new, powerful aggregators.
Device manufacturers build secure content delivery into their products and make ease of use and interactivity key features. Individual manufacturers develop suites of devices to serve viewing both in the home and outside. Consumers happily invest in the latest, greatest gadget.
Source: STL Partners
Working through these scenarios it becomes clear that success will be based not on technology per se, but on which players have the ability and rights to monetise content.
This leads to the following assumptions:
Old order restored
Re-establishes control and content rights
Maintains control of sources of funding, such as ads and subscriptions.
Pirate world
Success factors include no control of rights, free wins
Offset-based funding, including investment and subsidies, that can continue to cover the cost of industry growth.
New order emerges
New copyright model allows pricing control by new aggregators and creators
Control of sources of funding, such as ads and subscriptions, migrates.
Combining these assumptions with evidence from historical case studies, the opinions of industry experts and the outcomes of workshops and surveys, suggests that the scenarios are inter-related. Over time, STL believes that the story told in Figure 8 will emerge.
Old order structures will be at risk from disruptive pirate world plays and will continue to come under pressure
Pirate world will not be sustainable as there is not enough money to fund an ‘always free’ industry of this size
Within pirate world the beginnings of a new order will emerge.
Figure 8: The shift from the old order to the new order
[Figure]
Source: STL Partners
Figure 8 shows a decline in Old Order market share caused by the Pirate World. At this point, the New Order has only a small market share, but over time the New Order gains market share as the Pirate World grows and then collapses to about 10% of the total. The Old Order retreats to its core business and holds about 25% of the total market. The New Order is expected to take about 40% of the market in 2013 and 65% in 2018. This market share will be built on a number of foundations, the most likely being:
Old Order players restructure: Old Order players restructure to compete in the new order markets. Two examples of this occurring are Hulu and BBC iPlayer.
Outside players make a new market: The classic example is Apple, which has consistently entered markets that are confused or emergent and has driven a high value, end-to-end solution early, capturing a small, but useful, 15% to 25% market share and earning higher than average surplus.
Pirates settle down, become gamekeepers: An example here is YouTube. In October 2008, it looked like Google and YouTube would promote the ‘respectable’ face of pirate world, with Google subsidising YouTube as it continued to mount pirated content as fast as, or faster than, Digital Millennium Copyright Act takedown notices could remove it. However, in November, a new financial reality marked a change in behaviour, with YouTube doing deals with MGM to offer ad-funded movies.
New trusted guides emerge: Increasingly, consumers will look for people who they can trust to help them navigate through the morass of content. These trusted suppliers will accumulate users as early adopters recommend them to others.
Genre differences
STL Partners examined a number of genres to understand how things will play out. We looked at movies, sport, user-generated and adult content. Movies and sport are major components of video output today, user-generated content is the new kid on the block and adult content is often a bellwether for the future of more mainstream media.
Movies: These will feel the impact of pirate world, just as music did. However, movie revenues are more diverse and there are already large streams coming from other areas. Music is only starting to develop diverse revenue streams. The endgame will depend on a new settlement for copyrights. We believe a new concord will emerge as, unlike music, it is too difficult to pirate movies cheaply and too costly to produce new content for free. We believe online movies will be delivered by a combination of Old Order players that have redesigned their supply chains, reformed Pirates and emergent New Order players.
Sport: Sport suffers a rapid decline in value after its live date, like news and a few other genres. This makes it less valuable for Pirate World to copy, so the most likely piracy will be illegal live streaming of events where rights already exist. This is an issue, but not on the same scale as rampant copying. However, there are a huge number of events followed globally that are not currently covered. We believe viable New Order businesses can be formed quickly to deliver sport.
User-generated content: As discussed earlier, user-generated content is often early to flower, but it tends to wither over time as professional content takes over and more structured markets emerge. We see this occurring here, with the exception of user-generated communications, such as online video social networks.
Adult content: The adult content industry is being hit hardest by user-generated content. It faces many problems, but has few solutions. For example, advertising, enforcement of rights, obtaining subsidies or subscriptions are tougher tasks for this sort of content. In addition, legislators are clamping down. We believe this is an industry, like photography, where users can and will create content for free themselves, leading to value destruction.
Mobile video evolution
The initial hype around mobile TV and video has largely stalled, because of a variety of issues:
High levels of friction through the supply chain, driving poor returns and poor user experiences;
Early reliance on 3.0G networks, often with poor capacity, coverage and latency;
Limitations of handsets, including price, battery, screen, application software and useability. For example, some early services took 5+ seconds to ‘change channel’;
Poor fit with typical mobile payment methods, especially prepay users, for whom regular subscription-based services tend to be unsuitable.
However, the new generation of mobile smartphones that has emerged following the debut of the Apple iPhone is leading to a resurgence in mobile video, albeit from a small base. Newer devices have faster 3.5G and WiFi radios, bigger screens and faster graphics processors. Improved mobile web browsers and native video software is further improving the experience, while the costs and complexities of mobile-oriented broadcast (e.g. via DVB-H) is helping the pendulum swing back towards mobile online video.
The good news for Old Order mobile players is that this market is still well regulated, making it difficult for the Pirate World to take over any large volumes. That said, the growing prevalence of flat-rate data plans, coupled with more-capable browsers and ‘sideloading’ content via memory cards, presents a challenge to monetisation, as it is becoming increasingly possible to see ‘the real Internet’ on handsets. Nevertheless, various technical and regulatory factors tend to mean that content and bandwidth consumption is better-policed in mobile than on fixed broadband.
Regional differences
As Figure 9 shows, countries such as Japan, Korea, the Nordics and France are way ahead in bandwidth and price. There is a strong correlation between bandwidth, price and centrally planned and managed economies. The lesson here is similar to that of mobile. To get ahead, some form of national – and perhaps in Europe international – co-ordination will be required to move bandwidth speeds and prices forward.
Figure 9: Price and speed of broadband by country
[Figure]
Source: The Information Technology and Innovation Foundation
This co-ordination is key for distributors, as one of the lessons of the planned rollouts is that it is far better to have visibility of revenues to justify rolling out large scale infrastructure upgrades.
Strategic options for distributors
Distributors must act quickly to avert self-imposed threats of inflexible structures and strategies, and realise the opportunities of entering the market by brokering content for customers.
Threats
The threats are in distributors’ current structures and strategies. In an STL global survey of 145 telecom and media professionals, there was concern about the ability of distributors to compete, both in terms of creating the right services and in executing quickly if they could create the services.
Figure 10: Online distributors seen as second most likely to lose
[Figure]
Respondents suggested that IPTV would not be the major online video market going forward, with various forms of web-based video services taking the lion’s share of the market. This view is backed up by other forecasts of IPTV against other forms of online video take-up.
In addition, there is a real risk that the sheer volume of online video – and the low value of most of it – will make it uneconomic for distributors to play a dumb-pipe role, especially if this would hand market power to players that will then enter areas of the distributors’ markets, such as edge distribution, service provisioning and orchestration. This is a particular concern in the mobile arena, where it is particularly expensive and time-consuming to add capacity, if it involves acquiring extra spectrum or cell-site locations.
Weakness
The major weakness pointed out by many in a Telco 2.0 brainstorm session is that distributors, even if they do respond, may move too slowly and with the wrong business models. The rise of web-based video is also making it less likely that subscription-based IPTV will be able to form a cornerstone of future fibre rollout business models.
Strengths
Looking at potential scenarios and their problem outcomes, there is also strength as the distributors made money in every scenario, which is more than can be said for most of the other players. There is at least some money to be made in providing ‘pipes’, especially in favourable regulatory regimes.
The Pirate World will be one where cash is king and those with deep pockets (like Telcos) will gain market share.
Opportunities
In both the pirate world and new order, the aggregator’s power diminishes and the increasing interconnection of CPE devices with the network will drive new opportunities, giving distributors an opportunity to capture some of the power and value.
Old World economics will be disrupted by the impact of the Pirate World, giving distributors a once in a lifetime chance to move up the value chain and avoid being relegated to dumb pipes.
The requirement is for distributors to use their strengths – cash, valuable users, reach, ownership of a key part of the value chain and willingness of users to pay for distribution and CPE devices – to begin to forge value chains that will maximise the opportunities.
Strategic options
While the emergence of the new order is still unclear, our scenario suggests some activities distributors can plan for and strategic options they can consider.
Figure 11: Strategic map for distributors
[Figure]
ource: STL Partners
Summarising Figure 11, we assume distributors start this model with a flat rate, or perhaps a subsidy, for broadband, as is increasingly common. Added value options then follow.
Moving into Pirate World: We assume there will be little revenue to be gained from upstream players, so the key initially must be to sell extra value-added services to downstream users. For example:
Service bundles: Not just connectivity, but buying and bulk-splitting services and material that downstream users would not buy themselves. Examples could be brokered content, perhaps downloads from Amazon, aiding interworking between CPE and devices, access to web services such as VoIP and WiFi, fixed and mobile connectivity, and new web services based around storage, security, social networking and unified directories.
Content delivery networks and quality of service: As users and contention increase, some users will pay more for better quality connectivity and services that allow synchronous broadband use.
One issue to examine in both these propositions is how distributors can optimise services and gain revenue by expanding into the CPE arena. Nearly all the research we have seen and done implies that, for the user, seamless interoperation of CPE devices is a major requirement.
As the New Order emerges: We believe that there will be increased economic surplus in the value chain (especially upstream from better-protected aggregators and rights’ owners) so that distributors can seek to develop two-sided market strategies. For example:
Higher service levels: Initially, distributors could offer higher service levels for higher value content to upstream service providers. This requires the emergence of a two-sided market.
Developing ecosystems: Over time, distributors could develop ecosystems with upstream, downstream and third-party service providers. These ecosystems could exist on Telco distribution platforms and infrastructure.
With a few exceptions, single operators will not be able to drive these strategies alone. They will need to collaborate with each other, certainly nationally and possibly regionally and globally. In many countries, a concerted effort will be required between distributors and government to define the conditions for investment in better, faster capacity.
Conclusion
There are six key conclusions for distributors:
The growth of online video will have a major impact on internet traffic, which will experience an order of magnitude growth over the next five years. Our estimates are pessimistic compared to other analysts, so internet traffic could grow more.
Although forecast online video revenues of about $28bn in 2013 are not large, they represent an extra revenue stream that will cover costs in converged services and quad-plays where distributors always take revenue.
The key opportunity for distributors is to expand their influence in the overall value chain as the aggregation, content and CPE markets undergo disruption in Pirate World.
As all value chain models of online video are sensitive to video traffic pricing, the provision of scale will be essential to upstream players. Distributors must leverage this advantage to adopt two-sided business models.
Distributors need to create conditions that will allow investment in major capacity upgrades. Where this has been done, it has been done with some form of government or regulatory influence. Elsewhere, distributors will need to influence this, or condition society to capacity overload.
In the new order, targeted customer advertising and cost per mille (CPM), as well as the ability to charge for value-added services, will create opportunities for distributors to add value by exposing useful network data and added value services.
2 Note: We provide an overview of the scenarios here – for more detail on them and how distributors, in particular, should respond to (or drive) them, please see the strategy report.