Full Article: Apps & Appstores: Litmus Vs Apple Appstore

Summary: As O2 UK’s Litmus developer proramme matures into a global corporate project for Telefonia, we analyse the business model challenges it faces in becoming a vibrant community for developers and a value driver ofr the company.

Back in March, we said that O2’s Litmus developer site was “better than the Apple App Store”. Quite a claim, as it turned out. We based it on the deep integration of Litmus with the range of social and business enablers it provided in addition to the O2 network APIs. As well as a generous revenue share and quick payment, Litmus offered access to O2’s billing system to help cash collection, crowdsourced testing from Mob4Hire, Web-hosting services, and the tantalising prospect of access to an internal Telefonica venture capital group.

How is Litmus doing now?

In terms of product quality, Litmus’s recently added some highly interesting APIs. For example: the ability to query the current status and capabilities of a device, whether the user has sufficient credit to make a payment, if they have an inclusive data plan, whether they are in a WLAN hotspot, and whether or not they are currently roaming.

The importance of this kind of contextual data – call it Level 1 context – for delivering an excellent user experience with mobile applications and content is hard to overestimate, and it avoids most of the political issues that dog some other forms of context, like user behaviour and social graph data (call them Level 2 context). Overall then, the potential quality of application looks encouraging.

But how about quantity? At the moment, there are 36 pages of apps on sale at Litmus, plus three more for testing; at 10 apps to the page, that’s 390 apps. Many of them are versions of the same application for different devices or localisations, so the count of active projects is rather less than that. It’s also true that a lot of people submit their applications to every app store going, sensibly enough, so there is quite a bit of duplication.

So far, this is a respectable try, but it’s nowhere near Apple’s app count. However, as we’ll see later on, stacking up apps in an app store isn’t the only strategy available.

A further indicator on the quantity of development activity is that the forums at o2litmus.co.uk look worryingly quiet. Another traditional measure of activity at an open-source project is the traffic on the mailing list; there doesn’t seem to be that much going on. This is something Litmus has in common with the other mobile dev platforms – the Symbian and Forum Nokia ones are patchy at best. Perhaps this point from The Information Architecture of Social Experience Design‘s list of anti-patterns for Web sites applies:

“a Potemkin Village is an overly elaborated set of empty community discussion areas or other collaborative spaces, created in anticipation of a thriving population rather than grown organically in response to their needs”.

So, why aren’t we seeing much more development activity at Litmus? It’s a big question, especially as Litmus is meant to be under active development. What are the warning signs of a community that might end up looking like this?

litshot.png

The critical challenge is getting to sufficient scale, which is vitally important to the success of platform business models like Litmus. O2 UK has 18 million subscribers; if 10% are conscious of apps, that is an addressable user base of c1.8 million.

Further, it’s probably true that iPhone owners tend to be power users, being a self-selected group of early adopters. (According to Ray da Silva of Vodafone, iPhone users exhibit 7 times greater usage than the closest rival group, BlackBerry users.) And O2 has the exclusive right to distribute iPhones in the UK, so the bulk of O2’s power users are probably concentrated in its population of iPhones. Those 1.5 million O2 iPhone users have the App Store to go to, which is integrated with the hardware and software and prominently placed on the device. If our estimates are close, that leaves about a fifth of that number, or 300 thousand or so who might use Litmus.

So, Telefonica / O2 faces a strategic dilemma. How should it balance investment in creating and serving the huge (but ultimately Apple’s) iPhone community and the nascent and home-grown Litmus eco-system?

And, as we’ve often pointed out, telcos consistently overestimate the degree to which their subscribers constitute a real community or want to have any affinity with their operator. Apple, at least, can claim to be the proud owner of a cult, an image it works extremely hard to maintain. Probably no other hardware vendor in mobile can claim that, and the OS vendors aren’t much better off although Symbian tries hard.

This is important, because active developer communities tend to be driven by a smallish core group of members. Recruiting new members of this group is critical for long term survival. On the other hand, the problems, ideas, feedback, and money coming into such a community usually originate in another community core group – the user elite. The line between the power users and the developer community is necessarily fuzzy, but it’s crucial that you have enough people in the user community who are passionately engaged with the product to support the developer core group.

Fragmentation is another challenge resulting from insufficient scale; it’s a serious problem if you have to keep refactoring your code to work on dozens of different devices and OS platforms. Equally, being fragmented between operators is no better; in terms of scale, developing for Symbian is going to beat developing for O2 UK.

Put together, these issues add up to a serious overall challenge to the viability of Litmus in its current form as anything other than a test of limited scale and ultimately limited value.

Litmus Responds…

So, clearly it was going to be interesting when James Parton and Jose Valles Nunez, from Litmus and Telefonica’s Open Innovation group respectively, dropped into the Telco 2.0 offices.

The first interesting point that arises is that the Litmus group within Telefonica is very keen not to be considered an appstore. You might think this is a brave decision; everyone in the industry is obsessed with them since Apple’s big hit, and a week doesn’t go by without someone launching one – whether an operator, a vendor, a third-party store like Handango or Symbian’s app warehouse, or a gaggle of hackers doing an unofficial one for iPhone apps that Apple don’t like.

The obvious corollary to that is “well, what is it then?” Parton argues that the real role of Litmus isn’t as a first-line product, but rather as a way of crowdsourcing decisions about which applications to promote to the mass market through O2 Active – a form of “co-creation” with the community of power users and developers. Rather than relying on the judgment of product managers in Slough, the idea is to serve up new ideas to a self-selected group of neophiles and to see what sticks. Litmus is hoping that this will both provide useful feedback and also reduce churn by binding their user elite into the company more closely.

So far, they report that the extra features like hosting and testing haven’t been much used, and were perhaps a case of “over-engineering” the product – most of the developers involved are primarily interested in Litmus as a route to market, whether as an app store or as a sort of X-Factor for applications that might make it to the official O2 deck. However, they are keenly concerned about recruiting more developers and about the perception of a lack of critical scale.

Scope for Business Model Innovation

So perhaps Telefonica, and the industry as a whole, should be looking for other organising principles for developer communities – whether to build scale in their own right or just to get to ‘critical mass’ in the communities? Rather than being operator- or vendor- specific, perhaps they should be application-specific or problem-specific?

The main forces that create these communities are either technological opportunity – ‘we can do something new!’ – or else an urgent problem – ‘how can we fix this?’ Examples of opportunity-based communities include the vigorous groups that grew up around major programming languages, or the Linux kernel. These exist because the possibilities of the technology attracted people with all kinds of interesting problems and, quite frequently, just raw curiosity. This is also the case for the iPhone, which opened up all the possibilities of mobile development, whilst preserving the relevance of existing Apple developers’ skills and offering a simple path to market.

Shared problem communities start with a very specific need; I need to get data out of a Web hosting firm that is about to shut down, or visualise water management information for northern Senegal without needing to spend $10,000 a seat, or find an alternative to Microsoft Internet Explorer. The first of these led to Archiveteam, the second to Agepabase, and the third to Mozilla. Exasperation with the telecoms vendors’ products for enterprise voice was what inspired the creation of Asterisk.

Salesforce’s Force.com is a successful example of a problem-specific developer platform; you’re using Salesforce and you have a problem that involves CRM, so off you go to Force.com. You’re trying to solve your problem using voice? Perhaps you might want to try Ribbit, which is an opportunity-based developer platform.

And this makes sense; after all, solving the problem is where the economic value emerges, and it’s the application of broad general purpose commodity technologies to very specific business problems that we want all those developers to bring to the show to extend the value.

Litmus: Neither fish nor fowl…?

But there’s a disconnect here relating back to Litmus; communities that form around the possibilities of a particular technology tend to be generalist, global, and attached to the technology rather to any particular operator or even vendor. Communities that form around a problem are more particular. Neither of these fits Litmus, although you could perhaps say that it’s about the possibilities of telco APIs in general.

It’s all rather reminiscent of J. P. Rangaswami’s notion that the more general-purpose the technology, the more appropriate it is for open source because it can scale better; a technology-motivated community needs breadth and scale.

So, while there is often value in keeping a test tightly managed as a centre of innovation and learning as O2 UK appear to have done, perhaps Telefonica / O2 will eventually be better off looking at enterprise problems and being less centred on the O2 brand name, or else broadening the possible addressable market by rolling Litmus out to the whole of Telefonica, if possible, including the Latin American markets as well. Brazil has one of the world’s most vigorous hacker communities – they invented Commwarrior, the first mobile worm, after all. Surely there’s innovation to be had there? And it’s absolutely vital to the success of the whole project that it finds a sufficiently large user elite of its own to support the developer core group.

At the moment, though, at least going by the content of a recent call the Mobile Entertainment Forum held, O2 seems to be mostly interested in using the Litmus APIs for content, rather than applications. For example, the key use case for their roaming status API is that content providers can avoid serving content licenced in one territory into another. This is fair enough as content applications may be part of the solution for Litmus, but we’re slightly concerned that they may be stepping into the vortex of content obsession, like so many other people in the industry.

Our view is that, as much as we like many elements of Litmus, in its current form and scale Litmus may well show some useful test results but probably won’t develop into a successful platform business. Building a much bigger user base should therefore be Telefonica / O2’s top priority for Litmus – even if the developer community is the key target audience. No amount of good new apps can deliver this in its current form and broader success will take good implementation of the kind of radical business model innovation we’ve outlined here.

One option would be to look in the other direction. The existing version of Litmus is targeted on consumers; what about enterprises, or small businesses/power users? This would require a different approach to signing up both developers and customers – in fact, it would be rather more app-store or app-market-like than the current “X-Factor for developers” model, although perhaps there could be a version in which the customers’ problems competed for solutions from the developers. In fact, according to Jose Valles Nunez, Telefonica is indeed considering a “business class Litmus” in the foreseeable future.

A further question is one of credibility. Attracting developers to use a platform requires their confidence that it really will be promoted and that it will stick around – no-one wants to put effort into something that might disappear in a few months’ time. Several hosted Web application environments have already done this. At BT, spending money on Ribbit was intended to act as what biologists call a costly signal – a signal that is credible precisely because it requires a real investment. Perhaps the first few “picks” for the mainline O2 Active lineup, or the first Telefonica Ventures investment, out of Litmus will light the blue touchpaper?

Full Article: LTE: Late, Tempting, and Elusive

Summary: To some, LTE is the latest mobile wonder technology – bigger, faster, better. But how do institutional investors see it?

AreteThis is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

For further information please contact:

Richard Kramer, Analyst
richard.kramer@arete.net
+44 (0)20 7959 1303


NB. This article can be downloaded in PDF format here or browsed on-screen below.

Wireless Infrastructure

[Figure]

LTE is the new HSPA is the new WCDMA: another wave of new air interfaces, network architectures, and enabled services to add mobile data capacity. From 3G to 3.5G to 4G, vendors are pushing technology into a few “pioneer” operators, hoping to boost sales. Yet, like previous “G’s,” LTE will see minimal near-term sales, requires $1bn+ of R&D per vendor, and promises uncertain returns. The LTE hype is adding costs for vendors that saw margins fall for two years.

Despite large projects in China and India, we see wireless infrastructure sales down 5% in ’09, after 10% growth in ’08. As major 2G rollouts near an end, emerging markets 3G pricing should take to new lows. Some 75% of sales are with four vendors (Ericsson, NSN-Nortel, Huawei, and Alcatel-Lucent), but margins have been falling: we do not see consolidation (like the recent NSN-Nortel deal) structurally improving margins. LTE is another chapter in the story of a fundamentally deflationary market, with each successive generation having a shorter lifecycle and yielding lower sales. We expect a period of heightened (and politicised) competition for a few “strategic accounts,” and fresh attempts to “buy” share (as in NSN-Nortel, or by ZTE).

Late Is Great. We think LTE will roll out later, and in a more limited form than is even now being proposed (after delays at Verizon and others). There is little business case for aggressive deployment, even at CDMA operators whose roadmaps are reaching dead ends. HSPA+ further confuses the picture.

Temptations Galore. Like WCDMA, every vendor thinks it can take market share in LTE. And like WCDMA, we think share shifts will prove limited, and the ensuing fight for deals will leave few winners.

Elusive Economics. LTE demands $1bn in R&D spend over three to five years; with extensive testing and sharing of technical data among leading operators, there is little scope to cut corners (or costs).  LTE rollouts will not improve poor industry margins, and at 2.6GHz, may force network sharing.

Reaching for the Grapes

Table 1 shows aggregate sales, EBITDA, and capex for the top global and emerging markets operators. It reflects a minefield of M&A, currencies, private equity deals, and changes in reporting structure. Getting more complete data is nearly impossible: GSA says there are 284 GSM/WCMDA operators, and CDG claims another 280 in CDMA. We have long found only limited correlation between aggregate capex numbers and OEM sales (which often lag shipments due to revenue recognition). Despite rising data traffic volumes and emerging markets capex, we think equipment vendor sales will fall 5%+ in US$. We think LTE adds risk by bringing forward R&D spend to lock down key customers, but committing OEMs to roll out immature technology with uncertain commercial demand.

Table 1: Sales and Capex Growth, ’05-’09E

  ’05 ’06 ’07 ’08 ’09E
Top 20 Global Operators          
Sales Growth 13% 16% 15% 10% 5%
EBITDA Growth 13% 15% 14% 10% 8%
Capex Growth 10% 10% 5% 9% -1%
Top 25 Emerging Market Operators          
Sales Growth 35% 38% 29% 20% 11%
EBITDA Growth 33% 46% 30% 18% 8%
Capex Growth 38% 29% 38% 25% -12%
Global Capex Total 16% 18% 13% 14% -5%

Source: Arete Research

LaTE for Operators

LTE was pushed by the GSM community in a global standards war against CDMA and WiMAX. Since LTE involves new core and radio networks, and raises the prospect of managing three networks (GSM, WCMDA/HSPA, and LTE), it is a major roadmap decision for conservative operators. Added to this are questions about spectrum, IPR, devices, and business cases. These many issues render moot near-term speculation about timing of LTE rollouts.

Verizon and DoCoMo aside, few operators profess an appetite for LTE’s new radio access products, air interfaces, or early-stage handsets and single-mode datacards. We expect plans for “commercial service” in ’10 will be “soft” launches. Reasons for launching early tend to be qualitative: gaining experience with new technology, or a perception of technical superiority. A look at leading operators shows only a few have clear LTE commitments.

  • Verizon already pushed back its Phase I (fixed access in 20-30 markets) to 2H10, with “rapid deployment” in ’11-’12 at 700MHz, 850MHz, and 1.9GHz bands, and national coverage by ’15, easily met by rolling out at 700Mhz. Arguably, Verizon is driven more by concerns over the end of the CDMA roadmap, and management said it would “start out slow and see what we need to do.”
  • TeliaSonera targets a 2010 data-only launch in two cities (Stockholm and Oslo), a high-profile test between Huawei and Ericsson.
  • Vodafone’s MetroZone concept uses low-cost femto- or micro-cells for urban areas; it has no firm commitment on launching LTE.
  • 3 is focussing on HSPA+, with HSPA datacards in the UK offering 15GB traffic for £15, on one-month rolling contracts.
  • TelefónicaO2 is awaiting spectrum auctions in key markets (Germany, UK) before deciding on LTE; it is sceptical about getting datacards for lower frequencies.
  •  Orange says it is investing in backhaul while it “considers LTE network architectures.”
  • T-Mobile is the most aggressive, aiming for an ’11 LTE rollout to make up for its late start in 3G, and seeks to build an eco-system around VoLGA (Voice over LTE via Generic Access).
  • China Mobile is backing a China-specific version (TD-LTE), which limits the role for Western vendors until any harmonising of standards.
  • DoCoMo plans to launch LTE “sometime” in ’10, but was burnt before in launching WCDMA early. LTE business plans submitted to the Japanese regulator expect $11bn of spend in five years, some at unique frequency bands (e.g., 1.5GHz and 1.7GHz).

LTE’s “commercial availability” marks the start of addressing the issue of handling voice, either via fallback to circuit switched networks, or with VoIP over wireless. The lack of LTE voice means operators have to support three networks, or shut down GSM (better coverage than WCDMA) or WCDMA (better data rates than GSM).  This is a major roadblock to mass market adoption: Operators are unlikely to roll out LTE based on data-only business models. The other hope is that LTE sparks fresh investment in core networks: radio is just 35-40% of Vodafone’s capex and 30% of Orange’s. The rest goes to core, transmission, IT, and other platforms. Yet large OEMs may not benefit from backhaul spend, with cheap wireline bandwidth and acceptance for point-to-multipoint microwave.

HSPA+ is a viable medium-term alternative to LTE, offering similar technical performance and spectral efficiency. (LTE needs, 20MHz vs. 10Mhz for HSPA+.)  There have been four “commercial” HSPA+ launches at 21Mbps peak downlink speeds, and 20+ others are pending. Canadian CDMA operators Telus and Bell (like the Koreans) adopted HSPA only recently. HSPA+ is favoured by existing vendors: it lacks enough new hardware to be an entry point for the industry’s second-tier (Motorola, NEC, and to a lesser extent Alcatel-Lucent), but HSPA+ will also require new devices. There are also further proposed extensions of GSM, quadrupling capacity (VAMOS, introducing MIMO antennas, and MUROS for multiplexing re-use); these too need new handsets.

Vendors say successive 3G and 4G variants require “just a software upgrade.”  This is largely a myth.  With both HSPA+ or LTE, the use of 64QAM brings significant throughput degradation with distance, sharply reducing the cell area that can get 21Mbps service to 15%. MIMO antennas and/or multi-carrier solutions with additional power amplifiers are needed to correct this. While products shipping from ’07 onwards can theoretically be upgraded to 21Mbps downlink, both capacity (i.e., extra carriers) and output power (to 60W+) requirements demand extra hardware (and new handsets). Vendors are only now starting to ship newer multi-mode (GSM, WCDMA, and LTE) platforms (e.g., Ericsson’s RBS6000 or Huawei’s Uni-BTS). Reducing the number of sites to run 2G, 3G, and 4G will dampen overall equipment sales.

Tempting for Vendors

There are three reasons LTE holds such irresistible charm for vendors. First, OEMs want to shift otherwise largely stagnant market shares. Second, vendor marketing does not allow for “fast followers” on technology roadmaps. Leading vendors readily admit claims of 100-150Mpbs throughput are “theoretical” but cannot resist the tendency to technical one-upmanship. Third, we think there will be fewer LTE networks built than in WCDMA, especially at 2.6GHz, as network-sharing concepts take root and operators are capital-constrained. Can the US afford to build 4+ nationwide LTE networks? This scarcity makes it even more crucial for vendors to win deals.

Every vendor expected to gain share in WCDMA. NSN briefly did, but Huawei is surging ahead, while ALU struggled to digest Nortel’s WCDMA unit and Motorola lost ground. Figure 1 shows leading radio vendors’ market share. In ’07, Ericsson and Huawei gained share.  In ’08, we again saw Huawei gain, as did ALU (+1ppt), whereas Ericsson was stable and Motorola and NSN lost ground.

Figure 1: Wireless Infrastructure Market Share, ’07E-’09E

[Figure]

Source: Arete Research; others incl. ZTE, Fujitsu, LG, Samsung, and direct sub-systems vendor sales (Powerwave, CommScope, Kathrein, etc.);
excludes data and transmission sales from Cisco, Juniper, Harris, Tellabs, and others.

While the industry evolved into an oligopoly structure where four vendors control 75% of sales, this has not eased pricing pressure or boosted margins. Ericsson remains the industry no. 1, but its margins are half ’07 levels; meanwhile, NSN is losing money and seeking further scale buying Nortel’s CDMA and LTE assets. Huawei’s long-standing aggressiveness is being matched by ZTE (now with 1,000 staff in EU), and both hired senior former EU execs from vendors such as Nortel and Motorola. Alcatel-Lucent and Motorola are battling to sustain critical mass, with a mix of technologies for each, within ~$5bn revenue business units.

We had forecast Nortel’s 5% share would dwindle to 3% in ’09 (despite part purchase by NSN) and Motorola seems unlikely to get LTE wins it badly needs, after abandoning direct 3G sales. ALU won a slice of Verizon’s LTE rollout (though it may be struggling with its EPC core product), and hopes for a role in China Mobile’s TD-LTE rollouts, but lacks WCDMA accounts to migrate. Huawei’s market share gains came from radio access more than core networks, but we hear it recently won Telefónica for LTE. NSN was late on its HSPA roadmap (to 7.2Mpbs and 14.4Mbps), and lacks traction in packet core. It won new customers in Canada and seeks a role in AT&T’s LTE rollout, but is likely to lose share in ’09. Buying Nortel is a final (further) bid for scale, but invites risks around retaining customers and integrating LTE product lines. Finally, Ericsson’s no. 1 market share looks stable, but it has been forced to respond to fresh lows in pricing from its Asian rivals, now equally adept at producing leading-edge technology, even if their delivery capability is still lagging.

Elusive Economics

The same issues that plagued WCDMA also make LTE elusive: coverage, network performance, terminals, and volume production of standard equipment. Operators have given vendors a list of issues to resolve in networks (esp. around EPC) and terminals. Verizon has its own technical specs relating to transmit output power and receive sensitivity, and requires tri-band support. We think commercialising LTE will require vendors to commit $1bn+ in R&D over three to five years, based on teams of 2-3,000 engineers. LTE comes at a time when every major OEM is seeking €1bn cost savings via restructuring, but must match plunging price levels.

Recent bids at a host of operators across a range of markets (i.e., emerging and developed) show no easing of pricing pressure. As a benchmark, if pricing starts out at 100, final prices may be <50, given “market entry” strategies, bundling within deals, or “gaming” bids to reduce incumbents’ profits at “house accounts.”  Competition remains intense. KDDI has eight vendors pitching for LTE business (Ericsson, NSN, ALU, Hitachi, Motorola, Samsung, Huawei, and NEC), with pricing “very important.”  Telefónica just awarded a radio and core network LTE deal to Huawei, which has been joined by ZTE in getting large Chinese orders and accessing ample export credit financing (as has Ericsson, via Sweden’s EKN).

Operators are pressuring vendors to add capacity at low incremental costs, with ever more sophisticated purchasing: Vodafone has a Luxembourg office that has run 3,000+ e-auctions; China Unicom did the same for 3G, squeezing prices. Operators are also hiring third-party benchmarking firms, which help unpack complex “black box” software pricing models.

It is no coincidence that every OEM saw a sharp structural decline in profitability during ’07, and none had recovered margins by 1Q09. (We cannot chart this precisely, since ALU, NSN, and others do not disclose wireless equipment-only profits, but Ericsson’s Networks margins offer a clear proxy.)  Vendors’ ongoing restructuring has not rid the industry of overcapacity, only shifted it down the value chain. Every OEM needs 50%+ of its cost base in low-cost countries by decade’s end. While ALU’s and NSN’s painful experience hardly recommends it, some M&A (or partial closures) has already begun with Nortel, and must spread to Motorola.

It took Ericsson six years of commercial WCMDA shipments before it neared the level of 2G sales: Indeed, WCDMA base station shipments surpassed GSM in 1Q09, driven by China (with APAC now 40% of the WCDMA market). Figure 2 shows our view that each successive wireless infrastructure generation yields a smaller addressable market, thanks in part to pricing. GSM sales peaked in ’08 and could fall 15% in ’09 as unit shipments peak, then drop sharply in ’11/’12. In WCDMA, shipments should rise 25% in ’09, but sales are likely to increase just 8-10%, led by the US and China, peaking in ’13/’14 on low-cost emerging markets deals.

Figure 2: Deflation and Delays in Successive Technology Generations

[Figure]

Source: Arete Research

We thought there were 300-400k Node Bs shipped by mid-’09; this may surpass 1m by YE’09 but seems unlikely to scale to the 3-4m cumulative GSM BTS deployed. The shrinking of addressable markets between generations and the shift to emerging markets invites further cost pressure. Speeding up LTE may leave a “hole” in OEM earnings.

There are no more “easy wins” for OEMs to boost margins from product re-design or squeezing suppliers. Sub-systems vendors like Powerwave and Commscope are struggling and can no longer afford to make product variants for each OEM. Ancillary costs (commodities, transport, energy) remain volatile and OEMs are often contractually obliged to deliver cost savings under managed services deals. Scores of smaller chipmakers have LTE basebands for base stations, but TI still has 80%+ market share. Cost pressures forced OEMs to adopt third-party products for femto-cells and WiMax. LTE aside, all OEMs are seeking project- and services-led deals (a trend we saw back in Managed Services: Gangland Warfare? June ’06). While it “locks in” customers, this people-intensive approach inherently lacks operating leverage.

LTE also awaits spectrum allocations (2.6GHz, digital dividend, or re-farming of 900MHz) that could affect industry economics, or tilt them towards HSPA+. This wide range of frequency bands limits scale economies and adds RF costs to devices. Terminals are a final challenge: Industry R&D staff were gushing about HSPA-enabled tablet devices back in mid-’07, yet they are only coming at YE’09 or by mid-’10. The same applies to “visionary” LTE device strategies: after a year of single-mode datacards (stretching into ’11), multi-mode handsets might come, followed by CE products with slots for SIM cards (or software solutions for this). Adding LTE modems to all CE devices is cost-prohibitive, and would require new business models from operators, with several iterations needed to cut chipset costs.

IPR remains a contentious and unresolved issue in both LTE and WiMax; QCOM and IDCC declarations to ETSI were preliminary filings; some have already expired, some have continuations, and some got re-filed. Many LTE IPR holders have not yet shown their hand, much like WiMax, where numerous key companies are not in Intel’s Open Patent Alliance. A sizable number of handset OEMs are working on their own LTE chipsets to build up IPR and avoid future royalties. NGNM speaks for 14 operators, many of whom also have their own IPR portfolios. Ground rules are unclear: will there be FRAND in LTE?

Coping with Traffic

Operators have numerous low-cost ways to add capacity (coding schemes, offloading traffic, adding carriers, etc.). We hear line cards for second carriers in a Node B cost as little as €2,000, before (controversial) associated software costs, which OEMs hoped would scale with subscribers, traffic, and speeds, but operators sought to contractually “cap.”  Most Node Bs are still not capable of handling 7.2Mbps.  Operators are also shifting investment from radio capacity (now in ample supply) to backhaul (which scales more directly with traffic), and seek to avoid new cells (a.k.a. “densification”), which add costs for rent, power, and maintenance. GSM micro-cells were deployed for coverage, but operators will not build 5,000+ 3G micro-cells. Vodafone said ~10% of its sites generate ~50% of its data traffic. On average, 3G networks are currently 10-20% utilised; only “hotspots” (airports, key metro hubs) are near 50–60%. We think mobile broadband depends in part on use of, and integration with, fixed broadband.  This “offload” makes more sense as 3G network traffic originates from “immobile” PCs using USB modems, near a fixed line connection.

Is There a Role for WiMax?

After three years of hype and delays, WiMax is finally getting deployed, with Clearwire, Yota (a Russian Greenfield operator with 75K subs), and UCOM (backed by KDDI, with 8,000 free trial subs) the highest-profile launches. Efforts to cut chipset costs are ongoing. Intel is moving to 45nm in ’10, and its rivals, e.g., Sequans, Beceem, and GCT, are seeing volumes ramp. WiMax chipsets are now $35-50, and must drop under $20 in ‘10 to match HSPA roadmaps. IOT should get easier as 802.16e networks become common, and more devices emerged at May ’09’s Computex fair. The roster of infrastructure vendors is seeing ALU and NSN retreat, leaving Motorola, Alvarion, Samsung, and possibly Cisco (for enterprise networks). Spectrum allocations remain uneven, with most new projects in emerging markets using WiMax as a DSL substitute. WiMax IPR remains controversial, fragmented, and lacking basic ground rules (i.e., FRAND). Intel has not won over potential heavyweight claimants like Qualcomm or Nokia in its Open Patent Alliance. As a data-only service, WiMax has a narrow window in which to reach critical mass before LTE rollouts subsume it. There remain too many differences between LTE and WiMax (frame structure, uplink transmission format, FDD vs. TDD, etc.) to merge them.

One long-promised solution is femto-cells, as part of so-called patch networks, which shift and intelligently re-route traffic onto fixed networks. Femto-cells have been through seemingly endless trials covering issues of distribution, support, network management, pricing, and customer propositions. As ever, femto-cells sit on the cusp of large-scale rollouts (due in ’10 or later) that depend on pricing and whether operators also have converged offerings. Regional incentives vary: The US needs coverage and to limit use of repeaters, Europe needs to ease congestion for specific users, and Japan might use femto-cells to integrate home devices.

All operators are targeting structurally lower capex/sales ratios. In emerging markets, the “mini-boom” in ’08 spending in Russia and Latin America is over. Attention is shifting to hotly contested 3G rollouts in China and India, both highly fragmented markets. India has six large established operators and half a dozen other projects, while China is split by technologies, provinces, and operators. Without over-engineering for “five nines” reliability, will developing world 3G be as profitable as GSM or CDMA? We already saw pricing fall by 30-50% in successive rounds of bids for China Unicom’s vast 3G rollout deal.

Will Anyone Get the Grapes?

Standing back from the hype, we struggle to see who really wants LTE to come in a hurry: Verizon and others are highly profitable, and have years to harvest cash flows from existing networks. Vendors’ R&D teams cannot resist the siren song of a wholly new technology, despite blindingly obvious drawbacks. None of these groups has excess cash to burn, though some are trying to force an end-game (as seen by NSN’s attempt to increase its relevance to US operators by buying Nortel). There is no doubt that wireless infrastructure is a deflationary industry; its last great success at rebuilding margins came from shifting costs onto a now moribund supply chain. We expect LTE and the NSN-Nortel deal (and another likely move involving Motorola) to usher in a period of highly political competition for “strategic accounts” and fresh attempts to “buy” share.


AreteThis is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

For further information please contact:

Richard Kramer, Analyst
richard.kramer@arete.net
+44 (0)20 7959 1303


IMPORTANT DISCLOSURES

For important disclosure information regarding the companies in this report, please call +44 (0)207 959 1300, or send an email to michael.pizzi@arete.net.

This publication was produced by Arete Research Services LLP (“Arete”) and is distributed in the US by Arete Research, LLC (“Arete LLC”).

Arete’s Rating System. Long (L), Neutral (N), Short (S). Analysts recommend stocks as Long or Short for inclusion in Arete Best Ideas, a monthly publication consisting of the firm’s highest conviction recommendations. Being assigned a Long or Short rating is determined by a stock’s absolute return potential and other factors, which may include share liquidity, debt refinancing, estimate risk, economic outlook of principal countries of operation, or other company or industry considerations. Any stock not assigned a Long or Short rating for inclusion in Arete Best Ideas is deemed to be Neutral. A stock’s return potential represents the difference between the current stock price and the target price.

Arete’s Recommendation Distribution.  As of 31 March 2009, research analysts at Arete have recommended 20.9% of issuers covered with Long (Buy) ratings, 14.9% with Short (Sell) ratings, with the remaining 64.2% (which are not included in Arete Best Ideas) deemed Neutral. A list of all stocks in each coverage group can be found at www.arete.net.

Required Disclosures. Analyst Certification: the research analyst(s) whose name(s) appear(s) on the front cover of this report certify that: all of the views expressed in this report accurately reflect their personal views about the subject company or companies and its or their securities, and that no part of their compensation was, is, or will be, directly or indirectly, related to the specific recommendations or views expressed in this report.

Research Disclosures. Arete Research Services LLP (“Arete”) provides investment advice for eligible counterparties and professional clients. Arete receives no compensation from the companies its analysts cover, does no investment banking, market making, money management or proprietary trading, derives no compensation from these activities and will not engage in these activities or receive compensation for these activities in the future. Arete’s analysts are based in London, authorized and regulated by the UK’s Financial Services Authority (“FSA”); they are not registered as research analysts with FINRA. Additionally, Arete’s analysts are not associated persons and therefore are not subject to Rule 2711 restrictions on communications with a subject company, public appearances and trading securities held by a research analyst account. Arete restricts the distribution of its research services to approved persons only.

Reports are prepared for non-private customers using sources believed to be wholly reliable and accurate but which cannot be warranted as to accuracy or completeness. Opinions held are subject to change without prior notice. No Arete director, employee or representative accepts liability for any loss arising from the use of any advice provided. Please see www.arete.net for details of any interests held by Arete representatives in securities discussed and for our conflicts of interest policy.

© Arete Research Services LLP 2009. All rights reserved. No part of this report may be reproduced or distributed in any manner without Arete’s written permission. Arete specifically prohibits the re-distribution of this report and accepts no liability for the actions of third parties in this respect.

Arete Research Services LLP, 27 St John’s Lane, London, EC1M 4BU, Tel: +44 (0)20 7959 1300
Registered in England: Number OC303210
Registered Office: Fairfax House, 15 Fulwood Place, London WC1V 6AY
Arete Research Services LLP is authorized and regulated by the Financial Services Authority

US Distribution Disclosures. Distribution in the United States is through Arete Research, LLC (“Arete LLC”), a wholly owned subsidiary of Arete, registered as a broker-dealer with the Securities and Exchange Commission (SEC) and the Financial Industry Regulatory Authority (FINRA). Arete LLC is registered for the purpose of distributing third-party research. It employs no analysts and conducts no equity research. Additionally, Arete LLC conducts no investment banking, market making, money management or proprietary trading, derives no compensation from these activities and will not engage in these activities or receive compensation for these activities in the future. Arete LLC accepts responsibility for the content of this report.

Section 28(e) Safe Harbor.  Arete LLC has entered into commission sharing agreements with a number of broker-dealers pursuant to which Arete LLC is involved in “effecting” trades on behalf of its clients by agreeing with the other broker-dealer that Arete LLC will monitor and respond to customer comments concerning the trading process, which is one of the four minimum functions listed by the Securities and Exchange Commission in its latest guidance on client commission practices under Section 28(e). Arete LLC encourages its clients to contact Anthony W. Graziano, III (+1 617 357 4800 or anthony.graziano@arete.net) with any comments or concerns they may have concerning the trading process.

Arete Research LLC, 3 Post Office Square, 7th Floor, Boston, MA 02109, Tel: +1 617 357 4800

LTE: Late, Tempting, and Elusive

Summary: To some, LTE is the latest mobile wonder technology – bigger, faster, better. But how do institutional investors see it?

AreteThis is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

[Members of the Telo 2.0TM Executive Briefing Subscription Service and Future Networks Stream, please see here for the full Briefing report. Non-Members, please see here for how to subscribe or email contact@telco2.net or call +44 (0) 207 247 5003.]

Wireless Infrastructure

[Figure]

LTE is the new HSPA is the new WCDMA: another wave of new air interfaces, network architectures, and enabled services to add mobile data capacity. From 3G to 3.5G to 4G, vendors are pushing technology into a few “pioneer” operators, hoping to boost sales. Yet, like previous “G’s,” LTE will see minimal near-term sales, requires $1bn+ of R&D per vendor, and promises uncertain returns. The LTE hype is adding costs for vendors that saw margins fall for two years.

Despite large projects in China and India, we see wireless infrastructure sales down 5% in ’09, after 10% growth in ’08. As major 2G rollouts near an end, emerging markets 3G pricing should take to new lows. Some 75% of sales are with four vendors (Ericsson, NSN-Nortel, Huawei, and Alcatel-Lucent), but margins have been falling: we do not see consolidation (like the recent NSN-Nortel deal) structurally improving margins. LTE is another chapter in the story of a fundamentally deflationary market, with each successive generation having a shorter lifecycle and yielding lower sales. We expect a period of heightened (and politicised) competition for a few “strategic accounts,” and fresh attempts to “buy” share (as in NSN-Nortel, or by ZTE).

Late Is Great. We think LTE will roll out later, and in a more limited form than is even now being proposed (after delays at Verizon and others). There is little business case for aggressive deployment, even at CDMA operators whose roadmaps are reaching dead ends. HSPA+ further confuses the picture.

Temptations Galore. Like WCDMA, every vendor thinks it can take market share in LTE. And like WCDMA, we think share shifts will prove limited, and the ensuing fight for deals will leave few winners.

Elusive Economics. LTE demands $1bn in R&D spend over three to five years; with extensive testing and sharing of technical data among leading operators, there is little scope to cut corners (or costs).  LTE rollouts will not improve poor industry margins, and at 2.6GHz, may force network sharing.

Reaching for the Grapes

Table 1 shows aggregate sales, EBITDA, and capex for the top global and emerging markets operators. It reflects a minefield of M&A, currencies, private equity deals, and changes in reporting structure. Getting more complete data is nearly impossible: GSA says there are 284 GSM/WCMDA operators, and CDG claims another 280 in CDMA. We have long found only limited correlation between aggregate capex numbers and OEM sales (which often lag shipments due to revenue recognition). Despite rising data traffic volumes and emerging markets capex, we think equipment vendor sales will fall 5%+ in US$. We think LTE adds risk by bringing forward R&D spend to lock down key customers, but committing OEMs to roll out immature technology with uncertain commercial demand.

Table 1: Sales and Capex Growth, ’05-’09E

  ’05 ’06 ’07 ’08 ’09E
Top 20 Global Operators          
Sales Growth 13% 16% 15% 10% 5%
EBITDA Growth 13% 15% 14% 10% 8%
Capex Growth 10% 10% 5% 9% -1%
Top 25 Emerging Market Operators          
Sales Growth 35% 38% 29% 20% 11%
EBITDA Growth 33% 46% 30% 18% 8%
Capex Growth 38% 29% 38% 25% -12%
Global Capex Total 16% 18% 13% 14% -5%

Source: Arete Research

LaTE for Operators

LTE was pushed by the GSM community in a global standards war against CDMA and WiMAX. Since LTE involves new core and radio networks, and raises the prospect of managing three networks (GSM, WCMDA/HSPA, and LTE), it is a major roadmap decision for conservative operators. Added to this are questions about spectrum, IPR, devices, and business cases. These many issues render moot near-term speculation about timing of LTE rollouts.

Verizon and DoCoMo aside, few operators profess an appetite for LTE’s new radio access products, air interfaces, or early-stage handsets and single-mode datacards. We expect plans for “commercial service” in ’10 will be “soft” launches. Reasons for launching early tend to be qualitative: gaining experience with new technology, or a perception of technical superiority. A look at leading operators shows only a few have clear LTE commitments.

  • Verizon already pushed back its Phase I (fixed access in 20-30 markets) to 2H10, with “rapid deployment” in ’11-’12 at 700MHz, 850MHz, and 1.9GHz bands, and national coverage by ’15, easily met by rolling out at 700Mhz. Arguably, Verizon is driven more by concerns over the end of the CDMA roadmap, and management said it would “start out slow and see what we need to do.”
  • TeliaSonera targets a 2010 data-only launch in two cities (Stockholm and Oslo), a high-profile test between Huawei and Ericsson.
  • Vodafone’s MetroZone concept uses low-cost femto- or micro-cells for urban areas; it has no firm commitment on launching LTE.
  • 3 is focussing on HSPA+, with HSPA datacards in the UK offering 15GB traffic for £15, on one-month rolling contracts.
  • TelefónicaO2 is awaiting spectrum auctions in key markets (Germany, UK) before deciding on LTE; it is sceptical about getting datacards for lower frequencies.
  •  Orange says it is investing in backhaul while it “considers LTE network architectures.”
  • T-Mobile is the most aggressive, aiming for an ’11 LTE rollout to make up for its late start in 3G, and seeks to build an eco-system around VoLGA (Voice over LTE via Generic Access).
  • China Mobile is backing a China-specific version (TD-LTE), which limits the role for Western vendors until any harmonising of standards.
  • DoCoMo plans to launch LTE “sometime” in ’10, but was burnt before in launching WCDMA early. LTE business plans submitted to the Japanese regulator expect $11bn of spend in five years, some at unique frequency bands (e.g., 1.5GHz and 1.7GHz).

LTE’s “commercial availability” marks the start of addressing the issue of handling voice, either via fallback to circuit switched networks, or with VoIP over wireless. The lack of LTE voice means operators have to support three networks, or shut down GSM (better coverage than WCDMA) or WCDMA (better data rates than GSM).  This is a major roadblock to mass market adoption: Operators are unlikely to roll out LTE based on data-only business models. The other hope is that LTE sparks fresh investment in core networks: radio is just 35-40% of Vodafone’s capex and 30% of Orange’s. The rest goes to core, transmission, IT, and other platforms. Yet large OEMs may not benefit from backhaul spend, with cheap wireline bandwidth and acceptance for point-to-multipoint microwave.

HSPA+ is a viable medium-term alternative to LTE, offering similar technical performance and spectral efficiency. (LTE needs, 20MHz vs. 10Mhz for HSPA+.)  There have been four “commercial” HSPA+ launches at 21Mbps peak downlink speeds, and 20+ others are pending. Canadian CDMA operators Telus and Bell (like the Koreans) adopted HSPA only recently. HSPA+ is favoured by existing vendors: it lacks enough new hardware to be an entry point for the industry’s second-tier (Motorola, NEC, and to a lesser extent Alcatel-Lucent), but HSPA+ will also require new devices. There are also further proposed extensions of GSM, quadrupling capacity (VAMOS, introducing MIMO antennas, and MUROS for multiplexing re-use); these too need new handsets.

Vendors say successive 3G and 4G variants require “just a software upgrade.”  This is largely a myth.  With both HSPA+ or LTE, the use of 64QAM brings significant throughput degradation with distance, sharply reducing the cell area that can get 21Mbps service to 15%. MIMO antennas and/or multi-carrier solutions with additional power amplifiers are needed to correct this. While products shipping from ’07 onwards can theoretically be upgraded to 21Mbps downlink, both capacity (i.e., extra carriers) and output power (to 60W+) requirements demand extra hardware (and new handsets). Vendors are only now starting to ship newer multi-mode (GSM, WCDMA, and LTE) platforms (e.g., Ericsson’s RBS6000 or Huawei’s Uni-BTS). Reducing the number of sites to run 2G, 3G, and 4G will dampen overall equipment sales.

Tempting for Vendors

There are three reasons LTE holds such irresistible charm for vendors. First, OEMs want to shift otherwise largely stagnant market shares. Second, vendor marketing does not allow for “fast followers” on technology roadmaps. Leading vendors readily admit claims of 100-150Mpbs throughput are “theoretical” but cannot resist the tendency to technical one-upmanship. Third, we think there will be fewer LTE networks built than in WCDMA, especially at 2.6GHz, as network-sharing concepts take root and operators are capital-constrained. Can the US afford to build 4+ nationwide LTE networks? This scarcity makes it even more crucial for vendors to win deals.

Every vendor expected to gain share in WCDMA. NSN briefly did, but Huawei is surging ahead, while ALU struggled to digest Nortel’s WCDMA unit and Motorola lost ground. Figure 1 shows leading radio vendors’ market share. In ’07, Ericsson and Huawei gained share.  In ’08, we again saw Huawei gain, as did ALU (+1ppt), whereas Ericsson was stable and Motorola and NSN lost ground.

Figure 1: Wireless Infrastructure Market Share, ’07E-’09E

[Figure]

Source: Arete Research; others incl. ZTE, Fujitsu, LG, Samsung, and direct sub-systems vendor sales (Powerwave, CommScope, Kathrein, etc.);
excludes data and transmission sales from Cisco, Juniper, Harris, Tellabs, and others.

While the industry evolved into an oligopoly structure where four vendors control 75% of sales, this has not eased pricing pressure or boosted margins. Ericsson remains the industry no. 1, but its margins are half ’07 levels; meanwhile, NSN is losing money and seeking further scale buying Nortel’s CDMA and LTE assets. Huawei’s long-standing aggressiveness is being matched by ZTE (now with 1,000 staff in EU), and both hired senior former EU execs from vendors such as Nortel and Motorola. Alcatel-Lucent and Motorola are battling to sustain critical mass, with a mix of technologies for each, within ~$5bn revenue business units.

We had forecast Nortel’s 5% share would dwindle to 3% in ’09 (despite part purchase by NSN) and Motorola seems unlikely to get LTE wins it badly needs, after abandoning direct 3G sales. ALU won a slice of Verizon’s LTE rollout (though it may be struggling with its EPC core product), and hopes for a role in China Mobile’s TD-LTE rollouts, but lacks WCDMA accounts to migrate. Huawei’s market share gains came from radio access more than core networks, but we hear it recently won Telefónica for LTE. NSN was late on its HSPA roadmap (to 7.2Mpbs and 14.4Mbps), and lacks traction in packet core. It won new customers in Canada and seeks a role in AT&T’s LTE rollout, but is likely to lose share in ’09. Buying Nortel is a final (further) bid for scale, but invites risks around retaining customers and integrating LTE product lines. Finally, Ericsson’s no. 1 market share looks stable, but it has been forced to respond to fresh lows in pricing from its Asian rivals, now equally adept at producing leading-edge technology, even if their delivery capability is still lagging.

Elusive Economics

The same issues that plagued WCDMA also make LTE elusive: coverage, network performance, terminals, and volume production of standard equipment. Operators have given vendors a list of issues to resolve in networks (esp. around EPC) and terminals. Verizon has its own technical specs relating to transmit output power and receive sensitivity, and requires tri-band support. We think commercialising LTE will require vendors to commit $1bn+ in R&D over three to five years, based on teams of 2-3,000 engineers. LTE comes at a time when every major OEM is seeking €1bn cost savings via restructuring, but must match plunging price levels.

To read the rest of the article, including:

  • Coping with Traffic
  • Is There a Role for WiMax?
  • Will Anyone Get the Grapes?

…Members of the Telco 2.0™ Executive Briefing Service and Future Networks Stream can read on here. Non-Members please see here to subscribe.


AreteThis is a Guest Briefing from Arete Research, a Telco 2.0™ partner specialising in investment analysis.

The views in this article are not intended to constitute investment advice from Telco 2.0™ or STL Partners. We are reprinting Arete’s Analysis to give our customers some additional insight into how some Investors see the Telecoms Market.

Full Article: Mobile Broadband: Urgent need for new business models

Summary: While the market for mobile broadband services (3G/WiMax/Dongles/Netbooks etc.) is growing explosively, today’s telco propositions are based on out-moded business models which threaten profitability. Telco 2.0 proposes innovative retail and wholesale approaches to improve returns.

This 30+ page article can be downloaded in PDF format here.The Executive Summary is reproduced below.

Executive summary & recommendations

At present, the majority of mobile broadband subscribers are engaged through traditional monthly contracts, typically over 12-24 month periods. This is true for both standalone modems and especially embedded-3G notebooks. There are also some popular prepaid offerings, especially in markets outside North America.

However, further evolution is necessary. Many consumers will not want another monthly commitment, especially if they are infrequent users. Operators will be wary of subsidising generic computing devices for the non-creditworthy.

We expect a variety of new business models to emerge and take a significant share of the overall user base, including:

  • Session-based access, similar to the familiar WiFi hotspot model;
  • Bundling of mobile broadband with other services, for example as an adjunct to fixed broadband or mobile voice services;
  • Free, guest or “sponsored” mobile broadband, paid for by venue owners or event organisers;
  • “Comes-with-data-included” models, where the upfront device purchase price includes connectivity, perhaps for a year;
  • Two-sided business models, with mobile access subsidised by “upstream” parties like advertisers or governments, rather than direct end-user payment.

Transition to these models will not be easy. There are question marks about the convenience of using physical SIM cards, especially for temporary access. Distribution, billing and support models will need re-evaluation. Definitions and metrics will need re-evaluation. Terms like ARPU and “subscription” will have less relevance as conventional “subscribers” drop to perhaps 40% of the overall mobile broadband user base. Operators and vendors need to face up to these challenges as soon as possible.

Figure 3: Mobile broadband can support both subscription & transient models

[Figure]

Source: Telco 2.0

Recommendations for mobile operators & retailers

Business models and business planning

  • Calculate your production cost per GB of data based on the real cost of adding extra new capacity, rather than just using up the “sunk costs” of current radio assets;
  • Reinterpret mobile broadband business plans based on potential capex reductions and delayed capacity upgrades during recession;
  • Develop a broad range of business models / payment options, including long-term contracts, prepaid accounts, session-based services, bundles and mechanisms for enabling “free” or “sponsored” connections. Do not think solely in terms of “subscribers” as most future users will not have “subscriptions”;
  • Examine “two-sided” Telco 2.0 business models as mechanisms for gaining mobile broadband revenue streams, for example through advertisers and governments.

Marketing and distribution

  • Be extremely careful about marketing mobile broadband as a direct alternative to DSL / cable. You may also need those wired broadband lines for future femtocells or WiFi offload;
  • Be realistic about the future mix of dongles vs. embedded modules. Customers (and salespeople) like dongles, so despite the theoretical attractions of embedded, don’t kill the golden goose. Instead, look at ways to add value to the dongle proposition;
  • Partner with large IT services and integration firms to deliver mobile broadband solutions to the enterprise, rather than point products.

Network planning

  • In dense areas, spectrum and network capacity is generally too valuable to waste on those users who are not “truly mobile”;
  • Only use application-specific traffic management if you are prepared to openly publish details of your network policies. Vague terms on “fair usage” are likely to be counter-productive and challenged by law and the Internet community;
  • Consider potential scenarios around new high-bandwidth applications appearing across the user base (e.g. high-definition video, enhanced always-on social networking etc). Put in place strong links between your device, web application and radio network departments to anticipate effects.

Technology planning

  • Look at the evolution of devices and software to understand likely opportunities & threats in the way they use the network (e.g. always-on connection whilst “off”, background applications pulling down traffic in “quiet” periods, new browser types or video codecs etc);
  • Push vendors and standards bodies towards mechanisms for enabling session-based access for mobile broadband. This may need compromises on SIMs or roaming / multi-operator partnerships.

Organisation

  • Develop a separate, arm-length, wholesale division able to offer mobile broadband to MVNOs, Internet players, device/content vendors or vertical-market specialists on a non-discriminatory basis.

Recommendations for network equipment suppliers

Business models and business planning

  • Better understand the mix of traffic by device type on operator customers’ networks, as this will drive their future upgrade / enhancement plans. A move to PC-dominated networks may need very different architecture to phone-oriented designs;
  • Develop network-upgrade business cases against realistic growth in device types, application consumption and changing usage patterns.

Product Development

  • Look at new managed service opportunities arising around the MID and “mobilised” broadband consumer electronics device ecosystems, for example in content or application management, service and support etc;
  • Look at mechanisms for supporting non-SIM or multi-SIM models for mobile broadband, especially for users with multiple devices;
  • Optimise backhaul and network-offload solutions to cope with expected trends in mobile broadband. Integrate WiFi or femtocells with “split tunnel” architectures to “dump traffic onto the Internet”;
  • Develop data-mining and analytics solutions to help operators better understand the usage models for mobile broadband, and customise their networks and offerings to target end users more effectively.

Marketing and distribution

  • Be wary of over-hyping network peak speeds in marketing material, rather than increasing overall aggregate network capacity;
  • Position WiMAX networks as ideal platforms for innovative end-to-end device, connectivity and application concepts.

Recommendations for device & component vendors

Business models and business planning

  • Consider issues around macro-network offload, specifically the ability to easily recognise and preferentially connect via femtocells or WiFi;
  • Expect the MID, consumer electronics and M2M markets for mobile broadband to be fragmented and possibly delayed by recession. Focus on partner programmes, tools and consulting/integration services to enable the creation of new device types and business models;
  • Do not expect markets with a heavy prepay bias for mobile phones to be enthusiastic about long-term contracts for notebook-based mobile broadband;
  • Be very wary about operator software acting as a “control point” on the notebook, especially in terms of application monitoring / blocking / advertising. As handsets become more open, there are few arguments for PCs to become closed;
  • Anticipate support questions around issues like network coverage, signal strength etc. and have processes in place to deal with these;
  • Consider new business models for WWAN-enabled notebooks supported by advertisers, content or Internet companies, governments etc;
  • Support WiMAX as well as 3G / LTE in new device platforms – it seems likely that some WiMAX operators will be more open to experimentation with new business models, as they have less legacy to protect from cannibalisation.

Product Development

  • Add value to dongles by supporting other functions like GPS, video, memory, WiFi, MP3 etc. Also use physical design to differentiate and make external modems seen as “cool”;
  • Encourage the development of “free” / 3rd-party paid models for mobile broadband to drive modem adoption among users unwilling to pay for access themselves;
  • Consider developing your own portfolio of value-added services that can exploit the WWAN connection – e.g. managed security and backup;
  • Everyone with a WWAN-enabled notebook or MID will have a mobile phone as well. Endeavour to make them work well together and exploit each other’s capabilities;

Marketing and distribution

  • Encourage operator partners to support a broader range of business models to extend the addressable market to customers unwilling to sign 24-month contracts for mobile data;
  • Look at channels for temporary modem rentals / provision to venue or event delegates;
  • Examine non-operator routes to market for “vanilla” modules and modems, and support this usage model. For example, set up a web portal with methods highlighting how to acquire temporary SIM+data plans in different countries;
  • Push OS suppliers towards richer APIs in connection managers that can tell applications various characteristics about the network being used, signal strength, macro vs. femtocell, maybe even measured latencies and packet loss. Maybe also expose details of alternative radio bearers;
  • Push module vendors towards pricing models that are geared into future service uptake / expenditure;
  • Work closely with software vendors to ensure optimised performance of connection managers, browsers and other application environments;
  • Look at bundling opportunities via operators, for example phone + netbook combinations.

© Copyright 2009. STL Partners. All rights reserved.
STL Partners published this content for the sole use of STL Partners’ customers and Telco 2.0™ subscribers. It may not be duplicated, reproduced or retransmitted in whole or in part without the express permission of STL Partners, Elmwood Road, London SE24 9NU (UK). Phone: +44 (0) 20 3239 7530. E-mail: contact@telco2.net. All rights reserved. All opinions and estimates herein constitute our judgment as of this date and are subject to change without notice.

Full Article: Video Distribution 2.0 – How to fix a broken value chain

NB A full PDF copy of this briefing can be downloaded here.

This special Executive Briefing report summarises the brainstorming output from the Content Distribution 2.0 (Broadband Video) section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.

It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0′ business model framework (see www.telco2research.com).

Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.

There are 5 other reports in this post-event series, covering the other sections of the event: Retail Services 2.0, Enterprise Services 2.0, Piloting 2.0, Technical Architecture 2.0, and APIs 2.0. In addition there will be an overall ‘Executive Summary’ report highlighting the overall messages from the event.

Each report contains:

  • Our independent summary of some of the key points from the stimulus presentations
  • An analysis of the brainstorming output, including a large selection of verbatim comments
  • The ‘next steps’ vote by the participants
  • Our conclusions of the key lessons learnt and our suggestions for industry next steps.

 

The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points.

 

Background to this report

The demand for internet video is exploding. This is putting significant stress on the current fixed and mobile distribution business model. Infrastructure investments and operating costs required to meet demand are growing faster than revenues. The strategic choices facing operators are to charge consumers more when they expect to pay less, to risk upsetting content providers and users by throttling bandwidth, or to unlock new revenues to support investment and cover operating costs by creating new valuable digital distribution services for the video content industry.

Brainstorm Topics

  • A summary of the new Telco 2.0 Online Video Market Study: Options and Opportunities for Distributors in a time of massive disruption.
  • What are the most valuable new digital distribution services that telcos could create?
  • What is the business model for these services – who are the potential buyers and what are prior opportunity areas?
  • What progress has been made in new business models for video distribution – including FTTH deployment, content-delivery networking, and P2P?
  • Preliminary results of the UK cross-carrier trial of sender-pays data
  • How the TM Forum’s IPSphere programme can support video distribution

 

Stimulus Presenters and Panellists

  • Richard D. Titus, Controller, Future Media, BBC
  • Trudy Norris-Grey, MD Transformation and Strategy, BT Wholesale
  • Scott Shoaf, Director, Strategy and Planning, Juniper Networks
  • Ibrahim Gedeon, CTO, Telus
  • Andrew Bud, Chairman, Mobile Entertainment Forum
  • Alan Patrick, Associate, Telco 2.0 Initiative

 

Facilitator

  • Simon Torrance, CEO, Telco 2.0 Initiative

 

Analysts

  • Chris Barraclough, Managing Director, Telco 2.0 Initiative
  • Dean Bubley, Senior Associate, Telco 2.0 Initiative
  • Alex Harrowell, Analyst, Telco 2.0 Initiative

 

Stimulus Presentation Summaries

Content Distribution 2.0

Scott Shoaf, Director, Strategy and Planning, Juniper Networks opened the session with a comparison of the telecoms industry’s response to massive volumes of video and that of the US cable operators. He pointed out that the cable companies’ raison d’etre was to deliver vast amounts of video; therefore their experience should be worth something.

The first question, however, was to define the problem. Was the problem the customer, in which case the answer would be to meter, throttle, and cap bandwidth usage? If we decided this was the solution, though, the industry would be in the position of selling broadband connections and then trying to discourage its customers from using them!

Or was the problem not one of cost, but one of revenue? Networks cost money; the cloud is not actually a cloud, but is made up of cables, trenches, data centres and machines. Surely there wouldn’t be a problem if revenues rose with higher usage? In that case, we ought to be looking at usage-based pricing, but also at alternative business models – like advertising and the two-sided business model.

Or is it an engineering problem? It’s not theoretically impossible to put in bigger pipes until all the HD video from everyone can reach everyone else without contention – but in practice there is always some degree of oversubscription. What if we focused on specific sources of content? Define a standard of user experience, train the users to that, and work backwards?

If it is an engineering problem, the first step is to reduce the problem set. The long tail obviously isn’t the problem; it’s too long, as has been pointed out, and doesn’t account for very much traffic. It’s the ‘big head’ or ‘short tail’ stuff that is the heart of the problem: we need to deal with this short tail of big traffic generators. We need a CDN or something similar to deliver for this.

On cable, the customers are paying for premium content – essentially movies and TV – and the content providers are paying for distribution. We need to escape from the strict distinctions between Internet, IPTV, and broadcast. After all, despite the alarming figures for people leaving cable, many of them are leaving existing cable connections to take a higher grade of service. Consider Comcast’s Fancast – focused on users, not lines, with an integrated social-recommendation system, it integrates traditional cable with subscription video. Remember that broadcast is a really great way to deliver!

Advertising – at the moment, content owners are getting 90% of the ad money.

Getting away from this requires us to standardise the technology and the operational and commercial practices involved. The cable industry is facing this with the SCTE130 and Advanced Advertising 1.0 standards, which provide for fine-grained ad insertion and reporting. We need to blur the definition of TV advertising – the market is much bigger if you include Internet and TV ads together. Further, 20,000 subscribers to IPTV aren’t interesting to anyone – we need to attack this across the industry and learn how to treat the customer as an asset.

 

The Future of Online Video, 6 months on

Alan Patrick, Associate, Telco 2.0 updated the conference on how things had changed since he introduced the ”Pirate World” concept from our Online Video Distribution strategy report at the last Telco 2.0 event. The Pirate World scenario, he said, had set in much faster and more intensely than we had expected, and was working in synergy with the economic crisis.

Richard Titus, Controller, Future Media, BBC: ”I have no problem with carriers making money, in fact, I pay over the odds for a 50Mbits link, but the real difference is between a model that creates opportunities for the public and one which constrains them.”

Ad revenues were falling; video traffic still soaring; rights-holders’ reaction had been even more aggressive than we had expected, but there was little evidence that it was doing any good. Entire categories of content were in crisis.

On the other hand, the first stirrings of the eventual “New Players Emerge” scenario were also observable; note the success of Apple in creating a complete, integrated content distribution and application development ecosystem around its mobile devices.

The importance of CPE is only increasing; especially with the proliferation of devices capable of media playback (or recording) and interacting with Internet resources. There’s a need for a secure gateway to help manage all the gadgets and deliver content efficiently. Similarly, CDNs are only becoming more central – there is no shortage of bandwidth, but only various bottlenecks. It’s possible that this layer of the industry may become a copyright policing point.

We think new forms of CPE and CDNs are happening now; efforts to police copyright in the network are in the near future; VAS platforms are the next wave after that, and then customer data will become a major line of business.

Most of all, time is flying by, and the overleveraged, or undercapitalised, are being eaten first.

 

The Content Delivery Framework

Ibrahim Gedeon, CTO, Telus introduced some lessons from Telus’s experience deploying both on-demand bandwidth and developer APIs. Telcos aren’t good at content, he said; instead, we need to be the smartest pipe and make use of our trusted relationship with customers, built up over the last 150 years.

We’re working in an environment where cash is scarce and expensive, and pricing is a zero- or even negative-sum game; impossible to raise prices, and hard to cut without furthering the price war. So what should we be doing? A few years ago the buzzword was SDP; now it’s CDN. We’d better learn what those actually mean!

Trudy Norris-Gray, Managing Director, BT Wholesale: ”There is no capacity problem in the core, but there is to the consumer – and three bad experiences means the end of an application or service for that individual user.”

Anyway, we’re both a mobile and fixed operator and ISP, and we’ve got an IPTV network. We’ve learned the hard way that technology isn’t our place in the value chain. When we got the first IPTV system from Microsoft, it used 2,500 servers and far, far too much power. So we’re moving to a CDF (Content Delivery Framework) – which looks a lot like a SDP. Have the vendors just changed the labels on these charts?

So why do we want this? So we can charge for bandwidth, of course; if it was free, we wouldn’t care! But we’re making around $10bn in revenues and spending 20% of that in CAPEX. We need a business case for this continued investment.

We need the CDF to help us to dynamically manage the delivery and charging process for content. There was lots of goodness in IMS, the buzzword of five years ago, and in SDPs. But in the end it’s the APIs that matter. And we like standards because we’re not very big. So, we want to use TM Forum’s IPSphere to extend the CDF and SDF; after all, in roaming we apply different rate cards dynamically and settle transactions, so why not here too, for video or data? I’d happily pay five bucks for good 3G video interconnection.

And we need to do this for developer platforms too, which is why we’re supporting the OneAPI reference architecture. To sum up, let’s not forget subscriber identity, online charging – we’ve got to make money – the need for policy management because not all users are equal, and QoS for a differentiated user experience.

 

Sender-Pays Data in Practice

Andrew Bud, Chairman, MEF gave an update on the trial of sender-pays data he announced at the last event. This is no longer theoretical, he said; it’s functioning, just with a restricted feature set. Retail-only Internet has just about worked so far; because people pay for the services through their subscription and they’re free. Video breaks this, he said; it will be impossible to be comprehensive, meaningful, and sustainable.

You can’t, he said, put a meaningful customer warning that covers all the possible prices you might encounter due to carrier policy with your content; and everyone is scared of huge bills after the WAP experience. Further, look at the history of post offices, telegraphy and telephony – it’s been sender-pays since the 1850s. Similarly, Amazon.com is sender-pays, as is Akamai.

Hence we need sending-party pays data – that way, we can have truly free ads: not one where the poor end users ends up paying the delivery cost!

Our trial: we have relationships with carriers making up 85% of the UK market. We have contracts, priced per-MB of data, with them. And we have four customers – Jamster, who brought you the Crazy Frog, Shorts, THMBNLS, who produce mobisodes promoting public health, and Creative North – mobile games as a gift from the government. Of course, without sender-pays this is impossible.

We’ve discovered that the carriers have no idea how much data costs; wholesale pricing has some very interesting consequences. Notably the prices are being set too high. Real costs and real prices mean that quality of experience is a real issue; it’s a very complicated system to get right. The positive sign, and ringing endorsement for the trial, is that some carriers are including sender-pays revenue in their budgets now!

 

Participant Feedback

Introduction

The business of video is a prime battleground for Telco 2.0 strategies. It represents the heaviest data flows, the cornerstone of triple/quad-play bundling, powerful entrenched interests from broadcasters and content owners, and a plethora of regulators and industry bodies. For many people, it lies at the heart of home-based service provision and entertainment, as well as encroaching on the mobile space. The growth of P2P and other illegal or semi-legal download mechanisms puts pressure on network capacity – and invites controversial measures around protecting content rights and Net Neutrality.

In theory, operators ought to be able to monetise video traffic, even if they don’t own or aggregate content themselves. There should be options for advertising, prioritised traffic or blended services – but these are all highly dependent on not just capable infrastructure, but realistic business models.  Operators also need to find a way to counter the ‘Network Neutrality’ lobbyists who are confounding the real issue (access to the internet for all service providers on a ‘best efforts’ basis) with spurious arguments that operators should not be able to offer premium services, such as QoS and identity, to customers that want to pay for them.  Telco 2.0 would argue that the right to offer (and the right to buy) a better service is a cornerstone of capitalism and something that is available in every other industry.  Telecoms should be no different.  Of course, it remains up to the operators to develop services that customers are willing to pay more for…

A common theme in the discussion was “tempus fugit” – time flies. The pace of evolution has been staggering, especially in Internet video distribution – IPTV, YouTube, iPlayer, Hulu, Qik, P2P, mashups and so forth. Telcos do not have the luxury of time for extended pilot projects or grandiose collaborations that take years to come to fruition.

With this timing issue in mind, the feedback from the audience was collected in three categories, although here the output has been aggregated thematically, as follows:

  • STOP – What should we stop doing?
  • START – What should we start doing?
  • DO MORE – What things should we do more of?

 

Feedback: STOP the current business model

There was broad agreement that the current model is unsustainable, especially given the demands that “heavy” content like video traffic places on the network…..

·         [Stop] giving customers bandwidth for free [#5]

·         Stop complex pricing models for end-user [#9]

·         Stop investing so much in sustaining old order [#18]

·         Stop charging mobile subscribers on a per megabyte basis. [#37]

·         Current peering agreement/ip neutrality is not sustainable. [#41]

·         [Stop] assuming things are free. [#48]

·         [Stop] lowering prices for unlimited data. [#61]

·         Have to develop more models for upstream charging for data rather than just flat rate to subscribers. [#11]

·         Build rational pricing segmentation for data to monetize both sides of the value chain with focus on premium value items. [#32]

 

Feedback: Transparency and pricing

… with many people suggesting that Telcos first need to educate users and service providers about the “true cost” of transporting data…. although whether they actually know the answer themselves is another question, as it is much an issue of accounting practices as network architecture. 

·         Make the service providers aware of the cost they generate to carriers. [#31]

·         Make pricing transparency for consumers a must. [#10]

·         Mobile operators start being honest with themselves about the true cost of data before they invest in LTE. [#7]

·         When resources are limited, then rationing is necessary. Net Neutrality will not work. Today people pay for water in regions where it is limited in supply. Its use is abused when there are no limits. [#17]

·         Start being transparent in data charges, it will all stay or fall with cost transparency. [#12]

·         You can help people understand usage charges, with meters or regular updates, requires education for a behavioural change, easier for fixed than mobile. [#14]

·         Service providers need to have a more honest dialogue with subscribers and give them confidence to use services [#57]

·         As an industry we must invest more in educating the market about network economics, end-users as well as service providers. [#58]

·         Start charging subscribers flat rate data fee rather than per megabyte. [#46]

Feedback: Sender-pays data

Andrew Bud’s concept of “sender pays data”, in which a content provider bundles in the notional cost of data transport into the download price for the consumer, generated both enthusiasm and concerns (although very little outright disagreement). Telco 2.0 agrees with the fundamental ‘elegance’ of the notion, but thinks that there are significant practical, regulatory and technical issues that need to be resolved. In particular, the delivery of “monolithic” chunks of content like movies may be limited, especially in mobile networks where data traffic is dominated by PCs with mobile broadband, usually conducting a wide variety of two-way applications like social networking.

Positive

·         Sender pays is the only sane model. [#6]

·         Do sender pays on both ‘sides’ consumer as well…gives ‘control’ and clarity to user. [#54]

·         Sender Pays is one specific example of a much larger category of 3rd-party pays data, which also includes venue owners (e.g. hotels or restaurants), advertisers/sponsors (‘thanks for flying Virgin, we’re giving you 10MB free as a thank-you’), software developers, government (e.g. ‘benefit’ data for the unemployed etc) etc. The opportunity for Telcos may be much larger from upstream players outside the content industry [#73]

·         We already do sender pays on our mobile portal – on behalf of all partner content providers including Napster mobile. [#77]

·         Change the current peering model into an end to end sender pay model where all carriers in the chain receive the appropriate allocation of the sender pay revenue in order to guarantee the QoS for the end user. [#63]

·         Focus on the money flows e.g. confirm the sender pays model. [#19]

Qualified Support/Implementation concerns

·         Business models on sender pays, but including the fact, that roaming is needed, data costs will be quite different across mobile carriers and the aggregators costs and agreements are based on the current carriers. These things need to be solved first [#26]

·         Sender pays is good but needs the option of ‘only deliver via WiFi or femtocell when the user gets home’ at 1/100th the cost of ‘deliver immediately via 3G macro network’. [#15]

·         Who pays for AJAX browsers proactively downloading stuff in the background without explicit user request? [#64]

·         Be realistic about sender pays data. It will not take off it is not standard across the market, and the data prices currently break the content business model – you have to compare to the next alternative. A video on iTunes costs 1.89 GBP including data… Operators should either take a long term view or forget about it. [#20]

·         Sender-pays data can be used to do anything the eco-system needs, including quality/HD. It doesn’t yet today only because the carriers don’t know how to provide those. [#44]

·         Sender pays works for big monolithic chunks like songs or videos. But doesn’t work for mash up or communications content/data like Facebook (my Facebook page has 30 components from different providers – are you going to bill all of them separately?) [#53]

·         mBlox: more or less like a free-call number. doesn’t guarantee quality/HD [#8]

Sceptical

·         Stop sender pays because user is inundated with spam. [#23]

o    Re 23: At least the sender is charged for the delivery. I do not want to pay for your SPAM! [#30]

 

Feedback: QoS

A fair amount of the discussion revolved around the thorny issues of capacity, congestion, prioritisation and QoS, although some participants felt this distracted a little from the “bigger picture” of integrated business models.

·         Part of bandwidth is dedicated to high quality contents (paid for). Rest is shared/best effort. [#27]

·         Start annotating the network, by installing the equivalent of gas meters at all points across the network, in order that they truly understand the nature of traffic passing over the network – to implement QoS. [#56]

o    Re: 56 – that’s fine in the fixed world or mobile core, but it doesn’t work in the radio network. Managing QoS in mobile is difficult when you have annoying things like concrete walls and metallised reflective windows in the way [#75]

·         [Stop] being telecom focused and move more towards solutions. It is more than bandwidth. [#25]

·         Stop pretending that mobile QoS is important, as coverage is still the gating factor for user experience. There’s no point offering 99.9% reliability when you only have 70% coverage, especially indoors [#29]

·         Start preparing for a world of fewer, but converged fixed-mobile networks that are shared between operators. In this world there will need to be dynamic model of allocating and charging for network capacity. [#67]

·         We need applications that are more aware of network capacity, congestion, cost and quality – and which alter their behaviour to optimise for the conditions at any point in time e.g. with different codec’s or frame rate or image size. The intelligence to do this is in the device, not the network. [#68]

o    Re: 68, is it really in the CPE? If the buffering of the content is close at the terminal, perhaps, otherwise there is no jitter guarantee. [#78]

§  Re 78 – depends on the situation, and download vs. streaming etc. Forget the word ‘terminal’, it’s 1980s speak, if you have a sufficiently smart endpoint you can manage this – hence PCs being fine for buffering YouTube or i-Player etc, and some of the video players auto-sensing network conditions [#81]

·         QoE – for residential cannot fully support devices which are not managed for streamed content. [#71]

·         Presumably CDNs and caching have a bit of a problem with customised content, e.g. with inserted/overlaid personalised adverts in a video stream? [#76]
 

Feedback: platforms, APIs, and infrastructure

However, the network and device architecture is only part of the issue. It is clear that video distribution fits centrally within the wider platform problems of APIs and OSS/BSS architecture, which span the overall Telco 2.0 reach of a given operator.

·         Too much focus on investment in the network, where is the innovation in enterprise software innovation to support the network? [#70]

·         For operator to open up access to the business assets in a consistent manner to innovative. Intermediaries who can harmonise APIs across a national or global marketplace. [#13]

·         The BSS back office; billing, etc will not support robust interactive media for the most part. [#22]

·         Let content providers come directly to Telcos to avoid a middle layer (aggregators) to take the profit. This requires collaboration and standardization among Telco’s for the technical interfaces and payment models. [#28]

·         More analysis on length of time and cost of managing billing vendor for support of 2-sided business model. Prohibitively expensive in back office to take risks. Why? [#65]

·         It doesn’t matter how strong the network is if you can’t monetize it on the back end OSS/BSS. [#40]
 

Feedback: Business models for video

Irrespective of the technical issues, or specific point commercial innovations like sender pays, there are also assorted problems in managing ecosystem dynamics, or more generalised business models for online video or IPTV. A significant part of the session’s feedback explored the concerns and possible solutions – with the “elephant in the room” of Net Neutrality lurking on the sidelines.

·         Open up to lower cost lower risk trials to see what does and doesn’t work. [#35]

·         Real multi quality services in order to monetize high quality services. [#36]

·         Transform net neutrality issues into a fair policy approach… meaning that you cannot have equal treatment when some parties abuse the openness. [#39]

o    Re 39: I want QoE for content I want to see. Part of this is from speed of access. Net Neutrality comes from the Best Effort and let is fight out in the scarce network. I.e. I do not get the QoE for all the other rubbish in the network. [#69]

·         Why not bundling VAS with content transportation to ease migration from a free world to a pay for value world? [#43]

·         Do more collaborative models which incorporate the entire value chain. [#55]

·         Service providers start partnering to resell long tail content from platform providers with big catalogues. [#59]

·         [Start to] combine down- and up-stream models in content. Especially starts get paid to deliver long tail content. [#60]

·         Start thinking longer term instead of short term profit, to create a new ecosystem that is bigger and healthier. [#62]

·         Exploit better the business models between content providers and carriers. [#16]

·         Adapt price to quality of service. [#21]

·         Put more attention on quality of end user experience. [#24]

·         I am prepared to pay a higher retail DSL subscription if I get a higher quality of experience. – not just monthly download limits. [#38]

·         maximize revenues based on typical Telco capabilities (billing, delivery, assurance on million of customers) [#50]

·         Need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]

·         On our fixed network, DSL resellers pay for the access and for the bandwidth used – this corresponds to the sender pays model; due to rising bandwidth demand the charge for the resellers continuously increases. so we have to adapt bandwidth tariffs every year in order not to suffocate our DSL resellers. Among them are also companies offering TV streaming. [#82]

·         More settlement free peering with content/app suppliers – make the origination point blazingly fast and close to zero cost. rather focus on charging for content distribution towards the edge of the access network (smart caching, torrent seeds, multicast nodes etc) [#74]
 

Feedback: Others

In addition to these central themes, the session’s participants also offered a variety of other comments concerning regulatory issues, industry collaboration, consumer issues and other non-video services like SMS.

·         Start addressing customer data privacy issues now, before it’s too late and there is a backlash from subscribers and the media. [#42]

·         Consolidating forums and industry bodies so we end up with one practical solution. [#45]

·         Identifying what an operator has potential to be of use for to content SP other than a pipe. [#49]

·         Getting regulators to stimulate competition by enforcing structural separation – unbundle at layer 1, bring in agile players with low operating cost. Let customers vote with their money – focus on deliverable the fastest basic IP pipe at a reasonable price. If the basic price point is reasonable customers will be glad to pay for extra services – either sender or receiver based. [#72]

·         IPTV <> Internet TV. In IPTV the Telco chooses my content, Internet TV I choose. [#79]

·         Put attention on creating industry collaboration models. [#47]

·         Stop milking the SMS cash cow and stop worrying about cannibalising it, otherwise today’s rip-off mobile data services will never take off. [#33]

·         SMS combined with the web is going to play a big role in the future, maybe bigger that the role it played in the past. Twitter is just the first of a wave of SMS based social media and comms applications for people. [#51]

Participants ‘Next Steps’ Vote

Participants were then asked: Which of the following do we need to understand better in the next 6 months?

  • Is there really a capacity problem, and what is the nature of it?
  • How to tackle the net neutrality debate and develop an acceptable QOS solution for video?
  • Is there a long term future for IPTV?
  • How to take on the iPhone regarding mobile video?
  • More aggressive piloting / roll-out of sender party pays data?

Lessons learnt & next steps

The vote itself reflects the nature of the discussions and debates at the event:  there are lots of issues and things that the industry is not yet clear on that need to be ironed out.  The world is changing fast and how we overcome issues and exploit opportunities is still hazy.  And all the time, there is a concern that the speed of change could overtake existing players (including Telcos and ISPs)!

However, there does now seem to be greater clarity on several issues with participants becoming increasingly keen to see the industry tackle the business model issue of flat-rate pricing to consumers and little revenue being attached to the distribution of content (particularly bandwidth hungry video).  Overall, most seem to agree that:

1.     End users like simple pricing models (hence success of flat rate) but that some ‘heavy users’ will require a variable rate pricing scheme to cover the demands they make;

2.     Bandwidth is not free and costs to Telcos and ISPs will continue to rise as video traffic grows;

3.     Asking those sending digital goods to pay for the distribution cost is sensible…;

4.     …but plenty of work needs to be done on the practicalities of the sender-pays model before it can be widely adopted across fixed and mobile;

5.     Operators need to develop a suite of value-added products and services for those sending digital goods over their networks so they can charge incremental revenues that will enable continued network investment;

6.     Those pushing the ‘network neutrality’ issue are (deliberately or otherwise) causing confusion over such differential pricing which creates PR and regulatory risks for operators that need to be addressed.

There are clearly details to be ironed out – and probably experiments in pricing and charging to be done. Andrew Bud’s (and many others, it must be added, have suggested similar) sending-party pays model may work, or it may not – but this is an area where experiments need to be tried. The idea of “educating” upstream users is euphemistic – they are well aware of the benefits they currently are accruing, which is why the Net Neutrality debate is being deliberately muddied. Distributors need to be working on disentangling bits that are able to be free from those that pay to ride, not letting anyone get a free ride.

As can be seen in the responses, there is also a growing realisation that the Telco has to understand and deal with the issues of the overall value chain, end-to-end, not just the section under its direct control, if it wishes to add value over and above being a bit pipe. This is essentially moving towards a solution of the “Quality of Service” issue – they need to decide how much of the solution is capacity increase, how much is traffic management, and how much is customer expectation management.

Alan Patrick, Telco 2.0: ”98.7% of users don’t have an iPhone, but 98% of mobile developers code for it because it has an integrated end-to-end experience, rather than a content model based on starving in a garage.”

The “Tempus Fugit” point is well made too – the Telco 2.0 participants are moving towards an answer, but it is not clear that the same urgency is being seen among wider Telco management.

Two areas were skimmed through a little too quickly in the feedback:

Managing a way through the ‘Pirate World’ environment

The economic crisis has helped in that it has reduced the amount of venture capital and other risk equity going into funding plays that need not make revenue, never mind profit. In our view this means that the game will resolve into a battle of deep pockets to fund the early businesses. Incumbents typically suffer from higher cost bases and higher hurdle rates for new ventures. New players typically have less revenue, but lower cost structures. For existing Telcos this means using existing assets as effectively as possible and we suggest a more consolidated approach from operators and associated forums and industry bodies so the industry ends up with one practical solution.  This is particularly important when initially tackling the ‘Network Neutrality’ issue and securing customer and regulatory support for differential pricing policies.

Adopting a policing role, particularly in the short-term during Pirate World, may be valuable for operators.  Telco 2.0 believes the real value is in managing the supply of content from companies (rather than end users) and ensuring that content is legal (paid for!). 

What sort of video solution should Telcos develop?

The temptation for operators to push iPTV is huge – it offers, in theory, steady revenues and control of the set-top box. Unfortunately, all the projected growth is expected to be in Web TV, delivered to PCs or TVs (or both).  Providing a suite of value-added distribution services is perhaps a more lucrative strategy for operators:

  • Operators must better understand the needs of upstream segments and individual customers (media owners, aggregators, broadcasters, retailers, games providers, social networks, etc.) and develop propositions for value-added services in response to these.  Managing end user data is likely to be important here.  As one participant put it:

    o    We need a deeper understanding of consumer demand which can then be aggregated by the operator (not content aggregators), providing feedback to content producers/owners and then syndicated as premium content to end-users. It comes down to operators understanding that the real value lays in their user data not their pipes! [#52]

  • Customer privacy will clearly be an issue if operators develop solutions for upstream customers that involve the management of data flows between both sides of the platform.  End users want to know what upstream customers are providing, how they can pay, whether the provider is trusted, etc. and the provider needs to be able to identify and authenticate the customer, as well as understand what content they want and how they want to pay for it.  Opt-in is one solution but is complex and time-consuming to build scale so operators need to explore ways of protecting data while using it to add value to transactions over the network.

Full Article: Enterprise Services 2.0: Mastering communications-enabled business processes; Executive Briefing Special

Introduction

NB A PDF version of this Executive Briefing can be downloaded here.

This special Executive Briefing report summarises the brainstorming output from the Enterprise Services 2.0 section of the 6th Telco 2.0 Executive Brainstorm, held on 6-7 May in Nice, France, with over 200 senior participants from across the Telecoms, Media and Technology sectors. See: www.telco2.net/event/may2009.

It forms part of our effort to stimulate a structured, ongoing debate within the context of our ‘Telco 2.0’ business model framework (see www.telco2research.com).

Each section of the Executive Brainstorm involved short stimulus presentations from leading figures in the industry, group brainstorming using our ‘Mindshare’ interactive technology and method, a panel discussion, and a vote on the best industry strategy for moving forward.

There are 5 other reports in this post-event series, covering the other sections of the event: Devices 2.0, Content Distribution 2.0, Retail Services 2.0, Piloting 2.0, Technical Architecture 2.0, and APIs 2.0. In addition there is an overall ‘Executive Summary’ report highlighting the overall messages from the event.

Each report contains:

  • Our independent summary of some of the key points from the stimulus presentations
  • An analysis of the brainstorming output, including a large selection of verbatim comments
  • The ‘next steps’ vote by the participants
  • Our conclusions of the key lessons learnt and our suggestions for industry next steps.

The brainstorm method generated many questions in real-time. Some were covered at the event itself and others we have responded to in each report. In addition we have asked the presenters and other experts to respond to some more specific points. Over the next few weeks we will produce additional ‘Analyst Notes’ with some of these more detailed responses.

NOTE: The presentations referred to in this and other reports, some videos of the presentations themselves, and whole series of post-event reports are available at the event download site.

Access is for event participants only or for subscribers to our Executive Briefing service. If you would like more details on the latter please contact: andrew.collinson@stlpartners.com.

 

Background to this report

Enterprises are rapidly extending their use of the internet and mobile to promote, sell, deliver and support their products and services and manage their customer and supplier relationships. However, companies involved in the ‘digital economy’ still face substantial challenges in doing business effectively and efficiently. Telcos have a unique mix of assets (user data, voice and messaging, data and connectivity capabilities) that can be re-configured into platform-based services to help reduce the friction in everyday enterprise business processes: Identity, Authentication and Security , Marketing and Advertising , Digital Content Distribution , Offline Logistics, Transactions (billing and payments), Customer Care.

Research from the Telco 2.0™ team has identified significant potential market demand for these services which could generate new profitable growth opportunities and increase the value of the telecoms industry to investors and government.

Brainstorm Topics

  • What assets should operators be leveraging to help enterprises?
  • What would a platform to support improved customer relationships for enterprises look like?
  • What ecosystem is needed to deliver telco platform services to enterprises?
  • Best practice use cases and case studies
  • Cutting-edge developments in voice-Web integration

Stimulus Presenters and Panelists

  • Joe Hogan, CTO and Founder, Openet
  • Laurence Galligo, VP Communications, Oracle
  • Glenda Akers, SVP, Telecommunications, SAP
  • J.P. Rangaswami, MD, BT Design
  • Werner Vogels, CTO, Amazon
  • Thomas Howe, CEO, Jaduka

 

Facilitator

  • Simon Torrance, CEO, Telco 2.0 Initiative

Analysts

  • Chris Barraclough, Managing Director, Telco 2.0 Initiative
  • Dean Bubley, Senior Associate, Telco 2.0 Initiative
  • Alex Harrowell, Analyst, Telco 2.0 Initiative

 

Stimulus presentation summaries

How to create profitable QoS, bandwidth, and network usage services

Joe Hogan, CTO, Openet said that things happen slowly at telcos, but they also have short memories. Looking back at the beginning of AOL, they provided modems, services, proprietary browsers, and content in their walled garden. Eventually, though, they had a disastrous experience with over-the-top (OTT) players; people started independent ISPs, and you could use your own e-mail and Mosaic or later, Netscape, which was actually better.

The lesson is that, if you try to own the entire value chain, you will be disaggregated. For telcos and ISPs today, the equivalent is the dumb pipe phenomenon. We’re now seeing RFPs from operators for serious intelligent pipe projects. We expect them to start coming from cable, and from mobile operators who are seeing their dongles used as broadband access.

I think not having a policy management system will be unusual in 24-36 months time; we need throttling, subscriber management, and deep packet inspection.

The second part of our strategy, he said, is to work with the OTT players. We need to impose controls on users who are essentially abusing the service, but only in a back to back relationship with the OTTers, so as to open up the network even if significant controls are imposed. For example, if someone uses all their bandwidth in the first three days in the month but goes to Hulu, and Hulu has a relationship with the network, they can still see the video anyway. As a result, we need a highly dynamic infrastructure.


So we’ve formed a new relationship with Cisco – making sure that the infrastructure does stay smart. If you’re a bandwidth hog, you will get shaped; unless you’re on a web site with a relationship.

Policy management, then, is a strategic piece of infrastructure for vertical revenue sharing and competition. In IMS parlance, it’s the PCRF that is responsible; this means that it must be able to process a significant volume of real-time decisions. We’re looking at 3,000-5,000 transactions a second.

J.P. Rangaswami, MD, BT Design: ”The old model worked and we were good at it, but the only way we could learn about the new model was by experimenting”

Looking ahead, the cable industry’s Canoe is the VISA for advertising – a standard for the technical aspects of ad insertion, for the business model, and for the accounting, reporting and management information system. It requires the infrastructure to provider subscriber- and context-aware charging rules, integrated, context-aware policy management (so it can improve QoS in appropriate contexts), multi-dimensional rating & charging, multiple balances for subscribers (general balance, service-specific balance, points balance), notifications/advice of charge, re-direction, and comprehensive auditing and reporting to support customer service. It needs to provide high performance and be essentially invisible to the subscriber.

 

Exploring the e-citizen opportunity

Laurence Galligo, VP, Communications, Oracle, presented results from a survey which suggested there was strong support for Telco 2.0 among Oracle customers. Who, she asked, has had a bad experience interacting with the public sector recently? Oracle has put a lot of focus into this recently under Smart City, their project to support improved citizen experience with government services.

For example, there is the SNEN – Single Non Emergency Number. A single point of contact for a whole range of government services outside the emergency-response sector. Value estimated at $1.2bn in five years.

We implemented this as 311 for New York City – the single point of contact led into an integrated ”citizen service centre”. This requires a lot of the underlying Telco 2.0 capabilities – to make it work, we need to authenticate the citizen, to federate their data, to carry out e-commerce transactions, to provide location-based services, and to route voice and messaging intelligently.

The result was an unified government platform – including networking, location-based services and GIS, voice and CTI, CRM, reporting, and transaction-processing systems. Working with a system integrator or software company, the Telco could become a leading partner for government in delivering better citizen services.

 

Enabling the Transition to Customer Self-Care

Glenda Akers, SVP, Telecommunications, SAP said that the mobile phone had become the preferred route for individuals to interact with organisations. But call centres were horribly inefficient – there is a need to balance quantity – the rate at which calls are processed – and quality – the outcome of the calls, and many fail at this. CTI systems are frequently very poorly integrated – hence there are lots of mistakes and much routing of calls between multiple call centres.

 

According to an Accenture study, 40% of agents’ time was spent dealing with calls that had gone to the wrong call centre.

And 60-70% of call centre costs are accounted for by labour; anything that can reduce the number of calls is therefore a good deal. Human agents are far too valuable to spend their time just looking up information from the database, so query-only calls must go. So – there is a clear business need to make self-care much better.

High priority sectors are those which have high volumes of traffic, complex queries, distributed resources, and which need to handle contact through multiple channels. Specifically, telecoms/IT itself, finance, retail, media, health, transport, government, unions.

BusinessObjects Mobile is the widgetry interface for SAP’s enterprise workflow systems. Some use cases are bank accounts, bills, energy usage statistics. Mostly, they are query-only, or they have a few one-click controls. The iPhone showed the way, now we want to spread it to many other devices. Web-based, so only a minimal degree of configuration required.

Telcos could provide this as a hosted service, using their identity, billing, voice switching, and device management capabilities, and perhaps also their call centres.

 

The Front Line of Communications-Enabled Business Processes

Thomas Howe, CEO, Jaduka said that Jaduka, his new company, differs from the Thomas Howe Co. in that it does more voice mashups for more people. Communications-enabled business processes – it’s about making processes faster and more efficient by including real-time communications, which may be voice but may not. There is a traditional heavy approach, using dialers, IVRs and call centres. This involves either heavy CAPEX, or else heavy OPEX on long term systems integration or outsourcing contracts.

The alternative is to do it on the Web. Think of it as long-tail delivery – many small applications, dealing with highly specific tasks. But the needs involved are more like the short head – because many, many enterprises have the same or similar problems.  An important, but underestimated market for CEBP is within the enterprise. Many companies lock out suppliers, customers, other stakeholders and even employees in the field from their systems. CEBP breaches this –  it extends the enterprise IT system outside the firewall.

Traditionally, there have been about 10 voice apps and about 10,000 nonvoice apps. The difference is that group 1 wouldn’t exist without voice but group 2 would. That doesn’t mean, though, that group 2 wouldn’t benefit from voice.

There are 4 fundamental CEBP services, out of which the others are constructed – Notifications, Diary, Click to Call, and Conferencing.

Consider the Ribbit/Salesforce app that lets salesmen leave voice messages into the CRM system an example of the Diary. Click-to-Call allows you to add metadata to the raw voice file. For narrowcast messaging: this means leaving a particular message for a particular person. It’s better than snailmail, and has a similar role to e-mail in e-commerce. But everyone has a phone number, and you can find them. E-mail isn’t the same.

Werner Vogels, CTO, Amazon.com: ”You can’t keep the whole value chain in your own hands – you’ve got to be part of the chain and take your one cent!”

Unique telco contributions here are: intelligent routing, determination between mobile and fixed numbers, information about which number is best, ability to switch between text and voice.

Voicesage did a solution for a furniture delivery service. Using notifications, confirmations, and post-delivery checks, they achieved a 10-fold reduction in missed deliveries. This reduced truck rolls, but also reduced inventory and accounts receivable. And they also made the customers happy. The solution is software-as-a-service, so there is no capex upfront. And it creates lots of interesting metrics.

Assume an average delivery cost of $70 for furniture in the US; 40,000 deliveries, of which 4% fail. There is an opportunity for a $120,000 saving at a cost of 50 cents a trip. This means additional carrier revenue of $20,000 at over 90% margin. There are appliance sales of $20bn a year in the US, 50 million deliveries per day.

 

But there are thousands of segments, thousands of distributors, and thousands of applications. In reality, serving these will require about a 50/50 split between ready-made solutions and custom development.  Value creation requires vertical expertise being applied to horizontal capabilities. One example would be using voice messaging to monitor congestive heart failure patients. .

Software as a Service is great for customers, but not so good for systems integrators. Their business model is getting more complicated. We, the developers, want to share revenue with the integrators – but they struggle with the idea. But do they want to be cut out entirely?

We’ve stopped using the term “price per minute”  – instead, we refer to value based pricing. This is the natural way for Telcos to monetise things like location, billing, and voice and to reintroduce variable (value-based pricing) into their business models.

 

Participant Feedback

Introduction

Undoubtedly, some of the more ‘glamorous’ Telco 2.0 propositions revolve around advertising, content and entertainment. New business models from operators and Internet players in the consumer space garner much of the attention of Telco executives and media commentators. Blyk, YouTube, iPlayer, music downloads and P2P video distribution sit at the top of the agenda in terms of driving new revenue opportunities and evolving cost models. Approaches like “sender pays” data are primarily aimed at those sending large chunks of “content”.

Yet historically it has been the corporate marketplace which has driven much of operators’ traffic and profits, through large voice volumes, national and international data networks, and value-added services like system integration and hosted applications. Much of the current hype around Cloud Computing and software-as-a-service is solidly enterprise-driven, while two-sided business models involve deriving extra revenues from large ‘upstream’ organisations rather than consumers. Even the mass-market mobile operators will need to learn to engage with (and sell to) corporate technology representatives.

Although it is possible to see the long-term roadmap of “exposed” network and device capabilities taking friction out of business processes, it seems that the initial group of service options are rather more prosaic. It should be relatively easy to build on existing communications platforms like call centres and customer-service platforms, extending B2C interactions in more intelligent ways.

Feedback (Verbatim comments): The money is in the enterprises

The feedback from the event highlighted some general agreement that the enterprise market offers significant opportunities.

  • Great service examples, how do we show the value based on actual cases and savings, also need to consider the green angle [#22]
  • Great explorations of the dual sided biz models. [#27]
  • This is a good case that shows how Telco assets can be put to use. Helping customers and businesses to interact better is also a good way to diffuse new services to consumers B2B2C. [#10]

Feedback: Where do telcos fit?

But significant doubts remain as to the precise value that the Telcos can contribute, or their fit in the enterprise technology value chain. This is not surprising, especially in mobile, where many operators have shown limited interest in integrating with corporate IT and business processes, often just focusing on bulk sales of phones and minutes.

  • Is this not just supporting opening the networks to monetisation of long tail applications utilising Open APIs?. [#6]
  • In all the cases we heard the value was in the application not the Telco services, and no obvious reason why the Telco should capture the application. Where is the Telco 2.0 value in all this? [#7]
  • rgd. 7: the question: what is Telco service in the future? The Telco service will include traditional services as well as applications and process support. [#11]
  • <How are those examples being translated into service provider revenue and business, maybe in the cloud? [#12]
  • In a flat cost / head count world, what do you stop as an operator to free up people to develop these services with enterprises; governments etc who are often slow in decision making? [#19]
  • Ref 19: this is where vendor expertise matters. Don’t reinvent the wheel. [#36]
  • Telco’s own self care offerings are not mature or sophisticated so although they could help enable this their ability to market/offer this seems like a stretch. [#25]
  • Re: 7 agree, at a high level, is business process outsourcing a function for a Telco to enable/extract value. [#26]
  • USP of Telco’s unclear. Could all be done by an ASP using Telco wholesale products? [#23]
  • How does this all integrate across the value chain? [#38]
  • Customer service platforms used internally by Telcos should be generic-ised, extended and then exposed to third parties, a bit like the Amazon web services strategy. [#40]
  • CIOs at large Telcos are, now more than ever, in need of cojones (balls). They need to take risk or the Telco 2.0 will not be realised. They have the old school PTT mentality. this make take a generation to achieve. [#54]
  • <What is the value added Telco2.0 services that these applications need? Examples didn’t focus on this core question [#32]
  • How will a Telco in these situations deal with enterprise customers who use a different access provider? E.g. if you’re the Telco supporting e-citizens for local govt, do you have to wear lots of interconnect costs to communicate with those citizens using competing cell phone providers? [#49]
  • re 49, good point. we need to coordinate activity or the costs became prohibitive. banks solved this for credit cards and ATMs so it is possible [#51]

Feedback: Jaduka & Communications-enabled Business Processes

A regular speaker at Telco 2.0 events, “Mr Voice Mashup” Thomas Howe received a lot of attention at his new gig as Jaduka CEO

  • >At first I was bored the Jaduka presentation, but after thinking about it, it was the best example of real world Telco 2.0. [#35]
  • What is Jaduka’s business model, how do they make money, it was not clear in the presentation? [#9]
  • What is Jaduka’s view on reselling, sharing customer data with partners is this beginning to happen? [#28]
  • Where does Jaduka see the money coming from, voice apps, data apps, SMS apps, what are the sweet spots?
  • [#15]One wonders whether we are missing some opportunities to span from Jaduka type capabilities with Bondi type standards to ensure that there is a logical hand-shake with the end customer.
  • [#37]Does Jaduka create a database of user identities mapped to phone numbers that works across carriers? This would be a powerful resource to enable anonymous communications and business processes. [#42]
  • What are Jaduka’s requirements to Telcos in terms of API and other interfaces in order to enable Telcos to build appropriate wholesale offering? [#43]
  • What can Telcos offer a company like Jaduka for them to make new services? What should Telcos standardize of new APIs to allow a company like Jaduka to reach as many users as possible? [#18]

Feedback: Customer care opportunities beyond call centres

But although there is interest in Voice 2.0 and mashups, it remains unclear what services are there beyond next generation contact centre-type applications

  • Machine to machine is an amazing opportunity but business process engineering is more difficult than expected. [#24]
    •     re:24 BPR is only part of the problem, legacy infrastructure and proprietary black box end-to-end are holding us back. There needs to be an internal conversation within the Enterprise to rethink the application of technology against new business models. [#47]
  • Some good stuff but maybe too much is just call centre + a bit more. Interesting but hardly revolutionary. [#46]
  • Not enough focus on more advanced assets like GPS in phones, pushing widgets to devices etc. There’s a lot more than just advanced call centres. [#44]
  • What is a little disappointing is the low level of Telco 2.0 insight and vision amongst these enterprise protagonists compared to the entertainment and content people. Is this because there is less Telco 2.0 opportunity here, or because they’ve thought less about it? [#56
    • re 56 – I think it’s because in the enterprise there’s an issue that most Telcos, especially mobile, don’t really understand the detailed business processes at their corporate customers, so it’s difficult to come up with solutions that exploit Telco assets. Also there’s a big mass of SI’s and VARs/ISVs and outsourcers who sit much closer to the customer than the big apps providers. [#58]

 

Feedback: Telco 2.0 for Government 2.0?

Taken as a whole, it is exceptionally difficult to target the whole enterprise marketplace. The IT industry tends to sell its offerings through offering industry-specific teams, which take general software or service components and tune them for the requirements of particular verticals. Telcos will need to fit their “two-sided” offers (or just basic single-sided hosted options) into a similar structure, except for the most “vanilla” horizontal service elements. The event threw up some doubts that new upstream customers could be reached easily. One approach that seemed to resonate was Oracle’s pitch around a central contact point for all local government services, or a “311” number in US parlance.

  • These apps need detailed use cases and expertise for the verticals. Where would a Telco get this knowledge or would they partner with these type of companies, we heard from today? [#16]
  • I love the 311 idea. This is like a special 0800 number to the local government call centre. [#34]
  • I don’t think the SAP proposition works well for consumers – who wants to download a customer service app for their gas/electricity company to their mobile phone? [#41]
  • At what size does this make sense as a municipal opportunity for a Telco? 3 million residents? More? [#45]
  • In my discussions with the Telcos they do not believe that the local services are coordinated enough to see the value proposition, we need to widen our industry engagement to include these local service companies. [#17]
  • Does this signal the death of the traditional Telco and the emergence of the local communications provider attached to the local municipality? [#33]
  • Not sure the Telco can cooperate enough with the local government. to provide an integrated 311. e.g. provide location service to find nearest service. [#39]

 

Feedback: Marketing Telco 2.0 to the enterprise

The engagement model between operators and enterprises remains opaque. Is it about partnering or new channels & marketing techniques? Telco 2.0 believes that many operators need to be realistic – they cannot “own” the enterprise value chain simply via provision of a few APIs, when incumbent integrators and software vendors are already tightly bound to business processes.

  • Can a Telco actually logistically work with hundreds of SIs to make this feasible [#30]
  • As a Telco how do you stop partners taking the majority of the value chain with enterprises and governments? [#31]
  • How does a Telco manage to sell the idea of these services to millions of small businesses? the cost of sale is too high to service a dentist who might spend $100 a year on phone/SMS reminders for appointments. [#48]
  • Re 48, in the same way Google and Amazon can do it: by driving down the cost of bringing companies on to the platform. it doesn’t work if it needs an SI involved – the whole point is that this works if it is plug and play. [#50]
  • If the likely evolution of many Telcos is that network assets are spun off into a few shared netco’s and the remaining service operations are left competing for customers (with Google, Nokia etc), who exploits the 2 sided business model – the netco with open API’s or the service leveraging the end customer relationship? [#60]

Feedback: Competing with Big Technology Solutions

Software vendors like SAP and Oracle could be the bridges between enterprise and Telco IT domains. These companies already have strong footholds in almost all vertical markets – and are also ramping up the reach of their applications for telecoms operators. That said, their incumbency also represents a challenge to the Telco 2.0 model, particularly where the more innovative web- and SaaS-based models conflict with large-scale “owned” in-house application architectures.

  • It was not clear in the SAP presentation how it really fits into the Telco2.0 initiative – it may have been better received if it addressed the commercial model the technology allows. [#20]
  • I don’t understand the Oracle or SAP examples. they have a vested interest in complex, heavy apps which are attractive to SIs with very high total cost of ownership. [#52]
  • Web services, cloud computing and virtualization are absolute disruptive advances which will allow operators to save money thus to be more apt to take risks on new biz models. [#21]
  • Is oracle/sap interested to provide apps to Telcos on a pure revenue sharing basis? [#59]
  • Do oracle and sap really interested in working with carriers? Why? For sharing revenues? [#61]
  • To Openet: have you ever met someone from a Telco with the job title of ‘policy manager’? who manages all this stuff, given you need to understand access, apps, legal issues, behaviour, core networks, issues around false positives/negatives etc? [#55]

 

Participants’ “Next steps” Vote

Participants were asked which of the following statements best described their view on communications-enabled business processes for the enterprise?

  1. Individual operators should focus their efforts very carefully on specific capabilities (e.g. billing and payments or customer care) and verticals (e.g. government, healthcare) and compete with point providers (such as Paypal) in these markets.
  2. Individual operators should focus their efforts on building a broad set of horizontal capabilities (covering identity, authentication, security, marketing and advertising, content distribution, off-line logistics support, billing and payments and customer care) to a broad range of vertical markets as this will enable a unique value proposition and develop scale.
  3. Telcos should avoid Telco enabled business processes – the market is a red herring.

Lessons learnt & next steps

In theory, the enterprise segment ought to be at the heart of operators’ Telco 2.0 strategies. Irrespective of single-sided corporate retail propositions, in a two-sided world “upstream” providers are generally businesses or governments. But many of the comments during the session identified just how difficult it is to extract the value in a Telco’s inherent assets and capabilities, and apply this to corporate IT and business problems.

The Telco 2.0 Initiative believes that one of the major issues around exploiting the enterprise opportunity is that Telcos need to learn how to develop, sell and support services which are customised, as well as mass-market “basic” applications and APIs. Ideally, the technical platform will be made of underlying components (e.g. the API interface “machinery” and the associated back-office support systems) designed to cope with both “off the shelf” and “bespoke” go-to-market models for new services.

Especially in the two-sided model, there are very few opportunities to gain millions – or even tens of thousands – of B2B customers buying the same basic “product”. Google has managed it for advertising, while Amazon has large numbers of hosting and “cloud computing” customers – but these are the exceptions. Even in the software industry, only a few players have really huge scale for basic APIs (Microsoft, Oracle, Sun, etc.) across millions of developers.

Werner Vogels, CTO, Amazon.com: ”Amazon cloud services took off with the creative people and start-ups, but the enterprises came aboard because they could get agility here they couldn’t get anywhere else.”

Operators may indeed have some easily-replicable “upstream” services that could be sold through an online platform in bulk (perhaps authentication or billing, or basic APIs like location), but these often also face competition in terms of alternative technological routes to their provision. They may also need to be “federated” across multiple operators to be truly useful. Perhaps the most easy and universal horizontals will be enhancements to voice and messaging capabilities – after all, these are the ubiquitous cross-sectoral services today, so it seems likely that any enhancements will follow.

To really exploit unique assets and “take friction out of business processes”, there will also be a need to understand specific companies’ (or at least sectors’) processes in detail – and offer customised or integrated solutions. Although this does not scale up quite as compellingly, the aggregated value involved may be even higher. Even Microsoft and Oracle have dedicated solutions for healthcare or manufacturing, as well as their baseline horizontal products.

J. P. Rangaswami, MD, BT Design: ”Our measure of success should be how easy it is for customers to use the network. Margins will be like a retail business –  a razor thin layer of value spread across a huge area of the economy.”

Another interesting example is that of the BlackBerry. Although today we think of mobile email as a generic capability used across the whole of the economy, the original roots of the company (pagers) were highly financial-oriented. The banking sector very much catalysed the subsequent growth in other knowledge industries (e.g. legal / consulting) and then the more general adoption among businesses of all types. This reflected not just the need for (and high value of) real-time messaging, but also other issues that a pure horizontal approach may have neglected. A specialist salesforce, an early focus on enterprise network security integration – and a large target audience of Microsoft Exchange users were all important. Even the “gadget envy” of a well-paid and dense concentration of users (Wall Street) may have helped the device’s early viral adoption.

As yet, this need for customisation and integration has not been fully recognised. The results of the vote at the end of the session were stark – perhaps surprisingly so. The vast majority of survey responses suggested that operators should attempt to build up exposed capabilities across a set of horizontals, rather than focus on the needs of specific markets.

This seems to reflect the hope for more Google/Amazon-style cross-sector offerings. But as discussed above, this may not be easy, nor will it be the whole story. It is also unlikely to work for every operator. Telco 2.0 thinks that the horizontal approach certainly makes sense in terms of the core abilities of the technical platform, but in terms of developing solutions and partnering with particular integrators or influencers, some measure of vertical specialism is often necessary.

That said, the telecom industry has not often been good at “picking winners” from an enterprise stance,

In the short term, Telco 2.0 would recommend the following:

  • Look for “low hanging fruit” around next-generation contact centres and voice mashups. These are prime targets for horizontal exploitation. Where appropriate, partner with one or more start-ups if existing internal skillsets are weak. ‘Eat your own dog food’ – sort out your own call centres first and develop skills and processes that can be applied to other industries
  • Continue with plans to monetise certain other assets for enterprise utility – especially security, payments, messaging and features that can add value to logistics processes. However, work in parallel on broad commercial platforms (e.g. web-based APIs) and more customised routes to market.
  • Conduct research to identify any particularly attractive near-term addressable target verticals. This can reflect existing skills/services (e.g. within an internal integration business unit), national-specific trends (e.g. major healthcare or environmental projects), local legislation (e.g. banking rules) or wider industry collaboration (e.g. GSMA projects in areas like mobile payments).
  • Build a database of possible acquisition targets (for example, corporate web/telco specialists), especially those with funding vulnerabilities that may make them available at low prices in the recession.
  • Start thinking about the implications of network outsourcing or managed service contracts on the ease of offering exposed service capabilities to upstream enterprise customers.

Longer term, other considerations come into play:

  • Develop separate strategies for high-volume/low-value enterprise services (e.g. servicing thousands of customers via web service platforms for generic “building blocks” like authentication), and low-volume/high-value corporate projects. [Note: volume here means # of customers, not # of transactions or events: imagine a one-off deal with a government, for national health ID & patient records]. Ultimately these may use the same underlying capabilities, but the engagement model is totally different – for example, participation in a Government-led scheme to extend smart metering for utilities, or a one-off deal with a broadcaster for a new advertising and content-delivery partnership.
  • Aim to work closely with one or more top-tier enterprise IT vendors to help add value to their hardware/software solutions. IBM, Microsoft, Oracle, SAP, Cisco, HP, Sun and others have large bases of extremely loyal customers.
  • Look to exploit new device and network capabilities, such as sensors, cameras, enhanced browsers and widgets on phones, or femtocells in B2C customers’ homes. In particular, there are various government/public-sector applications that could benefit from closer integration with citizens’ technology. Examples could include authentication for local services (or even voting), or assorted types of monitoring for environmental, healthcare or public safety reasons.Do a full analysis of applications that can be hosted in the cloud – but beware the integration and “touch points” with corporates’ in-house infrastructure.

Online Video Distribution: how will the market play out?

Members of the Executive Briefing Service and Future of Networks Stream: please click here for further access. Non-Members: please see here for how to access this content or go straight to buy here.

Overview

The online video distribution business model faces increasing challenges, particularly as explosive traffic growth is driving some costs faster than revenues. This is unsustainable – and there are many other changes: in content creation, aggregation, and distribution; in devices; and in end-user behaviour.

[Figure]

The Online Video Value Chain

This new Briefing summarises the evolution of the key technologies, the shifting industry structures, business models and behaviours, the evolving scenarios and the strategies required to prosper.

Who should read the report

Telco: Group strategy director, business development and strategy teams, data and IPTV product managers, CIO, CTO, CMO; Media companies; Broadcasters; Content players.

Key Challenges

In theory, telecom operators should be well-poised to benefit from the evolution of video technology. Fixed and mobile broadband services are increasing in speed, while phones and set-top boxes are becoming much more sophisticated and user-friendly.

Yet apart from some patchy adoption of IPTV as part of broadband triple-play in markets like Japan and France, the agenda is being set by Internet specialists like Google/YouTube and Joost, or offshoots of traditional media players like the BBC’s iPlayer and Hulu.

Many consumers are also turning to self-copied and pirated video content found on streaming or P2P sites. And although there is a lot of noise about the creativity of user-generated video and mashups, it is not being matched by revenue, especially from advertisers.

These changes present commercial challenges to different players in the value chain. Changes in user demand challenge the economics of “all you can eat” data plans (see “iPlayer nukes ISP business model”), content creators face well known issues relating to digital piracy and content protection, while aggregators face challenges monetising content.

Which Scenario will win – and who will prosper?

Our new research uses scenario planning to map out and analyse the future. The methodology was designed to deal with many moving parts, uncertain times and rapid change. We identified three archetypal future scenarios:

  • Old order restored: Historic distribution structures and business models are replicated online. Existing actors succeed in reinventing and reasserting themselves against new entrants.
  • Pirate world: Distribution becomes commoditised, copyright declines in relevance and the Internet destroys value. A new business model is required.
  • New order emerges: New or “evolved” distributors replace existing ones, with content aggregation becoming more valuable, as well as delivery via a multitude of devices and networks.

Which of these scenarios will dominate, when, and what can operators and other players do in order to prosper?

Key Topics Covered

  • Current market and variation across national markets
  • Significant changes and trends in content production, aggregation and distribution
  • Significant changes and trends in devices and end-user behaviour
  • Detail on the scenarios and the likely market evolution
  • Consequences of the changes by content genre (movies, sport, user-generated, adult)
  • Strategies to prosper as the scenarios evolve

Contents

Key questions for online video distribution

  • Online video today
  • Bandwidth
  • Penetration
  • Other factors

Emerging industry structure

  • User-generated vs. professional content
  • Aggregated vs. curated content
  • Market size

Future challenges for the industry

  • Content creation
  • Aggregation
  • Distribution
  • Customer environment and devices
  • Supply and demand side issues

Future scenarios for online video

  • Genre differences
  • Mobile video evolution
  • Regional differences

Strategic options for distributors

  • Threats
  • Weakness
  • Strengths
  • Opportunities
  • Strategic options

Conclusion

Members of the Executive Briefing Service and Future of Networks Stream: please click here for further accessNon-Members: please see here for how to access this content or go straight to buy here.

Full Article: Online video distribution

NB. This 30+ page article can be downloaded in PDF format here or browsed on-screen below.

© Copyright 2009. STL Partners. All rights reserved.
STL Partners published this content for the sole use of STL Partners’ customers and Telco 2.0™ subscribers. It may not be duplicated, reproduced or retransmitted in whole or in part without the express permission of STL Partners, Elmwood Road, London SE24 9NU (UK). Phone: +44 (0) 20 3239 7530. E-mail: contact@telco2.net. All rights reserved. All opinions and estimates herein constitute our judgment as of this date and are subject to change without notice.

Executive summary

This briefing summarises key outputs from a recent STL Partners research report 1 and survey on Online Video Distribution. It considers the evolution of the key technologies, and the shifting industry structures, business models and behaviours involved in content creation, distribution, aggregation and viewing.

In this document, online video distribution refers to any video-based material, such as movies, television, sports and user-generated content, distributed via various internet-based (IP) technologies. This includes internet protocol television (IPTV), web streaming and peer-to-peer (P2P) downloading. It includes distribution via any fixed or mobile network, to any device, such as a PC, television or smartphone.

We exclude dynamic two-way video applications such as videoconferencing and video-sharing, as well as traditional broadcasting and physical means of distribution, although the impact of online video distribution on these platforms is explored briefly. Standalone mobile video broadcasting, for example using DMB or DVB-H technologies, is also not considered to be “online”.

In theory, telecom operators should be well-poised to benefit from the evolution of video technology. Fixed and mobile broadband usage are increasing in speed, while phones and set-top boxes are becoming much more sophisticated and user-friendly. Yet apart from some patchy adoption of IPTV as part of broadband triple-play in markets like Japan and France, the agenda is being set by Internet specialists like Google/YouTube and Joost, or offshoots of traditional media players like the BBC’s iPlayer and Hulu. In the background, many consumers are also turning to self-copied and pirated video content found on streaming or P2P sites. And although there is a lot of noise about the creativity of user-generated video and mashups, it is not being matched by revenue, especially from advertisers.

STL used scenario planning to understand the future. The methodology was designed to deal with many moving parts, uncertain times and rapid change. We identified three archetypal future scenarios:

  • Old order restored: Historic distribution structures and business models are replicated online. Existing actors succeed in reinventing and reasserting themselves against new entrants.
  • Pirate world: Distribution becomes commoditised, copyright declines in relevance and the Internet destroys value. A new business model is required.
  • New order emerges: New or “evolved” distributors replace existing ones, with content aggregation becoming more valuable, as well as delivery via a multitude of devices and networks.

In our study we found that these scenarios are not mutually exclusive. In fact, the likelihood is that the current old order will pass through a pirate world phase, before a new order emerges.

In the meantime, the two most important considerations for “distributors” are:

  • Investing in, and adequately managing, sufficient network access and core network capacity. In many instances this will involve partnering with specialist CDNs (content distribution networks) and deploying appropriate network management / QoS technology.
  • Developing improved value-added services, based on network and device intelligence and a two-sided “platform” business model, especially to assist in targeting and delivery of adverts.

Contents

  • Key questions for online video distribution
  • Online video today
  • Emerging industry structure
  • Market size
  • Future challenges for the industry
  • Future scenarios for online video
  • Genre differences
  • Mobile video evolution
  • Regional differences
  • Conclusion

Key questions for online video distribution

When thinking about this report and its ‘big sister’ (a strategy report: Online Video Distribution: Fixing the broken value chain), we asked ourselves some key questions based on what we saw happening in the online video arena. These were:

How will the online video market develop and what are the best strategies for aggregators and distributors?

As broadband pipes have grown fatter and fatter, the capability to deliver a quality video viewing experience over the internet has grown. This broadband capability has driven a tsunami of innovation in hardware, software and services – and the eyeballs have followed. Recent data suggests video is the fastest growing segment of all internet traffic and that the trend will continue for the foreseeable future. This is true, whichever metric is used, be it absolute number of viewers, total time spent viewing or data traffic volumes. In the last 24 months, the same trend has also been seen in mobile video, aided by faster 3.5G networks and more capable handsets and smartphones like the Apple iPhone.

Growth is not limited to a specific content category: adult content; sports; movies; and music are all moving online rapidly. The internet has also led to a new category of user generated content. Home movies have moved out of the privacy of the living room and are becoming more professional, while existing copyright material is being repurposed in the legal grey zone of ‘mash-ups’.

Neither is growth limited to a specific geography. The movement online is a worldwide phenomenon, as the internet has no respect for traditional geographies and boundaries. Certain markets have seen faster adoption of fibre, high-speed DSL, cable or HSPA mobile broadband, which drive more (and higher-quality) video consumption.

All the evidence points towards a future where the internet (and other “closed” IP networks) will be a critical distribution channel for all forms of video. Significantly, so far at least, revenue growth for aggregators and distributors has not followed traffic growth.

Are there historical lessons from which to learn?

Innovation in video distribution is not new. Over the past century, we have seen cinema, broadcast networks and physical media creating temporary shocks to older methods of distributing content. Despite the gloom of some predictions, live events, whether sport, theatre or music, remain popular and co-exist with home entertainment. The transition to, and evolution of, these distribution channels and the associated business models will provide clues about the outcome of video distribution as more content moves online.

However, there is only a certain amount of time in the day available for entertainment in general and watching video specifically. Legacy distribution channels are understandably worried about whether internet video will be additive to, or will cannibalise, their audiences.

A new distribution channel brings opportunities for new entrants to enter markets and disrupt existing markets and business models. The key feature of the internet as an interactive distribution channel adds to the opportunities and the challenges faced by existing players.

User empowerment – for good or ill it is happening, but what will the impact be?

Interactivity has allowed individuals to become distributors in their own right. On the positive side, individuals have generated their own content and made it available to the world. On the negative side, some individuals have used interactivity to distribute content without regard to the rights of copyright holders. Copyright holders have struggled to enforce their rights. Illegal distribution of content not only threatens the absolute value of content, but has also led to the development of unpopular and complicated mechanisms to protect content.

The volume growth of content has placed internet access providers under severe strain. Their attempts to increase prices to compensate for the growth in traffic and gain extra revenue through developing additional services are proving very difficult. Technology-based methods of blocking or prioritising certain traffic types garner a lot of publicity, but also prompt user and legal furore over “Net Neutrality”. These issues have generated a considerable amount of experimentation in the market, especially in the area of pricing models, where subscription, pay as you go, advertising-funded bundles with other distribution channels and offsets and subsidies all exist in various forms.

The net result is that the video market is in a state of chaos. Will order emerge out of the chaos? What form will this new order take? What will be the impact on existing players in the video value chain? And will powerful new players emerge?

What sort of scenario will emerge?

STL’s scenario planning methodology to understand the future identified three potential “core” future scenarios:

  • Old order restored: Traditional distribution methods and business models are replicated online. Existing actors succeed in reasserting themselves.
  • Pirate world: Distribution ceases to be valuable and copyright ceases to be relevant. A new business model is required.
  • New order emerges: Rather than the total breakdown of pirate world, new distributors will replace existing ones as we still need aggregation to guide us through the jungle.

In our study we found that these scenarios are not mutually exclusive. In fact, the likelihood is that the current old order will pass through a pirate world phase, before a new order emerges.

This allowed us to think about what strategies are relevant in which situations, which strategic options can be placed early and which should only be placed when the likely path becomes clearer. In addition, this approach allowed us to look at ‘what you need to believe’ for each scenario and to define milestones that will make the path predictable.

The study places the drivers of future internet video distribution in a technological, economic, social and political framework. It then evaluates the implications of these on content type for the value chain of creators, aggregators and distributors. Research includes literature reviews, desk research, industry surveys and interviews with key staff from relevant organisations. In our strategy report, case studies are produced to bring the story to life and to provide a historical context for both successes and failures.

Online video today

The rise of online video as a market in its own right has been driven by two key factors: increasing bandwidth and growing user penetration.

1. Bandwidth

Bandwidth drives the quality of online video that can be consumed in real time via streaming, rather than the quality of online video that is downloaded prior to consumption, as well as download speed. This significantly impacts user experience – YouTube really took off when it could be viewed in realtime, without lengthy waits for “buffering”. For ISPs and broadband providers, realistic bandwidth is also a key determinant of when IPTV becomes feasible as a mass commercial proposition – it is no coincidence that it tends to track rollout of fibre or higher-speed DSL.

Figure 1 shows the relative quality of media that can be streamed at different broadband speeds and the relative average broadband speeds of a selection of different countries. It should be noted that certain lower-ranked markets have pockets of much-faster users, for example those with Verizon’s FIOS network in the US.

Figure 1: Quality of media stream by bandwidth

[Figure]

Source: The Information Technology and Innovation Foundation

At 4Mbit/s broadband speeds, high quality standard definition TV is possible. By 8Mbit/s, high definition (HD) TV is possible. From 24Mbit/s upwards, any normal sized household will have full multimedia capability. In the mobile world, smaller screen sizes mean that lower speeds can generate acceptable experience, even at 1Mbit/s. Instead, the limiting factors in mobile are more often video processing power, user-friendly interface design and battery life.
Different countries have vastly different average bandwidth, which means there are major differences in the sort of online video services that can be offered around the world. Advanced countries, such as South Korea and Japan, give an indication of how video distribution in other countries should develop over the next five years. In both these countries, a range of video services and new applications has emerged. Due to this, people value their broadband connections more highly and often pay more per user than elsewhere.

2. Penetration

Penetration of broadband drives the attractiveness of the market for video-focused service providers, as larger user bases can be monetised via:

  • Advertising: Selling adverts in various formats against video on the webpage, on the actual video media, or in the media as placement.
  • Offset economics: Players make money from elsewhere and choose to subsidise videos. For example, Google subsidises YouTube via search ad revenue.
  • Exit: Selling a successful start-up service to a larger player. For example, YouTube’s $1.65bn sale to Google has started a rush in this direction.

Consequently, the greater the penetration, the more and varied the services that can be offered. It should be noted that there are various methods of calculating penetration, including reach as a % of either population or household numbers. In many OECD countries, broadband penetration has now surpassed 50% of homes. The growing use of mobile broadband on laptops or high-end smartphones also means that some households (or even individuals) now possess 2+ separate broadband access channels.

That said, there are also benefits for high levels of acceptance within specific demographic or social niches, even if the broad average across the population is lower. So, for example, the advent of prepaid mobile broadband is enabling greater penetration into markets such as students or immigrants, who are well-suited to particular content types (e.g. foreign language).

In coming years, broadband penetration should continue to increase in developed markets, as well as certain developing nations which place an emphasis on it, such as China. One side-effect of the current financial crisis is that various countries (including the US) are including broadband in their lists of beneficiaries of “fiscal stimulus” packages. Other markets are also seeing changes in regulatory stance which should benefit fibre rollouts, or wider use of mobile broadband.

3. Other factors

In addition to bandwidth and penetration, it is also worth noting that various other factors are driving wider use of online video services and applications today. Briefly, these include:

  • Improved integration between web and video software. In particular, the use of Adobe’s Flash video has been a major driver in enabling services like YouTube.
  • Rise in social networking websites like MySpace and FaceBook, which permit easy use of video plug-ins, and encourage users to share video or links, creating viral consumption of content. Other Web 2.0 social media, such as blogs, have also become more video-friendly in recent years, especially as YouTube and other services have made it easy to embed video in web pages.
  • The falling costs of PCs, especially notebooks, has increased the number of PCs per household in developed markets. In some cases, laptops are replacing second TVs as the device-of-choice in studies, kitchens and bedrooms, increasing the time that “eyeballs” can spend on video.
  • Falling costs of video cameras, and increasing usefulness of video features on mobile phones has stimulated additional user-generated content.

Emerging industry structure

The initial online video market was driven by broadband-led IPTV, but with the increasing bandwidth and uptake of consumer broadband, the emerging market has become far more varied in the past few years. In particular, the ease of use of streamed and web-embedded video has transformed the landscape.

Figure 2 shows the emerging structure of the online video industry.

Figure 2: Structure of the online video industry

[Figure]

Source: Revision 3; STL Partners analysis

The industry is evolving along two main axes.

1. User-generated vs. professional content

The traditional video industry of TV and movies is based on professional, studio-produced content. New media plays – IPTV and web TV services such as Hulu and iPlayer – also operate in this way. But as discussed above, technology-driven cost reductions have opened up new opportunities for lower cost user-generated content, as well as an emerging ‘pro-tail’ sector between these two extremes.

One example of the pro-tail trend is iBall, a high quality, five minute, daily web TV show that is produced and distributed by a UK financial services company, Interactive Investor, rather than by a broadcaster working in tandem with a traditional production company. Its aim is to give a more in-depth view of the financial markets, while maintaining the entertainment focus associated with more mainstream media.

The other side of this middle ground between professional and amateur content is made up of increasingly competent amateurs building high-quality content in the hope of obtaining commercial sponsorship. There is also a small but growing market for business-related video, for example for video webinars or news magazines. Telecom TV, in our own industry, is a good example here.

2. Aggregated vs. curated content

The traditional video and TV industry is based on curated content written, produced, edited and scheduled by professionals. However, technology has introduced automation into this process, allowing various businesses to build simple aggregation-based services. Content is thrown up on to the internet and a search engine enables users to find what they need. YouTube is the best known example of this model.

Increasingly, automation is being used in the curation space too. Joost and Babelgum are examples of curated aggregation with content grouped into channels such as action and sport, animation, comedy and drama. Similarly, Phreadz is a video-based chat system that has a fairly sophisticated threading capability to support conversations built around specific conversation threads – or channels, in video terminology.

Another evolution is the tussle between set-top box based online video, which is mainly IPTV, and pure web TV based online video provided by services such as BBC iPlayer. The set-top box approach usually subsidises the box, but it is a subscription-based model that delivers more assured revenues. The web TV model is more likely to be ad supported and has the advantage of running over free architecture so will probably pick up more users. Cable is an interesting hybrid, as the set-top box is the broadband modem equivalent.

Market size

Today, the estimated market size of all online video – IPTV, cable, broadband and mobile – is about $2bn. This figure is made up of an amalgamation of components, including:

  • Subscription revenues for IPTV, typically as part of triple-play or other bundles from broadband ISPs.
  • Advertising revenues for IPTV, online video sites like YouTube and other avenues.
  • Mobile TV subscriptions, or as a component of bundles of services.
  • Pay-per-use or per-download services.

The figures exclude any standalone consideration of basic “pipe” revenues that can be attributed to video – for example per-MB mobile data fees incurred during video download.

Third-party estimates for the same market in 2012 vary hugely, from $10bn to around $70bn.

Figure 3 shows STL Partner’s (fairly conservative) prediction of the online video market, around $28bn, set against the total size of the global cinema and TV markets.

Figure 3: Total video market versus total online video market ($bn)

[Figure]

Source: STL Partners

In financial terms, online video looks small compared to cinema and TV, but the online video market will also drive major disruption in existing video markets as described in “Future Scenarios” below.

Future challenges for the industry

The online video market can be modelled as a value chain from content creation to customer devices. This is often called the ‘four box model’. Figure 4 shows the four box supply chain model and the key trends for online video.

Figure 4: Four box video supply chain

[Figure]

Source: STL Partners

1. Content creation

There have been two major shifts over the past five years:

  • Cheaper content recording and production equipment has reduced costs of capture and creation. The resulting emergence of user-generated content has had a major impact. For example, user-generated content has reduced prices in media professions where differentiation is low, for example, photography. It has also resulted in a huge inventory of short-form media on the internet.
  • The digitisation of video libraries, both by rights owners and increasingly by amateurs with low-cost equipment, has led to a huge back catalogue of long-form video being made available online. This has aided not only the media enterprise operators, but has also driven the market in user copied content – piracy to you and me.

2. Aggregation

The traditional high-cost manual process of content finding, editing and marketing has increasingly been replaced online by low cost, automated aggregation systems. As more media came online, finding content was initially carried out using search engines such as Google.

Now, social media is coming to the fore and networks of friends discover new content. These social networks have also invaded many of the traditional editing functions of selecting, rating and recommending content. Amazon started this with its customer reviews, but the process of peer reviews has become mainstream on the internet, reducing marketing costs for companies as customers do the job that the marketing department used to do.

3. Distribution

Moore’s Law, working open source software, de facto web service standards and a glut of cheap bandwidth and hardware left over from the dot-com bust have meant that distribution costs, per megabyte, teraflop or mbit/s, have plummeted since 2000.

In addition, distribution options have multiplied. DSL, cable modems and various wireless technologies all compete as online video distribution platforms, while standards such as WiFi and various mobile 3G technologies have expanded mobile bandwidth by 1,000 times.

Over the past few years, fixed line distributors in Europe and the US have engaged in vicious price cutting to fill their huge empty pipes. The overriding strategy has been flat rate pricing on both fixed line and broadband, giving consumers near-unlimited upload and download volumes. However, the rising use of online video is threatening to overload the capacity of these networks in some areas, leading to increasing debates about capacity throttling versus charging more to fund new capacity infrastructure build-out.

One example of this is the launch of the BBC iPlayer a year ago. The average streaming use per customer in the evening increased by 60% within a few days- see Figure 5 – and one ISP’s streaming costs tripled within a month. Since this time, the problem has been exacerbated by the launch of higher quality video with associated higher bit rates.

Figure 5: Video is soaking up bandwidth and driving up ISP costs

[Figure]

Source: PlusNet

In the mobile domain, problems can be even greater. Many operators have now launched consumer-oriented mobile broadband services, both via smartphones like the iPhone, but especially via cheap HSDPA modems connected to laptops. Whereas in the past, even “heavy” business users of 3G data cards have only tended to generate around 200MB per month of traffic, it is now not uncommon for consumers to use 5GB, 10GB or even more – especially video traffic. At first, this was just using up 3G capacity that had essentially been built-out several years ago and which had been left almost unused – in other words, incremental revenue against existing assets and sunk investment costs.

But the rapidity of adoption of consumer mobile broadband may not be all good news. Given that price points start from as little as €10-15 per month, with video traffic now moving into higher definitions, this is starting to look unsustainable, when set against the cost of mobile network capacity upgrades.

4. Customer environment and devices

The inexorable march of Moore’s Law and increasing adherence to open architectures, together with increasing device interchangeability and application flexibility, have led to the total cost of ownership falling for both hardware and software. This has created new devices and platforms for video, which people describe as the ‘four screens’ of online video – TV, PC, mobile and games consoles.

Each of these screen environments offers a different user experience, leading the environments to be optimised for different media such as full-length movies and short-form clips. This optimisation, together with bandwidth limitations and business model constraints, will form the principle driver of what video content is offered by the supply side and consumed by the demand side on each of the four screens.

5. Supply and demand side issues

The supply side has rushed to serve the perceived new market and, as already noted, a plethora of new businesses and business models has emerged. How value will be extracted is unclear. Some entrants are building audiences on the basis that a large audience will attract advertising, but it remains questionable whether advertising is going to be large enough to sustain this approach, particularly in the current economic climate.

Other start-ups hope they will be able to charge for services that they are initially offering free. But history shows that consumers are happy to pay for distribution, for example broadband access, and for consumer electronics equipment, such as an iPod, but are less keen on paying extra for online services, such as games, email and content.

Similarly, behaviour on the demand side is not yet clear. For example, a large amount of innovation on the supply side has been in the area of user-generated video (UGV), but most research shows that consumers – and thus advertisers – value long-form and high quality short-form media more. Although volumes of UGV may continue to grow, the bulk of value is expected to be in curated, higher quality content, as shown in Figure 6.

Figure 6: User-generated video drives traffic, long-form video drives revenue

[Figure]

Source: The Diffusion Group

This will have a major impact on distributors. On one hand, there is a risk that distributors will be asked to carry vast amounts of low grade traffic, without necessarily being able to generate the returns to upgrade their capacity. On the other hand, here is an opportunity, given the reluctance of consumers to pay directly for upstream services, to broker content and aggregation services to the end customer.

Future scenarios for online video

While distributors potentially face both threat and opportunity, it is still too early to predict exactly what will happen. There are a number of variables – notably economic and regulatory – that are almost impossible to call, at the beginning of 2009. The depth and length of the recession could have a variety of side-effects on mobile video, ranging from reduced broadband rollout and capex, through to increased consumption of ‘free’ services as consumers avoid the costs of more expensive forms of entertainment. There are also assorted unknowns around regulatory shifts, such as the US FCC’s changing attitudes to Net Neutrality and lobbying on copyright issues. In Europe, there remains uncertainty around the ‘digital dividend’ and switch-off of analogue TV. Allocation of spectrum to broadcasters vs. mobile operators is also a particularly thorny issue.

Instead, as in Figure 7, we can consider three future scenarios; understand what they entail, what has to be believed for them to occur and how to identify when they were occurring2.

Figure 7: Scenarios for examination

Old Order restored

This scenario explores a world in which traditional content aggregators control the value chain in the online world.

Traditional aggregators build an online presence that is additive to their existing distribution channels both in terms of overall viewing and revenue.

Pirate World

This scenario explores a world in which content is freely available online.

The short-term impact will be a shock to content creators and traditional aggregators that will see a rapid decline in traditional revenue sources as more and more people move towards acquiring content on the black market. Some of this content will be delivered online, but not all, as the techno savvy will acquire content online and act as distributors to the less savvy in friendship and family groups.

New Order emerges

This scenario explores a world in which traditional aggregators are trapped in a pincer movement by device manufacturers and new, powerful aggregators.

Device manufacturers build secure content delivery into their products and make ease of use and interactivity key features. Individual manufacturers develop suites of devices to serve viewing both in the home and outside. Consumers happily invest in the latest, greatest gadget.

Source: STL Partners

Working through these scenarios it becomes clear that success will be based not on technology per se, but on which players have the ability and rights to monetise content.

This leads to the following assumptions:

  • Old order restored
    • Re-establishes control and content rights
    • Maintains control of sources of funding, such as ads and subscriptions.
  • Pirate world
    • Success factors include no control of rights, free wins
    • Offset-based funding, including investment and subsidies, that can continue to cover the cost of industry growth.
  • New order emerges
    • New copyright model allows pricing control by new aggregators and creators
    • Control of sources of funding, such as ads and subscriptions, migrates.

Combining these assumptions with evidence from historical case studies, the opinions of industry experts and the outcomes of workshops and surveys, suggests that the scenarios are inter-related. Over time, STL believes that the story told in Figure 8 will emerge.

  • Old order structures will be at risk from disruptive pirate world plays and will continue to come under pressure
  • Pirate world will not be sustainable as there is not enough money to fund an ‘always free’ industry of this size
  • Within pirate world the beginnings of a new order will emerge.

Figure 8: The shift from the old order to the new order

[Figure]

Source: STL Partners

Figure 8 shows a decline in Old Order market share caused by the Pirate World. At this point, the New Order has only a small market share, but over time the New Order gains market share as the Pirate World grows and then collapses to about 10% of the total. The Old Order retreats to its core business and holds about 25% of the total market. The New Order is expected to take about 40% of the market in 2013 and 65% in 2018. This market share will be built on a number of foundations, the most likely being:

  • Old Order players restructure: Old Order players restructure to compete in the new order markets. Two examples of this occurring are Hulu and BBC iPlayer.
  • Outside players make a new market: The classic example is Apple, which has consistently entered markets that are confused or emergent and has driven a high value, end-to-end solution early, capturing a small, but useful, 15% to 25% market share and earning higher than average surplus.
  • Pirates settle down, become gamekeepers: An example here is YouTube. In October 2008, it looked like Google and YouTube would promote the ‘respectable’ face of pirate world, with Google subsidising YouTube as it continued to mount pirated content as fast as, or faster than, Digital Millennium Copyright Act takedown notices could remove it. However, in November, a new financial reality marked a change in behaviour, with YouTube doing deals with MGM to offer ad-funded movies.
  • New trusted guides emerge: Increasingly, consumers will look for people who they can trust to help them navigate through the morass of content. These trusted suppliers will accumulate users as early adopters recommend them to others.

Genre differences

STL Partners examined a number of genres to understand how things will play out. We looked at movies, sport, user-generated and adult content. Movies and sport are major components of video output today, user-generated content is the new kid on the block and adult content is often a bellwether for the future of more mainstream media.

  • Movies: These will feel the impact of pirate world, just as music did. However, movie revenues are more diverse and there are already large streams coming from other areas. Music is only starting to develop diverse revenue streams. The endgame will depend on a new settlement for copyrights. We believe a new concord will emerge as, unlike music, it is too difficult to pirate movies cheaply and too costly to produce new content for free. We believe online movies will be delivered by a combination of Old Order players that have redesigned their supply chains, reformed Pirates and emergent New Order players.
  • Sport: Sport suffers a rapid decline in value after its live date, like news and a few other genres. This makes it less valuable for Pirate World to copy, so the most likely piracy will be illegal live streaming of events where rights already exist. This is an issue, but not on the same scale as rampant copying. However, there are a huge number of events followed globally that are not currently covered. We believe viable New Order businesses can be formed quickly to deliver sport.
  • User-generated content: As discussed earlier, user-generated content is often early to flower, but it tends to wither over time as professional content takes over and more structured markets emerge. We see this occurring here, with the exception of user-generated communications, such as online video social networks.
  • Adult content: The adult content industry is being hit hardest by user-generated content. It faces many problems, but has few solutions. For example, advertising, enforcement of rights, obtaining subsidies or subscriptions are tougher tasks for this sort of content. In addition, legislators are clamping down. We believe this is an industry, like photography, where users can and will create content for free themselves, leading to value destruction.

Mobile video evolution

The initial hype around mobile TV and video has largely stalled, because of a variety of issues:

  • High levels of friction through the supply chain, driving poor returns and poor user experiences;
  • Early reliance on 3.0G networks, often with poor capacity, coverage and latency;
  • Limitations of handsets, including price, battery, screen, application software and useability. For example, some early services took 5+ seconds to ‘change channel’;
  • Poor fit with typical mobile payment methods, especially prepay users, for whom regular subscription-based services tend to be unsuitable.

However, the new generation of mobile smartphones that has emerged following the debut of the Apple iPhone is leading to a resurgence in mobile video, albeit from a small base. Newer devices have faster 3.5G and WiFi radios, bigger screens and faster graphics processors. Improved mobile web browsers and native video software is further improving the experience, while the costs and complexities of mobile-oriented broadcast (e.g. via DVB-H) is helping the pendulum swing back towards mobile online video.

The good news for Old Order mobile players is that this market is still well regulated, making it difficult for the Pirate World to take over any large volumes. That said, the growing prevalence of flat-rate data plans, coupled with more-capable browsers and ‘sideloading’ content via memory cards, presents a challenge to monetisation, as it is becoming increasingly possible to see ‘the real Internet’ on handsets. Nevertheless, various technical and regulatory factors tend to mean that content and bandwidth consumption is better-policed in mobile than on fixed broadband.

Regional differences

As Figure 9 shows, countries such as Japan, Korea, the Nordics and France are way ahead in bandwidth and price. There is a strong correlation between bandwidth, price and centrally planned and managed economies. The lesson here is similar to that of mobile. To get ahead, some form of national – and perhaps in Europe international – co-ordination will be required to move bandwidth speeds and prices forward.

Figure 9: Price and speed of broadband by country

[Figure]

Source: The Information Technology and Innovation Foundation

This co-ordination is key for distributors, as one of the lessons of the planned rollouts is that it is far better to have visibility of revenues to justify rolling out large scale infrastructure upgrades.

Strategic options for distributors

Distributors must act quickly to avert self-imposed threats of inflexible structures and strategies, and realise the opportunities of entering the market by brokering content for customers.

Threats

The threats are in distributors’ current structures and strategies. In an STL global survey of 145 telecom and media professionals, there was concern about the ability of distributors to compete, both in terms of creating the right services and in executing quickly if they could create the services.

Figure 10: Online distributors seen as second most likely to lose

[Figure]

Respondents suggested that IPTV would not be the major online video market going forward, with various forms of web-based video services taking the lion’s share of the market. This view is backed up by other forecasts of IPTV against other forms of online video take-up.

In addition, there is a real risk that the sheer volume of online video – and the low value of most of it – will make it uneconomic for distributors to play a dumb-pipe role, especially if this would hand market power to players that will then enter areas of the distributors’ markets, such as edge distribution, service provisioning and orchestration. This is a particular concern in the mobile arena, where it is particularly expensive and time-consuming to add capacity, if it involves acquiring extra spectrum or cell-site locations.

Weakness

The major weakness pointed out by many in a Telco 2.0 brainstorm session is that distributors, even if they do respond, may move too slowly and with the wrong business models. The rise of web-based video is also making it less likely that subscription-based IPTV will be able to form a cornerstone of future fibre rollout business models.

Strengths

Looking at potential scenarios and their problem outcomes, there is also strength as the distributors made money in every scenario, which is more than can be said for most of the other players. There is at least some money to be made in providing ‘pipes’, especially in favourable regulatory regimes.

The Pirate World will be one where cash is king and those with deep pockets (like Telcos) will gain market share.

Opportunities

In both the pirate world and new order, the aggregator’s power diminishes and the increasing interconnection of CPE devices with the network will drive new opportunities, giving distributors an opportunity to capture some of the power and value.

Old World economics will be disrupted by the impact of the Pirate World, giving distributors a once in a lifetime chance to move up the value chain and avoid being relegated to dumb pipes.

The requirement is for distributors to use their strengths – cash, valuable users, reach, ownership of a key part of the value chain and willingness of users to pay for distribution and CPE devices – to begin to forge value chains that will maximise the opportunities.

Strategic options

While the emergence of the new order is still unclear, our scenario suggests some activities distributors can plan for and strategic options they can consider.

Figure 11: Strategic map for distributors

[Figure]

ource: STL Partners

Summarising Figure 11, we assume distributors start this model with a flat rate, or perhaps a subsidy, for broadband, as is increasingly common. Added value options then follow.

Moving into Pirate World: We assume there will be little revenue to be gained from upstream players, so the key initially must be to sell extra value-added services to downstream users. For example:

  • Service bundles: Not just connectivity, but buying and bulk-splitting services and material that downstream users would not buy themselves. Examples could be brokered content, perhaps downloads from Amazon, aiding interworking between CPE and devices, access to web services such as VoIP and WiFi, fixed and mobile connectivity, and new web services based around storage, security, social networking and unified directories.
  • Content delivery networks and quality of service: As users and contention increase, some users will pay more for better quality connectivity and services that allow synchronous broadband use.

One issue to examine in both these propositions is how distributors can optimise services and gain revenue by expanding into the CPE arena. Nearly all the research we have seen and done implies that, for the user, seamless interoperation of CPE devices is a major requirement.

As the New Order emerges: We believe that there will be increased economic surplus in the value chain (especially upstream from better-protected aggregators and rights’ owners) so that distributors can seek to develop two-sided market strategies. For example:

  • Higher service levels: Initially, distributors could offer higher service levels for higher value content to upstream service providers. This requires the emergence of a two-sided market.
  • Developing ecosystems: Over time, distributors could develop ecosystems with upstream, downstream and third-party service providers. These ecosystems could exist on Telco distribution platforms and infrastructure.

With a few exceptions, single operators will not be able to drive these strategies alone. They will need to collaborate with each other, certainly nationally and possibly regionally and globally. In many countries, a concerted effort will be required between distributors and government to define the conditions for investment in better, faster capacity.

Conclusion

There are six key conclusions for distributors:

  • The growth of online video will have a major impact on internet traffic, which will experience an order of magnitude growth over the next five years. Our estimates are pessimistic compared to other analysts, so internet traffic could grow more.
  • Although forecast online video revenues of about $28bn in 2013 are not large, they represent an extra revenue stream that will cover costs in converged services and quad-plays where distributors always take revenue.
  • The key opportunity for distributors is to expand their influence in the overall value chain as the aggregation, content and CPE markets undergo disruption in Pirate World.
  • As all value chain models of online video are sensitive to video traffic pricing, the provision of scale will be essential to upstream players. Distributors must leverage this advantage to adopt two-sided business models.
  • Distributors need to create conditions that will allow investment in major capacity upgrades. Where this has been done, it has been done with some form of government or regulatory influence. Elsewhere, distributors will need to influence this, or condition society to capacity overload.
  • In the new order, targeted customer advertising and cost per mille (CPM), as well as the ability to charge for value-added services, will create opportunities for distributors to add value by exposing useful network data and added value services.

For more detail on the evolving scenarios, please see our in-depth Strategy Report: “Online Video Market Study: Options and Opportunities for Distributors in a time of massive disruption”

1 For full details of the Online Video Distribution strategy report, please see:
www.stlpartners.com/telco2_online-video-distribution/

2 Note: We provide an overview of the scenarios here – for more detail on them and how distributors, in particular, should respond to (or drive) them, please see the strategy report.

Full Article: A Quick Slick Unpick of Blyk’s 2-Sided Business Model Trick

The title is Doctor Seuss’ fault as his rhymes are lodged in this writer’s head after years of reading them to his children:

Look, sir. Look, sir. Mr. Knox, sir.
Let’s do tricks with bricks and blocks, sir.
Let’s do tricks with chicks and clocks, sir.
First, I’ll make a quick trick brick stack.
Then I’ll make a quick trick block stack.
You can make a quick trick chick stack.
You can make a quick trick clock stack.
Etc…

We thought it might be helpful to review the Blyk business model in a bit more detail following our pre-launch analysis where we were bearish on the company. Its business model ties in nicely with our 2-sided strategy for operators about which we have written on numerous occasions on the Telco 2.0™ blog

This piece, therefore, seeks to answer the following questions:
1.How does Blyk make money?
2.What are the benefits and risks of the business model? (Are we still bearish?)
3.What are the broader ‘Telco 2.0’ lessons for other operators?

News Glorious News

News flow from Blyk has been positive recently. It announced a few weeks ago that it has reached 200,000 customers in its first year of trading (versus its target of 100,000). This follows press releases in June that the company is set to expand operations in 2009 into other European markets, notably the Netherlands, as well as Belgium, Germany and Spain. All this follows investment (of an undisclosed amount) from Goldman Sachs and IFIC in January. The current squeeze on credit can hardly be helpful to an expanding start-up, but it looks like Blyk was lucky in securing funds ahead of the summer problems.

The Blyk Business Model

Blyk is an ad-funded MVNO focused on the 16-24 year old market (although they position themselves as a ‘media company’). It gifts minutes and texts to customers in exchange for the right to send advertisements to them. Users complete a set of questions about themselves when they sign up, giving Blyk information about their preferences. Advertisers market their products and services via text to Blyk users based on this profiling and Blyk gets paid to deliver the advertisement. So, at first glance, Blyk reverses the normal revenue model for operators: it collects money upstream and pays out for delivering services to customers:

Blyk%201.png

But this is too simplistic (and many who have commented on Blyk’s business model have been guilty of this) because Blyk actually makes money from both sides – from end users as well as advertisers:

1.Termination charges from off-net callers. This is effectively shown in the lower diagram of the chart above where we show operators as both receivers of money from end users (when originating the call) and receivers of money from other operators (when terminating the call). So every time a Blyk user receives a call or text from an off-net customer the originating operator pays Blyk for termination. In turn, Blyk obviously pays some of this termination charge out to its network supplier (Orange) but we guesstimate that the company still makes some margin on this.

2. Overage. Typically 16-24 year olds, like the rest of us, have a pre-determined communications budget – “I will spend £x on my phone each month��?. The fact that Blyk gives users free calls and texts does not stop users from spending this money. Blyk’s users will simply display the same behaviour that every Telco exec is familiar with: increased communications usage as the price reduces (see this excellent piece on elasticity and pricing from the Ericsson Business Review). Because Blyk offers 217 free minutes and 43 texts, we believe that users will be profligate with their communications. They will use this free allowance up and STILL spend at least some of their previous budget.

Blyk%202.png

So how much revenue and margin does Blyk make? Well, we developed a model of the company and plugged in the following assumptions:

Usage Assumptions (Average per User per Month)

Makes 230 texts (13 more than 217 limit) Makes 50 minutes of calls (7 more than 43 limit)
Makes 5 minutes of voicemail calls (all above limit)
Consumes 1MB of off-portal web browsing
Receives 100 texts
Receives 50 minutes of inbound calls
Receives 120 advertising SMS
Receives 30 advertising MMS

Pricing Assumptions

Calls to any UK mobile network or landline (over and above free): 15p/min Calls to Blyk voicemail: 15p/min
Text messages to UK mobile networks (over and above free): 10p each
Browsing off Blyk portal: £1 per MB
Price charged to Advertiser per SMS: 7p
Price charged to Advertiser per MMS: 22p

Cost Assumptions

Off-net texts are terminated at 3p each On-net texts are terminated at 2p each
80% of outbound texts are off-net
Off-net calls are terminated at 5.1p per minute
On-net calls are terminated at 4p per minute
80% of calls are off-net
Off-portal browsing costs £0.50 per MB
On-net MMS are terminated at 9p each

Results

Our analysis suggests that, by combining user and advertiser revenues, Blyk could be making as much as £26 in revenue per user per month at a gross margin (defined as revenue less network costs only) of around 28%:

Blyk%203.png

In other words, Blyk makes around 2/3rd of its revenue from upstream customers (advertisers) and 1/3rd from users (overage and inbound):

Blyk%204.png

It is worth pointing out that Blyk has, thus far, been pretty successful at (a) attracting advertisers and (b) managing campaigns. In fact, response rates over a four week period of 116 campaigns were a staggering 29% last year towards the end of 2007:

blyk-5.png

29% compares very favourably to other forms of digital advertising (Source: e-consultancy, September 2007) and suggests both that young people are open to this value exchange (receiving ads and giving information up about themselves in exchange for free communications) and that even basic targeting is effective:

* On-line Advertising 0.02%
* Paid Search Advertising 0.2%
* Email 0.1%
* Direct Mail 2.0%
* Magazines 0.2%
* Direct Response TV 0.04%
* Radio 0.01%

Benefits and Risks of the Business Model

There is a lot about Blyk’s business model to admire. Compared with a traditional one-sided mobile operator Blyk has the following strengths:

Higher ARPUs. By introducing a second revenue source, Blyk can potentially more than double the ARPU levels achieved by a traditional one-sided player.

Strong appeal to advertisers. Response rates appear to be so good that advertisers cannot fail to be impressed with the Blyk platform as a means of communicating with a traditionally ‘hard-to-get-at’ segment. They certainly seem to have signed up plenty of high-profile brands including WDK (drinks), Penguin (books), Sky Box Office (TV), Local Government (elections), Brylcreem (male grooming products), Boots (Retail). There are lots of examples on the Blyk media portal.

Strong appeal to youth market. Students on a tight budget will be seeking value for money and Blyk offers this in spades in return for relatively limited intrusion (users receive a maximum of 6 ads per day).

Speed to market. The simple approach to targeting (capturing user preferences when they sign up) is not particularly sophisticated and certainly way short of providing real-time behavioural targeting but it has allowed Blyk to launch and grow quite quickly – no operator has yet launched anything similar.

However, as we pointed out before, there are large risks for Blyk. Specifically:

Network pricing. Because it is an MVNO, Blyk is to a great extent dependent on the prices charged by operators for network usage (for origination, transmission and termination). In a competitive market like the UK, these are unlikely to be excessive but there is a margin risk for Blyk if these rise. Blyk would presumably be able to pass on the increase on the revenue it generates on inbound minutes and text but this would not be enough to offset the margin hit. In our model, we calculate that a 10% increase in network costs would see gross margin drop from £7.27 per user per month (28%) to £5.95 (22%).

Declining response rates. A 29% response rate is mighty impressive but this figure is likely to come down as the initial enthusiasm for receiving advertising diminishes and as Blyk penetrates more deeply into this segment and captures users who are less wedded to the ad-funded model. This has two potential impacts:

It may make advertisers less inclined to use Blyk which would reduce the premium prices that Blyk can charge advertisers for SMS and MMS messages.

It will impact the number of SMS and MMS messages sent over the course of a campaign which could have a substantial impact on advertiser revenues. To illustrate this, suppose that Blyk conducted a SMS campaign for an advertiser to 20,000 of its user base and achieved a 29% response rate overall (additional messages are sent only to those who respond up to a maximum of 3). We calculate that such a campaign could be worth £2,345 to Blyk. However, if the response rate drops to 10% (still quite high), then Blyk’s revenue drops by nearly 30% to £1,694:

Blyk%205.png

Given that advertisers account for nearly 2/3rds of Blyk’s revenue, this would equate to a 18% revenue hit overall (assuming stable subscriber numbers).

Operator competition. To date, no operators have followed Blyk into the youth market with an ad-funded model. But if Blyk shows signs of success, you can be sure that other operators will look for a piece of the action. Orange, Blyk’s network provider, has a youth skew and if it sees ad-funding as providing incremental value (rather than cannibalising subscriber revenues), then they are likely to follow suit. And Virgin also has a strong youth bias and could potentially copy the Blyk model relatively easily. Moves such as these are likely to drive prices down for advertiser media purchases.

Scalability. Even if Blyk could capture a large proportion of 16-24 year olds (which seems unlikely in saturated and competitive European markets), the cost Blyk spends on acquiring customers is likely to mean that EBITDA margins will be razor thin. Our 28% gross margin excludes operations, customer care (where it looks like they have had some problems) and marketing and sales costs. The latter is particularly concerning since Blyk uses people at university campuses to sign up prospects and capture profile information. This simply doesn’t scale effectively and the sign-up and data capture process will need to be automated as Blyk grows to improve both efficiency and the effectiveness of targeting.

Growth – eats itself. Ironically, it is because Blyk is so small that we calculate that 25% of its revenue could come from inbound termination of off-net calls and messages. If the company grows and more and more call and texts are on-net, Blyk continues to pick up the costs without the associated termination benefit. Like the voice arbitrage players, that make money by using the internet to reduce voice and fixed calls, it is to some extent a beneficiary of its small size for if it grows it loses a key revenue stream.

Lessons for Operators

1. 2-sided market opportunity is real. Perhaps the most obvious lesson for other operators is that there is value in 2-sided markets! Blyk may struggle to make a return for the reasons mentioned above, but it has already done enough to show that for operators with large existing (youth) customer bases the ad-funded model could be fruitful. We think this also shows the wider potential for 2-sided opportunities in the areas outlined in our report on the subject.

2. Different Business Model = Different Business! It is not mere marketing fluff that Blyk refers to itself as a media company rather than a MVNO. It shows that Blyk’s management considers the advertising community as its primary market and end users as ‘members’ rather than customers. This is important – a different business model is a different business. A two-sided approach for operators will require new customers, new metrics, new operational procedures and processes, new skills and assets (see below). It will be very, very difficult to build this within the existing organisation structure and operators should consider carving out the new unit and making it a customer of the core business.

The core business could even charge the new unit for using the customer and network data and other assets it requires for success. The ‘differentness’ of this future business was brought home to me recently in a meeting with two strategy executives at a leading European mobile operator who said that one of the key barriers to developing a two-sided business model is the current metrics used for business planning. Unless projects are shown to replicate the 40-50% EBITDA margin enjoyed by the current business, they fall at the first hurdle. The two-sided business is likely to be much less capital intensive than the current business so, while it may not generate such high EBITDA margins, EBIT margins could be equally impressive. .

3. Scale for success. We have oft pointed out the need to build scale on at least one side of a platform. I was delighted to see a media agency also voicing this recently when Grant Miller, joint MD of media agency Vizeum, explained why they had chosen AOL’s Platform-A for promoting Oasis’ new album:

“We need a property that has scale, tools and the technology to build a communications platform that delivers on all fronts. By bringing together all its individual properties, Platform-A represents a great opportunity to build a dialogue with the target audience.��?

Blyk has done well from a standing start and its 200,000 users are clearly attracting brands.

The real value to advertisers (and merchants, governments, developers, enterprises and other upstream customers) is from seriously large numbers of end-user customers willing to accept advertising and other telco-enabled VAS services. This makes the 2-sided telco opportunity most valuable to larger operators OR the operator community working collaboratively.

4. The power of a 2-sided pricing strategy. Blyk isn’t the first company to give stuff away. Google gives 99% of its products and service away to end users and Microsoft gives away its SDK for Windows to developers. What these companies do is subsidise one side of the platform and charge a premium to the other and thus seek to maximise value across BOTH sides. In Google’s case, its efficiency means that it can undercut other advertising channels’ prices and still make a handsome return. The ability to understand and use such a pricing strategy makes 2-sided players tremendously powerful as they can attack the markets of competitors that charge for services that they give away.

5. Cost control remains king. You’ve got your customer base on one side and you are building scale on the other side, so you’re sorted, right? Absolutely not. The platform will only thrive it not only provides an effective service (identification, authentication, advertising, billing, content delivery, customer care, etc.) AND does it more cheaply than can be found elsewhere. Google is winning because advertising is cheap for brands, Microsoft won on Windows partly because the platform, when bundled in with a PC purchase, was negligible. This means that driving costs out of the platform is critical. The high-cost nature of Blyk’s sales model and customer data acquisition is a worry and other operators looking to enter the market should seek to ruthlessly drive cost out of the system.

6. Customer data and CRM is core. Even with its relatively low-tech data acquisition approach, Blyk shows that targeting customers with the right message/product/service/solution really does work. Operators should seek to invest heavily in this area whether they pursue a 2-sided strategy or not because understanding their customers better can only improve the delivery of their own retail services anyway. A strong CRM capability becomes a must-have if they wish to become a platform player like Google.

Finally, what is Blyk’s plan for the emerging world of Voice & Messaging 2.0? After all, its target demographic is made up of exactly the same young early-adopter kids who most of the new V&M players are targeting; but its product isn’t really geared to that. For example, they’re keeping a tight grip on the data pipe, and it’s 2G only. And there’s no sign of a developer community.

However, Blyk does have capabilities most MVNOs don’t – it has its own complete Nokia Siemens Networks-provided core network, not just an HLR plugged into a partner’s network. So, how long before there’s a Blyk API to play with? Or do they fear cannibalisation too much?

Full Article: Mobile NGN, a Real Telco 2.0 Opportunity?

The sixty page document “Next-Generation Mobile Networks (NGMN): Beyond HSPA and EVDO? is the latest white paper of NGMN.org, an initiative by the CTO’s of China Mobile, KPN Mobile, NTT DoCoMo, Orange, Sprint Nextel, T-Mobile International and Vodafone Group. It provides a technical requirements framework to vendors for the next iteration of mobile networks.

To be clear, what’s defined is just a technology toolkit. Different carriers may deploy it in different ways with varying business models and services. Until we see the business models, jubilation or damnation is premature. Nonetheless, this is an extremely important document. The “walled gardens? of 3G are starting to look like weed patches, and this is a rare chance to define a truly new Telco 2.0 approach that takes the best of the Internet and traditional telecoms models.

The document avoids wild flights of fancy about sophisticated combinatorial services, and focuses on practical implementation concerns of mobile broadband. It rightly sees the mobile ecosystem as a co-evolution of devices, access and services. This offers a valid and viable parallel/alternative path to the fragmented and sometimes chaotic Internet approach. It’s clear about what generic classes of service are to be offered, and what tradeoffs are likely to be acceptable. The document also outlines a very much evolutionary approach: business-as-usual, only faster and cheaper.

And therein lie the big questions:
* Does it go far enough in addressing the forces tugging apart network access, services and devices?
* Does it react to the counter-forces that would push them back together in order to address deep architectural issues of IP and the Internet (such as weak security and low efficiency)?

Our answer based on our reading is “maybe, if deployed right? — but you need to be a bit of a Kremlinologist to read between the lines and think about what’s left unsaid.

We’ll start with the easy bit: things in the document that make sense about Making Money in an IP world. Then we can delve into the more philosophical and practical limits of that IP world and how a next-generation architecture might address them.

Plenty to praise

There are many positive improvements proposed. Some highlights might include:

  • Self-configuring networks that cost less to run.
  • Improved scheduling algorithms that focus on user “quality of experience? at the periphery of a cell site, rather than RFP-friendly numbers for maximum burst throughput standing under the cell tower at 3am on Christmas morning.
  • Flexible and modular service-oriented architecture to accommodate future change.

Put simply, whatever NGMN turns out to be, operators want OSS and BSS thought through in advance, and for vendors to take responsibility for the operator and user experience post-installation. So far, so good.

Aligned with several Telco 2.0 trends

There are also some features which come with the “Telco 2.0 Approved��? stamp because of their reflection of the business trends we see:

  • The ability to share equipment and do more slice-and-dice of the infrastructure similar to MVNOs, but better. We believe infrastructure sharing and new modes of financing/ownership as being a key Telco 2.0 trend (as we will discuss at our forthcoming Digital Town event workstream).
  • Stronger device and end-to-end security to enable transactions of money or sensitive data. As telcos are already diversifying into the payments and identity business, this can only grow — and depends on such enabling infrastructure. DoCoMo are part of the consortium, and given their trailblazing in payments services, we’re hopeful of seeing diversification successes of operators elsewhere based on their learnings.
  • Detection and mitigation of network traffic resulting from malware or attack. This we feel will be a growth area as the services become less controlled. A limitation of the “intelligence at the edge? concept is the ability of those edges to collaborate to detect and eliminate abuse. The experience of email spam and phishing tells us that not all is wonderful in Internetland.

Moving on, there are several things conspicuous by their absence.

The Internet elephant in the corner

Apart from some in-passing references in a few tables and diagrams, the word “Internet? is wholly absent from the document. It’s a bit like Skype, YouTube and BitTorrent never happened. In fact, you can only conclude this absence is deliberate.

It could very well be that the technology defined can be deployed in very different manners, and operators may take radically different approaches — such as the contrast between 3 and T-Mobile in the UK embracing open Internet access, O2 trying to keep people on-portal, and Vodafone outright banning many popular Internet services such as IM, VoIP and streaming. Will operators want to continue to ride the “Telco 1.0? command-and-control horse, or switch to a more open “Telco 2.0? Internet-centric approach? Will the point of a future mobile network to channel bits back at all costs to a cell tower where they can contend for expensive backhaul to be deep-packet-inspected. metered and accounted for? Or will it complement the other infrastructure that exists?

The IMS mouse in the cupboard

Equally conspicuous by its general absence is reference to IMS. Our take is that there could be a polarisation here between “service-centric? operators trying to define interoperable new services and compete against Internet players; and “connectivity-centric? operators who create “smart dumb pipes? and enabling platforms for a wide ecosystem of players. You could deploy NGMN and completely ignore IMS if you chose to do so.

Local connectivity, globally interoperable

At the other extreme of connectivity, another thing not given much ink is the explosion of highly local connectivity. For example, we’ve just passed the billionth Bluetooth-enabled device. Motorola’s Chief Software Architect, John Waclawsky, described this at the last Telco 2.0 event in October in his presentation “From POTS [telephony] to PANS [Personal Area Networks]?. The mobile network itself can still play a part in this, such as offering directories of resources. If you’re sat in Starbucks today and want to print out a document, you’re out of luck — the network can’t help you locate or pay for such services.

Given that this is an integrated vision of handset, network and service evolution, we think it may be gold-plating the longhaul connectivity vision, and underspecified the local connectivity one. The business model will also need to evolve, since there may be no billable event. It has to anyway: products like Truphone will make it ever easier for users to bypass or arbitrage network access.

What’s the commercial vision?

Naturally, the operators can’t write down a collective commercial vision (because of anti-trust), nor an individual one (due to commercial confidentiality). So you have to impute the commercial vision from the technology roadmap.

The stated requirement is for a network that’s low-latency, efficient, high-throughput, more symmetrical, good at unicast, multicast and broadcast, cheap, and interoperates seamlessly with everything that went before it. It’s a bit like low-calorie cream-topped chocolate cake. Sounds like a good idea, until you try making one.

The inevitable billion-dollar question is what are the services and the business model that will pay for all this? The experience from 3G was that “faster? isn’t itself a user benefit of significance (particularly when it doesn’t work indoors!) In fact, given that battery technology follows a curve well below that of Moore’s Law (or its transmission equivalent), there’s the “oven mitt? problem of early 3G handsets still lurking: how to create hand-held devices that are physically capable of sourcing and sinking data at such speeds and over such distances (and high power) — and that create services users care about in the process.

Or, to put in another way, why sync my iPod over the air slowly when I can plug this USB cable into my laptop and do it at 400Mbps for free?

What is a mobile network for, exactly?

There’s a significant difference of expert opinion here that’s worth noting. There isn’t universal agreement on what wireless networks are best used for compared to wireline. For example, Peter Cochrane, the former CTO and head of research at BT has long been keen on forgetting DSL and copper and going all-wireless. NGMN’s ambitions to match and exceed the technical and cost capabilities of DSL suggest a commercial vision of competing against fixed access.

Our take is that success is most likely to come from intelligently blending the best of fixed, mobile and media-based delivery of data, rather than an absolutist approach to any one of these. Furthermore, the unsolved user problems are more to do with identity, provisioning, security and “seamlessness? than speed or even price. Finally, users don’t generally see the up-front value in metered or fixed buckets of IP connectivity, particularly given the anxiety it causes over cost or overage. True unlimited use isn’t technically possible, so the network has to allow connectivity to be bundled into the sale of specific device or application types, where traffic is more predictable.

Stop looking for the platinum bit

The hypothesis seems to be that some bits will be blessed with “End-to-end QoS? and continue to gather super-premium pricing (by many orders of magnitude). The need for this QoS capability is repeatedly stated. At the same time as the network capacity, latency and cost improve to near-wireline levels. I think you can spot the problem. I’ve made a successful Skype call to someone 35,000 feet up on a 747 somewhere over central Asia, and there wasn’t any QoS involved.

Our post on Paris Metro Pricing attempts to challenge some of the assumptions that drive this requirement. It sounds esoteric to those from the commercial side of the business, but ignoring this small technical detail is telecom’s equivalent of the frozen O-ring. Set the price high, and at some point all the valuable bits flow around the “premium pipe? and not through it, and the commercial model fails.

NGMN could be part of the solution here, not the problem. If operators can switch to a congestion-based mode of pricing, rather than pure capacity, they could offer users a far better deal.

What are the real sources value?

Here are some examples of requirements in the document, and how NGMN provides opportunities for product and business innovation:

  • Making user data more seamlessly accessible, blurring the line between online and offline. The specification includes
    Standardised APIs (i.e. not operator or handset-specific) to sync online and offline data like address books, so the user doesn’t have to care so much about network connection state. This whole process could be taken much further to cover all content. This lecture video by Van Jacobson, former Chief Scientist at Cisco, points to a very different future network architecture based around diffusion of data rather than today’s packet-only networks where you have to know where every pieve of data is located to find it. (Hat tip: Gordon Cook.) This isn’t a theoretical concern: wireless networks readily become congested. Maybe it’s time to reward your neighbours for delivery you the content, rather than backhauling everything across the globe. The Internet’s address space is flat, but its cost structure is not.
  • Deeper coverage, richer business models. The document talks about hub terminals (e.g. femtocells). Deep in-building and local coverage is a clear user desire. The first step is outlined, but there’s no corresponding economic model being included. Companies like FON and Iliad are doing innovative things with user-premises equipment and roaming. We nope NGMN doesn’t repeat the experience of Wi-Fi, where hooks for payment weren’t included (causing a mess of splash screens), and the social aspects neglected (am I sharing this access point deliberately?). The existence of bottom-up network deployment is an interesting possibility. You need to create new security and payment mechanisms so that local entrepreneurs can extend networks based on local knowledge and marketing. Top-down is becoming top-heavy.
  • Support for a diverse array of charging models. It’s in there, but could get lost in the deep-packet-inspection swamps. The genius of telephony and SMS is to sell connectivity bundled with service in little incremental slices. We’d like to see richer, better and simpler ways of device makers and service providers bundling in connectivity. (See out earlier artlce on this for more details.) For example, the manifest of a download application could say that Acme Corp. is going to pay for the resulting traffic — and the secure handset will ensure it’s not abused to tunnel unrelated data at Acme’s expense. NGMN could enable this.
  • Uplinks vs. downlinks. Users create as much content as they consume. Devices are equipped with multi-megapixel cameras and video capture, which will be uploaded for online storage and sharing. That media is then often down-sized for viewing (if it is ever viewed at all). Yet the standards continue to emphasise downlink performance. We’ll acknowledge that from a technology perspective uplink engineering is like trying to fire bullets back into the gun barrel from a distance. Somehow this issue needs to be looked at. NGMN takes us closer, at least.
  • Peer-to-peer. A great requirement in the specification is “better support for ‘always on’ devices, with improved battery performance and network resource usage.?. We’d second that. But given this requirement, where’s the peer-to-peer specification of the services those devices should host? Or do operators still believe that the purpose of the network remains distribution of professionally authored media entertainment from “central them to “edge us?
  • Building an identity-centric business. Another good requirement is for more advanced modes of device authentication, such as sharing a SIM among multiple devices. In some ways it defines an “identity network that is independent of the NGMN, and potentially fixes some serious problems with the Internet. Mobile networks may happen to use those identities, but they’re equals with other uses. We’d encourage more creative thinking in this area.

Summary thoughts

Overall, it’s a good piece of work. Change doesn’t happen overnight, and given a 3-5 year time horizon, the world will not be beyond recognition. Nonetheless, without a parallel vision of business model evolution, much of the investment in NGMN could become as equally stranded as that in 3G. With the right vision, it could make the “mobile Internet really work, since the “real Internet continues to be a polluted, expensive and frustrating experience for users.

Telcos’ Role in the Advertising Value Chain

Summary: A report identifying how to build a valuable new business model and customer base.

To share this article easily, please click:

 



Read in Full (Members only)   To Subscribe click here

Background

Fixed and mobile voice and data revenues are in free-fall in most European and North American markets. Since the 3G auctions at the turn of the century, content has long been considered the key future growth area for operators in the consumer segment. However excluding SMS, the only material content revenues for Telcos to date have been through movement into adjacent markets – particularly acquisitions in the cable and media sectors.

Through advertising, operators have a potential opportunity to:

  • Reduce the price of content and services to end-users;
  • Increase the volume of available content and services, and
  • Provide value to the advertising community

 

To achieve this they must contribute to the development of a differentiated new advertising channel in which users are provided with a portfolio of content and services supported by contextually-relevant advertising.

Operators have an opportunity both to provide their own advertising-funded services as well as become an enabler to the advertising community by helping advertisers interact more effectively with their targets (who may or may not be Telco customers). In this report, we examine both of these opportunities in both the fixed and mobile markets. We explore in detail what advertisers and users really want and the opportunities available to operators to carve out a valuable role in meeting those needs.

Key Questions Answered

This report seeks to help operators and vendors maximise future advertising-funded service opportunities by answering the following questions:

  • What is the rationale for advertising-funded services?
  • When will the market take off and how big will it get?
  • How can operators prevent cannabalising existing revenue streams?
  • What are the needs of the advertising community?
  • How should operators work with Internet enablers (e.g. Google), content providers (e.g. Sony) and aggregators (e.g. Motricity)?
  • What implementation issues need to be resolved?
  • What are the options available to operators to add value and what is the best option available?
  • What are they key factors for success?
  • What value is there in opening-up Telco assets (open APIs etc.)?
  • Which can be learned from market-leaders in advertising-funded services?
  • What are the attitudes of operators, internet enablers, content providers and aggregators to the market and how to be successful in it?
  • What needs to be done to develop the market and generate near-term benefits?

Contents

  • Executive Summary
  • Background and Key Issues to Date
  • Growing Pressure on the Existing Operator Business Model
  • Content Delivery: Not a Panacea
  • Advertising-Funded Services: Tried and Tested in Adjacent Markets
  • Telcos’ Role in Advertising: Market Scope
  • Activity from Operators to Date
  • Advertising-Funded Services – Threat or Opportunity?
  • The risk of cannibalising existing revenues
  • Internet players – partner or competitor?
  • Show me the Money! – How big could the market be?
  • Understanding the Advertising-Funded Value Chain
  • Value Chain players in Internet Advertising
  • What do Advertisers *really* want?
  • Options for the Operator to add Value
  • Key Skills and Assets Required
  • Issues to resolve
  • Operator role: The Devil in the Detail
  • Who to Partner with and How
  • Meeting Advertiser and Customer Needs:
  • Return on Investment
  • Customer attention & interaction
  • Performance measurement
  • Ubiquity
  • Legal and Regulatory Issues
  • Learning from Web 2.0
  • Content and Communications: Two sides of the same coin
  • Social Networking Communities and Advertising
  • Case studies:
  • Learning from the Master: Google and the Art of Ad-Funding
  • Accelerating the need for Advertising Revenues: The X-Series from 3
  • The Whole Hog: Blyk’s Advertising-Funded MVNO
  • Delivering an Open Platform: Amazon
  • Views from the Industry – new primary research by STL Partners
  • Action steps & Conclusions

This report is now available to members of our Telco 2.0 Research Executive Briefing Service. Below is an introductory extract and list of contents from this strategy Report that can be downloaded in full in PDF format by members of the executive Briefing Service here.  To order or find out more please email contact@telco2.net, call +44 (0) 207 247 5003.

Full Article: Beyond bundling, the future of broadband

This is an edited version of the keynote presentation of Martin Geddes, Chief Analyst at STL Partners, at the October 2007 Telco 2.0 Executive Brainstorm in London. It provides some initial findings from our research into future business models for broadband service providers (BSPs), including our online survey. (The summary results will be mailed out to respondents in the next few days.) Those wishing to find out more may want to take a look at our forthcoming report, Broadband Business Models 2.0.

To save you the suspense, here’s the headlines for what’s upcoming for the telecoms industry, based on what insiders are saying through our survey and research:

  1. Operators are going to face a slew of non-traditional voice service competition. To corrupt the words of Yogi Berra, “The phone network? Nobody goes there anymore, it’s too crowded.? The volume may linger on, but the margins in personal communication will move elsewhere.
  2. Content delivery is a logistics problem that spans many distribution systems. Those who can solve the delivery problem by sewing together many delivery services, rather than those focused on owning and controlling one channel, will win.
  3. Wholesale markets in telecoms are immature and need to evolve to support new business models.
  4. Investors aren’t up for more “loser takes nothing? facilities-based competition capex splurges. Time to look hard at network sharing models.

So, read on for the background and evidence:

Background to the survey and research

Our ingoing hypothesis is that telecoms – fixed or mobile — is a freight business for valuable bits. This could be via traditional voice networks. Broadband is another means of delivering those bits. It includes Internet ISP access, as well as other services such as private VPNs and IPTV.

Broadband competes with and complements other delivery systems like broadcast TV, circuit-switched phone calls and physical media.

Just as with physical goods, there are lots of delivery systems for information goods. These are based on the bulk, value and urgency of the product – from bicycle couriers to container lorries for atoms; phone calls to broadcast TV for bits.

As part of our research we’ve also been looking at how other communications and delivery systems have evolved commercially, and what the lessons are for the telecoms industry. After all, broadband as a mass-market business is barely a decade old, so we can expect considerable future change. In particular, the container industry has some strong parallels that may hold important lessons.

Physical goods and the telephone system have developed a wide range of payment methods and business models.

With physical goods we have “collect it yourself?, cash-on-delivery, pre-paid envelopes and packages, as well as express parcels, first and second class postage.

The phone system offers freephone, national, non-geographic and various premium-rate billing features. It offers the user a simple, packaged service that includes connectivity, value-added features, interoperability, support and a wide choice of devices.

Likewise, SMS packages together the service and its transport. It’s wildly popular, bringing in more money globally than games software, music and movies combined.

The problem is that this has come within closed systems that don’t enjoy the rich innovation that the open Internet brings.

Internet access, by contrast, offers an abundance of goods but is relatively immature in the commercial models on offer. Broadband service providers typically offer just one product: Internet access. And they generally only offers one payment mechanism for delivery of those online applications: one-size-fits-all metered or unlimited, paid independently of services used. (There are some important exceptions — you can read more here.)

As a small example of how the Internet under-serves its users, when a small non-commercial website suddenly gets a surge of traffic it typically falls over and is swamped. That is because there’s no commercial incentive for everyone to pay for a massively scalable hosting plan just in case of unexpected demand. The telephony system doesn’t suffer this because the termination fee for every call is designed to at least cover the technical cost of carrying the call.

Oh, and don’t expect Google to host it all for free for you either – the error message in the slide above is cut and pasted from a bandwidth-exceeded Google Blogger account.

There is also a lack of incentive for access providers to invest in capacity on behalf of Google to deliver richer, heavier content (where Google collects the revenues).

The question therefore is: How can BSPs find new business models inspired by more mature distribution systems?… whilst at the same time not killing off the innovation commons that is the Internet. BSPs must both create and capture new value in the delivery of online applications and content. Being an NGN or IPTV gatekeeper is not enough.

Fixed voice revenues are declining; mobile voice is peaking; and SMS is slowing down. The theory has always been that broadband ISP services will take up the slack, but in practise margins are thin.

Our research is testing out a wide variety of alternative commercial models. For example, would an advertiser like Google pay for not just the hosting of content (via YouTube, Picassa or Blogger), but also the end-user usage on a fixed or mobile device for receiving that content?

We believe that whilst these alternative models may individually be much smaller than traditional broadband Internet access, collectively they may add up to a larger amount of value.

Survey supporters and respondents

The research would not be possible without the active support of the above sponsoring and supporting organisations, and we thank them all.

We’ve had over 800 respondents, with roughly one third from operators & ISPs; a quarter from vendors; and the rest consultants, analysts, etc. The geographic split is Europe 40%, N America 30%, Emerging 20%, Developed Asia 10%. There is a ratio of around 60:40 fixed:mobile respondents, and mostly people from commercial (rather than technical) functions.

We asked about four main areas:

  • Today’s ISP model — is it sustainable.
  • Future of voice service in a broadband world
  • Future of video service, as the other leg of the “triple play? stool
  • Future business and distribution models

Rather than assault you with dozens of charts and statistical analyses, what follows is the gist of what we’ve discovered.

Furthermore, we’re looking 5-10 years out at macro trends. You might not be able to predict Google, Skype or Facebook; but you can foretell the rise of search, VoIP and socially-enhanced online services. Even in our own industry, there can be large structural changes, such as the creation of Openreach by BT. You could probably have foretold that as vertical integration weakens there would be such organisational upheavals, even if not who and when.

Sustainability of ISP business model

What’s the future business model for broadband?

Around 20% see the current stand-alone ISP business model as sustainable long-term. This includes many senior industry figures, who cite better segmentation, tiered price plans, cost-cutting and reduced competition in more consolidated markets. It may be a minority view, but cannot be dismissed out of hand.

Around a quarter of respondents thought that broadband works as part of a triple or quad-play bundle of voice, video and data – cross-subsidised by its higher-margin cousins. This is the current received wisdom.

However, a majority of respondents say that a new business model is required. These results hold broadly true across fixed and mobile; geographies and sectors.

Which brings us to our first lesson from the container industry. Old product and pricing structures die hard. The equivalent efforts at maintaining a “voice premium��? all failed. Trying to price traffic according to the value of what’s inside the container or packet doesn’t scale.

For BSPs, that means technologies like deep packet inspection might be used:

  • for law enforcement (“x-ray the containers?), or
  • to improve user experience (at the user’s request), for example by prioritising latency-sensitive traffic (“perishable goods?)

However, traffic shaping can’t be your only or main tool for the long-term; you can’t reverse-engineer a new business model onto the old structures. It doesn’t, ultimately, contain your costs or generate significant new revenues.

Broadband voice

One of the big surprises of the survey was how quickly respondents see alternative voice networks getting traction. We asked what proportion of voice minutes (volume – not value) will go over four different kinds of telephony in 5 and 10 years from now. Looking at just the growth areas of IP (i.e. non-circuit) voice, you get the following result.

It seems those WiFi phones we laugh at now are more dangerous than previously thought – maybe when 90% of your young customers are communicating via social networking sites, you’ve got some unexpected competition? (Indeed, we note that social network traffic is just overtaking the traditional email portals.)

We were also given a surprise in that respondents saw most of these changes happening over the next 5 years.

Insiders see the growth in voice traffic as being anchored on best-effort Internet delivery, which gets around 1/3 of the IP voice traffic. Using traffic shaping, offering tiered levels of priority, and using traditional end-to-end quality of service guarantees all got roughly equal share.

There are some small differences between fixed and mobile, and mobile operators might like to seriously consider offering tiered “fast dumb pipe? and “slow dumb pipe? that applications can intelligently choose between.

This all suggests that operators may be over-investing in complex NGN voice networks and services. They need to urgently work out how they can partner with Internet application providers to offer “voice ready? IP connectivity without the costly telco-specific baggage of telco protocols and platforms.

So what’s the lesson from container shipping for the broadband voice community?

At the same time as containers where being adopted, some ports doubled-down on the old business model and built better breakbulk facilities – and lost. Manhattan’s quays are gone, Newark has replaced it.

Others waited to become “fast followers?, and lost too. London went from being one of the world’s busiest ports, to zero activity. Dubai did the reverse by investing exclusively in the new model, with a low cost base and high volume. (Shades of Iliad’s approach in France.)

The winners were those who staked out the key nodes of the new value chain.

There are some clear lessons here for telcos and their NGN voice networks. The cost of broadband access technology is dropping, capacity is rising, and the voice component’s value is decaying towards zero. Furthermore, session control (the software part of the voice application) is just another IT function that runs inside a big server, and isn’t something you can charge for above hosting costs. It has the economics of email, and that’s mostly given away for free. So IP voice isn’t adding anything to your triple/quad play bundle, and can only be justified on the basis of reducing cost in the old business model. An IP NGN voice service that’s still selling metered minutes does not constitute a new business model.

Broadband video

The survey results for video are a little less dramatic than for voice and follows received wisdom more closely. Overall respondents endorsed Internet video as far more of an opportunity than a threat. (Only in telecoms can a significant proportion see more demand for their product as a problem! The potential issue is that video could drive up costs without sufficient compensating revenue.) A long slow decline for broadcast TV and DVDs is matched to a slow ramp-up in various forms of on-line delivery. Every form of Internet delivery, from multicast IP to peer-to-peer file sharing gets a roughly equal cut. There were some things to watch out for though…

The opportunity is to become as supplier of advertising, e-commerce, caching and delivery services for a variety of video portals, not just tied to your own. This isn’t surprising; can you imagine a Web where there were only two portals to choose from, both owned by the network owners? The same applies to video.

Economic migration, cultural fragmentation and user-created content ensure that we’ll need a diversity of aggregation, recommendation, filtering and presentation technologies.

Given a choice between building a closed IPTV solution, or an open content platform, the response was well in favour of the latter as the more profitable to run. (The slow ramp up of BT’s Vision service suggests its success is more likely to be based on the “push? of analogue switch-off than the “pull? of the telco brand as a TV provider. Why do no telco TV plans centre around external entrepreneurial talent and innovation?)

Both options beat the alternative of disinvestment in video delivery technology. So fixed and mobile operators are well positioned to help enable and market video, just not “TV over IP?. That’s the steam-hauled canal boat, when you’re supposed to be using IP to build a railroad. It seems telcos are over-investing in emulating broadcast TV and under-investing in the unique nature of the online medium.

P2P and “over the top? are here to stay. You deal with the costs by offering more profitable alternatives, not by punishing your most voracious customers. (See our article on Playlouder as an example of how to do it right.)

In music, Apple’s iTunes captured the key bottleneck in the distribution chain. Could the same happen for online video?

We gave respondents a choice of four scenarios:

  • Direct to user from the content author or publisher
  • A single dominant player
  • A fragmented market dominated by telecoms companies
  • A fragmented market dominated by non-telcos

Our respondents say that the market is likely to be fragmented with many aggregators and non-carriers will dominate. Again, “triple play? doesn’t capture the richness of the business-to-business model required with many partners in the distribution and retail value chain. How will Telco TV satisfy my wife’s taste in Lithuanian current affairs and my interest in gadgets and economics lectures? It can’t.

Our take-away from the shipping industry is that when it comes to shifting bulky stuff around, big is good and bigger is, err, gooder. Networked infrastructure businesses have strong increasing returns to scale. There’s no point in building a new port anywhere near Rotterdam because that’s not where the other ships go. There’s a good reason why Akamai takes the bulk of the profit pool from content delivery networks — their one is the biggest.

Network ownership models

Compared to today’s dominant models (facilities-based competition and structural separation), respondents rated a third ownership model – co-operatives of telcos – surprisingly highly. The two currently dominant models remain on top.

The issue is how to structure the vehicles for mutual or co-operative asset ownership. The financial industry has already created structures that allow shared operational businesses, either mutually owned or as private special entities. Furthermore, they’ve managed to preserve barriers to entry. To become a member of the VISA network, you need a banking license. That costs a lot of money.

Telecoms and the Internet business have some common structures around numbering and interconnect, but could emulate these other models from other industries.

The arrival of containers shifted the balance of profit away from the shipping lines and towards the ports.

In terms of telecoms, it’s where the content is originated or goes between delivery systems that matters – from CDN to broadband access, from broadcast to DVR. That means every Googleplex and content delivery network that gets built puts Google or Akamai at a massive advantage, since everyone wants to peer with them.

Traditionally it has been long distance and access networks that have dominated telecoms economics. AT&T’s early years found it the only owner of a long-distance network and thus able to negotiate very advantageous terms in buying up local carriers into the Bell system. It mistakenly help onto the long distance network just as the bottleneck shifted to the access network. At the moment the US sees a duopoly in access networks, and supernormal profits. Wireless carriers enjoy an oligopoly in most markets as a by-product of spectrum licensing.

However, Europe is moving towards structural separation or open access of fixed networks. Homes and offices offer WiFi or femtocell bypass options for cellular. Over time, local access ceases to be such a bottleneck. Furthermore, there are many physical paths and proliferating technologies and suppliers hauling data between the distant points that want to be connected up — be it transoceanic cables or competing wireless backhaul technologies. So the owners of the transmission networks don’t enjoy the benefits. It’s the owners of the places where traffic is exchanged between delivery systems that do, since those feature increasing returns to scale and dominant suppliers.

What is the product we are selling?

Today operators expect you to go out and buy yet another access plan for every device you touch or place you make your temporary home. They sell “lines��?, either physical, or virtual (via a SIM card). Is this really the right way for the future?

All I want to do is connect my phone and laptop to the Internet wherever I am – but I get different prices and plans depending on which combination of device and access technologies I use – yet all from a single vendor. (The first is using my phone as a 3G modem over a USB cable; second is a separate 3G USB modem; third is WiFi.) This creates the perverse incentive when I’m sat in Starbucks to use my phone as a modem for my laptop over the expensive 3G network.

Also, I might be a peer-to-peer download lover, and hopelessly unprofitable. Or I might just want to check my email and surf the web a little on my mobile. How can you rationally price this product? What are the alternatives?

We gave users a choice of 3 alternatives (above) as to how broadband connectivity is provisioned. Should we sell you “unlimited browsing?, but listening to Internet radio is a separate charge? Or should we price access according to the device, but not make the plan portable between devices? A data plan on a basic featurephone would differ in price from a smartphone, Internet tablet or laptop. Or should we just give the user a set of credentials that activates any device or network they touch and bills that usage back to them?

The preferred one was to offer users a connected lifestyle, regardless of devices, applications or prices.

BT’s deal with FON is an example of a step towards this goal. Picocells too have the potential to upend the access line model. In terms of immediate actions, mobile operators should recognise the trend towards divergence and users with multiple handsets. Don’t make me swap SIMs around when I go from my “day phone? to “out on the town phone?. Give them a common number and interface.

New, more liquid, ways of combining together devices and networks for sale would require wholesale markets to evolve.

We asked what impact it would have on BSP revenues if all the friction were taken out of the wholesale market. Anyone who wants to come along and build an application with connectivity included in the price would be able to source their wholesale data from any carrier. You don’t have to be Yahoo!, Google or RIM to negotiate a deal with every carrier in the world, or make one-off special billing integration.

The effect? A 50%+ boost in revenues, which has a commensurately greater effect on profit. How much value is the broadband industry leaving on the table because of its inability to package up and sell its product via multiple channels?

Even more profitable than the ports are the agents who arrange the end-to-end logistics and supply chains for their customers. In telecoms terms, it’s the operator who can assemble a multitude of fixed and mobile networks, content delivery systems and B2B parterships with the application providers that wins.

For telcos, the critical development to enable personalised packaging of connectivity, applications and devices is to build richer wholesale models. The hot activity will be in the B2B markets, not direct-to-user. The failure of most MVNOs has shown that you don’t just want to create “mini me? telcos, but to enable more granular offerings.

Conclusions and summary

Telecoms is going to move to a multi-sided business model. Google are as likely to be paying for the full delivery of the ad-supported YouTube video as the user is. The telco will also feed Google usage and relationship data to help target advertising. Google might use credit data from the operator to manage its own fraud and chargeback risk on its checkout product. Telcos are logistics companies for data, helping the right data to be at the right place at the right time. This is completely different from being a “dumb pipe��?, wannabe media company or end-user services provider.

When you buy a new electronic gizmo, it typically comes with batteries included. The battery makers have learnt to supply batteries wholesale to consumer electronics makers, as well as to end users. Broadband needs to evolve to add “connectivity included?, with the right quality and quantity packaged up with the application or content in ways that the user finds easy to buy. Today’s product is selling users a raw unprocessed commodity, which is serving neither the interests of the users, merchants or operators.

Full Article: Telco 2.0 Business Model Map: Links & Q&As

At the February 2007 Telco 2.0 event we presented our Business Model Map. Links and Q&As are reproduced here to help clarify the map and its implications.

You can read the background to our Telco 2.0 Business Model Map in the following four-part article series: Introduction, The axes of the map, The business models, and The consequences.

In a nutshell:

  • Network operators are delivery/distribution businesses: they deliver valuable bits from A to B.
  • There are many ways of delivering those bits. For example, a video could be sent on a DVD, via IPTV, a peer-to-peer download, streamed from a content delivery cache, etc.
  • The map documents these distribution channels for valuable bits. Which ones are you as a telco going to invest in?
  • Each one is assessed on two criteria: does payment automatically flow between connectivity and the content/service (“commercial integration), and is the delivery network hard-wired to that particular media delivery, or general purpose (“technical integration).

Can you clarify the Embedded Business Model?

It’s the reverse of the handset subsidy model. You buy a device which comes with connectivity embedded – a bit like those self-adjusting clocks. As we in-fill the device market between mobile handsets and PCs, we’ll see a huge range of appliances, many with relatively short lifespans (under 2 years). Embedded a fixed period of connectivity into the device eliminates a huge amount of billing and revenue management cost.

In other words, as we have more tailored devices deployed, we’ll see many variants on how the money flows between the hardware, service and network. In this case, it’s the hardware sale that funds the rest.

Is it fair to read the matrix and bubbles in the following way: no single new business model, nor a combination, will compensate for cash flow lost on traditional business (bubbles are revenue right?)

That depends on how optimistic you are that operators can grow the “new opportunity” bubbles. In the chart their sizes all add up to the same amount in every scenario, so I’m just showing relative (not absolute) changes.

We do not understand the “protection��? zone

You try to protect the existing revenue model of vertically integrated telephony and messaging. Do this by gentle feature innovation, extending the commercial footprint (e.g. non-geographic numbers, short codes) and resisting competition.

We deliberately don’t call it “product innovation” because it’s not about seeking new online services products (best left to Internet players) but extending the lifespan and economic model of the ones you’ve got.

Q&A: Tiered access and QoS

How do we deal with the situation on Tiered service (Paris Metro model) where disputes arise – proof of QOS etc.?

The point of systems like Paris Metro Pricing is that they retain the cost structure and efficiency of best-effort delivery. If you don’t make a promise of guaranteed delivery/capacity (just better statistical odds of delivery), there’s no dispute possible or proof needed compared to the “reserved capacity” model of QoS.
You can read more at here.

Can you detail the Tiered commercial model?

Think “multiple virtual dumb pipes”. Today my computer talks to the Internet on device eth0 or wlan0. What if there were five “virtual” interfaces (“vnet0″ etc.), and I could choose any one of them, but some had higher cost/lower contention? And I could dynamically switch between them? Think of it as like TCP/IP’s back-off mechanism, but you step up and down priority levels as well as transmission rate.

A lot of the complexity is going to be hidden from the user. Say Skype get together with a bunch of carriers to offer Mobile Skype. Rather than re-write Skype to fit into IMS (no chance), they use a high-priority virtual pipe for the VoIP part, and a low priority one for the IM/file transfer.

One day, you might have hundred or even millions of “virtual Internets” (“VWANs”, to mirror Virtual LANs). A bunch of Xbox gaming devices might talk to each other on a virtual network, safe from denial-of-service attacked from the general Internet, and with the priority and latency they desire.

Does traffic prioritisation have some value as a business model at least in the short term?

Short term, yes. Enterprises, absolutely. And there will always be some friction in the system between different generations of technology that some network smarts will be needed to smooth over.

The comment was made that it was desirable to move away from deep packet inspection – what model would be better?

DPI isn’t all bad. For example, I’d retain it as a means of identifying “bad” packets like DDoS attacks, spam, fraud management. However, as a means of value-based pricing, it’s kaput. Even from a physics standpoint, it’s a non-starter: move all the bits from the optical transmission domain to the expensive and slow electronic processing domain. I’m sure Cisco would like you to do it, but it doesn’t mean it’s a good idea. You’re just in an arms race with your own customers to cloak the traffic. A better model is either (i) spend the money on more capacity and stick to all-you-can-eat broadband, (ii) go to the Paris Metro Pricing style of tiered dumb pipes plus offload the heavyweight traffic to CDNs, (iii) lock down the edge devices to eliminate the problem at source.

Was the Paris Metro Pricing a way of Bell Labs ingratiating themselves with the new French owners

Bien sur! Err, no. The research was done back in the days of Bell Labs, before croissants came on the cafeteria menu.

Q&A: Low vs high integration models

Is the future in the radical advances of end-user devices (high memory, CPU power, client-based apps)? Does this accelerate the ‘dumb pipe’ scenario for infrastructure?

Yes – ever more applications become possible to deliver over a general-purpose network with a general-purpose handheld computing device.

How do you move from a world of one size fits all broadband into a slicing and dicing world? If broadband is unlimited, who would want to bundle it into a device? Will users be willing to move off their unlimited plans to tiered plans?

As “legitimate” P2P and video download traffic moves onto content delivery networks and separately packaged offerings (that avoid peak hour, keeping capex down) the price of “unlimited” broadband will probably rise. For the average email/web/Youtube user, they won’t notice a thing. But the broadband “hogs” most certainly will. We’ve already seen the first steps in the UK with BT wholesale going from an unmetered to a metered product. (I’d expect a congestion-based pricing scheme to be next.)

Do consumers really dislike vertically integrated offers? It is the cost and service that matters. Given cheap offering with good service, one probably will buy everything from one place.

They love them! However, technical vertical integration makes for incredibly slow progress (e.g. voice telephony network) compared to “dumb pipe” applications (e.g. social networking). The trick is to preserve some of the commercial integration of service and delivery without the technical integration.

Can you really integrate commercially without any technical integration/leverage? Explain…

Sure – you’re just accounting for the packets differently based on their origin and destination. There’s no magic – as long as there’s some compensation that flows between the device, content or service and the underlying connectivity provision, then you have some form of commercial integration.

Packaging service/devices etc with connectivity is a good answer, but the worry is that the examples James gave are actually about startups that help users unbundle.

These two trends are not mutually exclusive. Artificial bundles will be torn apart, but where they create convenience to users then they’ll buy up packaged offerings.

What are the fundamental technical drivers of business model & structural change? For example, does Moore’s Law and smarter end-devices MANDATE that a dumb pipe model will evolve?

Blame Claude Shannon, the father of information theory. The information carried by a signal is independent of how it was delivered. Dumb pipes are less complex than smart ones, all other things being equal. So if you can get the application to work over the cheap pipe, why pay more for a specialised one?

Q&A: Telco culture and organisation

How bad does the crisis have to get internally/externally at the telcos before they actually embrace change? Do we need to go through a real period of agony before the willingness to change becomes real?

We’re already there. The ISP industry is already a world of hurt except for some niche players, with exits and M&A on the up. The crisis is as much for the users as for the operators, who can simply delay investment. The users miss out of the experience they require, and the economy never gets to grow new industries.

What is the future for incumbents

In the short-medium term, they’re OK as there are such large barriers to entry and a regulatory tar-pit to negotiate. In the longer term most will disappear (consolidated into a few mega-operators with the necessary scale). Many will exit the application services space or become effectively different businesses (e.g. BT’s global services, where the network is just an enabler, not the product). Probably a similar story to what’s already happened in other utility and media industries. British Gas used to retail cookers – but no longer.

What should telcos do today? What’s the first big tipping point?

Decide which core Telco 2.0 strategy you want to follow: diversification, protection, platform or pipe. Then exit the non-core stuff (but preserve the revenue streams via partnerships). “Tipping points” are a bit of a semantic illusion, it’s a continuous process of evolution.

In terms of broadband and mobile who owns the customer today?

The customer owns the customer. At best, you own some expensive glass and electronic boxes, plus some databases of variable quality. If “2.0″ is about anything, it’s the users being in control of their destiny.

What might the cultural changes be for Telcos to become more innovative?

Innovation does not equal invention. Innovation is simply applying inventive ideas to bring those things to reality. Telcos are plenty innovative enough, just not in end-user services. Most of the innovation is in deployment and operational management of networks, which you never get to see. The evidence is in how well they’ve adapted to extremely rapid technological change. I don’t think it’s possible to create an innovation or invention culture in a mature organisation which doesn’t have one.

What’s the underlying structure out of your perspective which makes change so difficult? No moralistic judgement — a structural explanation

I don’t think change is difficult. The telecoms industry has undergone incredible change in the past 25 years (mobile, broadband, optical transmission, regulatory) – as much if not more than most other industries. How much has Google really changed? You underestimate telcos.

Why are Telcos so low in R&D spend?

Shareholders have done really well out of gatekeeping other people’s innovations being distributed over their pipes. Maybe telcos are like travel agents: high turnover as you act as a handling agent for 3rd party products, but that artificially inflates your revenue and expectations for research spending. Given the efforts of the Internet companies and telco equipment vendors, overall there’s no lack of R&D in communications.

How do telco’s strike a balance between commercial innovation and innovation in technology which is often their bargaining card/source of competitive advantage?

Easy – exit the services innovation space and partner. Focus on integration and overall user experience across multiple services and payment/delivery methods.

Q&A: Peer-to-peer & video distribution

Can you expand on the telcos’ exploitation of peer to peer and content delivery networks?

“If you can’t beat them, join them.” Telco IPTV isn’t going to deliver the Lithuanian shows I want my kids to watch; or give my brother his dose of British TV out in California. There are going to be many different aggregators, editors and recommendation engines. Operators need to make it easy and advantageous for those video distribution networks to minimise backhaul costs and contention. Today there’s no incentive to make your P2P application operator network friendly.

Since P2P is so successful, can Martin and James suggest ways for Telcos to monetise it?

Just deliver the damn bits! Someone else made them valuable, so don’t expect the extract the premium for content creation. Build content delivery networks that are cheap to run and flexible to support many different aggregation and delivery models. Work to lower capex by creating incentives to avoid transfers during peak network activity.

What should the service providers do about the cost to them of “Over the Top��? content traversing their networks? Deep Packet Inspection does control this and requires intelligence in the network.

Throttling users in arbitrary ways is just going to lead to network neutrality legislation and consumer backlash. Find ways and incentives to get those P2P apps to deliver their content in cheaper and more efficient ways. Stick 100Tb of BitTorrent cache in every central office!

Operators need to block P2P traffic! The people that pay the bills for broadband won’t care! I would happily pay for fast content delivery…but you need to free up the network first.

This just isn’t feasible in the long run. Users don’t know or care what the delivery technology is, and if you give them a bad experience (or bill shock) then you lose too.

How do we reconcile the cost of carrying bits with flat-rate in a video dominated traffic model?

You can’t which is why the current Internet model will fragment and video distribution will be done via cheaper methods. Just like you send most parcels by parcel post or UPS ground service and not next-day 1st class mail.

Q&A Wholesale

Moving complexity from retail models to wholesale is far from simple. Peering agreements are complex – how can this be driven?

I think this is an unsolved problem, and the industry is going to have to work out new structures for aggregating wholesale connectivity of varying quantities and qualities and packaging it together. (Telecom probably ends up looking more like the mortgage re-sale market – build a network, and then leave the problem of filling the pipes to someone else.)

Please explain further the shift of complexity to wholesale from retail – give a practical example?

Easy – look at the Blackberry or Three’s X-series. You don’t need to sign up to a data plan, read the terms and conditions to see if you can do the things you want, and watch the megabyte meter all the time.

Is there an advantage for network operators to break themselves up (like BT)? Is there more shareholder value if operators split into retail and wholesale?

Yes…. But. In the case of BT, despite the Chinese walls, the CEO and board can effectively synchronise investment across the two halves of the business. BT has a good balance between the discipline of separation (wholesale can’t rely on one captive customer, retail can’t rely on a distribution monopoly) and the need for services and capacity to be deployed in sync. If there was no such synergy, we’d probably have seen a break-up already. We heard some evidence at the event that the City rather likes the structural separation model, as utilities have higher multiples than volatile telco services.

Regarding shifting of complexity from retail to wholesale: why should it be done, and how can it be done? Isn’t it just about shifting the risk? The risk is still there!

The WHY is because you want to manage the capex cost associated with the delivery of low-value (and often pirated!) video content. This is generally time-insensitive and needs to be shifted out of peak hour (or where economically unsustainable, eliminated via raising metered broadband prices at higher usage levels). The HOW is documented in the business model map – go read on the Telco 2.0 blog my 4 articles on it. You’re mitigating risk by taking some control over it, and aggregating risk to smooth it out. For example, with the BT Openreach structure, they make money no matter which one of Sky, Virgin Media, BT Retail, Carphone Warehouse, or anyone else turns out to be the winner.

Q&A Customer data and relationship

What about the future of data ownership and control? Isn’t this the future cash cow that perhaps telcos, if they can become platform players, can excel in?

Yes, in principle. Just as Google extracted the latent value from the collective effort of hyperlinking, telcos could feed social network applications with the squeezed juice of CDRs. Whether they can finesse all he legal, branding and cultural issues is another thing.

Google has been successful by storing, analyzing and monetizing what users are actually doing. Telcos have/own lots of data, what are the challenges in building business models around this data versus charging for access?

Privacy, regulatory, brand, encryption technology, user experience,…

It’s going to be tough.

How to provide user-centric solutions when telco model still largely views the household, not the individual, as the customer

That’s both a problem and an opportunity. Yahoo! don’t have a clue who I am at all, so the telcos should see their glass as half full here.

Q&A Partners, portals and services

When will web services stop being free? What will be the catalyst

They’re not free already: you have to buy an expensive PC and broadband connection, and then watch a bunch of ads (or give away valuable private data to enable those ads). We’re just moving to a world of many different business models linking the devices, services and connectivity.

Should telcos partner with Google, MSN, Yahoo on Consumer apps and focus on Enterprise / Business opportunities?

Depends on each telco. These aren’t mutually exclusive.

How can the telco better leverage the customer’s various contact directories?

Sync with network, offer an API, enable 3rd parties to access it, take out the barriers to usability!

How about this as a business strategy: rather than development & investment in new services, adopt a simpler strategy of “seed and buy” successful small competitors and using the telco capital for acquisitions?

Yes, telcos could become like Cisco and vacuum up the good services ideas. However, what’s the point if they are then limited to one operator’s distribution network? It’s like building supermarkets for Toyota car owners. Totally unnatural. Although maybe Cyworld and SK Telecom is the exception that proves the rule.

Partnering is obviously a critical success factor in the new world – how should this be embraced?

Big question. The first step of anyone embarking on a “platform” route is probably to separate partner management into its own exec position independent of BizDev.

Q&A Service innovation

Isn’t innovation with big Telcos more like “we are because we are expected to” Is it still a “show business” or real one? Why are there no ground breaking innovative products, if the above case is not true?

It’s been very comfortable selling vanilla voice and messaging. The risk/reward for telco managers doesn’t make it worth trying to jump that shark.

Where is the innovation coming from? Generate it from the inside or stimulate external creation? How to drive, channel and capture innovation?

All the above – depends on each operator’s situation and what opportunities present themselves.

My hypothesis: the endgame will be: there will be money in Access, in Devices, but not in the three C: Communication, Communities, or Content. These will be free, paid by ads or thrown in for free by the telcos.

I think all five of them will be fine businesses. Content will become more personalised and interactive, meaning much of the copyright crisis will naturally subside. Is World of Warcraft content?

Q&A …and the rest

What value or cost are users attributing to access the internet to make use of all these services?

They don’t care about the Internet per se. Campaigns to “save the Internet” have zero resonance with the public.

How quickly will regulatory bodies and policy makers react to this change?

About five years too late, if history is any guide.

To what extent can the overall telecoms spend purse be grown? How many other purses can telecoms raid?

Banking is ripe for disruption, and DoCoMo are probably the ones to watch.

What of the telecoms vendors? Especially the big complex integrated ones?

I think I’ll need consulting dollars to answer that one.

It’s going to be a whole lot smaller as an industry. Again, other industries have led the way, e.g. GE into “power by the hour” rather than turbine sales and blending services with hardware.

What are we going to do with all the copper?

Put Chilean copper miners out of business by flooding the scrap market…. There’s almost 70m tons of it in the legacy access networks.

What are the payment models for future fragmented services?

Equally fragmented, which will probably drive a user experience crisis in the medium term as the user switches between metered, free, ad-funded and “bundled” service connectivity.

What tech features telcos need to have on client platforms (PCs) to enable Telco 2.0 services?

The Virtual SIM that Intel described would be a great start, so we could provision applications with network access, rather than whole devices.