Full Article: BSkyB Platform – Lessons

BSkyB is probably the most misunderstood publicly quoted company in the UK. Most analysts view them as a media company; our theory is that BSkyB is a platform company and comparisons to Apple, Microsoft or Google are more appropriate than UK media players such as the BBC, ITV or even potential new entrants such as BT.

bskyb1.png
Figure 1: Sky Delivery Platform

Rule #1 – You have to keep loading extra features onto your platform…

Microsoft historically were tops at this – Windows for some grew fatter and fatter release by release, but in reality every release contained new features that appealed to some. BSkyB have done the same – they moved from analogue to digital, introduced interactive TV, took PVRs to the mass-market, and now are doing the same with HD-TV. As soon as that is complete ,they will move onto the next thing – 3D TV or even true on-demand VOD.

BSkyB seem to be one step ahead of the competition all the time. The only exception to this is Virgin Media networked VOD service, which is far superior to the BSkyB limited caching of programmes to the Sky+ device – currently a big hole in the portfolio. Notice that BSkyB is totally agnostic whether the features are driven by hardware, software or network – their platform contains all three elements.

Rule #2 – You have to design your platform to be the easiest to use…

Apple are the kings of usability – they control both client hardware, software and in the networked world they are taking more control of delivery though not by ownership of underlying assets – think of the parallel with BSkyB using 3rd parties for satellite and broadband delivery.

Very little is thought of BSkyB’s innovation in design – the Sky remote control in its day was a huge advance from TV manufacturers’ efforts. Similarly, they have extended this advantage in the PVR world – still keeping with their own designs. This is just the start of BSkyB’s advantage in usability.

It is not by accident that BSkyB wins awards for customer service – they realise that excellence in usability also needs to consider every touch point with customer – from sales to installation and care. BSkyB have made huge investments here and it pays off – within a couple of years of entering the voice and broadband market, they seem to be winning almost every award on offer.

Rule #3 – You need the lowest cost platform…

This is all about the economics of scale and running a tight ship especially with regard to corporate overheads. Nearly every platform businesses involve huge upfront risks & investment with many years of losses, with a steady ascent to profitability as the platform gains volume & pricing power.

Organisational inefficiency is a disease that afflicts almost all large companies. Very little data comparative data is available here, but I suspect that Sky is far more efficient if costs were normalized and compared to other payTV (eg Virgin Media), fixed (eg BT) or even Mobile (eg Vodafone) competitors for the share of the consumer wallet.

Rule #4 – You need the best content on your platform…

This is probably the most misunderstood element of BSkyB’s business – people still think BSkyB’s advantage is all about exclusive rights to Premier League Football & Movies. The truth for most content owners is that the BSkyB platform is the most profitable mass market route. And the attractiveness grows year-by-year as subscribers and revenues increase compared to their competitors.

People tend to dismiss how flexible the BSkyB platform is to content owners: you can sell your individual rights to your preferred bidder on a particular Sky Channel, for example HBO selling rights to the “The Wire” to a minority channel, FoxFX; you can build your own channel(s), e.g. MTV and Discovery Channels, or get a particular Sky-owned channel to do the production, e.g. sports events.

Strangely, even the major public service broadcasters (BBC, ITV, C4 & Five) are not only happy to have their content attached to the BSkyB platform, but are willing to pay for the pleasure.

Rule #5 – You need to extend your platform into adjacent areas…

Currently the focus on BSkyB is the move into the home broadband and voice market, which is a much bigger market than pure TV. To date, despite the rapid gain in scale profitability is still an aspiration. The defensive qualities of the play are always underestimated, as is how it constrains the freedom of major competitors, especially Virgin Media and BT, to differentiate.

However, it should never be forgotten that moving into adjacent areas is nothing new to BSkyB: the early digital days featured BSkyB investing/losing money in “t-commerce”; a move into gambling via SkyBet has been more successful, but the retail focus of the service was dwarfed by the emergence of a gambling platform play, BetFair; and Sky has investing in online properties, especially football, but it is too early to access the success.

The BSkyB defensive investments pale into insignificance compared to those made by Microsoft in protecting their franchise against the growing encroachment from Google. These defensive investments are also crucial in negotiations with regulators – for every complaint that BT states about the lack of profitability in the payTV market; BSkyB can counter with allegations about the impossibility of making profits in the home broadband & voice market.

Following the Rules

Of course following the rules is extremely difficult – continual innovation and improvement is hard. But the end result is a platform with plenty of levers for growth to play with. Different levers can be used at different times – all of which make the platform more attractive to particular segments of consumers and thereby generate the growth. The underlying dynamic is that BSkyB needs to show customer growth for success – or more importantly the right type of customer growth, ones who’ll happily pay for the extra features. Building the platform, increasing eyeballs, and increasing diversity are crucial. Traditional TV metrics such as share of viewing for an overall channel is completely irrelevant.

bskyb2.png
Figure 2: Sky Customer Growth and major platform upgrades

Impact on Regulation

UK Regulators, especially OFCOM, have struggled with the BSkyB business model and tend to try to examine market share in very narrow vertical segments – eg the current ongoing PayTV consultation. These ways of looking at BSkyB are doomed to fail. OFCOM is not alone and the EU battles with Microsoft over its platform business are already legendary and still ongoing. We don’t yet have the answer of how to regulate in a platform world, but we do know that a different approach is required.

Most importantly, BSkyB’s platform shows a multi-sided business model in action, bringing value to both upstream and downstream customers and earning decent returns for shareholders.

Full Article: The Long Tail, Debunked?

The fifth Telco 2.0 Executive Brainstorm November 2008 continued its theme of business model innovation at the intersection of telecoms, media and technology by welcoming back Will Page, Chief Economist at the MCPS PRS Alliance, a copyright collection society that represents over 50,000 songwriters and 5,000 publishers.

Will_Page_Long_Tail%20%28Small%29.JPG

Will took the opportunity to present, exclusively to Telco 2.0, new research – based on an unprecedented analysis of digital music sales data gathered over a year – that opens to question the recieved wisdom around the ‘Long Tail’ theory, and helps to re-define what it actually means and for whom. The presentation created quite a stir at the event, in the media and blogosphere. Here, Telco 2.0 discusses at length the presentation and the reaction to it with Will Page.

Telco 2.0: At previous Telco 2.0 executive brainstoms you’ve covered file sharing, the economics of two-sided markets and now the long tail. Coming from outside the Telco world, how useful do you find the events?

Will: Very! It’s ironic that we live in a world of convergence yet too many of the disparate camps like to remain in their pigeon holes and preach to the converted. What Telco 2.0 events do, by allowing someone from a copyright collecting society to speak openly to an audience predominately made up of ISPs and Telco operators, is invaluable to both content and connectivity businesses.

You can see the importance of this more and more now. We have an truly awesome CEO of UK Music (the newly formed music industry trade body) in place now – Feargal Sharkey – and he’s spending an increasing amount of time at OfCom. Two years ago, that simply would not be happening. So, in many ways, what the Telco 2.0 Initiative does in terms of brining different industries together is ahead of the curve.

Last week’s Long Tail presentation was a good example of this. I first met Andrew Bud, Executive Chairman of MBlox (and now Chair of the Mobile Entertainment Forum) and a key collaborator on my analysis, at a Telco 2.0 event back in October 2007. I can’t think of another ‘platform’ where our paths would have crossed, even though both our businesses share surprisingly similar characteristics.

Telco 2.0: For those who follow Telco 2.0 but missed the presentation, could you bring them up to speed on what exactly you, your colleague Gary Eggleton and Andrew Bud have been working on over the past few months?

Will: Sure, it’s worth going back to the beginning as it’s been an interesting journey. Firstly, like so many others, I read the original Wired article on the Long Tail in December 2004 and was genuinely inspired by it. For the next two years I was active in the blog-to-a-book website and have to credit the concept as being one of the principal reasons behind moving to London to work in the music industry in the Summer of 2006, ironically when the book came out.

I guess the presentation I gave last week reflects what I’ve learnt in the two years since from working in a collecting society, an organisation which by default is in the long tail business. Indeed, the Performing Right Society (PRS) has been dealing with long tail markets since 1914. The whole purpose of constructing and offering a collective licence is that it doesn’t matter if it’s a song is a hit or a niche, all the tracks have been licensed under a blanket agreement.

Given the clear relevance of collecting societies to the ‘long tail’ debate, I was surprised to see so little mention of them in Chris’s book – or the blogs that followed. For example, our US equivalents’ ASCAP and BMI don’t appear once in the book’s index.

For those who weren’t there, let me break the presentation down into three parts. It began by looking at the evidence in terms of actual historical data. I drew upon a great expression that I learned whilst in the Government Economic Service, which is to always strive for ‘evidence-based policy making’ and resist the temptation of ‘policy-based evidence making’. Increasingly, when I hear those words “here’s another great example of the long tail at work��?, I’m inclined to expect that claim to lean towards the latter of the two.

I made the point that looking at volume-based Rhapsody data, which much of the long tail application to music has been ‘built’ around, is like a glass half empty – at best. We need to also consider value, and by that I mean not just retail spend, but marginal profitability in terms of what gets back to the artist and songwriter, and also ‘displacement’.

One achievement of my two years at the MCPS PRS Alliance is to get ‘displacement’ into the everyday lexicon of the UK music industry. Is every digital track sold to be celebrated (a P2P user now gone legitimate)? Or regretted (a £9.95 album sale lost)? The reality of the the long tail is now being uncovered by many stakeholders in the music industry’s head (hitmakers) and tail (poor sellers).

The second part revisited ‘histograms’ as a way of plotting the long tail. Andrew Bud, who’s been like a Professor to me throughout this project, put me onto a fantastic book published in 1956 by Brown, entitled ‘Statistical Forecasting for Inventory Control’, and which described the importance of log normal distribution as an analysis method.

This concept is not radically new and is still discussed today (for example, Chris Anderson refers to log normal distributions in his speeches and blog too). But for me this book was fifty years ahead of its time.

Using this approach our team constructed log normal intervals and plotted an unprecedented amount of digital music data over a significant time period. The basic shape of consumer demand for digital music clearly fits the log normal distribution…��?with eye-watering accuracy��?. It was really striking. There are many new schools of thought, but the old rules seem to hold truest.

The difference between a Pareto-style distribution and a log normal is neatly summarised by Chris Anderson in his recent response to my analysis, below:

��? …The two distributions look similar at first glance, and you have to plot them log-log (or fit them with a statistical package) to tell the difference. Long Tails are “heavy-tailed��? distributions, where a lot of the total volume is the tail, while lognormals are more like the classic top-heavy hit distribution…��?

The third part of my presentation at Telco 2.0 last week concluded with two important slides. The first one plotted two heads. The first ‘head’ was the concentration of tracks which sold very little or none at all. It questioned whether the net revenue generated from these tracks cover the real sunk and commission-based costs for a.) getting the song there and b.) getting money back out of the system.

The second ‘head’ was to show the effective average revenue per track in each ‘bin’ (or statistical grouping of data). This was a crude averaging method but it proved highly illustrative. The inequality in revenue between hits and niches was jaw-droppingly stark, justifying Andrew’s observation that “in this tail, you starve��?.

For example, we found that only 20% of tracks in our sample were ‘active’, that is to say they sold at least one copy, and hence, 80% of the tracks sold nothing at all. Moreover, approximately 80% of sales revenue came from around 3% of the active tracks. Factor in the dormant tail and you’re looking at a 80/0.38% rule for all the inventory on the digital shelf.

Finally, only 40 tracks sold more than 100,000 copies, accounting for 8% of the business. Think about that – back in the physical world, forty tracks could be just 4 albums, or the top slice of the best-selling “Now That’s What I Call Music, Volume 70��? which bundles up 43 ‘hits’ into one perennially popular customer offering!

This chart really drove home the theme of the presentation: what does the ‘long tail’ actually mean, and for whom?

If you’re a for-profit aggregator, it means one thing, if you’re an individual copyright holder, it means another. Again, this is something the debate has largely overlooked to date, yet everyone ‘down here on the ground’ increasingly recognises it.

As a not-for-profit membership governed collecting society, I’m extremely fortunate to be in unique position to make a balanced interpretation about the facts, and what they mean to both sides – individuals and firms. My interpretation is in no way gospel, but I can at least build an argument that’s based on evidence from the coal face.

My argument, in summary, was that the future of business is definitely not selling ‘less of more’. Scale matters. To tee up the interactive debate with the brainstorm participants I concluded with a final slide posing the dilemma facing firms in the content value chain as regards their investment strategies. Thanks to the way Telco 2.0 allows the participants to ‘blog live’ with the presenters using lap tops and special software [Ed. – we call the format ‘Mindshare’], a mountain of comments and questions flooded in.

Telco 2.0: Of the (literally) hundreds of questions the audience threw at yourself, your colleague Gary Eggleton and Andrew Bud, which would like to answer in more detail here?

Will: Firstly, I’m genuinely grateful for that ‘blogging facility’ you have at Telco 2.0. The day after the presentation, your CEO emailed me every question, idea and challenge your audience threw at me – that’s a fantastic facility, a “free lunch��? of excellent feedback and advice. My thanks to everyone who posted.

Now, I think there are three themes which I can draw from your excellent interactive facility at the conference and expand on here: (i) the black market, (ii) digital inventory costs and (iii) scarcity.

1.) The Black Market (P2P)
There were lots of really insightful questions on this topic which can be summarised by this one: ‘is the P2P market more or less concentrated around hits than the legal one?’

To help answer it I would direct readers to a now infamous paper I published with Eric Garland, CEO of Big Champagne, titled ‘In Rainbows, On Torrents’ (pdf).

My hunch, based on the evidence we presented in that paper (pointing out the 2.3 million illegal downloads of Raidohead’s new album when it was also available ‘for free’ on their own website), is that the black market is even more hit-centric.

As Eric would argue, popular music is popular wherever it is popular, in that you can’t be a hit on iTunes without being a hit on Bit Torrent …and vice versa. For further reading, I’d suggest the sociologist William McPhee’s groundbreaking theory of exposure, found in his 1963 book ‘Formal Theories of Mass Behavior’.

2.) Digital Inventory Costs
These costs are often overlooked by those claiming the long tail is a ‘panacea’ for artists. Making recorded music has many independent costs, some have gone down, others have gone up (what economists call ‘cost disease’).

Similarly, there are administrative costs to uploading tracks onto digital sales platforms and getting the money back to the creator. For example, indie ‘niche’ labels need ‘aggregators’ before they can join the main digital music platforms, which is a wholesale market, just like in any other business, digital or bricks and mortar. The same old rules of transaction costs and economies of scale apply there too.

I wanted the audience last week to consider another old rule of economics – cost-benefit analysis. Do the net benefits outweigh the costs (both upfront and commission based) of joining the long tail? It’s a simple question to ask which few bother to do and hence it’s infuriating when you read propositions like ‘all you need is 1,000 true fans’.

3.) Scarcity
There was a wonderful comment from one anonymous participant in the room who said:
“…Scarcity forces a ‘competition’-like structure to pass the cut-off point, which paradoxically creates value by increasing the effort of content suppliers to win….��?

This really sums up the point of my presentation. What I said was not particularly new, in fact its basic common economic sense, about which we sometimes need a reminder. This quote points to where I’m going to take the research next. Not the ‘head’, nor the ‘tail’ – but what happens to the ‘body’?

My hunch is that without scarcity, the body is underexploited. The quote provides context for a point I made on stage and in an article in The Register: “Is the ‘future of business’ really selling less of more? Absolutely not. If Top of the Pops still existed, it would feature the Top 14, not Top 40.��?

Telco 2.0: You’ve definitely got the debate started as there’s been significant press coverage since the publication. How have you found the reaction in the media by those who were and weren’t there?

Will: Tricky. Many journalists have agendas – some you agree with, some you don’t. Sometimes the meaning of your work gets lost in the differences within those agendas. My agenda is an academic one, like the great Scottish philosopher Hume would of wanted, one of conjectures and refutations. Let’s take a theory and put it to an impartial evidence-based test.

Firstly. it was great to see Eric Schmidt, CEO of Google, putting forward a strikingly similar argument to myself on McKinsey’s website recently.

In addition, I was pleased to see The Register picked up on the role of a collecting society, an institution that receives surprisingly little coverage in Long Tail debate, yet has pioneered the creation of long tail markets for musical copyright through patent pooling and blanket licensing for almost a century.

It was great to finally get a mention on Chris Anderson’s blog – given that I published my first set of long tail statistics (pdf) back in November 2006!

Given that the source of my data cannot be disclosed at this stage and the slide deck from the Telco 2.0 event cannot be circulated (and I’m genuinely grateful to those in the audience for understanding this point), I thought he did a pretty good job at blogging about a presentation he wasn’t present at. Hopefully this interview will help him fill the gaps.

However, I still think he’s focusing his arguments on the less-relevant volume-based data, and not looking at value in all of its definitions, Or, as an impatient CEO might say ‘show me the money!’ Volume-based discounting has, is and always will be prevalent in any market, be it online or offline. It is a simple and widely accepted fact, and once you accept that it becomes increasingly difficult to hold a conversation about why the future of businesses is selling less of more.

On the downside, one of your participants published a blog article that was so far off the mark, it made me wonder if he was actually paying attention (or more likely, understood the complexity of the music industry). For example, he describes my focus on individual sales as very ‘old economy’. Yet the erosion of the unit value of musical copyright is the biggest issue facing the membership of the MCPS PRS Alliance.

Why? Because we’re a membership-governed not-for-profit organisation that licenses, collects, processes and pays out royalties for our 50,000 individual song writers. He also goes on to say I used a data set where the concept of margin is irrelevant, which is the completely reverse of what I actually did. I presented data and then introduced the concepts of marginal costs (the real costs of managing and processing digital inventory) and marginal benefits (how much of that unit value actually gets back to the creator) from the outset.

Trying to have a balanced debate about the long tail, and avoiding knee jerk reactions and hysterical claims, is hard, very hard! Everyone immediately becomes an expert in a specific market or a statistical rule that they actually know relatively little about.

I feel for Chris on this who pioneered this concept, as he must have had to stare this problem in the face for a lot longer than me. In the music industry, which has experienced the force majeure of disruptive technologies like no other, and for over a decade now, you get a little tired of arm chair critics telling you what to do with the benefit of hindsight, and little understanding of what options were available at the time.

Nevertheless, as the legendary Peter Jenner would always say every time meetings between the content and connectivity industries collapsed into disagreement and disarray the important thing is that we keep all the parties talking, exchanging ideas, evidence and advice.

Telco 2.0: Finally, how transferable is your work? What does it mean for other areas which Telco companies are looking to get involved in, such as Television, Film and Books as well as applications?

Will: Very! Discovering a log normal distribution in one area of media provides a template for evidence-based gathering, interpretation and decision making in others.

I’d stress caution though – you need to order the questions correctly. Just as when you look at international comparisons in order to learn lessons for domestic issues, you need to ask ‘what works over there’ and ONLY then ask ‘of what works over there, and what could work here?’.

There’s been far too much decision-based evidence making to date along the lines of “the long tail must work so find me a great example of it working��?. That’s in no way the fault of Chris Anderson. He (like myself) goes to great pains to correct people’s knee jerk reactions, but it’s standard fare when a new economic theory comes along, people get a wee bit hysterical about it.

I think that the most important lesson to come out of this work is a real back-to-basics question: is scarcity a constraint or is it a discipline? I think that you can ask this question from the outset, regardless of what type (and what size) of media inventory you intend to carry across your network.

Finally, I’d like to reemphasise the importance of impartial evidence-based analysis. My work is not about trying to prove anyone wrong. I’m looking at how well their case stands up when presented with evidence. On that note, perhaps it would be apt to end with a suggestion to those proponents of the long tail theory, by drawing upon a quote from the late great John Maynard Keynes: “When the facts change, I change my mind. What do you do, sir?��?

[Ed. – After the interview we asked the Telco 2.0 analysts to comment on the transferability of Will’s analysis and the ‘so what’ for telcos:

Chris Barraclough, Consulting Director: Value comes from: catalogue breadth/depth + distribution (which includes searchability and multiple customer touchpoints, including affiliates) + evaluation. Amazon has market power because it works hard on both distribution and evaluation. You can exploit a longer tail than your competitors if you can a.) lower your cost base further than them (ie afford to carry more inventory than them) and b.) price cleverly (link price to volume so that lower volume items are priced higher).

Martin Geddes, Chief Analyst: We must appreciate the uniqueness of different content types and distribution networks. On the latter, for example, iTunes differs from last.fm due to pricing policies and content recommendation systems. In terms of content types music is weak in metadata (people don’t write much about individual songs), unlike richer media like movies (where there are lots of reviews and information on the participants), games, software apps, and TV shows. So, my advice is be careful about extrapolating lessons between different media and indeed even between sub-genres within the same medium. Sport, porn, and news video all have very different dynamics, for example.

The big ‘so what?’ for telcos is that a lot of ‘long tail’ content may have no commercial value, but may have considerable social value to users (e.g. photos of your kids). It still needs to be transported. This makes it all the more important to cater not just for ‘high QoS’ material like streaming HD movies, but also to be able to dynamically subload other content. Watch this space for interesting developments in this latter category…!

James Enck, Senior Associate Analyst: It would be useful to analyse how the value of content changes over time. Van Gogh didn’t sell much during his lifetime, Grateful Dead fans favor bootlegs rather than studio recordings, and we all know the story of the Arctic Monkeys. In other words, it’s conceivable that content moves from the tail to the head over time, and those who don’t spend time in the tail will always be surprised at what appears in the head.

The depressing truth for telcos is that replicating the head does nothing for differentiation. Moreover, if content strategies are geared to churn reduction rather than incremental revenue, then what disincentive to churn is there if all competitors have the same 4,000 film library? Long tail content can be highly appealing as a differentiator if it maintains a local flavor. www.pod3.tv is one example in the UK, which to my knowledge, no telco has sought to engage with. I’m baffled as to why Telekom Austria seems to have stopped the Buntes Fernsehen project, which to my mind was a very interesting way to differentiate on long-tail content in a way that’s highly relevant to a local customer base.

Keith McMahon, Senior Analyst, Content Distribution 2.0: The key message for me is that there is no silver bullet in merely loading content onto the net. The challenge is beyond simply distribution. The promotional aspects will be a really hard skill for telcos to replicate over a wide range of content. They are probably much better partnering, developing ‘two-sided’ enabling business models and shifting the demand risk to parties who know better.

Alan Patrick, Senior Associate Analyst, Content Distribution 2.0: I did Mechanical Engineering at University and studied inventory management theory. The thing I recall is that nearly every inventory based demand curve was Log Normal. The big issue in the online world is the lower transaction costs which supports a “positive returns��? power law dynamic, ie. the big get bigger. This drives an increased rush to the ‘Hit Head’. In other words any service which had a long tail distribution would rapidly move to a bigger hit head in any online world.]

Full Article: Online Video Usage – YouTube thrashes iPlayer, but for how long?

Online Video consumption is booming. The good news is that clearer demand patterns are beginning to emerge which should help in capacity planning and improving the user experience; the bad news is that an overall economic model which works for all players in the value chain is about as clear as mud.

We previously analysed the leffect of the launch of the BBC iPlayer on the ISP business model, but the truth is that, even in the UK, YouTube traffic still far outweighs the BBC iPlayer in the all important peak hour slot – even though the bitrate is far lower.

Looking at current usage data at a UK ISP we can see that the number of concurrent people using YouTube is roughly seven times that of the iPlayer. However, our analysis suggests that this situation is set to change quite dramatically as traditional broadcasters increase their presence online, with significant impact for all players. Here’s why:

Streaming Traffic Patterns

Our friends at Plusnet, a small UK ISP, have provided Telco 2.0 with their latest data on traffic patterns. The important measurement for ISPs is peak hour load as this determines variable-cost capacity requirements.

iplayer_7_days.PNG

iPlayer accounts for around 7% of total bandwidth at peak hour. The peaks are quite variable and follows the hit shows: the availability of Dr Who episodes or the latest in a long string of British losers at Wimbledon increase traffics.

Included within the iPlayer 7% is the Flash-based streaming traffic. The Kontiki-P2P based free-rental-download iPlayer traffic is included within general streaming volumes. This accounts for 5% of total peak-hour traffic and includes such applications as Real Audio, iChat, Google Video, Joost, Squeezebox, Slingbox, Google Earth, Multicast, DAAP, Kontiki (4OD, SkyPlayer, iPlayer downloads), Quicktime, MS Streaming, Shoutcast, Coral Video, H.323 and IGMP.

The BBC are planning to introduce a “bookmarking��? feature to the iPlayer which will allow pre-ordering of content and hopefully time-of-day based delivery options. This is a win-win-win enhancement and we can’t see any serious objections to the implementation: for the consumers it is great because they can view higher-quality video and allow the download when traffic is not counted towards their allowance; for ISPs it is great because it encourages non-peak hour downloads; and for the BBC it is great as it will potentially reduce their CDN costs.

youtube_7_days.PNG

YouTube traffic accounts for 17% of peak-hour usage – this is despite YouTube streaming at around 200kbps compared to the iPlayer 500kbps. There are about seven times the amount of concurrent users using YouTube compared to the iPlayer at peak hour. Concurrent is important here: YouTubers watch short-length clips whereas iPlayers watch longer shows of broadcast length.

P2P is declining in importance

The real interesting part of the PlusNet data is that peak-hour streaming at around 30% far outweighs p2p and usenet traffic at around 10%. Admittedly the peakhour p2p/usenet traffic at Plusnet is probably far lower than at other ISPs, but it goes to show how ISPs can control their destiny and manage consumption through the use of open and transparent traffic shaping policies. Overall, p2p consumption is 26% of Plusnet traffic across a 24-hour window – the policies are obviously working and people are p2p and usenet downloading when the network is not busy.

Quality and therefore bandwidth bound to increase

Both YouTube and the iPlayer are relatively low-bandwidth solutions compared to broadcast quality shows either in SD (standard definition) or HD (high-definition), however applications are emerging which are real headache material for the ISPs.

The most interesting emerging application is the Move Networks media player. This player is already in use by Fox, ABC, ESPN, Discovery and Televisa — amongst others. In the UK, it is currently only used by ChannelBee, which is a new online channel launched by Tim Lovejoy of Soccer AM fame.

The interesting part of the Move Networks technology is dynamic adjustment of the bit-rate according to the quality of the connection. Also, it does not seem to suffer from the buffering “feature��? that unfortunately seems to be part of the YouTube experience. Move Networks achieve this by installing a client in the form of a browser plug-in which switches the video stream according to the connection much in the same way as the TCP protocol works. We have regularly streamed content at 1.5Mbps which is good enough to view on a big widescreen TV and is indistinguishable to the naked eye from broadcast TV.

Unlike Akamai there is no secret sauce in the Move Networks technology and we expect other Media Players to start to use similar features — after all every content owner wants the best possible experience for viewers.

Clearing the rights

The amount of iPlayer content is also increasing: Wimbledon coverage was available for the first time and upcoming is the Beijing Olympics and the British Golf Open. We also expect that the BBC will eventually get permission to make available content outside of the iPlayer 7-day window. The clearing of rights for the BBC’s vast archive will take many years, but slowly but surely more and more content will be available. This is true for all major broadcasters in the UK and the rest of the world.

YouTube to shrink in importance

It will be extremely interesting to see how YouTube responds to the challenge of the traditional broadcasters — personally we can’t see a future where YouTube market share is anywhere near its current level. We believe watching User Generated Content, free of copyright, will always be a niche market.

Online Video Distribution and the associated economics is a key area of study for the Telco 2.0 team. 

Full Article: BBC’s iPlayer nukes “all you can eat” ISP business model

The UK’s largest broadcaster finally launched its online video streaming and download service on Christmas Day. Plusnet, a small ISP owned by BT,  has provided a preliminary analysis of the traffic and the results should send shivers down the spine of any ISP currently offering an unlimited “all-you-eat” service.

The iPlayer service is basically a 7-day catch-up service which enables people who missed and didn’t record a broadcast to watch the programme at their leisure on a PC connected to the internet. The iPlayer differs from any other internet-based video service in certain key respects:

It is funded by the £135.50 annual licence fee which pays for the majority of BBC activities.

  1. The BBC collected 25.1m licence fees in 2006/7. No advertising is required for the iPlayer business model to work.
  2. It is heavily promoted on the BBC broadcast TV channels. The BBC had a 42.6% share of overall UK viewing in 2006/7 and therefore a lot of people already know about the existence of the iPlayer after one month of launch.
  3. it is a high quality service and is designed for watching whole programmes rather than consumption of small vignettes.

This is sharp contrast to the current #1 streaming site, YouTube.

A massive rise in costs

The key outputs from the Plusnet data is that in January:

  1. more customers are streaming;
  2. streamers are using more; and most importantly
  3. peak usage is being pushed up

This equates for Plusnet to streaming cost increasing in total to £51.7k/month from £17.2k, or an increase of 18.3p/user from 6.1p/user. This is a 200% cost increase in just the first MONTH of the service. If we assume that the Plusnet base of 282k customers is a representative sample of the whole UK internet universe than we can draw some interesting conclusions about the overall impact of the iPlayer on the UK internet. On the whole UK IPstream base of 8.5m the introduction of the iPlayer would equate to an increase in costs to £1.5m in January from 500k.

Despite access unbundling, ‘middle mile’ costs remain a key bottleneck

IPstream is a wholesale product from BT, with BT being being responsible for the transit of the data from the customer’s home to an interconnect point of the ISP’s choice. The ISP pays for bandwidth capacity at the point of interconnect. BT Retail acts like an external ISP in the structurally separated model. The overall effect of the iPlayer for the BT’s IPstream-based customers is roughly neutral, with the increase in revenues at wholesale (external base of 4.2m customers) being offset by the increase in costs at BT Retail (total base of 4.2m customers). Of course, this assumes no bandwidth overages at BT Retail, which probably is not the case as both BT and Plusnet have bandwidth caps. In effect, incremental cost for ISPs using the IPstream product is determined by ordering extra BT IPstream pipes which come in 155-meg bit size chunks. The option for the ISP is either to allow a degradation in performance or order more capacity.

Time to buy more pipes

We tested the bandwidth profile using Wireshark watching a 59mins documentary celebrating the 50 year anniversary of Sputnik with both streaming and P2P. The streaming traffic is easy to analyse as it comes through on port 1935, which is the port used by Flash for streaming. Basically a jitter-free screening ran on average at around 0.5Mbit/sec. Using the 155-meg ordering slice this means only around 300 people need to be watching the iPlayer at the same time (peak = 8pm-10pm) to fill a pipe. Seeing that IPstream customers are aggregated across the UK to a single point, a lot of ISPs will be thinking of the need to order extra capacity. The BBC also offers a P2P download which is of higher quality than the streaming. We managed to download the 500Mb file in just over 20 minutes at an average speed of 3.5Mbit/sec. The total traffic (including overhead) for the streaming was 231MB and for the P2P delivery was 544Mb.

Full unbundling still leaves ISPs at the mercy of backhaul costs

The story for facility-based LLU(Local Loop Unbundling) players, which account for another 3.7m UK broadband customers, is slightly different as it depends completely on network design and distribution of the base across the exchanges. Telco 2.0 market intelligence says that some unbundlers have ordered 1-gig links for the backhaul and should be unaffected least in the short term. However, some unbundlers have only ordered 100-meg links and could be in deep trouble with peak hour people really noticing the difference in experience. The only real option for these unbundlers is to order extra capacity on their backhaul links which could be extremely expensive. The average speed for someone just browsing and doing emails is quite low compared to someone sat back watching videos stream.

Cable companies understand sending telly over wires

The story for Virgin Media, which is the main UK cable operator with 3.3m broadband subscribers, is again is dependent on network design. This time it depends upon the load on the UBR(Universal Broadband Router) within the network segment. Virgin Media have a special angle to this as the iPlayer will be coming to their Video-on-Demand service in the spring, and therefore we assume this will take a lot of load off their IP network. The Virgin VoD service runs on dedicated bandwidth within their network and allows for the content to be watched on TV rather than PC. A big bonus for the Virgin Media subscribers.

Modelling the cost impact

For both cable and LLU players the cost profile is radically different to IPstream players, and it is not a trivial task to calculate the impact. However, we can extrapolate the Plusnet traffic figures to note the effect in volumes of data. We have modelled four scenarios: usage the same as in Jan 2008 (i.e. an average of 19min/month/user) rising to 1 hour/month, 1 hour/week and 1 hour/day. These would give an increase in cost of £1,035k/month, £3,243k/month, £14,053k/month and £98,638k/month respectively for the IPstream industry, only based upon Plusnet cost assumptions. Of course this is assuming the IPstream base stays the same (and they don’t just all go bust straight away!). Across the whole of the UK ISP industry, the increase in traffic (Gb/month) is 1,166, 3,655, 15,837 and 111,161 respectively. That’s a lot of data. The obvious conclusion is that ISP pricing will need to be raised and extra capacity will needed to be added. The data reinforces our belief expressed in our recent Broadband Report that “Video will kill the ISP star”. The problem with the current ISP model is it is like an all you can eat buffet, where one in ten customers eats all the food, one in a hundred takes his chair home too, and one in a thousand unscrews all the fixtures and fittings and loads them into a van as well.

A trigger for industry structural change?

An interesting corollary to the increase in costs for the ISPs is that we believe that the iPlayer will actually speed up consolidation across the industry and make the life of smaller ISPs even more difficult than it is today. Additionally because of the high bandwidth needs of the iPlayer, the long copper lengths in rural England and the lack of cable or LLU competition to the IPstream product, we believe that the iPlayer will increase the digital divide between rural and suburban UK. The iPlayer also poses an interesting question for the legion of UK small businesses who rely on broadband and yet don’t have a full set of telecommunications skills. What do they do about the employee who wants to eat their lunch at their desk whilst simultaneously watching last nights episode of top soap EastEnders?

Time to stop the game of ‘pass the distribution cost parcel’

The BBC is actually in quite a difficult situation, especially as publicity starts to mount over the coming months with users breaking their bandwidth limits and more or more start to get charged for overages. The UK licence payers expect they paid for both content and distribution when they handed over £133.50. In 2006/7, the BBC paid £99.7m for distributing its broadcast TV signal, £42.6m for its radio signal and only £8.8m for its online content. This is out of a total of £3.2bn licence fee income. I would suggest that the easiest way for the BBC to escape the iPlayer conundrum is for them to pay an equitable fee to the ISPs for distributing their content and the ISP plan comes with unlimited BBC content, possibly with a small retail mark-up. The alternative of traffic-shaping your users to death doesn’t seem like a great way of creating high customer satisfaction. The old media saying sums up the situation quite nicely:

“If content is King, then distribution is King Kong”

[Ed – to participate in the debate on sustainable business models in the telecoms-media-tech space, do come to the Telco 2.0 ‘Executive Brainstorm’ on 16-17 April in London.]