Speaking Engagements & Private Workshops - Get Dean Bubley to present or chair your event

Need an experienced, provocative & influential telecoms keynote speaker, moderator/chair or workshop facilitator?
To see recent presentations, and discuss Dean Bubley's appearance at a specific event, click here

Friday, August 24, 2012

Telcos will suffer because of "subscription myopia". WebRTC & WiFi don't need subs

I've been thinking a lot about WebRTC recently. How and where it will become important, and what it might do to our concepts of voice/video communications and the existing telecom value chain.

It's still very early days, but the momentum and details suggest that it will be of incredibly high importance. There are certainly complexities - not least of which is Apple not yet revealing its intentions - but overall the general premise "feels" right.  There are no obvious irreversible "gotchas", and there are plenty of interesting use-cases, and a whole plethora of innovators from both small and large companies alike.

See more recent posts on WebRTC here and here , and watch out for the forthcoming Disruptive Analysis research report here.

This is diametrically opposite to things like NFC payments or RCS, for which there are plenty of hard, easily-described and unfixable flaws in the basic concept, and where support and innovation are thin.

WebRTC fits well with the idea that much of what we consider as communications "services" are in fact just "applications", and increasingly drifting further down to become "features" and eventually "functions". Messaging is already a long way down that curve - IM chat inside apps such as Facebook or Yammer or Bloomberg are not "services", any more than the bold type button is a service. They just send words from A to B, rather than highlight them on the page.

WebRTC extends that metaphor to spoken words or visual images. They will just be sent via a browser or web widget (obviously needing access to camera, microphone, codecs & acoustic processing). It is already possible to have direct browser-to-browser conversations without plug-in or downloaded applications on the desktop. Massmarket versions of Chrome, Firefox and IE are all likely to support WebRTC during 2013, with a steady move onto mobile over the next couple of years.

This will mean that voice communications (and in some cases video, although I think that will be minor) will become much more pervasive, cropping up in all sorts of interesting contexts. I've long talked about "non-telephony" forms of voice, such as Siri, in-game voice chat, push-to-talk, business-process integrated voice and so on. WebRTC is likely to be the single biggest catalyst enabling "voice as a feature" to be used by web developers in the same fashion as any other aspect of HTML.

Maybe in two years time, you'll be on the Amazon website and you'll suddenly hear a voice saying "Hey, congratulations to all of you browsing right now - there's a 10% discount on everything for the next 5 minutes!". It could be the web equivalent of a tannoy in a supermarket "Special on Aisle 3!". That's not a phone call. It's not a service, either. But it is voice communications. Other possibilities are too numerous to mention, but many have observed that this means that "the website becomes the call centre". Not "click to call" or even "Skype me", but just having an in-browser real-time voice interaction in the same fashion we already see with IM chat. Adding WebRTC voice to LinkedIn, Facebook and numerous other sites is obvious, and so are things like web-karaoke without plugins, or voiceprint-based authentication instead of passwords.

This is disruptive to both traditional phone calls, and also to "legacy" standalone VoIP clients such as Skype's. It is doubly disruptive to new VoIP platforms such as telcos' IMS-based VoLTE, which is mostly just a recreation of the old telephone mindset, and is having enough problems even doing that.

At the core of this is a central problem for the telecoms industry. It is addicted - perhaps even enslaved by - the idea of the "subscription". All operators report subscriber numbers, the word SIM means Subscriber Identification Module, and much of the technology elements such as HSS's and most billing systems assume subscription-type relationships. Regulation is also heavily subscriber-centric.

Now subscriptions are a very valuable business model. Ongoing payments are attractive for companies, and predictable for users. Many businesses - include a lot of technology analyst firms - are heavily dependent on subscription revenue streams.

But they're certainly not the only business model, and neither are they without flaws. They mandate an ongoing customer relationship. They assume that the capability being provided is an identifiable and separable service.

While that has been fine for the past 100 years of telephony, it is clear that the landscape is changing. Voice or video communications is going to appear in lots of contexts - service, application, feature, function. Sometimes it will be based on the need for enduring relationships and "reachability", for example with a phone number and subscription. Sometimes it will be transient and in-app.

Some communications capabilities will continue with ongoing identities and billing relationships. Others will be sponsored, free, ad-hoc, one-offs, occasional use, ambient, ad-supported and so on. I'll get my spoken words delivered - and paid for - in as many ways as I get my italic words. Sometimes I'll get italics in my subscribed and paid-for magazines. Sometimes they'll be on a website or billboard for free.

If I want to speak to an Ikea customer service agent with a query on how to put my cupboard together, I don't need their number, and they won't need mind. I'll just click the "help!" button in the Ikea app which has already tried to show me where I'm going wrong, perhaps with a one-off fee associated with it.

Now it's possible that could be done with Telco APIs, hooking into an IMS core and telephony app server. But I might be using a WiFi-only tablet with no associated phone number or operator relationship. And Ikea, in this example, is not going to want to deal with either 100 telcos or the constraints of some collaboration like OneAPI, when it could just add the function simply and easily into the browser or app, at no cost or hassle.

My view is that WebRTC will ultimately be the "ubiquitous" voice and video communications service. There will be more browsers, and voice-embedded websites and apps than mobile and fixed phones. The telco/IMS world will be a subset of this, constrained by the narrow formula of subscription-style relationships and defined identities.

Yes, there will be security issues around the perceived dangers of anonymised voice communications. Yes, in some cases network quality will be too poor to support good-enough voice using best-efforts connections. But those will be (fixable) corner cases, and not things to derail the wider trend.

We already see service providers looking at opportunities around WebRTC - addressing services, legacy interoperability, premium billing, perhaps quality enhancement or emergency-calling-as-a-service. AT&T, China Mobile, Telefonica and others have spoken publicly about WebRTC, and I know many more that are watching or involved in the standards work. Vendors like Ericsson are looking too - this is not just a Google / Microsoft / Apple (??) fight, with traditional telecoms getting squashed in the middle.

There are still plenty of questions, and this won't all happen overnight. But one thing is, to my mind, utterly inevitable. Those companies who refuse to see beyond the "subscription" - and those technologies which cannot flex enough for non-subscription relationships - are facing decline into niches or outright irrelevance.

(Footnote: WiFi doesn't need a subscription either. LTE does)
(Footnote #2: One good way for Telcos to get around the legacy subscription mindset & infrastructure base is to pursue Telco-OTT services and business models. Buy the report!)

Thursday, August 23, 2012

Upcoming Telco-OTT & Future of Voice events: US, UK, Asia & MidEast

The debate about Telcos & OTT / Telcos vs. OTT services refuses to die down, spanning VoIP, messaging, content and cloud services.

Recent months have seen continued debate at all levels of the industry. There have been countless articles written with various levels of apocalyptic and/or messianic tone. Skype, Google, Apple, Amazon and Facebook are becoming more entrenched in users' minds and smartphones, along with newcomers such as WhatsApp, Pinterest and Twitter.  We've felt the rumblings of WebRTC (more on that in coming months from me). We have seen operator CEOs opine on both perceived threats and opportunities (most notably Telefonica). We've seen organisations like ETNO try to flex political muscles, lobbying the ITU about the whole structure of the Internet and about permitting operators to levy transport charges (more like telecom-style termination fees) on 3rd-party applications and content.

We've also seen the difficulties of some Internet business models - notably Facebook, Netflix and Zynga. As such, these firms are unlikely to ever pay rent-seeking telcos any form of transport toll without additional value-add that helps their business. Google, BBC and others have ways around network-quality issues and are unlikely to ever pay for QoS, even if it becomes feasible.The only ways to "monetise" OTT services are those which enhance their current business and revenues, not tax what they're doing already.

We might see some traction in areas like customer data intelligence, identity, congestion or billing APIs, but the operators are desperately slow, especially where they attempt to collaborate.

In the meantime, customer expectations from voice and messaging are changing significantly. Users seem entirely happy without the need for "ubiquity", except as a last-ditch common denominator for people they contact outside their normal network. For almost every use-case of communications, there's something better, cheaper or cooler than phone calls or SMS.

With all this in mind, I'm going to be on the road over the next couple of months, participating in a broad variety of events, and speaking on the central themes of:

  • Clash of ecosystems: telecoms standards, apps & web
  • Telco-OTT services and strategies
  • Future of Voice & Messaging
  • Why telcos cannot hope to "monetise" OTT comms/content/cloud services if all they are offering is data transport
  • At some of the events I'll also be looking at network-side issues like policy & charging, or WiFi offload/onload models

The format of these events vary. Some are private vendor-led customer conferences at which I am a "stimulus speaker". Some are paid public conferences. I'm also doing various private behind-closed-doors workshops.
 
First up, on 20th-21st September, I'll be running a 2-day telecoms excellence course in Singapore with Clariden Events. Titled "Managing and Understanding Disruptive New Technologies in the Mobile Telecommunication Business", it will cover a broad array of trends around both communications services (voice, Telco-OTT, WebRTC etc), and the underlying infrastructure. It will cover both global and Asia-specific developments.

On 27th September I'm doing a webinar on Telco-OTT with Acme Packet. Details here 

Next up is the US, where Martin Geddes & I will be running a shortened version of our Future of Voice / Telco-OTT workshops as the pre-conference for Metaswitch's customer forum in Orlando, on October 1st. We're both also speaking or moderating panels in the main part of the conference.

Then, from 14-18th October, I'll be in Dubai at the ITU Telecom World Summit, attended by a variety of global telecom luminaries, including national telecom ministers and operator CEOs. I'm on a couple of panel sessions, including the "Battle of the Ecosystems", which will examine telco business models in the world of OTT services and consumer data. I'll be voicing a number of opinions, including the pivotal role of OTT, and the need to maintain a strict view of the "Real Internet" alongside any other non-neutral varieties of data service. My other session will be on "Service delivery", which will cover areas like IMS, RCS and WebRTC. I'll be arguing that operators and standards bodies need to look beyond legacy platforms such as IMS, if they are to survive the next decade.

October 23rd & 24th in London is the next iteration of my and Martin's full 2-day workshop series on Future of Voice / Telco-OTT. We'll be revamping the material to cover recent developments - from WebRTC, through outcomes from ITU, to updates on RCS/VoLTE launches. Full details are at www.futureofcomms.com and sign-up is here. These interactive, small-group events feature a careful mix of operators, vendors, Internet companies and developers - often with some regulators and investors thrown in as well. We've got a very strict 25-person maximum so we can give personal attention to everyone's specific situation, and encourage collaboration between people in the room.

I'll also be chairing the Total Telecom World conference in London on November 13th, which will also examine OTT disruption, and how to rebuild the telecom ecosystem to recapture growth and revenue.

There will probably also be a few other events I'll be at in 2012 - I've already got a Telco-OTT webinar (details soon) and a couple of private presentations/workshops booked in. I'll also probably be wearing my "Telco 2.0" hat at STL's Digital Asia event in Singapore on 3-5 December.

If you're interested in booking me as a stimulus speaker, event chair or panel moderator, please get in touch via information AT disruptive-analysis DOT com.

Lastly, one bit of advanced warning. I will NOT be attending MWC'13 in Barcelona next year, for all the same reasons that Alan Quayle eloquently discusses in this blog post. I think the move to the new out-of-town venue will destroy the nature of the event, and I've got no desire to suffer the "commute" to and from central Barcelona. However, I'll be extra making time available for meetings in London the week before - and if anyone fancies joining me, maybe we can organise some sort of Mobile London Congress instead, or at least a few drinks.

Saturday, August 04, 2012

London & Technology: The Mayoral 2012 Debate & my city's future direction

I'm a native of the world's greatest city. And despite my extensive travel schedule, I still live a mile from where I was born, right in the centre of London.

The Olympics - and Team GB's performance - are making me doubly proud of my home. And so I'm absolutely delighted to have been asked to take part in one of the "Mayor of London 2012 Debates" this afternoon - unsurprisingly, the one called "Technology: Disruption or Convergence". The lead speaker is Jimmy Wales, of Wikipedia fame. I'm down as one of eight "spotlight speakers" who will assist the debate with questions to him.

The theme of the event overall is this: "London has demonstrated resilience over many economic cycles, but what role will it play in the global economy as we shift to meet new challenges, and how can it incubate innovation?"

I've watched with interest in recent years as the technology industry in London has surged once again, especially in the parts of East London around Shoreditch - sometimes called "Silicon Roundabout", in reference to the Old Street road system. A bunch of interesting startups have emerged, companies like Google have set up shop, and - it's just a mile from the City of London - sources of investment have  filtered in. There's definitely a sense that innovation is indeed being "incubated", and there's certainly no shortage of encouragement for digital businesses in areas like e-commerce, social media and so on.

But I worry slightly that London focuses too much on the glossy - and rather emphemeral - part of the tech value-chain. It's very much "all about digital" - an extension of the city's heritage in media, advertising and trade. What's missing, to my mind, is hardware and other more engineering-led disciplines. While design is clearly critical to many firms' success (as Johnny Ive's recent knighthood highlights), there is also a need for nuts and bolts to underpin the sexier, flashier part of the Internet and mobile experience.

Ironically, there are no companies involved in silicon anywhere near Silicon Roundabout.

To me, the reason that Silicon Valley in the US has been so successful has been that it has had everything from university led research at Stanford, to silicon vendors such as Intel, and leading IT/networking players such as HP and Cisco and Apple and Sun, all alongside software (enterprise and consumer), finance and assorted supporting functions. Yes, media in the US tends to congregate in New York or LA, but the Bay Area has had pretty much everything else "on site".

I worry that London doesn't have the same depth - and despite having centres of excellence in places such as Bristol and Cambridge, there isn't the same "corridor" effect. Cambridge is almost exactly the same distance from Old Street, as San Jose is from Market Street in San Francisco. Yet the M11 motorway most certainly isn't Highway 101. While a drive from SF to SJ takes you past numerous famous tech locations - Redwood Shores, Cupertino, Palo Alto, Santa Clara - the equivalent trip in the UK embraces Walthamstow, Bishops Stortford and lots of pretty scenery - as well as some rather unpleasant bits of Northeast London. Although Stansted Airport is directly in between, most international travellers go via Heathrow, which is a battle with traffic and transport right across town.

In the past, UK technology seemed centred on the "M4 Corridor", from West London, past Heathrow, out towards Reading and Swindon and Bristol. Yet with a few exceptions, most of the many offices are just the UK sales and marketing HQ's for US or other international players. Not that much innovation and R&D happens there, and it also lacked the investment firepower (Not many VCs would want offices in Slough or Basingstoke - they're hardly Sand Hill Road equivalents). Not to mention the fact that London's rather sniffy media industry doesn't really fit with the business parks of Berkshire.

My question this afternoon is therefore going to be on these lines:

At the moment, London has a great deal of creativity and investment in the digital space, from mobile to e-commerce to social media. Yet despite the name ‘Silicon Roundabout’ to refer to the Old Street and East London technology hub, the one thing London lacks is expertise in the ‘silicon’ aspect – there are no IT hardware or semiconductor firms, unlike California. Can London’s technology sector continue to prosper without that hardware-engineering baseline – or can it rely on satellites such as Cambridge and Bristol to supplement its software and design competencies?
 
Personally, I'd like to see an East London / Cambridge corridor being considered in more concrete  terms, with better transport links and perhaps enterprise investment incentives being extended there, probably also including the Olympic Park legacy area around Stratford as well. Otherwise, I fear that for all the hype around London's tech renaissance, we're going to lose out on the synergies and self-reinforcement gained from combining all parts of the value chain. Semiconductors, network design and enterprise software might not appeal to Number 10 Downing Street's desire for photo-opportunities, but that's where a lot of success - and employment and tax revenues - could come from.
 
It might be the Internet age, but Silicon Valley proves that geography - and the chance for people to meet and travel - remains an incredibly strong factor in sustaining innovation and the relevance of a local economy on a global stage. And the valuations of Apple, Cisco, Intel and their peers also illustrate that engineering and hardware make up for their relative lack of sexiness against social media in other - perhaps more important - ways.
 
I'd like to see London on its current success by extending its investment and innovation in both physical location, and new parts of the broader technology industry
 

Friday, August 03, 2012

Mobile data traffic growth - a thought experiment & forecast

I'm deeply skeptical about a lot of the rhetoric about "mobile data explosions" and "tsunamis". In particular, I believe that a lot of the forecasts are unrealistic and often self-serving. Cisco's VNI is the best-known, but many other vendors (eg Ericsson, Huawei) and analysts put out their own data as well.

The predictions of several more years of 100% traffic growth seem a particularly poor fit, given that numerous operators (eg Vodafone) have reported notable falls in growth rate, often to below 50%pa annualised in developed markets with tiered/capped plans.

Clearly, suggesting that networks might get overwhelmed is a great way to suggest that operators should "buy more kit". There's also usually a particular focus on video and the percentage of traffic it makes up.

I also think that overstated & misrepresented data traffic forecasts are mis-used by the operators and industry bodies, especially when it comes to talking about "spectrum shortages", or perhaps the onerous effects of Internet content that should justify non-neutrality of service provision. There is a bigger battle being fought here, with the telecoms industry trying to claim spectrum previously used for TV or government functions. This has two functions - it makes network expansion simpler than using alternative approaches, and it also reduces the strategic and competitive power of the broadcast industry, which is a gating factor on telcos' IPTV opportunity.

SIGN UP TO RECEIVE THIS BLOG BY EMAIL

So it is easy to understand why there is a good reason to high-ball estimates of future mobile data growth. There is also a desire to try and create - or at least influence - self-fulfilling prophesies about the role of mobile broadband.

In any case, market forecasting is imprecise, because often the market itself is dependent on decisions made in the light of people reading them. To be accurate, you'd really need to forecast what new actions will occur as a result of your forecasts being believed, which is clearly a circular argument.

But in any instance, these discussions generally overlook numerous inconvenient issues that ought to be front-and-centre for mobile data before we run to the hills (or the regulators) from the "tsunami":

-  Tiered/capped pricing seems to "work" very well in limiting data consumption and congestion, especially if users have a "fuel gauge" and some idea of what activities burn the most of their quota
- Gross measures of traffic "tonnage" don't translate either to costs or congestion. It's traffic in busy hours and busy cells that matters. An extra 10GB of video downloaded at 3am in a rural cell is essentially free. An "offpeak" dataplan might increase reported traffic volumes but have no impact on costs or congestion. In fact, it might generate incremental and very profitable revenue by increasing capacity utilisation during quiet periods.
- For many networks, signalling load (against both the RAN and the core network) is the problem, not data tonnage. Multiple short bursts of data or "pings" clog up the network in different ways to a single, consistent stream. But it's harder to measure and bill for signalling, so it tends to get ignored
- Smaller cells give greater capacity density, at a lower price. They allow better spectrum re-use, reducing the need for new bands. In the long run, we get much more extra capacity by reducing cell size rather than adding extra radio channels - but this conflicts with the desire to grab more spectrum from alternative/competing users.
- Other new technologies and processes will improve network efficiency too: beam-forming, better sectorisation, MIMO and so on. But these are less well-proven than simply adding extra bands.
- "Video" isn't an application, it's hundreds. Amalgamating 500+ different applications and services under a single banner is completely arbitrary and meaningless. It's like saying the web is disproportionately dominated by the colour blue, and it should therefore be treated differently. (It might be green, I don't know).
- The dynamics of demand for mobile broadband are over-simplified. Much of the historic growth has been from "more users" rather than "more use per user". There are additional issues (discussed below) about coverage, device capability and so on.
- Demand growth and capacity growth are not directly linked - especially because "capacity" is impacted by numerous factors such as backhaul as well as radio-network scale. We also find that some base stations are "congested" because they haven't yet been upgraded to the maximum number of radio carriers already available.
- The dynamics of the mobile data market differ for post-paid and pre-paid users. As PAYG (which makes up the bulk of the planet's mobile customer base) becomes predominant, the idea of a monthly allowance will shift to a more usage-linked model. Early evidence suggests that PAYG data users - with smartphones - consume much less data than those on fixed plans. This is not factored into most forecasts.
- Some forecasts start from 2009 or 2010, therefore building in a huge initial leap from a low base. Ignore anything that doesn't use 2011 or 2012 as a start year, as otherwise the statistics will get swamped by vague and patchy measurements of historic data.

NEW REPORT - 10 REASONS WHY 1-800 TOLLFREE APPS MODEL WON'T WORK

We're already satisfying much of the latent demand

I'd argue that in developed markets such as the US and UK, we are already at 50%+ of  potential mobile data usage saturation given TODAY's devices, data-plans, apps and user behaviour. Almost anyone who really wants a smartphone already has one.

Many people who want cellular-connected tablets or laptops already have them too. Sure, there are still a few demographics that want them but can't afford them, but that group is diminishing rapidly. The remainder are mostly "laggards" - folk who might like a new device, but who are likely to be "unenthusiastic" in usage behaviour, at least in comparison to those with a 4S or S3 or L900 in their hands 24x7. There are surprising numbers of mobile data "refuseniks" whose attitude is unlikely to change.

If you magically increased the smartphone penetration of all developed countries to 100% right NOW, I'd be surprised if it added more than 50% to tomorrow's data consumption stats.

NEW REPORT - 10 REASONS WHY 1-800 TOLLFREE APPS MODEL WON'T WORK

A thought experiment

Consider a fictional place where 50% of people have mobile broadband (smartphones with data plans, plus some fraction have 2nd/3rd devices), and 50% don't have MBB, either because it's too expensive, or because they live outside coverage, or they're just apathetic.

Let's say that the 50% of current data users are using 1GB/month as an average (mean). There's a mix of capped and uncapped plans, some people are heavy users (with one or two devices), others are more parsimonious, or perhaps just use WiFi a lot.

What is the "unconstrained demand" from these people at around current data-plan prices? I think we can now assume that the mobile broadband marketplace is now pretty efficient at giving people roughly what they want, at roughly the price they're prepared to spend. I'd be surprised if the true unconstrained demand from existing mobile data users would be much more than 50-100% higher than today.  (Yes, if we dropped the average price massively there'd be an elasticity effect and demand would rise, but let's leave that option for a moment).

What are those constrains?


Thinking about Mobile Data Demand Constraints

A good way to think about this is to consider "what would happen to overall data traffic if we removed certain constraints?". This is often a counter-intuitive thought process because it involves going beyond the raw statistics and thinking about the real world and user behaviour.

So it's tempting to say that going from 50% penetration to 100% overnight would result in an instant doubling of traffic. But actually, it wouldn't because the remaining 50% would be much less enthusiastic users than the early adopters, especially on Day 1. Similarly, if we had perfect 3G/4G coverage tomorrow, we'd see traffic growth but not that much overall because all the busiest areas are already covered. What's left are big zones of occasional use (rural), and quiet corners of some indoor spaces.

This type of analysis is inherently much more complex, and goes beyond most statisticians' comfort zones. But it's a critical application of common sense. It's a sanity-checking phase too, that often seems absent in a lot of the mobile data forecasts I've seen.

For the thought experiment, let's consider I've got a magic wand to remove constraints. Each time, I'm going to leave the other variables untouched, especially price. What might happen?

1) Device penetration - if you gave everyone a smartphone tomorrow with a dataplan they could afford, plus cellular tablets/dongles at a pro-rata penetration to the early-adopter base, I expect we'd get around another 40% of traffic. (Heavily dependent on existing penetration of smartphones, eg 30% vs 50% - by the time some people read this post, we'll be nearing saturation anyway as it's moving so fast). In fact, the figure might be much lower - maybe just +20-30%, because most of those users will be prepay subscribers who tend to have lower data consumption anyway.

2) If we had perfect cellular coverage everywhere, I reckon we'd get a 20% uplift in aggregate traffic. Network planners aren't stupid - they know where the demand is, and the economics of satisfying it. Providing coverage to every mile of road and rail, or to every small village would definitely be nice - but it doesn't compare to a a big metro area when added together.

3) Now for a biggie. If we improved network speeds to 4G-type rates, with better latency, what would that do to user behaviour? More video streaming? Probably. More web use? Sure. More cloud-based apps? Perhaps. This is a tough one to predict, but we see some indications from people who move from 3G to LTE (although some of that is about getting upgrading to a new & better device rather than having a better network). Some stats say that LTE users typically use 50% more data than 3G users. BUT, that is skewed by early adopters switching first. But given today's apps and data plans and user expectations - which are often met pretty well on 3G after all - I'd say 40% is reasonable if the speed constraints were removed.

4) Device performance is another variable here. A lot of people have quite old devices that are slow, clunky, have low-res screens or are otherwise constrained, irrespective of network capabilities or coverage. But how much of a big deal is this really? Again, most of the real heavy users and enthusiasts do have the latest devices. If you waved the proverbial magic wand and upgraded all the old 3GS's and Galaxy S2's and assorted BlackBerries to today's state of the art, what would happen? Not much I reckon, again if all other variables were kept constant. (There's a bit of a co-dependency with LTE availability as noted above, though). I reckon that across the user base as a whole , we'd see perhaps a 30% uplift in data usage.

Now let's bring all these together to see what might happened if we removed the constrains (except price, and again bear in mind this is with today's typical apps and behaviours):

Device penetration = 1.4x (maybe lower)
Coverage = 1.2x
Network speed = 1.4x
Device performance = 1.3x

Multiplying through, we get an estimate of unconstrained demand = 3x today's constrained demand

But some of this is - in all reality - never going to happen. We're always going to have a spread of device ages and capabilities. We're never going to get 100% coverage. Some people will never use Facebook or Netflix or Dropbox, no matter how fast the network.  PAYG prepay users will use less data for various reasons. And some people will hold onto their Nokia 6310 from 2003, even if you try and bribe them with the latest smartphone for free.

So in the confines of this thought experiment, we're probably addressing 50% of *current* unconstrained demand for mobile data, at current prices.

That's probably true of a bunch of other industries as well. I'd guess that we're probably at around 50% of unconstrained price-constant demand for anything from flights (people don't have enough holiday time, or are scared of flying etc), or even unconstrained demand for beer (can't drink at lunchtime before a meeting, too young, health/religious issues etc).

It also passes the "taste test". Most people don't spend all day moaning about how they're only getting a fraction of the mobile data they want. Generally, apart from a few minor gripes (coverage mostly, plus congestion in some hot-spots), people seem pretty happy that their mobile data demands are being met.

Obviously, this is macro-level stuff. Specific places (eg Olympic Park) clearly see much faster growth in demand as there will be localised drivers. Also, the calculation in the thought experiment above will vary by country a lot too - there are different levels of network rollout, smartphone/dataplan adoption and so on. India, for example, is starting from a much lower base for traffic, and so many of the variables will be considerably higher to work out latent demand.


Mapping future demand

So. Let's say that with current devices, networks, apps, behaviour and pricing, we are dampening consumption of mobile data by a factor of maybe two from the theoretical realistic demand in today's developed markets. That's a useful number to bear in mind, as it means that:

Any future mobile data traffic growth is primarily going to come from new demand not satisfying current latent demand

That's important in technology. It's often said that "usage always expands to fill the capability available", or "build it and they will come", but that's not actually true. The reason that computer processor speed has always been exploited (and so quickly) has been that companies such as Intel have spent lots of time and resources on "demand creation", seeding developers with new technology, running marketing programmes and so on.

So where are all these forecasts for 10x, 20x - even 1000x - growth for mobile data coming from?

Is it just the mobile industry's normal ridiculous arrogance (almost a sense of entitlement about growth) and its propensity to ignore lessons painfully learnt elsewhere in the technology industry?

Well, firstly there's still a lot of untapped growth in emerging markets, although it again needs to be born in mind that at current network costs and dataplan prices it is likely that data use will be lower per-capita. A $2 data prepay ARPU is not going to give 100% coverage, $300 subsidy of an iPhone 4S, and a 1GB per-month plan. In general, that part of the market will be using cheaper/less-capable devices, on thinner/slower networks, with more restrictive tariffs than elsewhere. They will also likely adopt different behaviour to "squeeze more from less".

And, as mentioned above, we'll also see growth in smartphone use among late-adopters in developed markets.

So while this means we should see mobile data user numbers continue to grow rapidly, this will also paradoxically drive down the average data consumption as heavy users (who may well be using more data year-on-year) will get diluted by newer and ever more numerous lighter users.

I have not seen this mathematical inevitability called out on any forecasts, yet it happens in virtually all markets as we see a progression of maturity. A grandparent in rural Bolivia getting their first smartphone is unlikely to be using Facebook and streaming video 24x7 on Day 1.

So, where else is the extra usage going to come from? Various sources are possible:

- Even faster / better networks making apps more usable
- Device improvements like bigger screens
- New apps and services (eg mobile cloud-based offers)
- More cellular devices per person (either "attach rates" of personal devices like 3G tablets, or others like M2M that bump up the total)
- Behaviour changes meaning greater time spent on mobile apps
- Lower prices driving elasticity (eg through changing behaviour faster/further)
- Better structured data-plans driving off-peak usage volumes
- Switching from using fixed broadband or WiFi to LTE

NEW REPORT - 10 REASONS WHY 1-800 TOLLFREE APPS MODEL WON'T WORK

Many of these are interlinked, obviously - if the network is faster, it enables new apps, and alters peoples' behaviour.

It is also important to note some downward pressures on users' average consumption of mobile data:

- Better devices and OS's that compress data (eg similar to BlackBerry, Opera Mini & Nokia Asha which route "optimised" data via a server), or third-party software like Onavo's
- Various types of network-based optimisation and compression, especially for streamed video
- More use of adaptive applications (eg HLS-encoded video) which self-optimise to network conditions, or more efficent codecs
- Substitution of 3G/4G data with WiFi, either as true "offload", or (much more importantly) user-driven preference for accessing private WiFi, usually for free
- Older devices being retired (for example, I'm about to cancel my 3G dongle contract as I never use it - I get WiFi almost everywhere I want it)
- App developers becoming wiser and more parsimonious about how their software consumes data, especially if they get better development/testing tools, or are "shamed" into it in appstore ratings

I don't have quantitative forecasts for all of these drivers and constraints. But some appear to me to be especially important:

- More usage per person driven by better apps and devices, and behavioural change. Let's tackle the latter first. I honestly don't think that the average mobile data user is going to increase their time spend on mobile devices by another 2x, 3x, whatever. We're close to saturation on that one already. Better apps and devices? Definitely. I agree that there will be more video and cloud-app usage. I can certainly see scope for an extra 2x or 3x over a 5-8 year period. However, the swing factor here is likely to be tablets and the evidence suggests more extra usage will be WiFi-based.
- WiFi is to my mind the biggest "decelerant" here. Operators are (often) trying to do their own controlled offload of traffic, although I still believe that most examples are going down the wrong path of "seamlessness" with things like ANDSF and Hotspot 2.0. However, that is becoming less relevant anyway, given the huge global explosion of non-operator WiFi and increasing sophistication of users in exploiting it. Partly because of data pricing and caps, users are actively seeking "free WiFi" wherever they go, and becoming adept at using it - especially the high-end power users that normally generate the most traffic. Unless we see lots of WiFi congestion (possible), that move now seems irreversible. Recent Ofcom data shows that more people are becoming "WiFi-primary", just using 3G/4G where they have to.
- More devices per person. Yes, this is going to rise, even though most tablets will likely remain WiFi-only. We'll see various new mobile-enabled gadgets in all walks of life. M2M can, however, be dismissed as a major traffic driver as the vast bulk of products are low-bandwidth. Despite a few high-consuming categories like digital signage or in-car telemetry, there are none that obviously have the scope to scale to billions of units. Against phones and to a less degree PCs, tablets & MiFis, they are lost in the noise.
- Despite better uplink speeds and slow rollout of fibre, I don't see signs of much shift from ADSL/cable to mobile-only. Coverage limitations and need for IPTV and WiFi are likely to keep fixed broadband largely protected except for a few niches.
- I do see quite a lot of traffic growth being driven by "off-peak" data plans. Marketeers and their billing systems are becoming smarter. However, this is essentially irrelevant from the point of view of network capex, spectrum needs and so on.

The price elasticity issue is a difficult one to address. If 4G was completely free & ubiquitous, you'd find people getting LTE-enabled 42" TVs in their home, running HDTV over cellular even when there is nobody in the room. In that case, even the most bullish forecasts would still be too low. Clearly, that's an extreme example, but various other milder scenarios are possible.

This is the paradox in all the forecasts: there seem to be no clear assumptions on pricing. Cisco's VNI projections probably are achievable, but only if mobile data is priced at levels that nobody would make any profit from it. As we found with flatrate, it's easy to drive usage if you're throwing away money.That said, we may see users being encouraged to migrate to LTE with offers of larger data bundles, which effectively reduces prices.

Taken together, my belief is that the bulk of forecasts are over-hyped. I think that too many of the projections are being made without considering how devices, apps or users are actually changing. There also seems little recognition of the "dilution of the average" as late-adopters, prepay users and developing-market subscribers bring down the headline "per-user" numbers.


Quantitative estimates and forecasts

I haven't done a full spreadsheet model analysis & forecast, but since I'm sure everyone's going to ask me anyway, I'll "take a punt" on overall global data traffic volume growth rates, based on existing published stats & bearing in mind the qualitative factors described in this post. It's worth noting that most spreadsheets I've seen are long on detail but a bit short on commonsense, especially with assumptions on per-user data growth continuing inexorably, despite the "average dilution" effects described here. There is also a general assumption that "subscription" remains the main business model, rather than more ad-hoc usage through PAYG. (Although, obviously, the overhyped 1-800 tollfree models won't be happening)

My prediction for overall global data traffic growth (Indexed to full-year 2011) is as follows:

2012: 70%
2013: 55%
2014: 45%
2015: 40%
2016: 35%
2017-2020: CAGR 30%

In other words, 2011-2016 growth is 7.2x - and in developed markets, probably more like 4x, and by 2015 we should see growth rates broadly comparable to that in fixed broadband in developed markets.

Given advances in technology, this should be "sustainable" with ongoing evolution of networks to LTE, especially small cells & better radio technology like beamforming. Some of the traffic growth is likely to be off-peak, as pricing & policy becomes smarter.

This means that growth will be more of a regular tide than a "tsunami", and definitely not something for operators  to panic about - and regulators need to learn to be skeptical of shrill demands for more spectrum.

SIGN UP HERE TO RECEIVE THIS BLOG BY EMAIL

NEW REPORT - 10 REASONS WHY 1-800 TOLLFREE APPS MODEL WON'T WORK

Some comparisons with other forecasts:

Cisco VNI expects 18x growth from 2011-2016 (vs. 7.2x from my best estimates)
Ericsson predicts 15x growth 2011-2017 (me: 9.4x for that period)
Reading from a chart in this ALU presentation suggests 30x from 2010-2015, or about 10x from 2011-2015 (me: 5.5x)

My peers over at ABI reckon 8x from 2012-2017 (me: 5.5x for that period)
Informa predicts 10x for 2011 to 2016 (me: 7.2x)
Morgan Stanley gives scenarios for 5x, 9x & 23x for 2011-2015 (me: 5.4x)
 
Edit, 14th September - AnalysysMason makes very similar arguments to me (and has been similarly pessimistic to me for a while). They reckon 5.5x global growth for 2012-17 (me: 5.6x) and "dangerously low" for Europe. I agree - overcapacity is the risk, not a mythical "spectrum crunch"