Measuring Subscriber Experience in the Hyper-Connected World

In both emerging and developed markets, competition from fellow service providers is known to be heavy, driving down margins. Innovative offerings like Google Fi and e-SIMs provide new forms of competition that are exciting to the subscriber but potentially alarming for the telecommunications providers. These innovations threaten to take away the proverbial goose that lays the golden egg – the captive subscriber. By making the underlying infrastructure invisible, within limits of tolerance for the QoS perceived by the subscriber, these innovations shift the focus to  “knowing” the subscriber better in order to drive revenue. In this process, the emphasis becomes about collecting as much data as possible about the subscriber and using that to own and drive the connectivity experience. In what could become a replay of the disruption wrought by e-commerce companies on the brick-and-mortar retail industry, behavioural data becomes the currency with which subscribers pay for access to nearly frictionless, quality-controlled services provided by the same brands they already trust to empower their digital selves.

It is instructive to examine why such innovations have the potential for disruption in the first place. Telecommunications is an essential service that many subscribers have now come to depend on to run their daily lives. Since critical as well as non-essential activities are hinged on staying connected, it is important to not only have the service be available but also provide a satisfactory experience meeting expectations. Different geographic regions with varied cultures and user groups have differing levels of tolerance for (loss of) connectivity. Nevertheless, the fundamental basis for subscriber experience hinges on continued connectivity for a majority of time – be it voice-enabled or data-enabled. Subscriber experience in turn drives churn (or stickiness) thus impacting revenue directly.

Measuring subscriber experience at individual level is a daunting challenge due to a host of reasons. A typical subscriber uses a variety of services layered atop the connectivity enabled by the telecommunications provider. At the lowest level, voice and data power the most typical usage of the network. Within data, there is a wide choice of services, powered by OTT applications, each catering to specific sets of preferred activities – be it chatting with friends, group discussions, watching videos, playing games, connecting remote and the like.

The second factor adding to the complexity of measuring subscriber experience is that different groups of users across regions and cultures have differing expectations of service when using particular applications. A chat user expects near-instantaneous message delivery but is not exposed to transient network-related packet losses. On the other hand, a video watcher in a market with developed telecommunications infrastructure could be expecting high-fidelity in his streaming experience, and transient packet losses could drastically alter the fidelity affecting video-viewing experience. Even within the gamut of video watchers, the content of the video plays a role in driving QoS expectations. A subscriber catching up on his favourite TV episodes online might not be as frustrated with temporary fidelity loss as much as a person watching the highlights of his favourite football team in action where the loss of fidelity results in a blurring of fast movements and the football itself.

A third compounding factor is the size of the subscriber base and the massive volume of data being generated by each subscriber. In order to measure subscriber experience at the individual level, it is necessary to first collect the data from each subscriber. While this by itself is large in volume, there is also the need to add to the mix data from the network and the OSS, in general. Some of the actions to take on the outcomes from the measured experience are potentially to be done immediately – for instance, if a subscriber is predicted to churn given the most recent bad experience, then it is necessary to take suitable action immediately and not wait for all data to be landed somewhere and then be analysed.

Traditional approaches to subscriber experience measurement has been either at an aggregate level or, where individually measured, on an as-needed basis. Aggregate level measurement is typically done using OSS data and therefore loses valuable individual-related information. Subscriber experience assessment is today done usually at the time of escalation of poor QoS to the service provider, for example, at the time of a call to the service centre. Till the advent of big data, the technologies required to address these three challenges were not affordable and could not be flexibly manipulated to serve customized needs. At the same time, it is foolhardy to imagine that the introduction of specific technologies like Hadoop to store large amounts of data alone would solve the the challenges.

The approach to solving for these three challenges involves the application of big data techniques and tools to engineer large-scale data pipelines not only to store the vast amounts of data but also process immediately as well as in large batches by way of comparing with past histories. What’s more, a holistic framework for tracking subscriber behaviour over time is the need of the hour to be able to make sense of every new piece of information that is obtained from the collected data. Ontologies provide useful frameworks to organise subscriber information. Combined with the right skill sets and solution, big data and large scale applied data science can then prove fruitful in being able to assess subscriber experience and act on the insights generated.

Leave a Comment