Telstra is willing to be publicly measured on its levels of customer service, so long as what is being measured is not operational metrics, but actual customer satisfaction levels, the Australian Media and Communication Authority's (ACMA) Reconnecting the Customer public hearing in Melbourne, has heard.
Speaking at the hearing, Telstra group managing director, public policy, David Quilty, said clarity over exactly what was being publicly measured was required if the sector’s customer service levels were to improve.
“The question is: What is the right measurement,” he said. “There has been a lot of talk about net promoter scores, and my understanding is that net promoter scores are a commercial measurement, and not focused on customer service.”
According to Quilty, existing industry codes and industry-driven solutions — already in development by the telecommunications sector — were the key to resolving the issue of poor customer service.
“That work is going on as we speak as part of a review of the code one of the working groups has been specifically tasked to look at issue of compliance monitoring and enforcement,” he said.
“I think the industry is reasonably well placed [to agree on, and share data on, public customer service], as an industry we measure everything. We measure ourselves to death in some ways, so there is certainly no shortage of options as to measurement, but the question is to get the right ones.”
Also speaking at the hearing, Telstra director customer service and satisfaction, Jules Scarlett, said the telco was already increasingly publicly measured as it transitions from an engineering company to a customer-driven company.
“Traditionally [measurement] has been [around] operational metrics and they haven’t actually reflected what the true customer experience is. So, what you need to do is find proxies in measures which do reflect the true customer experience.
An example of this, Scarlett said, was the telco’s metric around the time to activate a new account, which examined the time taken for specific business processes, but not the overall time it took for a customer to “Log on.”
“General satisfaction questions are easily understood by customers, and then there is … some room too to look at the ‘customer effort score’ which is a score [that] is really about … how the customer perceives the amount of effort they have to put in to get an issue resolved.
“I think that is a powerful measure of understanding what the true experience is like. But, I would definitely not be a proponent of heavy operational metrics that truly don’t give you the perspective of what the customer might have experienced.”
Scarlett said that for any measure of customer service among service providers, consumer education — on just what the measurements meant — would be vital.
“I do find that in our industry, even for example with some of the element of complaints statistics that are used, they are easily taken out of context and if you don’t give the education, as well as the measure, you may not have informed consumers any better,” she claimed.
Commenting on the implementation of a customer service measurement for smaller service providers, Quilty said regulators needed to be mindful that different sized companies had different levels of resourcing and capabilities.
“We have to be careful not to require them too undertake overly burdensome performance, measurement compliance,” he said. “We need to, I think, in terms of the things which are most critical to customers, in hopefully a relatively simple way, find ways to measure and apply those across the board.”
Telstra’s comments at the hearing follow those of the Telecommunications Industry Ombudsman who argued that a lack of clear information about the product being sold and price consumers will pay was at the heart of the current crisis of confidence between telecommunications customers and providers.
Optus also appeared at the hearing, arguing that upgrading IT systems and ensuring a 'level playing field' would lift customer service levels.