By Tim Moynihan
The Affordable Care Act won’t directly touch most Americans until 2014. For healthcare and insurance companies, the ACA touched down in 2010 with the realization that they would have to bring 48.6 million uninsured Americans into the healthcare system.
Contact centers will be on the front lines of this massive effort. Consumers will need help from phone-based customer service reps and interactive voice systems to determine which plan best meets their unique needs. Corporate customers will also require ready answers to complex coverage questions.
In response, healthcare companies are investing heavily in contact center technology. Many are migrating from traditional PSTN-based (public switched telephone network) environments to flexible session initiation protocol (SIP) environments that scale more cost-effectively. Others are adding IVR ports, creating menu options, and updating routing solutions to provide faster access to increasingly diverse choices of information and agent services.
With the public uncertain about the changes the ACA brings, much is riding on these contact center upgrades. Insurers and healthcare providers realize they must build trust through contact center operations. People need to feel safe and confident when making such life-affecting choices – especially those who are enrolling in a health plan for the first time. Companies sponsoring employee health plans need to know they are properly administered.
Dropped or incorrectly transferred calls, dead-end IVR menus, and voice quality issues damage these important trust-building opportunities. Expanding and reconfiguring networks inevitably causes these kinds of mistakes. However, they don’t have to cause problems with customers. Adhering to best-practice technology deployment and management methodologies reduces the number and severity of implementation issues, yielding smooth processes that build credibility.
Assess Readiness: Every technology implementation has issues: programming bugs, vendor delays, knowledge gaps, and more. However, contact center projects are uniquely susceptible to problems. Voice and video services are resource intensive, which means that every system must be properly provisioned, sized, and integrated to ensure a great experience. A four-phase, pre-deployment testing program helps identify issues with contact center systems before they affect users.
- Phase 1, Network Assessment: To start, organizations need to validate foundational elements to ensure that the carrier connection and IP network functions properly as a whole. This assessment should include a test of security vulnerabilities and the session border controller (SBC) configurations. Once validated, these tests provide important baseline metrics for evaluating performance as additional applications are brought online.
- Phase 2, Evaluating Real-Time (Synchronous) Communications: Voice, chat, and video services are highly vulnerable to quality issues, such as delays and jitter. Companies need to assess the performance of contact center applications (IVR, CTI, and routing) from the caller’s perspective and collect quality of service (QoS) metrics for each application. This phase is critical for revealing interoperability issues.
- Phase 3, Evaluating Non Real-Time (Asynchronous) Systems: Once real-time services are working properly, companies should then test the data-driven applications the contact center must support (such as email messaging and agent desktop applications). This phase helps companies determine if the additional traffic will affect the quality of real-time communications and ensure that all services are properly provisioned.
- Phase 4, End-to-end Validation: The only way to ensure a great experience is to evaluate the entire system under expected call volumes and traffic conditions. For example, if a company expects 1,000 concurrent calls and 500 concurrent chat sessions, the test needs to be configured accordingly. It is also important to test voice quality from the customer to the agent and back to the customer to ensure a clear conversation for every call.
Unfortunately, many companies view pre-deployment testing as a luxury and do not adequately plan for it. In fact, a study conducted by the Customer Experience Foundation revealed that poor or no testing increases cost and results in delay in 79 percent of all projects. To keep projects on time and on budget, it is recommended that companies allocate five to ten percent of the total project cost for pre-deployment testing. Sadly, the “agent pizza party” approach – employees making manual calls into the center – is not as effective as an automated, repeatable testing solution for assuring today’s more complex environments.
Provide Ongoing Assurance: In an ideal world, new technologies are released into product environments, and they work perfectly forever. In the real world, real-time communications and customer-facing solutions need to be closely monitored to maintain their performance. However, companies that only look at health statistics for their individual components do not get a complete picture. To understand customer experience, companies need a wider perspective with more meaningful metrics, such as:
- Call Blockage Rate: This measures how well customers can access services. When solutions are not working properly or the contact center cannot handle the volume of customer inquiries, calls are not answered. A high blockage rate has an immediate, negative effect on customer satisfaction.
- Call Abandonment Rate: High abandonment rates indicate application problems, incorrect routing latencies in back-end communications, or inefficient management of customer service resources. These conditions frustrate customers who are unable to get their problems fixed quickly and efficiently.
- Call Quality: Poor voice quality – low MOS (mean opinion scores) – reflects badly on any company. It also leads to an increase in call length when customers and agents cannot understand each other and are forced to repeat themselves. In extreme cases, customers will hang up and try again. Either way, these delays can be expensive to both customer loyalty and overall cost per call.
- Repeat Calls: This is a measurement of how many times a customer contacts the company before his or her issue is corrected. A variety of technical issues can lead to higher repeat call rates, improper routing, long queue lines, and dropped calls. This key performance indicator (KPI) also reflects how successfully agents are able to satisfy customers the first time.
The companies fully committed to quality take a proactive approach, using active monitoring to dial into the contact center and measure a wide range of customer experience oriented KPIs including response times, IVR availability, menu functionality, and voice quality. The results are compared to baseline performance measures. Any anomalies are automatically reported to the support staff for immediate investigation. Armed with intelligence on which test failed –and where – these companies can take corrective actions to minimize the negative impact on customers.
Prepare for Success: The variety and complexity of options makes it difficult for most individuals to choose the right health plan. In fact, a March 2013 poll conducted by the nonpartisan, nonprofit Kaiser Family Foundation found that 57 percent of Americans didn’t understand how they would be affected by the Affordable Care Act. Being able to have an informed, clear conversation with a knowledgeable service agent will go a long way towards giving people the confidence they need to make such an important decision.
These conversations will take place through contact centers and, by extension, contact center technology. Validating technology deployments through testing and monitoring is only one aspect of creating a reassuring experience, but it is one area that companies can control. Committing to best-practice quality assurance methodologies will pay off now and in the future as ACA regulations take effect and the estimated 48.6 million uninsured seek to comply.
Tim Moynihan is vice president of marketing at Empirix.
[From the August/September 2013 issue of AnswerStat magazine]