You are currently browsing the tag archive for the ‘Cincom’ tag.

Ventana Research has just released its 2013 Value Index for Agent Desktop Management, in which we evaluate the competency and maturity of vendors and products that support the management of the AD_VentanaResearchValueIndex_2013desktop systems that agents use to handle customer interactions. Our firm has researched this software category for many years, and our benchmark research into customer service and the agent desktop shows the impact the agent desktop has on agent satisfaction and efficiency and the business outcome of such interactions. Because of its increasing importance, we have taken agent desktop management out of our Customer Experience Value Index and created a separate category for it.

I am excited to provide research and education on this critical software. It is essential in every contact center and industry, but its importance is often not recognized by businesses or other analyst firms. Our research on organizations using this software not only uncovered best practices and trends but also highlighted what businesses can do to improve competencies across their workforce and processes. The new Value Index for Agent Desktop Management assesses vendors and their products and whether they meet companies’ needs, based on what participants in our agent desktop benchmark research told us was important to them.

The Ventana Research methodology utilizes a request for proposal and assessment based approach looking more closely at the software than just a vendor’s vision or ability to sell software. Each Value Index takes six months to complete; unlike other analyst firms, we look at the product details that have the most importance in terms of successful use and benefits. We evaluate agent desktop vendors on seven categories that are essential for achieving expected benefits: usability, reliability, manageability, adaptability and capability of the products, as well as the customer assurance areas of validation and TCO and ROI. We assign weight to each category according to its priority to buyers, and total the results to 100 percent for scoring purposes. In the process, we identify best and worst practices to further refine how we assess technology vendors in each category. For instance, this year we placed a heavier emphasis on usability, a factor that organizations in our 2013 benchmark research indicated is becoming more important to the value of software used. You can read the details on our methodology and process in the 2013 Agent Desktop Management Value Index market report.

Our Value Index analysis for agent desktop management looks at a range of needs across industries and across companies and contact centers of all sizes. The Value Index examines requirements by job role, including management, contact center managers, supervisors and agents, and IT groups that support contact center systems. We also examine whether a system provides in-depth capabilities and features such as easy-to-use interfaces to optimize desktop use; workflow alerts and triggers to align operations; access to multiple channels of communication; the ability to find customer information easily so agents can provide personalized responses and complete after-call tasks; agent access to training information and dashboards; optimization of back-office processes; data capture and analysis; integration with communication systems and other business applications; analysis of interaction handling performance; and administrative capabilities to help create agent-specific desktops and manage use of the software.

Our analysis this year rates eight vendors Hot, which, as the highest value level, demonstrates product maturity. Upstream Works ranks at the top, followed by Cicero, OpenSpan, Jacada,, Cincom, AD_Weighted_OverallAltitude Software and KANA. Upstream Works retains the top position it achieved last year, while Altitude Software significantly improves its position and KANA enters as a Hot vendor. NICE Systems enters as a Warm vendor with a product that focuses more on optimizing agent performance and less on supporting multiple communication channels and integration with other systems. Genesys remains a Warm vendor, as its product focuses on telephony and  lacks sophisticated integration capabilities. SmartPoint and RiverStar dropped out of the analysis, declining to participate.

In line with the increasing importance of the agent desktop and its impact on the agent and overall customer experiences, the agent desktop management market has become highly competitive. Much has changed since companies such as Jacada and Genesys released their early systems, which focused on telephony and, in the case of Jacada, hid all systems from the user behind a replacement user interface. Today’s most advanced systems allow companies to choose the style of interface they want, support more channels of communication, guide users on the next best action, offer point-and-click capabilities to support ease of integration with other systems, and support more advanced analysis and presentation of performance information. All of the Hot vendors have released updated versions of their products in an effort to keep up with these requirements. As we note in the report, Upstream Works has released a new product, based on Cisco’s new offering, that changes the user interface and mode of operation, and Enghouse Interactive has entered the market with a system that focuses on optimizing access to multiple communication channels. The agent desktop management category thus is maturing rapidly, with new vendors entering the market and new capabilities to support emerging requirements, such as social customer service and mobility. This Value Index offers a guide to which vendors are in the market and their products’ maturity levels, providing a good starting point as companies evaluate how to improve customer interaction performance.

We take pride in our Value Index, and we believe it’s cool to be a Hot vendor. Unlike other analyst firms, we recognize the impact the agent desktop has on agent satisfaction and thus the customer experience. If you look at our research or talk to contact center managers, you will see that the agent desktop is becoming more complex as agents need to access more channels of communication and applications. Today’s desktop systems can therefore play a part in not only making processes easier and more efficient, but also ensuring that more agents follow best practices and achieve the best outcomes. To different degrees, our Hot vendors demonstrate capabilities that support these objectives, and each should be recognized for its efforts.

Congratulations to the vendors that stood up to our detailed assessment processes and granular analysis, which represent how organizations assess and select vendors. We’re proud of our objective and in-depth analysis, which we publish without review or editing by the technology vendors, unlike other analyst firms. While some vendors may object to the results, our independence provides the basis for the most trusted research in the industry. If you want further information, please download the executive summary and let us know if you need help selecting the right vendor for your agent desktop needs. We look forward to continuing to offer guidance to buyers in this critical application category, and to helping business and IT professionals who need to run the most efficient, results-oriented and profitable organizations.


Richard J. Snow

VP & Research Director

In my research area, a lot is said and written these days about optimizing the customer experience. Some say it is done by improving key performance metrics such as customer satisfaction (CSAT), net promoter score (NPS) and customer effort score (CES). Others say customer experience management (CEM) is the “new CRM”; some think it is part of a multichannel service strategy, and for others it is as simple as managing social media. In my view it takes all of these, and other efforts, to optimize the customer experience, and thus it is difficult for companies to achieve. Customer experience management is the practice of managing the effectiveness of customer interactions so the outcome meets the customer’s and the company’s expectations. In any case, the key question is how companies achieve this goal. 

We consider CEM as a form of performance management, which Ventana Research defines as the strategy, methodologies and process of managing the performance of the organization and its business network by leveraging assets to achieve a common set of goals and objectives. In practice performance management requires measuring what happened in the past and what is currently happening, understanding why things happened and then taking action to improve the efforts of people, processes, information and technology. In the context of customer experience management this means measuring the outcome of all types of interactions (ads, marketing campaigns, calls to the contact center, IVR menus, visits to the website, emails, letters, text messages, video calls, watching videos, instant message sessions and even one-on-one meetings), identifying the reason for the interaction (such as a product complaint, a request for missing information or an inquiry as the result of a marketing email), analyzing why the outcome was what it was, and then making the necessary changes so that subsequent customer experiences match up to the customer’s and the company’s expectations. 

Measuring the outcome of customer interactions is in many instances straightforward, among them the time a call took to complete, the sales value of the interaction, sales conversion rates on the website or whether the customer’s issue was closed. In cases such as NPS and CES it is a case of following the required process – usually a simple poll question – and companies can derive concrete answers. But rather often what appears to be simple is actually complex. Take first-call (or contact) resolution (FCR) for instance. Determining whether the customer’s issue was resolved during the first interaction can be complex because companies need to link past interactions across different communication channels to be sure that customers don’t have to get back in touch because the issue wasn’t resolved on the first channel or the underlying issue wasn’t really resolved. Another increasingly popular measure is the lifetime value of customers; in general, the higher the value, the higher the likelihood that the customer is satisfied because he or she continues to spend. The complexity here derives from the number of systems that have to be accessed to arrive at the full cost of doing business with a customer (including marketing, sales and support across multiple channels) and to a lesser extent the number of systems required to determine the revenue per customer. 

One of the most important outcomes is satisfied customers; indeed, my research into CEM shows this is the most important metric for most companies. This research also shows that companies use numerous methods of measuring CSAT, from simply getting agents to click on a smiley face, to outbound calls, to IVR surveys and several others; some of these produce more objective and consistent results than others. The most reliable way is to solicit feedback by asking customers to complete a survey after each interaction. This should be done by choosing the survey method of the customer’s choice to increase the numbers of completion. The most efficient way to analyze completed surveys is to use a text analytics tool that can be programmed to analyze all text-based inputs to spot hot issues, trends and customer sentiments. The most mature feedback management tools, based on the Ventana Research Customer Experience Management Value Index, can also produce CSAT scores based on rules applied across all interactions. The leading vendors according to this index include Verint, Confirmit, ResponseTek and MarketTools. 

Understanding the outcome of interactions is important. But it is equally important to understand why customers contact the company, as this is likely to have had the initial impact on the customer’s satisfaction and mood. Was it, for example, because their bill was wrong or they couldn’t understand it? To gain this understanding the most mature companies use root-cause analysis that can track trends and events across business units, processes and communication channels. It can, for example, explain an increase in complaints resulting from a mobile phone cell failing, or a particular email campaign having resulted in more than the expected numbers of calls, or an increase in the number of complaints increases because queue lengths increases as a result of agents going sick. With this type of understanding, companies can address the underlying causes, leading to fewer calls to the contact center and less adverse social media comment. 

The final step is to take action for improvement. This step can be automated by deploying software that raises alerts based on rules (for example, if a key metric falls outside a set range it sends a message to a designated person), includes workflow tools that create tasks and monitor their flow through a predefined process, and produces a dashboard that shows not only metrics but also trends in how they change over time. 

This is not to say that improving the customer experience is just about technology; technology’s role is to support people, processes and information. Outcome analysis, for example, should be linked to people’s performance: If a particular customer was satisfied because this agent acted this way, gave this information and said these things, that might constitute a best practice for interaction-handling. Armed with this analysis, a company could create focused training and coaching that address each individual’s needs and bring everyone as close as possible to using a best practice.  

The same is true of processes: Analysis enables you to discover that the best outcomes were achieved by people who followed this path and took these actions when handling interactions of a certain type. Process or desktop analytics products can map how different people handle different interaction types and generate the best practice processes. Companies can then use a combination of training, coaching and technology (for example, a smart agent desktop from top-performing vendors such as Upstream Works, OpenSpan, Cicero, and Cincom) to increase the number of people following best practices. Finally, companies shouldn’t ignore the impact of metrics and the way they reward people for achieving certain metrics; for example, I have seen how agents who are rewarded for holding down average handling times (AHT) find ways of staying within target, often to the detriment of customer satisfaction. Companies need to have a balanced set of efficiency and effectiveness (outcome) metrics that relate to their targeted business goals and to reward people for achieving them. 

It is has never been more important to ensure that customers feel satisfied after every interaction, not the least because companies can quickly find themselves exposed on social media. And it has never been more difficult to achieve this goal because customers are now more demanding and want to communicate over many more channels. Technology can help, but successful CEM requires companies to be more customer-focused and willing to change. 

Is your company engaged in customer experience management? If so, I’d love to know how you do it, so please come and collaborate with me. 


Richard Snow – VP & Research Director

RSS Richard Snow’s Analyst Perspectives at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Twitter Updates


  • 68,686 hits
%d bloggers like this: