From organizational-level LLM management and advanced BYOLLM support to more enterprise flexibility in handover management and Voice Preview upgrades, Cognigy.AI v4.97 is packed with exciting updates you don’t want to miss!
Dynamic Routing to Multiple Contact Centers
Cognigy streamlines AI-to-human handovers with seamless integration to your preferred contact center platforms. This release takes it a step further, introducing more flexibility and efficiency than ever in chat-to-chat and voice-to-chat handover scenarios.
What’s New:
Previously, the handover settings were tied directly to individual Endpoints, meaning AI Agents could only route conversations on a specific channel to a single contact center.
The new Handover Providers Resource introduces a centralized interface for managing all your contact center integrations in one place and enabling dynamic handovers to multiple contact centers. By isolating the handover logic from the Endpoint logic, it allows you to configure various handover providers in a single Agent Flow - deployable on any channel.
Why It Matters:
Enterprises often rely on multiple contact centers for several strategic and operational reasons:
- Specialized Expertise: Different contact centers can be staffed with agents trained in specific areas (technical support, billing, multilingual support, etc.), ensuring customers receive the most accurate and efficient assistance.
- Tiered Services: Multiple contact centers allow for differentiated handling of distinct customer segments (premium, business, or standard), ensuring high-value customers receive faster response times through dedicated agents or priority routing.
- Localized Service: Enterprises operating across diverse regions and time zones can offer localized support with agents fluent in local languages and cultural norms.
- Regulatory Considerations: Different regions may have varying data protection laws, making it necessary to route interactions to a contact center within the correct jurisdiction to comply with local regulations.
- Load Balancing & Business Continuity: Distributing inquiries across several centers helps manage high volumes of interactions while ensuring continuous service delivery in the event of an outage, technical issues, or natural disasters.
Note: This feature only applies to chat-to-chat and voice-to-chat handovers. For voice-to-voice handovers, Cognigy Voice Gateway already supports multiple carrier configurations for routing to various contact centers as required.
Interested in seeing a demo? Register for our Q1 Tech Update
Organization-Wide LLM Resource Management
Cognigy.AI v4.97 also unveils a new Global LLMs Resource in the Admin Center, streamlining LLM management on an organizational level. Administrators can thus manage LLM connections centrally and assign a shared configuration to multiple projects.
By removing the need to set up LLMs separately for each project, this feature ensures consistency, eliminates duplicated efforts, and simplifies LLM deployment and maintenance at scale.
Universal Connector for OpenAI-Compatible LLMs
Reinforcing our commitment to best-of-breed LLM flexibility, this release features advanced BYOLLM support with a universal connector for OpenAI-compatible LLMs.
Available within the LLM Resource, the new OpenAI-Compatible LLM type allows you to plug in any provider and model that adheres to OpenAI’s API standards, including self-hosted or fine-tuned solutions.
This offers enterprise users unprecedented LLM interoperability and freedom of choice to balance cost, latency, accuracy, and compliance, all while maintaining a unified agent development experience.
Deepgram Support Added to Voice Preview
Voice Preview in Cognigy.AI enables instant testing and fine-tuning of voice output at any step in a conversation flow without running a full test call. You can experiment with TTS voices and SSML settings in real time to optimize voice experience for specific use cases.
In addition to AWS, Microsoft and Google speech services, you can now also preview the TTS experience of Deepgram to design the perfect voice persona for your AI Agents.
Other Improvements
Cognigy.AI
- Renamed the option from Log Token Count to Show Token Count to more accurately reflect its functionality. This change affects the following Nodes: Search Extract Output, LLM Prompt, LLM Entity Extract, and AI Agent
- Added the Exclude from Transcript option to the Advanced section in Say and Question Nodes to hide an output message from AI Agents
- Added streamed AI Agent messages to the same output bubble in the Interaction Panel, improving readability
- Added the
notifyURL
parameter to the list of parameters for creating an Outbound Call API request, enabling status updates for the call - Improved speech fallback to cover additional cases, including invalid SSML in streaming TTS APIs
- Added the
PRE_CACHE_IGNORE_LIST
environment variable to allow specifying a list of vendors that shouldn't be pre-cached. Valid values:aws
,deepgram
,elevenlabs
,google
,ibm
,microsoft
,nuance
,wellsaid
,whisper
, andplayh
. You can set these values via. By default,
deepgram
is in the ignore list due to potentially corrupted audio during pre-caching
Cognigy Webchat
- Added the capability to configure translations for Webchat UI elements in the Webchat Custom Settings of the Webchat v3 Endpoint
- Added the option to customize the Start new conversation button text in the Webchat v3 Endpoint
- Updated Webchat v3 and Demo Webchat to 3.16.0
Cognigy Insights
- Introduced the capability to delete session transcripts in the Transcript Explorer, giving you greater control over your data. This feature enhances data management, boosts confidentiality, and ensures you can easily maintain a clean and secure transcript history.
For further information, check out our complete Release Notes here.