Native Google Gemini & Amazon Nova Support for Agentic AI and More with Cognigy.AI v4.92

8 min read
Nhu Ho
Authors name: Nhu Ho January 10, 2025
Release Blog_Cognigy.AI. v4.92

Happy New Year! Cognigy.AI’s first release of 2025 delivers significant upgrades, exemplifying our LLM-agnostic approach with built-in support for Google Gemini and Amazon Nova in Agentic AI.

Other highlights include project pinning for easier management, Knowledge Source downloads as .ctxt files for easier refinement and migration, and real-time Flow execution tracking for AI observability.

Extended Multi-Model Support for Agentic AI

Cognigy employs a model-agnostic approach to Large Language Models, ensuring maximum flexibility and adaptability for enterprises to stay ahead in advanced AI development.

With this update, our Agentic AI capabilities now support Google Gemini and Amazon Nova models, alongside OpenAI, Microsoft Azure OpenAI, and Anthropic Claude.

Key benefits of multi-model LLM orchestration include:

  • Performance Optimization: Not all LLMs are optimized for the same tasks or languages. A model-agnostic solution lets you leverage the best-performing model for each use case.
  • Cost Efficiency: The flexibility to use different models for different tasks enables cost optimization based on token consumption and performance requirements.
  • Resilience & Uptime: When a model encounters limitations, errors, or service outages, the availability of a fallback solution ensures uninterrupted AI operations.
  • Best-of-Market AI: Cognigy lets you instantly connect to the latest emerging models, enabling access to state-of-the-art AI.
  • Future-Proofing: As market dynamics and regulations evolve, the ability to select and switch models ensures the long-term viability and compliance of your AI deployment.

In addition to Agentic AI, you can leverage multiple LLMs for a wide range of conversational design and orchestration use cases with Cognigy. See the full list here.

Streamlined Project Handling

The new ability to pin and unpin Projects simplifies project handling, especially in complex enterprise deployments.

This feature allows you to quickly prioritize and access frequently used projects, creating a personalized and organized workspace that reduces the time spent searching for important projects.

Easier Optimization and Migration of Knowledge Sources

The ability to download Knowledge Sources as .ctxt files unlocks several advantages:

  • Effortless Migration Across Projects: Besides the option to import/export entire Knowledge Stores using Packages, you can now transfer granular Knowledge Sources between Projects. This feature facilitates scaling, resource reuse, and consistent implementation.
  • Flexible Data Refinement: By downloading knowledge sources as .ctxt, you can more easily refine, restructure, or optimize extracted chunks offline (especially from complex formats like URLs, ppts, etc.), using familiar tools.

Real-Time Flow Tracking for AI Observability

To facilitate agent testing and debugging, Cognigy illuminates and displays the Flow execution path with each conversation turn in the Interaction Panel, detailing every step an AI Agent takes towards resolution, including Tool calls.

Unlike the previous version, where path highlights appeared only after the complete flow execution, the latest release introduces granular tracking, updating every 300 ms. Ideal for Agentic workflows, this enhancement lets you follow each step in real time, pinpoint latency bottlenecks, and identify where messages are generated.

Real-time flow tracking

Other Improvements

Cognigy.AI 

  • Added support for any modern Google Gemini and Amazon Nova models
  • Increased the button limit from 6 to 15 in the Text with Buttons output type
  • Blocked inactive users from using SSO login
  • Updated the links on the Get Started cards on the Projects page
  • Implemented the prevention of user impersonation in the Management UI
  • Added the capability to track total token usage in a conversation or chat session via API. This capability allows users to monitor their LLM token usage per session
  • Added a Temperature slider to the Advanced section of the AI Agent Node
  • Added the Hide References to External Resources in Transcripts setting to remove tags that can reference third-party APIs (<a>, <img>) from inputs to protect users
  • Reduced application loading times by splitting the JS bundle into chunks with lazy loading
  • Redesigned and resized the AI Agent wizard
  • Added the Generate Search Prompt parameter to the Grounding Knowledge section in the AI Agent Node. This parameter is enabled by default and allows you to generate a context-aware search prompt before executing the knowledge search

Cognigy Voice Gateway

  • Added the capability to set recognizer, fillerNoise, and synthesizer via the Advanced Session Parameters in the Set Session Config Node and via the Set Activity Parameters in the Say or Question Nodes
  • Added Third Party-Call-Control (3PCC) support for INVITE without Session Description Protocol (SDP)
  • Added the capability to define the user parameter in the Contact header via the SIP From User carrier:
    • <[scheme]:[user]@[host][port];[transport]> - "Cognigy Bot" <sip:+4912345@18.198.97.129:5061;transport=tls>
      By default, this capability is disabled. To enable it, set FEAT_COMPATIBILITY_RFC3261=

Webchat

Cognigy Insights

  • Allowed the use of apikey as a request header for the OData API

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1