Amazon Bedrock Native Integration and More with Cognigy.AI v4.82

Topic: 
|
2 min read
Nhu Ho
Authors name: Nhu Ho August 11, 2024

Dive into Cognigy’s latest LLM orchestration enhancement with our native integration for Amazon Bedrock.

Unified API Access to Diverse LLMs via Amazon Bedrock

Amazon Bedrock is a fully managed service offering comprehensive developer tools for fine-tuning and utilizing various pre-trained foundational models to build GenAI applications. In addition to AWS’s Amazon Titan, it gives you access to various other foundational models like Anthropic Claude, Cohere, AI21 Labs, Meta Llama, and Mistral AI.

Besides seamless integration with the AWS ecosystem, Amazon Bedrock offers unique advantages compared to standalone GenAI solutions including:

  • Multi-Model Testing: Simultaneously evaluate and compare the performance and latency of various models in a unified environment.
  • Unified API Access: Utilize a single API to interact with multiple LLMs across different vendors and modalities, including text and images.

This integration aligns with Cognigy’s commitment to helping you leverage top-tier Generative AI for advanced customer service automation with ease-of-use and fastest time-to-market.

Note: Currently Cognigy only supports models with Converse API on Amazon Bedrock. For a detailed configuration guide, refer to our documentation.

In addition to Amazon Bedrock, gpt-4o and gpt-4o-mini models are now also natively available for Azure OpenAI and OpenAI providers.

Other Improvements

Cognigy.AI

  • The Knowledge AI feature is now publicly available
  • Deprecated the old Azure OpenAI connection type
  • Limited the Any Slot matching to sentences of a certain length
  • Removed the Chatbase analytics configuration from the Endpoint Settings, as Chatbase no longer exists
  • Improved by not showing the Enable Input Sanitization option in Endpoint Settings if the feature is not globally activated
  • Removed the New badge from the Knowledge AI feature in the Cognigy.AI UI
  • Added the NICE CXone Endpoint

Cognigy Voice Gateway

  • Improved the setting for the Call Completed event within the Voice Gateway Endpoint. Now, when the Execute Flow action is selected in Call Event Action: Call Completed > Call Event Settings > Call Event Action and the Call Completed event is enabled, theinput.data Input object in the executed Flow contains the last event data

Cognigy Webchat

  • Added support for RTL (right-to-left) to enhance usability for languages such as Arabic, Hebrew, Urdu, and more

Cognigy Live Agent

  • Improved Live Agent performance by changing the handling of Live Agent WebSocket events
  • Revised the notification logic to enhance retrieval speed
  • Improved the style of the Contact Profile on the Details tab within the conversation interface. Space was added between the key and value to prevent them from sticking together

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1