Print

How to Configure the Embedding LLM in Genie

Introduction 

The Embedding LLM in Genie is a mandatory configuration that enables Genie to understand and process the semantic meaning of your documents. It works by transforming uploaded documents into vector representations, allowing Genie to perform accurate context-based searches and retrieve the most relevant information during a chat or query. 

This setup is essential for enabling document intelligence within Genie, helping it interpret relationships, topics, and meanings beyond just keywords. 

Once configured, the Embedding LLM supports Genie’s Document Explorer and other agents that rely on vectorized data to deliver precise and context-aware answers. 

The following section explains the steps to configure the Embedding LLM. 

Supported LLM Providers 

Genie currently supports the following Embedding LLM providers

Steps to Access Embedding LLM Configuration 

  1. Navigate to Configuration  
  1. And choose LLMs → Embedding LLM 
  1. Locate the Select your LLM provider dropdown  
  1. The default provider is Azure – Open AI
  1. Selecting a different provider automatically updates the page to show the relevant configuration fields required for that provider. 

Configuring Embedding LLM Providers 

To configure the Embedding LLM , the following fields are mandatory: 

Azure – Open AI 

Required Fields 

Name Description 
API Endpoint* The endpoint URL where requests are sent to your Azure OpenAI service. 
API Key* The authentication key used to securely connect with your Azure OpenAI resource. 
Deployment Name* The name of the deployed Azure OpenAI model configured in your Azure portal. 
API Version* The version of the Azure OpenAI API to be used for communication.  

Steps 

  1. Enter all required details for the selected LLM provider. 
  1. Click “Save” to validate and store the configuration securely. 

Note: For security purposes, the API Key will be hidden after validation and saving, ensuring it is not visible or accessible from the interface. 

AWS Bedrock 

Required Fields 

Name Description 
Model Id* The identifier of the Claude model deployed in AWS Bedrock to be used for reasoning. 
Region Name* The AWS region where the Bedrock model is hosted and accessed. 

Steps 

  1. Enter all required details for the selected LLM provider. 
aws-bedrock-ui
  1. Click “Save” to validate and store the configuration securely. 
aws-bedrock-save-button

Note: For security purposes, the API Key will be hidden after validation and saving, ensuring it is not visible or accessible from the interface. 

Open AI 

Required Fields 

Name Description 
Model Id*  The identifier of the OpenAI model to be used for reasoning and responses. 
API Key*  The authentication key used to securely connect with the OpenAI API. 

Steps 

  1. Enter all required details for the selected LLM provide 
open-ai-ui
  1. Click “Save” to validate and store the configuration securely. 
     
openai-save-button

Note: For security purposes, the API Key will be hidden after validation and saving, ensuring it is not visible or accessible from the interface. 

Subscribe to our Newsletter

Marketing Subscription Form
Tags: