Predefined LLM
In xFlow, the AI Agent State leverages predefined large language model (LLM) configurations to streamline the integration of AI capabilities into business workflows. These configurations are designed to provide optimal performance and reliability by incorporating strategies like fallback mechanisms. This documentation details the predefined LLM configurations that developers can use by specifying the aiModel parameter in the AI Agent State.
Common Parameters
All predefined LLM configurations in xFlow share the following common parameters:
Temperature:
0.0Timeout:
PT60S
Predefined LLM Configurations
1. GPT-4o
aiModel:
gpt-4oStrategy:
singleProvider:
openaiModel:
gpt-4o
2. LLaMA3-70B-8192
aiModel:
llama3-70b-8192Strategy: Fallback on status codes
429and400Providers:
Primary:
groqSecondary:
openai
Models:
Primary:
llama3-70b-8192Fallback:
gpt-4o
3. Mixtral-8x7B-32768
aiModel:
mixtral-8x7b-32768Strategy: Fallback on status codes 429 and 400
Providers:
Primary: Groq
Secondary: OpenAI
Models:
Primary: mixtral-8x7b-32768
Fallback: gpt-4o
4. Gemini-1.5-Pro-Latest
aiModel:
gemini-1.5-pro-latestStrategy: Fallback on status codes 429 and 400
Providers:
Primary:
googleSecondary:
openai
Models:
Primary:
gemini-1.5-pro-latestFallback:
gpt-4o
Usage Example
To use one of these predefined LLM configurations in your AI Agent State, simply specify the aiModel parameter in the state definition. Here is an example of how to configure an AI Agent State using the gpt-4o model:
- name: ExampleAIState
type: aiagent
agentName: ExampleAgent
aiModel: gpt-4o
systemMessage: "You are an assistant designed to provide accurate answers."
userMessage: '${ "User: " + .request.question }'
output:
{
"type": "object",
"properties": {
"response": {
"type": "string",
"description": "The AI's response to the user question"
}
},
"required": ["response"]
}
maxToolExecutions: 5
memory:
memoryId: "session123"
memoryType: "message_window"
maxMessages: 10
tools:
- name: SEARCH_DOCUMENTS
description: "Search for relevant documents based on the user's query."
parameters:
{
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "The search query"
}
},
"required": ["query"]
}
output:
{
"type": "object",
"properties": {
"documents": {
"type": "array",
"items": {
"type": "string",
"format": "uri"
}
}
},
"required": ["documents"]
}
agentOutcomes:
- condition: '${ $agentOutcome.returnValues.response != null }'
transition: SuccessState
- condition: '${ $agentOutcome.returnValues.response == null }'
transition: ErrorStateLast updated