Predefined LLM
In xFlow, the AI Agent State leverages predefined large language model (LLM) configurations to streamline the integration of AI capabilities into business workflows. These configurations are designed to provide optimal performance and reliability by incorporating strategies like fallback mechanisms. This documentation details the predefined LLM configurations that developers can use by specifying the aiModel
parameter in the AI Agent State.
Common Parameters
All predefined LLM configurations in xFlow share the following common parameters:
Temperature:
0.0
Timeout:
PT60S
Predefined LLM Configurations
1. GPT-4o
aiModel:
gpt-4o
Strategy:
single
Provider:
openai
Model:
gpt-4o
2. LLaMA3-70B-8192
aiModel:
llama3-70b-8192
Strategy: Fallback on status codes
429
and400
Providers:
Primary:
groq
Secondary:
openai
Models:
Primary:
llama3-70b-8192
Fallback:
gpt-4o
3. Mixtral-8x7B-32768
aiModel:
mixtral-8x7b-32768
Strategy: Fallback on status codes 429 and 400
Providers:
Primary: Groq
Secondary: OpenAI
Models:
Primary: mixtral-8x7b-32768
Fallback: gpt-4o
4. Gemini-1.5-Pro-Latest
aiModel:
gemini-1.5-pro-latest
Strategy: Fallback on status codes 429 and 400
Providers:
Primary:
google
Secondary:
openai
Models:
Primary:
gemini-1.5-pro-latest
Fallback:
gpt-4o
Usage Example
To use one of these predefined LLM configurations in your AI Agent State, simply specify the aiModel
parameter in the state definition. Here is an example of how to configure an AI Agent State using the gpt-4o
model:
Last updated