AI Configuration
AI Configuration section allows to manage parameters that are involved in the agent's logic flow without the need to edit them in chaingraph.
Last updated
AI Configuration section allows to manage parameters that are involved in the agent's logic flow without the need to edit them in chaingraph.
Last updated
The setup has predefined parameters that you can alter.
Choose the LLM of your preference. Don't feel any pressure, you may change it anytime later or even use different LLMs for different parts of the scenario (configured in chaingraph).
Review and edit system prompt. This is the main instruction for your agent. By default, it is configured to act as an AI assistant. Tailor it your needs.
The last four parameters refer to chat history
MESSAGES_COUNT
defines the number of messages stored in the agent's chat history. Adjusting this parameter balances context depth with memory efficiency.
TOKENS_LIMIT
sets the maximum token count the model can process in one interaction. A higher limit allows for longer and more detailed exchanges but requires more computational resources.
TEMPERATURE_CONTROL
controls the randomness of the AI's responses. Lower values (e.g., 0.3) produce more precise and deterministic outputs, while higher values (closer to 1) allow for creative and varied answers.
MAX_TOKENS
specifies the maximum token output for each response.
You should understand that all of these parameters correspond to nodes in chaingraph. For example, the pre-defined set of parameters corresponds to Agent Profile node. You do not have to use it, though. Just unlink the node and use configure your own graph in the way you like.
AI configuration tab makes your life easier when dealing with tangled graphs and tens of parameters. If you want to set them up in the section, add the keys in the node called "Template Param" and you'll be able to configure those keys in AI Configuration tab.
Don't forget to Save the agent every time you make changes to AI configuration.