llm_condition
autogen.agentchat.group.llm_condition.StringLLMCondition
StringLLMCondition
Simple string-based LLM condition.
This condition provides a static string prompt to be evaluated by an LLM.
Initialize with a prompt string as a positional parameter.
Name | Description |
---|---|
prompt | Type: str |
**data | Type: Any |
Class Attributes
model_config
prompt
Instance Methods
get_prompt
Return the static prompt string.
Name | Description |
---|---|
agent | The agent evaluating the condition (not used) Type: ConversableAgent |
messages | The conversation history (not used) Type: list[dict[str, typing.Any]] |
Type | Description |
---|---|
str | The static prompt string |