Ask LLM
Send a prompt to an LLM and receive a response. Supports multiple providers.
Authentication
AuthorizationBearer
Bearer authentication of the form Bearer <token>, where token is your auth token.
Path parameters
group_id
Request
This endpoint expects an object.
message
The message/prompt to send to the LLM
provider
The LLM provider to use
Allowed values:
model
Specific model version to use. If not provided, uses the provider's default model.
system_prompt
Optional system prompt to set context for the LLM
temperature
Controls randomness in the response. 0.0 is deterministic, higher values are more creative.
Response
Successful Response
content
The LLM's response content
provider
The LLM provider that was used
Allowed values:
model
The specific model that was used

