Ask LLM

Send a prompt to an LLM and receive a response. Supports multiple providers.

Authentication

AuthorizationBearer

Bearer authentication of the form Bearer <token>, where token is your auth token.

Path parameters

group_idintegerRequired

Request

This endpoint expects an object.
messagestringRequired1-32000 characters

The message/prompt to send to the LLM

providerenumOptional
The LLM provider to use
Allowed values:
modelstring or nullOptional
Specific model version to use. If not provided, uses the provider's default model.
system_promptstring or nullOptional<=16000 characters
Optional system prompt to set context for the LLM
temperaturedoubleOptional0-2Defaults to 0
Controls randomness in the response. 0.0 is deterministic, higher values are more creative.

Response

Successful Response
contentstring
The LLM's response content
providerenum
The LLM provider that was used
Allowed values:
modelstring
The specific model that was used

Errors