LLM completion (internal) POST /v1/internal/ai/complete Generates an LLM completion. Service-to-service endpoint. Request Responses200defaultOKError