Using the OpenSesame Endpoint
Export agents into endpoints so you can use it directly in your product
OpenSesame Agent Endpoint Documentation
The OpenSesame agent endpoint allows users to execute AI-driven workflows involving multiple components, such as LLMs (Large Language Models) and integrated tools (e.g., Gmail). This documentation covers how to use the endpoint via cURL and Python.
-
Click on Generate Endpoint to generate the endpoint for your agent
-
Copy the endpoint
Endpoint Overview
Base URL:
https://opensesame--%7Bname%7D-modal-endpoint.modal.run/
HTTP Method:
POST
Headers:
-
accept: application/json
-
Content-Type: application/json
Request Body Parameters:
-
agent_config
: Defines the agents and tools used in the workflow.-
Key Parameters:
-
id
: Unique identifier for each agent/tool. -
type
: Specifies if the component is an LLM or a tool. -
name
: Name of the agent or tool. -
provider_name
: For LLMs, it specifies the AI provider (e.g., OpenAI). -
model_name
: Specifies the AI model to use (e.g.,gpt-4o-mini
). -
tool_name
andurl
: For tools, provides integration information and authentication URLs. -
children
: Defines the execution order by specifying downstream agents/tools.
-
-
-
user_query
: A natural-language instruction for what the workflow should accomplish. -
user_id
: Identifier for the user making the request. -
conversation_id
: Unique conversation or workflow identifier.
Example cURL Request
Using the Endpoint in Python
Response
A successful response will return a JSON object containing the results of the workflow execution, including any outputs from the AI agents and tools used. Check the response.json()
for specifics.