OpenTools provides an API for LLM tool use. Using an OpenAI-compatible completion API, you can access thousands of tools from our registry through any supported LLM via OpenRouter. You can also use your own custom tools ("type": "function") as before.

Let’s say you’re building a “deep research for news about space” — your API request might look like:

shell
curl -X POST https://api.opentools.com/v1/chat/completions \
  -H "Authorization: Bearer $YOUR_OPENTOOLS_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic/claude-3.7-sonnet",
    "messages": [
      { "role": "system", "content": "You are a research assistant specializing in space exploration and rocketry news. Provide in-depth, factual information by searching for the latest developments in space missions, astronomical discoveries, and space technology, analyzing multiple sources, and summarizing key findings and scientific implications." },
      {
        "role": "user",
        "content": "What is the latest on the two American astronauts on the Starliner mission in the ISS?"
      }
    ],
    "tools": [{ "type": "mcp", "ref": "exa" }]
  }'

Let’s break down a few key parts of the request:

LineDescription
-H "Authorization: Bearer $YOUR_OPENTOOLS_API_KEY"You’ll first need to create an API key which we authenticate with Bearer auth.
"model": "anthropic/claude-3.7-sonnet"We use OpenRouter to route requests to LLM provider that natively supports tool use. You can find their selection of models here.
"tools": [{ "type": "mcp", "ref": "exa" }]We’re equipping our LLM with web search using the official exa MCP server. OpenTools handles authenticating to the Exa API on your behalf.

Instead of the usual “since my knowledge cutoff is April 2023, I don’t have real-time information” caveat, you’ll get a response like:

// ...
"content": "Based on my search, here's the latest information about the two American astronauts on the International Space Station (ISS):...",
"refusal": null,
"role": "assistant"
// ...