Tool Calling
Let models call functions in your code. Define tools, handle calls, and build agentic workflows — all through one API.
Tributary uses the OpenAI tool calling format as its unified interface. Tools are defined as JSON Schema functions and passed in the tools array. Requests are automatically translated for each provider.
How it works
You send a request with a tools array describing available functions.
The model decides whether to call a tool. If it does, the response contains tool_calls instead of text content, and finish_reason is "tool_calls".
You execute the function locally, then send the result back as a tool message with the matching tool_call_id.
The model uses the tool result to generate its final response. You can repeat this loop for multi-step agentic workflows.
1. Define your tools
Tools are defined using JSON Schema, following the OpenAI format. Pass them in the tools array.
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.tributary.cc/openai/v1",
apiKey: "<TRIBUTARY_API_KEY>",
});
const tools: OpenAI.ChatCompletionTool[] = [
{
type: "function",
function: {
name: "get_weather",
description: "Get the current weather for a location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "City name, e.g. San Francisco",
},
},
required: ["location"],
},
},
},
]; 2. Send a request with tools
Pass the tools array alongside your messages. The model will decide whether to call a tool.
const response = await client.chat.completions.create({
model: "anthropic:claude-opus-4.6",
messages: [
{ role: "user", content: "What is the weather in San Francisco?" }
],
tools,
});
const message = response.choices[0].message;
// The model may return tool calls instead of text
if (message.tool_calls) {
console.log("Tool call:", message.tool_calls[0].function);
// { name: "get_weather", arguments: '{"location":"San Francisco"}' }
} 3. Send the tool result back
Execute the function, then send the result back as a tool message to get the final response.
// Execute your function with the parsed arguments
const args = JSON.parse(message.tool_calls[0].function.arguments);
const weatherResult = await getWeather(args.location);
// Send the result back to the model
const finalResponse = await client.chat.completions.create({
model: "anthropic:claude-opus-4.6",
messages: [
{ role: "user", content: "What is the weather in San Francisco?" },
message, // The assistant message with tool_calls
{
role: "tool",
tool_call_id: message.tool_calls[0].id,
content: JSON.stringify(weatherResult),
},
],
tools,
});
console.log(finalResponse.choices[0].message.content);
// "The current weather in San Francisco is 62°F and sunny." 1. Send a request with tools
Include tool definitions in the request body. No dependencies required.
const response = await fetch(
"https://api.tributary.cc/openai/v1/chat/completions",
{
method: "POST",
headers: {
"Authorization": "Bearer <TRIBUTARY_API_KEY>",
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "anthropic:claude-opus-4.6",
messages: [
{ role: "user", content: "What is the weather in San Francisco?" }
],
tools: [
{
type: "function",
function: {
name: "get_weather",
description: "Get the current weather for a location",
parameters: {
type: "object",
properties: {
location: { type: "string" },
},
required: ["location"],
},
},
},
],
}),
}
);
const data = await response.json();
const message = data.choices[0].message;
if (message.tool_calls) {
console.log("Tool call:", message.tool_calls[0].function);
} 2. Send the tool result back
Execute the function, then send the conversation history including the tool result.
const args = JSON.parse(message.tool_calls[0].function.arguments);
const weatherResult = await getWeather(args.location);
const finalResponse = await fetch(
"https://api.tributary.cc/openai/v1/chat/completions",
{
method: "POST",
headers: {
"Authorization": "Bearer <TRIBUTARY_API_KEY>",
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "anthropic:claude-opus-4.6",
messages: [
{ role: "user", content: "What is the weather in San Francisco?" },
message, // The assistant message with tool_calls
{
role: "tool",
tool_call_id: message.tool_calls[0].id,
content: JSON.stringify(weatherResult),
},
],
tools: [/* same tools array */],
}),
}
);
const finalData = await finalResponse.json();
console.log(finalData.choices[0].message.content); 1. Define your tools
Tools are defined using JSON Schema, following the OpenAI format. Pass them in the tools parameter.
from openai import OpenAI
import json
client = OpenAI(
base_url="https://api.tributary.cc/openai/v1",
api_key="<TRIBUTARY_API_KEY>",
)
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name, e.g. San Francisco",
},
},
"required": ["location"],
},
},
},
] 2. Send a request with tools
Pass the tools list alongside your messages. The model will decide whether to call a tool.
response = client.chat.completions.create(
model="anthropic:claude-opus-4.6",
messages=[
{"role": "user", "content": "What is the weather in San Francisco?"}
],
tools=tools,
)
message = response.choices[0].message
# The model may return tool calls instead of text
if message.tool_calls:
tool_call = message.tool_calls[0]
print(f"Tool call: {tool_call.function.name}({tool_call.function.arguments})")
# Tool call: get_weather({"location":"San Francisco"}) 3. Send the tool result back
Execute the function, then send the result back as a tool message to get the final response.
# Execute your function with the parsed arguments
args = json.loads(tool_call.function.arguments)
weather_result = get_weather(args["location"])
# Send the result back to the model
final_response = client.chat.completions.create(
model="anthropic:claude-opus-4.6",
messages=[
{"role": "user", "content": "What is the weather in San Francisco?"},
message, # The assistant message with tool_calls
{
"role": "tool",
"tool_call_id": tool_call.id,
"content": json.dumps(weather_result),
},
],
tools=tools,
)
print(final_response.choices[0].message.content)
# "The current weather in San Francisco is 62°F and sunny." 1. Send a request with tools
Include tool definitions in the request body using the requests library.
import requests
import json
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"},
},
"required": ["location"],
},
},
},
]
response = requests.post(
"https://api.tributary.cc/openai/v1/chat/completions",
headers={
"Authorization": "Bearer <TRIBUTARY_API_KEY>",
"Content-Type": "application/json",
},
json={
"model": "anthropic:claude-opus-4.6",
"messages": [
{"role": "user", "content": "What is the weather in San Francisco?"}
],
"tools": tools,
},
)
message = response.json()["choices"][0]["message"]
if "tool_calls" in message:
print("Tool call:", message["tool_calls"][0]["function"]) 2. Send the tool result back
Execute the function, then send the conversation history including the tool result.
args = json.loads(message["tool_calls"][0]["function"]["arguments"])
weather_result = get_weather(args["location"])
final_response = requests.post(
"https://api.tributary.cc/openai/v1/chat/completions",
headers={
"Authorization": "Bearer <TRIBUTARY_API_KEY>",
"Content-Type": "application/json",
},
json={
"model": "anthropic:claude-opus-4.6",
"messages": [
{"role": "user", "content": "What is the weather in San Francisco?"},
message, # The assistant message with tool_calls
{
"role": "tool",
"tool_call_id": message["tool_calls"][0]["id"],
"content": json.dumps(weather_result),
},
],
"tools": tools,
},
)
print(final_response.json()["choices"][0]["message"]["content"]) 1. Send a request with tools
Include the tools array in your request body. The model will decide whether to call a tool.
curl https://api.tributary.cc/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TRIBUTARY_API_KEY" \
-d '{
"model": "anthropic:claude-opus-4.6",
"messages": [
{"role": "user", "content": "What is the weather in San Francisco?"}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name, e.g. San Francisco"
}
},
"required": ["location"]
}
}
}
]
}' 2. Send the tool result back
After executing the function, send the result back with the full conversation history.
curl https://api.tributary.cc/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TRIBUTARY_API_KEY" \
-d '{
"model": "anthropic:claude-opus-4.6",
"messages": [
{"role": "user", "content": "What is the weather in San Francisco?"},
{
"role": "assistant",
"tool_calls": [
{
"id": "call_abc123",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"location\":\"San Francisco\"}"
}
}
]
},
{
"role": "tool",
"tool_call_id": "call_abc123",
"content": "{\"temperature\": 62, \"condition\": \"sunny\"}"
}
]
}'