Tool Calling
staik supports OpenAI-compatible tool calling. Send tools and tool_choice in your request — the model responds with tool_calls in the exact same format as OpenAI.
Define a tool
curlRequest with tool
curl https://api.staik.se/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-st-your-key" \
-d '{
"model": "gemma4:31b",
"messages": [
{"role": "user", "content": "What is the weather in Stockholm?"}
],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["city"]
}
}
}],
"tool_choice": "auto"
}'The model responds with tool_calls instead of text when it wants to call a tool:
JSONResponse
{
"id": "chatcmpl-...",
"object": "chat.completion",
"model": "gemma4:31b",
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"tool_calls": [{
"id": "call_abc123",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"city\": \"Stockholm\", \"unit\": \"celsius\"}"
}
}]
},
"finish_reason": "tool_calls"
}]
}Full tool loop (Python)
PythonOpenAI SDK
import json
from openai import OpenAI
client = OpenAI(
base_url="https://api.staik.se/v1",
api_key="sk-st-your-key",
)
def get_weather(city: str, unit: str = "celsius") -> dict:
# Call your real weather API here
return {"city": city, "temp": 4, "unit": unit, "conditions": "cloudy"}
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["city"],
},
},
}]
messages = [{"role": "user", "content": "What is the weather in Stockholm?"}]
# 1. Model decides to call the tool
response = client.chat.completions.create(
model="gemma4:31b",
messages=messages,
tools=tools,
)
msg = response.choices[0].message
messages.append(msg)
# 2. Execute each tool call and feed the result back
for tc in msg.tool_calls or []:
args = json.loads(tc.function.arguments)
result = get_weather(**args)
messages.append({
"role": "tool",
"tool_call_id": tc.id,
"content": json.dumps(result),
})
# 3. Model formulates the final answer using the tool result
final = client.chat.completions.create(
model="gemma4:31b",
messages=messages,
)
print(final.choices[0].message.content)Tool calling (Node.js)
Node.jsOpenAI SDK
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.staik.se/v1",
apiKey: "sk-st-your-key",
});
const tools = [{
type: "function",
function: {
name: "get_weather",
description: "Get current weather for a city",
parameters: {
type: "object",
properties: {
city: { type: "string" },
unit: { type: "string", enum: ["celsius", "fahrenheit"] },
},
required: ["city"],
},
},
}] as const;
const response = await client.chat.completions.create({
model: "gemma4:31b",
messages: [{ role: "user", content: "What is the weather in Stockholm?" }],
tools,
});
const toolCalls = response.choices[0].message.tool_calls;
if (toolCalls) {
for (const tc of toolCalls) {
const args = JSON.parse(tc.function.arguments);
console.log("Calling", tc.function.name, "with", args);
}
}Tips
- Set
tool_choice: "required"to force the model to call a tool, or{type: "function", function: {name: "..."}}for a specific tool. - Streaming (SSE) is supported — tool_calls arrive in the delta field using the same indexed format as OpenAI.
- Well-described tool descriptions and parameter names (in English or Swedish) yield better tool calls. Use
gemma4:31borqwen3.6:35b-a3bfor the best tool-following.