TOOL MANIFEST LANGUAGE

Any AI. Any API.

TML is an open format that makes it easy for AI agents to discover and use public APIs — using a fraction of the tokens of existing standards.

Thousands of public APIs exist — weather, maps, finance, government data. AI agents need a simple way to know what's available and how to call it. TML gives every API a lightweight, portable spec file that any LLM can read. No server. No SDK. Just a file.

OpenAPI
0
tokens
vs
.min
0
tokens
View on GitHub Try TML Studio
THE PROBLEM

There's no simple way to tell AI what APIs exist

Today, developers hardcode tools or use verbose specs designed for humans, not agents. There's no standard for API discovery that's lightweight enough for AI. And the formats that do exist waste thousands of tokens on schema overhead — every message, every turn.

3 tools (weather API)

OpenAPI
1047
OpenAI
670
.min
243

20 tools (projected)

OpenAPI
~6,980
OpenAI
~4,467
.min
~1,620

Projected from 3-tool benchmarks. Actual token counts vary by API complexity.

THE SOLUTION

One file per API. Any AI can read it.

A .min file is a complete, portable tool spec. Publish it on GitHub, share it in a prompt, or load it into any agent framework. Every AI gets the same tools — in a fraction of the tokens.

OpenAPI JSON 1047 tokens
{
  "openapi": "3.0.0",
  "paths": {
    "/forecast": {
      "get": {
        "operationId": "getCurrentWeather",
        "summary": "Get current weather...",
        "parameters": [
          {
            "name": "latitude",
            "in": "query",
            "required": true,
            "schema": { "type": "number" },
            "description": "Location latitude"
          },
          {
            "name": "longitude",
            "in": "query",
            "required": true,
            "schema": { "type": "number" },
            "description": "Location longitude"
          },
          // ... 5 more params, each 6 lines
        ],
        "responses": {
          "200": {
            "content": {
              "application/json": {
                "schema": { /* nested */ }
              }
  // ... repeat for 3 endpoints
.min Index 243 tokens
Same 3 weather tools. 243 tokens vs 1047 tokens. 100% tool selection accuracy.
BENCHMARK

7 formats tested. Only one wins on both size and accuracy.

5 real scenarios, 35 API calls, all against live Open-Meteo weather API. Every format given the same 3 tools.

.min Index
243
Plain English
359
TypeScript
396
Minimal JSON
399
TML Source
422
OpenAI Schema
670
OpenAPI JSON
1047

Token counts measured from Claude API input_tokens for 3 weather tools, 5 live scenarios (Open-Meteo, March 2026). TML achieved 100% tool selection accuracy across all tested scenarios.

VS MCP

Complementary, not competitors

TML and MCP solve different layers of the tool stack. Use both.

243
.min tokens (3 tools)
670
OpenAI Schema (same 3 tools)
0
servers to deploy

Where MCP wins

  • Live bidirectional runtime protocol
  • Established ecosystem with thousands of servers
  • Server-side auth (credentials never leave server)
  • Rich returns (images, files, structured MIME types)

Where TML wins

  • 243 tokens vs 1047 for OpenAPI (same 3 tools)
  • Zero infrastructure — a .min file on GitHub is the entire spec
  • Works with any SDK (Anthropic, OpenAI, Mistral, any LLM)
  • Cacheable and works offline
TML .min tool discovery + selection
MCP tool execution protocol
FORMAT

Two formats, one spec

TML Source is human-readable. .min Index is token-optimized. Both describe the same tools.

TML Source
.min Index
@tool.id
desc: What this tool does and when to use it
GET /endpoint
auth: none|bearer|apikey
in:
  param:str!          "description"
  param:num=default   "description"
out: {response_shape}
tags: keyword tags for search
---

Human-readable and AI-readable. Write tools by hand or compile from OpenAPI. Rich descriptions help agents select the right tool.

str string
num number
int integer
bool boolean
arr array
obj object
! required
=val default
EXAMPLES

Real registries, real APIs

These .min files describe production APIs. Each one is a complete tool spec.

Open-Meteo Weather

no auth

Airtable CRUD

bearer auth
TML STUDIO

Compile, test, and chat with your tools

A local dev tool with three panels: compile OpenAPI to .min, manage your registry, and test tools with a live AI chatbot.

Compiler

Paste an OpenAPI spec or describe any API in natural language. AI generates TML + .min instantly. Pick which tools to keep.

Registry

Browse all your .min files. Add authentication (bearer, API key, OAuth2). One click to load into the chatbot.

Chatbot

Chat with a live AI agent that uses your tools. See tool calls, params, and real API responses in real time.

Try TML Studio
QUICK START

Start in 3 steps

1

Write your TML source

Describe your API tools in TML — human-readable, with rich descriptions that help the AI select the right tool.

@users.list
desc: List all users with pagination. Use when user asks to see users or search the directory.
GET /users
auth: bearer
in:
  page:int=1       "Page number"
  limit:int=10     "Results per page"
out: {users:[{id,name,email}]}
tags: users list directory search
---

@users.get
desc: Get a single user by ID. Use when user asks about a specific person.
GET /users/{id}
auth: bearer
in:
  id:str!          "User ID"
out: {id,name,email,role}
tags: users get profile detail
---
2

Compile to .min

The .min format strips TML down to the minimum tokens an AI needs. Compile with TML Studio or write it by hand.

Same tools, 100% accuracy — 243 tokens vs 1047. The .min is what you send to the LLM.

3

Give .min to any LLM

Inject into the system prompt or use as native tool definitions. Any model that can read text can read .min.

You have access to these API tools:

#oatr1|My API|https://api.example.com
users.list|GET /users|page:i=1,limit:i=10|{users:[{id,name,email}]}|users list
users.get|GET /users/{id}|id:s!|{id,name,email,role}|users get profile

Select the right tool and construct the API call.
Claude GPT-4 Gemini Mistral Llama Any LLM

Publish your .min on GitHub and any AI agent in the world can use your API.

Our goal: every public API has a .min file that any AI can read.