Context distillation, not blunt compression
TrexAPI extracts the facts, constraints, and conclusions a task actually depends on before compacting low-signal narrative instead of just cutting context down.
TrexAPI is the first platform built for the TokenZip Protocol. Most teams no longer need to start with custom HMAC signing or per-request upstream keys. Create one Trex key, bind your own OpenAI or Anthropic key in the dashboard, then point your client base URL to the Trex-compatible endpoint.
At the documentation level, TrexAPI is not only a payload API and not only a provider proxy. It is a production interface that combines context distillation, a TrexID memory layer, and Semantic Edge routing in one surface.
TrexAPI extracts the facts, constraints, and conclusions a task actually depends on before compacting low-signal narrative instead of just cutting context down.
Processed context is organized as a TokenZip payload and stored behind a TrexID so downstream systems can pass it by reference and expand it on demand.
When you proxy your own model traffic through Trex, the Worker reconciles original tokens, actual upstream billed tokens, and savings in one place.
This is the default integration path we recommend. Switch to the advanced sections below only if you need direct payload lifecycle control or custom signing.
Create an API key in the dashboard and copy the returned `Trex Proxy Key`. That is the value you paste into the `apiKey` field of OpenAI-compatible clients.
Save your OpenAI or Anthropic key in the dashboard. Trex will automatically use it on proxy requests so you do not need to send `X-Upstream-Api-Key` every time.
Point the client base URL to `https://api.trexapi.com/v1/proxy/openai` or `.../anthropic`, replace the `apiKey` with your Trex proxy key, and traffic will start flowing through the Trex managed proxy.
If you do not want to hand-build `trex_accounting`, signature headers, and provider passthrough parameters yourself, use the `trexapi` SDK. It automates Trex proxy keys, OpenAI / Anthropic passthrough headers, and HMAC signing for payload routes.
const { createTrexClient } = require('trexapi');
const trex = createTrexClient({
baseUrl: 'https://api.trexapi.com',
proxyKey: process.env.TREX_PROXY_KEY,
openai: {
apiKey: process.env.OPENAI_API_KEY,
organization: process.env.OPENAI_ORG,
project: process.env.OPENAI_PROJECT,
},
});
const response = await trex.openai.responses.create(
{
model: 'gpt-5-mini',
input: 'Summarize the latest deployment notes.',
},
{
originalInput: fullPromptBeforeTrex,
}
);
console.log(response._trex);Whether you are pushing a payload or reconciling baseline usage inside the managed proxy, the underlying job is the same: turning long context into a reusable TrexID semantic object.
Identify the semantic material that actually changes model behavior.
IDs, amounts, clauses, code, and structured fields move through an exact-preservation path.
Organize the distilled result into a signable, storable, reusable semantic payload.
Systems pass the TrexID and expand only when needed instead of resending full long context every time.
For any active subscriber, retrieval of existing cached content through TrexID is permanently unlimited and free. TrexAPI is designed so you do not keep paying to re-read the same semantic memory once it has already settled behind a TrexID.
If you need direct `/v1/payloads` control or custom signing, the Worker still supports the advanced `Authorization: Bearer <agent_id>:<timestamp>:<nonce>:<signature>` mode. Requests outside a 5-minute clock window or reusing a nonce are rejected.
POST
/v1/payloads
2026-03-12T10:30:00.000Z
nonce_demo_123This is the default entry point we recommend. Once your provider key is stored in the dashboard, Trex forwards the request, parses exact usage, applies official pricing, and credits saved input tokens plus saved token cost into your guarantee cycle.
Pass through the OpenAI Responses API with the original provider body plus an optional `trex_accounting` object.
Supports Chat Completions requests; cached prompt tokens are split from standard input tokens before pricing.
Supports Anthropic Messages and uses `cache_write_ttl` to distinguish 5-minute vs 1-hour prompt cache write pricing.
X-Trex-Usage-Event-Id still returns the reconciliation event ID.X-Trex-Guarantee-Status still returns the current guarantee state.This is the most common setup: create a Trex proxy key, replace the base URL, and keep the rest of your OpenAI request structure intact.
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.TREX_PROXY_KEY!,
baseURL: 'https://api.trexapi.com/v1/proxy/openai',
});
const response = await client.responses.create({
model: 'gpt-5-mini',
input: 'Summarize the latest deployment notes for the team.',
trex_accounting: {
baseline_usage: {
input_tokens: 6200,
output_tokens: 380,
},
trex: {
original_input_tokens: 6200,
request_bytes_before: 18342,
},
},
});
console.log(response.output_text);If you want to temporarily bypass the stored dashboard binding or explicitly override the provider key in a script, use this pattern.
const proxyResponse = await fetch('https://api.trexapi.com/v1/proxy/openai/responses', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: authorization,
'X-Upstream-Api-Key': process.env.OPENAI_API_KEY!,
},
body: JSON.stringify({
model: 'gpt-5-mini',
input: 'Summarize the latest deployment notes for the team.',
trex_accounting: {
baseline_usage: {
input_tokens: 6200,
output_tokens: 380,
},
},
}),
});`baseline_usage` tells the Worker what the provider would have consumed without Trex, while `actual_usage_override` is available for edge cases where you need to explicitly override the provider usage split.
{
"trex_accounting": {
"baseline_usage": {
"input_tokens": 6200,
"output_tokens": 380
},
"actual_usage_override": {
"input_tokens": 1900,
"cached_input_tokens": 120
},
"cache_write_ttl": "5m",
"trex": {
"original_input_tokens": 6200,
"optimized_input_tokens": 2020,
"request_bytes_before": 18342,
"request_bytes_after": 6021
}
}
}This example matches the current Worker signature implementation and the `/v1/payloads` request schema.
import crypto from 'node:crypto';
const agentId = 'agent_xxx';
const secret = 'tsk_xxx';
const timestamp = new Date().toISOString();
const nonce = 'nonce_demo_123';
const path = '/v1/payloads';
const canonical = ['POST', path, timestamp, nonce].join('\n');
const signature = crypto.createHmac('sha256', secret).update(canonical).digest('base64url');
const authorization = `Bearer ${agentId}:${timestamp}:${nonce}:${signature}`;
const response = await fetch('http://127.0.0.1:8787/v1/payloads', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: authorization,
},
body: JSON.stringify({
tzp_version: '1.0',
payload: {
vector_seq_b64: ['AAECAwQ='],
quant_params: { min: -1, max: 1, method: 'uniform' },
dimensions: 384,
chunk_count: 1,
summary: 'demo payload',
},
metadata: {
ttl_seconds: 86400,
allowed_receivers: ['agent_xxx'],
},
}),
});The same HMAC authorization format applies across the full TokenZip payload lifecycle.
Push a new payload and receive `trex_id`, `expires_at`, and checksum metadata.
Retrieve the payload and metadata if the record exists, is active, and your agent is allowed.
Check payload metadata headers or revoke a payload when the sender needs to invalidate it.
The Worker returns structured JSON errors for most failure cases.
Missing signature, invalid HMAC, expired timestamp, reused nonce, or inactive credentials.
Your payload declared a TZP version other than `1.0` or `1.0.0`.
Your agent is not included in `allowed_receivers`, or you are trying to revoke a payload you did not send.
The payload does not exist, has expired, or was revoked before your request.
You called `/v1/proxy/...` without a stored dashboard provider key and without an `X-Upstream-Api-Key` override.
Streaming is not yet supported on the managed proxy because usage reconciliation requires the final JSON response body.
Start in the dashboard for API key provisioning, then work from this quickstart. For account, access, or rollout issues, contact support@trexapi.com.