What's new
Customer-facing release notes for the Codec stack. The deeper engineering changelog (commit lists, fork SHAs, image digests) lives in GitHub Releases; this page is the one-line-per-change summary.
-
feature v0.4 v0.4 — safety-policy negotiation as a TLS-style capability axis
Codec gains a sixth negotiation axis on the HELLO/READY handshake — a sanitized, hash-anchored `safety_policy` descriptor that lets servers advertise enforcement (categories, actions, classifier family) without leaking operator-internal banned-id lists or thresholds. Adds an optional `@codecai/web-safety` client package (prefilter + classifier registry), full operator-side enforcement in codec-supervisor (logits processor, multi-token matcher, classifier registry with three v1 implementations), and per-language tokenize/detok benchmarks across all six client libs. Wire numbers unchanged from v0.3.x — v0.4 is wire-additive.
-
improvement v0.3.2 Cross-stack bench cleanup — 24/24 unanimous on every engine
Re-ran the full cross-stack matrix after patching two bench-driver bugs (C/TS token-decode fallback, vllm REPS=1 noise). All three engines × six client languages now produce byte-identical Codec frames per cell — including vllm, which previously read as 0/24 unanimous in the post-mortem.
-
improvement v0.3.0 v0.3 bench numbers from the lab — 3.6× on tools/list, 18× on text streams
First end-to-end run of the v0.3 stack against codec-metamcp:v0.3.0 on a real lab box. tools/list collapses to 3.6× over JSON-RPC; text streams hit 18× over JSON-SSE on protobuf framing.
-
fix v0.3.1 codec-metamcp v0.3.1 — leaf-mode validator fix; Codec-aware tools 4.2× e2e
First end-to-end run with codec-time-leaf in a metamcp namespace surfaced (and we fixed) a CallToolResult validator bug that was rejecting all leaf-mode results. Codec-aware tool calls now compress 4.2× through the gateway.
-
feature v0.3.0 v0.3 latent modality — VAE latents on the wire
Image and video diffusion models now stream VAE latents instead of decoded pixels. 48× smaller wire weight, decode at the leaf.
-
feature v0.3.0 Codec-aware MCP gateway
Tool authors can now ship pre-tokenized results that bypass the gateway's back-compat shim. ~4.7× wire-byte reduction on real MCP traffic.
-
improvement v0.3.0 tool_calling block in tokenizer maps
Tokenizer maps now carry the model's tool-calling convention. Auto-derived from the chat template — no per-deployment config.
-
feature v0.3.4 v0.3 latent bench — pipeline math validates byte-for-byte
First end-to-end latent run against codec-diffusers with real SD-1.5 latents on the wire. The seven-pipeline registry collapses bytes exactly as the spec promises — int4 packs 3.9× over raw, ~5-10× smaller than JPEG.
-
feature v0.3.2 v0.3.2 — leaf-mode bypass observable end-to-end on real MCP traffic
The Codec leaf-mode bypass — the architectural target the entire v0.3 contract was designed for — fires end-to-end. `[Codec][leaf]` log line confirms the gateway is a transparent ID pipe; the tokenizer sits at the leaf where it belongs.
-
feature Java, Rust, and .NET clients reach feature parity
Six client libraries (TypeScript, Python, Java, Rust, .NET, C) are now byte-identical across the cross-stack benchmark matrix. 36 cells × 3 sizes, all green.
-
improvement zstd dictionary negotiation via Codec-Zstd-Dict header
Servers advertise the active zstd dict on the wire; clients fetch it once and decompress every frame against it. Identification by sha256, not URL.