← Docs/

§ 快速开始 · 三协议

三协议对照.

HiPMM Engine 同时讲三种 wire protocol. Anthropic 客户改 base_url, OpenAI 客户改 baseURL, 一行配置, 五分钟接入.

01

端点表

协议端点客户 SDK
HiPMM 原生POST /v1/codename · /v1/codename/symbolic · /v1/analyze/individualhipmm-python / hipmm-ts
Anthropic MessagesPOST /v1/messagesanthropic-sdk (改 base_url)
OpenAI ChatPOST /v1/chat/completionsopenai-sdk (改 baseURL)
OpenAI ResponsesPOST /v1/responsesopenai-sdk (.responses.create)

三个协议**共享同一套** API key (hipmm_sk_*), 同一套 quota, 同一套审计 log. 你切协议不重新配置.

02

model 字段路由表

兼容协议 (Anthropic / OpenAI) 通过 model 字段决定走哪个 HiPMM 模块:

model 字段对应 HiPMM 端点cost (managed / BYOM)
hipmm/codename/v1/codename1 / 1
hipmm/codename-symbolic/v1/codename/symbolic2 / 2
hipmm/individual-brief/v1/analyze/individual?depth=brief10 / 4
hipmm/individual-standard/v1/analyze/individual?depth=standard10 / 4
hipmm/individual-deep/v1/analyze/individual?depth=deep10 / 4
03

Anthropic 兼容示例

Python · anthropic SDK

from anthropic import AsyncAnthropic

client = AsyncAnthropic(
    base_url="https://api.hipmm.com/v1",  # 只需改这一行
    api_key="hipmm_sk_...",
)

msg = await client.messages.create(
    model="hipmm/codename",
    messages=[{"role": "user",
               "content": "I-A-L-X-ID-AR(PR)-SE-LT"}],
    max_tokens=1024,
)
print(msg.content[0].text)

提示

Anthropic SDK 默认走 x-api-key header. HiPMM AuthMiddleware 同时接受 Authorization: Bearer ... x-api-key, 客户端不用改.
04

OpenAI 兼容示例

TypeScript · openai SDK

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.hipmm.com/v1",
  apiKey: "hipmm_sk_..."
});

// chat.completions (主流)
const r = await client.chat.completions.create({
  model: "hipmm/codename",
  messages: [{ role: "user", content: "I-A-L-X-ID-AR(PR)-SE-LT" }],
});

// 或 responses (OpenAI 2025 新协议)
const r2 = await client.responses.create({
  model: "hipmm/codename-symbolic",
  input: "I-A-L-X-ID-AR(PR)-SE-LT",
});
05

什么时候用哪个

场景推荐协议
新项目, 直接接 HiPMMHiPMM 原生 SDK (字段类型化, BYOM 一等公民)
已有 anthropic SDK 代码改 base_url 接 /v1/messages
已有 openai SDK 代码改 baseURL 接 /v1/chat/completions
想用 OpenAI Responses (2025)/v1/responses
不想引任何 SDK, 只 curlHiPMM 原生 POST /v1/codename
06

协议差异速查

三协议 wire format 不同, 但 HiPMM 业务字段一致:

HiPMM 原生 response (200 OK)

{
  "success": true,
  "data": { "short": "深思的孤勇者", "long": "深思的颠覆者 · 孤勇者" },
  "metadata": { "engine_version": "0.3.0", "cost_units": 1, ... }
}

Anthropic /v1/messages response

{
  "id": "msg_xxx",
  "type": "message",
  "role": "assistant",
  "content": [{
    "type": "text",
    "text": "{\"short\":\"深思的孤勇者\",\"long\":\"...\"}"
  }],
  "usage": { "input_tokens": 12, "output_tokens": 24 }
}

OpenAI /v1/chat/completions response

{
  "object": "chat.completion",
  "choices": [{
    "message": {
      "role": "assistant",
      "content": "{\"short\":\"深思的孤勇者\",\"long\":\"...\"}"
    }
  }],
  "usage": { "prompt_tokens": 12, "completion_tokens": 24 }
}

HiPMM 原生 vs 兼容协议

兼容协议的 content / message.content 是一个 JSON 字符串(符合 LLM API 习惯). 客户解析: JSON.parse(content) 拿到原生 response 里 data 那一坨. HiPMM 原生 SDK 帮你做了这层 (类型化字段), 不用解析.