API Overview
ClawNex exposes three integration surfaces for programmatic access:
| Surface | Purpose | Transport |
|---|---|---|
Public REST API (/api/v1/*) | HTTP endpoints for shield scanning, fleet monitoring, alerts, audit logs, and agent inventory | HTTPS / JSON |
OpenAI-Compatible Endpoint (/api/v1/chat/completions) | Drop-in replacement for the OpenAI chat completions API with automatic shield scanning | HTTPS / JSON |
| MCP Server | Tool/resource server for AI assistants (Claude Code, etc.) | stdio (JSON-RPC 2.0) or HTTP SSE |
When to Use Each
- Public REST API — automations, CI/CD integrations, SIEM connectors, custom dashboards
- OpenAI-Compatible Endpoint — route LLM traffic through ClawNex’s shield transparently; works with the Python
openailibrary, LangChain, and curl - MCP Server — AI assistants that need to call ClawNex tools interactively during conversations
Base URL
http://127.0.0.1:5001Response Format
All Public API endpoints return JSON with a standard envelope:
{
"ok": true,
"data": { },
"meta": {
"requestId": "uuid",
"timestamp": "ISO-8601"
}
}Error responses use the same envelope with "ok": false and an "error" field.
Architecture
All three surfaces share the same underlying services: the shield scanner, the SQLite database, the alert manager, and the audit logger. The Public API and OpenAI endpoint authenticate via API keys. The MCP server calls the dashboard’s internal API on 127.0.0.1:5001.