Getting StartedAdding Providers

Adding Providers

ClawNex supports 15 provider types for connecting to LLM backends. Providers are managed in Configuration > Model Providers.

Supported Providers

LM Studio, OpenRouter, OpenAI, Anthropic, Azure OpenAI, Google Vertex, AWS Bedrock, Mistral, Groq, Together AI, Fireworks, Replicate, Ollama, HuggingFace, and Custom/Other.

Adding a Provider

  1. Go to Configuration > Model Providers

  2. Click Add Provider

  3. Select LM Studio as the type

  4. Enter the API base URL (e.g., http://192.168.1.100:1234/v1)

  5. API key: use lmstudio-local (static)

  6. Configure model settings (context window, capabilities)

  7. Click Save

  8. Go to Configuration > Model Providers

  9. Click Add Provider

  10. Select OpenAI or Anthropic

  11. Enter your API key from the provider dashboard

  12. The API base URL is pre-filled for cloud providers

  13. Click Save

  14. Go to Configuration > Model Providers

  15. Click Add Provider

  16. Select OpenRouter

  17. Enter your OpenRouter API key from openrouter.ai

  18. Click Save

OpenRouter gives you access to 100+ models through a single API key.

LiteLLM Configuration

Providers configured through the dashboard are synced to litellm/config.yaml automatically. For manual configuration, edit:

  • ~/sentinel/litellm/config.yaml — update api_base URLs and model list
  • ~/sentinel/litellm/start.sh — update API key exports

Testing the Connection

After adding a provider:

  1. Go to Infrastructure panel
  2. Verify the LiteLLM Proxy shows ONLINE (green)
  3. Go to Prompt Shield and run a manual scan — if it returns results, the shield engine is working
  4. Send a test message through an agent to confirm traffic flows through the proxy
⚠️

Provider API keys are stored in the database and masked in GET responses. They are never exposed in plaintext through the API.