Adding Providers
ClawNex supports 15 provider types for connecting to LLM backends. Providers are managed in Configuration > Model Providers.
Supported Providers
LM Studio, OpenRouter, OpenAI, Anthropic, Azure OpenAI, Google Vertex, AWS Bedrock, Mistral, Groq, Together AI, Fireworks, Replicate, Ollama, HuggingFace, and Custom/Other.
Adding a Provider
-
Go to Configuration > Model Providers
-
Click Add Provider
-
Select LM Studio as the type
-
Enter the API base URL (e.g.,
http://192.168.1.100:1234/v1) -
API key: use
lmstudio-local(static) -
Configure model settings (context window, capabilities)
-
Click Save
-
Go to Configuration > Model Providers
-
Click Add Provider
-
Select OpenAI or Anthropic
-
Enter your API key from the provider dashboard
-
The API base URL is pre-filled for cloud providers
-
Click Save
-
Go to Configuration > Model Providers
-
Click Add Provider
-
Select OpenRouter
-
Enter your OpenRouter API key from openrouter.ai
-
Click Save
OpenRouter gives you access to 100+ models through a single API key.
LiteLLM Configuration
Providers configured through the dashboard are synced to litellm/config.yaml automatically. For manual configuration, edit:
~/sentinel/litellm/config.yaml— updateapi_baseURLs and model list~/sentinel/litellm/start.sh— update API key exports
Testing the Connection
After adding a provider:
- Go to Infrastructure panel
- Verify the LiteLLM Proxy shows ONLINE (green)
- Go to Prompt Shield and run a manual scan — if it returns results, the shield engine is working
- Send a test message through an agent to confirm traffic flows through the proxy
Provider API keys are stored in the database and masked in GET responses. They are never exposed in plaintext through the API.