본문으로 건너뛰기

Providers Overview

AI Supreme Council connects to large language model (LLM) providers directly from your browser. There is no proxy server in between -- your API keys and conversations go straight to the provider's API endpoint. This is the BYOK (Bring Your Own Key) model.

How It Works

  1. You obtain an API key from a provider (e.g., Anthropic, OpenAI, Google)
  2. You paste the key into AI Supreme Council's settings
  3. The key is stored locally in your browser (localStorage) -- it never touches our servers
  4. When you send a message, the browser calls the provider's API directly
  5. Responses stream back to you in real time via Server-Sent Events (SSE)
Key Security

API keys are stored exclusively in your browser's localStorage. They are never included in shared bot URLs, never sent to AI Supreme Council servers, and never logged. Only the provider you are chatting with receives your key.

Provider Comparison

ProviderAPI Key RequiredFree TierNotable ModelsReasoningVision
Google GeminiYesYes (no credit card)Gemini 2.5 Flash, 2.5 Pro, 3 Flash PreviewYesYes
OpenRouterYesYes (20+ free models)300+ models from all providersYesYes
GroqYesYes (rate limited)Llama 3.3 70B, DeepSeek R1 Distill, Compound BetaYesYes
AnthropicYesNoClaude Opus 4.6, Sonnet 4.5, Haiku 4.5YesYes
OpenAIYesNoGPT-5, GPT-4.1, o3, o4-miniYesYes
xAIYesNoGrok 4.1 Fast, Grok 4, Grok 3YesYes
DeepSeekYesNoDeepSeek V3.2, R1, V3.2 ReasonerYesNo
MistralYesNoMistral Large 3, Codestral, Devstral 2NoYes
OllamaNoFree (local)Any model you install locallyVariesVaries
Start for Free

The fastest way to get started is with Google Gemini (free API key, no credit card) or OpenRouter (20+ free models including DeepSeek R1, Qwen 3, and Llama 3.3). See Getting Started for a walkthrough.

Adding an API Key

  1. Open AI Supreme Council at aiscouncil.com
  2. Click the Settings gear icon in the sidebar
  3. Go to the AI Model tab
  4. Find the provider you want and paste your API key
  5. The key is saved immediately and persisted in your browser

You can also enter API keys during the first-run wizard when creating your first bot profile.

Per-Bot API Keys

Each bot profile can have its own API key that overrides the global key for that provider. This is useful if you have separate keys for different projects or billing accounts. Set a per-bot key in the bot's configuration panel (right sidebar).

How Provider Selection Works

When you create a bot profile, you choose a provider and a model. The provider determines which API endpoint is called, and the model determines which specific AI you are chatting with.

Models are loaded from the community model registry, which is updated independently of the app. New models appear automatically when the registry is refreshed (every 24 hours, or on page reload).

API Formats

Most providers use the OpenAI-compatible Chat Completions API format. Two exceptions:

FormatProvidersNotes
OpenAI-compatibleOpenAI, xAI, OpenRouter, DeepSeek, Mistral, Groq, Ollama, and othersStandard POST /v1/chat/completions with Bearer auth
AnthropicAnthropicCustom Messages API with x-api-key header
GeminiGoogle GeminiNative generateContent API with ?key= query param

This difference is handled automatically -- you do not need to worry about API formats when using the app.

Reasoning / Thinking Support

Several providers support reasoning or "thinking" modes where the model shows its step-by-step thought process before answering:

ProviderFeature NameHow to Enable
AnthropicExtended ThinkingSet reasoning effort in config panel (budget tokens or preset)
Google GeminiThinking ConfigSet reasoning effort in config panel (budget tokens or preset)
OpenAI-compatibleReasoning EffortSet to low, medium, or high in config panel

Reasoning output appears in a collapsible "thinking" block above the model's response.

Custom Providers

You can add any OpenAI-compatible API endpoint as a custom provider:

  1. Open Settings > AI Model
  2. Scroll to Custom Providers
  3. Enter a name, API endpoint URL, and API key
  4. The custom provider appears in the provider dropdown when creating bot profiles

Custom providers are persisted in localStorage and support all standard features (streaming, tool calling, etc.) as long as the endpoint implements the OpenAI Chat Completions format.

Usage Tracking

AI Supreme Council tracks token usage per provider in Settings > Usage. You can see input tokens, output tokens, and estimated costs across all your chat sessions. This helps you monitor spending without needing to check each provider's dashboard separately.