Skip to main content

Configure CodeGate

Customize CodeGate's behavior

The CodeGate container runs with default settings to support Ollama, Anthropic, and OpenAI APIs with typical settings. To customize the behavior, you can supply extra configuration parameters to the container as environment variables:

docker run --name codegate -d -p 8989:8989 -p 9090:80 \
[-e KEY=VALUE ...] \
--restart unless-stopped ghcr.io/stacklok/codegate

Config parameters

CodeGate supports the following parameters:

ParameterDefault valueDescription
CODEGATE_OLLAMA_URLhttp://host.docker.internal:11434Specifies the URL of an Ollama instance. Used when the provider in your plugin config is ollama.
CODEGATE_VLLM_URLhttp://localhost:8000Specifies the URL of a model hosted by a vLLM server. Used when the provider in your plugin config is vllm.
CODEGATE_ANTHROPIC_URLhttps://api.anthropic.com/v1Specifies the Anthropic engine API endpoint URL.
CODEGATE_OPENAI_URLhttps://api.openai.com/v1Specifies the OpenAI engine API endpoint URL.
CODEGATE_APP_LOG_LEVELWARNINGSets the logging level. Valid values: ERROR, WARNING, INFO, DEBUG (case sensitive)
CODEGATE_LOG_FORMATTEXTType of log formatting. Valid values: TEXT, JSON (case sensitive)

Example: Use CodeGate with OpenRouter

OpenRouter is an interface to many large language models. CodeGate's vLLM provider works with OpenRouter's API when used with the Continue IDE plugin.

To use OpenRouter, set the vLLM URL when you launch CodeGate:

docker run --name codegate -d -p 8989:8989 -p 9090:80 \
-e CODEGATE_VLLM_URL=https://openrouter.ai/api \
--restart unless-stopped ghcr.io/stacklok/codegate

Then, configure the Continue IDE plugin to access the vLLM endpoint (http://localhost:8989/vllm/) along with the model you'd like to use and your OpenRouter API key.