Skip to main content

Advanced configuration

Customize CodeGate's behavior

The CodeGate container runs with defaults that work with supported LLM providers using typical settings. To customize CodeGate's application settings like provider endpoints and logging level, you can add extra configuration parameters to the container as environment variables:

docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
[-e KEY=VALUE ...] \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate

Config parameters

CodeGate supports the following parameters:

ParameterDefault valueDescription
CODEGATE_APP_LOG_LEVELWARNINGSets the logging level. Valid values: ERROR, WARNING, INFO, DEBUG (case sensitive)
CODEGATE_LOG_FORMATTEXTType of log formatting. Valid values: TEXT, JSON (case sensitive)
CODEGATE_ANTHROPIC_URLhttps://api.anthropic.com/v1Specifies the Anthropic engine API endpoint URL.
CODEGATE_LM_STUDIO_URLhttp://host.docker.internal:1234Specifies the URL of your LM Studio server.
CODEGATE_OLLAMA_URLhttp://host.docker.internal:11434Specifies the URL of your Ollama instance.
CODEGATE_OPENAI_URLhttps://api.openai.com/v1Specifies the OpenAI engine API endpoint URL.
CODEGATE_VLLM_URLhttp://localhost:8000Specifies the URL of the vLLM server to use.

Example: Use CodeGate with a remote Ollama server

Set the Ollama server's URL when you launch CodeGate:

docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
-e CODEGATE_OLLAMA_URL=https://my.ollama-server.example \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate