Skip to main content

Use CodeGate with Open Interpreter

Open Interpreter lets LLMs run code locally through a ChatGPT-like interface in your terminal.

CodeGate works with OpenAI and compatible APIs through Open Interpreter.

You can also configure CodeGate muxing to select your provider and model using workspaces.

note

This guide assumes you have already installed Open Interpreter using their installation instructions.

Configure Open Interpreter to use CodeGate

To configure Open Interpreter to send requests through CodeGate, run interpreter with the API base setting set to CodeGate's local API port, http://localhost:8989/<provider>.

First, configure your provider(s) and select a model for each of your workspace(s) in the CodeGate dashboard.

When you run interpreter, the API key parameter is required but the value is not used. The --model setting must start with openai/ but the actual model is determined by your CodeGate workspace.

interpreter --api_base http://localhost:8989/v1/mux --api_key fake-value-not-used --model openai/fake-value-not-used
info

The --model parameter value must start with openai/ for CodeGate to properly handle the request.

Verify configuration

To verify that you've successfully connected Open Interpreter to CodeGate, type codegate version into the Open Interpreter chat. You should receive a response like "CodeGate version 0.1.16".

Next steps

Learn more about CodeGate's features and explore the dashboard.

Remove CodeGate

If you decide to stop using CodeGate, follow these steps to remove it and revert your environment.

  1. Quit Open Interpreter (Ctrl+C) and re-run it without the API base parameter.

  2. Stop and remove the CodeGate container:

    docker rm -f codegate
  3. If you launched CodeGate with a persistent volume, delete it to remove the CodeGate database and other files:

    docker volume rm codegate_volume