LogoLogo
Home
  • Cline Documentation
  • Getting Started
    • What is Cline?
    • For New Coders
    • Installing Cline
    • Installing Dev Essentials
    • Our Favorite Tech Stack
    • Context Management
    • Model Selection Guide
  • Improving Your Prompting Skills
    • Prompt Engineering Guide
    • Cline Memory Bank
  • Exploring Cline's Tools
    • Cline Tools Guide
    • Checkpoints and Messages
    • Plan & Act Modes: A Guide to Effective AI Development
    • New Task Tool
    • Slash Commands
    • Remote Browser Support
  • Enterprise Solutions
    • Security Concerns
    • Cloud Provider Integration
    • MCP Servers
    • Custom Instructions
  • MCP Servers
    • MCP Overview
    • MCP Marketplace
    • Adding MCP Servers from GitHub
    • Configuring MCP Servers
    • Connecting to a Remote Server
    • MCP Transport Mechanisms
    • MCP Server Development Protocol
  • Custom Model Configs
    • AWS Bedrock w/ Credentials Authentication
    • AWS Bedrock w/ Profile Authentication
    • GCP Vertex AI
    • LiteLLM & Cline (using Codestral)
  • Running Models Locally
    • Read Me First
    • Ollama
    • LM Studio
  • More Info
    • Telemetry
Powered by GitBook
On this page
  1. Custom Model Configs

LiteLLM & Cline (using Codestral)

PreviousGCP Vertex AINextRead Me First

Last updated 1 month ago

Using LiteLLM with Cline

This guide demonstrates how to run a demo for LiteLLM starting with the Codestral model for use with Cline.

Prerequisites

  • installed to run the LiteLLM image locally

  • For this example config: A Codestral API Key (different from the Mistral API Keys)

Setup

  1. Create a .env file and fill in the appropriate field

# Tip: Use the following command to generate a random alphanumeric key:
# openssl rand -base64 32 | tr -dc 'A-Za-z0-9' | head -c 32
LITELLM_MASTER_KEY=YOUR_LITELLM_MASTER_KEY
CODESTRAL_API_KEY=YOUR_CODESTRAL_API_KEY

Note: Although this is limited to localhost, it's a good practice set LITELLM_MASTER_KEY to something secure

  1. Configuration

We'll need to create a config.yaml file to contain our LiteLLM configuration. In this case we'll just have one model, 'codestral-latest' and label it 'codestral'

model_list:
  - model_name: codestral
    litellm_params:
      model: codestral/codestral-latest
      api_key: os.environ/CODESTRAL_API_KEY

Running the Demo

  1. Startup the LiteLLM docker container

docker run \
    --env-file .env \
    -v $(pwd)/config.yaml:/app/config.yaml \
    -p 127.0.0.1:4000:4000 \
    ghcr.io/berriai/litellm:main-latest \
    --config /app/config.yaml --detailed_debug
  1. Setup Cline

    Once the LiteLLM server is up and running you can set it up in Cline:

    • Base URL should be http://0.0.0.0:4000/v1

    • API Key should be the one you set in .env for LITELLM_MASTER_KEY

    • Model ID is codestral or whatever you named it under config.yaml

Getting Help

Author: mdp

Docker CLI or Docker Desktop
LiteLLM Documentation
Mistral AI Console
Cline Discord Community