๐Ÿ“‹ Prerequisites

  • Windows, macOS, or Linux computer
  • Cline installed in VS Code

๐Ÿš€ Setup Steps

1. Install Ollama

  • Visit ollama.com
  • Download and install for your operating system
Ollama download page

2. Choose and Download a Model

  • Browse models at ollama.com/search
  • Select model and copy command:
    ollama run [model-name]
    
Selecting a model in Ollama
  • Open your Terminal and run the command:
    • Example:
      ollama run llama2
      
Running Ollama in terminal
โœจ Your model is now ready to use within Cline!

3. Configure Cline

  1. Open VS Code
  2. Click Cline settings icon
  3. Select โ€œOllamaโ€ as API provider
  4. Enter configuration:
    • Base URL: http://localhost:11434/ (default value, can be left as is)
    • Select the model from your available options
Configuring Cline with Ollama

โš ๏ธ Important Notes

  • Start Ollama before using with Cline
  • Keep Ollama running in background
  • First model download may take several minutes

๐Ÿ”ง Troubleshooting

If Cline canโ€™t connect to Ollama:
  1. Verify Ollama is running
  2. Check base URL is correct
  3. Ensure model is downloaded
Need more info? Read the Ollama Docs.