πŸ“‹ Prerequisites

  • Windows, macOS, or Linux computer
  • Cline installed in VS Code

πŸš€ Setup Steps

1. Install Ollama

  • Visit ollama.com
  • Download and install for your operating system

2. Choose and Download a Model

  • Browse models at ollama.com/search

  • Select model and copy command:

    ollama run [model-name]
  • Open your Terminal and run the command:

    • Example:

      ollama run llama2

✨ Your model is now ready to use within Cline!

3. Configure Cline

  1. Open VS Code
  2. Click Cline settings icon
  3. Select β€œOllama” as API provider
  4. Enter configuration:
    • Base URL: http://localhost:11434/ (default value, can be left as is)
    • Select the model from your available options

⚠️ Important Notes

  • Start Ollama before using with Cline
  • Keep Ollama running in background
  • First model download may take several minutes

πŸ”§ Troubleshooting

If Cline can’t connect to Ollama:

  1. Verify Ollama is running
  2. Check base URL is correct
  3. Ensure model is downloaded

Need more info? Read the Ollama Docs.