LM Studio
A quick guide to setting up LM Studio for local AI model execution with Cline.
🤖 Setting Up LM Studio with Cline
Run AI models locally using LM Studio with Cline.
📋 Prerequisites
- Windows, macOS, or Linux computer with AVX2 support
- Cline installed in VS Code
🚀 Setup Steps
1. Install LM Studio
- Visit lmstudio.ai
- Download and install for your operating system
2. Launch LM Studio
- Open the installed application
- You’ll see four tabs on the left: Chat, Developer (where you will start the server), My Models (where your downloaded models are stored), Discover (add new models)
3. Download a Model
- Browse the “Discover” page
- Select and download your preferred model
- Wait for download to complete
4. Start the Server
- Navigate to the “Developer” tab
- Toggle the server switch to “Running”
- Note: The server will run at
http://localhost:1234
5. Configure Cline
- Open VS Code
- Click Cline settings icon
- Select “LM Studio” as API provider
- Select your model from the available options
⚠️ Important Notes
- Start LM Studio before using with Cline
- Keep LM Studio running in background
- First model download may take several minutes depending on size
- Models are stored locally after download
🔧 Troubleshooting
- If Cline can’t connect to LM Studio:
- Verify LM Studio server is running (check Developer tab)
- Ensure a model is loaded
- Check your system meets hardware requirements