LM Studio
A quick guide to setting up LM Studio for local AI model execution with Cline.
Last updated
A quick guide to setting up LM Studio for local AI model execution with Cline.
Last updated
Run AI models locally using LM Studio with Cline.
Windows, macOS, or Linux computer with AVX2 support
Cline installed in VS Code
Visit lmstudio.ai
Download and install for your operating system
Open the installed application
You'll see four tabs on the left: Chat, Developer (where you will start the server), My Models (where your downloaded models are stored), Discover (add new models)
Browse the "Discover" page
Select and download your preferred model
Wait for download to complete
Navigate to the "Developer" tab
Toggle the server switch to "Running"
Note: The server will run at http://localhost:1234
Open VS Code
Click Cline settings icon
Select "LM Studio" as API provider
Select your model from the available options
Start LM Studio before using with Cline
Keep LM Studio running in background
First model download may take several minutes depending on size
Models are stored locally after download
If Cline can't connect to LM Studio:
Verify LM Studio server is running (check Developer tab)
Ensure a model is loaded
Check your system meets hardware requirements