How to use LM Studio in BoltAI
Written By Daniel Nguyen
Last updated 3 months ago
LM Studio is a powerful, user-friendly desktop application (for Windows, macOS, and Linux) that allows users to discover, download, and run various large language models (LLMs) on their own local computer, entirely offline.
Follow these quick steps to run local models from LM Studio inside BoltAI.
Install & enable LM Studio server
Open LM Studio and switch to “Power User” mode.
Open Developer tab
Toggle Enable local server on.
(optional) Copy the server address if you customize the server port

Load your favorite AI model(s)
Click “Select a model to load” (or keyboard shortcut ⌘L)
Select your model in the list
Click “Load Model”.
LM Studio should show a “READY” badge.

Add LM Studio in BoltAI
LM Studio server is now ready. Let’s configure it in BoltAI.
Open BoltAI → Settings → AI Services.
Click Add → choose LM Studio → Next.
Set a Default Model → Add Service.

Start a new chat with LM Studio models
Click New Chat.
Choose the LM Studio service/model in the model switcher and start your conversation.

Quick checks (if it doesn’t connect)
LM Studio is running and Enable local server is ON
The Base URL/port in BoltAI matches what LM Studio shows (Developer tab).
The model you picked is downloaded and loaded in LM Studio.